Prompt Engineering in Plain English

A Brief Introduction to Crafting Perfect Prompts

In partnership with

Automate Prospecting Local Businesses With Our AI BDR

Struggling to identify local prospects? Our AI BDR Ava taps into a database of 200M+ local Google businesses and does fully autonomous outreach—so you can focus on closing deals, not chasing leads.

Ava operates within the Artisan platform, which consolidates every tool you need for outbound:

  • 300M+ High-Quality B2B Prospects

  • Automated Lead Enrichment With 10+ Data Sources Included

  • Full Email Deliverability Management

  • Personalization Waterfall using LinkedIn, Twitter, Web Scraping & More

Hello everyone and welcome to my newsletter where I discuss real-world skills needed for the top data jobs. 👏

This week I’m discussing the very basics of prompts.  👀

Not a subscriber? Join the informed. Over 200K people read my content monthly. 

The actual number is 250K last month and since February of 2107 I’ve amassed 73.6 millions views across multiple platforms. I’m not trying to flex here but that puts me in the top 1% of all authors in the world of artificial intelligence. Join the enlightened. 🤓

Thank you. 🎉

Almost all artificial intelligence is machine learning. A machine learning model is nothing more than computer code that looks through data to find patterns. In this article we will be discussing a specific type of deep learning model called a large Language model.

An LLM is a machine learning model that reads a lot of text, learns patterns in language, and uses that knowledge to write or answer questions like a person would.

It’s called large because it’s trained on huge amounts of data and has billions of parameters (which are like adjustable settings in its brain). There are different vendor flavors of large language models.

  • Google has Gemini

  • Anthropic has Claude

  • OpenAI has ChatGPT

A large language model (LLM) is a machine learning model that ingests text, learns patterns in language, and uses that knowledge to output a response to an input.

You interact with these models using a prompt. I’ll be using ChatGPT for this article. Let’s ask ChatGPT what a prompt is. Do you see that box at the very bottom of the pic below? That’s a prompt window. I typed what is a prompt and received the output you see below.

Used by Execs at Google and OpenAI

Join 400,000+ professionals who rely on The AI Report to work smarter with AI.

Delivered daily, it breaks down tools, prompts, and real use cases—so you can implement AI without wasting time.

If they’re reading it, why aren’t you?

Prompt engineering is the practice of crafting precise prompts to help large language models correctly respond to questions and perform a wide range of tasks. This practice improves the model's ability to produce accurate and relevant responses.

Prompt engineering is the practice of crafting precise prompts to help large language models correctly respond to questions and perform a wide range of tasks.

Prompts dictate the quality of specific outputs from these LLMs. Creating solid prompts that yield relevant and usable results is the key to using generative AI successfully. Generative AI systems rely on refining prompt engineering techniques to learn from diverse data, minimize biases reduce confusion, and produce accurate responses.

Prompt engineers craft queries that help large language models grasp the language, nuance, and intent behind a prompt. A well-crafted, thorough prompt significantly influences the quality of AI-generated content—whether it’s images, code, data summaries, or text.

Prompt engineers craft queries that help large language models grasp the language, nuance, and intent behind a prompt.

Prompt engineers fine-tune prompts to improve the quality and relevance of model outputs, addressing both specific and general needs. This process reduces the need for manual review. In the next section let’s discuss the types of prompts.

Zero-Shot Prompting

This involves giving the model a direct task without providing any examples or context. There are several ways to use this method:

  • Question: This asks for a specific answer and is useful for obtaining straightforward, factual responses. Example: What is today’s date?

  • Instruction: This directs the AI to perform a particular task or provide information in a specific format. It’s effective for generating structured responses or completing defined tasks. Example: List the five most significant trends in large language models in 2025.

The success of zero-shot prompting depends on the specific tasks the model was trained to perform well, in addition to the complexity of the given task.

Consider this example: Explain the rise of large language models over the past decade.

It’s possible the generated response will be thousands of words—too long and broad to be useful if you only need a single sentence. If that’s the case, it’s time to refine the approach with one-shot or few-shot prompting.

One-Shot Prompting

This provides a single example to illustrate the desired response format or style, helping guide the model more efficiently than zero-shot prompting.

Consider this example: Write a formal follow-up email. Thank clients for the meeting. Summarize key points. Show the benefits of moving forward. Suggest a contract timeline.

This single prompt guides the model. It specifies tone, content, and next steps. The model understands these parts. It creates a polished response. No further explanation is needed.

Few-Shot Prompting

This approach offers multiple examples to the model, enhancing its understanding of the task and expected output. It’s particularly useful for more complex queries or generating nuanced responses.

Consider this example: Classify the sentiment of the following movie reviews as Positive or Negative.

Review: This movie was fantastic! The plot was engaging and the acting was top-notch.

Sentiment: Positive

Review: I hated this film. It was boring and too long.

Sentiment: Negative

Review: The cinematography was beautiful, but the story didn’t make much sense.

Sentiment: Negative

Prompt engineering techniques

Advanced prompting techniques help generative AI tools to tackle complex tasks more successfully. Prompt engineers employ the following techniques for speed and efficiency.

Self-Ask Use Case

When facing a complex, multi-faceted query that benefits from breaking down the problem into smaller parts. Instruct the model to dissect a complex question into several sub-questions. By doing so, the model can address each part individually and combine insights for a more thorough answer.

Before Example:

Should I pursue a master’s degree in data science? (no)

After Example:

Should I pursue a master’s degree in data science? First, list and answer sub-questions on career prospects, financial implications, and current market demand. Then, based on these answers, provide a final recommendation. (still no)

Chain of Thought(CoT) Use Case

For queries requiring clear step-by-step reasoning, such as math, logic problems, or decisions with multiple variables. Explicitly instruct the model to think step by step as it works through the problem. This helps ensure that each reasoning step is transparent and verifiable.

Before Example:

Calculate the total cost of a meal with a discount and tax.

After Example:

What is the total cost of a meal after applying a 10% discount followed by a 7% tax? Let’s think through each step carefully, starting with the discount calculation, then applying the tax.

Step-back Prompting

For tasks that involve broad analysis followed by a focused recommendation or conclusion. Begin with a general inquiry to gather foundational insights, then narrow the scope with a follow-up question. This layered approach allows the model to first assess the overall situation before addressing a specific angle.

Before Example:

List the key factors for market expansion.

After Example:

Explain the key factors influencing market expansion for a business. Based on this analysis, should a tech firm consider expanding into Europe? Provide reasons for your recommendation.

Self-Consistency

When a query could yield multiple plausible answers and consistency is critical. Ask the model to generate several responses and choose the one that appears most frequently. This reduces randomness and improves accuracy, especially when the answer is not immediately clear.

Before Example:

What is the most popular programming language for machine learning?

After Example:

What is the most popular programming language for machine learning? Generate five distinct answers and then select the answer that appears most often.

Thread-of-Thought (ThoT)


For complex, context-rich scenarios where the model must navigate multiple layers of reasoning. Guide the model through the problem by asking it to break down the analysis into manageable parts. Request a detailed walkthrough of its reasoning, ensuring each step connects logically to the next.

Prompt: How many guests can attend the party based on their music preferences?

Prompt: Given a party where each guest has a unique music preference and only a limited number of genres can be played, walk me through the analysis in manageable steps. Determine the maximum number of guests that can be satisfied with their preferred music.

Benefits of prompt engineering

One of the main advantages of prompt engineering is the minimal revision and effort required after generating outputs. AI-powered results can vary in quality, often needing expert review and rework. However, well-written prompts help ensure the model output reflects the original intent, cutting down on extensive after-processing work.

Other notable benefits of prompt engineering include:

  • Efficiency in long-term AI interactions, as AI evolves through continued use

  • Innovative use of AI that goes beyond its original design and purpose

  • Future-proofing as AI systems increase in size and complexity

How Does Prompt Engineering Improve Systems?

Effective prompt engineering makes generative AI systems smarter by combining technical knowledge with a deep understanding of natural language, vocabulary, and context to yield usable outputs that require minimal revisions.

The foundation models that power generative AI are large language models (LLMs) built on transformer architectures, deep learning models that process input data all at once instead of in a sequence. This makes them especially useful for tasks like language translation and text generation. LLMs contain all the information the AI system needs.

Thanks and have a great day. 👏

There is only one entry level data role and it’s the data analyst.

I’ve created the courses, study guides and a GPT for preparing for this role. All you need to do is follow directions.

STEP 1:

  • Take the Transact-SQL course.

  • Download the must know interview questions for data roles study guide and fill it in. (Included in the course)

  • Download and install the GPT and learn all the top SQL interview questions. A GPT is a model I created that is part of ChatGPT to help you ace the SQL Server interview. (Included in the course)

  • SQL is the single most important skill for all the data roles. Spend a lot of time here.

STEP 2: 

  • Learn PowerBI. Take the PowerBI course.

  • Study the exam cram guides included in the course for PL-300.

  • Use the interview prep sheet for acing the interview. Attain PL-300. That’s the PowerBI certification. I know it works because almost every one of my students has passed the exam on the first try.

  • You’ll need to learn as many recent version as possible with a focus on the Fabric implementation.

STEP 3: 

  • Learn Fabric. It’s Microsoft new data centric cloud ecosystem.

  • Take my Fabric course and pass DP-600. That’s the Fabric certification. Yes, I’ve included the cheat sheet for that certification also. It’s at the end of the Fabric course. Study these, you’ll pass the exam I promise.

🥳 Here’s a code to receive $20 off the first month. Yep, that means the first month is only $30 dollars. This price includes:

  • All the courses

  • The study guides

  • The GPT

  • The exam crams

CODE: MEMDAY25

Take control of your future. Nobody else is going to. If you don’t fail often you are doing something wrong. 👏