Mastering Prompt Engineering with OpenAI’s API: How to Create an Effective Prompt

Caleb
GoPenAI
Published in
4 min readNov 6, 2023

--

In the rapidly advancing world of artificial intelligence, prompt engineering emerges as a pivotal skill for tapping into the potential of AI models, particularly those developed by OpenAI.

Prompt engineering isn’t about merely asking questions; it’s an art form that combines precision, foresight, and a deep understanding of how AI thinks and responds.

Let’s dive into some best practices.

Unsplash

Embrace the Vanguard of Innovation: Choose the Latest Model

Starting with the newest model available is akin to choosing the sharpest tool in the shed.

The more recent the AI model, the more refined its understanding and output will be.

Structure is Key: Instructions First

Clarity and structure guide AI towards producing the desired outcome.

Placing instructions at the very beginning of your prompt and then providing context ensures that the AI can distinguish between the two, leading to more precise results.

Utilize separators like “###” or triple quotes “””” to denote where the instructions end and the context begins.

For example:

Translate the text below in german. // instructions

Text: ###
{your text} // context
###

Specificity is Your Ally: Define Your Expectations

Ambiguity is the nemesis of precision in prompt engineering.

Being explicit about what you need, including the context, outcome, length, format, style, and more, is crucial.

Detailed prompts guide the AI much more reliably than vague ones.

Imagine the difference between asking:

Write a song about {...}
Write a short and inspiring song about {..}, focusing on {...} in the style of {...}

Show, Don’t Just Tell: Use Examples

AI, much like humans, learns better with examples.

Demonstrating the format you expect for the output can dramatically improve the accuracy and relevance of the response.

If you need to extract entities from a text, providing a structured example of how those entities should be presented will make it significantly easier for the AI to follow.

For example:

Extract the important entities mentioned in the text below. First extract all company names, then extract all people names, then extract specific topics which fit the content and finally extract general overarching themes

Desired format:
Company names: <comma_separated_list_of_company_names>
People names: -||-
Specific topics: -||-
General themes: -||-

Text: {text}

Begin Simple: Zero-shot to Few-shot to Fine-tuning

When approaching a new task, starting with zero-shot prompts (no examples), progressing to few-shot prompts (a couple of examples), and then moving to fine-tuning is a strategic way to gauge and guide the AI’s performance.

Each step provides the AI with a clearer understanding of the task at hand, honing its responses with each iteration.

OpenAI doc — [PUBLIC] Best practices for fine-tuning GPT-3 to classify text

Eliminate Fluff: Precision Over Prolixity

Avoid “fluffy” language that could confuse the AI.

Opt for succinct, precise descriptions over long-winded ones.

If a product description should be brief, state the exact number of sentences you expect, setting clear boundaries for the AI’s output.

// Not good
Write a short description. Not too long, a few sentences only.

// Better
Use 5 sentence paragraph to describe {...}

Direct Positively: State What You Want, Not What You Don’t

AI responds better to positive instructions. Instead of stating what you don’t want, focus on what you want the AI to do.

This positive framing helps the AI understand the desired path without the ambiguity of negation.

The following is a conversation between an Agent and a Customer. The agent will attempt to diagnose the problem and suggest a solution, whilst refraining from asking any questions related to PII. Instead of asking for PII, such as username or password, refer the user to the help article www.samplewebsite.com/help/faq

Customer: I can’t log in to my account.
Agent:

Code-Specific Tips: Nudge with Keywords

When generating code, using “leading words” can steer the AI toward the appropriate syntax or pattern.

For example, beginning a Python function with the word “import” implies that the function will require external libraries, setting the stage for the correct structure.

# Write a simple python function that
# 1. Ask me for a number in mile
# 2. It converts miles to kilometers

import

You can do the same with the word “SELECT” to start a SQL query for example.

Fine-Tuning Parameters: The Devil’s in the Details

Parameters like model, temperature, max_tokens, and stop sequences have significant effects on the AI’s output.

While a higher model performance might mean increased costs and latency, it’s worth it for the advanced capabilities.

Temperature affects randomness and creativity, whereas max_tokens and stop sequences help control the length and termination of the AI’s response.

Conclusion

Remember, AI is a tool shaped by human hands. Your prompts are the blueprint from which AI builds its responses. Engineer those prompts with care, and the AI will return the favor, crafting responses that carry your projects to new heights.

Enjoyed the read? For more on Web Development, JavaScript, Next.js, Cybersecurity, and Blockchain, check out my other articles here:

If you have questions or feedback, don’t hesitate to reach out at caleb.pro@pm.me or in the comments section.

[Disclosure: Every article I pen is a fusion of my ideas and the supportive capabilities of artificial intelligence. While AI assists in refining and elaborating, the core thoughts and concepts stem from my perspective and knowledge. To know more about my creative process, read this article.]

--

--

🌐 JavaScript & Web Dev Enthusiast | 👨‍💻 Cybersecurity specialist ! 🔗 Blockchain Explorer | caleb.pro@pm.me