š Promptpack: Beyond the prompt, towards AI conversations
Socrates can teach us how to use AI
Hi,
here.āAI Prompt Engineering is Dead,ā declares one recent headline, arguing that LLMs are far more capable at optimising their own prompts than humans. While AI can indeed help craft better prompts, we should not be misled by such dramatic claims. Learning how to build a conversation with AI, or what I call āconversation engineeringā, will remain a crucial skill for many high-quality AI uses.Ā
To dig deeper into this and explain the theory and practice behind conversation engineering, we partnered up with genAI advisor and researcher Matteo Castiello on todayās Promptpack.
You will learnā¦
Why conversations with AI (still) matter
How to strategically design your conversations with AI to get better outcomes
Example prompts showing how you can implement the theory.
Letās get into it!
Human-AI conversations matter more than ever
Some argue that LLMs are increasingly more effective at creating their own prompts and optimising them, which makes āmanualā prompt engineering a skill of the past. In addition, the rise of agentic workflows can be seen as a step away from manual prompting.Ā
Yet leading AI companies like Anthropic are still hiring prompt engineers and paying them handsomely: above $250,000. One reason for this ongoing demand is that a prompt engineerās job goes beyond designing great starting prompts. It involves thinking about the entire flow of direct interactions between human and machine: a directed conversation with an end goal in mind.
To master the art of using AI, we must think beyond prompt engineering and onto conversation engineering. This skill matters because itās essential for some of the most precious AI applications, for what we call āsecond brainā tasks. And as AI becomes more and more ubiquitous, learning how to talk to different AIs (10 hours of conversation with each model as
called for in conversation with ) is crucial ā especially as we donāt yet know by which modalities it will be present in the future.Using AI as a second brain through conversation engineering
For certain tasks, using AI or any other application to optimise your prompts works incredibly well, as we showcased in a previous Promptpack. In these tasks, the main objective is often to accelerate a task ā AI acts as a second pair of hands.
However, what is often overlooked is how AI helps us as a second brain: it can help us not just to do things faster, but to do them better and to improve our own thinking.Ā
This category contains three broad and often overlapping types of āactivitiesā: creative exploration, strategic planning and enhanced communication.
The interaction is an integral part of conducting these āsecond brainā activities. In other words, going through the process of a conversation (rather than one-shot prompting) is often necessary to get high-quality outputs from an LLM, especially by leveraging some techniques weāll showcase below. The activities are multi-step reasoning tasks in and of themselves, but there is something more: research suggests that LLMs generate better output when they have time to āthinkā, the opportunity to iterate and an understanding of the context of a task.
For example, brainstorming is an activity that inherently contains some level of exchange; exploring scenarios requires a careful build-up of some possible futures to explore; hiring a new employee contains multiple interlinked steps, each benefitting from the interaction and input of both human and AI.
A bonus is that when using AI as a thought partner, we too may reach a better understanding of what weāre doing, through additional exploration and having to refine and define precisely what we need.Ā
Designing an AI conversation
We combined insights from the emerging literature and our own experience with LLMs to identify three tactical decisions to make as you use AI as a second brain. Examples are included below.
š® Step 1: Outcome dependency. Think about your desired outcome. Are you in exploration or exploitation mode?Ā