Zero-Shot Prompting
Giving an LLM a task instruction without any examples, relying entirely on the model's pre-trained knowledge and instruction-following ability to perform the task.
Why It Matters
Zero-shot prompting is the simplest approach and the default starting point. If it works well enough, there is no need for more complex techniques.
Example
Simply asking 'Classify this review as positive or negative: The food was amazing but the service was terrible.' — no examples, just the instruction.
Think of it like...
Like asking a knowledgeable person a question without any preamble — if they know the topic well, they can answer correctly without examples.
Related Terms
Zero-Shot Learning
A model's ability to perform a task it was never explicitly trained on or shown examples of. The model applies its general knowledge and reasoning to handle entirely new task types.
Prompt Engineering
The practice of designing and optimizing input prompts to get the best possible output from AI models. It involves crafting instructions, providing examples, and structuring queries to guide the model toward desired responses.
Few-Shot Prompting
A prompt engineering technique where a small number of input-output examples are provided before the actual query, demonstrating the desired format and behavior to the model.
Instruction Following
An LLM's ability to accurately understand and execute user instructions, including complex multi-step directives with specific constraints on format, tone, length, and content.
Large Language Model
A type of AI model trained on massive amounts of text data that can understand and generate human-like text. LLMs use transformer architecture and typically have billions of parameters, enabling them to perform a wide range of language tasks.