Few-Shot Learning
LLM & Language ModelsProviding a few examples in your prompt to show the AI the pattern you want — dramatically improving output quality for specific formats or styles.
Few-shot learning is the technique of including examples in your prompt before asking the AI to generate. Instead of explaining the pattern abstractly, you show it: 'Here are 3 examples of product descriptions in our brand voice. Now write one for this new product.'
This is one of the simplest and most effective prompt engineering techniques. It works because LLMs are exceptional pattern matchers — given a few examples, they extract the underlying format, tone, length, and style, then replicate it for new inputs.
The 'few' typically means 2-5 examples. One example (one-shot) often works too. Zero-shot means no examples at all. More examples generally improve consistency but use more tokens (and thus cost more and reduce remaining context window).
Real-World Example
Including 3 examples of your desired output format in a prompt is few-shot learning — the AI recognizes the pattern and replicates it far more reliably than if you just described what you wanted.
Related Terms
More in LLM & Language Models
FAQ
What is Few-Shot Learning?
Providing a few examples in your prompt to show the AI the pattern you want — dramatically improving output quality for specific formats or styles.
How is Few-Shot Learning used in practice?
Including 3 examples of your desired output format in a prompt is few-shot learning — the AI recognizes the pattern and replicates it far more reliably than if you just described what you wanted.
What concepts are related to Few-Shot Learning?
Key related concepts include Zero-Shot Learning, Prompt Engineering, Prompt, Token. Understanding these together gives a more complete picture of how Few-Shot Learning fits into the AI landscape.