Zero-Shot Learning
Asking a model to perform a task without providing examples—just an instruction.
Zero-shot learning is asking an AI to do something it hasn't seen examples of. No training, no fine-tuning, no examples in the prompt—just an instruction. "Translate this to Spanish" or "Write a haiku" are zero-shot requests.
Zero-shot works because pre-trained models learn broad capabilities from their training data. A model trained on diverse text has seen translations and poems before, so it can generalize to new cases.
Zero-shot is convenient but often less accurate than few-shot. Adding examples (few-shot) almost always improves performance. Modern language models are better at zero-shot than older models because they've been trained on more diverse data and fine-tuned with human feedback.
Zero-shot and few-shot are in-context learning—they don't involve retraining.
Example
Zero-shot: "Summarize this article in one sentence." Few-shot would first show examples of good summaries.