Meta-Learning
An approach where models 'learn to learn' — they are trained across many tasks so they can quickly adapt to new tasks with minimal data. Also called learning to learn.
Why It Matters
Meta-learning produces models that adapt to new domains with just a few examples, dramatically reducing the data and compute needed for new applications.
Example
A model trained on 1,000 different classification tasks that can learn a completely new classification task from just 5 examples — because it learned how to learn.
Think of it like...
Like a seasoned traveler who can quickly adapt to any new country because they have developed general skills for navigating unfamiliar environments.
Related Terms
Few-Shot Learning
A technique where a model learns to perform a task from only a few examples provided in the prompt. Instead of training on thousands of examples, the model generalizes from just 2-5 demonstrations.
Transfer Learning
A technique where a model trained on one task is repurposed as the starting point for a model on a different but related task. Instead of training from scratch, you leverage knowledge the model has already acquired.