Knowledge Cutoff
The date after which a language model has no training data. The model cannot reliably answer questions about events that occurred after its knowledge cutoff.
Why It Matters
Knowledge cutoffs create blind spots in LLMs. RAG and web search tools exist specifically to bridge this gap and keep AI responses current.
Example
Claude's knowledge cutoff means it may not know about events from last month. When asked, it should acknowledge the limitation or use search tools.
Think of it like...
Like an encyclopedia that was printed in 2024 — it contains everything up to that point but nothing that happened afterward.
Related Terms
Retrieval-Augmented Generation
A technique that enhances LLM outputs by first retrieving relevant information from external knowledge sources and then using that information as context for generation. RAG combines the power of search with the fluency of language models.
Training Data
The dataset used to teach a machine learning model. It contains examples (and often labels) that the model learns patterns from during the training process. The quality and quantity of training data directly impact model performance.
Hallucination
When an AI model generates information that sounds plausible and confident but is factually incorrect, fabricated, or not grounded in its training data or provided context. The model essentially 'makes things up'.
Grounding
The practice of connecting AI model outputs to verifiable sources of information, ensuring responses are based on factual data rather than the model's potentially unreliable internal knowledge.