Grounding
The practice of connecting AI model outputs to verifiable sources of information, ensuring responses are based on factual data rather than the model's potentially unreliable internal knowledge.
Why It Matters
Grounding is essential for enterprise AI — it transforms LLMs from creative writing tools into reliable information systems that cite their sources.
Example
An AI assistant that retrieves specific documents from the company knowledge base and cites page numbers and sections when answering employee questions.
Think of it like...
Like an academic paper that backs every claim with citations — it is not enough to be right, you need to show where the information comes from.
Related Terms
Retrieval-Augmented Generation
A technique that enhances LLM outputs by first retrieving relevant information from external knowledge sources and then using that information as context for generation. RAG combines the power of search with the fluency of language models.
Hallucination
When an AI model generates information that sounds plausible and confident but is factually incorrect, fabricated, or not grounded in its training data or provided context. The model essentially 'makes things up'.
Knowledge Base
A structured or semi-structured collection of information used by AI systems to retrieve factual data. In the context of RAG, it typically refers to the document collection that the system can search.