Machine Learning

Cross-Encoder

A model that takes two texts as input simultaneously and outputs a relevance or similarity score. Unlike bi-encoders, cross-encoders consider the full interaction between both texts.

Why It Matters

Cross-encoders produce much more accurate similarity scores than bi-encoders but are slower. They are ideal for reranking where accuracy on a small set matters most.

Example

A cross-encoder taking a query 'best restaurants in NYC' and a passage about dining in New York simultaneously, outputting a relevance score of 0.94.

Think of it like...

Like a teacher who reads both the question and the student's answer together before scoring, versus one who scores the answer without seeing the question.

Related Terms