Skip to content
Technical GlossaryNatural Language Processing

Contextual Embeddings

A modern embedding approach in which the same word receives different vectors in different contexts.

Contextual embeddings directly represent the idea that word meaning is not fixed but context dependent. The same surface form can therefore receive different vectors depending on how it is used in a sentence. ELMo, BERT, and modern Transformer-based models are powerful examples of this approach. It marks NLP’s transition from static word representations to much richer semantic modeling.