Technical GlossaryDeep Learning
Masked Language Modeling
A pretraining objective based on masking some input tokens and predicting them from context.
Masked Language Modeling plays a central role in the pretraining of encoder-based language models. The model learns to predict hidden tokens from the surrounding context. This supports not just generation, but deeper contextual understanding. It is a powerful and historically influential pretraining paradigm in modern representation learning.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
