Technical GlossaryGenerative AI and LLM
Pretraining
The initial training stage in which a model learns broad patterns from large-scale general data.
Pretraining is the foundation of the foundation model paradigm because the model acquires broad knowledge of language, visual structure, or multimodal patterns during this stage. Much of the adaptability needed for downstream tasks is gained here. The diversity, volume, and quality of the data directly shape the model’s later capability.
You Might Also Like
Explore these concepts to continue your artificial intelligence journey.
