Skip to content
Technical GlossaryDeep Learning

Context Window

The maximum sequence length that a Transformer model can process in a single pass.

The context window is a critical property that defines the practical limits of Transformer-based models, especially large language models. The model can attend only to tokens within this window and must base its decisions on that visible context. In tasks such as long-document processing, multi-step reasoning, and code analysis, context size directly affects performance. It has become one of the major axes of competition in modern model development.