Context Window
The maximum amount of text an AI model can process at one time.
Full Definition
A context window refers to the maximum amount of text that an AI language model can process and remember during a single conversation or task. Think of it as the model's "working memory" - everything within this window can influence the AI's responses, while anything outside of it is effectively forgotten. The size of a context window is measured in tokens (roughly equivalent to words or word fragments). For example, GPT-3.5 has a context window of about 4,000 tokens, while newer models like GPT-4 can handle up to 128,000 tokens or more. This limitation affects how much conversation history, document content, or instructions the AI can consider when generating responses. When the context window fills up, the AI typically removes the oldest information to make room for new input, which can lead to the model "forgetting" earlier parts of a conversation. Understanding context windows is crucial for effectively working with AI tools, especially for tasks involving long documents or extended conversations.
Examples
A chatbot that can only remember the last 10 messages in a conversation due to its context window limit
An AI writing assistant that can analyze a 50-page document at once because it has a large context window
A code review tool that needs to break large files into smaller chunks to fit within its context window