Definition

AI Tokens

The chunks of text that AI models process. Roughly 3/4 of a word each. Tokens determine how much you can input and how much responses cost.

Full Definition

In AI, tokens are the basic units that language models use to process text. A token is roughly 3/4 of a word. "hello" is 1 token, "unbelievable" is 3 tokens, and a space or punctuation mark can be its own token. When you interact with an AI like ChatGPT or Claude, your message is broken into tokens, processed, and the response is generated token by token. Tokens matter for two practical reasons: they determine how much text you can send to the AI in one message (the context window), and they determine how much API usage costs. GPT-4o charges about $2.50 per million input tokens and $10 per million output tokens. Claude Sonnet charges about $3 per million input tokens and $15 per million output tokens. For most people using AI through a subscription (ChatGPT Plus, Claude Pro), tokens are invisible. You just type and get responses. Tokens become relevant when you hit context limits (the AI 'forgetting' earlier parts of long conversations), when you use AI APIs for building applications, or when you're trying to understand pricing for business use.

Examples

The sentence 'The quick brown fox jumps over the lazy dog' is approximately 10 tokens

A typical 500-word blog post uses about 650-700 tokens. Well within any model's context window

Claude's 200K token context window can hold roughly 150,000 words. About 2-3 full novels

Where You'll See This

ai-toolscodingproductivity

Frequently Asked Questions