Fundamentals
2 articles
Concept
What is a token
A token is the text unit a model processes. It shapes input splitting, context limits, latency, and usually API cost.
ConceptFeatured
What is an LLM
An LLM is a language model trained on massive amounts of text to predict and generate coherent language. It's the engine behind ChatGPT, Claude, Gemini, and most of the generative AI tools you use today.