1. Text becomes tokens
AI does not read whole sentences at once
Think of tokens as the small text chunks the model uses to learn patterns. It studies which chunks usually come before or after other chunks.
Token example
Rewrite politely: "Your report was bad." becomes chunks like {Rewrite} {politely} {Your} {report} {was} {bad}.
A token can be a full word, part of a word, or punctuation. Models learn by predicting tokens one at a time.