What tokens actually are
Tokens are chunks of text, code, punctuation, numbers, or symbols that a model can process. A short word may be one token, while a longer word, a code fragment, or structured text can break into several tokens.
That is why token counts are not the same as word counts or character counts. Prompt estimators use heuristics or provider-specific tokenizers to approximate how a model may split your input.