Counting Tokens
Last updated
Was this helpful?
Last updated
Was this helpful?
Model
The name of the Model
to use for counting tokens.
Prompt
The content of the current conversation with the model.
totalTokens
The number of tokens that the Model
tokenizes the prompt
into. Always non-negative.
cachedContentTokenCount
Number of tokens in the cached part of the prompt (the cached content).