You may have noticed that Gamma's subscription plans include specific AI token counts, and wondered, what exactly is a token?
Gamma uses OpenAI's definition of a token:
Tokens can be thought of as pieces of words. Before the API processes the request, the input is broken down into tokens. These tokens are not cut up exactly where the words start or end - tokens can include trailing spaces and even sub-words.
How words are split into tokens is also language-dependent. For example ‘Cómo estás’ (‘How are you’ in Spanish) contains 5 tokens (for 10 chars). The higher token-to-char ratio can make it more expensive to implement the API for languages other than English.
Gamma's Plus subscription plan allows up to 10k AI tokens per generation, and Gamma's Pro subscription plan allows up to 25k AI tokens per generation.
OpenAI offers a free tokenizer, which you can use to understand how a piece of text might be tokenized by a language model, and the total count of tokens in that piece of text: https://platform.openai.com/tokenizer