Skip to main content
DigiCalcs

How to Calculate Token Cost

What is Token Cost?

The Token Cost Calculator converts text into tokens and calculates the exact API cost for any LLM provider. It helps developers understand the relationship between text length, token count, and cost across different models and pricing tiers.

Formula

Cost = (Token Count / 1,000,000) × Price per Million Tokens
W
Word Count (words) — Number of words in the text
T
Token Count (tokens) — Number of tokens after tokenization
P
Price per Million ($/1M tokens) — Model-specific per-million-token rate
C
Cost ($) — Total cost for the given token count

Step-by-Step Guide

  1. 1Paste your text or enter a word/character count to estimate tokens
  2. 2Select the LLM provider and model for pricing
  3. 3Specify whether the text is input (prompt) or output (completion)
  4. 4View the token count, cost for this text, and cost per word

Worked Examples

Input
1,000 words of English text as input to GPT-4o
Result
Approximately 1,333 tokens. Cost: 1,333 / 1,000,000 × $2.50 = $0.0033 (one-third of a cent).
Input
10,000 words of output from Claude 3.5 Sonnet
Result
Approximately 13,333 tokens. Cost: 13,333 / 1,000,000 × $15.00 = $0.20 (twenty cents).

Common Mistakes to Avoid

  • Assuming 1 word = 1 token — in English, 1 token averages about 0.75 words or 4 characters
  • Forgetting that code, non-English text, and special characters often tokenize into more tokens per word
  • Not separating input and output costs when budgeting — they have very different rates

Frequently Asked Questions

How many tokens is a typical ChatGPT conversation?

A typical conversation turn is 50-200 tokens for the user message and 100-500 tokens for the response. A 10-turn conversation accumulates approximately 2,000-5,000 total tokens. System prompts add 100-2,000 additional tokens per call depending on complexity.

Do different languages use different numbers of tokens?

Yes, significantly. English is most efficiently tokenized (~1.3 tokens per word). Chinese, Japanese, and Korean use 2-3 tokens per character. Arabic, Hindi, and other scripts may use 3-5 tokens per word. This means non-English API calls can be 2-4x more expensive per equivalent meaning.

Ready to calculate? Try the free Token Cost Calculator

Try it yourself →

Settings

PrivacyTermsAbout© 2026 DigiCalcs