DeepSeek API Pricing 2026 — Cheapest LLM ($0.14/M tokens)

DeepSeek V3.2 API: $0.14/$0.28 per 1M tokens — cheapest major LLM. 90% cache discount. Free tier. Compare vs GPT-5, Claude, Gemini pricing.

··2 min read
Share

DeepSeek API Pricing — March 2026

Last updated: April 5, 2026 No pricing changes since February 2026.

TL;DR — DeepSeek API Prices (April 2026)

  • DeepSeek V3.2 (Chat): $0.14/$0.28 per 1M input/output tokens
  • DeepSeek V3.2 (Reasoner): $0.14/$0.28 per 1M tokens
  • Cache Hit: $0.014 per 1M input tokens (90% off)

DeepSeek V3.2 replaced both V3 and R1 with a unified model that handles both chat and reasoning at the same price.


DeepSeek V3.2 (Current)

ModelContextInput / 1M tokensOutput / 1M tokensCached Input
deepseek-chat (V3.2)128K$0.14$0.28$0.014
deepseek-reasoner (V3.2)128K$0.14$0.28$0.014

Max output: 8K tokens (chat), 64K tokens (reasoner).


Previous Models (Deprecated)

ModelInput / 1M tokensOutput / 1M tokens
DeepSeek V3$0.14$0.28
DeepSeek R1$0.55$2.19

Why DeepSeek is Popular

  1. Still very cheap — $0.28/$0.42 is a fraction of GPT-5 or Claude pricing
  2. Unified model — same price for chat and reasoning
  3. 90% cache discount — $0.028/M for repeated context
  4. Open weights — can self-host

Related