Hopp til hovedinnhold
Fredag 24. april 2026AI-nyheter, ferdig filtrert for ledere
SISTE:
DeepSeek åpner V4 Preview med 1M kontekst og API-kompatibilitetOpenAI lanserer GPT-5.5 for ChatGPT og CodexAnthropic og Amazon utvider AI-alliansen med 5 GW kapasitet og ny investeringDeepSeek åpner V4 Preview med 1M kontekst og API-kompatibilitetOpenAI lanserer GPT-5.5 for ChatGPT og CodexAnthropic og Amazon utvider AI-alliansen med 5 GW kapasitet og ny investering
Anthropic Makes 1 Million Token Context Window Generally Available — No Beta Header Required
AnthropicClaudeCIOAPILLM

Anthropic Makes 1 Million Token Context Window Generally Available — No Beta Header Required

JH
Joachim Høgby
19. mars 202619. mars 20264 min lesingKilde:

Anthropic has officially made the 1 million token context window generally available (GA) for Claude Opus 4.6 and Claude Sonnet 4.6 – no beta header required, and at standard pricing with no additional cost.

What Changed?

Until now, 1M context required an explicit beta header in the API call and was subject to additional limitations. Now:

  • Beta header removed – no extra configuration needed
  • Limits lifted – full 1M token access in production
  • Media cap raised to 600 – supports much larger multimodal requests
  • Standard pricing – no additional cost for extended context

What Does One Million Tokens Mean in Practice?

1 million tokens equals approximately:

  • 750,000 words – an entire codebase
  • Hundreds of long documents or meeting notes
  • A full novel plus supplementary material

This enables analyzing complete repositories, lengthy legal contracts, or entire annual reports in a single request – without splitting or losing context along the way.

Practical Use Cases

For enterprises with large data volumes, this opens up:

  • Codebase analysis: Send entire projects to Claude for review or refactoring
  • Document search: Analyze complete regulatory documents without chunking
  • Contract review: Entire agreement packages in one prompt
  • Long-running conversations: Support agents with full history without context loss

The change makes Anthropic more competitive against Google Gemini, which has had 1M+ context for some time.

📬 Likte du denne?

AI-nyheter for ledere. Kuratert av en CIO som bygger det selv. Daglig i innboksen.

Relaterte saker

Anthropic avduker Project Glasswing og holder igjen Claude Mythos Preview
Breaking
CIOAICybersecurity

Anthropic avduker Project Glasswing og holder igjen Claude Mythos Preview

7. april 20264 min lesing
Åpne saken
Anthropic unveils Project Glasswing and withholds Claude Mythos Preview
Breaking
CIOAICybersecurity

Anthropic unveils Project Glasswing and withholds Claude Mythos Preview

7. april 20264 min lesing
Åpne saken