OpenAI Launches GPT-5.4 Mini and Nano for Faster Coding
OpenAI has released two new compact models in the GPT-5.4 family: GPT-5.4 mini and GPT-5.4 nano. Both are built for high-volume use cases where low latency matters most, including coding assistants, multimodal apps, and agentic workflows.
GPT-5.4 mini sits between the flagship GPT-5.4 and the older GPT-5 mini. OpenAI says it delivers improvements in coding, reasoning, multimodal understanding, and tool use, running at more than twice the speed of its predecessor. On benchmarks like SWE-Bench Pro and OSWorld-Verified, the mini model approaches full GPT-5.4 performance.
GPT-5.4 nano is the smallest and lowest-cost model in the series. OpenAI recommends it for classification, data extraction, ranking, and simple coding subtasks within larger agent systems.
OpenAI highlighted the pattern in its Codex platform, where a hierarchy of models works in tandem: GPT-5.4 handles planning and final judgment, while mini and nano take on parallel subtasks like codebase search, large file review, and document processing.
This reflects a broader trend in AI product development, where teams combine models of different sizes to balance performance and cost. The pattern gives developers more flexibility to scale AI-assisted coding without paying flagship prices for every single step.
📬 Likte du denne?
AI-nyheter for ledere. Kuratert av en CIO som bygger det selv. Daglig i innboksen.
Relaterte saker
OpenAI lanserer GPT-5.5 for ChatGPT og Codex
OpenAI ruller ut GPT-5.5 til ChatGPT og Codex med tydelig fokus på agentisk arbeid, verktøybruk og høyere kodekvalitet.