Anthropic Hiring Explosives Expert to Prevent Catastrophic AI Misuse
Anthropic, the company behind Claude AI models, has posted a job listing for a chemical weapons and high-yield explosives expert. The role is designed to help the company prevent what it describes as "catastrophic misuse" of its AI systems.
According to the job description, the new expert will work closely with Anthropic's safety researchers to "tackle critical problems in preventing catastrophic misuse." This includes identifying ways Claude could potentially assist bad actors in creating dangerous substances or explosives — and then building safeguards against those scenarios.
Both BBC and Semafor, which reported the story, note this is part of a broader industry trend: major AI companies are now building internal teams with expertise drawn from defense and security sectors.
Context: Anthropic and the Pentagon
The announcement comes after a turbulent stretch for Anthropic. In January 2026, it emerged that Claude had been used in an operation that contributed to the capture of Venezuelan President Nicolás Maduro in Caracas. Following that revelation, the relationship between Anthropic and the Pentagon began to unravel, culminating in a February 27 announcement by the Trump administration that reshaped the partnership.
Anthropic now finds itself in a difficult position: delivering powerful AI systems to governments and military clients while ensuring those same systems cannot be weaponized for mass casualty events.
The Industry's New Normal
Anthropic isn't alone in making this move. According to sources familiar with the industry, there is now a growing market for experts with backgrounds in defense, explosives, and hazardous materials — people who can help AI companies understand precisely what they need to guard against.
This signals a maturation in the AI safety field: moving beyond purely text-based safeguards toward deep domain expertise drawn from the physical world.
📬 Likte du denne?
AI-nyheter for ledere. Kuratert av en CIO som bygger det selv. Daglig i innboksen.