AI outsmarts PhD-level Virologists, now raising biohazard fears. A groundbreaking new study by the Center for AI Safety , in collab...
![]() |
AI outsmarts PhD-level Virologists, now raising biohazard fears. |
Researchers subjected today’s top AI models, such as DeepVariant, ViraMiner, etc, including generative AI like OpenAI’s o3 and Google’s Gemini 2.5 Pro, to a test of 322 intricate virology problems. These weren’t generic search-engine queries, but nuanced, practical lab challenges—the kinds of issues that typically stump even experienced professionals. The results were shocking: while human virologists scored just 22.1%, the o3 model hit 43.8%, and Gemini came in at 37.6%. In other words, AI is now more reliable than human experts in scenarios that demand deep technical knowledge and lab acumen.
But with this leap in capability comes a wave of fear—one that extends far beyond the academic world. Biosecurity experts are sounding the alarm. These AI systems aren’t just smart—they’re dangerously helpful. With such models becoming more accessible, the protective barrier that once existed between dangerous bioweapon knowledge and the average person is rapidly disintegrating.Historically, crafting or weaponizing a virus was a task reserved for those with years of scientific training and access to high-level resources. Now, that gatekeeping is at risk. AI can serve as a step-by-step guide, filling in the gaps where human ignorance used to act as a natural safeguard. It's not just that AI has made biotechnology easier to understand—it's that it's made it actionable for anyone, even those with ill intent.
As the implications sink in, calls for control are getting louder. Many experts insist that only verified professionals, such as researchers affiliated with institutions like MIT, should have access to unrestricted AI capabilities. They argue for third-party audits, mandatory evaluations before deployment, and perhaps most critically, for government regulation to replace the voluntary self-policing that currently governs the industry.
Because here's the chilling truth: AI has effectively made decades of complex biotech expertise Googleable. That’s a marvel—until you realize the same intelligence can walk someone through building a bioweapon. The machines haven’t just gotten smarter—they’ve gotten too helpful. And that, more than anything, demands urgent, global attention.