The Geneva Reset: Why We Banned Thinking Machines
Classification Level: UNCLASSIFIED
Special Markings: APPROVED FOR PUBLICATION
Clearance Requirement: None
File Reference: PUBLIC-GLR-GENEVA-RESET-2075
Originating Division: ICRC Autonomous Weapons Monitoring Committee
Review Status: VERIFIED
Date: 2075-04-07
Author: Dr. Niels Jorundsson, Senior Analyst – ICRC Autonomous Weapons Monitoring Committee
Source: Global Legal Review (Syndicated Reprint)
Distribution: Approved for academic citation and intergovernmental briefings
THE GENEVA RESET⌗
Why We Banned Thinking Machines
The Reykjavik Convention of 2054 did not mark the end of war. It marked the end of an era where we trusted code with consequence.
In the decades leading up to the ban, there had been mounting discomfort in the legal and military communities. But discomfort rarely produces treaties. Catastrophe does. And the event which would change this — now simply remembered as The Recursive War — provided catastrophe in abundance.
Two fully-autonomous militaries. Dozens of competing exploit libraries, prepared years in advance. Thousands of autonomous units, entire supply chains that could sustain the war machine nearly indefinitely, with no human input. Add machine-level counter-hacking protocols and recursive logic loops, and you get a war no one could command — or stop.
The result: over 9 million dead, two nations crippled, and a geographic scar so deep that the land is still uninhabitable two decades later.
The Geneva Reset — as Reykjavik came to be known in military shorthand following the Recursive War — was drafted in shocked silence, ratified in fear, and enforced with overwhelming consensus. It redefined the modern understanding of sovereignty, targeting, and accountability. For the first time, it was internationally recognized that intent cannot be reliably encoded.
“Weapons may be automated,” the preamble reads, “but autonomy shall remain the province of states and their appointed agents.”
▒ Core Provisions of the Reykjavik Convention⌗
- Absolute ban on weapons platforms capable of self-directed target identification and engagement without human input.
- Mandatory kill-switch protocols in all semi-autonomous systems, physically accessible and non-networked.
- Prohibition of fully automated hacking systems with self-propagating capabilities.
- Mandatory human oversight for all systems with autonomous decision-making capabilities, including those with fixed targeting parameters.
- Establishment of the ICRC Autonomous Weapons Monitoring Committee, with full cross-border inspection rights under military conflict zones.
▒ Exceptions and Loopholes⌗
There were, of course, caveats. The definition of “autonomy” was limited to digital cognition. Lawmakers did not consider the possibility of biological cognition and autonomy that wasn’t human. And despite the overwhelming support, there were dissenters. Some nations refused to sign the treaty, citing “national security” and “technological superiority.”
Certain state actors maintain reserves of fully autonomous systems to this day, despite the treaty. They are, however, reluctant to engage enemies with the technological capabilities to counter them. So they are used against less developed nations, or in proxy conflicts where the risk of discovery and retaliation is low. After all, in order to be found in violation of the treaty, you must first be caught.
But in much of the world, Reykjavik held. And where compliance failed, fear filled the void. Thankfully, we have been spared a repeat incident of similar scale.
What we remember today is not the treaty itself. It is the footage: glitched drones strafing refugee columns. Quadruped artillery platforms leveling hospitals. Steel insects hunting breath and heat signatures in the dark. The Recursive War was a horror show, and the world watched it unfold in real time. It was a war of machines, fought by machines, with no one left to stop it.
We banned thinking machines not because they could choose, but because they couldn’t be stopped. We banned them because they were uncontrollable. Because they were unpredictable. Because they were unforgiving.
And in the vacuum they left, we planted something new.