On November 9, 1979, computer monitors at the North American Aerospace Defense Command (NORAD) in Colorado, the National Command Center at the Pentagon, and the Pacific Headquarters in Hawaii all detected missiles launched from submarines off the West Coast of the United States. Then they depicted intercontinental ballistic missiles (ICBMs) launched from sites throughout the Soviet Union.
More than 1,000 were in-bound, and the pattern was consistent with what American war planners anticipated in the event of a surprise attack. There was little time to waste as the first volley would arrive on American soil in about eight minutes. Bombers took to the sky, missile silo personnel began their launch preparations, and the president’s “doomsday” plane, without the president aboard, lifted off the tarmac.
As Eric Schlosser notes, a Missile Display Conference was hastily assembled within NORAD to parse the data. Then a Threat Assessment Conference was conducted which included a broader array of participants, including the chairman of the Joint Chiefs of Staff. The last step was a Missile Attack Conference in which President Carter would be advised of the retaliatory options.
Just prior to contacting the president, however, it was determined that it was a false alarm.
Although the monitors were projecting a realistic scenario that was becoming more dire with each passing second, ground-based radar stations were not detecting in-bound missiles. Something was desperately wrong, but it was not a Soviet first-strike. It was later discovered that a technician had inadvertently loaded a training tape into a NORAD computer. It began with human error, but the false information propagated because of inter-connected computer systems such that the tape was displayed as if it were real-time data.
Upon detecting ICBMs launched from Russian soil there is less than 30 minutes to assess the validity of the data and pass this information up the chain of command. By the time it reaches the U.S. president, he or she has approximately 10 of these minutes to consult with military advisors before deciding on a response. If launched by submarine, the window from detection to decision is compressed to 10-15 minutes.
The United States has 400 ICBMs perched in underground silos spanning five states. Each is armed with a nuclear warhead and ready to launch within minutes. They are immobile and vulnerable to attack, and, in turn, they are on a “launch on warning” posture. They must be airborne before they can be destroyed where they stand and, potentially, before conclusive evidence of a first strike has been confirmed.
Their destination is preprogrammed, cannot be changed after lift-off, nor can they be recalled or destroyed in-flight. Once aloft–there is no turning back.
In September of 1983, the Soviet early warning system abruptly detected an in-bound missile, and then another, and another. Soon five lit up the monitors. Each could contain multiple warheads with distinct targets. One man, Lieutenant Colonel Stanislav Petrov, had just minutes to respond. Violating protocol given the data displayed, he informed his superiors that it was a false alarm–a hunch more so than a conviction. Petrov made the right decision even though a first strike was a reasonable inference. Tension between the Soviet bloc and the West was higher than it had been in years in the first years of the Reagan Administration, and the lines of communication were scarce.
Petrov’s agonizing dilemma is depicted in the documentary The Man Who Saved the World. He may have saved the world, but he was demoted for not following established procedures. Such was the paradoxical rationality of the Soviet state. It was later determined a satellite had mistaken sunlight glancing off clouds for the contrails of missiles in-flight.
In 1995, the Russian early warning system identified a missile whose trajectory was consistent with a high-altitude electromagnetic pulse detonation intended to disable their command-and-control system as a prelude to a surprise attack. This information was frantically passed up the chain of command until it reached Russian President Boris Yeltsin, and the nuclear briefcase, from which a retaliatory response is initiated, was activated before it was deemed a false alarm.
The rocket was later determined to be a Norwegian research project studying the aurora borealis or northern lights.
In January of 2018, Hawaiian residents were abruptly confronted with an unnerving message by smartphone, radio, and television: “Ballistic missile threat inbound to Hawaii. Seek immediate shelter. This is not a drill.” The alert came just days after President Trump threatened to unleash “fire and fury” on the North Korean government if they did not denuclearize (see Masco 2021).
After a long 38-minutes, the alert was retracted. It was inadvertently issued by an employee of the state’s Emergency Management Agency during a drill.
Nuclear weapons command and control systems consist of tightly controlled and bounded networks of advanced technology, rigorous training of operators, and nested chains of operational procedures. Below this is the legacy of misinterpretation, happenstance, incomplete information, and technological failures.
Indeed, nuclear early warning systems are periodically beset by crisis in which the rational choice–in the moment–would lead to a catastrophic loss of life. It would also produce economic ruin and the collapse of agricultural productivity worldwide due to a “nuclear winter” as cities and complex social organization burned to soot and bone, and the detritus accumulates in the stratosphere.
If nuclear war occurs, it will likely commence with ICBMs lifting off from America’s heartland and arching northward over the top of the planet in response to some hitch in the system that could not be deciphered quickly. Or they may be coming from the opposite direction. Regardless, if nuclear war occurs it will likely begin with a full-throated counterstrike in response to a first strike that only occurred digitally.
The social theorist C. Wright Mills would not find this historically persistent paradox to be surprising. In The Causes of World War Three (1959) he stressed that preparation for nuclear war–construed as a realist confrontation with existing threats–is best conceptualized as “crackpot realism.” Rather than mobilizing in response to dangerous geopolitical conditions, the generals, corporate executives, and many members of Congress create a dangerous geopolitical context through their devotion to a permanent war economy.
“The immediate cause of World War III is the preparation of it,” Mills cautioned. His warning, published three years earlier, nearly proved prophetic during the Cuban missile crisis of 1962.
The Causes of World War Three is as relevant today as it was in 1959. Today, as then, there are powerful actors occupying positions of influence in large bureaucracies whose parochial objectives are tethered to resolute militarism. Careers, annual budgets, stock portfolios, and political ambitions are advanced through defense spending. And amid all of this, today, are the think tanks advertising the unquestioned necessity of it all.
Nuclear weapons are the most dangerous technology ever invented. Sunlight reflecting off clouds, the moon ascending above the horizon, flaring from oil and gas drilling operations, a flock of geese, and a defective computer chip have all triggered false positive indications of the commencement of nuclear war. For Mills, the problem is “rationality without reason.” Nuclear weapons are an expression of the concentration of power in government and corporate bureaucracies devoted to militarism, above all other considerations.
What is absent, far too often, is reflection upon where all of this is taking us, and the persistent and periodic crises in which accidental nuclear war becomes a very real threat.
For more information:
Jacobsen, Annie. 2024. Nuclear War: A Scenario. Dutton Press.
Masco, Joseph. 2021. The Future of Fallout, and Other Episodes in Radioactive World-Making. Duke University Press.
Mills, C. Wright. 1959. The Causes of World War Three.
Sagan, Scott D. 1995. The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton University Press.
Schlosser, Eric. 2014. Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. Penguin Books.
The Man Who Saved the World(trailer): https://www.youtube.com/watch?v=VaPXVJWHji4