Zbigniew Brzezinski with Deng Xiaoping |
Washington, D.C., March 1, 2012 –
During the 2008 campaign, Democratic presidential hopefuls Hillary
Clinton and Barack Obama debated the question: who was best suited to be
suddenly awakened at 3 a.m. in the White House to make a tough call in a
crisis. The candidates probably meant news of trouble in the Middle
East or a terrorist attack in the United States or in a major ally, not
an 'end of the world' phone call about a major nuclear strike on the
United States. In fact at least one such phone call occurred during the
Cold War, but it did not go to the President. It went to a national
security adviser, Zbigniew Brzezinski, who was awakened on 9 November
1979, to be told that the North American Aerospace Defense Command
(NORAD), the combined U.S.–Canada military command–was reporting a
Soviet missile attack. Just before Brzezinski was about to call
President Carter, the NORAD warning turned out to be a false alarm. It
was one of those moments in Cold War history when top officials believed
they were facing the ultimate threat. The apparent cause? The routine
testing of an overworked computer system.
Recently declassified documents about this incident and other false
warnings of Soviet missile attacks delivered to the Pentagon and
military commands by computers at NORAD in 1979 and 1980 are published
today for the first time by the National Security Archive. The erroneous
warnings, variously produced by computer tests and worn out computer
chips, led to a number of alert actions by U.S. bomber and missile
forces and the emergency airborne command post. Alarmed by reports of
the incident on 9 November 1979, the Soviet leadership lodged a
complaint with Washington about the "extreme danger" of false warnings.
While Pentagon officials were trying to prevent future incidents,
Secretary of Defense Harold Brown assured President Jimmy Carter that
false warnings were virtually inevitable, although he tried to reassure
the President that "human safeguards" would prevent them from getting
out of control.
Among the disclosures in today's posting:
- Reports that the mistaken use of a nuclear exercise tape on a NORAD computer had produced a U.S. false warning and alert actions prompted Soviet Communist Party General Secretary Leonid Brezhnev to write secretly to President Carter that the erroneous alert was "fraught with a tremendous danger." Further, "I think you will agree with me that there should be no errors in such matters."
- Commenting on the November 1979 NORAD incident, senior State Department adviser Marshal Shulman wrote that "false alerts of this kind are not a rare occurrence" and that there is a "complacency about handling them that disturbs me."
- With U.S.-Soviet relations already difficult, the Brezhnev message sparked discussion inside the Carter administration on how best to reply. Hard-liners prevailed and the draft that was approved included language ("inaccurate and unacceptable") that Marshal Shulman saw as "snotty" and "gratuitously insulting."
- Months later, in May and June 1980, 3 more false alerts occurred. The dates of two of them, 3 and 6 June 1980, have been in the public record for years, but the existence of a third event, cited in a memorandum from Secretary of Defense Brown to President Carter on 7 June 1980, has hitherto been unknown, although the details are classified.
- False alerts by NORAD computers on 3 and 6 June 1980 triggered routine actions by SAC and the NMCC to ensure survivability of strategic forces and command and control systems. The National Emergency Airborne Command Post (NEACP) at Andrews Air Force Base taxied in position for emergency launch, although it remained in place. Because missile attack warning systems showed nothing unusual, the alert actions were suspended.
- Supposedly causing the incidents in June 1980 was the failure of a 46¢ integrated circuit ("chip") in a NORAD computer, but Secretary of Defense Brown reported to a surprised President Carter that NORAD "has been unable to get the suspected circuit to fail again under tests."
- In reports to Carter, Secretary cautioned that "we must be prepared for the possibility that another, unrelated malfunction may someday generate another false alert." Nevertheless, Brown argued that "human safeguards"—people reading data produced by warning systems--ensured that there would be "no chance that any irretrievable actions would be taken."
Background
For decades, the possibility of a Soviet missile attack preoccupied
U.S. presidents and their security advisers. Because nuclear hostilities
were more likely to emerge during a political-military confrontation
(such as Cuba 1962, the likelihood of a bolt from the blue was remote
but Washington nevertheless planned for the worst case. Under any
circumstances, U.S. presidents and top military commanders wanted
warning systems that could provide them with the earliest possible
notice of missile launches by the Soviet Union or other adversaries. By
the early 1960s, the Pentagon had the Ballistic Missile Early Warning
System (BMEWs) that could provide about 15 minutes of warning time. By
the mid-to-late1960s, forward-scatter systems (so-called "Over the
Horizon Radar") could detect missile launches within five to seven
minutes from while, while the 474N system could give three-to-seven
minutes of warning of launches from submarines off the North American
coast. [1]
By the end of the 1960s, the United States was getting ready to
deploy the Defense Support Program satellites which use infrared
technology to detect plumes produced by missile launches. DSP could be
used to tell whether missile launches were only tests or whether they
signified a real attack by detecting number of missile launches and
trajectory. This provided25 to 30 minutes of warning along with
information on the trajectory and ultimate targets of the missiles. As
long as decision-makers were not confronting the danger of a SLBM
launch, the DSP would give them some time to decide how to retaliate.
In 1972, the North American Aerospace Command (NORAD) began to
network warning systems into at "interlinked system" operated at its
headquarters in Cheyenne Mountain, Colorado.[2]
A complex computer-based system always bore the risk of failure,
break-downs, or errors. Even before networking emerged, false warnings
emerged as early as 1960 when a BMEWs radar in Greenland caught "echoes
from the moon," which generated a report of a missile attack which was
quickly understood to be false (see document 1). During the Cuban
Missile Crisis false warning episodes occurred, some of them involving
NORAD, that were virtually unknown for many years.[3]
If there were significant incidents during the years that followed, it
remains to be learned. But once the networked systems were in place, the
possibility that they would typically produce false warnings became
evident.
The Events of 1979-1980
"As he recounted it to me, Brzezinski was awakened at three in the
morning by [military assistant William] Odom, who told him that some 250
Soviet missiles had been launched against the United States. Brzezinski
knew that the President's decision time to order retaliation was from
three to seven minutes …. Thus he told Odom he would stand by for a
further call to confirm Soviet launch and the intended targets before
calling the President. Brzezinski was convinced we had to hit back and
told Odom to confirm that the Strategic Air Command was launching its
planes. When Odom called back, he reported that … 2,200 missiles had
been launched—it was an all-out attack. One minute before Brzezinski
intended to call the President, Odom called a third time to say that
other warning systems were not reporting Soviet launches. Sitting alone
in the middle of the night, Brzezinski had not awakened his wife,
reckoning that everyone would be dead in half an hour. It had been a
false alarm. Someone had mistakenly put military exercise tapes into the
computer system." -- Robert M. Gates. From the Shadows: The Ultimate Insider's Story of Five Presidents and How they Won the Cold War (New York: Simon & Shuster, 1996),114.
The series of alarming incidents and telephone phone calls recounted
by former NSC staffer (and later CIA director and future Secretary of
Defense) Robert Gates took place in the middle of the night on 9
November 1979. Because of the potentially grave implications of the
event, the episode quickly leaked to the media, with the Washington Post and The New York Times
printing stories on what happened. According to press reports, based on
Pentagon briefings, a NORAD staffer caused the mistake by mistakenly
loading a training/exercise tape into a computer, which simulated an
"attack into the live warning system." This was a distortion because it
was not a matter of a "wrong tape," but software simulating a Soviet
missile attack then testing NORAD's 427M computers "was inexplicably transferred
into the regular warning display" at the Command's headquarters.
Indeed, NORAD's Commander-in-chief later acknowledged that the "precise
mode of failure … could not be replicated."[4]
The information on the display simultaneously appeared on screens at
SAC headquarters and the National Military Command Center (NMCC), which
quickly led to defensive actions: NORAD alerted interceptor forces and
10 fighters were immediately launched. Moreover, the National Emergency
Airborne Command Post (NEACP), used so the president could control U.S.
forces during a nuclear war, was launched from Andrews Air Force Base,
although without the president or secretary of defense.
Some of this information did not reach the public for months, but at
least one reporter received misleading information about how high the
alert went. According to the New York Times' sources, the
warning was "deemed insufficiently urgent to warrant notifying top
Government or military officials." Apparently no one wanted to tell
reporters (and further scare the public) that the phone call went to
President's Carter's national security adviser Zbigniew Brzezinski.
The behind-the-scenes story became more complicated because the
Soviet leadership was worried enough to lodge a complaint with
Washington. The Cold War tensions had already been exacerbated during
the previous year and this could not help (nor could an impending
Kremlin decision to invade Afghanistan). On 14 November, party leader
Leonid Brezhnev sent a message via Ambassador Anatoly Dobyrnin
expressing his concern about the incident which was "fraught with a
tremendous danger." What especially concerned Brezhnev were press
reports that top U.S. leaders had not been informed at the time about
the warning. The Defense Department and Brzezinski took hold of the
reply to Brezhnev's message which senior State Department adviser
Marshall Shulman saw as "gratuitously snotty" (for example, language
about the "inaccurate and unacceptable" Soviet message). The Soviets
were indeed miffed because they later replied that the U.S. message was
not "satisfactory" because it had taken a polemical approach to Moscow's
"profound and natural concern."
About seven months later, U.S. warning systems generated three more
false alerts. One occurred on 28 May 1980; it was a minor harbinger of
false alerts on 3 and 6 June 1980. According to the Pentagon, what
caused the malfunctions in June 1980 was a failed 46¢ micro-electronic
integrated circuit ("chip") and "faulty message design." A computer at
NORAD made what amounted to "typographical errors" in the routine
messages it sent to SAC and the National Military Command Center (NMCC)
about missile launches. While the message usually said "OOO" ICBMs or
SLBMs had been launched, some of the zeroes were erroneously filled in
with a 2, e.g. 002 or 200, so the message indicated that 2, then 200
SLBMs were on their way. Once the message arrived at SAC, the command
took survivability measures by ordering bomber pilots and crews to their
stations at alert bombers and tankers and to start the engines.
No NORAD interceptors were launched so something had been learned
from the November episode, but SAC took same precautionary measures. The
Pacific Command's airborne command post ("Blue Eagle") was launched for
reasons that remain mysterious.[5]
NEACP taxied in position at Andrews Air Force Base, but it was not
launched as in November. That missile warning sensors (DSP, BMEWs, etc)
showed nothing amiss made it possible for military commanders to call
off further action. According to a Senate report, NORAD ran its
computers the next 3 days in order to isolate the cause of the error;
the "mistake was reproduced" in the mid-afternoon of 6 June with the
similar results and SAC took defensive measures.[6]
When Harold Brown explained to President Carter what had happened and
what was being done to fix the system, he cautioned that "we must be
prepared for the possibility that another, unrelated malfunction may
someday generate another false alert." This meant that "we must continue
to place our confidence in the human element of our missile attack
warning system." Brown, however, did not address a problem raised by
journalists who asked Pentagon officials, if another false alert
occurred, whether a "chain reaction" could be triggered when "duty
officers in the Soviet Union read data on the American alert coming into
their warning systems." A nameless U.S. defense official would give no
assurances that a "chain reaction" would not occur, noting that "I hope
they have as secure a system as we do, that they have the safeguards we
do."
How good the safeguards actually were remains an open question. While
Secretary of Defense Brown acknowledged the "possibility" of future
false alerts, he insisted on the importance of human safeguards in
preventing catastrophes. Stanford University professor Scott Sagan's
argument about "organizational failure" is critical of that optimism on
several counts. For example, under some circumstances false alerts could
have had more perilous outcomes, e.g. if Soviet missile tests had
occurred at the same time or if there were serious political tensions
with Moscow, defense officials might have been jumpier and launched
bomber aircraft or worse. Further, false warnings were symptomatic of
"more serious problems with the way portions of the command system had
been designed." Yet, defense officials have been reluctant to
acknowledge organizational failings, instead blaming mistakes on 46¢
chips or individuals inserting the wrong tape. Treating the events of
1979 and 1980 as "normal accidents" in complex systems, Sagan observes
that defense officials are reluctant to learn from mistakes and have
persuaded themselves that the system is "foolproof."[7]
Bruce Blair also sees systemic problems. Once a
"launch-under--attack" strategic nuclear option became embedded in war
planning policy during the late 1970s, he sees the weakening of the
safeguards that had been in place, e.g., confirmation that a Soviet
nuclear attack was in progress or had already occurred. One of the
arguments for taking Minuteman ICBMs off their current high alert status
(making virtually instantaneous launch possible) has been that a false
warning, combined with an advanced state of readiness, raises the risk
of accidental nuclear war. The risk of false alerts/accidental war is
one of the considerations that is prompting other anti-nuclear activists,
including Daniel Ellsberg, to protest at Vandenberg Air Force Base
against the Minuteman ICBM program and the continued testing of
Minutemen.[8]
The Soviet nuclear command and control system that developed during
the 1980s provides an interesting contrast with the U.S.'s. While the
United States emphasized "human safeguards" as a firewall, the
"Perimeter" nuclear warning-nuclear strike system may have minimized
them. In large part, it was a response to Soviet concern that a U.S.
decapitating strike, aimed at the political leadership and central
control systems, could cripple retaliatory capabilities. Reminiscent of
the "doomsday machine" in Stanley Kubrick's Dr. Strangelove or How I Learned to Stop Worrying and Love the Bomb,
Perimeter could launch a semi-automatic nuclear strike under specified
conditions, for example, no contact with political or military leaders,
atomic bombs detonating, etc. If such conditions were fulfilled, a few
military personnel deep in an underground bunker could launch emergency
command and control rockets which in turn would transmit launch orders
to ICBMs in their silos. According to David Hoffman's Pulitzer-prize
winning The Dead Hand, when Bruce Blair learned about
Perimeter, he was "uneasy that it put launch orders in the hands of a
few, with so much automation." While the system may have been
operational as late as the early 1990s, only declassification decisions
by Russian authorities can shed light on Perimeter's evolution.[9]
According to Bruce Blair, writing in the early 1990s, warning system
failures continued after 1980, although they did not trigger alert
measures.[10]
The U.S. nuclear incidents that have received the most attention have
not been false warnings, but events such as the Air Force's accidental
movement of nuclear-tipped cruise missiles from Minot AFB to Barksdale
AFB in 2007 and the mistaken transfer of Minuteman nose-cone assemblies
to Taiwan in 2006. In any event, more needs to be learned about the
problem of false warnings during and after the Cold War and pending
declassification requests and appeals may shed further light on this
issue. More The National Security Archive >>
No hay comentarios:
Publicar un comentario