Stuxnet

Risk & Uncertainty in the First Salvo of Global Cyber Warfare

By
Source code security plugin
Stuxnet : Risk & Uncertainty in the First Salvo of Global Cyber Warfare - Andrew Leedom

Abstract

The primary focus of this case study is to try to understand what risks and uncertainties governments and other actors confront when they deploy offensive cyber weapons as part of military and intelligence operations. Under current circumstances, the implications for national and international infrastructure seem staggering. The roadmap for this case study is threefold: 1) provide a framework for analysis, 2) introduce Stuxnet and survey the history and development of the first documented cyber weapon, and 3) apply the framework in an attempt to explore the risks and uncertainties inherent in the nascent arena of cyber warfare.

State of the Art

The interconnected and complex systems that lie at the heart of modern energy delivery, financial markets, and information storage are shockingly vulnerable to penetration and manipulation by hostile parties. The United States government, specifically the National Security Agency (NSA) and its allegedly associated cyber development teams such as the shadowy consortium of infiltration experts dubbed the “Equation Group” by Kaspersky Labs,1 played with precisely this kind of fire when it designed and deployed the worm that became known to the world as Stuxnet as part of the clandestine project codenamed Operation Olympic Games.2 Looking at the saga and consequences associated with Stuxnet offers the opportunity to see clearly how advanced cyber weaponry disseminates through interconnected information systems and into the far reaches of cyberspace beyond the control of its architects, and how it eventually ends up in the hands of unintended parties. When the malware was – as it had to be expected – isolated and the source code made publicly available, security firms like Symantec, Kaspersky, and F-Secure quickly began to see offspring from Stuxnet. For instance, malware such as Duqu 2.0, Flame, and Gauss containing similar or identical security exploits showed up on their radar, leading them to conclude that third parties are dissecting the code from the world’s most advanced known cyber weapon and utilizing its capabilities for their own purposes.3 These risks are very real and quite serious as governments rapidly careen toward conflicts fought across fiber optic cables rather than battlefields.

Cyber Warfare’s Unintended Consequences

Perhaps the most famous quote from Donald Rumsfeld’s long and storied career comes from his time as secretary of defense under President George W. Bush. At a Pentagon press conference, he answered a reporter’s question with the following linguistic triumph:

As we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns— the ones we don’t know we don’t know.

Secretary Rumsfeld went on to point out that it is the latter category, that of the “unknown unknowns,” that poses the greatest problem for assessing the consequences of and fallout from one’s actions.5 Despite its façade of incomprehensibility, his comment is actually an excellent expression of the difference between risk and uncertainty as distinguished by Frank H. Knight in his seminal text, Risk, Uncertainty, and Profit. The Knightian distinction between risk – which he used to describe those instances in which the outcome of a given set of circumstances is not known, but may be accurately quantified – and uncertainty – where outcomes are unknown and the information needed to make any kind of quantifiable prediction is not attainable – provides a helpful framework for understanding the difficulties associated with carrying out cyber warfare attacks. 

As with the financial markets that Professor Knight examined, military strategic decision-making is replete with risk and uncertainty with regard to outcomes and consequences. This has always been true, but the nature of cyber weapons makes the considerations more important. Even with cutting edge conventional weaponry (improved or more effective ordnance, GPS-guided missiles, stealth technologies, nuclear weapons, etc.), there is the concern that it will fall into the wrong hands for one reason or another. The nuclear age brought with it much hand-wringing about the possibility of nuclear designs being stolen by enemy governments or non-state actors. Such worries, however, are often overblown. The per-unit destructiveness that can be achieved by an enemy acquiring advanced assault rifle technology or even reverse engineering and building better stealth technology is rather low and, in the case of stealth technologies, the cost of production is prohibitively expensive for non-state actors and for many governments as well. Additionally, even the great fear of the nuclear age – namely, the proliferation of nuclear weapons designs – is tempered by the fact that, even given the proper information for constructing a warhead, the process is extremely complex, and requires tremendous investment and years to implement. Moreover, in the circumstances in which advanced ordnance and warhead delivery systems are utilized against an enemy, the weapon itself is obliterated upon reaching its target, leaving no trace of the weapons systems for hostile actors to use or reverse engineer. 

Few of these mitigating factors for conventional arms hold true in the case of cyber weapons. When a payload is delivered to a target via a worm or in the form of a virus, the weapon itself must remain intact in order to be effective against its intended recipient. This is a unique feature of cyber weapons and it has an interesting, if also terrifying, risk associated with it. The very use of malware against an enemy actually gives the enemy access to all of the advanced components that make such an attack successful in the first place. Essentially, the aggressor is delivering the means for an effective retaliatory attack directly into the hands of its enemy. It is in this context and through this lens that the risks associated with cyber warfare must be evaluated. The risks are high, the payouts are not always certain or obvious, and the nature and structure of the venue makes retaliation an uncertainty.

Stuxnet: Origins & Design

At some point in June 2010, strange things were happening at a nuclear enrichment facility located in Natanz, Iran. International Atomic Energy Agency (IAEA) inspectors reviewing footage from surveillance cameras installed at the facility in accordance with UN nuclear enrichment monitoring requirements noticed that the Iranian technicians at the plant had been replacing an exceptional number of centrifuges used to enrich uranium. Failure of these delicate machines is not uncommon over a number of years, and there were many at the plant that were previously used when installed or were somewhat older than would be ideal. Nonetheless, the centrifuges were apparently failing at an unexpectedly high rate. Still more alarming to the Iranians, the high failure rate was increasing with time.6 It would come to light over the ensuing months that the cause for the high rate of failure was anything but commonplace. The centrifuges were not physically faulty, nor were the Iranian technicians and nuclear scientists incompetent. Rather, the Natanz nuclear facility had become the first targeted casualty of the most sophisticated offensive cyber weapon the world had seen to date.7 The malware causing the Iranians so much trouble at Natanz would became infamous around the globe under the moniker Stuxnet.8 

There was something remarkable about the fact that malware of any kind was present on control systems at the Natanz nuclear facility: the Natanz plant was an air-gapped site.9 In theory, air-gapped systems are not vulnerable to worms and other malware that propagate through the interconnections between computers via the Internet. Without having been connected at any time to the open web, there should have been no opportunity for malicious code or hackers to gain access to the system and infect it. So where did Stuxnet come from and how, given the extensive security protocols that separated Natanz from the outside world, was it moving through the system? The answers to these questions came to light only over time and in a piecemeal fashion. 

According to experts in cyber warfare, the Stuxnet virus is one of the most intricate pieces of malware known to exist.10 Although no group or government has officially claimed responsibility for the design and deployment of Stuxnet, it is widely assumed, based on extensive investigations by reporters, statements from current and former government officials (such as Richard A. Clarke) and in-depth analyses of the source code carried out by security experts around the world, that it was the work of teams at NSA and their counterparts in Israel.11,12 The sophistication of the malware in and of itself suggests that the group behind Stuxnet must have had state backing.13 Regardless of the worm’s origin, it contained a staggering number of zero-day exploits and intricate, high-level code features.14 

While the technical details of the Stuxnet worm are beyond the scope of this case study, a general explanation of its unique design and capabilities is necessary in order to understand how its designers approached the risks and uncertainties associated with cyber weaponry of this kind. Stuxnet is, at its most basic level, a kind of malware that is known as a “worm.”15 Several features of Stuxnet’s architecture, however, set the malware package dramatically apart from even its most formidable predecessors (e.g., Conficker). First, Stuxnet clearly possessed the capability of circumventing one of spy tradecraft’s most effective barriers to entry in digital systems – air gapping. Second, it was a truly massive piece of coding by malware standards, weighing in at 500 KB.16,17 Third, Stuxnet was capable of utilizing a kernel-level rootkit exploit that makes it possible for the malware to avoid detection by even some of the most sophisticated anti-virus scanners.18 Fourth, the package contained a set of four zero-day exploits.19,20 

It would also eventually be determined that the group who deployed Stuxnet had breached the air-gap defense by infecting the computers at five well-known contractors working in Iran on the construction and operation of the Natanz nuclear facility. The worm was designed to hide on the computers of individual employees at these contractors’ offices and then to jump onto any USB thumb drive that was inserted into an infected terminal. The bet that the designers made was that eventually one of those USB drives would end up in a computer within the Natanz facility and, at that point, the worm would migrate to the internal Natanz system. To that end, yet another unique aspect of Stuxnet was the fact that it used a standard Windows functionality (a .LNK file) to initiate infection on a new computer. The brilliance of this exploit lies in the fact that .LNK files in the Windows OS are responsible for rendering the icons for files contained on a USB drive or any other portable media drive that is inserted into a terminal. The traditional method for transmission from portable media drives utilized by hackers is to initiate migration via the Windows Autorun command that usually functions to automatically open files that are present on newly introduced portable media drives; however, the Autorun feature is a commonly known vulnerability and even the most mediocre of security administrators disable this functionality as a matter of common practice when securing proprietary networks. The .LNK method used by Stuxnet is far subtler and significantly more difficult to defend against.21 

Any time a user inserts a USB drive and opens the drive on a desktop display, the Windows OS will use a .LNK protocol to render icons for all the files contained on the USB drive. At the initiation of the .LNK rendering process on a terminal at Natanz, Stuxnet moved into the computer’s hard drive where it began to search for its targets and Stuxnet’s kernel-level rootkit (disguised as a common driver) buried the intrusive code deep within the computer’s OS, making it nearly undetectable by standard issue security software and virus scanners.22 

Incredibly, the descriptions in the preceding paragraph are only the more noteworthy capabilities that Stuxnet possesses, barely scratching the surface of what the worm can do and how it can enter and control specific computers. Once inside, Stuxnet performs a series of linked inquiries summarized by Richard Clarke as follows: 

What does this incredible Stuxnet thing do? As soon as it gets into the network and wakes up, it verifies it’s in the right network by saying, “Am I in a network that’s running a SCADA [Supervisory Control and Data Acquisition] software control system?” “Yes.” Second question: “Is it running Siemens [the German manufacturer of the Iranian plant controls]?” “Yes.” Third question: “Is it running Siemens 7 [a genre of software control package]?” “Yes.” Fourth question: “Is this software contacting an electrical motor made by one of two companies?”23 

If the answer to the last inquiry is “Yes,” said Clarke, then the worm knows that it can be in only one place – Natanz.24 If, at any point in the survey, Stuxnet received a negative answer to any of its queries, then it would not take any action and would simply replicate and move within the infected system, limiting itself to only three additional infections in order to reduce its footprint, but to still investigate whether it is in a system at Natanz and had simply been inserted into one of the terminals that does not control centrifuges.25 If, however, Stuxnet received affirmative answers to all four aspects of its checklist, it would deliver the payload it was built to convey. 

Centrifuges spin at extremely high speeds to separate the heavier uranium-238 atoms from the remaining gas components to form an enriched product. When enriched sufficiently, the product can be used in a nuclear weapon.26,27 Stuxnet’s payload instructions were created to stymie the Iranians’ progress toward weapons grade enrichment.28 The centrifuges involved in enrichment are easily damaged if the speed at which they are spinning is unexpectedly changed. Any volatility in the rotational frequency can cause catastrophic damage to the machinery over a relatively short period of time, requiring it to be scrapped and replaced. Taking a man-in-the-middle position,29 Stuxnet set to work destroying the Natanz centrifuges without ever making its presence known.30 The Iranians had to replace nearly 1,000 damaged or destroyed centrifuges over the time that Stuxnet was implanted in the facility’s control systems, setting back Iranian enrichment efforts months, if not years.31 

So, in the end, that was it. The architects of Stuxnet – NSA and Israel’s cyber warfare groups – had accomplished their goals of demonstrating American and Israeli dominance in offensive cyber capabilities and simultaneously setting back the Iranian nuclear program, seemingly with no body count, zero collateral damage, nor negative repercussions. Not quite. 

Stuxnet Unbound: The Progeny

From a purely tactical standpoint, Stuxnet was a resounding success. The targeted facility had been hit with precision and collateral damage as a direct result of the attack on the nuclear program in Iran was nil. No body count. No external or unrelated systems were hit by the worm. However, within months of its release into the contractors’ computer systems in Iran and the worm’s growing notoriety around the world, a disturbing trend began to flash across the screens of the same experts who had found and isolated Stuxnet in the first place. Variants of the Stuxnet code were showing up in private sector computer infrastructure all over the world and on other governments’ systems. 

An interesting aspect of the Stuxnet source code was that it included an end date. That is to say, the program was written to self-arrest its propagation and end continuing infections on June 24, 2012.32 While this is an intriguing safeguard installed by its designers, likely as a hedge against unnecessary distribution into non-targeted systems, the shortfall is that the safety measure may not actually end infection outside of controlled conditions.33 The consequence of this is that any previously affected computers may still harbor Stuxnet source code and it may continue to spread. This means that any person with the technical skills to find and extract the Stuxnet source code on an infected terminal can then utilize the infection architecture to deliver a payload of their own choosing, or they can simply post the source code to publicly accessible web sites for use by just about anyone interested in doing so.

As discussed in the previous section, Stuxnet included a multitude of nefariously clever capabilities that allowed it to invade and subvert control systems while remaining undetected. Stuxnet can now be used by hostile state and non-state actors as a cheap, pre-fabricated, and modifiable means for delivering devastating attacks against governments, financial systems, businesses, and critical national infrastructure anywhere in the world. 

In September 2011, Hungarian cyber security researchers found themselves with a concerning program on their hands. Dubbed Duqu, in reference to a little understood hacking collective known for its advanced malware, the virus was designed to steal information from industrial control systems.34 Analysis of Duqu by several firms found that it bore a number of striking similarities to the Stuxnet source code and used several of the same techniques and exploits.35 Duqu was, at the time, one of the more intricate infiltration systems documented, which is not terribly surprising since the authors had used the Stuxnet platform as a source of Duqu’s building blocks. 

In May 2012, another major blip appeared on security firms’ maps. A truly massive modular platform, essentially comprising a hacker’s dream toolkit, showed up on the scene. The package was called Flame and, at 20 MB, it was 40 times the size of both Stuxnet and Duqu.36 Flame included several signature exploits that were first observed in the Stuxnet architecture including the .LNK infection tool and a print spooler exploit that had been one of the zero-day manipulations that originally appeared in Stuxnet.37 Researchers who mapped the infection locations and distributions were not able to determine what kinds of information Flame was intended to access or target, but the malware has been found in government computer systems, private industrial control systems, and even in critical infrastructure controls.38 The variety of infected systems reinforces the notion that Flame is designed as a multifaceted collection of tools that can be mixed and matched by preference, depending on the function that a programmer wishes to achieve. 

Most recent in the family tree of Stuxnet offshoots is Duqu 2.0, which was found to have infiltrated – almost unbelievably – the internal network at Kaspersky Labs.39,40 Duqu 2.0 is based on the architecture that underlies both Stuxnet and the original Duqu. The attack was likely intended as a demonstration of advanced capabilities by the Duqu group.41 It was a show of force achieved by stealthily gaining access to the networks of one of the world’s most respected and resourceful cyber security firms.42

While the Duqu branch of the Stuxnet family is widely accepted as having been manufactured by a non-state group of hackers who used Stuxnet code in their malware, many experts believe that Flame likely had the backing of a national government.43,44 These are precisely the risks and uncertainties that are so inherently intertwined within the field of cyber warfare. Once Stuxnet was released into the wild (sent to the computers at the Iranian contractors’ offices), the developers at NSA immediately lost control of the worm’s destiny. All that NSA could know for sure was that the source code would inevitably end up in the hands of hostile third parties at some point in the future, but they could not know who that party would be or how they might use Stuxnet (the known unknown). Clearly the likes of Flame, Duqu, Gauss, and Duqu 2.0 would not have come to fruition in the form that they did without the guidance and helping hand of the source code that they found Stuxnet. 

Stuxnet has also given the world a model for how to successfully complete a cyber attack that combines both virtual infiltration and physical damage. It has focused attention on the shockingly unprotected critical infrastructure in nations such as the United States, where even a slight disruption to the power grid could bring the economy to a screeching halt and deliver a devastating blow to essential services that would take weeks or months to repair. As then-Secretary of Defense Leon Panetta observed in a speech to the Council on Foreign Relations in October 2012: 

The most destructive scenarios involve cyber actors launching several attacks on our critical infrastructure at one time, in combination with a physical attack on our country. Attackers could also seek to disable or degrade critical military systems and communication networks. 

The collective result of these kinds of attacks could be a cyber Pearl Harbor; an attack that would cause physical destruction and the loss of life. In fact, it would paralyze and shock the nation and create a new, profound sense of vulnerability.45 

The American power grid system, in particular, is regarded as poorly protected and exceedingly interconnected and difficult to repair once damaged, making it a prime target for cyber attacks.46 Stuxnet and its successors have shown a path for those who mean to harm the United States and its allies. The design and modular components of these malware platforms are free to anyone who has the desire to find them. They can be modified or used as-is in many cases to carry massively destructive payloads of malicious code to the poorly insulated critical systems in the United States, potentially causing power grid failures, oil and gas pipeline explosions, systemic financial crises, and even direct contamination of major water supply lines. 

Just as Stuxnet in itself posed risks for the United States when it was deployed because of the possibility that it would be appropriated, it is also important to look at the uncertainty that comes with launching the first documented salvo in international cyber warfare. The NSA may have been able to quantify the risk that came with others getting ahold of malicious code they had themselves designed and were willing to run with that risk, but it is an entirely different matter when dealing with the uncertainty that comes with initiating the development of a new and untested form of conflict. The United States government must deal with the uncertainty that the decision to unleash Stuxnet has brought. It is difficult to understand what, if any, significance a cyber attack carries under international laws. Has the United States engaged in an attack that could be construed as an act of war by using Stuxnet in Iran? This is not totally obvious and the consequences are uncertain. The doors of the future of cyber war have been flung open by the United States and the next steps are precarious. 

Conclusion

This case study has used Stuxnet as a window into the risks and uncertainties of cyber warfare in the 21st century. The first and most advanced nation-sponsored cyber weapon ever produced provides a perfect example of what the United States and other actors at the forefront of the technological arms race are and will be confronted with in the future. When Stuxnet found its way to Natanz, it left its trail through the open wilderness of the public web. There it was studied and disassembled by experts in the employ of American enemies and non-state hostiles alike. The results have been a series of increasingly evolved platforms for attack on both government and private systems and the potential for much more severe attacks on critical systems nationwide and abroad. The newest field of battle across the globe is one characterized by new risk and uncertainty that must be confronted head-on when considering whether to design and deploy newer and better weaponry. Stuxnet has shown both the success that comes from preparation and planning, but, more poignantly, it has given an example of the risks that nations run when they open a new front in global conflict.

Bibliography

Ashford, Warwick. Risk of crippling cyber war yet to be addressed, says former US official. November 9, 2015. http://www.computerweekly. com/news/4500257027/Risk-of-crippling-cyber-war-yet-to-be-addressed-says-former-US-official (accessed December 4, 2015). 

Bookstaber, Richard. A Demon Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation. Hoboken, New Jersey: John Wiley & Sons, Inc., 2007. Cisco Systems, Inc. What Is the Difference: Viruses, Worms, Trojans, and Bots? http://www.cisco.com/ web/about/security/intelligence/virus- worm-diffs.html#5 (accessed December 6, 2015). 

Clarke, Richard A., and Robert K. Knake. Cyber War: The Next Threat to National Security and What To Do About It. New York: HarperCollins, 2010. 

Coll, Steve. The Rewards (and Risks) of Cyber War. June 6, 2012. http:// www. newyorker.com/news/daily-comment/ the-rewards-and-risks- of-cyber-war (accessed Decemeber 4, 2015). 

Constantin, Lucian. Researchers Identify Stuxnet-like Cyberespionage Malware Called ‘Flame’. May 28, 2012. http:// www.pcworld.com/article/256370/ researchers_identify_stuxnetlike_cyberespionage_malware_called_flame. html (accessed December 6, 2015). 

Dizikes, Peter. Explained: Knightian uncertainty The economic crisis has revived an old philosophical idea about risk and uncertainty. But what is it, exactly? June 2, 2010. http://news.mit. edu/2010/explained-knightian-0602 (accessed December 6, 2015). 

Falliere, Nicholas. Stuxnet Introduces the First Known Rootkit for Industrial Control Systems. August 6, 2010. http:// www.symantec.com/connect/blogs/ stuxnet-introduces-first-known-rootkit-industrial-control-systems (accessed December 6, 2015). 

Falliere, Nicholas, Liam O’Murchu, and Eric Chien. W32.Stuxnet Dossier. Malware Profile, Symantec Security, Mountain View: Symantec Security Response, 2011. 

Good Harbor Consulting, LLC. Confronting Cyber Risk in Critical Infrastructure: The National and Economic Benefits of Security Development Processes. Commissioned Report, Washington, DC: Good Harbor Consulting, LLC, 2014. 

Graham, David A. Rumsfeld’s Knowns and Unknowns: The Intellectual History of a Quip. March 27, 2014. http://www.theatlantic.com/politics/archive/2014/03/ rumsfelds-knowns-and-unknowns-the-intellectual-history-of-a-quip/359719/ (accessed December 6, 2015). 

GReAT (Kaspersky Labs). Gauss: Nation-State Cyber-Surveillance Meets Banking Trojan. August 9, 2012. https:// securelist.com/blog/incidents/33854/ gauss-nation-state-cyber-surveillance-meets-banking-trojan-54/ (accessed December 5, 2015). 

GReAT (Kaspersky Labs). The Mystery of Duqu 2.0: a sophisticated cyberespionage actor returns: New zero-day used for effective kernel memory injection and stealth. June 10, 2015. https:// securelist.com/blog/research/70504/ the-mystery-of-duqu-2-0-a-sophisticated-cyberespionage-actor-returns/ (accessed December 4, 2015). 

Kaspersky Labs. Equation Group: The Crown Creater of Cyber Espionage. February 16, 2015. http://www.kaspersky. com/about/news/virus/2015/equation-group-the-crown-creator-of-cyber-espionage (accessed December 5, 2015). 

Knight, Frank H. Risk, Uncertainty, and Profit. New York: Houghton Mifflin Co., 1921. 

Kushner, David. “The Real Story of Stuxnet.” IEEE Spectrum. February 26, 2013. http://spectrum.ieee.org/ telecom/security/the-real-story-of-stuxnet (accessed December 4, 2015). 

Panetta, Leon. “Secretary Panetta’s Speech About Cybersecurity.” Primary Sources. New York: Council on Foreign Relations, December 12, 2010. 

Rosenbaum, Ron. Richard Clarke on Who Was Behind the Stuxnet Attack. April 2012. http://www.smithsonianmag. com/history/richard-clarke-on-who-was-behind-the-stuxnet-attack- 160630516/?no-ist=&=&c=y&page=5 (accessed December 6, 2015). 

Rumsfeld, Donald. “Department of Defense News Briefing.” Washington, DC: Department of Defense, February 12, 2002. 

Sanger, David E. Confront and Conceal: Obama’s Secret Wars and Surprising Use of American Power. New York: Crown Publishing, 2012. 

Symantec Security Response. Duqu 2.0: Reemergence of an aggressive cyberespionage threat. June 10, 2015. http://www.symantec.com/connect/ blogs/duqu-20-reemergence-aggressive-cyberespionage-threat (accessed December 6, 2015). 

Verizon Communications Inc. Data Breach Investigation Report for 2015. Annual Data Security Analysis, New York: Verizon Enterprise Solutions, 2015. 

Zetter, Kim. An Unprecedented Look at Stuxnet, The World’s First Digital Weapon. November 3, 2014. http://www.wired. com/2014/11/countdown-to-zero-day-stuxnet/ (accessed December 2, 2015). 

Zetter, Kim. Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon. New York: Broadway Books, 2014. 

Zetter, Kim. How Digital Detectives Deciphered Stuxnet, the Most Menacing Malware in History. July 11, 2011. http:// arstechnica.com/tech-policy/2011/07/ how-digital-detectives-deciphered-stuxnet-the-most-menacing-malware-in-history/ (accessed December 6, 2015). 

Notes

1 Kaspersky Labs has been tracking the work of a specific set of malware designers whose identities have not been confirmed, but whose work has shown up in infamous malware packages such as Stuxnet, Flame, and Gauss, amongst others. See Equation Group: The Crown Creator of Cyber Espionage, available at http://www.kaspersky.com/about/ news/virus/2015/equation-group-the-crown-creator-of-cyber-espionage. 

2 See Sanger 2012. 

3 (GReAT (Kaspersky Labs) 2012) 

4 (Rumsfeld 2002) 

5 (Rumsfeld 2002) 

6 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

7 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014). 

8 The malware was “dubbed Stuxnet by Microsoft from a combination of file names (.stub and MrxNet.sys) found in the code.” (Zetter, How Digital Detectives Deciphered Stuxnet, the Most Menacing Malware in History 2011). 

9 An air-gapped computer or facility is one that has never been connected to the open internet at any point during its existence. 

10 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

11 The Israeli equivalent of the NSA and the US Cyber Command is a military specialist group named Unit 8200. 

12 (Sanger 2012) 

13 (Zetter, How Digital Detectives Deciphered Stuxnet, the Most Menacing Malware in History 2011) 

14 (Falliere, O’Murchu and Chien, W32.Stuxnet Dossier 2011) 

15 “[W]orms are similar to viruses in that they replicate functional copies of themselves and can cause the same type of damage. In contrast to viruses, which require the spreading of an infected host file, worms are standalone software and do not require a host program or human help to propagate. To spread, worms either exploit a vulnerability on the target system or use some kind of social engineering to trick users into executing them. A worm enters a computer through a vulnerability in the system and takes advantage of file-transport or information-transport features on the system, allowing it to travel unaided.” (Cisco Systems, Inc. n.d.). 

16 At the time Stuxnet was discovered, 500 KB was a nearly unheard of size in the malware world because the general idea of malware packages is to keep a low profile and avoid detection by the user’s operating system or antivirus platforms. Soon after the release of Stuxnet, however, some truly gargantuan malware programs were discovered that were related to the Stuxnet infrastructure. Notable is Flame, which came in at 20 MB, was isolated and identified in May 2012. 

17 (Constantin 2012) 

18 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

19 Zero-day exploits are vulnerabilities in software that have never been previously identified by anti-virus and security firms and are also unknown to the software makers themselves, leaving the system completely defenseless in the face of attacks that utilize those security holes. 

20 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

21 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

22 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

23 (Rosenbaum 2012) (quoting Richard A. Clarke) 

24 (Rosenbaum 2012) 

25 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

26 Enriched product intended for energy generation facilities generally measure in at about three to five percent enrichment and, while crude nuclear devices can be constructed with uranium enriched to 20%, truly weaponized material comes at 90%. (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

27 (Sanger 2012) 

28 Most malware, and worms in particular, are constructed as vehicles for delivery of a “payload” package, much like an ICBM is designed as a vehicle for the transport and delivery of its warhead. 

29 A man-in-the-middle tactic in cyber security circles is one in which a program creates a false feedback loop which prevents monitoring systems from receiving correct diagnostic information. In the case of Stuxnet, the man-in-the-middle aspect of the malware prevented the Natanz computers from noticing that the centrifuges were spinning at critically high and low speeds. The monitoring system was always given data that indicated that the centrifuges were spinning at the proper 1064 Hz level. 

30 “After the initial reconnaissance stage recording data for thirteen days, Stuxnet first increased the frequency of the converters to 1,410 Hz for fifteen minutes, then reduced it to 1,064 Hz, presumably the normal operating frequency, for about twenty-six days. Once Stuxnet recorded all of the data it needed to record during these three weeks, it dropped the frequency drastically to 2 Hz for fifty minutes, before restoring it to 1,064 Hz again. After another twenty-six days, the attack began again. Each time the sabotage commenced, the man-in-the-middle attack fed false frequency readings back to the operators and safety system to keep them blind to what was happening.” (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014). 

31 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

32 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

33 Stuxnet was designed to erase itself using what is called a TTL, or “Time to Live” protocol – a command that tells a program to self-terminate at a given date – but the problem is that not all computer are properly set to the correct date and time. This can lead to serious errors that would prevent Stuxnet from executing the erase command under any circumstance based on date. (Rosenbaum 2012) 

34 (Kushner 2013) 

35 (Symantec Security Response 2015) 

36 (Constantin 2012) 

37 (Constantin 2012) 

38 (Constantin 2012) 

39 Gauss was also recently discovered and includes a number of previously unseen exploits and a particularly clever MS Word-based infection method using a fabricated font style. (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014). Gauss also incorporates a great deal of Stuxnet architecture, but much of the re-used code is already discussed in the context of the Duqu, Flame, and Duqu 2.0 sections of this case study. 

40 (Symantec Security Response 2015) 

41 (GReAT (Kaspersky Labs) 2015) 

42 (Symantec Security Response 2015) 

43 Gauss may also have been developed with government support. (GReAT (Kaspersky Labs) 2012). 

44 (Kushner 2013) 

45 (Panetta 2010) 

46 (Zetter, Countdown to Zero Day: Stuxnet and the Launch of the World’s First Digital Weapon 2014) 

Andrew Leedom is a Masters candidate in his first year at the Johns Hopkins University School for Advanced International Studies. His area of concentration is International Political Economy.