ÇïżûÊÓÆ”

Governance of Dual-Use Technologies: Theory and Practice

Chapter 3: Governance of Information Technology and Cyber Weapons

Back to table of contents
Authors
Elisa D. Harris, James M. Acton, and Herbert Lin
Project
Global Nuclear Future

Herbert Lin


Framing the Problem

In the twenty-first century, information is the key coin of the realm. Nations rely on information and information technology (IT) to ever-increasing degrees. Computers and networks are integral for most business processes, including payroll and accounting, tracking of sales and inventory, and research and development (R&D). Delivery of food, water, energy, transportation, healthcare, and financial services all depend on IT, which is itself a major sector of the economy. Modern military forces use weapons that are computer controlled. Coordination of actions of military forces depends on networks that allow information about the battlefield to be shared. Logistics for both civilian and military activities depend on IT-based scheduling and optimization.

But bad guys also use IT. Criminals use IT to steal intellectual property and commit fraud. Terrorists use IT for recruitment, training, communications, and public outreach, often in highly sophisticated ways, although to date they are not known to have used IT to commit destructive acts. And as the U.S. government is exploring various ways of using cyberspace as an instrument of national policy to create political, military, diplomatic, economic, or business advantages, other nations—some of them with interests that do not align with those of the United States—are doing the same.

One commonly used definition of dual-use technology is “technology intended for beneficial purposes that can also be misused for harmful purposes”1 This article focuses on the governance of specific applications of IT (or research aimed at developing such applications) designed and intended to create specific negative effects on a target’s computer or communications system or the information inside it, being carried through it, or being processed within it and which can be used for both beneficial and harmful purposes. In the lexicon of this article, these specific applications are “cyber weapons”2 The negative effects of possible concern are effects on integrity (in which data or computer operations are altered with respect to what users expect), effects on availability (in which services provided to users of the system or network are unavailable when expected), and effects on confidentiality (in which information that users expect to keep secret is exposed to others).

Note the distinction between effects and purpose. A gun is designed to have negative effects on objects and people. But in the hands of the good guys, (e.g., the police), its use is beneficial to society.3 Guns are misused for harmful purposes primarily when they are put into the hands of the bad guys (e.g., criminals). Similar comments apply to applications of IT with negative effects. For example, a negative effect of a specific program might be to render ineffective the encryption capabilities of a targeted system. In the hands of the good guys, the purpose may be benign or societally beneficial—consider, for example, the properly authorized use of such a program by a law enforcement agency against a computer used by criminals. But if the same computer program performing the same task were used by a terrorist or criminal (e.g., used against a government computer containing classified information or a corporate computer holding confidential business plans), that purpose would be regarded as a harmful or nonbenign misuse.

When the use of a cyber weapon affects the integrity or the availability of a service, it is usually classified as a cyberattack. More generally, cyberattack refers to the use of cyber weapons to alter, usurp, deny, disrupt, deceive, degrade, or destroy computer systems or networks used by an adversary or competitor or the information and/or programs resident in or transiting these systems or networks. The activities may also affect artifacts connected to these systems and networks—examples of such artifacts, often called cyber-physical devices, include generators, radar systems, and physical control devices for airplanes, automobiles, and chemical manufacturing plants. A cyberattack might be conducted to prevent authorized users from accessing a computer or information service (a denial of service attack), to destroy computer-controlled machinery, or to destroy or alter critical data (e.g., timetables for the deployment of military logistics).

When the use of a cyber weapon compromises the confidentiality of information that is intended to be kept secret from unauthorized parties, it is usually classified as a “cyber exploitation” (Press accounts often use the term cyber attack when the activity conducted is actually cyber exploitation.) More generally, cyber exploitation refers to the use of cyber weapons to obtain information resident on or transiting through a system or network. The information sought is information that the target wishes not to be disclosed. For a company, such information may include trade secrets, negotiating positions, R&D information, or other business-sensitive information. For a nation, such information may include intelligence information, the strength and disposition of military forces, military plans, communications with allied nations, and so on. Of particular interest is information that will allow the perpetrator to conduct further penetrations on other systems and networks to gather additional information.

In general, a cyber weapon requires both penetration and payload. (Selecting the targets on which cyber weapons are used is a matter of command and control of those weapons.)

Penetration requires a mechanism for gaining access to the system or network of interest (e.g., through the Internet, by physical intrusion) and taking advantage of a vulnerability in the system or network. Vulnerabilities may be accidentally introduced through a design or implementation flaw (often called a “bug”), or introduced intentionally (e.g., by an untrustworthy insider). Before a vulnerability is known to the supplier of the system or network (and thus before it can be repaired), a system with that vulnerability can be penetrated by an adversary who does know of it. When an adversary uses a vulnerability that is unknown to others to effect penetration, it is termed a “zero-day” penetration or compromise, since the victim will have had zero days to respond to it.

Payload is the term used to describe the mechanism for affecting the victim’s system or network after penetration has occurred. The payload is a program that executes after the cyber weapon has entered the computer system of putative interest;4 payload execution may result in the weapon reproducing and retransmitting itself, destroying files on the system, or altering files. Payloads can be designed to do more than one thing, and these things can happen at different times. If a communications channel is available, payloads can be remotely updated. (And in some cases, the function of the payload is performed by a human being who has gained remote access to the computer in question through use of a penetration mechanism.)

From the standpoint of the victim, one of the most problematic aspects of cyber weapons arises from the fact that the payload—and only the payload—determines whether the weapon is used for damaging or destructive actions (attacks) or nondestructive actions (exploitation/espionage). Even after recognizing an intrusion into a system or network, the victim usually cannot be certain whether the purpose of that intrusion is destructive or nondestructive.5

Some of the most important characteristics of cyber weapons are as follows:

  • The use of a cyber weapon can lead to results that vary from the utterly insignificant to destruction over a large scale. Similarly, the duration and spatial scale of a cyber weapon’s impact can span many orders of magnitude. But any given cyber weapon almost certainly is not designed to span such a range.
  • A given cyber weapon can often be used only once because a penetration that takes advantage of a system or network vulnerability usually reveals the vulnerability. If the victim repairs the vulnerability, a later use of the same weapon may not succeed.
  • Obtaining a large-scale and long-lasting impact from the use of a single cyber weapon can be highly challenging. Large-scale impact may well require simultaneous attacks against a large number of heterogeneous targets, and such heterogeneity means that a different attack would have to be crafted against each target type. Long-lasting impact may require repeated strikes against the targets of interest, and any vulnerability whose presence resulted in serious negative effects is likely to be repaired quickly, making that vulnerability unusable in the future.
  • The effects of using a cyber weapon may or may not be significantly delayed in time from the moment of penetration. That is, the payload may not execute immediately once penetration has been effected.
  • The successful use (launch) of a cyber weapon generally depends heavily on accurate, detailed, and timely information about the target (and what is connected to it). Such information may be gathered through the use of a variety of methods, including the use of other cyber weapons. In the absence of such information, the use of any given cyber weapon may have no effect whatsoever.
  • The effects of using a cyber weapon remain unknown until the payload executes (or until all of the payload is available for analysis).
  • The use of a cyber weapon is plausibly deniable under many circumstances—the so-called attribution problem. High-confidence attribution of such use to an entity that can be held responsible is most difficult when the weapon in question has never been used before (which means there is no historical record with which to compare), when the responsible entity has maintained perfect operational security (which means the victim has no other sources of intelligence on which to make a judgment), and when the judgment needs to be made quickly. Conversely, when these conditions are not true, attribution is often much easier.
  • A given cyber weapon may or may not be self-propagating. Self-propagation refers to the ability of software to duplicate itself on one system and then to take advantage of connections to other systems to spread to those systems. Depending on the weapon’s programming, self-propagation may be limited or unlimited. To the extent that the computing environments of affected systems constitute a monoculture, self-propagation is dangerous because the same program can affect all of the systems involved. But if the relevant computing environments are different from one another, similar effects on all of the systems are unlikely to be the result. A cyber weapon that is not self-propagating affects only the system against which it is targeted, except to the extent that failures in that system may affect other systems connected to it.
  • The expertise and infrastructure needed to create certain kinds of cyber weapons extend beyond the usual purview of computer scientists. Cyber weapons that are intended to be used against cyber-physical systems—systems or devices that are controlled by computer but have tangible effects in the physical world—also require expertise specific to those systems or devices and also, under some circumstances, test facilities that are a high-fidelity replica of the targets to be attacked. (For example, the Stuxnet worm used to attack Iranian centrifuge facilities was previously tested on facilities located at Dimona, the Israeli nuclear complex in the Negev Desert.6)

Because cyber weapons can be used for beneficial purposes (i.e., by the good guys) and misused for harmful purposes (i.e., by the bad guys), cyber weapons constitute a dual-use technology of concern. But unlike the case for the analogous dual-use technology in biology (for which there is a well-established consensus that the use of a biological weapon would define the user as a bad guy), what makes the use of a cyber weapon harmful is very much in the eyes of the beholder.

For example, consider technologies that make it easier for nations to spy on one another. Most nations conduct espionage operations on other nations, and yet no nation wants other nations to conduct similar operations against it. From Nation A’s perspective regarding Nation B, A’s use of espionage against B serves a beneficial purpose, whereas B’s use of espionage against A serves a harmful purpose. Of course, Nation B believes the opposite.

The nations of the world have not agreed that the use of cyber weapons is ipso facto a harmful use, nor have they agreed that only bad guys use cyber weapons or that the development and acquisition of cyber weapons is necessarily something to be avoided.7 For these reasons, much of the governance discussion in this chapter explores what the world does and does not believe about cyber weapons.

A second example comes from the email people routinely receive. A substantial portion of email traffic consists of “spam”—unsolicited commercial email that is sent in bulk. For the vast majority of recipients, such emails are annoying and in effect constitute a denial of service attack on them. Recipients waste time deleting these emails in search of useful emails in their traffic. But for the senders of such email and a small proportion of those who receive it, the email is beneficial. Senders earn some profit from sending the emails, and some individuals want the products or services offered and respond affirmatively.

So what are the beneficial purposes of cyber weapons? Perhaps the most important purpose is to assist defenders in testing themselves against adversaries. That is, if I want to strengthen my system against a cyber onslaught, I need to take specific measures—and then I need to test my upgraded system to see if indeed it is more robust. Knowledge of possible offensive techniques (using cyber weapons) helps me to design a better defense—and my refraining from developing specific cyber weapons is no assurance that others will do the same.

Who uses cyber weapons for harmful purposes? The range of possible users is large and includes lone hackers acting as individuals; criminals acting on their own for profit; organized crime (e.g., drug cartels); transnational terrorists (perhaps acting with state sponsorship or tolerance); small nation-states; and major nation-states. Moreover, today one can find service providers who will, for a fee, use cyber weapons against targets of the customer’s choosing. The availability of such services enables any party with the appropriate financial resources to cause negative cyber effects, even if that party has no particular technical expertise.

Motivations for using cyber weapons in such operations also span a wide range. One of the most common motivations is financial. Cyber exploitations can yield valuable information, such as credit card numbers or bank log-in credentials; trade secrets; business development plans; or contract negotiation strategies—such information can be sold. Cyberattacks can disrupt the production schedules of competitors, destroy valuable data belonging to a competitor, or be used as a tool to extort money from a victim.

Another possible motivation is political. A perpetrator might use cyber weapons to advance some political purpose. A cyberattack or exploitation may be conducted to send a political message to a nation, to gather intelligence for national purposes, to persuade or influence another party to behave in a certain manner, or to dissuade another party from taking certain actions.

Still another reason for conducting such operations is personal. The perpetrator might conduct the operation to obtain “bragging rights,” to demonstrate mastery of certain technical skills, or to satisfy personal curiosities.

Lastly, the use of cyber weapons could be integrated into military operations in much the same way as kinetic weapons. In such scenarios, cyber weapons become just another weapon that military commanders might use—in this case, to damage either the system or network directly targeted or the devices connected to it. Individuals with no military affiliation may also wish to use cyber weapons for physically destructive purposes for reasons such as maliciousness, extortion, or financial gain.

A focus on the governance of cyber weapons means that other governance measures that promote cyber defenses—applications of IT intended to thwart or respond to the operation of cyber weapons—are not central to this chapter. In the big picture of efforts to promote and enhance cybersecurity, this is a big omission, as the vast majority of work on cybersecurity and related governance measures is defensively oriented. But since the vast majority of defensive applications are regarded as benign and because few parties feel a need to govern benign activities, they fall outside this chapter’s ambit.8 Therefore, this chapter does not address governance measures focused on defense, such as measures to improve coordination of defensive responses to cyberattacks, to promote and enhance cooperative relationships among law enforcement authorities in different nations in order to enhance their ability to respond to cyberattacks, or to build stronger and more resilient cyber infrastructures. Such measures—and others—are unquestionably important to the governance of security in cyberspace, but the issues associated with the governance of security in cyberspace constitute a vastly larger set than those associated with cyber weapons per se.

Moreover, the technical specifics of cyber defenses are not in general closely related to the specific details of cyber weapons. For example, a cyber defense may look for the known “fingerprint” of a penetration mechanism, but the part of the weapon that does the actual harm is its payload (which may not even be present at the time the penetration mechanism is recognized). In this regard, cyber defenses share a characteristic with nuclear defenses, which are more properly characterized as defenses against the delivery systems that could carry nuclear weapons rather than against the nuclear weapons themselves. But with one notable exception, cyber defenses do not seek to mitigate damage caused by the use of cyber weapons. (The exception is that encryption mitigates harm caused by cyber exploitation. Adversaries may obtain encrypted information, but that information is useless to them without a way to decrypt it.) Cyber defenses are thus dissimilar to biological defenses such as antimicrobial drugs or vaccines, which are developed to mitigate damage caused by biological agents (of either natural or deliberate origin).

Techniques and approaches that protect against deliberately induced failures in IT (i.e., that protect against cyber weapons) are also often useful against failures in IT that are not deliberately induced. For example, the Morris worm of 1988 was a self-replicating, self-propagating program that was released onto the Internet. The author, Robert Morris, had intended the program to spread onto systems that were previously untouched by the program, but only once. An error in the program caused it to replicate numerous times on the systems it touched, thereby crashing those systems. The program eventually spread to a large number of systems on the Internet (around 6,000, or about 10 percent of the Internet-connected systems at the time). The program took advantage of vulnerabilities in existing programs on those systems. Had those vulnerabilities been repaired or never been introduced, the program would not have been successful at spreading to even one machine.

An additional question remains: Given its potential for beneficial and harmful uses, should IT itself be regarded as a dual-use technology? The answer is no, at least in the sense that one would not logically regard physics or biology in the same way. Physics, biology, and IT can be used to create a broad range of applications, only some of which raise dual-use concerns. Pencils and walkie-talkies are applications of IT, and in the hands of criminals or terrorists they are often used to facilitate the commission of crimes and other terrorist acts. Yet the public has expressed little concern about the misuse of pencils and walkie-talkies for harmful purposes.

More generally, IT is often regarded as a medium for expressing thoughts. As described in a 1992 NRC report on the future of computer science, “Computer programs enable the computer scientist and engineer to feel the excitement of seeing something spring to life from the ‘mind’s eye’ and of creating information artifacts that have considerable practical utility for people in all walks of life”9

Fred Brooks, arguably one of the fathers of modern computing, writes, “The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds castles in the air, creating by the exertion of the imagination. . . . Yet the program construct, unlike the poet’s words, is real in the sense that it moves and works, producing visible outputs separate from the construct itself. . . . The magic of myth and legend has come true in our time. One types the correct incantation on a keyboard, and a display screen comes to life, showing things that never were nor could be”10

If IT is indeed a general-purpose medium for expression, meaningful “governance” of such a technology is hard to imagine. That said, a question still remains regarding the existence of specific areas of research in IT where progress may help to enable the creation or improvement of cyber weapons.


Past Uses of Cyber Weapons

Past uses of cyber weapons have encompassed a wide range of criminal activities (for example, use of cyber weapons to steal money, commit fraud, or appropriate trade secrets that constitute intellectual property), activities that are aimed at obtaining national security information (for example, use of cyber weapons by one nation to conduct espionage against another), and activities that are destructive in nature (for example, use of cyber weapons to destroy data, render information systems inoperable, or damage machinery controlled by computers).

Some of the more notable instances in which cyber weapons have been used include the following.11

  • A denial of service attack in 2007 against Estonian government websites, media sites, and online banking services prevented citizen access to these sites and services for an extended period of time. The attack is widely believed to have originated in Russia, though whether the attack was launched at the explicit behest of the Russian government is less clear.
  • Stuxnet, a cyberattack conducted in 2009 and 2010, destroyed about one thousand Iranian uranium enrichment centrifuges.12 The United States and possibly Israel are widely believed to have been responsible for the attack.
  • In August 2012, Aramco, the national oil firm of Saudi Arabia, was struck by a cyberattack that wiped out the data and operating systems on thirty thousand computers connected to the Aramco network.13 According to press reports, the United States believes Iran was responsible for the attack.
  • A denial of service attack in the fall of 2012 against U.S. banks caused significant delays for users trying to access online banking sites.14 Some analysts believe that the government of Iran tolerated or encouraged these attacks, though the extent of its responsibility is unclear.
  • In June 2013 the U.S. Department of Defense acknowledged that sensitive unclassified data regarding the F-35 fighter jet had been stolen, significantly reducing the U.S. design and production edge on fifth-generation fighters (e.g., cost advantage and lead time) compared to other nations that are seeking to produce such fighters.15
  • In December 2013, Target reported a data breach involving the credit and debit card records of more than 40 million customers, as well as personal information such as email and mailing addresses for some 70 million people. The access path used by the intruders involved one of Target’s HVAC service vendors. The vendor apparently had access to the entire Target network.16
  • In May 2014 the U.S. Justice Department issued indictments against five members of the Chinese People’s Liberation Army for violations of the Computer Fraud and Abuse Act (CFAA) and the Economic Espionage Act, alleging that these individuals engaged in criminal acts of industrial espionage that took place in the 2006–2014 period.17
  • In September 2014, Home Depot reported that about 56 million credit and debit cards had probably been compromised over a six-month period earlier that year through malicious software implanted on point-of-sale terminals.18
  • In November 2014, Sony Pictures Entertainment was the victim of a cyberattack that compromised unreleased films, private email correspondence, and other sensitive information and also destroyed operating systems on Sony computers.19 The United States publicly attributed this attack to North Korea.
  • In January 2015, Wired magazine reported that a cyberattack on a steel mill in Germany had manipulated control systems in such a way that “a blast furnace could not be properly shut down, resulting in ‘massive’—though unspecified—damage” The German government report on this incident does not specify when the attack occurred.20
  • In February 2015, Anthem, one of the largest health insurers in the United States, announced that it had been the target of an effort to obtain the personal information of tens of millions of its customers and employees. The information in question included names, Social Security numbers, birthdays, addresses, email addresses, and employment information, including income data.21
  • In March 2015, Premera Blue Cross, a health insurer based in Washington State, reported that the personal information of up to 11 million customers could have been exposed in a data breach that occurred in 2014.22
  • In August 2015, the Office of Personnel Management of the U.S. government revealed that approximately 22 million personnel records of U.S. government employees—including those with high-level security clearances—had been compromised. These records contained information that went far beyond basic identifying information and, in the case of those who had applied for security clearances, included fingerprints and lists of foreign contacts.23

Over time, the vast majority of instances in which cyber weapons have been used have involved the exfiltration of information—cyber exploitation—rather than an act of destruction or denial.

As for scale, few good numbers are available for the frequency with which cyber weapons have been used in the past. Part of the problem is that both failed and successful uses may easily go unreported because they have not been noticed. A failed penetration attempt may go unnoticed because it is unsuccessful. A successful penetration attempt may be successful precisely because it was unnoticed. Definitions of what it means to “use” a cyber weapon are also inconsistent. (For example, some analysts define use as a successful use, whereas others define a use as an attempted use, successful or not. Some analysts regard a probe to test for access points as the “use” of a cyber weapon, while others do not because such probes generally do not compromise system operation.)

With these caveats in mind, a survey by PricewaterhouseCoopers of more than 9,700 security, IT, and business executives found that respondents detected 42.8 million cybersecurity “incidents” in 2014, an increase of 48 percent over 2013.24 A spokesman for the National Nuclear Security Agency was quoted in 2012 as saying that the agency experiences up to 10 million “security significant cyber security events” each day, of which “less than one hundredth of a percent can be categorized as successful attacks against the Nuclear Security Enterprise computing infrastructure”25


Today’s Governance of Cyber Weapons

One source refers to governance as “all processes of governing, whether undertaken by a government, market or network, whether over a family, tribe, formal or informal organization or territory and whether through laws, norms, power or language”26 With such a broad scope, mechanisms of governance clearly go beyond law, though law is an important aspect of governance. Governance mechanisms also include government policies, norms of behavior (which may or may not be reflected in law), codes of conduct, ethics, markets, and education. They may also involve nongovernment actors.

What aspects of cyber weapons could governance mechanisms operate or affect? In principle, three distinct aspects should be considered.

  • Governance might address the acquisition of some or all cyber weapons, where acquisition should be understood to mean research, development, testing, production, sale, transfer, or some combination thereof.
  • Conceptually separate from restrictions on acquisition, governance might also seek limits on the deployment or use of some or all cyber weapons or limit the circumstances of such use.
  • Lastly, governance might make use of transparency and confidence-building measures, which call for nations to take or refrain from taking certain actions in the hope that such behavior will reassure other parties about their own benign intent.

International Law Regarding Cyber Weapons

No treaties or other international agreements address any aspect of the acquisition of cyber weapons. Thus, research, development, testing, or production of cyber weapons is entirely unconstrained by international law.

In addition, no treaties or other international agreements address directly and explicitly the use of cyber weapons. However, some existing bodies of law may in principle be applied to the use of cyber weapons. In a 2012 speech, Harold Koh, then legal adviser to the U.S. secretary of state, explicitly stated the U.S. view that international law principles do apply in cyberspace.27 Thus, from the U.S. perspective, international law provides an important legal framework from which to understand constraints on the use of cyber weapons.

Specifically, international law—under the rubric of the law of armed conflict (LOAC)—addresses the use of armed force by states in two ways. First, when is it legal for a nation to use force against another nation? This body of law is known as jus ad bellum. Second, what are the rules that govern the behavior of combatants who are engaged in armed conflict? Known as jus in bello, this body of law is separate and distinct from jus ad bellum.

The UN Charter and Jus ad bellum. Jus ad bellum is governed by the UN Charter (written in 1945), interpretations of the UN Charter, and some customary international law that has developed in connection with and sometimes prior to the UN Charter. Article 2(4) of the Charter prohibits nations from using “the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations” Article 51 provides for an exception to this prohibition, affirming the inherent right of states of self-defense in the case of an “armed attack”

The UN Charter does not formally define use of force, threat of force, or armed attack. Based largely on historical precedents, nations appear to agree that a variety of unfriendly actions, including unfavorable trade decisions, space-based surveillance, boycotts, severance of diplomatic relations, denial of communications, espionage, economic competition or sanctions, and economic and political coercion, do not rise to the threshold of a “use of force,” regardless of the scale of their effects. “Armed attack” is also likely to include declared war, occupation of territory, naval blockade, or the use of armed force against territory, military forces, or civilians abroad.

In his 2012 speech, Koh expanded on the use of cyber weapons and jus ad bellum, noting that:

  • cyber activities may in certain circumstances constitute uses of force within the meaning of Article 2(4) of the UN Charter and customary international law;
  • a state’s inherent right of self-defense, recognized in Article 51 of the UN Charter, may be triggered by computer network activities that amount to an armed attack or imminent threat thereof;
  • states conducting activities in cyberspace must take into account the sovereignty of other states, including outside the context of armed conflict; and
  • states are legally responsible for activities undertaken through “proxy actors” who act on the state’s instructions or under its direction or control.

For actions relating to security, international law also recognizes the concept of countermeasures.28 According to Michael Schmitt, countermeasures are “State actions, or omissions, directed at another State that would otherwise violate an obligation owed to that State and that are conducted by the former in order to compel or convince the latter to desist in its own internationally wrongful acts or omissions”29 That is, countermeasures taken by B against A would themselves be unlawful actions were it not for the wrongful actions of A against B. B’s countermeasures must be taken only for the purpose of persuading A to desist in A’s wrongful actions. Moreover, countermeasures are relevant only when A’s wrongful actions do not rise to the threshold of a “use of force” or “an armed attack” as the latter terms are used in the UN Charter. (If A’s actions do rise to these levels, Article 2(4) and Article 51 of the UN Charter come into play.)

Countermeasures are subject to two constraints. First, they must themselves be below the threshold of a use of force or an armed attack. Second, the provoking action must be attributable to a specific responsible nation (in the example above, A must be known to be the specific nation that is in fact responsible for the action).

The Geneva Conventions and Jus in bello. Jus in bello is governed by the Geneva Conventions of 1949 and their subsequent protocols, interpretations of the conventions, and some customary international law that has developed in connection with and sometimes prior to the conventions. Several fundamental principles underlie the Geneva Conventions, including:

  • Military necessity. The only targets that may be attacked are those that make a direct contribution to the enemy’s war effort or those whose damage or destruction would produce a military advantage because of their nature, location, purpose, or use.
  • Proportionality. Attacks on valid military targets may result in collateral injury and damage to civilian assets or people, and thus the Geneva Conventions allow some degree of collateral damage but not if the foreseeable collateral damage is disproportionate to the military advantage likely to be gained from the attack. In the event that military and nonmilitary assets are circumstantially commingled (e.g., the use of a common electrical grid to power both military and civilian facilities), the attacker must make a proportionality judgment. If the enemy has deliberately intermingled military and nonmilitary assets or people (e.g., by using human shields), the enemy must also assume some responsibility for the collateral damage that may result. (In the latter case the attacker must still make a proportionality judgment.)
  • Distinction. Distinction requires armed forces to make reasonable efforts to distinguish between military and civilian assets and between military personnel and civilians and to refrain from deliberately attacking civilians or civilian assets. The Geneva Conventions also confer special protected status on civilian facilities such as houses of religious worship and hospitals.
  • Discrimination. Nations have agreed to refrain from using weapons such as biological and chemical weapons at least in part because they are inherently indiscriminate weapons. An inherently indiscriminate weapon is one that is impossible to be used in a manner that discriminates between combatants and noncombatants. However, because nearly all weapons can be used indiscriminately, harm to noncombatants is minimized through adherence to requirements of proportionality imposed on the use of weapons.

Regarding cyber weapons and jus in bello, Koh said that:

  • in the context of an armed conflict, the law of armed conflict applies to regulate the use of cyber tools in hostilities, just as it does other tools;
  • the jus in bello principle of distinction (that is, distinguishing between military and nonmilitary objectives) applies to computer network attacks undertaken in the context of an armed conflict; and
  • the jus in bello principle of proportionality applies to computer network attacks undertaken in the context of an armed conflict.

Applying Existing International Law to the Use of Cyber Weapons. As of early 2016 no international legal precedents—no decisions by the International Court of Justice, no resolutions from the UN Security Council or General Assembly—guide interpretation of international law as it pertains to the use of cyber weapons.

A variety of reports and proposals, however, do address the topic. Perhaps the best known of these analyses—the Tallinn Manual on the International Law Applicable to Cyber Warfare of 2013—presents the views of twenty international law scholars and practitioners on how international law applies to cyber warfare and proposes ninety-five “black-letter rules” relevant to cyber conflict that can be derived from international law (including law related to sovereignty, state responsibility, and neutrality, as well as the UN Charter and the Geneva Conventions).30

Two examples of the manual’s black-letter rules will give a flavor of their character:

  • Rule 10 states, “A cyber operation that constitutes a threat or use of force against the territorial integrity or political independence of any State, or that is in any other manner inconsistent with the purposes of the United Nations, is unlawful”31
  • Rule 37 states, “Civilian objects shall not be made the object of cyber­attacks. Computers, computer networks, and cyber infrastructure may be made the object of attack if they are military objectives”32

The Tallinn Manual was the result of an initiative undertaken by the NATO Cooperative Cyber Defence Centre of Excellence, although the book’s introduction states that the manual “is not an official document but rather the product of a group of independent experts acting solely in their personal capacity”33 The introduction further states that it does not represent the views of the Centre of Excellence, its sponsoring nations, or NATO. Nevertheless, the document is the only comprehensive source of legal analysis on this topic and is widely regarded as the most authoritative treatment to date.

Rule 9 of the Tallinn Manual briefly discusses countermeasures, stating, “A state injured by an internationally wrongful act may resort to proportionate countermeasures, including cyber countermeasures, against the responsible state”34 The manual also notes a disagreement among its experts on what actions count as allowable countermeasures, some arguing that countermeasures entailing the use or threat of force are entirely prohibited and others arguing that a limited use of force might be appropriate if that use were below the threshold of an armed attack.

A second noteworthy document is the August 2013 report of the UN Group of Governmental Experts (GGE) on Information, Telecommunications, and International Security. This group, comprised of governmental experts on IT from fifteen nations (Argentina, Australia, Belarus, Canada, China, Egypt, Estonia, France, Germany, India, Indonesia, Japan, the Russian Federation, the United Kingdom, and the United States), was established at the request of the UN General Assembly to study “existing and potential threats in the sphere of information security and possible cooperative measures to address them including norms, rules or principles of responsible behavior of States and confidence-building measures with regard to information space, as well as the concepts aimed at strengthening the security of global information and telecommunications systems”35

The 2013 GGE report concludes that “International law, and in particular the Charter of the United Nations, is applicable and is essential to maintaining peace and stability and promoting an open, secure, peaceful and accessible ICT [information and communications technology] environment”36 Although this statement is a recommendation of the group of experts rather than an authoritative commitment from the nations they represent, many of the experts in the group have formal affiliations with their national governments and this has frequently been interpreted as indicating a concurrence among the represented nations that international law applies to cyberspace. The report was presented to the UN General Assembly, in accordance with the report’s terms of reference, after which the General Assembly unanimously took note of the report without accepting any of its specific assessments or recommendations.37

In July 2015, a new UN group of government experts issued a second report on “Developments in the Field of Information and Telecommunications in the Context of International Security”38 The new group numbered twenty and included Antigua and Barbuda, Belarus, Brazil, China, Colombia, Egypt, Estonia, France, Germany, Ghana, Israel, Japan, Kenya, Malaysia, Mexico, Pakistan, Russia, Spain, the United Kingdom, and the United States.

Although the 2015 report does explicitly endorse other parts of the 2013 report, such as those related to capacity building, it does not explicitly endorse the conclusion of the 2013 report that “international law, and in particular the Charter of the United Nations, is applicable and is essential to maintaining peace and stability and promoting an open, secure, peaceful and accessible ICT environment” However, Paragraph 28(c) of the 2015 report says, “Underscoring the aspirations of the international community to the peaceful use of ICTs for the common good of mankind, and recalling that the Charter applies in its entirety, the Group noted the inherent right of States to take measures consistent with international law and as recognized in the Charter”39

Some commentators have interpreted the inclusion of Paragraph 28(c) as an implicit acknowledgment that the UN Charter applies in its entirety to the use of ICTs. Others—including this author—note that, at the very least, Paragraph 28(c) of the 2015 GGE report is nowhere as clear as the allegedly comparable statement in the 2013 report and that no nations—except the United States and the United Kingdom—have authoritatively repeated the assertion that international law applies to cyberspace.

Since the issuance of the 2015 GGE report, two significant events have occurred. In a joint statement resulting from the Sino-American summit between Presidents Xi Jinping and Barack Obama in September 2015, the two nations agreed not to “conduct or knowingly support cyber-enabled theft of intellectual property, including trade secrets or other confidential business information, with the intent of providing competitive advantages to companies or commercial sectors”40 In addition, they “welcomed” the 2015 UN GGE report on cybersecurity, which recommended that states “should not conduct or knowingly support ICT activity contrary to its obligations under international law that intentionally damages critical infrastructure or otherwise impairs the use and operation of critical infrastructure to provide services to the public”41

Cyber-enabled theft of intellectual property has been the most significant impediment to better cyber relations between China and the United States, and the joint statement is noteworthy because China has never before made such an explicit statement on that topic. (President Xi has also issued similar joint statements with the leaders of the United Kingdom and Germany.) But observers have expressed skepticism about whether the stated commitment from China will be accompanied by an actual reduction in such theft in the future.42 Moreover, the statement is entirely silent on the use of cyber weapons for destructive purposes against critical infrastructure.

Just two months later, the second significant event occurred when the leaders of the G20, which represents the world’s largest advanced and emerging economies, agreed in the communiquĂ© from their summit in Antalya, Turkey, that no country should conduct or support cyber theft “of intellectual property, including trade secrets or other confidential business information, with the intent of providing competitive advantages to companies or commercial sectors” and, further, that “we [the leaders of the G20] affirm that international law, and in particular the UN Charter, is applicable to state conduct in the use of ICTs”43 This last statement mirrors the statement contained in the 2013 GGE report but, because it is explicitly endorsed by the national leaderships of the signatory nations, can be regarded as authoritative.

In the few months after the 2015 GGE report was released, a number of observers (including this author) were concerned that because the GGE report failed to include statements that strongly endorsed the recommendations of the 2013 report, certain nations, such as China and possibly Russia, were backing away from the 2013 statement regarding international law’s applicability to cyberspace.

The G20 summit communiqué is a significant change that ameliorates some of these concerns. Whether the communiqué will be regarded as the start of an emerging consensus on cyber norms, however, remains to be seen.

The Budapest Convention. The Budapest Convention on Cybercrime of 2001 is an international agreement among forty-seven nations (including most members of the Council of Europe, the United States, Canada, Australia, and Japan).44 Notably, Russia and China are not parties to the convention. The convention has three main purposes: to enact domestic laws that criminalize certain kinds of behavior in cyberspace; to implement certain investigative procedures for law enforcement in the signatory nations; and to enhance international cooperation regarding law enforcement activities against cybercrime.45

  • Criminalized behavior. Some of the behaviors that parties to the convention agree to criminalize include improper access to a computer, improper interception of data, data interference, system interference, and misuse of devices.46 In many cases, the use of a cyber weapon could be included under these rubrics.
  • Investigatory procedures. Parties to the convention agree to enact a variety of procedural mechanisms and procedures to facilitate the investigation of cybercrimes or any crimes committed with a computer or for which evidence may be found in electronic form.
  • International cooperation. Parties to the convention agree to implement mechanisms through which they will assist one another in investigating cybercrimes and other crimes involving electronic evidence. However, cooperation may be limited or delayed by a nation’s domestic laws or by other arrangements. In addition, parties can usually decline to cooperate if such cooperation would compromise their sovereignty, security, law enforcement, public order, or other essential interests.

Of these purposes, only the first category addresses the governance of cyber weapons as such—focusing on uses of cyber weapons that should be discouraged through criminalization.

The Budapest Convention itself does not establish international law that criminalizes specific behaviors. Rather, it harmonizes domestic criminal law across the signatory nations regarding these behaviors.

The Agreement between the Member States of the Shanghai Cooperation Organization. In June 2009, the six member states of the Shanghai Cooperation Organization (Russia, China, Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan) concluded a Russian-drafted agreement defining “information wars” broadly as a “confrontation between two or more states in the information space aimed at damaging information systems, processes and resources, critical and other structures, undermining political, economic and social systems, [and] mass psychologic brainwashing to destabilize society and state”47 While this definition does include cyberattack within its ambit, the status of cyber exploitation is uncertain.

For the most part, this agreement was a joint statement among the signatories emphasizing their views on the undesirability of information war. However, Article 4(1) states, “The parties shall cooperate and act in the international information space within the framework of this agreement in such a way that the activities contribute to social and economic development and comply with maintaining international security and stability, generally recognized principles and norms of international law, including the principles of peaceful settlement of disputes and conflicts, non-use of force, non-interference in internal affairs, respect for human rights and fundamental freedoms and the principles of regional cooperation and non-interference in the information resources of the States of the Parties”

In the reading of this author, this statement does not require any signatory to refrain from any particular action in cyberspace. The agreement commits the parties to take actions that contribute to social and economic development in a manner consistent with the principles listed, but it does not explicitly prohibit one signatory from launching cyberattacks against another if, in the judgment of the launching nation, such attacks might help to maintain security and stability.

The Sino-Russian Cyber Security Agreement of 2015. On May 8, 2015, the Russian Federation and the People’s Republic of China signed an agreement to cooperate on information security,48 a term that is discussed in greater detail in a later section of this paper. Article 4(3) states, “Each Party has an equal right to the protection of the information resources of their state against misuse and unsanctioned interference, including computer attacks against them. Each Party shall not exercise such actions with respect to the other Party and shall assist the other Party in the realization of said right”49 Dr. Elaine Korzak, at the time of this writing a National Fellow at the Hoover Institution of Stanford University, notes that these two sentences together could be read in a way to prohibit Russia and China from using “computer attacks” against each other.

A Note on Cyber Espionage. The term cyber weapon encompasses all applications of IT that have an impact on the integrity, availability, or confidentiality of information inside a targeted information system or network, being carried through it, or being processed within it. This definition focuses on the technical dimensions of cyber weapons, but the legal distinction between these kinds of impact is significant.

In particular, compromises of confidentiality are usually regarded as espionage. Most important, a compromise of confidentiality still leaves the targeted computer working exactly as it did before—if I steal a $10 bill from you, I have it and you do not. But if I steal your Social Security number, I have it and you still have it. Espionage—whether committed through cyber or noncyber means—is illegal under the domestic laws of virtually all nations, but it is not forbidden under international law. For example, W. Hays Parks (former Defense Department attorney and Special Assistant to the Army Judge Advocate General) writes,

Each nation endeavors to deny intelligence gathering within its territory through domestic laws . . . . Prosecution under domestic law (or the threat thereof) constitutes a form of denial of information rather than the assertion of a per se violation of international law; domestic laws are promulgated in such a way to deny foreign intelligence collection efforts within a nation’s territory without inhibiting that nation’s efforts to collect intelligence about other nations. No serious proposal has ever been made within the international community to prohibit intelligence collection as a violation of international law because of the tacit acknowledgement by nations that it is important to all, and practiced by each.50

Thus, by this logic, espionage conducted by or through the use of a computer—also known as cyber espionage—is also not forbidden by international law, and nations that engage in cyber espionage do derive significant benefit from it.


U.S. Domestic Law Regarding Cyber Weapons

In the United States, no domestic law addresses any aspect of research, development, testing, or production of cyber weapons. However, the United States criminalizes unauthorized access to computers under the Computer Fraud and Abuse Act (CFAA). Most significant, the CFAA criminalizes unauthorized access originating from any party under U.S. jurisdiction to any computer connected to the Internet, wherever in the world the computer is located. That access may be effected in any number of ways, including through the use of a cyber weapon. The CFAA contains an explicit exception for U.S. law enforcement and intelligence agencies, however, allowing them to engage in activities that are otherwise prohibited under the act.

The United States also criminalizes unauthorized interception of electronic communications under the provisions of the Electronic Communications Privacy Act (ECPA), as amended. (The ECPA also contains a number of exceptions applying to U.S. law enforcement and intelligence agencies.) Under the definitions used in this chapter, an application of IT that enabled the interception of communications would be classified as a cyber weapon.

Many domestic laws criminalize actions without specific regard for the instruments used in those actions. For example, the Economic Espionage Act criminalizes the stealing of economic information but does not specifically mention how one might effect such a theft. Today such theft is often perpetrated through the use of cyber weapons.

As for domestic law regarding the export of cyber weapons, the United States is a party to the Wassenaar Arrangement, in which a number of states have agreed to regulate exports of certain cyber weapons and related dual-use technologies. (In this case, the adopted definition of dual-use technologies—namely, technologies that can be used for either civilian or military purposes—is that of the U.S. government.) This arrangement was established “to contribute to regional and international security and stability, by promoting transparency and greater responsibility in transfers of conventional arms and dual-use goods and technologies, thus preventing destabilising accumulations. Participating States seek, through their national policies, to ensure that transfers of these items do not contribute to the development or enhancement of military capabilities which undermine these goals, and are not diverted to support such capabilities”51 In the United States, controlled dual-use technologies are enumerated on the Commerce Control List (CCL),52 and the export of items on the CCL is administered by the Department of Commerce.

For many years, certain technologies for cyber defense were controlled in this manner. Restricting the availability of these technologies to undesirable nations made attacking or conducting signals intelligence against them easier. But in March 2014, certain technologies for cyber weapons were added to the Wassenaar control list.53

  • Under Category 4-A-5: “Systems, equipment, and components therefor, specially designed or modified for the generation, operation, or delivery of, or communication with, ‘intrusion software’”—where “intrusion software” is “‘Software’ specially designed or modified to avoid detection by ‘monitoring tools’, or to defeat ‘protective countermeasures’, of a computer or network-capable device, and performing any of the following: (a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or (b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions”54
  • Under Category 5-A-1-j: “IP network communications surveillance systems or equipment, and specially designed components therefor” with certain technical capabilities.55

A particular twist to the Wassenaar formulation is that it apparently applies controls not to a hostile payload but to the means of delivery and creation of hostile payloads. That is, the payload per se—the part of a cyber weapon that actually causes negative effects on the targeted computer—is unaffected by the Wassenaar Arrangement.56 Thus, a destructive payload could be exported freely as long as it was not packaged with a penetration mechanism. The reason for this exception is unknown.

As an example of the Wassenaar Arrangement’s impact, VUPEN Security, a leading seller of vulnerabilities, changed its sales policy to sell its products only to approved government agencies in approved countries. VUPEN also announced it would automatically exclude countries subject to European Union restrictions and countries subject to embargoes by the United States or the UN.57

The Wassenaar Arrangement is a harmonization regime for the domestic laws of its signatories rather than an international legal agreement. In this regard it is similar to the Budapest Convention.


Export Controls on Munitions

Export controls have long been used to stem the proliferation of certain “dangerous” technologies; that is, technologies that would be dangerous were they to fall into the hands of adversaries. In the United States, the Arms Export Control Act of 1976 (22 USC 39) gives the president authority to control the export of defense articles and defense services. (Defense articles and services are those intended explicitly and primarily for military use and thus do not fall into the “dual-use” category.) The act is implemented by the International Traffic in Arms Regulations (ITAR), and the regulated defense articles and services are found on the United States Munitions List (USML).58 The Department of State administers the ITAR.

Information technologies found on the munitions list are mostly found in Category XIII (auxiliary military equipment) and include military information, security assurance systems and equipment, cryptographic devices, software, and components specifically designed, developed, modified, adapted, or configured for military applications (including command, control, and intelligence applications). Items 1, 2, and 4 of this list contain defensive cyber technologies that the United States would prefer to keep out of the hands of adversaries; doing so enables the United States to conduct more effective attacks or espionage in cyberspace. Item 3 on the list contains the only mention of technology related to cyber weapons: military cryptanalytic systems, equipment, assemblies, modules, integrated circuits, components, or software that would enhance an adversary’s signals intelligence capabilities.

However, Category 21 (miscellaneous items) of the Wassenaar Arrangement is a catchall category for items not enumerated in the other categories: “Any article not specifically enumerated in the other categories of the U.S. Munitions List which has substantial military applicability and which has been specifically designed, developed, configured, adapted, or modified for military purposes. The decision on whether any article may be included in this category shall be made by the Director, Office of Defense Trade Controls Policy” Authorities related to Category 21 are not likely to have been used to restrict the transfer of cyber weapons to other nations.59

The original purpose of ITAR and the USML was to regulate the sale of weapons designed and intended for military purposes. But IT is inherently dual-use, and thus a clear definition of when an IT artifact with some destructive or damaging capability is designed or intended for military purposes is elusive. Recognizing a military purpose for an IT artifact that is also used in a civilian context is problematic. Thus, judgments about the permissibility of a U.S. sale of cyber weapons to Nation X will, more likely than not, be based less on the capabilities of the artifacts involved and more on the perceived intentions of Nation X and the U.S. relationship with Nation X.


U.S. Policy Statements Concerning Cyber Weapons

In May 2011 the White House released its International Strategy for Cyberspace. This document was remarkable for its near-total silence regarding the acquisition or use of cyber weapons. The closest the document comes to this topic is its statement that, “consistent with the United Nations Charter, states have an inherent right to self-defense that may be triggered by certain aggressive acts in cyberspace” It further states, “the United States will respond to hostile acts in cyberspace as we would to any other threat to our country. . . . We reserve the right to use all necessary means—diplomatic, informational, military, and economic—as appropriate and consistent with applicable international law, in order to defend our Nation, our allies, our partners, and our interests” Nevertheless, the document emphasizes, “we will exhaust all options before military force whenever we can; will carefully weigh the costs and risks of action against the costs of inaction; and will act in a way that reflects our values and strengthens our legitimacy, seeking broad international support whenever possible”60

The document also speaks of norms of behavior that its authors believe could bring “predictability to state conduct, helping prevent the misunderstandings that could lead to conflict” In addition to the right of self-defense as one such norm, the document argues, “States must identify and prosecute cybercriminals [presumable criminals using cyber weapons], to ensure laws and practices deny criminals safe havens, and cooperate with international criminal investigations in a timely manner”61 (Compliance with such norms during conflict is not—and cannot be—assured.)

Another policy statement was made by Michael Daniel, special assistant to the president and White House cybersecurity coordinator, in an April 2014 blog post. In this post, Daniel discusses the tension between revealing a vulnerability in a system so that it can be repaired (thus improving security for those using that system) and withholding knowledge of that vulnerability (thus enabling those with that knowledge, such as the U.S. government, to use that vulnerability as part of a cyber weapon). He further noted that when the U.S. government does obtain knowledge of such a vulnerability, the administration “takes seriously its commitment to an open and interoperable, secure and reliable Internet, and in the majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest. This has been and continues to be the case”62 To the extent that this is actually the case (there is significant public skepticism on this point), disclosure helps to inhibit the development of cyber weapons.

In April 2015, the DOD released The DOD Cyber Strategy.63 This document went further than any previous official statement in asserting the right of the United States to use cyber weapons. Three passages in the document warrant special attention.

  • “There may be times when the President or the Secretary of Defense may determine that it would be appropriate for the U.S. military to conduct cyber operations to disrupt an adversary’s military-related networks or infrastructure so that the U.S. military can protect U.S. interests in an area of operations. For example, the United States military might use cyber operations to terminate an ongoing conflict on U.S. terms, or to disrupt an adversary’s military systems to prevent the use of force against U.S. interests. United States Cyber Command (USCYBERCOM) may also be directed to conduct cyber operations, in coordination with other U.S. government agencies as appropriate, to deter or defeat strategic threats in other domains”64
  • “[A strategic goal of the DOD cyber strategy is to] build and maintain viable cyber options and plan to use those options to control conflict escalation and to shape the conflict environment at all stages. During heightened tensions or outright hostilities, DOD must be able to provide the President with a wide range of options for managing conflict escalation. If directed, DOD should be able to use cyber operations to disrupt an adversary’s command and control networks, military-related critical infrastructure, and weapons capabilities. . . . To ensure unity of effort, DOD will enable combatant commands to plan and synchronize cyber operations with kinetic operations across all domains of military operations”65
  • “DOD will work with agencies of the U.S. government as well as U.S. allies and partners to integrate cyber options into combatant command planning”66

The United States thus reserves the right to use cyber weapons to militarily protect U.S. interests in an area of operations, to deter or defeat strategic threats in other domains, and to integrate the use of cyber weapons into military planning efforts as an additional tool, albeit with special characteristics, in the U.S. arsenal.

Around the same time, in May 2015 in Seoul, Secretary of State John Kerry delivered a speech on the Internet. After reiterating the U.S. view that the basic rules of international law apply in cyberspace, that acts of aggression are not permissible, and that countries that are hurt by a cyberattack have a right to respond in accordance with the laws of armed conflict, he said that the United States also “support[s] a set of additional principles that, if observed, can contribute substantially to conflict prevention and stability in time of peace”67 He did not use the term norms in this context, but Kerry’s principles are in fact norms by any conventional definition of the term. These principles include the following (with the author’s principle-by-principle commentary indented and in brackets after each quotation from Kerry’s speech):

  • “No country should conduct or knowingly support online activity that intentionally damages or impedes the use of another country’s critical infrastructure”

    [This principle can be construed as limiting the use of cyber weapons against another nation’s critical infrastructure. (Other analysts have suggested restrictions on targeting critical infrastructure as well.68) Such restrictions would be in some ways analogous to prohibitions on targeting hospitals and places of worship as provided by the Geneva Conventions. However, the Kerry speech does not define critical infrastructure, and in the United States, at least, a large fraction of the U.S. economy (well over 50 percent) is categorized as such. The principle is also silent about the extent of damage or impediment that it forbids. At the upper end (a large-scale cyberattack against critical infrastructure that results in nationwide collapse of that infrastructure), LOAC already rules out such an attack because it would be likely to result in a large amount of damage and harm to civilians and thus to fail the LOAC test of proportionality. At the lower end (e.g., a small cyberattack against a single electrical generator powering a military facility), it is hard to imagine that the principle is intended to prohibit such an action. Indeed, The DOD Cyber Strategy explicitly says that military-related critical infrastructure is not off-limits. Lastly, agreements to refrain from such targeting can remove an overt and openly declared cyber threat against these facilities, but all of the concerns about actual attacks on these facilities will continue to be unaddressed, a point that has two consequences: (1) The cyber defenses of critical infrastructure must be just as strong and robust as they would be in the absence of a targeting agreement. That is, the defense of critical infrastructure is not simplified or made easier in any way by such an agreement. (2) Political costs that might be expected to accrue to a violator of such an agreement could be mitigated to some degree by the inherent plausible deniability of cyber operations. Political costs accrue only if evidence is available that would convince third-party observers of responsibility in the face of outright denial by the perpetrator, and finding convincing evidence is particularly problematic in the cyber world.]

  • “No country should seek either to prevent emergency teams from responding to a cybersecurity incident, or allow its own teams to cause harm”

    [This principle can be construed as limiting the use of cyber weapons against computer emergency response teams and is somewhat analogous to restrictions on targeting hospitals, medical personnel, or ambulances.]

  • “No country should conduct or support cyber-enabled theft of intellectual property, trade secrets, or other confidential business information for commercial gain”

    [This principle is limited to prohibiting the obtaining of data for commercial gain but leaves unrestricted the obtaining of data for purposes related to national security. As such, this principle places no restrictions on the actual use of cyber weapons per se, though it does restrict the purposes to which the results of such use may be put.]

  • “Every country should mitigate malicious cyber activity emanating from its soil, and they should do so in a transparent, accountable and cooperative way”

    [This principle is silent on the use of cyber weapons per se, though it seeks to assign state responsibility for suppression of such use.]

  • “Every country should do what it can to help states that are victimized by a cyberattack”

    [This principle is silent on the use of cyber weapons.]

Lastly, the U.S. Congress expressed concerns about the proliferation of cyber weapons in Section 940 of the National Defense Authorization Act for Fiscal Year 2014, which required the president to “establish an interagency process to provide for the establishment of an integrated policy to control the proliferation of cyber weapons through unilateral and cooperative law enforcement activities, financial means, diplomatic engagement, and such other means as the President considers appropriate”69

According to the legislation, this policy was to have two purposes:

  • To identify the intelligence, law enforcement, and financial sanctions tools that can and should be used to suppress the trade in cyber tools and infrastructure that are or can be used for criminal, terrorist, or military activities while preserving the ability of governments and the private sector to use such tools for self-defense.
  • To establish a statement of principles to control the proliferation of cyber weapons, including principles for controlling the proliferation of cyber weapons that can lead to expanded cooperation and engagement with international partners.

As of April 2016, the administration had not produced the requested report. Further, whether the concerns of Congress were related to cyber weapons as mostly military artifacts or as dual-use artifacts is not clear. Supporting the former interpretation is the fact that the legislation passed as a part of the DOD authorization bill; supporting the latter interpretation is the idea that government and private sector entities had legitimate interests in cyber weapons for self-defense purposes.


Existing Transparency and Confidence-Building Measures

Confidence-building measures (CBMs) are measures that two or more nations agree to take to reduce the likelihood that a conflict might break out between or among them because of miscalculation or misperception or that a conflict might inadvertently escalate. As far as is known to this author, only one specific and currently extant CBM relates to cyberspace. At talks during the G-8 meeting in June 2013, the United States and the Russian Federation agreed on CBMs for cyberspace to increase transparency and reduce the possibility that a misunderstood cyber incident could create instability or a crisis.70 The measures include:

  • The establishment of a direct secure voice communications line between the U.S. Cybersecurity Coordinator and the Russian Deputy Secretary of the Security Council, should there be a need to directly manage a crisis situation arising from a cybersecurity incident. It is planned that this direct line will be seamlessly integrated into the existing Direct Secure Communication System (“hotline”) that both governments already maintain.
  • The establishment of secure and reliable lines of communication for each nation to make formal inquiries of the other about cybersecurity incidents of national concern so as to reduce the possibility of misperception and escalation from cybersecurity incidents. The existing Nuclear Risk Reduction Center links established in 1987 between the two nations will house these cyber lines of communication.
  • The sharing of threat indicators between the United States Computer Emergency Readiness Team and its counterpart in Russia, including technical information about malicious software or other indicators reflecting malicious activity appearing to originate from each other’s territory. Sharing such information helps in the proactive mitigation of threats.

Nothing in this arrangement speaks directly to governing either the acquisition or use of cyber weapons (or the underlying technology).


Current State of Governance of Cyber Weapons in Other Countries

Many states other than the United States criminalize unauthorized access to computers (see, for example, the UK’s Computer Misuse Act 1990).71 However, to the best of this author’s knowledge, no state has attempted to regulate the acquisition of cyber weapons within its borders.

A number of states also place export controls on technologies relevant to cyber weapons, thus limiting the acquisition of cyber weapons by certain nations. In some cases, these controls closely mirror the U.S. controls (for example, when the nations in question are parties to the Wassenaar Arrangement); in others they are somewhat different.

Outside the United States, the government of the Netherlands has been comparatively outspoken in its discussion of offensive operations in cyberspace. For example, in December 2009, some members of the Dutch Parliament stated that defensive capabilities alone were insufficient to engage in cyber warfare.72 In December 2011, an advisory committee to the Dutch government freely discussed the value of using cyber weapons (under the rubric of offensive cyber capabilities as a rough synonym for cyber weapons).73 For example, it endorsed the view that such weapons could be used to protect friendly systems and networks. In January 2012, the Dutch government stated that its Ministry of Defence was investing in measures to develop “new (including offensive) [cyber] capabilities”74 And in February 2015, the Dutch minister of defense released a six-page letter that revises the Dutch Defense Cyber Strategy of 2012.75 This letter identifies as priorities the “strengthening [of] the intelligence capability in the digital domain” and “strengthening the use of cyber in military missions,” and the discussion of both priorities includes explicit mention of the role that offensive cyber capabilities play in supporting them.

Also, the government of the United Kingdom said in September 2013 that it is “developing a full spectrum military cyber capability, including a strike capability”76 The UK government has not officially provided any details on this point, although Defence Secretary Philip Hammond said in an interview with the Daily Mail that “clinical ‘cyber strikes’ could disable enemy communications, nuclear and chemical weapons, planes, ships and other hardware”77


Other Proposals for Managing the Risks from Cyber Weapons

State Initiatives at the United Nations78

The vast majority of cybersecurity discussions at the UN have taken place under the auspices of the various committees of the General Assembly, primarily the First Committee (whose mandate is to focus on disarmament and international security). In 1998, the Russian Federation introduced a draft resolution to the First Committee entitled “Developments in the Field of Information and Telecommunications in the Context of International Security”

The draft resolution invited interested states to submit their views on the topic, and the Russian submission stated that work should begin on the development of international principles that would “subsequently be incorporated into a multilateral international legal instrument” to regulate “information weapons” In its letter initiating deliberations in the First Committee, the Russian Federation wrote of “the creation of information weapons and the threat of information wars, which we understand as actions taken by one country to damage the information resources and systems of another country”79

The Russian Federation was the sole sponsor of draft resolutions with the same name until 2006, at which time China, Armenia, Belarus, Kazakhstan, Kyrgyzstan, Myanmar, Tajikistan, and Uzbekistan joined Russia as cosponsors.

On a parallel but related track, China, Russia, Tajikistan, and Uzbekistan jointly presented in September 2011 to the UN General Assembly a proposal for an international code of conduct for information security.80 An updated version of this proposal was presented in January 2015. According to the updated version, the purpose of the code is to “ensure that the use of information and communications technologies and information and communications networks facilitates the comprehensive economic and social development and well-being of peoples, and does not run counter to the objective of ensuring international peace and security”81

The most significant part of the proposed code related to dual-use is the obligation of signatories “not to use information and communications technologies and information and communications networks to carry out activities which run counter to the task of maintaining international peace and security”82


Proposals for Confidence-Building Measures and Norms of Behavior

One set of CBMs was proposed by the Organization for Security and Co-operation in Europe in 2013 with the purpose of reducing “the risks of misperception, escalation, and conflict that may stem from the use of ICTs”83

  • Measure 3 of the proposed set of CBMs calls for participating states to engage in “consultations in order to reduce the risks of misperception, and of possible emergence of political or military tension or conflict that may stem from the use of ICT” This measure acknowledges the possibility that the use of cyber weapons could under some circumstances lead to tension or conflict. A hypothetical example of such a possibility, not mentioned in the report, is the use of cyber means by Nation A to gather intelligence information during a crisis involving A and Nation B. Such an action by A, taken with the best of intentions (e.g., to understand B’s intentions during the crisis), may well be interpreted by B as a prelude to attack.
  • Measure 8 calls on states to “establish measures to ensure rapid communication at policy levels of authority to permit concerns to be raised at the national security level” This measure is essentially a mechanism for greater communication during crisis.

In addition, the GGE report of 2013 was charged with studying CBMs to address existing and potential threats to information security. But a careful examination of the report reveals only three statements that can even remotely be related to the governance of cyber weapons.84

  • Paragraph 22 says, “States should intensify cooperation against criminal or terrorist use of ICTs, harmonize legal approaches as appropriate and strengthen practical collaboration between respective law enforcement and prosecutorial agencies” Criminal or terrorist use of ICTs would count as use of cyber weapons, but the paragraph presents no specifics about the kind(s) of use that should be criminalized. This paragraph seems to be urging states toward the Budapest Convention or a similar arrangement.
  • Paragraph 23 says, “States must meet their international obligations regarding internationally wrongful acts attributable to them. States must not use proxies to commit internationally wrongful acts. States should seek to ensure that their territories are not used by non-State actors for unlawful use of ICTs” The language about proxies suggests the undesirability of a nation “outsourcing” the use of cyber weapons to a third party (a proxy), which could in principle be a nonstate actor. The third sentence asks states to assume the responsibility of suppressing the illegal use of cyber weapons from their territories. But since government use of cyber weapons is likely to be legal under the laws of the using nation, the third sentence is silent on such use.
  • Paragraph 26(f) says, “Enhanced mechanisms for law enforcement cooperation to reduce incidents that could otherwise be misinterpreted as hostile State actions would improve international security” This language acknowledges that certain criminal acts involving the use of cyber weapons could be interpreted as hostile actions and suggests that more law enforcement cooperation between nations could help to reduce the likelihood of misinterpretation.

The GGE report of July 2015 offers recommendations regarding “voluntary, non-binding norms, rules, or principles for the responsible behaviour of States aimed at promoting an open, secure, stable, accessible and peaceful ICT environment”85 These recommendations include the following (with the author’s recommendation-by-recommendation commentary indented and in brackets):

  • Paragraph 13(f) indicates, “A State should not conduct or knowingly support ICT activity . . . that intentionally damages critical infrastructure or otherwise impairs the use and operation of critical infrastructure to provide services to the public”86

    [As with the first principle identified in Secretary of State Kerry’s May 2015 speech, the language does not provide a definition of critical infrastructure; it is also silent on the question of extent of damage that would be involved in a proscribed act, a silence that almost certainly reflects differing interpretations of what this norm would mean in practice.]

  • Paragraph 13(i) indicates, “States should take reasonable steps to ensure the integrity of the supply chain, so end users can have confidence in the security of ICT products”87

    [This recommendation speaks to the possibility that cyberattacks might originate in or depend on compromises in the supply chain of IT products or services, but it is otherwise silent on the matter.]

  • Paragraph 13(k) indicates, “States should not conduct or knowingly support activity to harm the information systems of another State’s authorized emergency response teams . . . [nor] use authorized emergency response teams to engage in malicious international activity”88

    [As with the second principle identified in Secretary of State Kerry’s May 2015 speech, this norm can be construed as limiting the use of cyber weapons against computer emergency response teams, and it is somewhat analogous to restrictions on targeting hospitals, medical personnel, or ambulances.]

Norms and CBM proposals focusing on the use of cyber weapons and related issues originating from other analysts include:

  • Measures to improve crisis management, such as hotlines that enable direct communications between states during a cyber crisis and the sharing of threat information.89 (The United States and Russia established a means for direct communication in 2013.) Greater communication among responsible authorities during a crisis may be helpful to the extent that the content of these communications are believable. But given the fact that the successful use of cyber weapons depends entirely on stealth and deception, participants in these communications may well have cause for skepticism about what the other side is saying. An additional concern is that, as with all CBMs, some degree of political will is necessary for their successful operation. A case in point is the military hotline between the United States and China. Intended to enable direct communication between senior military leaders on both sides during a crisis, it has not always been operational even during routine tests of the system. On several occasions in which the line was tested for operational capability, as well as in the wake of the 2001 EP-3 incident over Hainan Island,90 the Chinese military failed to respond.91
  • Bans on distributed denial of service (DDOS) attacks.92 DDOS attacks are among the most frequent attacks on cyber infrastructure, and while they continue, they can be crippling to the targeted organization. On the other hand and in contrast to permanently destructive action, their effects are temporary and reversible—after they stop, the targeted organization is as good as new. But the most significant point regarding DDOS attacks is that powerful DDOS attacks can be launched by nonstate actors, and constraining the actions of such parties continues to be problematic. Even worse, if such attacks by nonstate actors do occur, suspicious targeted nation-states might still be inclined to blame other nations for violating the bans in question.

Many CBMs were originally developed to address issues arising in the context of kinetic armed conflict. As such, they presumed the existence of easily observed physical entities (soldiers, tracked and wheeled vehicles, artillery pieces, ships, airplanes, missiles). Movements of these entities from one geographic region to another had bearing on what they might or might not be able to do in a conflict. The number of physical entities was an important contributor to military power and capability.

Cyberspace is very different. For example, physical distance has little meaning in cyberspace. Forces as such do not move from one area to another. The key weaponry in cyber conflict is usually software—digitized information—and as such is intangible. Along with other fundamental differences, such as the availability of many cyber weapons to nonstate entities, measures that take for granted the unique characteristics of cyber weapons are unlikely to be useful in reducing the likelihood of cyber conflict.


Policy Proposals from the Information Technology Industry

The IT industry is an important stakeholder in the emergence of behavioral norms around the use of cyber weapons. Of particular note is a proposal from Microsoft for six norms intended to guide the behavior of nation-states with respect to the use of cyber weapons and to reduce the risk arising from such use.

  • Norm 1: States should not target ICT companies to insert vulnerabilities (backdoors) or take actions that would otherwise undermine public trust in products and services.
  • Norm 2: States should have a clear principle-based policy for handling product and service vulnerabilities that reflects a strong mandate to report them to vendors rather than to stockpile, buy, sell, or exploit them.
  • Norm 3: States should exercise restraint in developing cyber weapons and should ensure that any which are developed are limited, precise, and not reusable.
  • Norm 4: States should commit to nonproliferation activities related to cyber weapons.
  • Norm 5: States should limit their engagement in cyber offensive operations to avoid creating a mass event.
  • Norm 6: States should assist private sector efforts to detect, contain, respond to, and recover from events in cyberspace.93

The intent behind these norms of behavior is to minimize state actions that compromise the trust of users in the products and services that private sector IT vendors offer. From the standpoint of these vendors, the economic rationale for the norms, particularly norms 1 and 2, is clear: actions that undermine public trust in IT products and services make the public more reluctant to use those products and services, an outcome with foreseeable negative economic consequences. Furthermore, to the extent that national security (e.g., of the United States) is tied to a thriving and vibrant IT industry, national security would benefit as well from widespread adoption of these norms.

Observing these norms of behavior would not prohibit the use of all cyber weapons—only those that are based on taking advantage of design or implementation vulnerabilities existing in deployed products and services. In principle, the use of other cyber weapons—such as those based on taking advantage of flawed configurations (e.g., a port left open when it should have been closed) or taking advantage of features designed into the product or service in an unexpected or novel way—would still be allowable. Denial of service attacks would also be permissible in principle as well, as long as the computers used to launch such attacks had not been compromised through a design or implementation vulnerability. But these norms would inhibit “zero-day” penetrations or compromises (which are regarded as being enormously powerful) and would unambiguously commit nations to help vendors improve the security of the products and services they offer.

A second step the private sector—specifically vendors of IT products and services—has begun to take is to reduce the number of vulnerabilities through “bug bounty” programs. A bug bounty program is an offer from a vendor to pay and otherwise recognize individuals who report vulnerabilities in the product or services of that vendor. A continuously updated list of such programs is maintained online.94 When these programs work, they provide vendors with information about previously unknown vulnerabilities that they can then repair before adversaries can take advantage of them. In addition, at least one firm has been established with a business model that connects finders of vulnerabilities with the appropriate bug bounty programs.95

By reducing the number of unknown vulnerabilities available to developers of cyber weapons, bug bounty programs are in principle a market-based mechanism that helps to inhibit the development of such weapons. However, the extent to which these programs have been successful in doing so is not yet known.


The Role of Scientists, Scientific Societies, and Community Norms

The major society in the United States associated with computer professionals is ACM (formerly the Association for Computing Machinery). ACM includes about one hundred thousand members, which is only a small fraction of the number of programmers and software developers in the United States.96 Membership is available to anyone for a nominal fee, and the major benefit of membership is access to an array of professional journals.

ACM promulgates a code of ethics for its members, of which two provisions are relevant. Section 1.2 requires members to “avoid harm to others” and prohibits “use of computing technology in ways that result in harm. . . . Harmful actions include intentional destruction or modification of files and programs leading to serious loss of resources” Section 2.8 requires members to “access computing and communication resources only when authorized to do so” and states that “one must always have appropriate approval before using system resources, including communication ports, file space, other system peripherals, and computer time”97

The code is silent about the responsibility of members to refrain from creating or developing cyber weapons; it speaks only to use.

On the educational side, accreditation of university-level computer science programs is provided by the Computer Science Accreditation Board, a member of the Accreditation Board for Engineering and Technology (ABET), a nonprofit, nongovernmental organization that accredits over 3,400 programs at nearly seven hundred colleges and universities in twenty-eight countries.98 The requirements for accreditation in computing do not include any course or project work that relates to ethical or legal issues in computing, although a more general requirement (that is, one imposed by ABET for accreditation in all relevant disciplines) states that students should graduate with “an understanding of professional, ethical, legal, security and social issues and responsibilities”99

Nevertheless, for much of its history as a formal academic discipline, computer science has had a moderately strong norm against providing formal education intended to teach hacking skills.100 However, in recent years, this norm has started to break down as a number of educational institutions have begun to teach courses explicitly intended to nurture hacking skills.101 Debate on the topic continues. Teachers of such courses note that they include a substantial ethical treatment in their courses and that their graduates are eagerly sought by government agencies and private sector entities who need people with such skills for carrying out offensive operations against adversary computers or approved “white-hat” penetration testing against an organization’s cyber defenses. Detractors dislike the idea of sanctioned or approved cyber intrusions, arguing that all vulnerabilities discovered should be promptly reported to parties responsible for fixing them.

Academic researchers do undertake research on vulnerabilities in products and services with the intent of enhancing cybersecurity from the defensive perspective. In doing such work, the expectation of the community is that such research is published, and far from being regarded as antisocial or hostile activity, work to uncover weaknesses is praised by the community because only when such weaknesses become known can they be addressed.

One example is a series of annual workshops on offensive technologies (WOOT) that began in 2007. According to the current description, WOOT aims “to present a broad picture of offense and its contributions, bringing together researchers and practitioners in all areas of computer security. Offensive security has changed from a hobby to an industry. No longer an exercise for isolated enthusiasts, offensive security is today a large-scale operation managed by organized, capitalized actors”102 But the fundamental rationale for such work is that it informs work on defensive technologies.

Another example is the research community for cryptography. Algorithms for encryption (which scramble and descramble digitally represented data) are designed to be impervious to anything but a “brute-force” attack in which decryption can be reliably accomplished by trying all possible decryption keys. But deep mathematical research can reveal vulnerabilities in an encryption algorithm that allow shortcuts to be taken, thus reducing the effort needed to accomplish a decryption.

Users of encryption algorithms count on such research to reveal weaknesses. As a specific illustration, in 1997 the National Institute of Standards and Technology (NIST) announced the initiation of a development effort for an Advanced Encryption Standard (AES) that would specify one or more encryption algorithms capable of protecting sensitive government information well into the twenty-first century.103 Some twenty months later, NIST announced a group of fifteen AES candidate algorithms, submitted by members of the cryptographic community from around the world. NIST used an extensive public process to obtain technical commentary on the candidate algorithms, and on the basis of these comments NIST decided on a specific algorithm for the AES.

In general, security researchers also investigate vulnerabilities in deployed products, and many such individuals adhere to an ethic of “responsible disclosure” in which they privately report a discovered vulnerability to the vendor and temporarily withhold public disclosure to give the vendor time to repair it. However, a growing number of such individuals are finding ways to profit from their discoveries, selling them to vendors (which often offer bounties for vulnerabilities) or on the black market to parties for eventual incorporation into cyber weapons.


Government Attitudes toward National and International Governance of Cyber Weapons

All governments are concerned about cyber weapons being used against them and their interests by other nations and nonstate parties—that is, they are concerned about cybersecurity as the term is traditionally defined (i.e., as defense against hostile cyber activities). Furthermore, they are all concerned about the criminal use of cyber weapons—that is, nonstate actors, whether associated with another state or within their own jurisdictional reach, using such weapons for criminal purposes such as fraud, theft, or blackmail.

At the same time, many governments—most often the governments of major world powers—see value in having the ability to use cyber weapons. For example, cyber weapons are useful for espionage operations against other nations, including both government and private sector entities. Cyber weapons also have offensive (and destructive) capabilities that many nations are reluctant to abandon because of their operational advantages. For example, cyber weapons favor the offense in the sense that it is very difficult to erect defenses against them, their use can often be conducted with plausible deniability, the effects of their use can vary broadly, they offer the possibility of asymmetric advantage against nations that are heavily dependent on IT, and they can be relatively inexpensive compared to traditional kinetic weapons. Nonetheless, almost all nations are silent on the question of their national capabilities for conducting offensive operations in cyberspace or even on the possibility that these capabilities might be useful to them. (Apart from the United States, the most significant exception to this general reluctance to discuss matters related to offensive cyber activities is the Dutch Ministry of Defence.)

However, just because governments see value in having access to cyber weapons does not mean they are necessarily sanguine about actually using those weapons. For example, the United States was concerned about the precedent-setting nature of using cyber weapons long before the Stuxnet operation was launched, and even after Stuxnet the United States had similar concerns about the use of such weapons against Libya and in that case chose not to use them.104 Recent news stories also indicate that Western cyber activities against the Iranian nuclear infrastructure drove an Iranian cyber retaliation against U.S. financial institutions.105

Governments do differ significantly in their willingness to outlaw the use of cyber weapons in a domestic legal context. In some cases, they have a robust legal regime that criminalizes the use of cyber weapons and a willingness to enforce that regime. Most of the signatories of the Budapest Convention are in this category. In other cases, an existing legal regime is accompanied by a reluctance to enforce those laws. Russia is widely regarded as an example.106


Commentary and Discussion

Cyber weapons can play a much different role in conflict than nuclear weapons. The use of a nuclear weapon would be a threshold event of enormous strategic and political significance. By contrast, cyber weapons are being used every day by a broad range of adversaries, ranging from individual misguided teenagers to major nation-states—and many of these uses go entirely unnoticed. Thus, the use of a cyber weapon per se does not cross any kind of threshold. Only if such use resulted in a sufficiently large impact would it do so.

Cyber weapons also have clear value in causing damage if that is the goal. From a policy-maker perspective, nuclear weapons are highly unusable (despite the fact that military planners can easily contemplate their use), and indeed none have been used as a part of hostilities since 1945.107 Biological weapons have been widely regarded as unpredictable in their effects and of limited value in combat for much of their history, as their use in a conflict might well result in blowback against friendly forces and populations.108

In the most general case for any weapon, acquisition requires both physical infrastructure (e.g., laboratories and appropriate physical devices or materials) and appropriate knowledge. For many cyber weapons, the physical infrastructure is not a limiting factor—the computers on which these applications can be developed are ubiquitous. In other cases (especially those in which the intended target is a specific physical system), adequate testing of adversarial applications may require the development and deployment of physical environments that mimic the intended target.109

IT and cyber weapons also have a different history than other types of weapons, such as nuclear or biological. The underlying IT is ubiquitous (i.e., more than just broadly available) around the world. Those who create cyber weapons take advantage of this technology base, but they are not—and never have been—primarily researchers. Since (more or less) the first computers, hackers have been curious about how these systems work. What once made the threat from hackers manageable was the small number of computers in the world and the largely prevailing ethos of hackers to refrain from damaging the systems they hacked.

The cyber weapons developed in this early era were derived not from science but from engineering and exploration. Even today, with only a few exceptions, cyber weapons are not created as the result of scientific research on IT and do not involve the discovery of new principles. For example, the penetration aspect of a cyber weapon may involve the discovery of a previously unknown weakness or vulnerability in an existing IT artifact such as a computer program. The payload aspect of a cyber weapon may involve the writing of a new computer program that manipulates the control system of a chemical plant. In neither case would one generally say that new principles were discovered.

Nor are scientific experiments involved in creating cyber weapons. To be sure, cyber weapons may be tested against various targets to understand how they might be made more effective, but such tests generally lack the features that characterize most scientific experiments (e.g., hypothesis testing). In fact, high school students have been developing techniques to penetrate computer systems for many years. (The author of this paper was one such high school student several decades ago.)

Governments fund a considerable amount of scientific research on IT, but since the connection between scientific research and cyber weapons is tenuous at best, research funding is mostly irrelevant to the creation of cyber weapons except insofar as such funding contributes to the overall foundations of IT.

As a result of this history, those interested in the governance of cyber weapons are faced with the problem of creating new institutions and mechanisms where many fewer choke points exist. The inevitable result resembles what is seen today—a paucity of such mechanisms and institutions compared to those for biological and nuclear technologies and little prospect for establishing them on a wide scale.

One might imagine that under the rubric of “Internet governance,” discussions occur regarding preventing the use of cyber weapons. But in this sphere, there is considerable dispute as to the appropriate participants and what subjects are included under the rubric of “Internet governance,” and few if any proposals explicitly address the acquisition or use of cyber weapons.

The dispute in Internet governance regarding participation centers on whether Internet governance is a multilateral endeavor or a multistakeholder endeavor. Those who favor a multilateral approach emphasize the role of national governments as the primary actors in Internet governance. Those who favor a multistakeholder approach identify governments as actors coequal to other stakeholders, such as private sector companies, public interest/civil society groups, and other nongovernment organizations.

The dispute over the purview of Internet governance centers on how, if at all, Internet governance should extend beyond the traditional function of managing IP addresses and domain names. Advocates of extending the scope of “Internet governance” wish to include regulation of various behaviors related to use of the Internet. Certain nations—China and Russia, for example—are strong advocates for the respect of national sovereignty and the right of each nation to define for itself the important aspects of its own history, culture, and social system. From this flows the natural consequence that these governments are concerned not only about cyber weapons that might pass through their borders but also about news stories and other information they find objectionable. (These concerns are generally labeled “information security”) And they insist on the authority to limit their populations’ access to such information.

Advocates of restricting the scope of Internet governance to its traditional function reject the proposition that nations should have the right to censor the information to which citizens have access. Concerns about “hostile information” that might be detrimental to state sovereignty conflict directly with the Western tradition of free speech and expression. Thus, a stalemate has existed along these lines for many years.

The debate is complicated by the scope of regulation contemplated by advocates of state-based information control. Specifically, centralized technical measures taken to restrict the use of cyber weapons transmitted through the Internet also conflict with the fundamental underlying design philosophy of the Internet. The Internet was designed in such a way that its only function is to do the best job possible in carrying bits from A to B without regard for the meaning of those bits. Whether those bits are the New York Times, a picture of my mother’s cats, malicious software embedded in a PDF file, a program for running statistical regressions, or pornography—the Internet is designed to carry it all.

Blocking specific content at the point of receipt—at B—is a relatively straightforward task, assuming that objectionable content can be specified clearly. But in this case, blocking at point B depends on B controlling the decision to block. Nations that wish to block certain content without B’s involvement inevitably resort to more centralized mechanisms located between A and B—that is, in the Internet infrastructure itself. Such changes to the underlying infrastructure would facilitate the fragmentation of the Internet into disparate and perhaps noninteroperable subnetworks.110


Conclusion

The use of a cyber weapon can have negative effects on data or program integrity (in which data or computer operations are altered with respect to what users expect), on availability (in which services normally provided to users of the system or network are unavailable when expected), and on confidentiality (in which information that users expect to be kept secret is exposed to others).

The agents that might use cyber weapons span a broad range, including lone hackers acting as individuals; criminals acting on their own for profit; organized crime (e.g., drug cartels); transnational terrorists (perhaps acting with state sponsorship or tolerance); small nation-states; and major nation-states. Certain nonstate actors make a business out of using cyber weapons of various kinds against targets of their customers’ choosing.

Today, few nations regulate or have laws concerning the creation or acquisition of cyber weapons, notwithstanding export control regimes that seek to prevent “bad” nations from obtaining cyber weapons or related knowledge from the nations that have them and bug bounty programs that reduce the supply of vulnerabilities that may be used to create cyber weapons.

International law is silent on the acquisition or use of cyber weapons, though to the extent that nations agree that the laws of armed conflict apply to cyberspace, some uses of cyber weapons are not permitted. Most nations of the world explicitly endorse the idea of a peaceful cyberspace, though no nation has publicly adopted a policy of refraining from using cyber weapons in its international relations for national security purposes. However, a variety of domestic laws in the nations of the world prohibit the criminal use of cyber weapons in various contexts.

Within the IT community, no broadly accepted and observed norms or codes of behavior proscribe, inhibit, or discourage the technical work needed to uncover vulnerabilities that can be used in cyber weapons. Indeed, those who do such work often receive accolades and financial rewards from IT product and service vendors when those vulnerabilities are revealed so that they may be fixed.

Four primary reasons explain why governance measures regarding cyber weapons have not been widely adopted. First, the underlying technology is ubiquitous, and it is too easy to create cyber weapons. Second, they are too useful for governments to give up or even to curb. Third, the use of a cyber weapon does not necessarily cross dangerous thresholds—at the lower end, the effect of such use is merely an annoyance or a prank, if that, which means that it is difficult to build cultures to inhibit such behavior per se. At the higher end, the threats posed by the use of cyber weapons are potentially quite serious, even if they are not existential in the same way that the use of nuclear or biological weapons can be. Finally, so many paths lead toward the IT expertise necessary to build cyber weapons that it would be well-nigh impossible for any governance mechanism—or set of governance mechanisms—to intervene effectively to prevent the development of such expertise.

This brief survey of the prospects for governance and oversight for cyber weapons suggests to this author that the future is grim. Cyber weapons have definite utility for national governments (especially in the domain of cyber espionage), and that utility has been demonstrated repeatedly in the last two decades. Accepting negotiated or unilateral constraints on cyber weapons would reduce their utility. Organizations both public and private need to be able to test their systems against cyber weapons that might be used against them (i.e., so-called penetration testing). Add to these points the easy availability of cyber weapons and the lack of meaningful choke points at which governance measures might operate, and one can easily see why few governance measures for cyber weapons exist today.

ENDNOTES

1. See, for example, National Research Council, Biotechnology Research in an Age of Terrorism (Washington, D.C.: National Academies Press, 2004); and Seumas Miller and Michael J. Selgelid, “Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological Sciences,” Science and Engineering Ethics 13 (4) (2007): 523–580. The definition used in the life sciences contrasts with what might be called a “traditional” definition of dual-use technology; namely, technology that has both civilian and military applications. This traditional definition is used by the U.S. government (15 CFR 730.3) and the European Commission (see “Dual-Use Export Controls,” updated January 28, 2016, ).

2. The term weapon is not entirely satisfactory in this context, since in noncyber contexts a weapon is usually an artifact that is used to destroy or damage human beings or other objects. However, this author knows of no other word that is any better, and many that are worse.

3. Although not all uses of guns by police are societally beneficial, such uses are not the intent of supplying guns to police officers.

4. The qualifier “of putative interest” accounts for the possibility that the payload may find itself in a computer system that the attacker did not intend to attack; in this case, payload execution may have negative effects on the wrong system.

5. In some contexts, certain forms of espionage—for example, involving ships, submarines, or aircraft as the collection platforms—have been seen as military threats, so the mere fact that a given action might count as espionage (among other things) does not mean that the action in question must be regarded as “only” espionage. See, for example, Roger D. Scott, “Territorially Intrusive Intelligence Collection and International Law,” Air Force Law Review 46 (1999): 217–226, .

6. William J. Broad, John Markoff, and David E. Sanger, “Israeli Test on Worm Called Crucial in Iran Nuclear Delay,” New York Times, January 15, 2011, .

7. At times during the Cold War, both the United States and the Soviet Union advocated peaceful applications of nuclear explosives. Although the idea of such applications has largely fallen out of favor, some nations apparently continue to advance that position. Still, the taboo against nuclear explosions—for whatever purpose—is much stronger and globally widespread than any existing norms of behavior regarding the use of cyber weapons.

8. The allegedly benign nature of defensive applications of IT warrants one important point of clarification. A defensive application will just as easily protect a computer operated by a hostile nation as one operated by or in a friendly nation, and it serves friendly interests if the former computer remains vulnerable. Export controls of various kinds seek to impede the transfer of certain defensive applications to hostile nations.

9. National Research Council, Computing the Future: A Broader Agenda for Computer Science and Engineering (Washington, D.C.: National Academies Press, 1992), . See also Computer Science: Reflections on the Field, Reflections from the Field (Washington, D.C.: National Academies Press, 2004); and William J. Mitchell, Alan S. Inouye, and Marjory S. Blumenthal, eds., Beyond Productivity: Information, Technology, Innovation, and Creativity (Washington, D.C.: National Academies Press, 2003).

10. Frederick Brooks, The Mythical Man-Month (Reading, Mass.: Addison-Wesley, 1975).

11. A more complete list of notable international cyber events can be found in Catherine A. Theohary and Anne I. Harrington, Cyber Operations in DOD Policy and Plans: Issues for Congress, CRS Report R43848 (Washington, D.C.: Congressional Research Service, January 5, 2015), .

12. For a primer on Stuxnet, see “Cyberattacks on Iran—Stuxnet and Flame,” New York Times, n.d., .

13. Nicole Perlroth, “In Cyberattack on Saudi Firm, U.S. Sees Iran Firing Back,” New York Times, October 23, 2012, .

14. Nicole Perlroth, “American Banks Undamaged by Cyberattacks,” Bits (blog), New York Times, September 26, 2012, .

15. David Alexander, “Theft of F-35 Design Data Is Helping U.S. Adversaries—Pentagon,” Reuters, June 19, 2013, .

16. Nicole Perlroth, “Heat System Called Door to Target for Hackers,” New York Times, February 5, 2014, .

17. Department of Justice, Office of Public Affairs, “U.S. Charges Five Chinese Military Hackers for Cyber Espionage against U.S. Corporations and a Labor Organization for Commercial Advantage,” May 19, 2014, .

18. Julie Creswell and Nicole Perlroth, “Ex-Employees Say Home Depot Left Data Vulnerable,” New York Times, September 19, 2014, .

19. Lori Grisham, “Timeline: North Korea and the Sony Pictures Hack,” USA Today, January 5, 2015, .

20. Kim Zetter, “A Cyberattack Has Caused Confirmed Physical Damage for the Second Time Ever,” Wired, January 8, 2015, . For the original German government report, see Bundesamt fĂŒr Sicherheit in der Informationstechnik, Die Lage der IT-Sicherheit in Deutschland 2014 (Bonn, 2015), .

21. Reed Abelson and Matthew Goldstein, “Millions of Anthem Customers Targeted in Cyberattack,” New York Times, February 5, 2015, .

22. Jaikumar Vijayan, “Premera Hack: What Criminals Can Do with Your Healthcare Data,” Christian Science Monitor, March 20, 2015, .

23. Mike Levine and Jack Date, “22 Million Affected by OPM Hack, Officials Say,” ABC News, July 9, 2015, . On fingerprints, see Andrea Peterson, “OPM Says 5.6 Million Fingerprints Stolen in Cyberattack, Five Times as Many as Previously Thought,” Washington Post, September 23, 2015, .

24. “The Global State of Information Security Survey 2015—Managing Cyber Risks in an Interconnected World,” PricewaterhouseCoopers, n.d., .

25. Jason Koebler, “U.S. Nukes Face Up to 10 Million Cyber Attacks Daily,” U.S. News and World Report, March 20, 2012, .

26. Mark Bevir, Governance: A Very Short Introduction (Oxford: Oxford University Press, 2013).

27. See Harold Hongju Koh, “International Law in Cyberspace” (presentation at the USCYBERCOM Inter-Agency Legal Conference, Ft. Meade, Md., September 18, 2012), . Koh was the legal advisor of the Department of State.

28. In this context, countermeasures is a legal term that contrasts with its more technical usage. For example, in the case of biological weapons, the term countermeasures refers to defenses against biological warfare agents. For cyber weapons, technical countermeasures might refer to the use of scanners to detect malicious software or active defense measures using cyber weapons to inflict damage or pain against a cyber intruder.

29. Michael Schmitt, “‘Below the Threshold’ Cyber Operations: The Countermeasures Response Option and International Law,” Virginia Journal of International Law 54 (3) (2014): 697–732, .

30. Michael N. Schmitt et al., eds., Tallinn Manual on the International Law Applicable to Cyber Warfare (Cambridge: Cambridge University Press, 2013); hereafter “Tallinn manual” “Black-letter” rules are legal rules that are so well settled they are no longer subject to serious dispute in the legal community.

31. Ibid., 42.

32. Ibid., 124.

33. Ibid., 11.

34. Ibid., 36.

35. “Statement by the Chair of the United Nations Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security, H.E. Ambassador Deborah Stokes of Australia,” October 25, 2013, .

36. “Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security,” A/68/98, United Nations General Assembly, June 24, 2013, .

37. “Developments in the Field of Information and Telecommunications in the Context of International Security,” A/RES/68/243, United Nations General Assembly, January 9, 2014, .

38. “Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security,” A/70/174, United Nations General Assembly, July 22, 2015, .

39. Ibid.

40. White House, Office of the Press Secretary, “FACT SHEET: President Xi Jinping’s State Visit to the United States,” September 25, 2015, .

41. Ibid.

42. Even President Obama said immediately after the summit, “The question now is, ‘Are words followed by actions? . . . And we will be watching carefully to make an assessment as to whether progress has been made in this area” See Julie Hirschfeld Davis and David E. Sanger, “Obama and Xi Jinping of China Agree to Steps on Cybertheft,” New York Times, September 25, 2015, .

43. For the text of the summit communiqué, see . The G20 members are Argentina, Australia, Brazil, Canada, China, France, Germany, India, Indonesia, Italy, Japan, Republic of Korea, Mexico, Russia, Saudi Arabia, South Africa, Turkey, the United Kingdom, the United States, and the European Union.

44. “Chart of Signatures and Ratifications of Treaty 185: Convention on Cybercrime,” Council of Europe, updated February 11, 2016, .

45. The discussion of the Budapest Convention is based largely on Michael Vatis, “The Council of Europe Convention on Cybercrime,” in Proceedings of a Workshop on Deterring Cyberattacks: Informing Strategies and Developing Options for U.S. Policy (Washington, D.C.: National Academies Press, 2010).

46. The text of the convention refers to access and interception “without right,” a term that means “without proper legal authorization”

47. “Agreement between the Governments of the Member States of the Shanghai Cooperation Organisation on Cooperation in the Field of International Information Security, Yekaterinburg, 16 June 2009,” in S. A. Komov, ed., International Information Security: The Diplomacy of Peace—Compilation of Publications and Documents (Moscow, 2009), 202–213. This is an unofficial translation; the authentic languages of the Agreement are Russian and Chinese.

48. Andrew Roth, “Russia and China Sign Cooperation Pacts,” New York Times, May 8, 2015, . For the original, Russian-language version of the Russian-Chinese agreement, see . For an unofficial English-language translation, see James Lewis, “Sino-Russian Cybersecurity Agreement 2015,” CSI Strategic Technologies Program, May 11, 2015, .

49. Elaine Korzak, “The Next Level for Russia-China Cyberspace Cooperation?” Net Politics (blog), Council on Foreign Relations, August 20, 2015, .

50. W. Hays Parks, “The International Law of Intelligence Collection,” in National Security Law, ed. John Norton Moore and Robert Turner (Durham, NC: Carolina Academic Press, 1990), 433–434.

51. “About Us,” n.d., .

52. .

53. The Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies: List of Dual-Use Goods and Technologies and Munitions List (Vienna: Wassenaar Arrangement, 2014), .

54. Ibid., 73, 212.

55. Ibid., 81.

56. For more discussion of the application of export controls to different components of a cyber weapon, see Trey Herr and Paul Rosenzweig, “Cyber Weapons and Export Control: Incorporating Dual Use with the PrEP Model,” Journal of National Security Law and Policy 8 (2) (2015), .

57. Jennifer Granick and Mailyn Fidler, “Changes to Export Control Arrangement Apply to Computer Exploits and More,” Just Security, January 15, 2014, .

58. For the USML, see 22 CFR 121, .

59. Trey Herr, George Washington University, personal communication, September 2015.

60. White House, International Strategy for Cyberspace: Prosperity, Security, and Openness in a Networked World (Washington, D.C.: White House, May 2011), 10, 14, .

61. Ibid., 9–10.

62. Michael Daniel, “Heartbleed: Understanding When We Disclose Cyber Vulnerabilities,” White House Blog, April 28, 2014, .

63. Department of Defense, The DOD Cyber Strategy (Washington, D.C.: DOD, April 2015), .

64. Ibid., 5.

65. Ibid., 14.

66. Ibid., 26.

67. “Text of John Kerry’s Remarks in Seoul on Open and Secure Internet,” Voice of America, May 18, 2015, .

68. “Confidence-Building Measures in Cyberspace: A Multistakeholder Approach for Stability and Security,” Atlantic Council, November 5, 2014, ; and John Steinbruner, “Prospects for Global Restraints on Cyberattack,” Arms Control Today 41 (December 2011), .

69. .

70. White House, Office of the Press Secretary, “FACT SHEET: U.S.-Russian Cooperation on Information and Communications Technology Security,” June 17, 2013, .

71.“Computer Misuse Act 1990,” n.d., .

72. “Vaststelling van de begrotingsstaten van het Ministerie van Defensie (X) voor het jaar 2010,” Vergaderjaar 2009–2010, Kamerstuk 32123-X nr. 66, December 10, 2009, .

73. Advisory Council on International Affairs (AIV) and Advisory Committee on Issues of Public International Law (CAVV), Cyber Warfare, AIV no. 77 / CAVV no. 22 (The Hague: AIV and CAVV, December 2011), .

74. “Government Response to the AIV/CAVV Report on Cyber Warfare,” Rijksoverheid, April 26, 2012, .

75. “Defensie Cyber Strategie,” Tweede Kamer der Staten-Generaal, Vergaderjaar 2014–2015, 33 321, nr. 5, . For an unofficial translation of this letter, see “Dutch Defense Cyber Strategy—Revised February 2015,” Matthijs R. Koot’s Notebook (blog), February 23, 2015, .

76. Ministry of Defence, Joint Forces Command, and Philip Hammond, “New Cyber Reserve Unit Created,” GOV.UK, September 29, 2013, .

77. Simon Walters, “Hammond’s £500m New Cyber Army,” Daily Mail, September 28, 2013, .

78. The discussion of initiatives at the UN borrows freely from Elaine Korzak, “Computer Network Attacks and International Law” (PhD dissertation, King’s College London, 2014).

79. “Letter dated 23 September 1998 from the Permanent Representative of the Russian Federation to the United Nations Addressed to the Secretary-General,” A/C.1/53/3, United Nations General Assembly, September 30, 1998, .

80. .

81. .

82. Ibid.

83. .

84. “Statement by the Chair of the United Nations Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security, H.E. Ambassador Deborah Stokes of Australia,” October 25, 2013, .

85. “Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security,” A/70/174, 7.

86. Ibid., 8.

87. Ibid.

88. Ibid.

89. Katharina Ziolkowski, Confidence Building Measures for Cyberspace—Legal Implications (Tallinn: NATO Cooperative Cyber Defence Centre of Excellence, 2013), https://ccdcoe.org/publications/CBMs.pdf; “Cyber Security,” North Atlantic Treaty Organization, updated February 10, 2016, http://www.nato.int/cps/en/natohq/topics_78170.htm; ABIresearch and International Telecommunication Union, Global Cybersecurity Index: Conceptual Framework (Geneva: ITU, n.d.), https://www.itu.int/en/ITU-D/Cybersecurity/Documents/GCI_Conceptual_Framework.pdf; and “Cybersecurity Information Exchange Techniques (CYBEX),” International Telecommunication Union, n.d., http://www.itu.int/en/ITU-T/studygroups/2013-2016/17/Pages/cybex.aspx.

90. In the EP-3 incident over Hainan Island in the South China Sea, a U.S. EP-3 reconnaissance plane collided with a Chinese F-8 fighter. The Chinese pilot died in the incident and the EP-3 made an emergency landing of the damaged plane onto Hainan Island. Given the involvement of both U.S. and Chinese military forces in this incident, U.S. military leaders tried to contact their counterparts in China to resolve the situation without undue escalation. Congressional Research Service, China-U.S. Aircraft Collision Incident of April 2001: Assessments and Policy Implications, October 10, 2001, available at .

91. Michael D. Swaine, Tuosheng Zhang, and Danielle Cohen, eds., Managing Sino-American Crises: Case Studies and Analysis (Baltimore: Johns Hopkins University Press, 2006); James A. Lewis, CSIS, personal communication, September 2014; and Adam Segal, Council on Foreign Relations, personal communication, July 2015.

92. Robert K. Knake, Internet Governance in an Age of Cyber Insecurity, Council Special Report no. 56 (Washington, D.C.: Council on Foreign Relations, September 2010), . A DDOS attack is one in which many different compromised computers send bogus service requests to a single target, which is overwhelmed trying to service these (fake) requests and is thus unable to provide service for legitimate users of the targeted system.

93. Microsoft Corporation, International Cybersecurity Norms: Reducing Conflict in an Internet-Dependent World (Redmond, WA: Microsoft Corporation, December 2014), 11–13, .

94. “The Bug Bounty List,” Bugcrowd, n.d., .

95. Nicole Perlroth, “HackerOne Connects Hackers with Companies, and Hopes for a Win-Win,” New York Times, June 7, 2015, .

96. Abel Avram, “IDC Study: How Many Software Developers Are Out There?” InfoQ, January 31, 2014, http://www.infoq.com/news/2014/01/IDC-software-developers; and Bureau of Labor Statistics, “Software Developers,” Occupational Outlook Handbook, n.d., .

97. “ACM Code of Ethics and Professional Conduct,” n.d., .

98. “About ABET,” n.d., http://www.abet.org/about-abet/.

99. “Criteria for Accrediting Computing Programs, 2016–2017,” n.d., .

100. See, for example, Eugene H. Spafford, “Are Computer Hacker Break-Ins Ethical?” Journal of Systems and Software 41 (17) (1992): 41–47.

101. Ellen Nakashima and Ashkan Soltani, “The Ethics of Hacking 101,” Washington Post, October 7, 2014, ; Jackie Kemp, “The Crack Team,” Guardian, October 20, 2008, ; and Queena Kim, “Good Hack, Bad Hack: A Cybersecurity Camp Teaches the Ethics of Hacking,” Marketplace, July 2, 2013.

102. “9th USENIX Workshop on Offensive Technologies: WOOT ’15,” Usenix, n.d., .

103. National Institute of Standards and Technology, “Advanced Encryption Standard (AES) Development Effort,” updated February 28, 2001, .

104. William A. Owens, Kenneth W. Dam, and Herbert S. Lin, eds., Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities (Washington, D.C.: National Academies Press, 2009); and Eric Schmitt and Thom Shanker, “U.S. Debated Cyberwarfare against Libya,” New York Times, October 17, 2011, .

105. See David Sanger, “Document Reveals Growth of Cyberwarfare between the U.S. and Iran,” New York Times, February 22, 2015, .

106. See, for example, John Blau, “Russia—A Happy Haven for Hackers,” Computer Weekly, May 2004, .

107. On the other hand, Paul Bracken points out that exploding a nuclear weapon is not the only way to “use” it. Moving nuclear weapons from one place to another or changing the alert status of nuclear delivery vehicles are actions that do not involve exploding nuclear weapons but that still may send politically significant messages. See Paul Bracken, The Second Nuclear Age (New York: St. Martin’s Griffin Press, 2013).

108. Even real and legitimate concerns about blowback in cyber are not analogous. For biological weapons, blowback is tactical—the same organisms that cause illness in an adversary can cause illness in friendly populations. For cyber weapons, blowback is strategic—launching a cyberattack sets precedents and helps to establish cyberattack as a legitimate means of conflict, but rarely would a cyber weapon be turned back on its user without significant modification. On the use of biological weapons more generally, see Jozef Goldblat, “The Biological Weapons Convention—An Overview,” International Review of the Red Cross, no. 318 (June 30, 1997), ; and Jonathan B. Tucker and Erin R. Mahan, President Nixon’s Decision to Renounce the U.S. Offensive Biological Weapons Program (Washington, D.C.: National Defense University Press, October 2009), .

109. For example, the development of Stuxnet required the construction of a test facility containing centrifuges identical to the ones Stuxnet was intended to attack. See Broad, Markoff, and Sanger, “Israeli Test on Worm Called Crucial in Iran Nuclear Delay”

110. More discussion of this point can be found in David Clark, Tom Berson, and Herbert Lin, eds., At the Nexus of Cybersecurity and Public Policy: Some Basic Concepts and Issues (Washington, D.C.: National Academies Press, 2014).