Ƶ

“Good Enough” Governance

Policy Options

Back to table of contents
Authors
Karl W. Eikenberry and Stephen D. Krasner
Project
Civil Wars, Violence, and International Responses

Standard Treatment

The “standard treatment”—the combination of mediation and UN or regional peacekeeping forces, together with limited foreign aid—is a viable, often successful method for responding to civil wars and internal violence under the right conditions. Where combatants adhere to the existing international system and the major powers are largely in agreement, the use of UN peacekeepers and mediation has quelled violence and brought stability to post–Cold War cases of civil violence. This is especially true in instances in which a great or regional power was able to commit resources, as George Downs and Gowan and Stedman have found.49 Cases where the UN treatment worked well include El Salvador, Guatemala, Nicaragua, Mozambique, Namibia, and Cambodia.50 However, UN experts and practitioners have noted that there are significant strains on resources and capabilities, as well as an underlying concern about peacekeepers’ ability to leave after intervening without causing internal collapse or a return to chaos. Further, the Ƶ’s Civil Wars project outreach to the UN in New York, as well as to policy-makers and scholars in Washington, Geneva, and elsewhere, revealed that recent shifts in global power are impacting the effectiveness of these international responses. Zero-sum competitions for regional influence complicate efforts to achieve major power unity of action. In certain conflicts that fit the template, the “standard treatment” regime should be applied, though opportunities for the application of this approach may be more limited going forward.

 


 

Endnotes

  • 49George Downs and Stephen John Stedman, “Evaluation Issues in Peace Implementation,” in Ending Civil Wars: The Implementation of Peace Agreements, ed. Stephen John Stedman, Donald Rothschild, and Elizabeth Cousens (Boulder, Colo.: Lynne Rienner, 2002), 43–69.
  • 50Gowan and Stedman, “The International Regime for Treating Civil War,” 178.

Occupation (Iraq)

The United States invaded Iraq in March 2003, with President George W. Bush publicly stating that the Iraqi regime must be “disarmed.” Bush cited not only the terrorist attacks of September 11, 2001, but the potential for future terrorist attacks as a reason for the invasion.51 The United States promised to bring democracy and stability. The price tag has topped $3 trillion, but returns on the investment—such as peace and stability in the region and planting the seeds of democracy—have been disappointing.52 Having seen how the challenges of an overly ambitious approach were magnified in the Iraqi conflict, we must caution against this option in most circumstances.

 


 

Endnotes

  • 51“The danger is clear: using chemical, biological or, one day, nuclear weapons, obtained with the help of Iraq, the terrorists could fulfill their stated ambitions and kill thousands or hundreds of thousands of innocent people in our country, or any other.” “,” The New York Times, March 18, 2003.
  • 52Joseph E. Stiglitz and Linda J. Bilmes, “,” The Washington Post, September 5, 2010; and Daniel Trotta, “,” Reuters, March 14, 2013.

Limited Security and Development Assistance (post-surge Afghanistan)

The United States entered Afghanistan in 2001 with similarly ambitious goals. After nineteen years of war in the country, the United States faces difficult choices about how to disengage. Osama bin Laden was killed, but the Taliban have achieved a stalemate on the battlefield. Opium production is strong, illustrating how transnational criminality can find a foothold in weak states wrought with conflict, and most Afghans still live in poverty.53 In spite of massive investments in security assistance, the Afghan National Army and Police cannot secure and stabilize their country, an example of Biddle’s argument that small-footprint security-force assistance is rarely effective and can still be costly.54 In countries where building a stronger security force is essential, the United States must sequence actions carefully and accept that, to be effective, the American footprint will be large.55 It must also recognize that effective and accountable security forces can rarely be built on a foundation of political disunity and corruption. Entrenched elites might permit foreign powers to establish and exercise control over a small number of mission-focused counterterrorist forces; they will rarely, however, embrace the formation of larger professional army and police units that can potentially threaten their political influence.

 


 

Endnotes

  • 53Sarah Almukhtar and Rod Nordland, “” The New York Times, December 9, 2019.
  • 54Biddle, “Building Security Forces and Stabilizing Nations,” 126.
  • 55Biddle “finds that effective SFA is much harder to implement in practice than often assumed, and less viable as a substitute for large unilateral troop deployments. He makes a strong case that for the United States, in particular, the achievable upper bound is usually modest, and even this is possible only if policy is intrusive and conditional, which it rarely is.” Karl W. Eikenberry and Stephen D. Krasner, “Introduction,” æ岹ܲ 146 (4) (Fall 2017): 14.

Proxy Wars

With the shift in the distribution of world power, as well as in foreign policy preferences, outsourcing conflict to client states and armed groups is becoming more prevalent but is fraught with risk. Instead of choosing to intervene directly, for example, the United States may provide military assistance to countries willing to intervene in areas of unrest. In the MENA region, this often means supporting countries that oppose U.S. adversaries. We have seen this in Syria, with the complex web of fighting and power-sharing between the United States, Russia, Iran, and Turkey; and in Yemen, with Saudi Arabia (backed by the United States) and Iran.

Yemen is one of several hotspots in the Middle East. The Yemeni civil war, begun in 2014, has developed into a complex proxy war among Saudi Arabia, Iran, and the United States. Criticism has focused on Saudi airstrikes, which have left thousands of civilians dead. Saudi Arabia views the Houthi militia as a proxy for its regional rival, Iran. Nearly one million Yemenis have been displaced as a result of the conflict.56 Some view the Saudi-led campaign as a cost-effective way for the United States to deal with the threats posed by international terrorist groups active in Yemen and growing Iranian influence in the Middle East. Yet such an approach carries considerable risks, including entanglement in a wider war because of unanticipated actions of the belligerents, unforeseen instability in the partner state or coalition, and culpability for violation of the laws of armed conflict by one’s proxy forces.

 


 

Endnotes

  • 56Declan Walsh, “,” The New York Times, October 26, 2018.

Ignore

Alternatively, the United States can choose to ignore conflicts in hopes that one side will reach a clear military victory or that a third party will step in to broker the peace process. For years the United States followed this option in the Democratic Republic of the Congo (DRC), a country two-thirds the size of Western Europe with a population of over 80 million. Moral considerations aside, such neglect has not come at any great cost to U.S. national interests. This could change, however, should a lethal pandemic spread across the region, international terrorists establish safe haven there, or the United States determine the DRC to be important in future geopolitical competition.

 


 

Prevention (Most Cost-Effective on Paper, Most Difficult in Practice)

The costs of prevention, when successful, are modest compared to robust interventions after the outbreak of violence. However, as Charles Call and Susanna Campbell argue, preventative measures—diplomacy, targeted and limited development, and possibly security assistance—are difficult to sell to constituents when no core security interest, such as preventing nuclear war, is at stake.57 An “uptick in violence in a faraway, non-strategic country” does not provide a case for action that is likely to be compelling at home in the United States, even if the conflict is at a stage when prevention could be effective.58 Further, preventive decisions are hard to make, and, if effective, the outcome draws little to no attention. Successful prevention does not create an obvious case for support for more prevention going forward.

The challenge is demonstrating that any efforts and resources committed will have a positive outcome. The urgent invariably trumps the important. While many political and bureaucratic hurdles make conflict prevention a difficult path to go down, Call and Campbell argue that “calls for more action and better organization aimed at preventing violent conflict may embolden a few policy-makers and bureaucrats to take on the risk of prevention. The more policy-makers who act preventively, the more credible the commitment that they will act in the future.”59 While policy-makers (and the general public) will see the “failures” of prevention when conflicts manifest, this does not mean the few successful attempts to prevent conflict are not worthwhile.60

 


 

Endnotes

  • 57Charles Call and Susanna Campbell, “Is Prevention the Answer?” æ岹ܲ 147 (1) (Winter 2018): 64.
  • 58Ibid.
  • 59Ibid., 74–75.
  • 60In response to a request from the U.S. Congress in 2017, the United States Institute of Peace developed a Task Force for Preventing Extremism in Fragile States. Its “Final Report” bears three main recommendations. First, it recommends developing a shared framework for strategic prevention that also ensures a shared understanding about why extremism spreads; second, it highlights the importance of better coordinating American efforts overseas with a longer-term focus; third, it emphasizes the importance of pooling international resources and building partnerships with leaders, civil society, and private-sector actors to successfully implement a strategy of prevention. United States Institute of Peace, (Washington, D.C.: United States Institute of Peace, 2019).

U.S. Policy Precedents

Americans’ conceptions of the U.S. role in the world help and hinder U.S. capacity to respond to the threats posed by civil wars, fragile states, and weak governance. Certain long-standing tensions in American policy remain in place even today. In the 1950s, Louis Hartz argued in The Liberal Tradition in America that the United States has always vacillated between two impulses: first, to remake the world in the image of the United States; second, to retreat into isolationism and to act as a city on the hill, providing a model that others just might emulate.61 The first vision is deeply utopian. It assumes that consolidated democracy is an ambition of all humankind. The second impulse is dystopian. It assumes that the rest of the world is fundamentally sinful and that even if the United States acts as a beacon of freedom, most countries will not follow its path.

For most of America’s history, this dualistic vision has served the country moderately well. During the nineteenth century, the United States did not have the material resources to engage in activist foreign policy. Moreover, two oceans insulated the country. The Monroe Doctrine had little meaning beyond rhetoric until the end of the nineteenth century, when the U.S. Navy could challenge its European rivals. Before the twentieth century, the United States acted mainly as a city on the hill. Given the limitations of its material power, it had few other options.

During the first part of the twentieth century, America’s vacillation between utopian and dystopian visions did not serve the country well. President Woodrow Wilson hoped to make the world safe for democracy when he brought the United States into World War I, even if many countries were not democratic, but he failed to get Senate approval for American participation in the League of Nations. In the 1920s and 1930s, isolationism prevailed in the United States. President Franklin Roosevelt recognized the dangers posed by Nazi Germany, an autocratic country that could dominate Europe, but he could not bring the United States into the war until the Japanese attack on Pearl Harbor in 1941, by which time Germany had conquered most of Europe and the United Kingdom stood alone.

During the second part of the twentieth century, after defeating the Axis powers together with its allies, the United States embraced its role as a global leader. The utopian vision prevailed. The UN was established in New York City, with the United States as one of the five permanent members of the Security Council. The International Monetary Fund and the World Bank were set up in Washington, D.C., just a few blocks from the White House, and the United States has selected every World Bank president since the organization was founded. The General Agreement on Tariffs and Trade and later the World Trade Organization were established with American support. Competitive economic policies, which were seen as one of the causes of World War II, were avoided. Hypernationalism was eschewed, at least by U.S. and European political elites. Instead, the United States encouraged and facilitated European unity. Not all was smooth sailing in the post–World War II period, however. Notable failures included a costly stalemate in Korea and defeat in Vietnam, but, on the whole, the United States did well—arguably, magnificently well—in pursuing its utopian vision. All of the states in Western Europe became consolidated democracies, even Germany. Japan also became a consolidated democracy. South Korea went from being poor and autocratic to being rich and democratic. Even the Soviet Union ultimately collapsed and, along with it, the authority of communist ideology.

When Francis Fukuyama wrote in 1989 about the end of history and the absence of any serious rivals to democracy and the free market, he recognized an important development in human history. The realization of America’s utopian vision—that any country could become a consolidated democracy—appeared imminent, reaching its apotheosis in the first term of George W. Bush’s presidency, before gradually losing favor during Bush’s second term.

September 11, 2001, was a shock. Thousands of people died. An administration that had expected to focus on domestic issues suddenly found itself responding to international threats to the security of the United States. Absent serious threats from potential state rivals in the aftermath of the Cold War, many Americans came to believe that international terrorism posed the gravest risks to their nation’s long-term security. The Bush administration concluded that the root cause of this danger—which it identified as harsh autocratic rule in parts of the Islamic world—had to be addressed. Terrorism could be extirpated by bringing democracy to the greater Middle East. Afghanistan, and later Iraq, would become a democratic country, administration officials believed, as a result of invasions led by the United States; foreign-imposed regime change could work. The spread of democracy was assumed to be unstoppable, bringing peace and prosperity for all. The necessary ways, means, time, and patience to achieve these goals were also assumed to be available. Bush could adopt a Wilsonian perspective because this was a foreign policy vision familiar to the American public.

The administration’s 2002 National Security Strategy reflected President Bush’s thinking that the United States could not be safe unless the world, especially the broader Middle East, moved toward democracy.62 Only the fundamental transformation of despotic political systems that spawned transnational terrorism could eliminate the threat to the United States. However, the administration’s intellectually coherent grand strategy ultimately proved to rest on an empirically unsound foundation. The United States became bogged down in lengthy, hugely expensive wars in Iraq and Afghanistan, and democracy did not take hold. The administration’s utopian goals proved more challenging to achieve than anticipated.

Bush’s successors, Presidents Obama and Trump, retreated to familiar territory. Obama did not want a precipitous withdrawal from Afghanistan and Iraq, but he desperately wanted to limit American exposure. Even as Obama announced a surge in Afghanistan, he declared a time limit after which American troops would begin to withdraw. Skeptical of the efficacy of expensive and protracted state-building enterprises, his efforts to dramatically reduce the U.S. military footprint in these two countries and the region beyond were ultimately frustrated by the rise of ISIS and its spread to surrounding areas.

The 2017 National Security Strategy, released under then National Security Advisor H.R. McMaster, states that the world has returned to an era in which the dominant threat to American security derives from nation-state competitors, not transnational terrorism. China and Russia are characterized as states that have taken advantage of institutions that the United States created and pose serious global and regional risks to U.S. interests. North Korea is characterized as a threat due to its nuclear weapons and Iran because of its potential to acquire them. Various nonstate actors, such as lethal international terrorist organizations and transnational criminal gangs, are also identified as significant threats but afforded a lower priority than in the administrations of George W. Bush and Barack Obama.

The security strategy released under Trump’s presidency returns, in some ways, to the language of the historic “city on a hill” strand of foreign policy thinking, stating, “America’s commitment to liberty, democracy, and the rule of law serves as an inspiration for those living under tyranny.”63 The Trump administration made clear its rejection of strategies that advocated military intervention and robust development efforts to deal with weaker, nondemocratic states, and it entertains no ambitions to invest American resources in efforts to get Iraq, Afghanistan, or other states on the path to democracy and prosperity. However, in a shift away from the “city on the hill” approach, the strategy states that the first priority for developing countries is to promote American investment in the private sector. In fragile states, the United States should “commit selectively” and try to strengthen the rule of law and support reformers. Trump may be fairly criticized for inconsistent foreign policy, but his instincts about limited possibilities in other countries are not wrong.

The national security strategies of 2002 and 2017 represent the two poles of American foreign policy. The 2002 National Security Strategy aimed at fundamental political transformation, advocating for the United States to bring democracy and, consequently, stability and prosperity to the Middle East and beyond. The 2017 National Security Strategy is closer to the dystopian, “city on a hill” vision of American foreign policy. President Trump’s own views appear to be right in line with much of this dystopian view: the United States is unique; other countries will not necessarily follow in its footsteps; and other countries are likely to take advantage of the United States, so the most important goal for foreign policy should be to maintain American strength. A more balanced view, and one that would better serve American security, would recognize that the United States can make the world safer, even if it cannot recreate the world in its own image.

Endnotes

  • 61Louis Hartz, The Liberal Tradition in America: An Interpretation of American Political Thought (New York: Harcourt, Brace, and World, 1955).
  • 62 (Washington, D.C.: The White House, September 2002).
  • 63(Washington, D.C.: The White House, December 2017), 4.