One advantage that historians have over journalists concerns time, not so much in the sense that they are free from urgent deadlines, but that they have the deeper perspective conferred by the years—or decades—between events and the act of writing about them. Twenty years is not a lot of time in historical terms, of course. But when it comes to understanding the war that the United States launched against Iraq in March 2003, it is all we have.
Not surprisingly, even two decades after the war began, there is no consensus regarding its legacy. This is to be expected, because all wars are fought three times. First comes the political and domestic struggle over the decision to go to war. Then comes the actual war, and all that happens on the battlefield. Finally, a long debate over the war’s significance ensues: weighing the costs and benefits, determining the lessons learned, and issuing forward-looking policy recommendations.
The events and other factors that led to the US decision to go to war in Iraq remain opaque and a matter of considerable controversy. Wars tend to fall into two categories: those of necessity and those of choice. Wars of necessity take place when vital interests are at stake and there are no other viable options available to defend them. Wars of choice, by contrast, are interventions initiated when the interests are less than vital, when there are options other than military force that could be employed to protect or promote those interests, or both. Russia’s invasion of Ukraine was a war of choice; Ukraine’s armed defence of its territory is one of necessity.
The Iraq War was a classic war of choice: the US did not have to fight it. Not everyone agrees with that assessment, however. Some contend that vital interests were indeed at stake because Iraq was believed to possess weapons of mass destruction that it might use or share with terrorists. Proponents of the war had little to no confidence that the US had other reliable options to eliminate the purported Iraqi WMDs.
Moreover, coming in the wake of the 11 September 2001, terrorist attacks, the decision reflected a staunch unwillingness to tolerate any risk to the US whatsoever. The idea that al-Qaeda or another terrorist group could strike the US with a nuclear, chemical or biological device was simply unacceptable. Then–vice president Dick Cheney was the primary exponent of this view.
Others, including President George W. Bush and many of his top advisers, appeared to be motivated by additional calculations, such as the pursuit of what they saw as a new and great foreign-policy opportunity. After 9/11, there was a widespread desire to send a message that the US wasn’t just on the defensive, but would be a proactive force in the world, taking the initiative with great effect.
Whatever progress had been made in Afghanistan after the US invaded and removed the Taliban government that had provided a safe haven to the al-Qaeda terrorists who planned and carried out the 9/11 attacks, it was deemed inadequate. Many in the Bush administration were motivated by a desire to bring democracy to the entire Middle East, and Iraq was viewed as the ideal country to set the transition in motion. Democratisation there would set an example that others across the region would be unable to resist following. And Bush himself wanted to do something big and bold.
I should make clear that I was part of the administration at the time, as the head of the Department of State’s policy planning staff. Like virtually all my colleagues, I thought Saddam Hussein possessed WMDs, namely chemical and biological weapons. Even so, I didn’t favour going to war. I believed there were other acceptable options, above all measures that could slow or stop the flow of Iraqi oil to Jordan and Turkey, as well as the possibility of cutting Iraq’s oil pipeline to Syria. Doing so would have put significant pressure on Saddam to allow inspectors into suspected weapons sites. If those inspections were blocked, the US could have conducted limited attacks against those facilities.
I wasn’t particularly worried about Saddam getting into the terrorism business. He ruled secular Iraq with an iron fist and considered religious-fuelled terrorism (with or without Iranian backing) the greatest threat to his regime. He also wasn’t the sort of person to hand WMDs over to terrorists, since he wanted to maintain tight control of anything that could be linked to Iraq.
I was deeply sceptical that Iraq—or the wider region—was ripe for democracy, given that the economic, political and social prerequisites were largely missing. I also foresaw that establishing democracy would require a large, prolonged military occupation that would likely prove costly on the ground and controversial at home.
The war itself went better, and certainly faster, than expected—at least in its initial phase. After the invasion in mid-March, it took only around six weeks to defeat the Iraqi armed forces. By May, Bush could claim that the mission was accomplished, meaning that Saddam’s government had been eliminated and any organised, armed opposition had disappeared.
But while the US force that had been sent to remove the government was more than capable of winning the war, it couldn’t secure the peace. Core assumptions that had informed the planning of the invasion—namely, that Iraqis would welcome the troops as liberators—might have been true for a few weeks, but not after that.
The Bush administration wanted to reap the benefits of nation-building without putting in the hard work it required. Worse still, those in charge disbanded the former Iraqi regime’s security forces and ruled out political and administrative roles for the many Iraqis who had been members of the ruling Ba’ath (Renaissance) Party, even though membership of the party was often essential to employment under Saddam’s regime.
The situation on the ground deteriorated rapidly. Looting and violence became commonplace. Insurgent movements and a civil war between Sunni and Shia militias destroyed what temporary order had been established. After that, conditions didn’t begin to improve until 2007, when the US deployed an additional 30,000 troops to Iraq in the famous ‘surge’. But four years later, Bush’s successor, Barack Obama, decided to withdraw US troops in the face of worsening political relations with the Iraqi government.
The results of the war have been overwhelmingly negative. Yes, a horrendous tyrant who had used chemical weapons against his own people and initiated wars against two of his neighbours was ousted. For all its flaws, Iraq today is better off than it was, and its long-persecuted Kurdish minority enjoys a degree of autonomy that it was previously denied.
But the cost side of the ledger is far longer. The Iraq War took the lives of some
200,000 Iraqi civilians and 4,600 US soldiers. The economic costs to the US were in the
range of US$2 trillion, and the war upset the balance of power in the region to the benefit of neighbouring Iran, which has increased its sway over Syria, Lebanon and Yemen, in addition to Iraq.
The war also isolated the US, owing to its decision to fight alongside only a few partners and without explicit backing from the United Nations. Millions of Americans became disillusioned with their government and US foreign policy, helping to set the stage for the anti-government populism and foreign-policy isolationism that has dominated US politics in recent years. The war ultimately proved to be a costly distraction. Without it, the US could have been in a much better position to reorient its foreign policy to contend with a more aggressive Russia and a more assertive China.
The war’s lessons are manifold. Wars of choice should be undertaken only with extreme care and consideration of the likely costs and benefits, as well as of the alternatives. That wasn’t done in the case of Iraq. On the contrary, decision-making at the highest levels was often informal and lacking in rigour. The lack of local knowledge was pervasive. It may seem obvious to suggest that it is dangerous or even reckless to invade a country that you don’t understand, but that’s exactly what the US did.
Assumptions can be dangerous traps. The decision to go to war rested on a worst-possible-case assessment that Iraq possessed WMDs and would use them or provide them to those who would. But if foreign policy always operated on this basis, interventions would be required everywhere. What’s needed is a balanced consideration of the most likely scenarios, not just the worst ones.
Ironically, the analysis of what would follow a battlefield victory in Iraq erred in the opposite direction: US officials placed all their chips on a best-case scenario. After rolling out the welcome mat to those who had liberated them from Saddam, the Iraqis would quickly put aside their sectarian differences and embrace democracy. We know what happened instead. The fall of Saddam became a moment for violently settling scores and jockeying for position. Promoting democracy is a daunting task. It is one thing to oust a leader and a regime, but it is quite another thing to put a better, enduring alternative in its place.
Still, common critiques of the war get it wrong when they conclude that the US government cannot ever be trusted to tell the truth. Yes, the US government maintained that Iraq possessed WMDs, and my boss at the time, Secretary of State Colin Powell, made that case before the United Nations. It turned out not to be true.
But governments can and do get things wrong without lying. More than anything else, the run-up to the Iraq War demonstrated the danger of leaving assumptions unexamined. Saddam’s refusal to cooperate with UN weapons inspectors was seen as proof that he had something to hide. He did, but what he was hiding was not WMDs but the fact that he did not have them. That revelation, he feared, would make him look weak to his neighbours and his own people.
Others have argued that the war was undertaken at Israel’s behest. That, too, is not true. I remember meetings with Israeli officials who suggested that the US was going to war with the wrong country. They saw Iran as the much greater threat. But those officials held back from saying so publicly, because they sensed that Bush was determined to go to war with Iraq and didn’t want to anger him with futile attempts at dissuasion.
Nor did the US go to war for oil, as many on the left have often insisted. Narrow commercial interests are not generally what animate US foreign policy, especially when it comes to using military force. Rather, interventions are predicated on, and motivated by, considerations of strategy, ideology or both. Indeed, former president Donald Trump criticised his predecessors for
not demanding a share of Iraq’s oil reserves.
The Iraq War also contains a warning about the limits of bipartisanship, which is frequently touted in US politics as if it is a guarantee of good policy. But it is no such thing. There was overwhelming bipartisanship in advance of not just the Iraq War but also the Vietnam War. The 2002 vote authorising the use of military force against Iraq passed with clear support from both major political parties. But even before that, President Bill Clinton’s administration and Congress had come together, in 1998, to call for regime change in Iraq. More recently, we have seen bipartisanship in opposition to free trade, and in support of leaving Afghanistan and confronting China.
But, just as broad political support is no guarantee that a policy is right or good, narrow support doesn’t necessarily mean that a policy is wrong or bad. The 1990–91 Gulf War—in which the US successfully led a UN-backed international coalition that liberated Kuwait at minimal cost—barely passed Congress, owing to considerable Democratic opposition. Whether or not a policy has bipartisan support tells us nothing about the quality of the policy.
In 2009, I wrote a
book arguing that the 2003 Iraq War was an ill-advised war of choice. More than a decade later, and 20 years after the war began, I see no reason to amend that judgement. It was a bad decision, badly executed. The US and the world are still living with the consequences.