Getting Duped: Trolling for Truth in Media
Someone is finally trying to make sense out of how and why nearly half of the country believed, incorrectly, that the U.S. government had irrefutable proof of a relationship between Iraq and al Qaeda, also known as 9/11 and Saddam Hussein. Out of Scientific American, a worthwhile read on how statements made in the media can surreptitiously plant distortions in the minds of millions and what you can do to recognize two commonly used fallacies to help separate fact from fiction.
In 2003 nearly half of all Americans falsely assumed that the U.S. government had found solid evidence for a link between Iraq and al Qaeda. What is more, almost a quarter of us believed that investigators had all but confirmed the existence of weapons of mass destruction in Iraq, according to a 2003 report by the University of Maryland’s Program on International Policy Attitudes and Knowledge Networks, a polling and market research firm. How did the true situation in Iraq become so grossly distorted in American minds?
Many people have attributed such misconceptions to a politically motivated disinformation campaign to engender support for the armed struggle in Iraq. We do not think the deceptions were premeditated, however. Instead they are most likely the result of common types of reasoning errors, which appear frequently in discussions in the news media and which can easily fool an unsuspecting public.
News shows often have an implicit bias that may motivate the portrayal of facts and opinions in misleading ways, even if the information presented is largely accurate. Nevertheless, by becoming familiar with how spokespeople can create false impressions, media consumers can learn to ignore certain claims and thereby avoid getting duped. We have detected two general types of fallacies—one of them well known and the other newly identified—that have permeated discussion of the Iraq War and that are generally ubiquitous in political debates and other discourse.
Spinning Straw into Fool’s Gold
One common method of spinning information is the so-called straw man argument. In this tactic, a person summarizes the opposition’s position inaccurately so as to weaken it and then refutes that inaccurate rendition. In a November 2005 speech, for example, President George W. Bush responded to questions about pulling troops out of Iraq by saying, “We’ve heard some people say, pull them out right now. That’s a huge mistake. It’d be a terrible mistake. It sends a bad message to our troops, and it sends a bad message to our enemy, and it sends a bad message to the Iraqis.” The statement that unnamed “people” are advocating a troop withdrawal from Iraq “right now” is a straw man, because it exaggerates the opposing viewpoint. Not even the most stalwart Bush adversaries backed an immediate troop withdrawal. Most proposed that the soldiers be sent home over several months, a more reasonable and persuasive plan that Bush undercut with his straw man.
The straw man is used in countless other contexts as well. In his acceptance speech at the 1996 Democratic Convention, for instance, Bill Clinton opined: “… with all respect [to Bob Dole], we do not need to build a bridge to the past. We need to build a bridge to the future.” Dole did discuss restoring the values of an earlier America, but Clinton falsely implied that Dole was only looking backward (whereas Clinton was looking forward). People may use a straw man to discredit theories to which they do not subscribe. Characterizing evolution, for example, as “all random chance” is a straw man argument; it misrepresents a complex theory that only partly rests on the randomness of mutations that may lead to better chances of survival.
Recently, in a 2006 paper co-authored with Scott F. Aikin, one of us (Talisse) documented a twist on the straw man tactic. In what Talisse dubs a weak man argument, a person sets up the opposition’s weakest (or one of its weakest) arguments or proponents for attack, as opposed to misstating a rival’s position as the straw man argument does. In a July 2007 edition of Talking Points, Bill O’Reilly took on a claim by the New York Times that we had lost the war in Iraq by saying that “the New York Times declared defeat in Iraq Sunday on its editorial page, and there’s no question the antiwar movement has momentum.” (The editorial actually said that “some opponents of the Iraq war are toying with the idea of American defeat,” but let us assume that O’Reilly’s characterization was correct.)
O’Reilly then offered a weak man explanation for the purported defeat: “The truth is the Iraqi government and many of its citizens are simply not doing enough to defeat the terrorists and corruption. The U.S.A. can’t control that country. No nation could…. Unfortunately, the Iraqi failure to help themselves has come true.” Although Iraq’s failure to aid in fighting terrorism and corruption could be why we are losing the war, the troubles in Iraq could also stem from a host of logistical reasons, some of which may shed a negative light on the current administration. O’Reilly, however, kept any discussion of these reasons offstage, suppressing the various other possible—and possibly more likely—reasons for “defeat” in Iraq. Meanwhile his claims that the “U.S.A. can’t control that country” and that “no nation could” deflected blame from the U.S. government.
Weak man arguments are pervasive. In a 2005 editorial in Denver’s Rocky Mountain News, conservative writer and activist David Horowitz picked on ethnic studies scholar Ward Churchill, formerly at the University of Colorado at Boulder, whose views he described as “hateful and ignorant.” Horowitz then went on to claim that Churchill’s radical “hate America” convictions “represent” those of a “substantial segment of the academic community.” Thus, he used the example of Churchill (the weak man) to argue that “tenured radicals” have made universities into leftist political institutions and subverted the academic enterprise, thereby failing to acknowledge the presence of more highly regarded and politically mainstream scholars in academia.
Trolling for Truth
Weak man tactics are harder to detect than those of the straw man variety. Because straw man arguments are closely related to an opponent’s true position, a clever listener might be able to spot the truth amid the hyperbole, understatement or other corrupted version of that view. A weak man argument, however, is more opaque because it contains a grain of truth and often bears little similarity to the stronger arguments that should also be presented. Therefore, a listener has to know a lot more about the situation to imagine the information that a speaker or writer has cleverly disregarded.
Nevertheless, an astute consumer of the news can catch many straw man and weak man fallacies by knowing how they work. Another strategy is to always consider a speaker’s or writer’s motivation or agenda and be especially alert for skewed statements of fact in editorials, television opinion shows, and the like. It is also wise to obtain news from more balanced news sources. An alternative approach is to try to construct, in your own mind, the best argument against what you have heard before accepting it as true. Or simply ask yourself: Why should I not believe this? [link]