Fallacies, Episode II: The Use and Abuse of Evidence

In episode I, you learned about logical fallacies that represent mistakes in moving from our premises to our conclusions, due to bad logical form. But even if we’re careful to support our conclusions with reasoning and evidence, we can often run into problems with the evidence itself. “Facts” are supposed to be objective — but facts need to be interpreted, and this process is often subjective.


For that reason, evidence can often mislead and confuse. Have we interpreted it correctly? Do we have enough context to understand what it really indicates?

As consumers of media, we need to be aware of how frequently the supposed “evidence” that supports a claim can be misinterpreted and misused.

As thinkers and writers, we must make sure that we always consider source and context to ensure that we are using evidence appropriately.

Misuse of Authority

To establish credibility for our claims, we can appeal to an authority who advocates our view—but this “authority” must be credible him or herself. The problem is that sometimes people appeal to “authorities” who are themselves biased or incompetent.

Have you ever seen a TV commercial where a football player tries to tell you which is the best car to buy? And did you ask yourself why a football player should know anything about cars? He might well be an authority on something—football, for example, or exercise, or head injuries. But if he’s not an expert on cars, his opinion or testimony is not an appropriate and credible source to support a claim about which car is the best buy.


A doctor or a chemist probably knows more about what makes a good energy drink than Tupac’s ghost.

Even when the authority cited has expertise on the topic, he or she may be biased about the topic or have some hidden agenda. Let’s say you cited the American Medical Association to support the claim that doctors are overworked and underpaid. The AMA certainly has the expertise to speak with authority about doctors and the healthcare industry, but an organization run by and comprised of doctors has a vested interest in representing its members as needing shorter hours and more pay. Using their claims to support your argument, you might be criticized for relying on a biased source.1

Misleading Statistics

One way to counteract that bias would be to make sure that the authority’s claim is based in fact. While the AMA certainly might have their own reasons for making the information known, their claim would seem much more legitimate if they could back it up with some numbers – say, statistics that show that the average yearly income of M.D.’s in this country is only $40,000, or something like that.2


Why didn’t I become a Philosophy professor like my parents wanted?

But wait a minute. M.D.’s? So that could include anyone with a medical degree, regardless of whether s/he is a practicing physician. Sometimes people get medical degrees, but then go into other fields. So this stat doesn’t necessarily tell me how much doctors earn.

In fact, once you have an M.D., you have it for life, even if you stop being a doctor. So, as far as we know, that statistic could include individuals who aren’t earning any income at all, because they’re unemployed or retired. If that average includes people with an income of $0, they could be throwing off the entire calculation.

We frequently tend to look at averages as indicators of overall conditions, but a few outliers can throw off an average, and produce results that are extremely misleading.

As the New York Times reports,

imgres-1“In 2011, for example, the average income of the 7,878 households in Steubenville,
Ohio, was $46,341. But if just two people, Warren Buffett and Oprah Winfrey, relocated to that city, the average household income in Steubenville would rise 62 percent overnight, to $75,263 per household.”3

While averages are especially prone to producing skewed results, any statistics can mislead if we’re not careful to examine the context.

Consider these statistics, provided by Fox News:4

Screen Shot 2016-10-20 at 4.27.21 PM.png

We’re meant to believe that these statistics support the following claim:

Americans believe that the science behind reports of global climate change is false.

But what do these statistics really indicate? How many people were actually surveyed—does this represent a significant number? Who was surveyed—could the survey results be biased? Was the survey question – “Did scientists falsify research to support their own theories on Global Warming?” – asked in a general sense, or was in asked in relation to a specific situation?

Furthermore, we must ask whether the statistics being presented are the appropriate ones to support the argument. Even if the majority of Americans believe the science behind theories of global climate change has been falsified, does this demonstrate that it actually has been falsified? That seems to be the inference we are expected to make here, but there’s no reason to believe that it’s true.

And wait another minute. Something’s off here. Look at those numbers again.

Screen Shot 2016-10-20 at 4.27.21 PM.png

These percentages should represent some fraction of the total number of people surveyed. If they add up to 120%, this means they supposedly have survey data for more people than were actually surveyed; these numbers can’t possibly be accurate.

The trouble is that because statistics seem to be objectively true – to be just facts – we often don’t look as closely at them as we should.

Anytime we analyze statistics, and try to determine what they indicate, we need to keep a few things in mind:

  • What does the statistic relate to? If it gives a percentage, what is it a percentage of?
  • Can we be sure the statistic is significant? What is the scope of the data it includes (e.g. how many years, how many countries, how many test subjects, etc.)? How does this data compare to other available data on the topic?
  • Who collected the data? For what purpose?
  • When was the data collected? Might it no longer be valid?
  • Was the data collected under controlled conditions that would help eliminate biases and inaccurate results?

Anytime you see statistics, be sure to look for this info, and anytime you use statistics in your writing, provide your readers with this info so that they can see you’re using your stats responsibly.

Good rule of thumb: don’t just give a stat (or any other evidence) and expect your readers to trust that it means what you say it means; give them enough information and context so they can judge for themselves, and agree with you.

Card Stacking

Card stacking is another way of manipulating evidence and people’s perception of it. By selecting only the evidence that supports their own claim, or by overemphasizing that evidence and minimizing any contrary evidence, people stack the deck in their favor.5

Of course, if we want to persuade people, we naturally want to include evidence that will support our claims – and we’ll want to line up as much supporting evidence as possible. But we can’t ignore all evidence to the contrary. It’s unethical to mislead others who don’t realize we’re stacking the deck; and for those who know there’s more to the story, our failure to acknowledge other points of view will make us seem uninformed and clueless at best, or biased and shady at worst.


Suppressing evidence?

For us consumers of media, card stacking is a sneaky little beast to watch out for. It can be tough to spot, because the evidence presented could actually consist of relevant and accurate facts – it’s just that those facts may represent only part of the picture.6

Let’s say you’re skimming the blogosphere and you read this:

The stock market today is in good shape. Some oil company stocks are up 30%. Some chemical companies have the highest profits ever.7

In the above example, only a couple out of the scads of industries that make up the “stock market” are cited in the author’s evidence. Even if these facts are accurate, we would need to know a lot more to determine that the stock market overall is in good shape.


Invest in Blackberry, they said.
It’s a sure thing, they said.

This author may simply be guilty of jumping to conclusions on the basis of only one or two examples (a hasty generalization), but if he were, say, making this statement because he’s trying to get you to invest your money with him, you should ask yourself if he could be hiding information about the stocks that aren’t doing so well.

We have a tendency to think that “facts” are incontrovertible evidence. But not all “facts” are equal. Facts in themselves are not evidence to support a claim—they have to be interpreted before we can say what they support or what they indicate. And that’s your job.

For more info on logical form and logical fallacies, check out the other posts in this series:

1. Adapted from Robert B. Donald, et al. Writing Clear Essays. 3rd ed. Prentice Hall, 1996. p. 302
2. Arbitrary fabricated number, concocted for demonstration purposes only.
3. Stephanie Coontz. “When Numbers Mislead.” New York Times.25 May, 2013. www.nytimes.com/2013/05/26/opinion/sunday/when-numbers-mislead.html?_r=0. Accessed 20 Oct., 2016.
4. Simon Maloy. “Fox News fiddles with climate change polling.” Media Matters for America. 8 Sept., 2009. mediamatters.org/blog/2009/12/08/fox-news-fiddles-with-climate-change-polling/157839. Accessed 20 Oct., 2016.
5. Robert B. Donald, et al. p. 302-303.
6. Robert B. Donald, et al. p. 303.
7. Robert B. Donald, et al. p. 303.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s