Logical Fallacies

In discussions about GMOs, both proponents and opponents make logical fallacies and claim that the other side is making logical fallacies. Sometimes arguments seem compelling even though they are based on faulty logic. In this post, you can find some common and not-so-common logical fallacies conveniently listed in alphabetical order.

Most of this post was written by Brian Dunning of the excellent podcast Skeptoid, who has generously given us permission to use his work here with some modification. The content can be found in its original form at Skeptoid, in the episodes A Magical Journey through the Land of Logical Fallacies – Part 1A Magical Journey through the Land of Logical Fallacies – Part 2, and Some New Logical Fallacies.

What is a fallacy and why are they used?

First, what is a logical fallacy? The Wikipedia definition is as good as any:

In logic and rhetoric, a fallacy is a misconception resulting from incorrect reasoning in argumentation. By accident or design, fallacies may exploit emotional triggers in the listener or interlocutor (e.g.appeal to emotion), or take advantage of social relationships between people (e.g. argument from authority). Fallacious arguments are often structured using rhetorical patterns that obscure the logical argument, making fallacies more difficult to diagnose. Also, the components of the fallacy may be spread out over separate arguments.

A shorter definition for logical fallacy is provided by Brian: “the use of rhetoric as a substitute for good evidence.”

Scientific arguments are won or lost by the scientific method. Either the data supports a claim or it does not. Sometimes, people who don’t have data to support their arguments will deliberately employ logical fallacies in an attempt to convince people that their claim is correct. Fallacies can also be accidentally employed when anyone mistakes compelling rhetoric for a sound argument.

Many of the fallacies listed here can be part of a legitimate discussion. The problem comes when we connect one of the fallacies to an unrelated claim. For example, stating a fact about a person is simply a fact. Only when we use that fact in an attempt to support or take down a claim does it become a fallacy.

This list contains many fallacies. Some are “traditional” fallacies and some are new arrivals. Surely you have seen or even used some of them. Are we missing any fallacies? Let us know in the comments, and we’ll add them to the list. Know of any interesting examples of fallacies being used to discuss agriculture or biotechnology? Please share!

The take home message in this post is that, if you’re going to have a debate, stick with valid arguments. In a scientific debate, particularly, stick with the data. Don’t get caught using fallacies. Hopefully, familiarity with these devices will help you to identify them in conversation. And, when you point them out, you will strip your opponent of the tools on which he depends the most.

Ad Hominem

From the latin for “to the person”, an ad hominem is an attack against the arguer rather than the argument. This doesn’t mean that you simply call the person a jerk; rather, it means that you use some weakness or characteristic of the arguer to imply a weakness of the argument.

Starling: “I think Volvos are fine automobiles.”
Bombo: “Of course you’d say that; you’re from Sweden.”

Starling’s Swedish heritage has nothing to do with the quality of Volvo automobiles, so Bombo’s is an attempt to change the subject and is an avoidance of the issue at hand. Bombo is trying to imply that Starling’s Swedish heritage biases, and thus invalidates, his statement. In fact, one thing has nothing to do with the other. Ad hominem arguments try to point out fault with the arguer, instead of with the argument.

Now, there are cases where it might be appropriate to consider the source of the information. If the authors of a study on plant biology are all physicists, or the author of a book about agriculture is actually a businessman, we might wonder if the person is familiar enough with the subject to design a valid study or to include all the relevant information in a book. If an article is written by or funded by persons who work for an organization with a known agenda we might have concerns about bias. We can then evaluate the work with a skeptical eye. Considering the source isn’t an ad hominem unless you throw out everything by the person simply because they are who they are.

Anecdotal Evidence

One of the most common ways to support claim is through the fallacious misuse of anecdotal evidence. Anecdotal evidence is information that cannot be tested scientifically, or that could be tested scientifically and has not. In practice this usually refers to personal testimonials and verbal reports. Anecdotal evidence often sounds compelling because it can be more personal and captivating than cold, uninteresting factual evidence.

Many people believe that their own experience trumps scientific evidence, and that merely relating that experience is sufficient to prove a given claim.

Starling: “Every scientific test of magical energy bracelets shows that they have no effect whatsoever.”
Bombo: “But they work for me, therefore I know for a fact they’re valid and that science is wrong.”

Is Bombo’s analysis of his own experience wrong? If it disagrees with well-performed controlled testing, then yes, he probably is wrong. Personal experiences are subject to influences, biases, preconceived notions, random variances, and are uncontrolled. Relating an anecdotal experience proves nothing.

Bombo: “My cousin’s friend took zinc pills and it cured her cold.”
Starling: “Perhaps the cold just went away by itself .”

Perhaps there is something to zinc pills, but without a randomized controlled study, we can’t know for sure. Anecdotal evidence is great for suggesting new directions in research, but by itself it is not evidence.

Anecdotal evidence is not completely useless. You could say “We saw the Bigfoot corpse at this location”, and if that information helps with the recovery of an actual body, then the anecdotal evidence was of tremendous value. But, note that it’s the Bigfoot corpse itself that comprises scientific evidence, not the story of where it was seen.

When anecdotes are presented as evidence or in place of evidence, you have very good reason to be skeptical.

Appeal to Authority

This type of argument refers to a special authoritative source as validation for the claim being made. Every time you see an advertisement featuring someone wearing a white lab coat, or telling you what 4 out of 5 dentists surveyed said, you’re seeing an appeal to authority.

“Acupuncture is based on centuries-old Chinese knowledge.”
“A growing number of scientists say that evolution is too improbable.”

These statements are true. They become a problem only when we use them to make a claim. For example:

“Acupuncture is based on centuries-old Chinese knowledge, therefore we know it works.”
“A growing number of scientists say that evolution is too improbable, therefore we need to question evolution.”

An appeal to authority is the opposite of an ad hominem attack, because here we are referring to some positive characteristic of the source, such as its perceived authority, as support for the argument. But a good authority supports a position because that position has been shown to be otherwise justified or evidenced, not the other way around. If you say that scientists support Theory X, are those scientists claiming that Theory X is true because they believe it?

We often see people appealing to authority when they say things like:

“This article in a peer-reviewed scientific journal says that people are getting fatter.”
“This PhD scientist says that people are getting fatter.”

Being peer-reviewed or having a PhD is not the end-all-be-all. The more important question is whether a particular claim fits within the established body of literature for that subject. If it doesn’t fit, then more research is needed before we can come to any conclusions.

Similarly, if a person has an advanced degree, that does not automatically mean that anything they say is correct. No good scientist attaches significance to their own authority. Theory X needs to stand on its own; an appeal to authority does not provide any useful support.

Appeal to Dead Puppies

Sometimes tugging at the heartstrings with a tragic tale is enough to quash dissent. Who wants to take the side of whatever malevolent force might be associated with death and suffering?

Starling: “Thank you, door-to-door solicitor, but I choose not to purchase your magazine subscription.”
Bombo: “But then I’ll be forced to turn to drugs and gangs.”

The Appeal to Dead Puppies draws a pathetic, poignant picture in order to play on your emotions. Recognize it when you hear it, and keep your emotions separate from the facts.

Appeal to Hitler

This one is inspired by Godwin’s Law, in which Mike Godwin stated “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.” Ever since, such arguments have become known as the reductio ad Hitlerum, or the Appeal to Hitler. It’s a garden variety “guilt by association” charge, saying you’re wrong because Hitler may have thought or done something similar.

Bombo: “You think illegal aliens should be deported? Sounds exactly like how the Nazis got started.”

Starling gives the common reply:

Starling: “The Nazis also owned dogs and played with their children.”

For good measure, Bombo comes back with a “straw man on a slippery slope” argument:

Bombo: “Are you saying everything about the Nazis was perfect?”

Appeal to Ignorance

Argumentum ad ignorantiam considers ignorance of something to be evidence that it does not exist. If I do not understand the mechanism of the Big Bang, that proves that there is no knowledge that supports it as a possibility and it therefore did not happen. Anything that is insufficiently explained or insufficiently understood is thus impossible.

Starling: “It is amazing that life arose through the fortuitous formation of amino acids in the primordial goo.”
Bombo: “A little too amazing. I can’t imagine how such a thing could happen; creationism is the only possibility.”

Using the absence of evidence as evidence of absence is a common appeal to ignorance. People who believe the Phoenix Lights could not have been simple flares generally don’t understand, or won’t listen to, the thorough evidence of that. Their glib layman’s understanding of what a flare might look like is inconsistent with their interpretation of the photographs, so they use an appeal to ignorance as proof that flares were not the cause.

Appeal to Lack of Authority

Authority has a reputation for being corrupt and inflexible, and this stereotype has been leveraged by some who assert that their own lack of authority somehow makes them a better authority.

Starling might say of the 9/11 attacks: “Every reputable structural engineer understands how fire caused the Twin Towers to collapse.”
Bombo can reply: “I’m not an expert in engineering or anything, I’m just a regular guy asking questions.”
Starling: “We should listen to what the people who know what they’re talking about have to say.”
Bombo: “Someone needs to stand up to these experts.”

The idea that not knowing what you’re talking about somehow makes you heroic or more reliable is incorrect. More likely, your lack of expertise simply makes you wrong.

Appeal to Quantum Physics

This is a form of special pleading, a scientific-sounding way of claiming that the way your magical product or service works is beyond the customer’s understanding; in this case, based on quantum physics. That sounds impressive, and who’s qualified to argue? Certainly not the average layperson.

Bombo: “Quantum physics explains why pressure points on the sole of your foot correspond with other parts of your anatomy.”

Here’s a tip. If you see or hear the phrase “quantum physics” mentioned in a context that is anything other than a scientific discussion of subatomic theory, raise your red flag. Someone is probably trying to hoodwink you by namedropping a science that they probably understand no better than your cat does.

Argument from Anomaly

This one is big with ghost hunters and UFO enthusiasts. Anything that’s anomalous, or otherwise not immediately, absolutely, positively, specifically identifiable, automatically becomes evidence of the paranormal claim.

Starling: “We found a cold spot in the room with no apparent source.”
Bombo: “That must be a ghost.”

Since the anomaly is, well, an anomaly, that means (by definition) that you can’t prove it was anything other than a ghost or a UFO or a leprechaun or whatever they want to say, not without well designed experiments. Since the skeptic can’t prove otherwise, the Argument from Anomaly is a perfect way to prove the existence of ghosts. Or, nearly perfect, I should say, because it’s not.

Of course, argument from anomaly doesn’t just work for UFOs and ghosts.

Starling: “This researcher found hamsters that have strange pouches of hair growing in their mouths.”
Bombo: “It must be due to GMOs.”

Bandwagon Fallacy

Also known as argumentum ad populum (appeal to the masses) or argument by consensus, the bandwagon fallacy states that if everyone else is doing it, so should you. If most people believe something or act a certain way, it must be correct.

“Everyone knows that O.J. Simpson was guilty; so he should be in jail.”
“Over 700 scientists have signed Dissent from Darwin, so you should reconsider your belief in evolution.”

The bandwagon fallacy can also be used in reverse: If very few people believe something, then it can’t be true.

Starling: “Firefly was a really cool show.”
Bombo: “Are you kidding? Almost nobody watched it.”

Consider how many supernatural beliefs are firmly held by a majority of the world’s population, and the lameness of the bandwagon fallacy comes into pretty sharp focus. The majority might sometimes be right, but they’re hardly reliable.

Better Journal Fallacy

It’s common for purveyors of woo to trot out some worthless, credulous magazine that promotes their belief, and refer to it as a peer-reviewed scientific journal:

Starling: “If telekinesis was real, you’d think there would be an article about it in the American Journal of Psychiatry.”
Bombo: “That rag is part of the establishment conspiracy to suppress psi research. You need to turn to a reputable source like the Journal of the American Society for Psychical Research. It’s peer-reviewed.”

And so it is, but its reviewers are people who have failed to establish credibility for themselves, as have such journals themselves. There are actually metrics for these things.

The productivity and impact of individual researchers can be described by their Hirsch index (or h-index), which attempts to measure the number and quality of citations of their publications and research. The number of citations the studies in a journal receive is important because the citations indicate that the study is considered a good source by other researchers in the field. A journal’s reputation can also be shown by its impact factor, which measures approximately the same thing as the Hirsch index. Although these indexes are not perfect, you need not ever lose a “my peer-reviewed scientific journal is better than yours” debate. Look up impact factors in the Thomson Reuters Journal Citation Reports through sciencewatch.com.

Chemical Fallacy

Want to terrify people and frighten them away from some product or technology that you don’t like? Mention chemicals. Chemical farming, chemical medicines, chemical toxins. As scary as the word is, it’s almost meaningless, because everything is a chemical. Even happy flowers and kittens consist entirely of chemicals. It’s a weasel word, nothing more, and its use often indicates that its user was unable to find a cogent argument.

Cherry Picking Fallacy

This fallacy is related to the appeal to authority fallacy. Often we read blog posts and articles about a press release, report, or, less frequently, a peer-reviewed article, that go on to state that this individual report or study proves some broad point. The problem here is that there may be many other reports and studies that disprove that point. Occasionally, there is a major change in scientific understanding, but those are rare. Focusing on one report or study while ignoring the rest is a fallacy.

Confusion of Correlation and Causation

Closely related to post hoc, but a little bit different, is the confusion of correlation and causation. Post hoc assumptions do not necessarily include any correlation between the two observations. When there is a correlation, but still no valid causation, we have a more convincing confusion.

Starling: “Chinese people eat a lot of rice.”
Bombo: “Therefore the consumption of rice must cause black hair.”

Due to the nature of Chinese agriculture, there is indeed a worldwide correlation between rice consumption and hair color. This is a perfect example of how causation can be invalidly inferred from a simple correlation.

Excluded Middle

The excluded middle assumes that only one of two ridiculous extremes is possible, when in fact a much more moderate middle-of-the-road result is more likely and desirable. An example of an excluded middle would be an argument that either every possible creation story should be taught in schools, or none of them. These two possibilities sound frightening, and may persuade people to choose the lesser of two evils and allow religious creation stories to be taught alongside science. In fact, the much more reasonable excluded middle, which is to teach science in science classes and religion in religion classes, is not offered.

The excluded middle is formally called reductio ad absurdum, reduction to the absurd. Bertrand Russell famously illustrated how an absurd premise can be fallaciously used to support an argument:

Starling says: “Given that 1 = 0, prove that you are the Pope.”
Bombo replies: “Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope.”

Just keep in mind that if your opponent is presuming extremes that are absurd, he is excluding the less absurd middle. Don’t fall for it.

Fallacy of the Consequent

Drawing invalid subset relationships in the wrong direction is called the fallacy of the consequent. Cancers are all considered diseases, but not all diseases are cancers. Stating that if you have a disease it must be cancer is a fallacy of the consequent.

Listen to how Bombo blames Starling’s failure to heal upon his failure to take one particular treatment, without regard for whether that treatment is a valid one for Starling’s particular condition:

Starling: “I am dying of bubonic plague.”
Bombo: “You did not drink enough wheatgrass juice.”

Even assuming that wheatgrass juice was a suitable treatment for anything, it would still not be a suitable treatment for everything, so Bombo’s suggestion that Starling’s illness is a fallacious consequence for his failure to drink wheatgrass juice.

Loaded Question

A loaded question is also known as the fallacy of multiple questions rolled into one, or plurium interrogationum. If I want to force you to answer one question in a certain way, I can roll that question up with another that offers you two choices, both of which require my desired answer to the first question. For example:

“Is this the first time you’ve killed anyone?”
“Have you always doubted the truth of the Bible?”
“Is it nice to never have to hassle with taking a shower?”

Any answer given forces you to give me the answer I was looking for: That you have killed someone, that you doubt the truth of the Bible, or that you don’t shower or bathe. Loaded questions should not be tolerated and certainly should never be answered.

Michael Jordan Fallacy

This one can be used to impugn the motives of anyone in the world, in an effort to prove they are driven by greed and don’t care about anyone else’s problems:

Bombo: “Just think if Michael Jordan had used all his talents and wealth to feed third world children, rather than to play a sport.”

Of course, you can say this about anyone, famous or not:

Bombo: “If your doctor really cared about people’s health, he’d sell everything he owned and become a charitable frontier doctor in Africa.”

In fact, for charitable efforts to exist, we need the Michael Jordans of the world playing basketball. Regular non-charitable activities, like your doctor’s business office, are what drives the economic machine that funds charity work. The world’s largest giver, the Bill & Melinda Gates Foundation, would not exist had a certain young man put his talents toward the Peace Corps instead of founding a profitable software giant.

Non-Sequitur

From the Latin for “It does not follow”, a non-sequitur is an obvious and stupid attempt to justify one claim using an irrelevant premise. Non-sequiturs work by starting with a reasonable sounding premise that it’s hoped you will agree with, and attaching it (like a rider to a bill in Congress) to a conclusion that has nothing to do with it. The sentence is phrased in such a way to make it sound like you have to accept both or neither:

“Corporations are evil, thus acupuncture is good.”
“The government is evil, thus UFOs are alien spacecraft.”
“Allah is great, thus all Christians should be killed.”

When we do science, it takes more than simply connecting two phrases with the word “thus” to draw a valid relationship. Thus, non-sequiturs are not valid devices to prove a point scientifically.

Observational Selection

Observational selection is the process of keeping the sample of data that agrees with your premise, and ignoring the sample of data that does not. Observational selection is the fallacy behind such phenomena as the Bible Code, psychic readings, the Global Consciousness Project, and faith healing. Observational selection is also a tool used by pollsters to produce desired survey results, by surveying only people who are predisposed to answer the poll the way the pollster wants.

Bombo: “The face of Satan is clearly visible in the smoke billowing from the World Trade Center.”
Starling: “And in one of the other 950,000 frames of film, the smoke looks like J. Edgar Hoover; in another, it looks like a Windows XP icon; and in another it looks like a map of Paris.”

Remember that one out of every million samples of anything is an incredible one-in-a-million rarity. This is a mere inevitability, but if observational selection compels you to ignore the other 999,999 samples, you’re very easily impressed.

Poisoning the Well

When you preface your comments by casually slipping in a derogatory adjective about your opponent or his position, you’re doing what’s called poisoning the well. A familiar example is the way Intelligent Design advocates poison the well by referring to evolution as Darwinism, as if it’s about devotion to one particular researcher. Or:

“And now, let’s hear the same old arguments about why we should believe UFOs come from outer space.”
“Celebrity television psychic Sylvia Browne tells us in her new book.”

If you listen to Skeptoid, you know that Brian poisons the well all the time. It’s one of his favorite devices. But he does it obviously, for the entertainment value, and not as a serious attempt at argument.

Post hoc

Post hoc ergo propter hoc means “after this, therefore because of this”. This fallacy is similar to the confusion of correlation and causation. Post hoc arguments are often the parents of superstition.

“When I wear my lucky shirt, I do much better on tests.”
“The incidence of allergies has risen after the introduction of GMOs into the food supply. Therefore, GMOs have caused the increase in allergies.”

Many things happen all the time. Choosing two practically at random does not make for a strong argument.

Proof by Lack of Evidence

This one is big in the conspiracy theory world: The lack of evidence that would support their conspiracy theory is due to the evil coverup. Thus, the lack of evidence for the conspiracy is, in and of itself, evidence of the conspiracy.

Bombo: “The passengers on Flight 93 were taken off the plane and executed by the government.”
Starling: “But there’s no evidence of that.”
Bombo: “Exactly. That’s how we know it for a fact.”

There are certainly things in the world that are true but for which no evidence exists, but these are in the minority. If you want to be right more often than not, stick with what we can actually learn. If instead your standard is that anything that can’t be disproven must therefore be true, like Russell’s Teapot, you’re one step away from delusional paranoia.

Proof by Mommy Instinct

Made famous by anti-vaccine activist Jenny McCarthy, this one asserts that nobody understands health issues better than a mom. Mothers obviously have experience with childbirth and with raising children, but is there any reason to suspect they understand internal medicine (for example) better than educated doctors, many of whom are also mothers? Not so far as I am able to divine.

Remember that Mommy Instincts are no different than anecdotal experiences. They are driven by perception and presumption, not by science.

Proof by Verbosity

The practice of burying you with so much information and misinformation that you cannot possibly respond to it all is called proof by verbosity, or argumentum verbosium. To win a debate, I need not have any support for my position if I can simply throw so many things at you that you can’t respond to all of them.

This is the favorite device of conspiracy theorists. The sheer volume of random tidbits that they throw out there gives the impression of their position having been thoroughly researched and well supported by many pillars of evidence. Any given tidbit is probably a red herring, but since there are so many of them, it would be hopeless (and fruitless) to respond intelligently to each and every one of them. Thus the argument appears to be impregnable and bulletproof. It may not be possible to construct a cogent argument using proof by verbosity, but it is very easy to construct an irrefutable argument.

Proof by Victimization

Beware of claims from those lording their victimization over you. They may well have been victimized by something, be it an illness, a scam, even their own flawed interpretation of an experience. And in many cases, such a tragedy does give the victim insight that others wouldn’t have. But it doesn’t mean that person necessarily understands what happened or why it happened, and should not be taken as proof that they do.

Bombo: “My neighbor’s wifi network gave me chronic fatigue.”
Starling: “But that’s been disproven every time it’s been tested.”
Bombo: “You don’t know what you’re talking about; it didn’t happen to you.”

Victimization does not anoint anyone with unassailable authority on their particular subject.

Red Herring

A red herring is a diversion inserted into an argument to distract attention away from the real point. Supposedly, dragging a smelly herring across the track of a hunted fox would save him from the dogs by diverting their attention away from the real quarry. Red herrings are a favorite device of those who argue conspiracy theories:

Starling: “Man landed on the moon in 1969.”
Bombo: “But don’t you think it’s strange that Werner von Braun went rock hunting in Antarctica only a few years before?”

Starling: “9/11 was perpetrated by Islamic terrorists.”
Bombo: “But don’t you think it’s strange that Dick Cheney had business contacts in the middle east?”

Red herrings are fallacious because they do not address the point under discussion, they merely distract from it; but in doing so, they give the impression that the true cause lies elsewhere. The wrongful use of red herrings as a substitute for evidence is rampant, absolutely rampant, in conspiracy theory arguments.

Slippery Slope

A slippery slope argument presumes that some change will inevitably result in extreme exaggerated consequences. If I give you a cookie now, you’ll expect a cookie every five minutes, so I shouldn’t give you a cookie.

Starling: “It should be illegal to sell alternative therapies that don’t work.”
Bombo: “If that happened, any minority group could make it illegal to sell anything they don’t happen to like.”

No matter what Starling suggests, multiplying it by ten or a hundred is probably a poor proposition. Bombo can use a slippery slope argument to exaggerate any suggestion Starling makes into a recipe for disaster.

The slippery slope is probably the most common subset of the larger fallacy, argument from adverse consequences, which is the practice of inventing almost any dire consequences to your opponent’s argument:

Starling: “They should remove ‘Under God’ from the Pledge of Allegiance.”
Bombo: “If that happened, all hell would break loose. Students would have sex in the hallways, school shootings would skyrocket, and we would become a nation of Satan worshippers.”

Special Pleading

An argument by special pleading states that the justification for some claim is on a higher level of knowledge than your opponent can comprehend, and thus he is not qualified to argue against it. The most common case of special pleading refers to God’s will, stating that we are not qualified to understand his reasons for doing whatever he does. Special pleadings grant a sort of get-out-of-jail-free exemption to whatever higher power lies behind a claim:

Starling: “Homeopathy should be tested with clinical trials.”
Bombo: “Clinical trials are not adequate to test the true nature of homeopathy.”

No matter what Starling says, Bombo can claim that there is knowledge outside of Starling’s experience or at a level that Starling cannot comprehend, and the argument is therefore ended. Bombo might also point out that Starling lacks some professional qualification to discuss the topic, thus placing the topic out of Starling’s reach.

Bombo: “You’re not a trained homeopath, so you shouldn’t be expected to understand it.”

A special pleading makes no attempt to address the opponent’s point, it is just another diversionary tactic.

Statistics of Small Numbers

You really have to take a statistics class to understand statistics, and I think the part that would surprise most people is the stuff about sample sizes. Given a population of a certain size, how many people do you have to survey before your results are meaningful? I took half of a statistics class once and learned just enough to realize that practically every online poll you see on the web, or survey you hear on the news or read about in the newspaper, is mathematically worthless.

But it extends much deeper than surveys. Drawing conclusions from data sets that are too small to be meaningful is common in pseudoscience. Listen to Bombo make a couple of bad conclusions from invalid sample sizes:

“I just threw double sixes. These dice are hot.”
“My neighbor’s a Mormon and he drinks wine, so I guess most Mormons don’t really follow the no-alcohol tradition.”
“I went to a chiropractor and I feel better, so chiropractic does work after all.”

Straw Man Argument

This fallacy is the most common and also one of the easiest to spot. This is where you state your position, and your opponent replies not to what you said, but to an exaggerated and distorted caricature of what you said that’s obviously harder to defend.

Starling says: “People who commit minor offenses should be let out of jail sooner.”
Bombo replies: “Emptying out all the jails would create havoc in society.”

Well, maybe Bombo’s right, but that’s not relevant, because “emptying the jails” is not what Starling advocated. In fact Bombo did not refute Starling’s point at all — he invented a different point that was easier to argue against. He created a straw man — one of those dummies stuffed with straw that soldiers use for bayonet practice. It’s too weak to fight back. And Bombo can then take satisfaction in having made a point that no reasonable person would argue with, and he appears to have successfully defeated Starling’s argument, when in fact he dodged it.

Weasel Words

Giving a controversial concept like creationism a new, more palatable name like Intelligent Design is what’s called the use of weasel words. Calling 9/11 conspiracies “9/11 Truth” is a weasel word; their movement is more interested in unlikely conspiracies than with truth, yet they give it a name that claims that’s what it’s all about.

Weasel words are a favorite of politicians. Witness the names of government programs that mean essentially the opposite of what they’re named: the Patriot Act, No Child Left Behind, Affirmative Action. By the way certain programs are named, it sounds like it would virtually be criminal to disagree with them.

Weasel words can also refer to sneaky wording in a sentence, like “It has been determined”, or “It is obvious that”, suggesting that some claim has support without actually indicating anything about the nature of such support.

Advertisements