Roger Ebert Home

Debate Based on Total Lack of Logic

drtable.jpg


 

 

 

 

 

 

 

 

 

"We have now sunk to a depth at which re-statement of the obvious is the first duty of intelligent men." -- George Orwell

The above headline excerpt is from an article at LiveScience, but this post (like my earlier one, "Maybe Bill Maher was right...") is not about health care or Obama or Nazis. It is about logic -- critical thinking -- and why our brains just aren't terribly good at it. All of our brains. Not just those inside the skulls of people who "disagree" with us. Because how often are we even able to locate the precise nature of the "disagreement"? Writer Jeanna Bryner reports that sociologists and psychologists are studying why humans are such irrational creatures:

The problem: People on both sides of the political aisle often work backward from a firm conclusion to find supporting facts, rather than letting evidence inform their views.

The result, according to a private national survey, finds that the so-called "debate" about health care (though this could apply to any subject) is based largely on misinformation or myths that confirm pre-existing beliefs, rather than on what is actually in proposed legislation.

Why is this? Because we're built to think irrationally. According to Steve Hoffman, a visiting professor of sociology at the University of Buffalo:

"People get deeply attached to their beliefs... We form emotional attachments that get wrapped up in our personal identity and sense of morality, irrespective of the facts of the matter."

And to keep our sense of personal and social identity, Hoffman said, we tend to use a backward type of reasoning in order to justify such beliefs.

Similarly, past research by Dolores Albarracin, a psychology professor at the University of Illinois at Urbana-Champaign, has shown in particular that people who are less confident in their beliefs are more reluctant than others to seek out opposing perspectives. So these people avoid counter evidence all together....

As I am fond of saying (paraphrasing Michael Shermer), we are pattern-seeking animals. We will find patterns whether they're there or not, and whether they have any meaning or not. Nobody is immune. The best we can do is to make a conscious (and conscientious) effort to get past our preconceptions and hone in on the core of the evidence. Ever had the feeling, when reading a movie review, that the writer had already formed an opinion before the movie started? Ever formed that opinion of a review because of the writer's verdict (or star rating) rather than what was actually written in the review? Ever formed an opinion about a movie yourself before you got a ticket? Yeah. We all do that sometimes. We can't help it.

This passage from another piece ("Why Aren't People Smarter?") at LiveScience gets to the core of the "cognitive limitations" we must labor to overcome:

The widespread failure to teach critical thinking (or even recognize its importance) is only part of the problem. Efforts to make our kids smarter will inevitably crash up against a biological barrier: Our brains are actually hardwired to hinder our attempts to think critically.

Critical thinking is often counterintuitive, and our brains are easily fooled.

Superstition and magical thinking come easily to us; we jump to conclusions without evidence; our biases and prejudices influence how we interpret the world. We see faces in clouds and patterns in events where they do not exist. Personal experience and vivid anecdotes are much more easily learned and remembered than facts. Our fears and emotions often override facts and logic (for example, the factual knowledge that air travel is very, very safe does little to calm many people's visceral fear of flying).

In a way, the better question is, should we expect people to be any smarter?

Critical thinking is a skill, and like any skill it can be taught, practiced, and improved upon. Expecting the average person to think logically and critically is like expecting the average person to play the piano or write a book. With study and practice, almost anyone can do it with some level of proficiency, but most people don't learn how to think critically or analytically--nor are they even aware of its value.

We see this all the time -- and indeed someone reading this post is likely to respond with a speculative ad hominem "argument" that ignores the pixels on the page and attributes a "hidden agenda" motive to the writer (me) such as the usual: "You're think you're smarter than those who disagree with you!" Of course, there can be no rational response to that (beyond the obvious, "No, that isn't what I said at all"), because the nature of any "disagreement" has been completely ignored. (So don't expect me to publish it if you can't form a legitimate argument.)

As Orwell so famously said: "To see what is in front of one's nose needs a constant struggle."

(tip: Drew Morton)

UPDATE (8/27/09): Jonah Lehrer ("Proust Was a Neuroscientist," "How We Decide") uses the "fear of flying" example in a piece about insurance at The Frontal Cortex:

Why do people buy insurance? On the one hand, the act of purchasing insurance is an utterly rational act, dependent on the uniquely human ability to ponder counterfactuals in the distant future. What if my a fire destroyed my house? What if my new car got totaled? What if I get cancer and require expensive medical treatments? We take this cognitive skill for granted, but it's actually profoundly rare.

And yet, the desire to purchase insurance is also influenced by deeply irrational forces, and the peculiar ways in which our emotions help us assess risks. The passionate nature of risk -- and the ways in which our passions lead to misperceptions -- shouldn't be too surprising. Look, for instance, at the very common fear of flying: More than 30 percent of people admit to being "scared or very scared" whenever they board a plane. And yet, according to the National Transportation Safety Board, flying on a commercial jetliner has a fatality rate of 0.04 per one hundred million passenger miles. In contrast, driving has a fatality rate of 0.86. This means that the most dangerous part of traveling on a commercial flight is the drive to the airport.

What does this have to do with insurance? Let's consider health insurance, which is the insurance debate du jour. America is that rare developed nation that 1) doesn't provide universal health insurance and 2) doesn't require individuals to purchase health insurance. (Massachusetts is the rare exception.) In other words, we trust ourselves to make the right decision when it comes to health insurance. The end result is that lots of healthy young people opt out of the system, choosing to accept the risk of illness in exchange for not having to pay some insanely expensive premiums.

[...]

It should be noted, of course, that society requires insurance for drivers precisely because it doesn't trust us to make the right decision. We all think we're above average drivers, so of course we won't get into an accident. But then we do and it's too late. I'm increasingly coming to believe that health insurance is the same way, and that we need society to save us from our own blinkered views of the future.

Again, Lehrer's not talking so much about the fear itself (discussed further in comments above) as about how fear colors our perceptions of risk. Most people underestimate their chances of getting sick (until there's a lot of media coverage about swine flu or SARS), and overestimate their chances of being in a plane crash. He's talking about the kind of emotional thinking that Vegas casinos, state lotteries and financial securities salespeople rely upon. We all know the odds are against us. And yet many of us are willing to put money (at least a relatively small amount) on the line in hopes of an astronomical reward.

* * * *

The Skeptics Society offers a $5 Baloney Detection Kit that includes:

* Carl Sagan's Ten Tools for Baloney Detection and Michael Shermer's Ten Questions For Baloney Detection

* How Thinking Goes Wrong: The 25 Fallacies of Thinking

o Problems in Scientific Thinking 1. Theory Influences Observations 2. The Observer Changes the Observed 3. Equipment Construct Results

o Problems in Pseudoscientific Thinking 4. Anecdotes Do Not Make a Science 5. Scientific Language Does Not Make a Science 6. Bold Statements Do Not Make True Claims 7. Heresy Does Not Equal Correctness 8. Burden of Proof 9. Rumors Do Not Equal Reality 10. Unexplained is Not Inexplicable 11. Failures are Rationalized 12. After-the-Fact Reasoning 13. Coincidence 14. Representativeness

o Logical Problems in Thinking 15. Emotive Words and False Analogies 16. Ad Ignorantiam 17. Ad Hominem and Tu Quoque 18. Hasty Generalization 19. Over-reliance on Authorities 20. Either-Or 21. Circular Reasoning 22. Reductio ad Absurdum and the Slippery Slope

o Psychological Problems in Thinking 23. Effort Inadequacies 24. Problem-Solving Inadequacies 25. Ideological Immunity, or The Planck Problem

* Eight Sample Syllabi: How to Teach a Course in Science & Pseudoscience * The Most Recommended Skeptical Books * Science and Skepticism: Science, Scientific Method and Skepticism -- How They Contribute to Rational and Critical Thinking

Latest blog posts

Latest reviews

Nowhere Special
We Grown Now
Blood for Dust
Dusk for a Hitman
Stress Positions

Comments

comments powered by Disqus