Friday, April 22, 2011

Cognitive Bias and the Availability Heuristic

A cognitive bias is a pattern of deviation from optimal reasoning or judgment caused by our brain architecture. Our brains evolved under various selective pressures including predation and resource/energy constraints. Genes that most closely approximated the optimal trade-off between accuracy and speed/energy consumption survived and multiplied while the rest produced organisms either too prone to error or too slow and cumbersome.

If we were to take every piece of data and every variable into account before forming a prediction or making a decision we would be effectively paralyzed. However, we must be able to make relatively accurate predictions in order to survive in a dangerous world. One example of a heuristic that our brains use to approximate optimal reasoning is the availability heuristic.

 -

Humans have poor intuitions regarding statistics and probability. When we think of the likelihood of an occurrence we do not perform a statistical analysis; we remember anecdotes, stories and examples that we have recently encountered. This is known as the availability heuristic. The greater our ability to visualize an event, the greater the likelihood we attach to it. David McRaney of You Are Not So Smart explains; 

"School shootings were considered to be a dangerous new phenomenon after Columbine. That event fundamentally changed the way kids are treated in American schools, and hundreds of books, seminars and films have been produced in an attempt to understand the sudden epidemic.
The truth, however, was there hadn’t been an increase of school shootings. During the time when Columbine and other school shootings got major media attention, violence in schools was down over 30 percent. Kids were more likely to get shot in school before Columbine, but the media during that time hadn’t given you many examples.
A typical school kid is three times more likely to get hit by lightning than be shot by a classmate, yet schools continue to guard against it as if it could happen at any second.
When you buy a lottery ticket, you imagine yourself winning like those people on television who get suddenly famous when their numbers are chosen, but you are far more likely to die in a car crash on the way to buy the ticket than you are to win.

-

The availability heuristic does a passable job of approximating statistical analysis; after all, the greater the frequency of an event the more likely an individual is to have experienced or heard examples of it. However, this heuristic sometimes ceases to approximate statistical analysis. Such is the case with news reporting. 

The problem with news reporting is that what is reported does not reflect a representative sample. News is reported because it is interesting, and that generally means unusual. Modern news media now has an entire world of events to choose from. We are exposed to kidnappings in Europe, corruption in China, and terrorism in the middle east. 

There are benefits to being informed with regards to important events around the world, and this is not a general argument against such exposure. However, it is undoubtedly the case that our beliefs about the frequency of rare events are drastically skewed relative to their actual frequency, especially compared to events that aren't considered 'newsworthy'. One need only look at the vaccination scares, terrorism paranoia and other 'epidemics' that flare up regularly for examples of the availability heuristic gone haywire. 

 
-

There is no quick and easy solution to the problems caused by systematic cognitive biases. Scholarship is not enough; learning about cognitive bias will not magically remove it. After all, these heuristics are the results of our particular cognitive machinery, not choices that we knowingly make. However, acknowledging our bias is the first step in improvement. Learning the ways in which our biases affect our judgment and the situations in which they are most prominent is the second step. Noticing when our judgment is likely affected and actually implementing strategies to compensate is one of the most important skills a rationalist can develop. It is only when we can overcome bias that the goal of epistemic rationality, or the map that reflects the territory, can be obtained.


For more information on specific biases and the conditions under which they occur see Kahneman & Tversky "Judgment under Uncertainty: Heuristics and Biases". The relevant fields are Cognitive Psychology, Cognitive Science and Behavioral Economics. For a basic overview read through the archives of You Are Not So Smart. A more technical treatment can be found at LessWrong
For more on the topic of evolution I recommend Richard Dawkins "The Selfish Gene" and "The Blind Watchmaker", as well as any introductory Evolutionary Psychology textbook.

5 comments:

  1. I just don't get how this relates to real life for the majority of people. Certain people may have brains that go in the direction yours does, that care to contemplate these scholarly concepts; but most people just don't get this stuff, or even want to understand it... I think you know this. You have a brain that is wired for this type of information, amazingly and uncommonly wired for it... most of us are simpler in our thoughts, and don't have the patience or interest in trying to comprehend what seems to come so naturally to you. I hope you will have patience and tolerance for us, and for our choice to not delve into concepts that we don't feel a need to understand to live our lives. I can hardly comprehend how this comes so easily for you, and how you seem to understand so well what you write about... it is Greek to me! and I hope you can forgive me for that. But to me it isn't ignorance to not understand it, because it adds nothing to make my life meaningful...

    ReplyDelete
  2. Great post, Scott.
    I've thought before about how emotional impact makes us raise the probability of an event but never really thought about it as a visualization issue. Thanks for that.
    Curiously, though, from a group perspective amplifying probabilities could be helpful. Think about how many businesses wouldn't be started if people were more grounded in actual probabilities: and yet for society, it is great to have new businesses. On a similar note, it might help with death rates for mothers to be more paranoid about the likelihood of a danger.

    ReplyDelete
  3. interesting scotty :) when you coming over for dinner?

    ReplyDelete
  4. scott, are you keeping up with your blogging?

    Like the post, however, the wording, like stacy is saying doesn't really relate, and although you know what you're talking about.... if you aren't able to relate to others then what good does it do others?

    If that means relating in the form of stories then so be it, you have to use that to your advantage if you know that the brain is able to get concepts, beliefs, and "life lessons" more easily in the form of story and parable, then use it.

    Merely coming from an intellectual point of view only connects with those on that wavelength but its cuts out pretty much everyone else from the loop. Why not try and put it like you and I discussing things. Being able to describe something complex in simple easy to grasp terms in a good tool to have.

    -Nick

    ReplyDelete
  5. Nick, thanks for the feedback. Expect more examples and applications to real life situations in the future.

    Ron, thanks for the response and I hope you continue to read and comment on my blog. A very interesting point to be made, and I actually plan on using it to illustrate a concept in my next post. Do you mind if I reference your comment?

    ReplyDelete