A cognitive bias is a pattern of deviation from optimal reasoning or judgment caused by our brain architecture. Our brains evolved under various selective pressures including predation and resource/energy constraints. Genes that most closely approximated the optimal trade-off between accuracy and speed/energy consumption survived and multiplied while the rest produced organisms either too prone to error or too slow and cumbersome.
If we were to take every piece of data and every variable into account before forming a prediction or making a decision we would be effectively paralyzed. However, we must be able to make relatively accurate predictions in order to survive in a dangerous world. One example of a heuristic that our brains use to approximate optimal reasoning is the availability heuristic.
Humans have poor intuitions regarding statistics and probability. When we think of the likelihood of an occurrence we do not perform a statistical analysis; we remember anecdotes, stories and examples that we have recently encountered. This is known as the availability heuristic. The greater our ability to visualize an event, the greater the likelihood we attach to it. David McRaney of You Are Not So Smart explains;
"School shootings were considered to be a dangerous new phenomenon after Columbine. That event fundamentally changed the way kids are treated in American schools, and hundreds of books, seminars and films have been produced in an attempt to understand the sudden epidemic.
The truth, however, was there hadn’t been an increase of school shootings. During the time when Columbine and other school shootings got major media attention, violence in schools was down over 30 percent. Kids were more likely to get shot in school before Columbine, but the media during that time hadn’t given you many examples.
A typical school kid is three times more likely to get hit by lightning than be shot by a classmate, yet schools continue to guard against it as if it could happen at any second.
When you buy a lottery ticket, you imagine yourself winning like those people on television who get suddenly famous when their numbers are chosen, but you are far more likely to die in a car crash on the way to buy the ticket than you are to win.
The availability heuristic does a passable job of approximating statistical analysis; after all, the greater the frequency of an event the more likely an individual is to have experienced or heard examples of it. However, this heuristic sometimes ceases to approximate statistical analysis. Such is the case with news reporting.
The problem with news reporting is that what is reported does not reflect a representative sample. News is reported because it is interesting, and that generally means unusual. Modern news media now has an entire world of events to choose from. We are exposed to kidnappings in Europe, corruption in China, and terrorism in the middle east.
There are benefits to being informed with regards to important events around the world, and this is not a general argument against such exposure. However, it is undoubtedly the case that our beliefs about the frequency of rare events are drastically skewed relative to their actual frequency, especially compared to events that aren't considered 'newsworthy'. One need only look at the vaccination scares, terrorism paranoia and other 'epidemics' that flare up regularly for examples of the availability heuristic gone haywire.
There is no quick and easy solution to the problems caused by systematic cognitive biases. Scholarship is not enough; learning about cognitive bias will not magically remove it. After all, these heuristics are the results of our particular cognitive machinery, not choices that we knowingly make. However, acknowledging our bias is the first step in improvement. Learning the ways in which our biases affect our judgment and the situations in which they are most prominent is the second step. Noticing when our judgment is likely affected and actually implementing strategies to compensate is one of the most important skills a rationalist can develop. It is only when we can overcome bias that the goal of epistemic rationality, or the map that reflects the territory, can be obtained.
For more information on specific biases and the conditions under which they occur see Kahneman & Tversky "Judgment under Uncertainty: Heuristics and Biases". The relevant fields are Cognitive Psychology, Cognitive Science and Behavioral Economics. For a basic overview read through the archives of You Are Not So Smart. A more technical treatment can be found at LessWrong.
For more on the topic of evolution I recommend Richard Dawkins "The Selfish Gene" and "The Blind Watchmaker", as well as any introductory Evolutionary Psychology textbook.
1 hour ago