When you're asked to make a decision after having heard "weak" evidence (that is, accurate information that only marginally raises the probability of an outcome), you're much more likely to make the wrong decision regarding an outcome—and be more pessimistic about the supported outcome—according to experiments performed by cognitive scientist Philip Fernbach.
The idea is a little confusing, so let's unwrap it a little. Let's say, for example, you heard your local paper was supporting a candidate you wanted elected. The fact that the paper is supporting the candidate certainly doesn't decrease the likelihood that candidate will be elected, but it's still relatively weak evidence that she will. However, if it's the most recent piece of evidence you've heard, you're most likely going to be less optimistic that your candidate will win the election, based on the fact that you're analysing that particular piece of evidence as weak. And obviously that's not a good way to make decisions.
"It turns out that if you give people some evidence that is positive but weak, then actually they focus too much on that piece of positive evidence" and are less likely to incorporate other facts in their decision, Fernbach said in a telephone interview.
The takeaway: This seems like a hard-coded decision-making bias, so when you're considering important decisions, it may be helpful to consider the evidence you're focusing on most, especially if you're overemphasizing the importance of weak evidence. Photo by katietower.
Faced With Evidence, We Still Get It Wrong [ABC News]