Alibris Secondhand Books Standard

Sunday, October 04, 2009

why we believe everything we read

From PsyBlog:

What is the mind's default position: are we naturally critical or naturally gullible? As a species do we have a tendency to behave like Agent Mulder from the X-Files who always wanted to believe in mythical monsters and alien abductions? Or are we like his partner Agent Scully who was the critical scientist, generating alternative explanations, trying to understand and evaluate the strange occurrences they encountered rationally?


This is not a new question. Four centuries ago, René Descartes and Baruch Spinoza disagreed about whether we can understand anything without believing it first. Descartes argued that as we take in information we evaluate its truthfulness. Spinoza claimed that we believe what we hear or see, and are only able to re-evaluate afterward.

PsyBlog notes that most people prefer Descartes' model:

Descartes' view is intuitively attractive and seems to accord with the way our minds work, or at least the way we would like our minds to work.

Spinoza's approach is unappealing because it suggests we have to waste our energy rooting out falsities that other people have randomly sprayed in our direction, whether by word of mouth, TV, the internet or any other medium of communication.


But the important question is not which view is more appealing; it's which view is right. An experiment in the early 1990s set out to test them.

Daniel Gilbert and colleagues put these two theories head-to-head in a series of experiments to test whether understanding and belief operate together or whether belief (or disbelief) comes later (Gilbert et al., 1993).

In their classic social psychology experiment seventy-one participants read statements about two robberies then gave the robber a jail sentence. Some of the statements were designed to make the crime seem worse, for example the robber had a gun, and others to make it look less serious, for example the robber had starving children to feed.

The twist was that only some of the statements were true, while others were false. Participants were told that all the statements that were true would be displayed in green type, while the false statement would be in red. Here's the clever bit: half the participants where purposefully distracted while they were reading the false statements while the other half weren't.


The idea was that people who are distracted don't have as much time to evaluate the statements. If, as Descartes suggested, we automatically evaluate as we read, the distractions shouldn't affect the participants' perceptions of the crime. False statements, written in red, would be automatically discarded. But if Spinoza was right, the distractions would interfere with the processing of the statements. The participants might not have time to evaluate the truthfulness based on the color of the text, and simply believe everything they read.

The results showed that when the false statements made the crime seem much worse rather than less serious, the participants who were interrupted gave the criminals almost twice as long in jail, up from about 6 years to around 11 years.

By contrast the group in which participants hadn't been interrupted managed to ignore the false statements. Consequently there was no significant difference between jail terms depending on whether false statements made the crime seem worse or less serious.

This meant that only when given time to think about it did people behave as though the false statements were actually false. On the other hand, without time for reflection, people simply believed what they read.


Spinoza was right. We are all Agent Mulder.

But that's not necessarily bad, claims Gilbert. There are some good reasons for defaulting to belief rather than skepticism:

The problem is that a lot of the information we are exposed to is actually true, and some of it is vital for our survival. If we had to go around checking our beliefs all the time, we'd never get anything done and miss out on some great opportunities.

Minds that work on a Spinozan model, however, can happily believe as a general rule of thumb, then check out anything that seems dodgy later. Yes, they will often believe things that aren't true, but it's better to believe too much and be caught out once in a while than be too cynical and fail to capitalise on the useful and beneficial information that is actually true.

Labels: ,