"Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment (of a target attribute) that is computationally complex, and instead substitutes a more easily-calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realising that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them."
As Nick Chater, Professor of Behavioural Science at Warwick University, points out: "If you can find a simple attribute, which you can quantify and is easy to bring to mind, then you'll use that as a short-cut—often a reasonably good short-cut—for answering a more tricky, broad, complicated question."
He goes on to explain: "I think we should be more respectful of the miracle that is the human brain. The problems that the world faces us with in understanding the commonsense world around us and the perceptual world around us are vastly complicated and they need to be simplified, fairly dramatically, to make any progress at all. As soon as you try to build a robot which can run about, say, or recognise the things around it, or speak and be understood, you realise how enormously difficult it is to deal with the complexity of the real world. So I think these simplifications—these short-cuts—although they do introduce biases—are absolutely essential; they're a central part of human intelligence. They're what allows us to behave effectively and successfully in a world that we can't possibly analyse in full detail."
When making decisions, would it be possible to put aside all the emotional stuff and focus instead only on the science? Dr Tamsin Edwards from the University of Bristol:
"We're humans, and we have to do what we think is the right thing to do. We have to make decisions somehow. Science can't give the answer for what to do; it can only say, 'Here is the evidence for what would happen if we did this, and here is the evidence for what would happen if we did that.' So that's not an answer: that's just a set of predictions. So you need to have humans and their emotions and their priorities and their values."
Dr Edwards explains that we have these ways of processing the information we get from different people: do we trust them? is it what we expect? are we retrenching into our own views because we've been challenged? Also, what sort of people are we: are we optimistic people or pessimistic people? do we trust authority? All of these things often combine together in our minds and help us to build up a coherent picture of how we view the science—and what we think we should do about it.
The programme's presenter, Michael Blastland, then interviewed Professor Dan Kahan from Yale University, "a lawyer with a keen interest in how different groups convince themselves that the experts are on their side" (whichever side that happens to be). The remarkable thing is, whatever the conflict, in none of them do you ever hear one side saying, "Who cares what the experts know? I don't believe in science." No, those on both sides are convinced that the position that predominates in their group is consistent with the expert view.
The problem is a tendency to search for or interpret information in a way that confirms our preconceptions, which means to say, Confirmation Bias. It's at this point that we put The Human Zoo to one side, and pick up instead Derren Brown's Tricks of the Mind.
"In one classic experiment," he reports, "two groups of students were arranged, one made of people who believed in the death penalty as an effective crime deterrent, and one opposed to it. Both were given two studies regarding the efficacy of the penalty, one for and one against. They were asked to evaluate the studies. Both groups were predictably more critical of the study that opposed their view, and more interestingly decided that the study with which they agreed was 'better conducted' and 'more convincing'."
He continues: "The all-too-common extreme of this sort of bias, though, is circular reasoning. This is the fallacy of the True Believer. The True Believer is impervious to real-world evidence because he just ignores anything that doesn't fit into his belief system. Instead he notices everything that matches and supports his beliefs, and inevitably comes to hold those beliefs at a very profound level. They can become absolutely part of his identity. It is this that brings together the religious, the psychic, the cynic (as opposed to the open sceptic) and the narrow-minds of all kinds.
"The case is, one can be a True Believer in anything: psychic ability, Christianity or, as Bertrand Russell classically suggested (with irony), in the fact that there is a teapot orbiting the earth. I could believe any of those things with total conviction. But my conviction doesn't make them true. Indeed, it is something of an insult to the very truth you might hold dear to say that something is true just because I really, really believe it."
Christopher Hitchens said: "What can be asserted without evidence can be dismissed without evidence." Put more bluntly, "Fact without theory is trivia; theory without fact is bullshit." Thus, it is the role of the believer to provide the evidence for his claims, and not the role of the non-believer to prove that the believer is wrong. To return to Bertrand Russell, nobody can prove that there isn't a teapot orbiting the earth, and nobody should be expected to either. If people believe that and want me to believe it too, it's up to them to show me the evidence.
Derren Brown again: "Whereas non-scientific thinking starts with a premise and then looks for things that support it, scientific thinking constantly tries to disprove itself. That alone makes all the difference. A scientist comes up with a premise: A causes B. Rather than look for all the cases where A causes B to support his premise, he starts trying to disprove that A causes B. Then, if after rigorous attempts to prove himself wrong it seems to hold up that A does indeed cause B, he'll publish his results. Now it's up to his peers to check and double check his findings. They will probably want to run their own experiments, to see if they replicate the results or disprove for themselves that A causes B. If that scientist has conducted bad experiments, or if his results are shown to be faulty, his reputation will suffer enormously."
That's the idea, at any rate. It's a system that tries to get out of the head of the scientist with his prejudices and values, and to see what seems to happen reliably in the world regardless of personal conviction or ideology.
* * *
Dom Nozzi has written, "For most all bicycling advocates, there is a single-minded tactic for increasing the number of bicyclists."
|Image from Drawing Rings blogspot.
This aside, if I had to sum up this "single-minded tactic" in just a sentence, I could do worse, I think, than quote Danny Williams from Cyclists in the City: "If you're going to build a bicycle transport network, then build it properly."
Who could argue with this? It's eminent common sense! And yet—it makes me feel a bit awkward to say this—it's not evidence. It does feel intuitively right, but this does not necessarily mean that it is in fact correct.
Let me give you an example. You are stood on the edge of a small platform which is located in the middle of an impossibly large lake. The surface of the water is like glass: there isn't a puff of wind in the air. In one hand you have a pistol, and in the other hand you have a bullet. You hold out both hands in front of you. At exactly the same moment, you drop the bullet and fire the gun. Which of the bullets is going to hit the water first?
Intuition would tell you that the bullet dropped out of the hand would hit the water first, but actually there is only one vector force acting upon both bullets, and that is gravity: the horizontal speed doesn't affect the vertical speed. So they would both hit the water at exactly the same time.
Oh, you knew that one already, did you? Okay, try this one. Now, first answer, please, and as quick as you can. Fish and chips cost £2.90. The fish costs £2.00 more than the chips. How much do the chips cost?
You knew that one as well? My Gosh, you are a bright little thing, aren't you? I got them both wrong, by the way. Anyway, it doesn't much matter. The point is that intuition can sometimes lead us astray
I did a Google search for "If you're going to build it, then build it properly." That was just weird. Adding the word "Quotes" at the start yielded equally bizarre results.
It turns out this is just management-speak, and stems from the desire that all of the processes that make up the just-in-time philosophy be done correctly and efficiently, so that there are no delays in the production process.
* * *
You think I like being the only one who says, "Network first"? I don't.