If there were a verb meaning 'to believe falsely', it would not have any significant first person, present indicative.
In the South Seas there is a cargo cult of people. The most widely-known period of cargo cult activity occurred among the Melanesian islanders in the years during and after World War II.
The indigenous societies of Melanesia are typically characterised by a 'Big Man' political system in which individuals gain prestige through gift exchanges. The more wealth a man can distribute, the more people come to be in his debt, and the greater his renown. Those who are unable to reciprocate are identified as 'Rubbish Men'.
As suddenly as the foreigners had appeared, however, they disappeared. This left the islanders without the foreign riches. Thinking they had fallen out of favour with the gods, the islanders decided to mimic the foreigners in the hope of bringing the cargo back.
|Image from Dr Joe Hanson|
The physicist, Richard Feynman, continues the story: "So they arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head for headphones, and bars of bamboo sticking out like antennas—he's the controller—and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. [...]
"Now it behooves me, of course, to tell you what they're missing. But it would be just about as difficult to explain to the South Sea islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones."
Cargo cults arise from the apparent belief that various ritualistic acts lead to a bestowing of material wealth ('cargo'). They tend to develop during times of social stress, and often under the leadership of a charismatic figure. It is likely that this leader has had a 'vision' (or 'myth-dream') of the future.
* * *
Feynman makes the point that he meets lots of people who sooner or later get him into a conversation about UFOs, or astrology, or some form of mysticism, expanded consciousness, new types of awareness, ESP, and so forth. And he's concluded that it's not a very scientific world.
"Most people believe so many wonderful things," he explains, "that I decided to investigate why they did. And what has been referred to as my curiosity for investigation has landed me in a difficulty where I found so much junk that I became overwhelmed."
Anyway, he spent a bit of time looking at various ideas of mysticism and mystic experiences, and at extrasensory perception, and PSI phenomena, and so on, and after a while he began to think, what else is there that we believe? And he found things that lots and lots of people believe, such as the best way to treat criminals. Bearing in mind Feynamn was speaking in 1974, he says: "We obviously have made no progress—lots of theory, but no progress—in decreasing the amount of crime by the method that we use to handle criminals."
He continues: "Yet these things are said to be scientific. We study them. And I think ordinary people with commonsense ideas are intimidated by this pseudoscience. A teacher who has some good idea of how to teach her children to read is forced by the school system to do it some other way—or is even fooled by the school system into thinking that her method is not necessarily a good one. Or a parent of bad boys, after disciplining them in one way or another, feels guilty for the rest of her life because she didn't do 'the right thing', according to the experts.
"So we really ought to look into theories that don't work, and science that isn't science."
This pseudoscience—this science that isn't science—is what Richard Feynman called Cargo Cult science. And there is one feature of genuine science that is generally missing in Cargo Cult science, he says.
"That is the idea that we all hope you have learned in studying science in school—we never say explicitly what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It's a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you're doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you've eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.
"Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can—if you know anything at all wrong, or possibly wrong—to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. [...]
"In summary, the idea is to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgement in one particular direction or another.
"The easiest way to explain this idea is to contrast it, for example, with advertising. Last night I heard that Wesson oil doesn't soak through food. Well, that's true. It's not dishonest; but the thing I'm talking about is not just a matter of not being dishonest; it's a matter of scientific integrity, which is another level. The fact that should be added to that advertising statement is that no oils soak through food, if operated at a certain temperature. If operated at another temperature, they all will—including Wesson oil. So it's the implication which has been conveyed, not the fact, which is true, and the difference is what we have to deal with.
"We've learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature's phenomena will agree or they'll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven't tried to be very careful in this kind of work. And it's this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in Cargo Cult science. [...]
"We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It's a little bit off because he had the incorrect value for the viscosity of air. It's interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan's, and the next one's a little bit bigger than that, and the next one's a little bit bigger than that, until finally they settle down to a number which is higher.
"Why didn't they discover the new number was higher right away? It's a thing that scientists are ashamed of—this history—because it's apparent that people did things like this: When they got a number that was too high above Millikan's, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number close to Millikan's value they didn't look so hard. And so they eliminated the numbers that were too far off, and did other things like that. We've learned those tricks nowadays, and now we don't have that kind of a disease.
"But this long history of learning how to not fool ourselves—of having utter scientific integrity—is, I'm sorry to say, something that we haven't specifically included in any particular course that I know of. We just hope you've caught on by osmosis
"The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you've not fooled yourself, it's easy not to fool other scientists. You just have to be honest in a conventional way after that. [...]
"Other kinds of errors are more characteristic of poor science. When I was at Cornell, I often talked to the people in the psychology department. One of the students told me she wanted to do an experiment that went something like this: It had been found by others that under certain circumstances, X, rats did something, A. She was curious as to whether, if she changed the circumstances to Y, they would still do A. So her proposal was to do the experiment under circumstances Y and see if they still did A.
"I explained to her that it was necessary first to repeat in her laboratory the experiment of the other person—to do it under condition X to see if she could also get result A, and then change to Y and see if A changed. Then she would know the the real difference was the thing she thought she had under control.
"She was very delighted with this new idea, and went to her professor. And his reply was, No, you cannot do that, because the experiment has already been done, and you would be wasting your time. [...]