Since I’ve been talking about the “science” of ghost hunting, I thought that now would be a good time to have a brief interlude and discuss how to do good science.
The scientific method is about creating a model of examined evidence or data, then trying out different hypotheses in order to attempt to explain what is going on. A good hypothesis makes predictions about the model that are testable, and the successes or failures of those tests are used in order to modify or even discard the hypothesis.
The biggest weakness with the scientific method is human bias. Good scientists try to guard against bias by setting up each experiment in a very careful manner, through peer review, and by being scrupulously honest in reporting any possible reason why their experiment might be fallacious.
In the words of Nobel Prize winning physicist, Dr. Richard Feynman in his lecture on “Cargo Cult Science”:
It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty — a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid — not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked — to make sure the other fellow can tell they have been eliminated.
Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can — if you know anything at all wrong, or possibly wrong — to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.
In summary, the idea is to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.
The easiest way to explain this idea is to contrast it, for example, with advertising. Last night I heard that Wesson oil doesn’t soak through food. Well, that’s true. It’s not dishonest; but the thing I’m talking about is not just a matter of not being dishonest; it’s a matter of scientific integrity, which is another level. The fact that should be added to that advertising statement is that no oils soak through food, if operated at a certain temperature. If operated at another temperature, they all will — including Wesson oil. So it’s the implication which has been conveyed, not the fact, which is true, and the difference is what we have to deal with.
We’ve learned from experience that the truth will come out. Other experimenters will repeat your experiment and find out whether you were wrong or right. Nature’s phenomena will agree or they’ll disagree with your theory. And, although you may gain some temporary fame and excitement, you will not gain a good reputation as a scientist if you haven’t tried to be very careful in this kind of work. And it’s this type of integrity, this kind of care not to fool yourself, that is missing to a large extent in much of the research in cargo cult science.
The first principle is that you must not fool yourself — and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.
“The first principle is that you must not fool yourself…” Here is where I think that ghost hunters fail miserably.
Ghost hunting is inherently a practice with a set goal in mind. Ghost hunters don’t proceed to a house with the intention of cataloging the phenomena that are happening in the house, (phenomena like creaks, footsteps, rattles and cold spots). They don’t catalog this data, create several models, and then create a testable hypothesis that they can use to predict future data.
No. Ghost hunters go to hunt ghosts. The data that they discover, regardless of validity, is used to prove or disprove that a ghost exists on the property in question. The entire idea that “ghosts exist” is never in question.
Their entire methodology is biased and therefore suspect.
I can understand the attraction of ghost hunting. It’s exciting and fun. I’ve done it as a young adult, when I was much more credulous.
And there is also something in us that abhors the idea of someday ceasing to exist. All adults have had times when they contemplate their own mortality, and thinking that our consciousness will some day come to an end is extremely disturbing to most of us. How much more comforting to think that our consciousness will continue after death? How much more comforting to think that we have proof – however tenuous – that we will continue?
But again, we must be scrupulously honest with ourselves, we must learn to think critically and learn to smell out when we are being fooled or conned… even if we are fooling ourselves.
Most scientists learn this ability to detect cons – even comforting cons – during the course of their training. Unfortunately this knowledge is picked up through a sort of ‘osmosis’, and was usually not stated explicitly until Dr. Carl Sagan spoke of it in his book, “The Demon Haunted World”.
Dr. Sagan’s words have a special bearing on ghost hunters:
More than a third of American adults believe that on some level they’ve made contact with the dead. The number seems to have jumped by 15 percent between 1977 and 1988. A quarter of Americans believe in reincarnation.
But that doesn’t mean I’d be willing to accept the pretensions of a “medium,” who claims to channel the spirits of the dear departed, when I’m aware the practice is rife with fraud. I know how much I want to believe that my parents have just abandoned the husks of their bodies, like insects or snakes molting, and gone somewhere else. I understand that those very feelings might make me easy prey even for an unclever con, or for normal people unfamiliar with their unconscious minds, or for those suffering from a dissociative psychiatric disorder Reluctantly, I rouse some reserves of skepticism.
How is it, I ask myself, that channelers never give us verifiable information otherwise unavailable? Why does Alexander the Great never tell us about the exact location of his tomb, Fermat about his Last Theorem, John Wilkes Booth about the Lincoln assassination conspiracy, Hermann Goring about the Reichstag fire? Why don’t Sophocles, Democritus, and Aristarchus dictate their lost books? Don’t they wish future generations to have access to their masterpieces?
If some good evidence for life after death were announced, I’d be eager to examine it; but it would have to be real scientific data, not mere anecdote. As with the face on Mars and alien abductions, better the hard truth, I say, than the comforting fantasy. And in the final tolling it often turns out that the facts are more comforting than the fantasy.
These are all cases of proved or presumptive baloney. A deception arises, sometimes innocently but collaboratively, sometimes with cynical premeditation. Usually the victim is caught up in a powerful emotion—wonder, fear, greed, grief. Credulous acceptance of baloney can cost you money; that’s what P. T. Barnum meant when he said, There’s a sucker born every minute.” But it can be much more dangerous than that, and when governments and societies lose the capacity for critical thinking, the results can be catastrophic — however sympathetic we may be to those who have bought the baloney.
In science we may start with experimental results, data, observations, measurements, “facts.” We invent, if we can, a rich array of possible explanations and systematically confront each explanation with the facts. In the course of their training, scientists are equipped with a baloney detection kit. The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re so inclined, if you don’t want to buy baloney even when it’s reassuring to do so, there are precautions that can be taken; there’s a tried-and-true, consumer-tested method.
What’s in the kit? Tools for skeptical thinking.
The tools in the “Baloney Detection Kit” include the following:
- Independent confirmation of facts or data
- Encourage debate on the evidence by knowledgeable proponents of all points of view
- Arguments from authority carry little weight
- Use multiple hypotheses to explain the model, and then let those hypotheses compete against each other to develop the best explanation.
- Don’t get too attached to your hypothesis.
- Data that is vague or of poor quality can be explained by many different hypotheses – so quantify, or precisely measure your data in order to develop a more compelling hypothesis.
- Every link in a chain of argument must work, or else the whole chain of reasoning fails.
- Use Occam’s Razor when discriminating against two hypotheses that explain the data equally well.
- Hypotheses that can’t be tested are not worth much. Skeptics must be able to follow your reasoning, duplicate your experiments, and get the same result, or else the hypothesis in question cannot be verified.
- All experiments must include control experiments. Variables must be separated in experiments. “Double Blind” experiments are useful in negating compromising biases held by the experimenters.
- Understand and recognize fallacies of logic and rhetoric.
In my next blog entry, I’ll explore some ghost hunting methodology and compare it to the scientific method. I’ll also discuss how ghost hunting methodology stacks up to the scientific integrity as discussed by Dr. Feynman, and how Dr. Sagan’s “baloney detection kit” can be applied to ghost hunting methods.