Free Novel Read

Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time Page 8


  How Science Changes

  Science is different from pseudoscience, and history is different from pseudohistory, not only in evidence and plausibility but in how they change. Science and history are cumulative and progressive in that they continue to improve and refine knowledge of our world and our past based on new observations and interpretations. Pseudohistory and pseudo-science, if they change at all, change primarily for personal, political, or ideological reasons. But how do science and history change?

  One of the most useful theories of how science changes is Thomas Kuhn's (1962) concept of "paradigm shift." The paradigm defines the "normal science" of an age—as accepted by the majority of the practicing scientists in a field—and a shift (or revolution) may occur when enough renegade and heretical scientists gain enough evidence and enough power to overthrow the existing paradigm. "Power" is made visible in the social and political aspects of science: research and professorial positions at major universities, influence within funding agencies, control of journals and conferences, prestigious books, and so forth. I define a paradigm as a model shared by most but not all members of a scientific community, designed to describe and interpret observed or inferred phenomena, past or present, and aimed at building a testable body of knowledge open to rejection or confirmation. In other words, a paradigm captures the scientific thinking of the majority but most of the time it coexists with competing paradigms—as is necessary if new paradigms are to displace old paradigms.

  Philosopher of science Michael Ruse, in The Darwinian Paradigm (1989), identified at least four usages of the word.

  1. Sociological, focusing on "a group of people who come together, feeling themselves as having a shared outlook (whether they do really, or not), and to an extent separating themselves off from other scientists" (pp. 124-125). Freudian psychoanalysts within psychology are a good example of science guided by a sociological paradigm.

  2. Psychological, where individuals within the paradigm literally see the world differently from those outside the paradigm. We have all seen the reversible figures in perceptual experiments, such as the old woman/young woman shifting figure where the perception of one precludes the perception of the other. In this particular perceptual experiment, presenting subjects with a strong "young woman" image followed by the ambiguous figure always produces the perception of the young woman, while presenting a strong "old woman" image followed by the ambiguous figure produces the perception of the old woman 95 percent of the time (Leeper 1935).

  Similarly, some researchers view aggression in humans primarily as biologically innate and essential, while others view it primarily as culturally induced and dispensable. Those who focus their research on proving one or the other of these views would be doing science guided by a psychological paradigm: both views have support, but the choice of which to believe more is influenced by psychological factors.

  3. Epistemological, where "one's ways of doing science are bound up with the paradigm" because the research techniques, problems, and solutions are determined by the hypotheses, theories, and models. A theory of phrenology that leads to the development of phrenological equipment for measuring bumps on the skull would be an example of science guided by an epistemological paradigm.

  4. Ontological, where in the deepest sense "what there is depends crucially on what paradigm you hold. For Priestley, there literally was no such thing as oxygen.... In the case of Lavoisier, he not only believed in oxygen: oxygen existed" (pp. 125-126). Similarly, for Georges Buffon and Charles Lyell, varieties in a population were merely degenerates from the originally created kind; nature eliminated them to preserve the essence of the species. For Charles Darwin and Alfred Russel Wallace, varieties were the key to evolutionary change. Each view depends on a different ontological paradigm: Buffon and Lyell could not see varieties as evolutionary engines because evolution did not exist for them; Darwin and Wallace did not view varieties as degenerative because degeneration is irrelevant to evolution.

  My definition of a paradigm holds for the sociological, psychological, and epistemological uses. To make it wholly ontological, however, would mean that any paradigm is as good as any other paradigm because there is no outside source for corroboration. Tealeaf reading and economic forecasting, sheep's livers and meteorological maps, astrology and astronomy, all equally determine reality under an ontological paradigm. This is not even wrong. It is ridiculous. As difficult as it is for economists and meteorologists to predict the future, they are still better at it than tealeaf readers and sheep's liver diviners. Astrologers cannot explain the interior workings of a star, predict the outcome of colliding galaxies, or chart the course of a spacecraft to Jupiter. Astronomers can, for the simple reason that they operate within a scientific paradigm that is constantly refined against the harsh realities of nature itself.

  Science is progressive because its paradigms depend upon the cumulative knowledge gained through experimentation, corroboration, and falsification. Pseudoscience, nonscience, superstition, myth, religion, and art are not progressive because they do not have goals or mechanisms that allow the accumulation of knowledge that builds on the past. Their paradigms either do not shift or coexist with other paradigms. Progress, in the cumulative sense, is not their purpose. This is not a criticism, just an observation. Artists do not improve upon the styles of their predecessors; they invent new styles. Priests, rabbis, and ministers do not attempt to improve upon the sayings of their masters; they repeat, interpret, and teach them. Pseudoscientists do not correct the errors of their predecessors; they perpetuate them.

  By cumulative change I mean, then, that when a paradigm shifts, scientists do not abandon the entire science. Rather, what remains useful in the paradigm is retained as new features are added and new interpretations given. Albert Einstein emphasized this point in reflecting upon his own contributions to physics and cosmology: "Creating a new theory is not like destroying an old barn and erecting a skyscraper in its place. It is rather like climbing a mountain, gaining new and wider views, discovering unexpected connections between our starting point and its rich environment. But the point from which we started out still exists and can be seen, although it appears smaller and forms a tiny part of our broad view gained by the mastery of the obstacles on our adventurous way up" (in Weaver 1987, p. 133). Even though Darwin replaced the theory of special creation with that of evolution by natural selection, much of what came before was retained in the new theory—Linnean classification, descriptive geology, comparative anatomy, and so forth. What changed was how these various fields were linked to one another through history—the theory of evolution. There was cumulative growth and paradigmatic change. This is scientific progress, defined as the cumulative growth of a system of knowledge - over time, in which useful features are retained and nonuseful features are abandoned, based on the rejection or confirmation of testable knowledge.

  The Triumph of Science

  Though I have defined science as progressive, I admit it is not possible to know whether the knowledge uncovered by the scientific method is absolutely certain because we have no place outside—no Archimedean point—from which to view Reality. There is no question but that science is heavily influenced by the culture in which it is embedded, and that scientists may all share a common bias that leads them to think a certain way about nature. But this does not take anything away from the progressive feature of science, in the cumulative sense.

  In this regard, philosopher Sydney Hook makes an interesting comparison between the arts and sciences: "Raphael's Sistine Madonna without Raphael, Beethoven's sonatas and symphonies without Beethoven, are inconceivable. In science, on the other hand, it is quite probable that most of the achievements of any given scientist would have been attained by other individuals working in the field" (1943, p. 35). The reason for this is that science, with progress as one of its primary goals, seeks understanding through objective methods (even though it rarely attains it). The arts seek provocation of emotion and reflection through subj
ective means. The more subjective the endeavor, the more individual it becomes, and therefore difficult if not impossible for someone else to produce. The more objective the pursuit, the more likely it is that someone else will duplicate the achievement. Science actually depends upon duplication for verification. Darwin's theory of natural selection would have occurred to another scientist—and, in fact, did occur to Alfred Russel Wallace simultaneously—because the scientific process is empirically verifiable.

  In the Industrial West, the emphasis on scientific and technological progress has affected Western cultures deeply—so much so that we now define a culture as progressive if it encourages the development of science and technology. In science, useful features are retained and nonuseful features are abandoned through the confirmation or rejection of testable knowledge by the community of scientists. The scientific method, in this way, is constructed to be progressive. In technology, useful features are retained and nonuseful features are abandoned based on the rejection or acceptance of the technologies by the consuming public. Technologies, then, are also constructed to be progressive. Cultural traditions (art, myth, religion) may exhibit some of the features found in science and technology, such as being accepted or rejected within their own community or by the public, but none have had as their primary goal cumulative growth through an indebtedness to the past. But in the Industrial West, culture has taken on a new guise: it has as a primary goal the accumulation of cultural traditions and artifacts, and it uses, ignores, and returns to cultural traditions and artifacts as needed to aid the progress of science and technology. We cannot, in any absolute sense, equate happiness with progress, or progress with happiness, but an individual who finds happiness in a variety of knowledge and artifacts, cherishes novelty and change, and esteems the living standards set by the Industrial West will view a culture driven by scientific and technological progress as progressive.

  Lately the word progress has taken on a pejorative meaning, implying superiority over those who "have not progressed as far," namely, they have not adopted the values or the standard of living defined by the Industrial West, because they are either not able or not willing to encourage the development of science and technology. I do not mean progress to have this pejorative sense. Whether or not a culture pursues science and technology does not make one culture better than another or one way of life more moral than another or one people happier than another. Science and technology have plenty of limitations, and they are double-edge swords. Science has made the modern world, but it may also unmake it. Our advances in the physical sciences have given us plastics and plastic explosives, cars and tanks, supersonic transports and B-l bombers; they have also put men on the moon and missiles in silos. We travel faster and further, but so do our destructive agents. Medical advances allow us to live twice as long as our ancestors did a mere 150 years ago, and now we have a potentially devastating overpopulation problem without a corresponding overproduction solution. Discoveries in anthropology and cosmology have given us insight into the origins of species and the workings of the universe. But for many people, these insights and their corresponding ideologies are an insult to personal and religious beliefs and a provocative threat to the comfortable status quo. Our scientific and technological progress has, for the first time in history, given us many ways of causing the extinct-tion of our own species. This is neither good nor bad. It is simply the out come of a cumulative system of knowledge. But flawed as it may be, science is at present the best method we have for doing what we want it to do. As Einstein observed, "One thing I have learned in a long life: that all our science, measured against reality, is primitive and childlike—and yet it is I the most precious thing we have."

  3

  How Thinking Goes Wrong

  Twenty-five Fallacies That Lead Us to Believe Weird Things

  In 1994 NBC began airing a New Age program called The Other Side that explored claims of the paranormal, various mysteries and miracles, and assorted "weird" things. I appeared numerous times as the token skeptic—the "other side" of The Other Side, if you will. On most talk shows, a "balanced" program is a half-dozen to a dozen believers and one lone skeptic as the voice of reason or opposition. The Other Side was no different, even though the executive producer, many of the program producers, and even the host were skeptical of most of the beliefs they were covering. I did one program on werewolves for which they flew in a fellow from England. He actually looked a little like what you see in werewolf movies—big bushy sideburns and rather pointy ears—but when I talked to him, I found that he did not actually remember becoming a werewolf. He recalled the experience under hypnosis. In my opinion, his was a case of false memory, either planted by the hypnotist or fantasized by the man.

  Another program was on astrology. The producers brought in a serious, professional astrologer from India who explained how it worked using charts and maps with all the jargon. But, because he was so serious, they ended up featuring a Hollywood astrologer who made all sorts of predictions about the lives of movie starts. He also did some readings for members of the audience. One young lady was told that she was having problems staying in long-term relationships with men. During the break, she told me that she was fourteen years old and was there with her high-school class to see how television programs were produced.

  In my opinion, most believers in miracles, monsters, and mysteries are not hoaxers, flimflam artists, or lunatics. Most are normal people whose normal thinking has gone wrong in some way. In chapters 4, 5, and 6, I will discuss in detail psychic power, altered states of consciousness, and alien abductions, but I would like to round out part 1 of this book by looking at twenty-five fallacies of thinking that can lead anyone to believe weird things. I have grouped them in four categories, listing specific fallacies and problems in each. But as an affirmation that thinking can go right, I begin with what I call Hume's Maxim and close with what I call Spinoza's Dictum.

  Hume's Maxim

  Skeptics owe a lot to the Scottish philosopher David Hume (1711-1776), whose An Enquiry Concerning Human Understanding is a classic in skeptical analysis. The work was first published anonymously in London in 1739 as A Treatise of Human Nature. In Hume's words, it "fell dead-born from the press, without reaching such distinction as even to excite a murmur among the zealots." Hume blamed his own writing style and reworked the manuscript into An Abstract of a Treatise of Human Nature, published in 1740, and then into Philosophical Essays Concerning the Human Understanding, published in 1748. The work still garnered no recognition, so in 1758 he brought out the final version, under the title An Enquiry Concerning Human Understanding, which today we regard as his greatest philosophical work.

  Hume distinguished between "antecedent skepticism," such as Rene Descartes' method of doubting everything that has no "antecedent" infallible criterion for belief; and "consequent skepticism," the method Hume employed, which recognizes the "consequences" of our fallible senses but corrects them through reason: "A wise man proportions his belief to the evidence." Better words could not be found for a skeptical motto.

  Even more important is Hume's foolproof, when-all-else-fails analysis of miraculous claims. For when one is confronted by a true believer whose apparently supernatural or paranormal claim has no immediately apparent natural explanation, Hume provides an argument that he thought so important that he placed his own words in quotes and called them a maxim:

  The plain consequence is (and it is a general maxim worthy of our attention), "That no testimony is sufficient to establish a miracle, unless the testimony be of such a kind, that its falsehood would be more miraculous than the fact which it endeavors to establish."

  When anyone tells me that he saw a dead man restored to life, I immediately consider with myself whether it be more probable, that this person should either deceive or be deceived, or that the fact, which he relates, should really have happened. I weigh the one miracle against the other; and according to the superiority, which I discover, I pronounce my decision, a
nd always reject the greater miracle. If the falsehood of his testimony would be more miraculous than the event which he relates; then, and not till then, can he pretend to command my belief or opinion. ([1758] 1952, p. 491)

  Problems in Scientific Thinking

  1. Theory Influences Observations

  About the human quest to understand the physical world, physicist and Nobel laureate Werner Heisenberg concluded, "What we observe is not nature itself but nature exposed to our method of questioning." In quantum mechanics, this notion has been formalized as the "Copenhagen interpretation" of quantum action: "a probability function does not prescribe a certain event but describes a continuum of possible events until a measurement interferes with the isolation of the system and a single event is actualized" (in Weaver 1987, p. 412). The Copenhagen interpretation eliminates the one-to-one correlation between theory and reality. The theory in part constructs the reality. Reality exists independent of I the observer, of course, but our perceptions of reality are influenced by the J theories framing our examination of it. Thus, philosophers call science theory laden. That theory shapes perceptions of reality is true not only for quantum physics but also for all observations of the world. When Columbus arrived in the New World, he had a theory that he was in Asia and proceeded to perceive the New World as such. Cinnamon was a valuable Asian spice, and the first New World shrub that smelled like cinnamon was declared to be it. When he encountered the aromatic gumbo-limbo tree of the West Indies, Columbus concluded it was an Asian species similar to the mastic tree of the Mediterranean. A New World nut was matched with Marco Polo's description of a coconut. Columbus's surgeon even declared, based on some Caribbean roots his men uncovered, that he had found Chinese rhubarb. A theory of Asia produced observations of Asia, even though Columbus was half a world away. Such is the power of theory.