Conclusion - Denying to the Grave - Sara E Gorman, Jack M Gorman

Denying to the Grave: Why We Ignore the Facts That Will Save Us - Sara E Gorman, Jack M Gorman (2016)

Conclusion

WE HAVE ARGUED THROUGHOUT THIS BOOK THAT THERE are complex psychological, social, and neurobiological underpinnings of resistance to scientific evidence. We have also argued that many of these tendencies are in many ways completely adaptive, healthy, and essentially human. The challenge that remains for those of us interested in maximizing population health and at the same time helping individuals to make scientifically informed health choices is to figure out how to address the tendencies that lead to false scientific belief without completely insulting or, worse, attempting to repress these tendencies. Not only does telling people not to be emotional fail, but we also strongly believe that it is in no one’s best interest to suppress this side of the human brain. So we propose a multipronged method to help guide people toward the evidence without dismissing the importance of their humanity. In the end, we don’t want people to scramble for a story when they should be paying attention to statistics. But at the same time, we wouldn’t want a society full of people who see only percentages and probabilities in place of showing empathy for the individuals around them. We just want to help people better tease out when their natural psychological tendencies are protecting them and when they are actually harming their health.

We will begin by citing the core principles that we believe any reader of this book should take from reading it. Then we will show how these core principles translate into our recommendations for better strategies for communicating and dealing with people who do not believe the science. We call this the Gorman-Gorman method.

Principles

Guiding Principle #1: It is not simply uneducated people who make irrational health decisions.

We have seen multiple times throughout this book that making rational health choices is not particularly correlated with intelligence. There are abundant examples of people with illustrious academic training, including in the sciences, who have embraced unrealistic and irrational beliefs about important topics such as the relationship between vaccines and autism and the cause of AIDS. The most prominent examples include people such as Andrew Wakefield and Peter Duesberg, both of whom had a great deal of scientific and medical training and illustrious scientific careers before building second careers espousing and proselytizing nonscientific and often dangerous ideas about health. It could not be argued that either of these individuals, along with many other examples like them, lack the ability to understand the science behind rational and irrational health choices, which is one of the prime factors that leads us to a more psychological approach to disbelief in scientific evidence. The problem is not ignorance but psychological forces that are in many ways surprisingly adaptive and important to human survival and the formation and endurance of human society.

We therefore strongly discourage any kind of approach to anti-vaccine, AIDS denialist, or any other nonscientific belief that revolves solely around assumptions of ignorance. Ignorance has clearly been shown not to be the prime factor in the formation of these nonscientific beliefs. Harping on ignorance, as many healthcare providers and public health officials do, will result only in further antagonism between the “scientists” and the “nonscientists” and will ultimately be unsuccessful in convincing a group of people who already understand the science to accept it. As we have shown throughout this book, the psychological and neurobiological factors driving these unscientific beliefs must be addressed first and foremost, not the educational level or scientific sophistication of the people who hold onto them. If combatting science denial were simply a matter of education our task would be far easier than it actually turns out to be, which brings us to our second point.

Guiding Principle #2: It isn’t about a simple “lack of information.”

There is a very strong tendency to attribute disbelief in scientific evidence to ignorance and a lack of information. In a Forbes article, Dr. Robert Pearl asserts that it is “important, especially for parents, to understand the potential consequences of preventable, infectious diseases.”1 A similar approach was taken in a 2001 article in Vaccine, as evidenced by the title: “Understanding Those Who Do Not Understand: A Brief Review of the Anti-vaccine Movement.”2 Both of these articles display a strong tendency toward the belief that science denialism has to do with a lack of information. As a result, a response of many public health officials and healthcare workers to health denialist beliefs has been to throw more information at “those who do not understand.” Yet, as we have shown throughout this book, many people with denialist beliefs are highly educated and informational campaigns have tended to be unsuccessful. So we argue that simply throwing more data at people to prove that vaccines are not dangerous but that infectious diseases are will never be enough to truly address the problem. Part of the reason this approach will not work is because it completely ignores the psychological, emotional, and social instincts accompanying science denialism that we have outlined in this book and that prove to be quite important in fueling these beliefs.

Guiding Principle #3: Empathy and evolutionary benefits may sometimes be at odds with rational thinking.

This book has shown how, in many cases, the psychological forces that help us to be good members of society and even allow us to survive can sometimes be antithetical to the demands of rational thinking. We have shown how social situations, such as the sway of a charismatic leader and a strong desire to form groups that provide important social benefits, can sometimes cause us to set aside our rationality in favor of emotional appeals, desire for community, and strong leadership cues. When the same group that loves the Environmental Protection Agency (EPA) for imposing strict carbon emission standards on industries, which corporate America hates, also accuses the EPA of conspiring with industry when it does not list a pesticide as a carcinogen, we realize that the overriding factor is not lack of information. Such groups are usually consumed with information, including scientific information that they refuse to believe. Similarly, when environmental groups insist that global warming is occurring because a consensus of scientists say so but claim that the consensus of scientists who support the safety of GMOs or nuclear energy are misguided, the ability to understand facts is not the driving force. Rather, the need to belong to a group that maintains its identity no matter what facts are presented is the fuel for these contradictory beliefs. This need is characteristic of people from every race, income level, intellectual capacity, and country.

We have also seen how the very basis of a functioning human society and often what makes us human—empathy—can cause us to favor stories over statistics and misjudge scientific fact. What’s more, we’ve seen how even our most basic survival instincts have caused us to formulate ways of thinking that are not always conducive to scientific rationality. For example, our most basic survival can sometimes depend on making quick inferences, instantaneously picking up cues in our environment and using heuristics to interpret them, thereby attributing causality with limited information. Yet, at the same time, all of these instincts can become traps when we are attempting to muddle through and make sense of complex scientific information. It is crucial to recognize that some of our most beneficial impulses and cognitive processes can also be the most damaging when it comes to understanding a complex system such as the scientific process. Without recognizing this, we run the risk of ignoring some of the most fundamental reasons for holding onto nonscientific beliefs: in many ways, the processes that cause them are the very mechanisms that help us survive and thrive in human society.

Guiding Principle #4: Hypothesis testing lends itself to an inability to profess absolute certainty, and people are uncomfortable with this.

How do scientists come to decide that X causes Y? In reality, the way in which scientific studies are set up does not allow scientists to ever technically make the claim that X causes Y. This is because of hypothesis testing, the statistical procedure by which scientists can decide if they have found a potentially meaningful result. In hypothesis testing, you begin with a null hypothesis, which is the hypothesis that shows you found nothing. As we have pointed out, if you are doing an experiment to figure out if Drug A works better than Drug B, your null hypothesis would be: “There is no statistically significant difference between the efficacy of Drug A and Drug B.” Then you would formulate an alternative hypothesis: “There is a statistically significant difference between the efficacy of Drug A and Drug B.” (Notice that even in your alternative hypothesis, you make no mention of whether Drug A or Drug B is better—you are simply stating the possibility of a difference.) Then, you carry out the experiment. We use statistics to test the null and alternative hypotheses. If you find a significant difference between Drug A and Drug B, you will then reject the null hypothesis. But you can never “accept” the alternative hypothesis. The only option is to reject or not reject the null hypothesis. When you make the decision to reject or not reject the null hypothesis, you also set up a confidence interval, usually of 95%, which means that you can say you would find this same result 95% of the time. Notice that there is no such thing as 100% in statistics. It would actually be incorrect for a statistician or a scientist to say: “I am 100% certain that Drug A is better than Drug B.”

So where does all this leave us? The rules of science and statistics force us to work within a framework of negation as well as only near 100% certainty. Students of epidemiology and biostatistics will tell you that it can be very confusing to formulate the correct statements after running statistical tests, and their professors will deduct points from a test or problem set if they proclaim: “Drug A and Drug B show statistically significant differences in efficacy. We can therefore accept the alternative hypothesis.” Such a comment would provoke a well of red ink from the professor, who would likely chastise this student, writing, “We cannot ever accept the alternative hypothesis. We can only reject or not reject the null hypothesis.” By the time these students have graduated and become scientists or statisticians themselves, they will therefore have “We cannot ever simply accept the alternative hypothesis” so drilled into their heads that they will never be willing to make that mistake again and will always be very careful about how they phrase their experiment results when discussing them with the media. This is why a scientist will never say something like: “I can tell you, with 100% certainty, that A does not cause B.” Instead, they will say something like: “There is no evidence that higher levels of A are associated with higher levels of B” or “Drug B does not display the same performance as Drug A in clinical trials to date.” This can be extremely frustrating. Our psychological forces prime us to look for certainty and reassurance. We just want to hear: “I can assure you that Drug B is better than Drug A.” But a scientist will most likely not feel comfortable making such a categorical statement for fear of misrepresenting the evidence that the scientific method is actually set up to collect. Indeed, when “experts” like the ever-popular Dr. Oz do seem to go too far with the surety of their statements, they generally get widespread criticism from their peers. Thus there is a fundamental disconnect between the kind of evidence that reassures human beings and the kind of evidence that the laws of statistics and science allow us to collect.

There is a joke among epidemiologists that encapsulates this disconnect: the marriage vow between two epidemiologists goes “I promise to always fail to reject you.” There is a reason why this is funny (to at least some of us). Imagine if we approached wedding vows the way we approach hypothesis testing. We would be forced to make statements such as this one, which are not very reassuring to the emotional needs of two people trying to establish a lifetime of devotion to each other. So instead, in our emotional and social realms, we make much more dramatically declarative statements, such as “I promise to always accept you, no matter what.” This is the kind of statement that makes epidemiologists squirm. The problem is that we cannot simply turn off our emotional and social proclivities when it comes to scientific evidence. So when a scientist says “There is no evidence to date that guns in the home are protective,” we are much more frustrated than if we heard “I can tell you with 100% certainty that guns in the home are definitely dangers and never offer any benefits.” The scientist does not make the former statement because he or she really believes there is a chance that guns are safe; the scientist is simply reflecting the statements that can be made from the way in which scientific evidence is collected. Although ultimately the two statements are the same, we feel less reassured by the former than by the latter.

Guiding Principle #5: People respond more to emotion than statistics, but charismatic leaders use emotion and scientists use statistics.

As we have observed many times throughout this book, people respond more to emotional anecdotes than to population-based statistics, and this is in part an adaptation that allows us to have empathy and thus form functional societies. This inclination has nothing to do with professional training: it is as strong in a scientist as in a novelist. The thought of helping our spouse, child, or best friend is easier to grasp than of saving the population of a city, state, or country. The problem arises when we think about how people get their scientific information and what kind of information is more likely to be accessible and persuasive. When a doctor, late in the day, sees a patient who has had acute bronchitis for the last week, her focus is on the patient, not on the population. An antibiotic will probably not help the patient, but it also probably won’t do the patient any harm. The fact that such needless administration of antibiotics causes antibiotic-resistant bacteria to flourish, jeopardizing the health of people in ICUs, is much more easily ignored.

We argued in chapter 2 that charismatic leaders are in part so successful because they appeal to people’s emotions. In fact, this is an important part of being a good leader in general: appealing to people’s emotional side can be more effective when trying to boost productivity than stimulating merely their rational side, and the two attitudes can even work at cross-purposes. Charismatic leaders are emotional leaders par excellence and do an outstanding job at making people feel special, included, and unified. This kind of leadership strengthens beliefs, no matter what they are. And this kind of group formation is such a basic human instinct that we felt we would be remiss to write an entire book about individual belief formation without paying some attention to group belief formation.

Yet when it comes to science, this kind of leadership is rarely helpful, and indeed it has revealed itself to be quite harmful in many instances. One only has to think about Thabo Mbeki, charismatic leader of South Africa, whose idea that HIV does not cause AIDS strengthened a devastating epidemic in South Africa that is still not under control today. Part of the problem is that the scientific community has not responded in kind to the efforts of anti-science charismatic leaders. As a result, we find ourselves caught between appealing, charismatic leaders telling us that vaccines and nuclear power are dangerous and guns and unpasteurized milk are safe on one side and dry, academic scientists on the other side citing p values and saying things like, “There is no good evidence to confirm the hypothesis that vaccines are associated with autism.” Given our natural tendency toward emotion and the way in which emotional appeals motivate us, who are we more likely to listen to in this scenario? You don’t have to be a conspiracy theorist or member of a cult to agree that you’d rather listen to Winston Churchill give a rousing, inspiring speech than sit through a statistician’s appraisal of all of the studies on the link between HIV and AIDS. The former is a charismatic leader; the latter is not. It is nothing to be ashamed of: the way in which traditional science is presented to the public is often very boring. But if we are going to make a dent in the way in which scientific beliefs are developed, we are going to have to find a better way to form groups around confirming the scientific evidence and selecting scientifically credible charismatic leaders to lead them.

Guiding Principle #6: People have trouble changing their minds.

It turns out that we are extremely resistant to changing our minds. We are reluctant to unlearn lessons we’ve learned and integrated. When we are confronted with information that conflicts with what we already believe, we experience cognitive dissonance. Cognitive dissonance is extremely uncomfortable, and we do everything we can to dispel it. This is how people convince themselves that smoking is safe, that they do not need to eat vegetables, and that repeatedly skipping trips to the gym will not make a difference in their weight and health. We set up beliefs that suit us, and then our brains work very hard to make sure we can resist anything that seems to challenge these beliefs. Studies have shown that the power of this fight against cognitive dissonance is so strong that it manifests itself in predictable changes in regional brain activation, even among people who have supposedly changed their minds.

Indeed, some clear patterns from imaging studies emerge that help us understand why cognitive dissonance occurs. For example, if we initially get a feeling of reward from an idea, we will seek to replicate the feeling multiple times. Each time, the reward center in the brain, the ventral striatum and more specifically the nucleus accumbens located within it, is triggered, and eventually other parts of the instinctive brain learn to solidify the idea into a fixed one. If we try to change our minds, a fear center in the brain like the anterior insula warns us that danger is imminent. The powerful dorsolateral prefrontal cortex can override these more primitive brain centers and assert reason and logic, but it is slow to act and requires a great deal of determination and effort to do so. Hence, it is fundamentally unnatural and uncomfortable to change our minds, and this is reflected in the way our brains work.

So what happens when we are confronted with a field that operates largely through the negation of old beliefs? As we have noted many times throughout this book, science operates mainly by disproving old evidence. As Sara’s biostatistics professor used to say, “When you finish a study and you have found a statistically significant result, your next task is to present that finding to other scientists and say, ‘I found something statistically significant; now it’s your turn to try to make it go away.’ ” And this is exactly how scientific research works: we do everything we can to try to attack our statistically significant results to see if they can falter, and if we can’t, we pass the results onto others to see if they can make them falter. Eventually, either the results can be replicated so many times that they become believable, or, more likely, conflicting results are subsequently found and people begin to change their minds. This is how science progresses: through repeated falsification of old beliefs. But what about the fact we just noted that changing our minds is among the most unnatural actions known to the human species? Obviously this creates a fundamental problem with our appraisal of scientific evidence. Science demands that we be open to changing our minds constantly, but human biology and psychology insist that we hold onto our beliefs with as much conviction as we possibly can. This conflict is fundamental to our reluctance to accept new scientific findings. Once the brain has set up the idea that GMOs cause cancer, it is basically impossible to undo that belief, no matter how many scientific studies provide evidence to the contrary.

Guiding Principle #7: People have trouble understanding probability and risk.

Decisions in medical science are predicated on the notion that we can determine a certain level of statistical risk and then use our judgment to apply that knowledge to specific cases. Medical decisions are therefore intrinsically based on a certain level of uncertainty, since statistical truth does not always translate into 100% reality for every individual. The fact that statistics can’t always predict reality perfectly is extremely troubling for us, especially when it comes to making important decisions about our health.

Often, psychological forces therefore intervene between the statistical probabilities and how we use them to make decisions. A 1% chance of developing cancer means something very different than a 1% chance of developing a bacterial sinus infection. We are much more likely to take the former seriously, even to the point of accepting a series of treatments with serious side effects and high risks, and ignore the latter. The risk level is the same in both scenarios, but the decision we make is very different. Is it rational to pursue aggressive treatment for a 1% chance of something? Possibly not. We might act “rationally” when it comes to the sinus infection, but “rational” means less to us when it comes to a potentially life-threatening disease. In our minds, that 1% suddenly becomes a mental image of dying from cancer and regretting that we didn’t decide to attack it early. In reality, a better way to think about the probability would be to imagine that we were 99% cancer-free and 1% cancerous. But from a human perspective, this kind of exercise is absolutely meaningless and displays a crucial failure of statistical evidence to translate into natural human cognition and decision-making processes. Emotion has a profound effect on how we judge a naked statistic.

This disconnect between natural human thought and the kind of thought required by statistical evidence is a serious problem when it comes to evaluating scientific evidence to make personal decisions. It is almost impossible for a mother or father to accept even a 0.001% chance of a seizure occurring after a vaccination when it comes to his or her own child. In the parent’s mind, a 0.001% chance becomes a picture of the child having uncontrollable seizures. And as we have pointed out throughout this book, this vivid mental picture is not necessarily a bad thing in principle: it is part of what makes us able to empathize and to place great emphasis on the care and safety of our own children. The problem, once again, is that when it comes to medical decision making, statistics are often much more reliable than mental images and emotion.

So what should we do about all of these cognitive and emotional challenges that make it difficult for us to take scientific evidence into account when making personal health decisions? We cannot completely eliminate the role of emotion and empathy, nor would we want to do that. We can, however, learn to incorporate these emotional responses into appropriate strategies to handle poor medical decision making.

We will now turn to a series of these strategies. As is the case with this entire book, there are strategies here for many different constituencies, including scientists, nonscientists, journalists, and educators.

Solution #1: Science must deal with increased access to various types of information via the Internet.

When the recent erroneous article by Brian Hooker came out claiming that the CDC covered up information from a study that demonstrated that vaccines cause autism in African American children, the Internet was immediately filled with blog posts and articles insisting that we cannot trust the government and their lies and cover-ups. One of us (Sara) did a small spontaneous experiment: How many pages of Google results would it take to get to an article that explained the results from a more scientifically informed perspective? It took Sara 10 Google result pages to find an article by an actual trained scientist taking the results of this study into account and carefully delineating why the methods of the Hooker study rendered its conclusions invalid. But how many people are actually patient enough to click through 10 pages of a Google search? We already know that it is difficult for us to change our minds once we have formed an opinion. By the time you get to page 10 of Google results (if you get there), wouldn’t you already have been persuaded by nine pages of strongly worded posts and articles trying to convince you, often with graphic pictures and heart-wrenching personal stories, that vaccines in fact cause autism and that this study by Brian Hooker, who has a PhD, shows nothing other than the government’s shocking and horrifying decade-long cover-up of Andrew Wakefield’s original “breakthrough”? Such messages invariably employ the principles of charismatic persuasion that we have already shown are incredibly effective. In addition, as we have seen with our analysis of risk perception, people are already prone to being skeptical of vaccines, unfamiliar man-made products whose mechanism of action is complex and counterintuitive. The mention of the word vaccine may trigger a response in the brain’s fear centers, like the amygdala, because it carries with it associations of painful injections, infectious diseases, and dire warnings of complications like autism, a devastating illness that has deeply affected the lives of so many American parents and has a largely unknown cause. As we have also shown, people are not comfortable sitting with unknown causalities and tend to “fill in the gap.” The advice of the charismatic leader is rewarding—it tells us there is a solution to our fear: Don’t vaccinate—that carries with it all the privileges of joining a group of passionate people. The nucleus accumbens is activated, and soon the words of the charismatic leader become reified in our minds, a kind of habitual thinking.

All of these factors, all very natural human responses, conspire against the poor scientist who has slaved away to write a thoughtful article showing the scientific invalidity of the paper causing all of this commotion in the first place. That beleaguered scientist, appealing as he is to the sluggish prefrontal cortex, does not stand a chance against the barrage of nine pages of conspiracy-theory-laden articles preceding his analysis. So what can she do to make sure her work is not for naught and that her important analysis can see the light of day and influence these crucial, often life-or-death, health decisions?

We propose that scientists not only become fluent in the kind of information we confront on the Internet but also that they join the conversation in a much more active way. We think that scientific and medical societies in particular have much to gain from formalizing a broad, far-reaching online and social media strategy. Yes, all scientific and medical societies these days have websites and send out e-newsletters and digitize their journals. But how many of them have a truly active Twitter or Facebook account providing up-to-the-minute coverage for the general public about important scientific and medical issues? We would venture to say very few. (See figure 11.) A good model to follow is that of the American Geophysical Union. It has over 20,000 followers on Twitter, and at the recent AGU meeting in San Francisco, staff and scientists live-tweeted from the conference as new information was being presented.3 Incidentally, a large number of these Twitter followers are not scientists. In addition, they have 11 individual blogs that incorporate recent groundbreaking science into user-friendly articles.

Just as important, scientific and medical societies and government health agencies like the CDC, FDA, and NIH must learn to anticipate public fears and concerns and take action before incorrect ideas become unchangeable. The CDC should have reacted immediately to reports of Ebola outbreaks in Africa to reassure Americans at the same time as it urged us to devote resources to help Africans. Hooker’s article should never have seen the light of day, but since it did, the CDC should have responded online immediately. While this approach in no way comes close to dealing with the problem of irrational health beliefs on its own, it may be a crucial step in the right direction and a critical part of a wider strategy. Scientific and medical societies might need to hire new staff for this effort, but in the end, it could actually be part of a movement that saves a lot of lives. So the next time Hooker or one of his associates publishes an article on vaccines and autism and people google “Hooker vaccines autism,” one of the first things that should pop up is a blog post from the American Academy of Pediatrics briefly and accurately discussing why the article’s methods produced spurious results and invalidated its conclusions. We know that this method will not prevent people who already believe that vaccines cause autism from continuing to believe that. However, efforts like this could prevent a large number of people who are confused or on the fence from being led down a path to dangerous medical decisions.

imag

FIGURE 11 Tweet Expert

Source: “Tweet Expert,” by Raymond K. Nakamura, 2013. Used with permission of the artist.

Solution #2: Members of the media need to be better trained to understand what a valid scientific debate is and what it is not.

There are obviously many legitimate debates in science, especially in medicine. Should the FDA speed up approvals of drugs for high-needs populations with severe untreatable medical problems? The answer to this is not straightforward: speak with two different doctors and you will get two different answers. On the one hand, it is essential to get people better treatment as quickly as possible. On the other hand, we need to be absolutely sure that the drugs we release are safe. There is no clear-cut answer here. Who really benefits from gluten-free diets? What causes irritable bowel syndrome? What exactly is chronic fatigue syndrome, and who is at risk? All of these are legitimate questions up for debate in the medical and health fields. But other non-issues—such as whether vaccines cause autism, ECT is unsafe, or antibiotics treat viral illnesses—are not. The vaccine question was up for debate in 1998 when Andrew Wakefield first published his paper. But 17 years and numerous robust studies finding absolutely no association whatsoever later, this is not a legitimate debate among respectable and well-informed medical experts and scientists. How many more scientifically sound studies must be done before we have enough evidence to convince nonscientists what scientists already know—that GMOs don’t harm people, just weeds and bugs. However, to an untrained person, these “debates” might look just as legitimate as the ones over the cause of irritable bowel syndrome, even though they are not. How should a journalist, who is usually not a trained scientist, be expected to detect the difference between legitimate and illegitimate scientific debate when the stakes seem so high in either case and, as we have shown, both types of debate involve preeminent scientists and medical figures?

We propose that all journalists who undertake scientific reporting, even if they have a PhD in a scientific field, should receive some form of training on telling the difference between legitimate and illegitimate scientific debate. The media must learn that the scientific method is not a variety of political discourse in which equal time must be given to both sides. The amount of time devoted to a side should be proportionate to the strength of its scientific evidence and the proportion of legitimate scientists who endorse it. Given these criteria, newspapers and magazines have to stop presenting the aforementioned non-issues as controversial. Why not institute continuing science education for journalists, as is required for many other types of professionals? A course on this precise skill could be a requirement in any scientific writing or journalism program. Clearly not every science journalist will go through this training, but if we can get a good majority to do it, the journalistic response to scientific debate might be a bit more accurate. We all want our newspapers and other media to spend more time reporting science, but it must be done in a scientifically sound manner. We consumers of scientific journalism should insist on reading stories written only by reporters who have maintained their understanding of the issues by demonstrating ongoing participation in educational activities.

Solution #3: Scientists must be more sensitive to difficulties in communicating causality, people’s discomfort with uncertainty, and their own weaknesses.

Scientists and medical researchers tend to be very intelligent, insightful individuals with a great deal of knowledge about their subject area. There is no doubt that scientific breakthroughs in the health and medical arenas are largely due to the unique insights of a handful of exceptionally skilled and dogged people. But this kind of intelligence does not necessarily translate into an understanding of how information should be communicated to the rest of us. Scientists and medical experts are simply not trained to take human emotion into account when transmitting the results of their research. So much of the focus in scientific training is on writing scientific papers for peer-reviewed journals that are read by other experts. However, little, if any, attention is paid to how to translate research into information for general consumption. There is some talk in science about making the results of scientific studies understandable to nonscientists: not using scientific jargon, explaining what specialized terms mean, and so on. However, as we have shown throughout this book, the issue is not necessarily that people do not understand the information on a basic level; the issue is that they interpret it in the context of a series of psychological processes that can sometimes distort the real nature of the evidence. This is the aspect that scientists need to grasp much better. Medical and PhD training programs for researchers should include lectures and exercises involving this type of general communication of complex scientific results. Discussion should center on not only how to make the material accessible but also how to present it in a manner that will discourage irrational responses. This is particularly important in areas of great uncertainty or seemingly high risk. Instead of just telling people the percent risk, scientists need to understand what these percentages actually mean to people and, most important, frame them in a way that they will be most convincing and acceptable to the nonscientists. A small amount of training in cognitive psychology and behavioral economics should make scientists more aware of the biases and heuristics people use to interpret scientific information and teach them how to communicate around these psychological processes so that their messages can have maximum impact and be interpreted in the way they intend.

Scientific journal editors must always bear in mind the possibility that articles they accept for publication will emerge on the Internet and be seen—and misinterpreted—by nonscientists who are not the intended audience. An article reporting that a new drug caused tumors in cancer-susceptible rats only at extremely high doses is reassuring to experts in drug toxicology but may hit the Web as scary evidence of cancer-producing medications. From there it is only a short step to conspiracy theories about the government and drug industry colluding to protect toxic drugs. We do not at all mean that the journals should stop publishing such articles—they represent vital steps in getting new medications that we need. Rather, the journals involved must develop ways to explain the content of these articles to nonscientists before things get out of hand.

We have stressed that it is not known how best to communicate scientific evidence in all cases and that some research has yielded surprising results that confound what we thought should work. Hence, there is a critical need for funded research on the best ways to communicate scientific evidence and perceived scientific controversies. Prospective, randomized trials of different methods need to be done until we get a better feel for what works and what doesn’t. Should scientists use scare tactics? Do we need our own coterie of charismatic leaders? How do we transmit facts in a way that appeals to our audience’s emotional needs, not just their intellectual ones?

In addition to understanding the psychological processes that accompany consumption of health and medical information, scientists and medical researchers must be more open about the mistakes science has made in the past. If we defend science too strongly and never admit to any weaknesses, we will only further alienate the public. Scientists and medical researchers should be taught about the value of uncertainty—if people are made aware that there is some uncertainty in the information presented to them, they may be able to more closely evaluate their psychological responses. Openness about past errors shows a certain ability to be self-reflective and will bolster public trust in the scientific community. Whether we like it or not, many people greatly distrust the scientific and medical communities. Rather than writing these people off, scientists and medical experts need to couch their findings with a compassionate understanding of some of these fears and reservations. After all, some of these anxieties and doubts do come from serious missteps in scientific and medical history, such as the ethical breach represented by the infamous Tuskegee syphilis study, the stubborn persistence in advocating against fat intake, or the inadvertent dispensing, to approximately 260 children, of early batches of the new Salk polio vaccine that contained some inadequately deactivated virus particles (that problem was quickly recognized and remedied). Mistakes are going to happen, but they must not be used as weapons against the validity of the scientific method or to bolster conspiracy theories.

Solution #4: We need better childhood education about statistics and probability, in-depth understanding of the scientific method, places for error in science, development of critical thinking skills, and techniques to understand what’s “good evidence.”

One of your authors (Sara) will always remember a project her older sister was assigned for science class in sixth grade. Rachel had to walk around town, picking up a variety of leaves, taping them into a notebook, and somehow finding out what type of tree they fell from. Sara never forgets this because the project tortured Rachel. Every time the family went out, Rachel got very nervous about finding “good” leaves. Then when she got home, she proclaimed (rightfully) that she had no idea how to figure out what kind of leaf her dried out and crushed specimens were. Her parents (including the coauthor of this book) spent hours (fruitlessly) trying to help. We could ask Rachel today whether she remembers that project. She would say yes, she remembers the pain and agony she went through to finish it on time. But if we asked her whether she remembers what any of those leaves were? No way. On the other hand, Rachel ultimately majored in biology in college and today she is a physician and must use statistical information every day to make important medical decisions that will have a deep impact on people’s lives and well-being. What if we asked Rachel when was the first time she learned anything substantial about statistics? It was probably toward the end of her time in college. Almost nothing about it was ever mentioned in medical school.

Physicians are by no means the only people who need a deeper understanding of statistics and risk prediction. We are all bombarded with statistics, many taken inappropriately out of context, every day. With the rise of the Internet, the deluge of data, and a movement toward “patient empowerment,” the need for us to understand statistics in order to make our own crucial medical decisions becomes more urgent by the day. Most Americans, however, are woefully unprepared to do this. As a variety of open access journals proliferate, many patients and their caregivers are now going online to read scientific studies themselves. But are these patients and their families really prepared to evaluate the nuances of a case-control study versus a retrospective cohort study, the true meaning of an odds ratio, and whether the sample size was truly sufficient to draw any conclusions? Not likely. We are not advocating that every person in America become an expert statistician or epidemiologist. However, we do believe that introducing some of these concepts in a general way early and often could make an enormous difference in the way in which average Americans approach data.

Throughout this book we have outlined many of our ideas for how this would work. However, there are a few worth highlighting. Early education needs to do a better job of teaching critical thinking techniques. By that we mean teaching children when to question something and when something seems solid enough to trust. This seems like a simple matter, but we are generally extremely unprepared for the task. Teachers could develop projects that make better use of the Internet, for example. Students could be assigned a somewhat controversial topic with a lot of conflicting data. They could then be told to use the Internet to collect as much information as they can. They could be asked to decide what information seems valid and what does not and outline their process for coming to this conclusion. Then a class discussion could examine how we make these decisions. In fact, as we outline in chapter 5, there are some relatively simple rules of thumb for deciding whether information is valid. One straightforward way is based on the way the website looks. Professional websites, like the website of the American Academy of Pediatrics, usually look polished and refined, while the websites of anti-vaccine activists usually look flashy and frenetic. Teachers could give students some rules of thumb like this that they could add to their strategies for distinguishing good information from bad. Children of the next generation will need to be able to wade through more online information than children of any previous generation, so they will need this kind of education more than ever before.

Teachers, and science teachers in particular, need better education themselves. Right now, teachers are often unsure of themselves when dealing with science, especially if they perceive that a topic is “controversial.” As Melissa McCartney recently advocated, teachers need to be taught how to think like scientists, not merely to learn scientific facts and educational techniques.4

Finally, we believe that there needs to be a change in the way the scientific method is taught. Today, if the scientific method is taught to children at all, it is presented as if it were an enshrined, no-fault process that helps scientists arrive at an unquestionable answer. This makes it hard for most of us to ever understand how it is possible for a scientific study to become invalid after repeated failures to replicate. Instead, students should be taught about the basic structure of the scientific method, but they should also have some exposure to the messiness of it all. Where exactly can things go wrong in a scientific study? Students should be actively exposed to examples of this and be asked to evaluate simplified versions of published studies. There is no need to wait until someone is an advanced graduate student to teach them the basics of study design, bias, confounding, and statistical error. At that point, we find ourselves in a situation of simply preaching to the choir, which creates an even greater rift between the scientists and the so-called scientifically illiterate. There is no need for this distinction to be so stark, and we believe that the best way to address it is to do it as early and as often as possible.

Solution #5: We need healthcare professionals who can engage in motivational interviewing with people who have incorrect medical beliefs.

Motivation interviewing (MI) focuses on first finding out what someone knows and cares about rather than trying to convince them about something. It has been used very successfully in areas like addiction treatment. Addicts are often initially unwilling to change their behavior. A traditional approach is to inform them about all the risks and dangers of using illicit drugs and tell them they have to stop. The approach meets with very little long-term success. MI, on the other hand, starts by asking the addict if there are things in his life that aren’t going well, that he would like to change. An addict may not be eager to give up the feeling he gets from snorting cocaine, but is unhappy with getting arrested or spending every paycheck buying drugs. Through this kind of inquiry, the interviewer finds common ground that becomes the basis for motivating the addict to change his drug use.

We believe this approach may also work when dealing with people with fixed but incorrect scientific ideas. Rather than telling someone straight out that vaccines are safe and effective and she should immunize her children, it may work better to begin by finding out the basis for her beliefs. What does she think, and how did she reach these conclusions? Along with this, the interviewer will find out what is important to this parent, presumably ensuring the safety of her children but also not jeopardizing the health of other children. A Socratic dialogue is developed in which the interviewer and parent share information and values to reach a joint decision. That decision may at first be simply to have another conversation in a week. Part of our research agenda is to test the ability of MI to successfully change health decision behaviors in a more scientific direction and to see if population-based MI techniques might be developed to reach larger numbers of people.

Solution #6: We all must examine our tendency to think uncritically and to place emotion over reason.

Poets, ethicists, and clerics have long lectured to us that the most exalted human state is love for and from another individual. We do not dispute this. But science can rarely afford to be sentimental given its responsibility to learn about the big picture, what will help the greatest number of people. To be a participant in this process, we are asked to set aside some basic biological processes that have been entrenched by millennia of evolution and to place reason over human instinct. This is not easy. Too much reason and we will never drive a car again and perhaps stop eating and breathing altogether since there seems to be a study somewhere, reported to us with glaring headlines, that almost everything we eat and breathe is bad for us. A life full of love, spontaneity, and enjoyment requires obeisance to emotions and heuristics.

As we have tried to make clear, however, the scientific process is rarely spontaneous or straightforward. It is full of fits and starts, rethinking what was once considered proven, and battling over what constitutes enough evidence to prove truth. It is so easy for us to be swayed away from the complexities and uncertainties of science by charged, graphic, and emotional appeals. We are not to be blamed for believing that anything made by a large, for-profit company must be evil, that medical associations say things only because they want to maintain their power over us, and that there are no government agencies that can be trusted. The facts, however, are not so simple. Drug companies do some awful things, but they also produce the life-saving medications that have revolutionized the prospects for human health and survival. Big companies try to sell us dangerous products all the time, but sometimes, as in the case of GMOs and nuclear power, they have hit upon technologies that, while not perfect, are far less risky and far more beneficial than detractors would have us believe. “Farm to table” and “back to nature” are wonderful ideas, often providing us with very delicious things to eat, but the “unnatural” process of pasteurization is critical to ensuring we do not get infected with life-threatening bacteria.

The only way that we can figure out what to believe is to school ourselves in the scientific method, to demand to know where people are getting their data from and with whom they are affiliated, and to reserve judgment until we have considered lots of points of view. Always ask “Compared to what?” and “What is in the rest of the boxes?” when someone makes a claim about the risk or safety about anything regarding our health. Above all, make learning about science an important part of your and your family’s life.

A Final Word

Some people’s views will never change. But if we can reach those people who are unsure and keep them from buying into incorrect scientific ideas, then we can declare a triumph. Undergirding our efforts to reach people should always be understanding and compassion. No one is immune from bias, heuristics, or emotional decision making. As we have hopefully made clear throughout this book, we are ready to admit our own foibles in these arenas, and this readiness certainly does not mean that we will not be prone to them over and over again. Even Daniel Kahneman admitted that he sometimes buys lottery tickets.5 We know we cannot overcome the incredible power of emotional and social forces that sometimes lead us astray in our scientific thinking, nor do we want to do that. It is not our intention to promote a world in which people do not care about stories, do not come together over issues that bother them, and do not feel inspired by charismatic leaders. But we do want to emphasize that until we bring these psychological, emotional, and social forces into the conversation, we will never get anywhere in the struggle against dangerous, unscientific ideas. As we hope we have shown, the answer to the question we set out to solve, why people cling to irrational medical and health views with no scientific basis, is as simple as this: because we are human. We are humans with empathy and a strong drive to build communities. We would never advocate eroding these wonderful features of human nature.

So we need to take them fully into account when we plan strategies to improve our ability to grasp scientific reality. Vaccinate your children, understand the benefits and not just the risks of GMOs and nuclear power, don’t drink unpasteurized milk or take antibiotics for a virus, agree to ECT if depression is overwhelming and refractory to drugs, and don’t buy a gun. Every one of these recommendations will invoke powerful feelings, including rage, in many people, but every one of them is based on the overwhelming weight of scientific evidence. By adding compassion, empathy, and emotion into this equation we will finally be able to effectively help people make the crucial decisions that will ultimately save their lives.