Charismatic Leaders - Denying to the Grave - Sara E Gorman, Jack M Gorman

Denying to the Grave: Why We Ignore the Facts That Will Save Us - Sara E Gorman, Jack M Gorman (2016)

Chapter 2. Charismatic Leaders

IN THIS CHAPTER, AS IN THE LAST, WE EXPLORE THE WAYS in which skeptical science beliefs are formed in social environments. This time, however, we focus specifically on the leaders of anti-science groups, people we call “charismatic leaders.” While behavioral economists and cognitive psychologists have spent much time analyzing the ways in which individual shortcuts and cognitive errors cause irrational beliefs, here we turn to social psychologists to inform us about how being in a group affects the way we process risk, the way we understand complexity, how we decide to whom we give credibility, and how we determine what is true and what is false. We will see that in some cases, people in groups make decisions or hold beliefs that do not resemble decisions or beliefs they would hold on their own. In this way, groups can be said to have transformative capacity. The formation of cooperative groups was foundational in the evolution of the human species. Groups are essential for successful human society. In this chapter, however, we explore the ways in which anti-science groups form around a particular type of leader. Although groups have many important and interesting dynamics, the role of the charismatic leader is a central feature of science denial groups in particular.

We will look at several such charismatic leaders in this chapter: Peter Duesberg, Andrew Wakefield, Jenny McCarthy, Gilles-Éric Séralini, and Wayne LaPierre. We will also look at the ways in which our brains and minds work in social settings and how exactly we are persuaded. Indeed, some of the features that make us most human, such as our ability to empathize using our imaginations and our desire to be accepted and integrated into social groups—indeed, some of the very things that enable our society to function—are also some of the features that render us most irrational and most prone to the wiles of the charismatic leader.

Leadership Is in the Eye of the Beholder

Before we discuss the figure of the charismatic leader, it is important to examine what factors enable leadership success in the first place. In many ways, leadership is actually created by followers. It is, in some ways, in the eye of the beholder. Leadership theory discusses how people’s preconceived notions of how a leader should look and act affect whether certain people rise to the position of “leader.” For example, in the business world, someone who is well dressed and outgoing might be immediately labeled as a leader.1

Once a group of people has formed a sense of “us,” the leader must conform to a leadership prototype. That is, the leader must be seen as the epitome of in-group identity. The more the leader is seen as “one of us,” the more influence he or she will have.2 The leader must also be seen as a group “champion.” The group must perceive that the leader is in it for the group goal rather than primarily for his or her own benefit.3 The leader must be proactive about creating the group’s identity: he or she should not simply wait for group members to formulate a prototypical leadership role but work hard to create that role, substantiate it in the minds of group members, and ultimately fulfill it.4 The leader therefore must walk a careful line between being seen as “one of us” and wielding the authority to create and shape the group’s identity. An effective leader must take cues from group members but also have control over how the group expresses itself, how it develops, and ultimately what it does and does not support.

What is most striking about this model of leadership is that it is dynamic. Leaders do not exist in a vacuum, and the formation of leaders cannot be viewed as something entirely separate from the formation of groups. That is why this chapter examines both the features of charismatic leaders as well as the theories and science behind group psychology, arguing that the leader and his or her followers go hand in hand. One does not exist without the other, and the psychological features of the group are also key not only to the perception of the leader but also to his or her leadership style. While there has been considerable debate in this field regarding the precise characteristics of the charismatic leader, there has been little debate about the fact that the charismatic leader’s authority lies in his or her relationship with followers. Therefore, charisma does not reside exclusively in the leader but more precisely as a relational entity. The interplay between leader and followers is therefore an essential part of the process.5 The charismatic leader is not fundamentally different from a regular or relatively noncharismatic leader but rather capitalizes more successfully on the particular psychological profile of group membership to create a more intense, perhaps occasionally cult-like, following. The charismatic leader therefore functions more specifically as a persuader.

What Makes a Leader “Charismatic”?

It is also very important for us to state emphatically that we do not categorically equate “charismatic leader” with “demagogue,” “snake-oil salesman,” or any other negative characterization. Martin Luther King Jr. was a charismatic leader whom we all recognize as one of the most dramatically effective and morally correct people who ever lived. Charismatic leaders can be forces for good, and even those whom we believe distort science and mislead us are often right about some aspects of the issue at hand, although we obviously disagree with the leaders we choose to profile in this chapter. Individuals such as Martin Luther King Jr. and Gandhi used their charisma to instill people with visions of the right way forward and to achieve great and vital societal changes. We simply wish to establish the fundamentals of charismatic leaders as powerful persuaders so that we may better understand what happens when this leadership style is put to work in service of irrational, scientifically incorrect beliefs.

In general, charismatic leaders tend to share the following common characteristics: verbal eloquence, strong communication skills, a potent sense of an “us” versus “them,” and a remarkable ability to elicit strong emotions. There is a long-standing debate about whether these abilities are “inborn” or developed and thus whether we all have some capacity to become this type of leader. All leaders must create a sense of “us,” but what is perhaps unique about the charismatic leader is that he or she not only creates a much stronger sense of an enemy “them” than any other type of leader but also formulates a sense of “us” that can be so potent that it operates to the exclusion of other identities. That is, the group’s “us” becomes so strong that people in the group feel an allegiance to the group identity above all other aspects of their identities. This is certainly true in cults, and we will examine whether some of this strong “us” branding has occurred in some of the health denialist movements under examination in this book.

The concept of the charismatic leader was first formulated by one of the founders of the field of sociology, Max Weber. Weber, who worked in Germany at the end of the 19th and beginning of the 20th centuries, postulated three forms of authority in any society: the charismatic, the traditional, and the rational-legal. According to Weber, charismatic leaders derive their authority from structures outside of the formal legal system and typically derive their legitimacy from their exemplary character. They are therefore often seen as “superhuman” or exhibiting “exceptional powers and qualities.”6 In addition, charismatic leaders are essentially revolutionaries. They often come from the “margins of society” and represent strong breaks with established societal systems.7 In the 1960s, one scholar identified four characteristics of the charismatic leader: “invocation of important cultural myths by the leader, performance of what are perceived as heroic or extraordinary feats, projection of attributes ‘with an uncanny or a powerful aura,’ and outstanding rhetorical skills.”8 We will see that many of the charismatic leaders involved in health denialist movements are indeed from “fringe” positions, usually on the fringes of mainstream science. We will also see that most of them have mastered the art of persuasion, exhibit excellent rhetorical skill, and hold themselves up as the exemplars of “true science” in the face of what they perceive as corrupt and self-interested scientific practice.

Cults and Charismatic Leaders

Although cults represent a much more extreme version of the kinds of groups we describe in this book (for instance, we are by no means suggesting that anti-vaccine activists constitute a cult), it is still worthwhile to look at the features of cult leadership and cult formation, because the cult leader is basically the charismatic leader taken to the extreme. In addition, cults are almost always a breeding ground for irrational beliefs, which the leader cultivates and develops in order to keep members in line. Although cults represent a much more extreme version of group formation than anti-vaccine activists or pro-gun activists, it is still helpful to see the charismatic leader in its most extreme form in order to have a thorough understanding of what this figure entails. Sometimes, looking at phenomena in their extreme states gives us the clearest sense of what these phenomena truly are.

Studies of cults as a phenomenon are not scarce. One consistent conclusion of these studies is that cult members are often surprisingly educated, seemingly rational people. In fact, they often possess above-average intelligence, and in many cases are recruited from college campuses.9 Indoctrination into cults probably occurs in several stages. First is the softening-up stage, in which recruits are targeted, invited to meetings, and showered with attention from cult members. In the second stage, compliance, recruits begin to experiment with some of the beliefs and behaviors of cult members, even though they may still be skeptical. By the third stage, internalization, recruits begin to buy into some of the beliefs of the cult members and find themselves more and more integrated into the group. In the final stage, consolidation, recruits become loyal to the cult at all costs.10 It is this stage that causes some of the tragic effects of cult membership we sometimes see in the news, such as mass shootings or suicides.

Cult leaders are certainly an extreme case, and many exhibit narcissistic features to an unusual, often pathological, degree.11 Yet some of the characteristics of cult leaders are common among leaders in general and are especially relevant to charismatic leaders. Most strikingly, cult leaders tend to refer to nonbelievers and outsiders explicitly as the “enemy.” Charismatic leaders may not be this explicit, but they do create an intense sense of in-group identity that craftily plays on the ways in which group psychology can alter individual needs, preferences, and opinions. Charismatic leaders thus create more of a sense of “us” and “them” than noncharismatic leaders do. In addition, cult leaders are notorious for cutting people off from society, whether this isolation includes removing them physically to a remote area or restricting their contact with family and friends. While charismatic leaders certainly do not all operate with such extreme measures, they do often create an environment in which people become so identified with the ideas of the group that they are impervious to outside influences, including rational and alternative points of view.

How Do Leaders Persuade Us?

Charismatic leaders are essentially persuaders. Nelson Mandela used his charisma to persuade his followers to work to erode apartheid in South Africa. Fidel Castro leveraged his magnetic personality to persuade Cuban citizens to follow and support him during a long period of economic depression and sanctions from important sources of foreign trade and commerce. Aside from being articulate and emotionally savvy, persuaders often have a good sense of the psychological features of group formation and identity. Persuaders also play on known tenets of individual psychology. For example, persuasive messages are often tailored to increase or decrease dissonance in receivers. Persuaders know that people have a strong, intuitive drive to reduce cognitive dissonance, the coexistence in the mind of conflicting ideas.12 In general, people do not tolerate inconsistency well, and there is also a social drive to appear consistent to others. If a persuader is trying to change an audience’s opinions on a particular issue, he or she might seek to arouse dissonance with respect to an opponent’s point of view. On the other hand, persuaders might want to reduce the possibility of dissonance by constantly reassuring people that they have made the right choice, that they hold the correct opinion, or that there is no viable reasonable alternative to the action they have taken or what they believe.13 If persuaders wish to convince people that vaccines are unsafe through a dissonance framework, they might begin by saying: “You all want to keep your children safe and healthy. Doctors and the government have routinely told us that to keep our children safe we must vaccinate them against a host of illnesses. Yet these vaccines are actually causing our children to be sicker than they would otherwise have been without them.” This statement creates a sense of dissonance in nearly any caring parent who is not well versed in the scientific and immunological bases of vaccination. The parent will immediately say to him- or herself: “How can I say that I am protecting my child when I am actually exposing her to all of these terrible illnesses by getting her vaccinated?” A sense of internal inconsistency, and hence dissonance, thus arises and the parent will likely do everything he or she can to dissolve this uncomfortable feeling.

In addition, people do not have to devote their full attention to a leader’s message in order to be persuaded that it is valid. Psychologists have asserted that there are two routes to persuasion: the central route and the peripheral route. When people scrutinize a message and are persuaded, they are being persuaded via the central route. However, an unscrutinized message may still be persuasive. This is where the peripheral route comes in. In this case, the listener will use various cues, some of which function very much like heuristics, to decide whether to be persuaded. For example, one powerful cue is the number of arguments the persuader uses. In one study, people who knew a lot about a particular topic were completely unpersuaded by a large quantity of weak arguments. People who did not know very much about the topic were very persuaded the more arguments there were, no matter how weak they were.14 In the case of whether HIV causes AIDS, vaccines cause autism, or GMOs cause cancer, most of us are not able to carefully parse an argument about these complex topics. As a result, we are more likely to be persuaded by heuristic-like “cues,” such as the quantity of arguments or the demeanor and authority of the speaker. Flooding the field with an abundance of so-called “studies” and individual cases impresses us. It might be that all of the “studies” are published in minor journals by scientists of questionable credentials and that the cases are cherry-picked rarities that do not represent the majority experience. Nevertheless, a charismatic leader can score points simply by providing a long list.

Charismatic Leaders Appeal to the “Emotional Brain”

To further compound the problem, charismatic leaders almost never speak to our rational sides. Part of what makes them such successful, magnetic leaders is their preference for emotional persuasion. Any CEO knows, and any basic leadership course will teach you, that immediately appealing to people’s rational brains will not get you very far. Not only will people not like you but your company will actually not function as well as it would if your employees were encouraged to think emotionally and to see the bigger picture. Rational discourse has an important place in any company, but research shows that leaders are more effective and companies more successful when the emotional brain is prioritized. Followers of charismatic leaders, who operate primarily by speaking to our emotional sides, tend to experience their work as more meaningful, receive higher performance ratings, and work longer hours than those who work for an effective but noncharismatic leader.15

Although it is an overly simplistic dichotomy, there is some justification in talking about a “rational” and an “emotional” brain. Neuroscientists note that basic emotions stem from evolutionarily more primitive parts of the brain, particularly the limbic cortex, which includes the amygdala (critical for fear), the insula (necessary for several emotions including fear and disgust), and the nucleus accumbens (the brain’s reward center). The sight of a snake activates the amygdala, causing an almost instantaneous set of behavioral and physiological reactions like freezing or fleeing, rapid breathing to increase oxygenation of the muscles, increased heart rate, and increased release of stress hormones like cortisol and adrenaline. Little thought is brought to bear. Show someone a picture of a snake several times, however, and another part of the brain recognizes that this is just a harmless picture and there is no actual danger. That part of the brain is the prefrontal cortex (PFC). Regions of the PFC are designated by location (e.g., ventromedial PFC, dorsolateral PFC) and other names, like anterior cingulate cortex (ACC) and orbitofrontal cortex (OFC). Each has slightly different primary roles in cognition and decision making. In general, the PFC and the limbic cortex turn down the activity of each other. A strongly frightening stimulus activates the amygdala, which inhibits the PFC and permits rapid response. A powerful PFC, on the other hand, can exert reason over the emotional brain. A balance of the two results in the ideal human who can experience deep emotions like love, fear, and compassion but can also read, write, understand mathematics, and plan for the future. Psychiatric illnesses like depression and anxiety disorders seem to involve in part a disruption in the normal connectivity between the PFC and the amygdala; sociopaths appear to have deficient amygdala responses and therefore do not experience normal levels of fear. In this chapter and throughout this book we will often refer to the rational and emotional brains with the understanding that the two are totally interconnected and influence each other in complicated and not always predictable ways.

Jan B. Engelmann of Emory University and colleagues performed an experiment that examined how the brain reacts to the advice of an expert. While having their brain activity imaged and quantified using functional magnetic resonance imaging (fMRI), subjects were asked to make a series of financial decisions; half of the subjects were given expert advice about which choices to make, and the other half were given no advice. As the researchers predicted, the group given the expert advice tended to conform to what the experts told them to do. Perhaps more interestingly, the study showed that neural activity in parts of the brain that are involved in judging probabilities, including the dorsolateral PFC and the cingulate cortex, were blunted among the subjects who received expert advice compared to those who did not. Furthermore, subjects who did go against the advice of the experts showed increased activity in the amygdala, which is associated with fear responses. The authors of the study concluded, “Taken together, these results provide significant support for the hypothesis that one effect of expert advice is to ‘offload’ the calculation of expected utility from the individual’s brain.”16 In other words, the decision-making parts of our brain turn off when experts tell us what to do. Even more important, defying an expert makes us anxious.

At the same time, if a charismatic leader can make us sufficiently frightened when he or she first gives us misleading or false information, it may set in motion neural processes that inhibit our ability to correct this initial impression. In a fascinating experiment by Micah G. Edelson of the Weizmann Institute of Science in Israel and colleagues, subjects’ brains were imaged with fMRI when they first received false information and then later were given the correct information. The study first showed that a region in the PFC (anterolateral PFC, alPFC) is activated during what the authors call “recovery from social influence.” However, strong amygdala activity during the initial exposure to false information decreased alPFC activity and the ability to correct the initial wrong impression. The study authors concluded,

It is possible that the amygdala activity leads to strongly encoded false memories that dominate the original representations. This in turn may restrict the possibility of recovery. These findings illuminate the process by which errors are, or fail to be, corrected and highlight how social influence restricts subsequent correction, even when that influence is later discredited.17

What this means is that when we are told that GMO foods cause cancer, vaccines cause autism, and there are criminals waiting to murder us if we are unarmed, we probably have an intense fear response egged on in part by amygdala activation. When facts are subsequently provided—GMO foods are not harmful to human health, vaccines and autism are unrelated, owning a gun increases the likelihood that the owner will get shot—the amygdala response strongly inhibits the ability of the PFC to correct the influence of the charismatic leaders. Those leaders who are most successful in making us feel frightened also make us most recalcitrant to the evidence that scientific inquiry provides.

Unfortunately, engagement of the emotional brain can sometimes mean neglect of reason and this can be especially problematic when trying to assess cold, hard scientific evidence. This is exactly the effect produced by the charismatic leader—an activated emotional brain and a repressed rational brain. Combining the charismatic style of leadership with topics that require careful scientific investigation and interpretation is precisely the recipe for disaster that fuels health denialist groups. And in fact, there are several reasons to believe that charismatic leadership is particularly suited to anti-science movements. Theorists have suggested that charismatic leadership is critical for organizations that need to induce subordinates to make value judgments and to understand the purpose of the entire movement of the organization. This kind of leadership is much less necessary in organizations that are focused on instrumental matters, such as which technology to use for patient records in a hospital. A lot of charisma is also necessary if compliance with the leader depends heavily on moral judgment and less on material reward.18 In other words, groups that depend on moral or value judgments require more charismatic leaders to hold subordinates together. This kind of leadership is therefore particularly appropriate for anti-science groups, which, as we have shown, depend not so much on evidence or mechanical facts but instead rely very heavily on emotional appeals and value judgments.

Case Study: Peter Duesberg

In her book on HIV denialism in South Africa, Nicoli Nattrass discusses some striking similarities between the structure of HIV denialist movements and anti-vaccine movements. One of the most significant similarities, Nattrass notes, is the existence in both movements of people she terms “hero scientists.” In the case of HIV denialism, these “hero scientists” include figures such as Peter Duesberg, a once well-respected scientist with experience researching retroviruses who brought the supposed debate about the cause of AIDS to a more public sphere. In the case of anti-vaxxers, the main hero scientist is Andrew Wakefield, the author of the retracted 1998 Lancet article suggesting a connection between autism and vaccines, who publicized the anti-vaccine position using his position as an authoritative, well-respected scientist. As Nattrass points out, both of these leaders have been skillful users of media, strategically utilizing media channels to create skepticism and to make it seem as though there were a genuine scientific debate going on.19

In their book Merchants of Doubt, Naomi Oreskes and Erik M. Conway profile some of the “hero scientists” and reveal a consistent pattern: a once distinguished scientist with outstanding academic credentials seems at some point to go astray, adopting fringe ideas and portraying himself as a victim of the very establishment in which he was once a major leader. For example, one of the leaders of the tobacco industry cover-up of the carcinogenicity of cigarettes was Frederick Seitz:

Seitz was one of America’s most distinguished scientists. A wunderkind who had helped to build the atomic bomb. Seitz had spent his career at the highest levels of American science: a science advisor to NATO in the 1950s; president of the National Academy of Science in the 1960s; president of Rockefeller University … in the 1970s.20

Because of his distinguished past, Seitz was able to garner the attention of both the media and Congress and his defense of nicotine was taken seriously in those circles for decades.

Duesberg was similarly preeminent in the scientific world. A PhD in chemistry, he gained international acclaim in the 1970s for his groundbreaking work in cancer research. In 1986, at the young age of 49, he was elected to the National Academy of Sciences and also received an Outstanding Investigator Award from the U.S. National Institutes of Health (NIH). He was widely considered a scientist of great importance worldwide. Yet starting in the late 1980s, Duesberg began publishing articles and eventually a book, called Inventing the AIDS Virus, in which he claimed that the human immunodeficiency virus (HIV) is not the cause of AIDS and proffered multiple conspiracy theories to explain why the scientific establishment disagreed.

In many ways, Duesberg can be regarded as the father of the AIDS denialist movement. The movement reached its most tragic peak with the public denialist stance of South African president Thabo Mbeki, who served from 1999 to 2008, but Duesberg began publishing his theories as early as 1987. Duesberg’s ideas had their greatest influence when he sat on Mbeki’s Presidential Advisory Panel on HIV and AIDS in 2000. Ever since then, Duesberg has been looked to as the “hero scientist” of the HIV/AIDS denialist movement.

One strategy Duesberg has on his side is simply volume. He has published a 500-page book arguing that the HIV/AIDS epidemic is basically a myth created by greedy virologists; he has also continued to write prolifically on the subject in every venue that will still have him. Most reputable scientific journals now refuse to publish Duesberg’s work, so he has switched to bombarding fringe journals, Internet sources, and popular media with his theories. His most recent 2014 article on the subject was published in the Italian Journal of Anatomy and Embryology.21 As noted earlier in this chapter, the peripheral route of persuasion often relies on the mere volume of arguments proffered. Duesberg’s prolific publishing and long-winded book reveal an interest in appealing to this form of persuasion.

One of Duesberg’s most potent strategies, however, is to present his side of the argument as something that people really value: scarce information. Duesberg repeatedly makes it seem as though he is giving his readers “inside” information that is precious and generally unavailable. For example, he draws a long analogy between the supposed “cover-up” of the “fact” that HIV does not cause AIDS and the SMON incident in 1960s Japan. For a long time, SMON (subacute myelo-optic neuropathy), a condition causing paralysis, blindness, and sometimes death, was thought by scientists to be caused by a virus. Eventually, however, they discovered that SMON was really caused by the drug clioquinol, commonly used as prophylaxis for traveler’s diarrhea. Duesberg uses faulty logic to “demonstrate” that this occurrence must be the same thing that is going on with HIV/AIDS, that AIDS is really caused by toxic antiretroviral drugs, that HIV has nothing to do with it, and that scientists are harping on the viral “hypothesis” in order to keep virology, a dead field in Duesberg’s opinion, alive. He draws inappropriate comparisons between the SMON case and the discovery of HIV as the cause of AIDS in order to suggest that the same “mistake” must be going on with AIDS.

The troubling logical errors embedded in these comparisons aside, Duesberg uses a very powerful rhetorical and persuasive strategy in telling the SMON story. He makes the reader feel as though this information, presumably kept relatively quiet by the U.S. government, is a well-kept secret and access to it is scarce. Duesberg therefore offers his readers an inside glance at a source of information that would otherwise be unavailable to them. Before making a sweeping comparison between SMON and the HIV/AIDS epidemic, Duesberg notes, “Once the truth about SMON could no longer be ignored, the episode dissolved into lawsuits for the thousands of remaining victims. This story has remained untold outside of Japan, ignored as being too embarrassing for the virus hunters. It deserves to be told in full here.”22 By making this statement, Duesberg essentially tells his readers: “Here is some juicy, private information that has been kept from you for a long time, and I am finally going to make it available to you.”

When people feel that a piece of information is scarce or difficult to obtain, they tend to feel more persuaded by it. This phenomenon has been well established in the realm of material goods: scarce items generally sell for higher values in the free market, reflecting the way in which people value scarcity. More recently, psychologists have begun to test whether the same is true about less tangible items, such as information. One experiment illustrates the principle quite well. The owner of a beef-importing company called the company’s customers, buyers for supermarkets and retail food outlets, to ask them to purchase beef under three different conditions. One group simply heard a standard sales presentation. A second group heard the sales presentation and were told that beef would soon be in short supply. A third group heard the sales presentation, were told that beef would soon be in short supply, and were also told that the information that the beef would soon be in short supply was exclusive information that no one else but them knew. The second group bought the same amount of beef as the first group. However, the third group, who were led to believe that they had been privileged with the “exclusive” news that the beef supply would soon run short, purchased six times as much beef as the first group.23 The results of this experiment suggest the strong effects not only of commodity scarcity but also of the illusion of information scarcity on people’s ability to be persuaded.

Like cult leaders, Duesberg also creates an extremely strong sense of “them” to pit the “us” against. This strong formulation of group identity has a significant ability to alter the way we think about things that are as straightforward as the probability of a particular event happening to us. Strong group membership can distort even a number that seems immune to manipulation. Duesberg achieves this primarily by making mainstream science look like a relentless form of consensus culture:

Few scientists are any longer willing to question, even privately, the consensus views in any field whatsoever. The successful researcher—the one who receives the biggest grants, the best career positions, the most prestigious prizes, the greatest number of published papers—is the one who generates the most data and the least controversy. The transition from small to big to mega-science has created an establishment of skilled technicians but mediocre scientists, who have abandoned real scientific interpretation and who even equate their experiments with science itself.24

Duesberg here constructs a new view of the scientific world: a consensus culture in which all the revered and established researchers operate simply as individual replicas of some dogmatically accepted central truth. By minimizing the intense kind of conflict, disagreement, and rigorous debate that actually goes on in the scientific community, Duesberg makes the “enemy” look much more unified than it actually is. Unifying the enemy ultimately functions as a strategy to then unify the opposition. This is in fact a favorite strategy of charismatic leaders, and it has a significant effect on people’s ability to parse the information that is truly coming from the “other side.” Once the “enemy” has been consolidated, followers of the charismatic leader have much less access to the individual arguments of those in the opposing camp and have lost the ability to analyze information from these camps unfiltered by the leader.

Duesberg is often cited as a forefather of the HIV denialist movement, and his idea that HIV does not cause AIDS but instead is really caused by illegal drug abuse and antiretroviral medication, which functions as a central tenet of the HIV denialist movement, has even been termed the “Duesberg hypothesis.” Look at any HIV denialist website, paper, book, or video and all will cite Duesberg as a major figure in the formation of the movement. He therefore earns the title of a movement leader. Beyond that, we are arguing that he is in fact a “charismatic leader,” due to the rhetorical tactics he has used, the popularity of his ideas among HIV denialists, and the way in which he has been worshipped and given undue amounts of authority by this group of denialists. In order to understand HIV denialism, it is essential to understand the charismatic leaders in front of the movement. In a similar manner, in order to understand the anti-vaccine movement, it is vital to understand the charismatic leaders at its helm. Without this kind of understanding, our view of these movements is by definition incomplete.

Case Study: Andrew Wakefield

Ever since he published a paper claiming a link between vaccines and autism in The Lancet in 1998, Andrew Wakefield has been a major champion of the anti-vaccine movement, frequently speaking at their rallies and conferences. Like Peter Duesberg, Andrew Wakefield had a distinguished background as a well-respected gastroenterologist and medical researcher. When it was uncovered in 2010 that Wakefield had been accepting money from a defense lawyer to publish certain findings, The Lancet paper was retracted and Wakefield eventually lost his medical license. In addition to obvious conflicts of interest, the paper was scientifically unsound, with only 12 nonrandomly selected subjects and no comparison group.25 Nevertheless, a strong contingent, including Wakefield himself, has persisted in the belief that his findings are in fact the scientific truth and that his demise is the product of a conspiracy propagated mostly by the medical community’s slavish deference to the pharmaceutical manufacturers of vaccines. This is, of course, the type of conspiracy theory we discussed in chapter 1.

Wakefield has many of the characteristics of a strong charismatic leader, and these characteristics have served him well as the “scientific” spokesman for the anti-vaccine movement. If Jenny McCarthy represents the “human,” parental perspective of the movement, Wakefield is supposedly the science behind it. Wakefield’s method is to characterize himself as a victim while emphasizing the desperate, inquisitional strategy of the opposition, which consists of the public health community, the American Academy of Pediatrics, and the pharmaceutical industry. In an interview with Anderson Cooper on CNN, Wakefield refers to the journalist who exposed him, Brian Deer, as a “hit man” who was being paid by some unidentified “them.” He calls the retraction of his paper by The Lancet and his condemnation by the British General Medical Council a “ruthless pragmatic attempt” to cover up his own sincere investigation of vaccine damage. In an interview with Robert Scott Bell, Wakefield refers to the medical community as a “religion” that no one is allowed to question and calls the allegations against him an “attack.”26 And during the American Rally for Personal Rights, at which Wakefield was the keynote speaker, he proclaimed that the allegations against him were “a futile public relations exercise that will fail.”27 These examples show Wakefield expending quite a bit of energy crafting an identity for his ruthless, calculating enemies. As we have seen, the creation of a strong enemy is a common tactic of charismatic leaders that helps solidify the group identity of his followers.

Wakefield also does an effective job of justifying the need for a charismatic leader in the first place. He frames the situation as a battle between the interests of the patients and the interests of the public health community and pharmaceutical industry. He suggests that the moment is ripe for a “social revolution,” embodying the charismatic leader’s position on the fringes of mainstream society as an instigator of revolution, as originally identified by Weber. Like the head of the National Rifle Association, Wayne LaPierre, Wakefield broadens the playing field from one limited to a specific issue—vaccines or guns—to a much larger issue of personal rights and freedom. He calls autism a “worldwide pandemic” and refers to vaccines as an “environmental catastrophe.” The choice is simple, he asserts: we can either attend to our patients or walk away. The scientific community must choose between “fidelity and collusion,” he proclaims. And parents must demand the right to choose “for the sake of the future of this country and the world.” All of this rhetoric effectively establishes Andrew Wakefield as a charismatic leader. A victim of the status quo, Wakefield has suffered the consequences of choosing the correct path and has lost his medical license, his research position, his reputation, and, he says, even his country. Yet all of this, he claims, is nothing compared to the suffering of parents of children with autism and other developmental disorders. Like the true charismatic leader, he demonstrates that his overwhelming commitment to the cause is so strong that he was willing to lose not only his license but also his home. He further emphasizes his commitment to the cause through an elaborate process of self-effacement, in which he calls himself “irrelevant,” stating: “It doesn’t matter what happens to me. It is a smokescreen.”28 The real issue here is not his career or his reputation but the pure goal of saving the children. Wakefield speaks with the language of a religious zealot. He does not talk science—even he has been unable to replicate his “findings” in a separate experiment. Everything he says rests on 12 children, some of whom did not actually have autism and others who had it before they received the MMR vaccinations.29 In retrospect, there is absolutely no science involved here.

Wakefield is certainly a powerful communicator, with a great deal of verbal eloquence, strategic use of hand gestures and facial expressions for maximum animation, and the ability to create a sense of urgency. Wakefield’s ability to persuade lies largely in his charismatic qualities, especially his ability to unify “us” against “them” and to invent a crisis that requires immediate action and the guidance of a cast-out, self-sacrificing, superman leader. Moreover, Wakefield is the face of empathy standing in opposition to a harsh scientific and medical world that supposedly panders to the whims of the pharmaceutical industry. Wakefield’s goal is to protect the children and to grant parents their freedom to choose, and he makes very clear that he is willing to sacrifice everything to achieve this goal. In his self-sacrifice, his call to revolution, and his existence on the fringes of mainstream science, Wakefield embodies the charismatic leader and all of the psychological consequences that accompany this figure.

Case Study: Jenny McCarthy

Jenny McCarthy may not seem like the most obvious choice here, since she does not have the credentials of a scientist such as Peter Duesberg or Andrew Wakefield. Nonetheless, she is a major leader of the anti-vaccine movement, and there are reasons to believe that her style is indeed charismatic. First of all, she is to some physically attractive. This feature helps her get attention. Watching a few of McCarthy’s media appearances and her rallies in places such as Washington, DC, we can see that she embodies many of the features of charismatic leaders. Her tone of voice is engaging and captivating. Her facial expressions are animated. She makes direct, sustained eye contact. She is expressive with her hands, and she even smiles and laughs as she finds appropriate in order to draw her audience toward her.30 She is certainly extraverted, which leadership theorists have maintained is a particularly charismatic feature for female leaders.31

We need not look much further than McCarthy’s speech at the 2008 “Green Our Vaccines” rally in Washington, DC, to see many of these principles at play. For one thing, videos of the rally show an enamored crowd chanting McCarthy’s name as she takes her place at the podium. This kind of devotion to a leader suggests a charismatic leadership style, in which the leader is viewed as someone with special powers and as someone to be admired as a superhero. The sound of “Jenny! Jenny!” being shouted in unison is certainly a striking one, and it already indicates, before McCarthy even starts speaking, that she at least has the kind of following that a charismatic leader must attract.

Once she starts speaking, McCarthy employs a whole host of charismatic leader strategies, many of which are designed to stimulate emotional responses in her followers. As we noted earlier, a movement designed to reinforce a certain point of view—one that is based not in facts but in “values” that supposedly promote caring for our children and taking their health into our own hands—is particularly prone to the skills of charismatic leaders. The leader in this type of situation can make all the difference, as opposed to one involving a group deciding on something more mundane and mechanical. McCarthy begins by insisting that everyone in the crowd come closer to the podium. She takes multiple strategic pauses and speaks in a kind of storytelling voice with extreme animation and constant fluctuations in her tone. She exhorts parents to be empowered and to take the safety of their children into their own hands. She refers to her audience multiple times as “you guys” and encourages everyone to show pictures of their children to the media cameras at the rally. Finally, she calls this a time in history for citizens to change the world and to fight the powers that be.32 Once again, and like many charismatic leaders, she broadens the scope of the appeal beyond the main issue to one that involves universal concepts of justice and freedom, things with which almost no one disagrees. Thus, the audience gets confused—is it really focused on the cause of autism or on freedom and justice for all? Any hint at science is notably absent from the discourse.

In this short speech, McCarthy displays many signs of charismatic leadership. By drawing the crowd closer to her, she creates an even stronger sense of “us,” inviting her followers in to create a more unified group. Similarly, by referring to her followers as “you guys,” she creates a sense of unity among them to fight against the opposition. One can almost visualize the outpouring of oxytocin from the hypothalamus of every person in attendance, making them all feel loved and secure.33 Her tone of voice and strategic, dramatic pauses draw listeners in and resemble the kind of sustained eye contact and animation that are often identified as central features of the charismatic leader’s style. Perhaps most interesting, in closing her speech, McCarthy calls upon her followers to embrace this as a time in history when citizens can rise up, change the world, and fight the powers in charge. As discussed earlier in this chapter, Max Weber defined the charismatic leader as someone who takes charge from the fringes and leads followers through a form of revolution. Here, McCarthy takes advantage of this formulation and specifically crafts the movement as a revolution of sorts. She encourages her followers to view this as an opportunity to challenge and perhaps overthrow those in power and to embrace this admittedly fringe view as a chance to change the world. This kind of revolutionary language is certainly in line with the prerogatives of the charismatic leader.

Although Jenny McCarthy may lack the scientific training of an Andrew Wakefield or Peter Duesberg, she does have many of the features of a charismatic leader that, as we have shown, create a very specific dynamic with her followers. Understanding the anti-vaccine movement is therefore dependent on understanding how precisely these leaders lead and the psychology involved in responses to different types of leadership.

Case Study: Gilles-Eric Séralini

The trailer to the documentary Tous Cobayes Fa Anglais begins with a startling claim: after the atomic bomb was dropped on Hiroshima at the end of the Second World War the U.S. Department of Energy had a lot of money and staff and did not know what to do with them, so the agency decided to embark on the genome sequencing project, which led directly to nuclear energy and genetically modified organisms (GMOs).34

You can view this trailer on the website GMOSeralini, a very attractively designed website that represents the ideas of its founder, French scientist Gilles-Eric Séralini. Séralini is the darling of the anti-GMO movement. Like Duesberg and Wakefield, his résumé is very impressive. Since 1991 he has been professor of molecular biology at the University of Caen. His earlier work was in endocrinology, and his lab produced important findings about an enzyme called aromatase that were published in fairly high level journals. It may be the fact that this enzyme is involved in an herbicide called glyphosate, discovered and marketed by the Monsanto Corporation under the trade name Roundup, that attracted Séralini’s interest, but for whatever reason he turned his attention away from endocrinology to testing GMOs in rats. In 1997 he began advocating for a moratorium on GMO foods. He also formed the Committee for Research and Independent Information on Genetic Engineering (CRIIGEN), an advocacy organization that supports legislation to regulate GMOs in foods.

To understand Séralini’s brand of charismatic leadership, it is useful to know a bit about the anti-GMO movement. The battle against GMOs, which has been waged for 2 decades, is absolutely fascinating for anyone trying to comprehend the roots of science denial. There is actually no direct evidence that anyone has ever been harmed by eating food that contains genetically modified ingredients, and we have been consuming them for a very long time—centuries, in fact, from one vantage point. Nevertheless, individuals and groups have risen up passionately condemning them, demanding they be banned or at the very least that foods containing them be so labeled.

Vermont was the first state to pass a law mandating labeling food that contains GMOs. Somehow, opposing GMOs has become a left-wing cause. Whole Foods refuses to stock them, and Greenpeace demands their abolition. But as a graphic in The Economist noted, there are 3.1 million deaths in the world every year from malnutrition and 0 deaths every year from genetically modified food. The Economist article concludes, “The outlook is unappetising. Food scares are easy to start but hard to stop. GMO opponents, like climate-change deniers, are deaf to evidence. And the world’s hungry people can’t vote in Vermont.”35 In Hawaii, a debate about GMOs brought out advocates on both sides, but as New York Times reporter Amy Harmon observed, “Popular opinion masqueraded convincingly as science, and the science itself was hard to grasp. People who spoke as experts lacked credentials, and G.M.O. critics discounted those with credentials as being pawns of biotechnology companies.”36

Into this fray emerged a charismatic leader. Gilles-Eric Séralini became famous—some would say infamous—in 2012 with the publication of his paper “Long Term Toxicity of a Roundup Herbicide and a Roundup-Tolerant Genetically Modified Maize” in the journal Food and Chemical Toxicology.37

In the study, Séralini and his colleagues at Caen fed rats for 2 years with Monsanto’s Roundup and Roundup Ready corn (formally known as glyphosate-resistant NK603 maize). They reported that compared to control rats not given the herbicide or the GMO corn, those fed Roundup and Roundup Ready developed more tumors, had liver and kidney abnormalities, and died sooner. The conclusion in the abstract reads, “These results can be explained by the non linear endocrine-disrupting effects of roundup, but also by the overexpression of the transgene in the GMO and its metabolic consequences.” Not exactly easy reading for the nonspecialist, but essentially asserting that the two products damaged the rodents’ hormonal and metabolic systems. Séralini introduced the paper at a press conference of select journalists who were required to sign a confidentiality agreement prohibiting them from discussing the study with other scientists. This unusual maneuver prevented consenting journalists from obtaining any comments from independent scientists. Furthermore, at the press conference Séralini introduced his new book, OGM, le vrai débat (GM foods, the real debate), and a film about the study. Séralini appeared to be on an aggressive public relations adventure, not something scientists commonly get into when they publish a paper in a scientific journal.

The paper won immediate praise and dire warnings from anti-GMO organizations. The European Commission ordered the European Food Safety Agency (EFSA) in Parma, Italy, to evaluate it, and both Russia and Kenya put bans on GMOs.

Yet even more robust was the wave of derision the paper received from scientists. Just reading the abstract conclusion quoted in the previous paragraph gives an immediate hint that there might be a problem. What Séralini et al. did was an association study in which they claim to have observed a relationship between consuming GMO corn and getting tumors. But the wording of the abstract implies that they found a biological basis for this supposed association, what they call “metabolic consequences.” In fact, no such biology was discovered in the study or reported in the paper. Indeed, it is hard to understand what plausible biology could link a gene for herbicide resistance to all of these effects. While again the scientific details behind that assertion are complex, what we are pointing out is the gene put into Roundup Ready to make it resistant to the herbicide Roundup should not be able to affect any other system, including a rat’s metabolism. If it does, then a lot more data and explanation would have had to be included in the paper to demonstrate the claim.

But more important, a host of scientists immediately noticed all kinds of problems with the way the study was designed and analyzed. There were clearly too few rats, the strain of rats used is notorious for developing spontaneous tumors at a high rate, and unusual statistics were employed, leading to widespread insistence that the study could not possibly have found any actual link between the GMO and the tumors. In 2012, after Séralini refused to withdraw it, the journal announced it was retracting the paper. The retraction notice reads,

A more in-depth look at the raw data revealed that no definitive conclusions can be reached with this small sample size regarding the role of either NK603 or glyphosate in regards to overall mortality or tumor incidence. Given the known high incidence of tumors in the Sprague-Dawley rat, normal variability cannot be excluded as the cause of the higher mortality and incidence observed in the treated groups… . Ultimately, the results presented … are inconclusive.38

Séralini and his colleagues of course protested the retraction, accusing the journal of “double standards,” and defended the study.39 They also noted that they had sued Monsanto and won the right in court to see raw data from the company’s studies, which had been claimed to show no such harm from Roundup Ready. On the contrary, Séralini stated, his analysis of the data confirmed that the company had obscured harmful effects.

Now that the battle was on, Séralini became a self-declared victim of Monsanto and the scientific establishment. In video interviews he is an impassioned orator who declares himself in favor of “food democracy” and “transparency.” He decries that pesticides are “inside GMOs” and fighting this “should be a matter of revolution.”40 Note once again the appeal to revolution used also by the other charismatic leaders discussed in this chapter and identified by Weber as integral to charismatic leadership. In an interview on the Internet he said, “It is the same problem as mad cow disease” and “GMOs can be used to make war, and there are two methods—the soft one and the tough one.”41 Séralini now has all the credentials of a charismatic leader of an anti-science group. He has a legitimate scientific résumé, he has crafted himself as the victim of imperious and malevolent forces that threaten our right to choice and safety, and he has broadened the agenda to encompass such widespread ills as nuclear holocaust and war. He delivers a powerful message on the Internet, in books, and in documentary films. His science is highly suspect, but his abilities as a speaker and persuader are nonpareil.

Against this charisma, will anyone heed the words of the following editorial written by scientists in 2013?

New technologies often evoke rumors of hazard. These generally fade with time, when, as in this case, no real hazards emerge. But the anti-GMO fever still burns brightly, fanned by electronic gossip and well-organized fear-mongering that profits some individuals and organizations. We, and the thousands of other scientists who have signed the statement of protest, stand together in staunch opposition to the violent destruction of [work testing] valuable advances such as Golden Rice that have the potential to save millions of impoverished fellow humans from needless suffering and death.42

Case Study: Wayne LaPierre

There are statements that Wayne LaPierre, executive vice-president and CEO of the 5-million-strong National Rifle Association (NRA), makes with which almost no one could disagree.

“Freedom has never needed our defense more than now.”

“We’re worried about the economic crisis choking our budgets and shrinking our retirement. We’re worried about providing decent healthcare and a college education for our children. We fear for the safety of our families—it’s why neighborhood streets that were once filled with bicycles and skateboards, laughter in the air, now sit empty and silent.”

“Political dishonesty and media dishonesty have linked together, joined forces, to misinform and deceive the American public. Let’s be straight about it—the political and media elites are lying to us.”43

We have taken these comments from LaPierre’s 2014 address to the Conservative Political Action Conference out of context to make a point about charismatic leaders: one of their signatures is to take whatever issue with which they are involved and turn it into something that shakes every aspect of our fundamental existence. Notice that both left-wing and right-wing activists would likely agree with all of these statements: everyone is concerned with incursions on our freedom, the economy, healthcare, and declining urban areas.

One can easily imagine an advocate for almost any cause articulating the same sentiments as those of LaPierre. We have already pointed to Séralini’s assertion that GMOs are an assault on our freedom. A corrupt media that refuses to recognize the truth is the mantra of almost every activist who sees him- or herself as a victim. In this case, we can insert the notion that the solution to all of these problems is owning a gun and we have LaPierre’s version. With remarkably little effort, one could replace all references to guns in his speech with mentions of any other cause, say rights for LGBT people—a cause one doubts LaPierre has much enthusiasm for—and the piece would work as well in arousing us to gross injustice and personal danger. Thus, LaPierre, like many charismatic leaders, actually distracts us from the specific issue at hand—in this case, the wisdom of personal gun ownership—and moves us to a world of platitudes and universal concerns that rouse our emotions without taxing our analytical skills.

Born in 1949, Wayne LaPierre received an undergraduate degree at Sienna College and a master’s degree from Boston College. He began working for the NRA in 1977 and in 1991 he became its head. According to the Educational Fund to End Gun Violence, he is paid about $1 million a year by the NRA. It is difficult to find accurate biographical information about him on the Internet, which lends an aura of mystery to him. Several sites claim that he was deferred from military service during the Vietnam War because of an emotional condition, which would be ironic given his insistence that gun violence is entirely the result of untreated mental illness, but these are mostly sites that are antagonistic to the NRA and it is difficult to verify the information.44

Whether or not LaPierre did have emotional problems that kept him out of the army, he and the NRA have latched onto mental illness as one of their key rallying cries for the basis of gun violence. The problem is not, he repeatedly tells us, people having guns; rather, it is only mentally ill people having guns about which we should worry. His solution is to make sure the mentally ill are prevented from owning guns. LaPierre and the NRA insist that this will solve the gun violence problem and leave the bulk of the population who do not suffer from mental illness free to enjoy their weapons in peace.

Many people have the notion that “crazy” people are dangerous. Put a gun in the hands of someone with mental illness and a murder is likely to happen, they think. As Heather Stuart noted in an article in the journal World Psychiatry, “Indeed, the global reach of news ensures that the viewing public will have a steady diet of real-life violence linked to mental illness.”45 In a recent survey, most respondents favored requiring states to report to a national gun control database the names of all people who have either been involuntarily committed to a hospital for psychiatric treatment or been declared mentally incompetent by a court.46 In general, the one type of gun control legislation that most people support is the variety that restricts seriously mentally ill people from possessing firearms.

While we do not think there is anything necessarily wrong with preventing people with serious mental illness from having guns, it is not a solution to gun violence for at least two reasons:

1.Most gun homicides are committed by people who do not have mental illness.

2.If one could stop mentally ill people from owning guns it might reduce the suicide rate but would have almost no impact on the homicide rate.

The mass of available data concerning the propensity to shoot a human being instead of a hunted animal does not make for easy reading. It involves complicated statistical analyses of data collected from large populations. Because no one can do a randomized clinical trial and control for all of the confounding variables involved in who does or doesn’t own a gun, figuring out who it is that shoots people mandates some very messy experimental designs and fancy mathematical tests. This situation is made all the more difficult by a law our Congress passed that forbids the CDC from conducting research on gun violence. These are generally not the kinds of things that charismatic leaders want to delve into; LaPierre prefers basing his arguments on “facts” like these he laid out in a 2014 speech:

There are terrorists and home invaders and drug cartels and carjackers and knockout gamers and rapers, haters, campus killers, airport killers, shopping mall killers, road-rage killers, and killers who scheme to destroy our country with massive storms of violence against our power grids, or vicious waves of chemicals or disease that could collapse the society that sustains us all.47

If LaPierre were really interested in the data, some of it actually at first seems to support his position: recent data do support the popular belief that people with some forms of mental illness are more likely than the general population to commit violent acts. These tend to be people with psychotic illnesses who suffer from paranoid delusions.48 One recent study showed that delusions of persecution, being spied on, and conspiracy were linked to violent acts, but only when the perpetrator is especially angry, such as during an argument.49 These kinds of delusions can make the person feel threatened to the point of believing there is a need for a violent defense. Delusional people tend to have the paranoid form of schizophrenia, suffer from bipolar disorder and be in the midst of psychotic manic episodes, or be intoxicated with stimulant drugs like amphetamines and cocaine.

But that is only the first part of the story, which becomes considerably more complicated and at the same time interesting. First, even given that a segment of the population of people with mental illness is statistically more likely to commit violent acts than the general population, the fact remains that mass murders by mentally ill people are rare events—too rare, experts agree, to be predictable.50 Newtown was a great tragedy, but it accounted for only 26 of the approximately 10,000 people who were killed in gun homicides in 2012. It is unlikely, for example, that any mental illness registry would have identified the shooter in Newtown. In fact, only 3-5% of violent acts are committed by people with mental illness, and most of these do not involve guns.51 The American Psychiatric Association has pointed out that the vast majority of people who commit violent acts are not mentally ill.52 When mentally ill people do commit violent acts, they usually do so in the context of an interpersonal conflict, not a well-planned mass murder.53

There is one form of mental illness that is clearly associated with an increased risk of violence and should be attracting much more of LaPierre and the NRA’s attention: alcohol and drug abuse. Studies have repeatedly shown that drinking significantly increases the chance that a person will commit a violent act.54 Neighborhoods with a high density of liquor stores have higher rates of violence than those with fewer liquor stores.55 Alarmingly, people who own guns are more likely than non-gun owners to binge-drink, drink and drive, and to be heavy drinkers.56 Heavy alcohol use is particularly common among people who carry guns for protection. Alcohol is also well known to be a significant factor in many cases of suicide by firearm.57 As an editorial in The New York Times pointed out,

Focusing on the mentally ill … overlooks people who are at demonstrably increased risk of committing violent crimes but are not barred by federal law from buying and having guns. These would include people who have been convicted of violent misdemeanors including assaults, and those who are alcohol abusers.58

Given all of these data, why does Wayne LaPierre continue to focus on the issue of mental illness as the root of all gun violence? After all, the Second Amendment, which he holds to be so precious, does not say “The right of the people—except the ones with a psychiatric diagnosis—to keep and bear Arms, shall not be infringed.” Why wouldn’t this kind of restriction also represent a “slippery slope” to LaPierre, the first step toward taking everyone’s guns away?

Clearly, LaPierre’s inflexible support of the Second Amendment is not, in fact, absolute. This is characteristic of charismatic leaders—the designation of an “out” group as the enemy. Rightly or wrongly, LaPierre senses that the mentally ill are not a sympathetic group to most Americans and therefore he does not see any risk in branding them as the true source of the problem. It is not guns or people who own guns that we need to worry about, he tells us: it is “crazy” people. Organizations like the National Alliance on Mental Illness (NAMI) struggle to defend the rights of people with psychiatric illness and to point to the data demonstrating that this population is, in fact, a very minor contributor to the problem of gun violence in the United States. But data are not LaPierre’s forte. He has crafted a potent tool of blaming journalists and liberals for attempting to violate a basic American right and thereby dooming us to face, unarmed, criminals, terrorists, and the mentally ill. Having frightened us, he can then drive home his point with statements like “The political elites can’t escape and the darlings of the liberal media can’t change the God-given right of good people to protect themselves.”59

Like all of the charismatic leaders we have profiled, LaPierre does not try to convince us with data or facts of any kind. He could cite some studies that support his point of view, even if they are in the minority among scientific reports and not given credence by the majority. But instead he scares us and then tells us we are on the side of freedom, God, and our children if we do what he says. For many people, these are hard arguments to resist.

What Links Charismatic Leaders?

Can we identify any patterns that link these charismatic leaders? Naomi Oreskes and Erik M. Conway attempted to do this in their book Merchants of Doubt, which discusses the shameful behavior of scientists in defending the tobacco industry and disputing the reality of climate change. They point out that although scientists, sometimes with impressive credentials like Duesberg and Séralini, are often the charismatic leaders misleading us, they generally have not done any scientific work in the area in which they are declaiming. With respect to scientists who tried to convince us that cigarette smoking is not harmful, they note,

Over the course of more than twenty years, these men did almost no original scientific research on any of the issues on which they weighed in. Once they had been prominent researchers, but by the time they turned to the topics of our story, they were mostly attacking the work and the reputations of others.60

In fact, many of these scientists seem to have fallen off the rails. Although Séralini did publish a study on a GMO product, his credentials were never in this kind of toxicology and in fact he apparently did almost everything possible incorrectly. Duesberg was past his most productive scientific days when he began railing against HIV as the cause of AIDS and was trained in chemistry, not virology or medicine. Wakefield, a gastroenterologist, should never have been given credence as a knowledgeable source regarding a disorder that involves the central nervous system like autism. Neither LaPierre nor McCarthy has any scientific credentials. This pattern will not always work, but in the case of these five leaders, their opinions are not actually “expert,” despite their credentials in loosely related fields.

Another tactic of these charismatic leaders is to vociferously accuse the rest of the scientific community of bullying them. Again, Oreskes and Conway give the example of Frederick Seitz, the scientist hired by the tobacco industry who spent years trying to convince regulatory authorities and the public that smoking and lung cancer are unrelated. Seitz of course was roundly criticized and ostracized by mainstream scientists as the connection became all too clear. “Seitz justified his increasing social and intellectual isolation by blaming others. American science had become ‘rigid,’ he insisted, his colleagues dogmatic and close-minded.”61

As we have seen in LaPierre’s case, charismatic leaders actually distract us from the core issues by avoiding the messiness of data analysis and broadening their rhetoric to universal concerns about freedom and justice. They try to scare us, telling us that we are, like them, sorry victims of dastardly forces, and they lay out for us the enemy we can unite against: big companies, liberals, other scientists, and the mentally ill. Against them, all science has are rigorously designed and conducted experiments, data analyses with equations that can fill a page, and the seemingly endless presentation of new facts. It doesn’t seem like a fair fight.

I Know You’re Wrong, but I’m Going to Copy You Anyway

No discussion of the charismatic leader would be complete without considering the psychology of the group that forms around him or her. As we discussed earlier in this chapter, many theories of leadership focus on the ways in which followers’ perception of a leader determine his or her actual authority. Once we think about the intensive ways in which the charismatic leader formulates an “us” as opposed to a “them,” we must then examine what happens to people’s decision making and rational thinking capacities when they join groups.

When discussing groups, it is essential to recognize that, as Gestalt theorists have argued, the group is more than the sum of its parts. Groups have higher-order emergent properties, which are characteristics that can be observed only when the group comes together and not when considering the members of a group individually. Group focus on a particular task or on the message of a speaker is a good example of an emergent property. Engagement on the individual level is not the same as the engagement of the group as a whole. This kind of emergent property is something that defines the group as different from the aggregation of its individual members.62

But how do individuals join groups, and what psychological process occurs as they identify increasingly with the group? Many social psychologists have noted that the simple act of individuals categorizing themselves as group members is enough to cause group behavior and all of the psychological processes that accompany group formation.63 This means that it is relatively easy for a group to form—in fact, it represents one of our most natural inclinations as humans. In today’s world, triggers for group psychology are particularly abundant, considering how easy it is to join a “group” with a click of a button on Facebook, LinkedIn, Twitter, and many other forms of social media. The ability to join a “Green Our Vaccines” or “No GMOs” group on Facebook reduces some of the barriers that may exist in traditional group formation, such as geographic location and time commitments. If psychologists are correct that this simple act of self-categorization as a group member is enough to produce group behavior and group psychology, then we can begin to realize how important it is to conceive of some of these health denialist movements as “group” activities and the appropriateness of applying principles of group psychology to them.

The first step in the process of becoming identified as a group member has often been called “depersonalization.” The self comes to be seen in terms of membership in a group. This process leads people to behave in terms of a self-image as a psychological representative of a group rather than as an individual.64 Joining the group is an extremely adaptive human behavior. It allows us to make sense of a lot of worldly phenomena that might be complex or confusing. In particular, group membership allows for social reality testing, in which we can move from an opinion such as “I think global warming is a serious problem” to an affirmation that “Global warming is a serious problem” by looking to the opinions of our fellow group members as a test of the idea.65 To some extent, we suspend the imperative to dive deeply into an issue in order to form an opinion when we join a group because we assume that the group has already done this. The more impressive sounding the group’s name, the more persuasive is this assumption. “National Rifle Association” sounds more official than “Gun Owner’s Club,” for example.

It seems logical that a person’s identification with a group will be shaken when groups with countervailing views crop up, but this is not generally the case. Group identities paradoxically become stronger as people encounter those that seem opposite or somehow very different from themselves. For example, in one study, two researchers organized people into either same-sex pairs (male-male or female-female) or four-person groups (two males and two females). Subjects were more likely to define themselves in gender-based terms in the four-person group than in the same-sex pairs. Once women were confronted with men, they began to identify more precisely as women than when they were simply paired with another woman.66 An anti-vaccine group will become more strongly identified as anti-vaccine when faced with a pro-vaccine group, even if the latter offers evidence to support its opinions. This is a basic tenet of group psychology. As such, when charismatic leaders try to emphasize the presence of the opposition and even try to engage with them, they are effectively stimulating patterns of group psychology that will strengthen group identity in the presence of a group perceived as very different.

Once a group identity is formed, people tend to want to conform to the group’s opinions, even when those opinions are obviously wrong. A few classic experiments have demonstrated the power of the conformity effect. In one early experiment, Muzafer Sherif used the autokinetic effect to demonstrate how group conformity can influence individual perceptions and behavior. The autokinetic effect is an optical illusion in which pinpoints of light seem to move when projected in a dark room. Generally, every individual will have a different perception of how much the light moved, and these perceptions can really be quite distinct. Sherif found very distinctive perceptions of how far the light moved among his subjects until he brought them together and tried the experiment again. Once they were brought together, subjects’ reports on how far the light moved converged on an agreement point somewhere in the middle of all of their differing perceptions.67 This experiment does not make clear whether people’s perceptions actually change or whether they simply report what they think will be most conforming and most socially acceptable to the group. However, many years after Sherif’s experiments, neuroscientists continue to demonstrate that individual cognitive perceptions can actually be altered in the presence of group conformity pressures.

Gregory S. Berns and colleagues at Emory University have shown that when people change their minds about something to conform to peer pressure, even when the change means adopting false beliefs, the parts of the brain involved in perception—the occipital-parietal network—are activated.68 On the other hand, activity in the decision-making part of the brain—the PFC—is actually suppressed when this switch occurs. This means that when we bow to social pressure there is an actual change in perception—how we see the facts—rather than a reasoned decision to change our minds.

Solomon Asch characterized a more dramatic example of social conformity in the 1950s. A single subject was seated around a table with a group of experiment confederates. Everyone was shown one piece of paper with a line on it and then another piece of paper with three lines on it. The task was to identify which line on the second piece of paper was the same length as that on the first piece of paper. Participants went around the table and gave their responses. Seating was arranged so that the subject always answered last, after all of the confederates. The confederates and subject gave the correct answer for a number of trials. After a certain number of trials, however, the confederates would begin to give a blatantly incorrect response. Asch found that 75% of naïve subjects gave an incorrect answer to at least one question when preceded by the incorrect answers of confederates. When interviewing the experimental subjects after the experiment, Asch found different categories of “yielders” (those who answered incorrectly after hearing the incorrect responses of confederates). Some participants reacted with distortion of perception. These participants believed that the incorrect responses were actually true. These subjects represented a very small proportion of the yielders. Others displayed a distortion of judgment, in which they became convinced that the majority must be right and then changed their answers, mostly due to low confidence and extreme doubt. The third group displayed a distortion of action, suggesting that they knew the correct answer but answered incorrectly in order to conform to the group. Asch’s experiment was instrumental in showing the strong, almost universal, drive to conform, even when we realize that the majority view is incorrect.

Hey There, Sports Fans

As we discuss throughout the book, these kinds of irrational or incorrect perceptions can, in certain instances, serve a societal purpose. They often allow us to function in social units and to form groups that offer protection, relief from stress, monetary benefits, and a whole host of other survival advantages. A simple example is found among fans of a particular sports team. Members of such groups believe that their team deserves to win the championship on the basis of some divine right. They buy jerseys, T-shirts, and hats with the team logo and insist that their team is the best. There may be no evidence that their team is indeed superior to others, nor is there any reason to assume that supernatural forces take sides in these things, so sports team affiliations might be characterized as largely irrational. And yet, these affiliations serve to bind together people who might otherwise have nothing in common, offering them the opportunity to relate to a large and otherwise diverse group. For many people, the experience is quite fulfilling and it certainly generates a great deal of income for the team’s owner(s).

In short, conforming to the group, even if that means distorting reality, may serve a highly adaptive purpose. It is part of a human inclination that allows people to set aside their differences, come together, and form units that serve as the foundation of functioning human societies. Forming these units comes naturally to us, and we may benefit from them in ways that enhance our survival, including gaining monetary support for families, forming militaries to fight off enemies, and generally creating a social infrastructure that allows people to come to our aid when we collapse in the street. The problem is that we may find it difficult to turn off this urge to conform in situations in which it is not appropriate. In particular, the urge to conform becomes particularly dangerous when we are trying to sort out complex evidence on a scientific issue. In this situation, conformity and group formation does not function in a productive manner but rather has the capacity to distract from and distort reality. Until there is a group whose sole rationale is to seriously and objectively evaluate the scientific merit of health claims and come to evidence-based conclusions that are flexible in the face of new data, group membership generally does little to advance our understanding of what science tells us. This is potentially remediable. Why not start groups—from small community associations to national organizations—whose main function is to help people feel warm, involved, and accepted in the joint exploration of science as it affects our decision making? When the claim is made, for example, that eating gluten-free food is better for all of us (and not just for people suffering with celiac disease), joining a “science exploration” group would function to allow people to work together to figure out what the evidence shows. The group could bring in experts, read articles in popular but respected journals like Scientific American, and figure out together if there is anything to it. When that task is done, the group could move onto another issue, say, whether global warming is real.

In addition to the practical benefits that group membership provides us, there are profound psychological benefits, particularly the availability of comfort. Human personalities come in every imaginable variety, of course, but in general we are a social and gregarious species that shuns isolation. Whenever we experience fear, worry, and pain, it is our instinct to seek comfort from others. As Brian G. Southwell puts it, “Much of the discussion we seek with others in moments of elevated emotion … is not necessarily focused on new factual information sharing as much as it is focused on reassurance, coping with stress, and ritualistic bonding.”69 After a charismatic leader like Wayne LaPierre terrifies us with images of armed hooligans lurking around every corner, he then offers us the comfort of joining his organization, the NRA, and finding safety with others in similar danger. Neuroscientists have established that social affiliations in mammals, including humans, involve the hormone oxytocin, sometimes referred to as the “love hormone.”70 Forming group affiliations stimulates the release of oxytocin in the brain, which, as Ernst Fehr of the University of Zurich and Colin Camerer of the California Institute of Technology point out, “seems to limit the fear of betrayal in social interactions.”71 They speculate, based on animal studies, that oxytocin may reduce amygdala activity in order to reduce the fear of betrayal. Thus, the same hormone that is critical in all mammalian species for pair-bonding and maternal behavior may also work to make people feel “loved” when they are in a group, regardless of whether that group is feeding them misleading information or promoting scientific thinking.

Group membership can change the way you think by two processes: informational influence and normative influence. Both are probably at play in most instances of health science denialism. Informational influence indicates a desire to conform to the group because you think the group may be correct about something. This form of influence is strong when people are uncertain about what is correct and incorrect, generally takes place in private, and is particularly strong with the first few members of the group you encounter.72 For example, if you are not sure whether GMOs cause cancer, you would be particularly prone to this type of influence. The first person from an anti-GMO group you meet would have the most influence on your opinions, and by the time you met the 20th person in the group, you would be hearing the same opinions repeated that you are already convinced are true. Informational influence is therefore especially strong upon first encounter with an individual who has an opinion. The mother who is uncertain about whether vaccines are safe for her child is particularly prone to being persuaded that they are not if she suddenly encounters even a single member of an anti-vaccine group. This is why demographics are important in patterns of health beliefs. If you are uncertain about the truth of a medical claim and you are likely to meet someone with a strong anti-establishment opinion because of where you live, then you yourself are also more likely to take on this view as soon as you meet someone who harbors it. Irrational denial of medical fact is thus particularly spurred on by informational influence, since the complicated nature of scientific discovery makes it more likely that people will be uncertain about what is in fact correct.

Normative influence has to do with wanting to be accepted in a group. This kind of influence is what most clearly explains Sherif’s findings. Even if you know that members of the group are blatantly incorrect, you will actively agree with them in order to maintain an acceptable social status. This kind of influence is normally observed as groups become larger and mainly influences behavior that takes place in front of the entire group rather than in private.73 In other words, you may privately believe that the group is incorrect about something but you will nevertheless publicly proclaim that it is in fact correct.74 As the group grows, you are more likely to feel a stronger need to be accepted. There is probably an element of this type of influence in many health science denialist groups as well, whether or not people realize it consciously. Some members of health science denialist groups may feel that the group’s assumptions are incorrect but maintain a position of advocacy of the group’s stance in order to avoid being excluded. This kind of influence is especially potent within social networks: if all the mothers on your block believe that vaccines cause autism and hold a rally against vaccines, you are more likely to join them, even if on some level you believe that the premise of their argument could be flawed. Pair this instinct with the fact that the science behind vaccination is incredibly complex and you are more likely to be influenced by this group on a basic informational level as well.

There are a number of theories about why we are so ready to conform to the group, even in situations in which its views are clearly wrong. In the case of followers of charismatic leaders, great uncertainty and psychological distress are key factors that make people prone to these leaders’ influence. In fact, some leadership theorists have hypothesized that charismatic leadership cannot even exist in the absence of a crisis.75 The novelty of a charismatic leader’s message as well as the uncertainty felt by the listener will often have a strong effect on how well the listener receives the message and how convincing it seems.76 Uncertainty is most definitely a feature of health science denialism. Unsure about what causes cancer or autism, we may feel comforted in believing a strong leader’s message that GMOs or vaccines, respectively, are the answer. In the case of anti-vaccine sentiment, parents whose children suffer from autism, epilepsy, or other disorders now frequently attributed by these groups to vaccines are clearly under conditions of high stress and psychological pain. The uncertainty surrounding the complexities of science as well as the mysteries of complex, chronic diseases and the psychological distress surrounding their management all create a perfect environment for the dissemination of strong anti-establishment messages from a charismatic leader. These conditions also create a ripe environment for high levels of social conformity of the sort Asch observed.

A Shot Against Persuasion

What should we do when faced with the persuasive tactics of a highly charismatic leader? Is it possible to resist persuasion? Although there has been far more research on being persuaded than on resisting persuasion, psychologists have found that it is possible to avoid persuasion.77 Resistance to persuasion has been described in the same terms as resistance to certain illnesses after vaccination. A person can be inoculated against persuasion by fabricating counterarguments, being forewarned about an upcoming persuasive message, and simply taking time to think alone after hearing a persuasive message before taking any action. In particular, just as a person is inoculated against a virus by receiving weakened doses of the virus itself and then mounting an immune response against it, we can be inoculated against persuasive arguments by being exposed to weaker versions of those arguments and then being encouraged to generate counterarguments.78

One group of theorists has identified what they call the “persuasion knowledge model.” The persuasion knowledge model refers to the ways in which people process powerful messages and recognize the sources and processes of persuasion occurring. The theory suggests that people learn “coping” techniques to deal with persuasive messages that can be cultivated and change over time. Coping techniques may help individuals separate emotion from their evaluation of the message, refocus their attention on a part of the message that seems more significant to them rather than what the persuader is trying to emphasize, elucidate the chain of events that led to the persuasive message, or make judgments about the persuader’s goals and tactics.79

Other experiments have suggested that the cognitive resources of an individual have a major influence on whether or not he or she is persuaded by a message. Cognitive load refers to the amount of strain on someone’s cognitive resources. In experiments, someone who is forced to memorize a very long string of numbers would be in the high cognitive load condition, and someone who is forced to memorize a very short string of numbers would be in the low cognitive load condition. The point of these manipulations is to see how the effects of persuasion alter when people are more or less distracted. One study tried to assess the ability of subjects to resist persuasion depending on the extent of their cognitive loads. In one condition, subjects were told about a new drug called AspirinForte, which had received bad press due to its unpleasant taste and its damaging effects on the environment when produced in mass quantities. These subjects were then told to memorize a long string of numbers (the high cognitive load condition) and next were exposed to an ad for AspirinForte and told to write down as many arguments against the use of the product as possible. In the other condition, subjects were asked to memorize a short string of letters (the low cognitive load condition) and then were exposed to the ad and asked to come up with arguments against the drug. The experimenters then collected data on the certainty of subjects in their attitudes toward the drug. They found that people with a high cognitive load were less certain about their attitudes toward the drug, even if they rated the source of the advertisement as highly trustworthy. In addition, those in the low cognitive load condition had higher-quality counterarguments, as judged on persuasiveness by external analysts, even though those in the high cognitive load condition had a greater absolute number of counterarguments.80 This all suggests that difficulties with resisting persuasion are related to factors other than the source of the argument. The condition of people receiving the message, how knowledgeable they are about the topic, and how distracted they are will all affect their ability to think about counterarguments that might prevent them from being completely persuaded. Perhaps counterintuitively, the more tired, stressed out, and busy you are, the more likely you are to be convinced by persuasive messages simply because you do not have the energy to come up with better ones of your own.

There is some evidence that making people aware of the way in which they are processing persuasive messages and the biases with which they are engaging can help them rethink their attitudes. In one experiment, researchers exposed subjects to a message from either a likeable or dislikeable source. Some subjects were specifically told not to let “non-message” factors affect their judgment of the message. When subjects were already being persuaded by a peripheral route that depended on non-message perceptions such as the authority of the speaker, being alerted to the possible existence of bias resulted in more careful scrutiny of the message and less bias in interpreting it.81

The results of these types of experiments suggest that there is room to intervene in the process of persuasion. Several possibilities for encouraging more careful scrutiny of the particularly persuasive messages of charismatic leaders exist. For example, nurses and doctors dealing with parents who are afraid of vaccinating their children might try attitude inoculation. Rather than simply arguing with the parent, the healthcare provider might offer weakened versions of arguments the parent may have heard against vaccines. The doctor or nurse might even be trained to present these views in a highly noncharismatic, flat manner. Research shows that when inundated with weakened versions of a persuasive argument, people are able to mount effective counterarguments against them. The healthcare provider would then encourage the parent to come back at a later date to discuss the possibility of vaccination again. In the interim, when the parent is faced with anti-vaccine messages, he or she will be more likely to mount effective counterarguments against them and more likely to return to the doctor with a changed attitude. This strategy would of course not work with everyone. People who are already entrenched in anti-vaccine attitudes or face significant social pressure not to vaccinate their children might not be swayed by this strategy. However, such a strategy would likely work for the parents who are simply unsure and are at risk of being persuaded by anti-vaccine activists. These types of parents represent a larger and more vulnerable group than the group of people who are already firmly convinced of the danger of vaccines. If we could get the parents on the fence to decide to vaccinate their children, we could probably avert many of the frightening gaps in vaccine coverage we are seeing today.

This type of “inoculation” might proceed as follows:

PARENT:I have heard that vaccines can cause serious brain damage in children, like autism, and I am not sure that I want my child to be vaccinated.

NURSE:Yes, there are definitely some people out there who have claimed that vaccines cause autism. Their evidence for this is that there has been an increase in autism over the last few years; that vaccines used to have a small amount of a mercury preservative and large, repeated doses of mercury can harm the brain; and that there are some parents who insist that their children started showing signs of autism after receiving a vaccine. Now that you have heard these arguments, you might want to do a little research and see what the evidence is for them. I would be happy to help you with this and to talk with you again about your concerns. Why don’t we make an appointment to discuss your thoughts and concerns soon?

For individuals who are faced with confusing, conflicting scientific messages, we recommend a few strategies. Take a moment, step back, and think about how you are processing the information in front of you. Do you have Jenny McCarthy on the TV in the background while you clean your house, talk on the phone, and make dinner for your children? Or have you sat down, listened to the arguments with a clear head, and given yourself ample time to think about possible counterarguments? Is it possible that you are exercising any biases? When making an important decision about your health and wellness, such as whether or not to buy a gun or to allow your family member to be treated with ECT, it might be wise to take a moment to list any possible biases you might have when receiving information about these topics. Who told you that ECT caused brain damage? Could they have any possible biases? What non-message factors might have influenced your perception of their message? This kind of simple reflective thinking can go a long way in helping to make sure you are not simply persuaded by charisma and bias but that you have truly thought through the facts and have remained aware of all of your potential cognitive pitfalls along the way. It is also important to identify sources of stress, both related to the health issue at hand or in general, to which you are being subjected. Reasonable decisions, as we have shown above, are best made when cognitive load is minimized.

Noreena Hertz, in her very useful book Eyes Wide Open, discusses how encouraging people to calmly consider alternative points of view can have a very powerful effect:

Studies show that simply by posing such “imagine if” questions, which allow us to consider alternative explanations and different perspectives, we can distance ourselves from the frames, cues, anchors and rhetoric that might be affecting us. Liberated from these tricks and triggers, we can consider information through a more neutral, less emotive, more analytical and nuanced lens.82

From the point of view of public health measures, it is critical that any attempt to limit the influence of charismatic leaders take into considerable account the powerful effect those leaders have in making us feel safe, understood, and even loved. We have indicated that the more fear a charismatic leader is able to conjure in a potential acolyte, the more powerful is the activation of select brain circuits that will make it difficult for countervailing evidence to have an impact. Charismatic leaders induce brain changes that first heighten the fear centers of the brain, like the amygdala, and then suppress the decision-making areas in the PFC. Those brain changes also contribute to an increased release of oxytocin that gives us a feeling of belonging and comfort. Dry, pedantic harangues about data will be powerless in the face of these potent effects. Rather, it is critical to make people feel that by using their minds to evaluate scientific claims they are joining a welcoming club of people who trust the scientific method and attempt to get at the truth about what is healthy.

Resisting persuasion requires a kind of self-awareness and critical thinking that is not necessarily intuitive or natural. But learning how to think critically in this way can help anyone in any career or walk of life, not just in resisting anti-science viewpoints. Therefore, it would make sense if middle school and high school curricula included more time developing these skills. As we advocate throughout this book, rather than focusing so heavily on memorization, schools should teach children the techniques to think through complex problems. Children should learn how to identify a flawed experimental design or a flawed argument, to become aware of how they process arguments, and to formulate viable counterarguments and test ideas. Spending more time on debate techniques, requiring children to argue against something they may intuitively believe, or even simply teaching children about rhetoric and the psychology behind persuasion would all aid in developing useful critical thinking skills. Cultivating critical thinking and awareness of cognitive traps and biases not only helps us identify faulty scientific arguments but also helps us to become better decision makers, better thinkers, and better-informed citizens.