Published in Social Epistemology, vol. 34, no. 5, 2020, pp. 409-422
pdf of published article
Brian Martin's publications on vaccination
Brian Martin's publications
Brian Martin's website
Academic discussions concerning what to do about conspiracy theories often focus on whether or not to debunk them. Less often discussed are the methods, audiences and effectiveness of debunking efforts. To motivate a closer examination of the ‘how’ of debunking, a slightly different issue is addressed: conspiracy theory attributions (CTAs), which are claims that someone or some group believes in a conspiracy theory. Three cases of discrediting CTAs in the vaccination debate are examined: general assumptions that vaccination critics are conspiracy theorists, a claim that a vaccine-critical group subscribed to a particular conspiracy theory, and a claim that a PhD thesis endorsed a conspiracy theory. Struggles over CTAs can be analysed in terms of the tactics that powerful perpetrators use to reduce outrage over injustice: cover-up, devaluation, reinterpretation, official channels, and intimidation/rewards. Options for responding to CTAs include ignoring, ridiculing, debunking, engaging, counterattacking and accepting. Potential responders, to decide between options, should take their goals into account.
Keywords: conspiracy theories; debunking; vaccination
In some countries, there is a high level of popular interest and belief in conspiracy theories. Typical conspiracy theories, such as those concerning the 9/11 attacks or the assassination of US president John F. Kennedy, attribute responsibility for significant events or processes to powerful groups conspiring behind the scenes. Popular interest in such conspiracy theories has led to increasing scholarly interest in various aspects of conspiracy theories, including the demographics of believers, the psychology of belief and ways to counter or legitimate conspiracy theories (e.g., Coady 2006; Dentith 2018; Oliver and Wood 2014; Uscinski 2019a).
A common question among academics concerned about conspiracy theorising is whether to debunk them. Many say or assume this is an important task, because conspiracy theories are wrong and, more importantly, dangerous in some ways (Sunstein and Vermeule 2009). Discussions about debunking often focus on whether to debunk without much attention to how. Neglecting the ‘how’ question obscures several important issues, including the possibilities that debunking may be infeasible, counterproductive, or mistaken in its target. Also obscured are the motivations for debunking, which can be different from explicit rationales.
The argument in this paper is that it is important to study how to respond to conspiracy theories as a crucial input into deciding whether to respond. To address these issues, I offer three case studies of actual or potential debunking interventions in which I was personally involved. The case studies show the importance of how to go about debunking and of the purposes of doing so, in these cases my own purposes. Before introducing these case studies, some further background is useful.
Scholars give different definitions of the term ‘conspiracy theory’. For some, it refers to any plotting behind the scenes by a group or network to achieve some purpose. For example, Dentith (2014, 30) gives a ‘most minimal definition’ of conspiracy theory as ‘any explanation of an event that cites the existence of a conspiracy as a salient cause’. In contrast, others exclude from the definition plotting acknowledged by mainstream sources. For example, Uscinski (2019b, 20) says ‘conspiracy theories conflict with establishment accounts’ and ‘if conspiracy theorists investigate a theory that eventually turns out to be true, that theory stops being labeled conspiracy theory.’
Among conspiracy theory scholars, Bratich (2008) provides a somewhat different perspective: he focuses on ‘conspiracy panics,’ which are the alarms felt by scholars and others about people’s beliefs in conspiracy theories. For Bratich, conspiracy theories are actually constituted by the panics about them. He sees panics as revealing the commitments of those who castigate conspiracy theorising, especially political commitments and linked commitments to a particular sort of rationality. Apparently, Bratich’s perspective does not apply when discussions about conspiracy theories do not involve alarm.
DeHaven-Smith (2013) argues that in the 1960s the CIA sought to undermine challenges to the official explanation of the assassination of President John F. Kennedy by seeding the idea of conspiracy theories as disreputable. He proposes ‘State Crimes Against Democracy’ (SCADs) as a replacement for conspiracy theories. The key point here is that the conspiracy-theory label became discrediting and now serves to divert attention from conspiracies in high places. Thalmann (2019) provides a different explanation for the stigmatising of conspiracy theories, drawing on a study of popular and scholarly discourses, but agrees that in the US this occurred by the end of the 1960s. DeHaven-Smith’s analysis of SCADs and Thalmann’s analysis of stigmatisation are compatible with Bratich’s analysis of conspiracy panics. Common to these approaches is a focus on efforts to delegitimise conspiracy theories and individuals, commonly called ‘conspiracy theorists,’ who advocate for or subscribe to them.
Some sceptics argue that believing in conspiracy theories represents a failure of critical thought, especially when conspiracy theories challenge assessments backed by political or scientific authorities (Goertzel 2019; Sunstein and Vermeule 2009). Such sceptics have been called ‘generalists,’ in contrast to ‘particularists’ who argue that conspiracy theories should be judged on their merits on a case-by-case basis (Buenting and Taylor 2010).
Because conspiracy theories are commonly associated with fringe and delusional ideas, to be called a conspiracy theorist is potentially discrediting. Husting and Orr (2007), in the most detailed study of this process, describe the conspiracy-theory label as ‘a routinized strategy of exclusion.’ Pigdon (2007, 219) writes, ‘To call something “a conspiracy theory” is to suggest that it is intellectually suspect; to call someone “a conspiracy theorist” is to suggest that he is irrational, paranoid or perverse.’ DeHaven-Smith (2013, 4) says ‘… the conspiracy-theory label is employed routinely to dismiss a wide range of antigovernment suspicions as symptoms of impaired thinking akin to superstition or mental illness.’ Harambam and Aupers (2015, 467) say ‘those who are labeled “conspiracy theorists” are a priori dismissed by academics and excluded from public debate.’ Bjerg and Presskorn-Thygesen (2017, 138) write that ‘any use of the concept of conspiracy theory always already implies a demarcation between legitimate, rational knowledge and illegitimate, irrational non-sense. Furthermore, the concept not only refers to a given type of proposition but it also invariably calls into question the sanity and credibility of the person making or asserting the proposition, the conspiracy theorist.’ As a result, to call someone a conspiracy theorist can serve to categorise them as pathological.
Many of those who subscribe to beliefs commonly called conspiracy theories say they are not conspiracy theorists themselves, seeing themselves as critical thinkers and distinguishing themselves from those with more irrational beliefs (Wood and Douglas 2013; Harambam and Aupers 2017). Rejecting the conspiracy theory label is compatible with believing a stigma is attached to it.
However, not everyone finds the labels ‘conspiracy theory’ and ‘conspiracy theorist’ stigmatising. Wood (2016) reports that, for members of the public, CTAs may be less discrediting than these academic commentators suggest. Further studies are needed to test Wood’s finding.
Some of those who subscribe to the particularist perspective on conspiracy theories, namely that explanations should be evaluated on their merits on a case-by-case basis, might well think that dismissing an explanation on the basis of a label is irrational. If knowledge is a social product, as assumed in social epistemology, then pathologising certain types of claims and claimants interrupts the opportunity for full exposition and respectful dialogue that should be the basis for knowledge creation and testing.
To say that someone subscribes to a conspiracy theory is called here a conspiracy theory attribution or CTA. Although scholars have said that CTAs are discrediting, there have been few systematic studies of CTAs. Husting and Orr (2007) provide the most detailed analysis of CTAs, but do not address how to respond to them.
The use of CTAs typically involves a pervasive double standard: CTAs are only applied to explanations the labeller wants to denigrate, not to ones they hold themselves (deHaven-Smith 2013, 8; Wood and Douglas 2013). This is apparent in the preponderance of attention given to non-establishment explanations. Those who exclude establishment views from the category of conspiracy theory build a double standard into the definition itself. A prominent example is 9/11. The orthodox explanation involves an al-Qaeda plot that, according to Dentith’s minimal definition, should be labelled a conspiracy, but adherents to this explanation are seldom called conspiracy theorists, least of all by themselves. Only non-orthodox explanations are labelled conspiracy theories. Similarly, the suggestion of mafia involvement in the assassination of President John F. Kennedy is called a conspiracy theory, but police who investigate organised crime are not called conspiracy theorists. When a CTA is assumed to be discrediting, the implication is that the targeted belief is wrong, and absurd besides. The systematic dismissal and denigration of some conspiracy theories while ignoring others, and not labelling the others conspiracy theories and thereby exempting them from scrutiny and potential opprobrium, can be called an inconsistency or a double standard. It can be understood as a type of power play serving to keep control over those who deviate from orthodoxy.
Levy (2007) argues that it is seldom rational to accept conspiracy theories not supported by properly constituted epistemic authorities. However, in the case of CTAs, namely whether to accept that a person or group subscribes to a conspiracy theory as alleged, it is not obvious who these authorities might be. Perhaps scholarly researchers into conspiracy theorising might qualify.
My case studies concern the potential or actual debunking of CTAs rather than conspiracy theories. As will be seen, these case studies illustrate how options for debunking CTAs are much the same as options for debunking conspiracy theories, indeed with a couple of additional possibilities. They also illustrate how to learn from the experience in debunking efforts, and the role of motivations in whether to debunk and how to go about it.
There have been few other studies of CTAs, much less studies of how to go about debunking them or how this task relates to debunking conspiracy theories. The case studies show the potential for additional studies of CTAs and debunking.
The next section outlines the vaccination controversy and presents three case studies of CTAs that stigmatise vaccine critics. The following section analyses struggles over CTAs using a model of outrage management. After this, I describe my own involvement in these case studies and comment on ways to respond. In the discussion, I argue that in deciding how to deal with conspiracy theories and CTAs, overall goals are important.
Vaccination is a medical procedure in which a person or animal is given a vaccine, which is a small dose of an agent designed to stimulate the immune system so it can better resist full-blown disease. A smallpox vaccine was first used in the late 1700s. In the 1900s, vaccines were developed for diphtheria, polio, measles, pertussis and many other infectious diseases.
In most affluent societies, rates of childhood vaccination are high, and there continue to be new vaccines added to government-recommended schedules. Vaccination has dramatically reduced the number of people who contract infectious diseases, and death rates from these diseases are very low compared to a century ago. Nearly all governments endorse vaccination, as do medical authorities and researchers. Most parents ensure that their children receive all vaccinations on the government schedule. Supporters say vaccination is one of medicine’s most important contributions to population health (Ehreth 2003; Plotkin et al. 2017).
However, since the earliest days of vaccination, there have been critics (Colgrove 2006). In the 1990s, opposition re-emerged in a number of countries, stimulated by concern about adverse effects from vaccinations, the expanding number of vaccines in the childhood schedule, and the role of pharmaceutical companies in promoting vaccines (Blume 2017). Critics argue that most of the declines in death rates from infectious diseases preceded mass vaccination and that serious adverse effects from vaccines are both inadequately studied and much more common than normally recognised (Cernic 2018; Habakus and Holland 2011). Most critics argue that vaccination should be a matter for personal or parental choice.
In some countries, proponents of vaccination have been disturbed by continuing public criticism of vaccination. In 2019, the World Health Organization declared ‘vaccine hesitancy’ — ‘the reluctance or refusal to vaccinate despite the availability of vaccines’ — to be one of the top ten threats to public health worldwide (World Health Organization 2019).
Much of the vaccination debate involves partisans making assertions about medical research and its implications for public health. So how do conspiracy theories enter into the debate? A number of campaigners, commentators and researchers have implied or alleged that vaccine critics and vaccine hesitants (parents with reservations about vaccination) believe in conspiracies to promote vaccination. Furthermore, belief in such conspiracies might be linked to vaccine hesitancy (Jolley and Douglas 2014; Kitta 2012). I will look at three case studies of such CTAs: general claims in scholarly analyses, a specific claim about a vaccine-critical group and a specific claim about a PhD thesis.
A number of scholarly treatments contain CTAs about vaccination; two examples are addressed here. Grimes (2016) uses a mathematical model to estimate how long large conspiracies can last before breaking down. Grimes refers to ‘some commonly-held conspiratorial beliefs; these are namely that the moon-landings were faked, climate-change is a hoax, vaccination is dangerous and that a cure for cancer is being suppressed by vested interests.’ (Grimes 2016, 1). Grimes does not elucidate why he treats a belief that ‘vaccination is dangerous’ as belief in a conspiracy. In a paragraph on ‘Vaccination conspiracy,’ he says, ‘It is estimated that roughly 20% of Americans hold the long de-bunked notion that there is a link between autism and the MMR vaccine’ and refers to ‘Ill-founded beliefs over vaccination’ (Grimes 2016, 3). With these statements, Grimes seems to imply that not accepting authorities’ endorsement of vaccination necessarily means believing in a conspiracy theory, though he does not describe the alleged conspiracy nor provide any evidence that people believe in it. Grimes, in saying that ‘vaccination is dangerous’ is a conspiratorial belief, thus makes a CTA without providing evidence that anyone believes in a conspiracy.
Uscinski (2019c, 199), in his introduction to chapters about science in his edited collection about conspiracy theories, writes that ‘Sizable portions of the public reject the science on vaccines, GM foods, fluoride, and climate change due to belief in conspiracy theories.’ This is a straightforward CTA, but one without any accompanying evidence to support it. In referring to ‘the science,’ Uscinski implicitly refers only to science deployed on one side of the debate, namely the side supported by most scientists. On each of the issues he mentions, there is research challenging the dominant view. Another complication is that these debates are not only about ‘the science’ but involve differing views about ethics and decision-making, for example about the relative importance of individual choice and collective benefit. Different vaccines have different cost-benefit profiles, so a parent might have their child vaccinated against measles and pertussis, yet refuse the hepatitis B vaccine given at birth on the grounds that their child is unlikely to ever be exposed to hepatitis B. (On hep B concerns, see Reich (2016, 102–106) and Conis (2015, 179–202).)
In contrast with Grimes, Uscinski and others (e.g. Goertzel 2010) who attribute belief in conspiracy theories to people who have concerns about vaccines, studies of vaccine-hesitant parents, based on interviews (e.g., Leach and Fairhead 2007; Reich 2016) seldom mention anything about conspiracy theories (for a contrary view, see Kitta 2012). This is compatible with the view that criticism of aspects of vaccination or disagreement with the recommendations of authorities does not necessarily imply or rely on beliefs in conspiracy theories.
In the mid 1990s, Meryl Dorey set up a citizens group, the Australian Vaccination Network (AVN), which eventually became Australia’s largest and most prominent vaccine-critical group. The AVN operated much like other vaccine-critical groups (Hobson 2007), hosting a website, organising meetings, making submissions, publishing a magazine, lobbying politicians and seeking media coverage. Then in 2009 a pro-vaccination citizens group, Stop the Australian Vaccination Network (SAVN), was set up, whose explicit purpose was to discredit and destroy the AVN. SAVN was not a formal organisation but rather a network coordinated through a Facebook page. It used a variety of methods, including making dozens of complaints to government regulatory bodies, attempting to stop AVN-organised talks, publicising criticisms of the AVN, and making abusive comments about Dorey. For more information about the AVN and SAVN, see Martin (2018a).
One of the imaginative techniques initially used by SAVN was a CTA, posted prominently on its Facebook page as part of SAVN’s self-description, which includes its rationale for attacking the AVN.
Name: Stop the Australian Vaccination Network
Category: Organizations - Advocacy Organizations
Description: The Australian Vaccination Network propagates misinformation, telling parents they should not vaccinate their children against such killer diseases as measles, mumps, rubella, whooping cough and polio.
They believe that vaccines are part of a global conspiracy to implant mind control chips into every man, woman and child and that the ‘illuminati’ plan a mass cull of humans.
They use the line that ‘vaccines cause injury’ as a cover for their conspiracy theory.
They lie to their members and the general public and after the death of a 4 week old child from whooping cough their members allegedly sent a barrage of hate mail to the child's grieving parents.
The dangerous rhetoric and lies of the AVN must be stopped. They must be held responsible for their campaign of misinformation. (quoted in Martin 2011, 31)
The statement that ‘They believe that vaccines are part of a global conspiracy to implant mind control chips into every man, woman and child and that the ‘illuminati’ plan a mass cull of humans’ is a CTA. SAVN gave no evidence that the AVN subscribed to this conspiracy theory, and Dorey denied believing in any such theory. It seems that the main purpose of this CTA was denigration, rather than accurate description, of the AVN.
Judy Wilyman completed her PhD at the University of Wollongong under my supervision. Her topic was a critique of the rationale for the Australian government’s vaccination policy (Wilyman 2015). She was also publicly critical of vaccination and hence came under attack by SAVNers while undertaking her PhD. For example, there were abusive blog posts, complaints to the university, and freedom-of-information requests to the university leading to hostile media stories.
Soon after her graduation, on 11 January 2016, Wilyman’s PhD thesis was posted on the university’s online repository. Two days later, there was a front-page story in the national daily newspaper The Australian with the headline ‘Uni accepts thesis on vaccine ‘conspiracy’.’ (Loussikian 2016). The CTA in the headline was elaborated in the opening sentence:
The University of Wollongong has accepted a PhD thesis from a prominent anti-vaccination activist that warns that global agencies such as the World Health Organisation are colluding with the pharmaceutical industry in a massive conspiracy to spruik immunisation.
In the following year, the same journalist reiterated this CTA in several further articles in The Australian. It was also taken up by other journalists, bloggers, petitioners and commentators (Martin 2019). In my view, the CTA was part of a collective attack on Wilyman’s PhD thesis, Wilyman herself, me as her supervisor and the University of Wollongong (Martin 2017).
When CTAs are false or misleading, they can be seen as an attack on the reputation of the individual or group alleged to believe in the conspiracy theory. From the point of view of the target of the allegation, in such circumstances the question is what, if anything, to do about the CTA. To address this question, I present here a framework of tactics against injustice (Martin 2007).
When a powerful individual or group does something that can be interpreted as unfair or inappropriate, this has the potential to cause outrage among observers. The perpetrator often takes steps to reduce such outrage, including by:
Consider torture, widely considered to be a serious injustice, and stigmatised around the world. Perpetrators of torture — this includes many governments — (1) act in secret, (2) devalue targets as terrorists or criminals, (3) justify their actions as not all that harmful, as due to rogue elements or as necessary to prevent a greater harm, (4) deal with adverse publicity by prosecuting minor players, and (5) threaten targets with further harm if they speak out (Brooks 2016).
The same tactics can be observed in numerous other injustices, for example sexual harassment (McDonald et al. 2010), inequality (Engel and Martin 2015) and genocide (Martin 2009). Most relevant to struggles over CTAs is the application of this framework to defamation (Gray and Martin 2006; Martin and Gray 2005). Here, each of the methods to reduce outrage over defamation is examined. Note that the focus here is on tactics that are often used by perpetrators of injustices; this does not imply that the injustices are similar in nature or seriousness.
Cover-up Defamation involves damage to someone’s reputation. The slurs on reputation may be public, but other aspects of the attack may be hidden. In a workplace or local community, scurrilous gossip about an individual can be harmful, yet the originator or lead promoter of the gossip may operate behind the scenes. Online, damaging allegations and photos can be circulated anonymously.
In the case of Judy Wilyman, those responsible for the CTA used to discredit her thesis were not made public. Given the time frame — one day from posting the thesis to journalist Kylar Loussikian’s queries to me and the university — it is implausible that he carefully analysed her thesis himself. The prime suspect would be members of the group SAVN who had been denigrating Wilyman for years, but no one took public responsibility for the CTA except for immunologist John Dwyer, quoted in Loussikian’s story. A possible scenario is that SAVNers fed information to Loussikian, who then obtained a supportive quote from Dwyer.
When CTAs are vague, such as the implications by Grimes or Uscinski that questioning vaccination amounts to believing in a conspiracy theory, the lack of specificity might be considered to be a type of cover-up. Targets cannot easily respond to vague sweeping allegations.
Devaluation Lowering the credibility or perceived worth of someone or something is a process of devaluation. For example, if someone is thought to be a paedophile or serial killer, this seriously reduces their perceived worth, so there is less concern if they are treated badly.
Defamation is an attack on someone’s reputation, so devaluation is built in. Furthermore, by harming the target’s reputation, audiences are less likely to be disturbed by the defamatory claims.
When CTAs are used in the vaccination debate, often they are accompanied by other methods of devaluation. In the implications by Grimes and Uscinski, vaccine critics and hesitants are grouped with believers in a variety of conspiracy theories, including ones widely considered absurd, such as that the moon landings were faked. This might be called devaluation by association. In the attack on the Australian Vaccination Network, the CTA was just one technique of devaluation. SAVNers also posted demeaning graphics, used derogatory language and deployed dismissive humour (Martin 2018a).
Reinterpretation Defamers commonly defend their claims by saying they are telling the truth and exercising free speech. The hurt to someone’s reputation then is just a by-product.
For scholars using CTAs, the primary justification, if needed, is pursuing truth. SAVNers presented several justifications for trying to prevent Dorey from giving a scheduled public talk, including that the AVN provided misinformation, Dorey lacked expertise, the AVN practised censorship, and Dorey could speak somewhere else (Martin 2015). The attack on Wilyman can be interpreted as an attack on academic freedom. SAVNers and other critics of her thesis instead said their concerns were about academic standards.
Official channels When defamation is seen as unfair, the first instinct of many injured parties is to sue. However, legal action is expensive, slow and highly procedural, so the central issues become secondary. Furthermore, suing can sometimes lead to further damage to reputation, through the defendant bringing up evidence in court, and through media coverage (Gray and Martin 2006; Martin and Gray 2005). Finally, even a court victory may not restore one’s reputation, and defamatory commentary sometimes continues or even intensifies.
For these reasons, challenging a CTA by threatening or pursuing legal action would probably be futile or counterproductive.
Intimidation Defamers, when they are more powerful than their targets, can make threats or use other means for intimidation. This can include harassment, complaints to employers, and a host of other techniques.
In the attack on the AVN, the CTA was just a small facet of a much wider attack that included complaints to government agencies, abusive commentary and sending pornography to Dorey and others (Martin 2018a). The attack on Judy Wilyman included complaints to the university, intrusive freedom-of-information requests, and a petition, among other methods (Martin 2017).
To challenge defamation, there are five types of counter-methods (Gray and Martin 2006; Martin and Gray 2005):
In the next section, I describe responses to the CTAs, noting how they relate to these counter-methods. This will lead to reflections on debunking.
Studies of debunking often pose a simple question: should ‘we’ — in particular, social researchers — debunk conspiracy theories? However, this is far too narrow a perspective: it treats debunking as a black box, not addressing methods of debunking, their effectiveness and whether the effort is worthwhile. To begin addressing these facets, the following are among possible responses to conspiracy theories.
Next consider possible responses to CTAs. This is a somewhat different matter, because CTAs are discrediting, yet the same options can be used, plus at least two others.
It is worth noting that looking at CTAs unearths two additional options. These six options have varying relationships with the five methods, listed earlier, to challenge defamation. Ignoring a CTA does not challenge it at all. Most of the other options — ridiculing, debunking, engaging and accepting — expose the CTA, validate the target implicitly or explicitly, interpret the CTA as unfair, avoid official channels, and resist intimidation. Counterattacking is different: it does not directly challenge the CTA but instead turns attention to the attacker, suggesting hypocrisy in assuming CTAs are discrediting.
With the above palate of six options for responding to CTAs, I next describe actual responses to each of the three examples of CTAs — scholarly treatments, the AVN and Judy Wilyman — and comment on my position and choice of options.
The most common response by vaccine critics to scholarly CTAs has been to ignore them. Indeed, judging by commentary in vaccine-critical forums, it is likely that few vaccine critics ever read scholarly treatments of the vaccination debate or read scholarly discussions of conspiracy theories. Even when critics are aware of these discussions, it is unlikely they would find it productive to spend time responding.
The only people likely to have the time and interest to respond to scholarly CTAs are other scholars. A number of scholars have investigated conspiracy theories, challenging the assumptions that they are inherently misguided or dangerous or both (e.g., Basham 2018; deHaven-Smith 2013; Dentith 2018; Hagen 2018; Harambam 2020; Pigdon 2007; Räikkä and Basham 2019). By giving a more positive picture of conspiracy theories, they indirectly undermine the potency of CTAs.
At this point, the discussion becomes reflexive, because to my knowledge I am the only person to try to counter scholars’ vaccination CTAs, and I am doing it in this very paper and one other (Martin 2019, 2020). Beyond this, it can be noted that my chosen response is to engage with scholarly CTAs: I wrote to the authors, Grimes and Uscinski, who however did not offer any comments in response. I have not tried to debunk their CTAs, which are so vague that countering them would be difficult; it is possible that these CTAs could be justified or narrowed to be accurate. I have not tried to mock the CTAs or their authors, nor have I counterattacked by accusing them of being conspiracy theorists. It is not possible for me to accept the CTA because I am not a vaccine critic myself — though others have categorised me as one (Martin 2018b) — and, more importantly, I do not believe that vaccination is a conspiracy against the human population. In terms of methods, I have exposed the CTAs, implicitly validated the targets, interpreted the CTAs as attacks on reputation, avoided official channels and resisted intimidation (for example, by not being deterred by online abuse and complaints to my university).
My choice to try to engage with scholarly vaccination CTAs reflects my role as a social scientist. The only reason I decided to follow up with the scholarly authors of these CTAs was due to becoming involved in studying the vaccination debate, where two CTAs played a significant role, as described next.
As described earlier, beginning in 2009 the pro-vaccination group SAVN claimed that the vaccine-critical group the Australian Vaccination Network (AVN) believed in a global conspiracy to implant mind control chips. At the time, the AVN made no public response to this CTA. Meryl Dorey (personal communication, 11 March 2020) did not want to draw attention to it.
My own role was pivotal in responding to this CTA. In 2010, after I learned about SAVN’s attack on the AVN, I wrote a long article analysing the attack in the context of controversy studies, assessing the arguments in a major complaint against the AVN, and presenting tactics for resisting the attack. Part of my article was an examination of SAVN’s CTA about the AVN: I called it an ‘unsupported claim’ (Martin 2011, 31).
Some time after my article appeared, SAVN removed the CTA from its Facebook page. Later, on the blog of a leading SAVNer using the moniker ‘reasonable hank’ (2012), I again challenged the CTA. One of my arguments was that there was no evidence that members of the AVN believed in a global conspiracy to implant mind control chips. ‘Reasonable hank’ said they were just referring to Dorey: for him, apparently, if Dorey believed something, it was legitimate to say the AVN believed it. Dorey denied believing in the conspiracy theory, but SAVNers wouldn’t accept her denial: they assumed she was lying. Their evidence for the CTA was that Dorey had once made a link to an article on the website of David Icke, a well-known conspiracy theorist. For them, this proved she believed in the conspiracy. It should be obvious that linking to an article on a website does not mean you necessarily believe in everything hosted on the website. But the SAVNers reiterated their claim.
Then I had an idea. I wrote, ‘Here’s what we could do. Each of us could collect our writings about the AVN-belief-in-a-global-conspiracy matter. These could then be sent to one or more independent individuals for them to assess our claims.’ I concluded my comment with ‘Are you interested?’ The SAVNers were not interested. Instead, ‘reasonable hank’ (2012) acknowledged that members of SAVN had read my article ‘Debating vaccination,’ agreed that the CTA was inaccurate and replaced it.
What techniques did I use in challenging the CTA? My article (Martin 2011) where I analysed the CTA might be considered debunking. In the blog exchange, my invitation to seek the views of experts fits the category of engaging, in that I was discussing the issue with defenders of the CTA rather than trying to discredit it to a wider audience.
In retrospect, it would have been worth considering two other types of responses. The CTA was so silly that it could have been satirised, making fun of anyone who would make such an allegation. SAVNers presented themselves as the defenders of rationality against deluded opponents of vaccination, and in this context their seemingly absurd CTA made them look foolish. Another option was to counterattack: SAVN itself had adopted a conspiracy theory, namely that Dorey and other members of the AVN were concealing and denying their belief in a global conspiracy.
My analysis of and challenge to SAVN’s CTA was part my analysis of SAVN’s attack on the AVN, which in turn was triggered by my longstanding interest in suppression of dissent (e.g., Martin 1999), which is harmful to both dissenters and to a fair-minded process of negotiation over knowledge claims. In other words, having studied many cases of suppression of dissent in scientific controversies, I recognised SAVN’s activities as fitting the same mould, yet more drastic and persistent than anything I had seen or heard of before. In the course of writing to defend free speech for vaccine critics, I came across SAVN’s CTA and saw it as part of SAVN’s wider project to discredit and silence vaccine critics.
Judy Wilyman’s PhD thesis is a critique of the Australian government’s vaccination policy. The attack on her thesis prominently featured a CTA — that the World Health Organisation had conspired with pharmaceutical companies in declaring the swine flu pandemic — that was taken up by a wide range of commentators (Martin 2019). Allegations that the thesis contained a conspiracy theory were a crucial component of the attack, though far from the only part.
In response to the attack, university officials defended the processes by which the thesis had been passed and invoked academic freedom (Martin 2017). It was not the role of university officials to endorse or defend the content of the thesis, so they could not engage with the CTA.
I took it upon myself to respond to the attacks, and chose to provide the context as explanation, referring to SAVN’s campaign against public vaccine critics generally and against Wilyman in particular, arguing that the attack had the characteristic features of suppression of dissent. In some of my responses to attackers, I focused on the shortcomings of their claims. One of my responses was a detailed analysis of the original attack piece in The Australian (Loussikian 2016), in which I queried the CTA, saying ‘By applying the conspiracy-theory label, Loussikian used a powerful negative framing device that trivialised the detailed arguments in Judy’s thesis.’ (Martin 2016). I also wrote that Loussikian’s implicit assumption that CTAs are discrediting ‘… ignores actual research on conspiracies and conspiracy-theory beliefs, including the points that conspiracies do exist and that conspiracy theories need to be rebutted: they are not proved wrong simply by calling them conspiracy theories,’ citing several studies of conspiracy theories (Martin 2016). However, at the time I did not undertake a close examination of the CTA.
In 2019, three years after the attack, an article appeared in the scientific journal Vaccine by four Australian academics who had carried out a close analysis of Wilyman’s thesis (Wiley et al. 2019). They found much to criticise and made strong comments; Wilyman (2019) wrote a response. What is important here is what Wiley et al. (2019) did not say: they made no mention of finding any conspiracy theory in her thesis. The implication was that, from the point of view of the only published scholarly analysis of the thesis, the CTA had little or no basis.
The Wiley et al. analysis led me to write two articles, one about the uptake of the CTA (Martin 2019) and the one you are now reading. These are the only two direct responses to the CTA. Of the various possible responses to a CTA, the most common was to ignore it. Debunking had to wait three years, for the article I wrote about the uptake of the CTA (Martin 2019). No one used humour in response.
I tried, to a limited extent, to engage with the chief purveyor of the CTA, journalist Kylar Loussikian, by sending him a draft of my analysis of his initial article (Martin 2016). He responded by suggesting I include a link to his article but otherwise had no comments. Given that the CTA had been repeated by numerous commentators, including journalists, scientists and pro-vaccination campaigners, but that none had made any attempt to provide detailed evidence for the CTA, engagement did not seem a productive avenue.
Another response is counterattack: claiming that the attackers themselves subscribe to a conspiracy theory. I mentioned this in passing in some of my publications (e.g., Martin 2018a, 57n10), but did not pursue this approach in any concerted way.
In terms of methods, challenging the CTA (Martin 2019, 2020) involved exposing it, validating the target (the thesis), interpreting the CTA as an attack on academic freedom, avoiding official channels (such as formal complaint procedures) and resisting intimidation.
In summary, the CTA about Wilyman’s thesis was a key part of an attack intended to discredit her and her work. Responding to the CTA was difficult because the attackers provided only the flimsiest of evidence to support the allegation, with most of them simply repeating the initial claims. Responding to the CTA carried the risk of making it seem to define the thesis; it was safer to point out that the attackers did not address the main arguments in her thesis. The controversy created a huge readership for the thesis, including many readers sympathetic to Wilyman’s arguments. For them, the CTA was largely irrelevant except as a sign of the hostility of the attackers.
The reason why I addressed the CTA about Wilyman’s thesis is that I had been her principal PhD supervisor and felt it was my responsibility to do what I could to counter unfair attacks on her thesis. As well, the attackers attempted to discredit both me and the University of Wollongong through numerous tweets and blog posts, a petition and Wikipedia entries, among other methods. Therefore, I had a personal stake in the controversy over her thesis. The CTA was just one element in the attack. My choice about when and how to respond to it reflected my assessment of the options for responding to various elements of the attack, not just the CTA.
From the point of view of those concerned about conspiracy theorists and popular belief in conspiracy theories, the question is what to do about them, if anything. As stated in the introduction, this paper’s argument is that, as a foundation for deciding whether to respond to conspiracy theories, it is important to examine how to respond to them. The case studies presented here support this argument.
Those pondering whether to debunk assume they are right and that ideas they want to debunk are wrong, misguided and possibly dangerous. What happens when both sides think they are right and the other side is wrong and dangerous? This is the situation in the vaccination debate, in which proponents believe critics are a hazard to public health and the critics believe public health is better served by raising concerns about vaccination.
There is also the issue of double standards: why are some conspiracy theories considered so wrong and dangerous that they should be discredited, whereas others are ignored or endorsed? Those who advocate debunking seldom address this question, instead assuming they are right and their targets are misinformed, misguided and possibly dangerous.
In this sort of situation, rather than thinking in terms of debunking or not debunking, it is more productive to think of one’s goals and evaluate different methods for achieving them. In the vaccination debate, most campaigners think primarily about how to win, namely how to achieve their preferred outcome. Beyond the goal of winning, there are other possible goals. Here are some possible goals, starting with the two goals commonly espoused by campaigners.
1. Convincing people of correct ideas
2. Promoting good policy
3. Helping people to think critically about evidence and arguments
4. Enabling a free and open debate
5. Fostering dialogue between participants in the debate
Most of the discussion about debunking seems to assume goals 1 or 2. Goal 1 is not as straightforward as it sounds, because ideas are bound up in worldviews, and when worldviews are incompatible, this complicates the process of promoting correct ideas.
From the point of view of vaccination proponents, successful debunking of conspiracy-theory beliefs of vaccine critics would contribute to goal 1, and this in turn would make it easier to achieve goal 2. On the other hand, for vaccine critics, successful debunking of false CTAs promulgated by vaccination supporters would contribute to goal 1. This example highlights that campaigners on the two main sides have different views about what ideas are correct and what constitutes good policy.
Some pro-vaccination researchers favour goal 5: they recommend engaging in dialogue with vaccine-hesitant parents, less to convince them of correct ideas than to encourage them to think for themselves with a better appreciation of the implications of their actions (Leask et al. 2012). The assumption underlying the work of such researchers is that respectful conversations will lead many, though not all, hesitant parents to have their children vaccinated. On the other hand, some vaccine critics also favour goal 5. They direct some of their efforts towards parents who support vaccination but who have not given much thought to problems with vaccines. Those focusing on goal 5 thus diverge in their choice of target audience and in the choice of evidence and arguments they think should be critically interrogated.
My own interventions have been driven by goal 4. Without taking a side in the vaccination debate, I have tried to support free speech and fair debate, under the assumption that informed public participation is valuable for policy making. Because pro-vaccination campaigners have the overwhelming superiority in numbers, power and willingness to silence opponents, my interventions (e.g., Martin 2011, 2018a) have been to encourage greater understanding of controversy dynamics, in particular methods of silencing and how to resist them.
Finally, it can be asked, why has there been so little attention to CTAs compared to the attention lavished on conspiracy theories and whether to combat them? One explanation is that the impacts of conspiracy theories are much greater than those of CTAs. Another explanation is that CTAs are primarily targeted against individuals and groups that are stigmatised and have relatively little power to fight back. The suggestion here is that campaigners, and many researchers, side with those having the greatest power. The roles of these explanations, and other factors, are matters for further investigation.
This examination shows a potential advantage of studying CTAs: it provides a window into additional options for responding. Because the targets of CTAs are usually politically weaker than their accusers, they have the extra options of counterattacking and accepting. This is a useful reminder to take into account the relative resources — political, social, epistemological — of conspiracy theorists and those who oppose them. The concept of debunking implicitly assumes that potential debunkers have resources to undertake the task and an audience for their efforts. Discussing whether debunking is a good idea should take into account options, resources and audiences, as well as likely consequences.
Thanks to Lee Basham, Kurtis Hagen and two anonymous referees for valuable comments on drafts of this paper.
Basham, Lee. 2018. ‘Joining the Conspiracy.’ Argumenta 3 (2): 271–290.
Bjerg, Ole and Thomas Presskorn-Thygesen. 2017. ‘Conspiracy Theory: Truth Claim or Language Game?’ Theory, Culture & Society 34 (1): 137–159.
Blume, Stuart. 2017. Immunization: How Vaccines Became Controversial. London: Reaktion Books.
Bratich, Jack Z. 2008. Conspiracy Panics: Political Rationality and Popular Culture. Albany: State University of New York Press.
Brooks, Aloysia. 2016. The Annihilation of Memory and Silent Suffering: Inhibiting Outrage at the Injustice of Torture in the War on Terror in Australia. PhD Thesis, University of Wollongong. http://ro.uow.edu.au/theses/4865/.
Buenting, Joel and Jason Taylor. 2010. ‘Conspiracy Theories and Fortuitous Data.’ Philosophy of the Social Sciences 40 (4): 567–578.
Cernic, Mateja. 2018. Ideological Constructs of Vaccination. Newcastle Upon Tyne: Vega Press.
Coady, David. 2006. Conspiracy Theories: The Philosophical Debate. Aldershot, UK: Ashgate.
Colgrove, James. 2006. State of Immunity: The Politics of Vaccination in Twentieth-Century America. Berkeley: University of California Press.
Conis, Elena. 2015. Vaccine Nation: America’s Changing Relationship with Immunization. Chicago: University of Chicago Press.
deHaven-Smith, Lance. 2013. Conspiracy Theory in America. Austin: University of Texas Press.
Dentith, Matthew R. X. 2014. The Philosophy of Conspiracy Theories. New York: Palgrave Macmillan.
Dentith, M R. X., ed. 2018. Taking Conspiracy Theories Seriously. Lanham, MD: Rowman & Littlefield.
Ehreth, Jenifer. 2003. ‘The Value of Vaccination: A Global Perspective.’ Vaccine 21: 4105–4017.
Engel, Susan and Brian Martin. 2015. ‘Challenging Economic Inequality: Tactics and Strategies.’ Economic and Political Weekly 50 (49), 5 December: 42–48.
Goertzel, Ted. 2010. ‘Conspiracy Theories in Science.’ EMBO Reports 11 (7): 493–499.
Goertzel, Ted. 2019. ‘The Conspiracy Theory Pyramid Scheme.’ In Conspiracy Theories and the People Who Believe Them, edited by Joseph Uscinski, 226–242. New York: Oxford University Press.
Gray, Truda and Brian Martin. 2006. ‘Defamation and the Art of Backfire.’ Deakin Law Review 11 (2): 115–136.
Grimes, David Robert. 2016. ‘On the Viability of Conspiratorial Beliefs.’ PLOS ONE 11 (1): e0147905.
Habakus, Louise Kuo and Mary Holland, eds. 2011. Vaccine Epidemic: How Corporate Greed, Biased Science, and Coercive Government Threaten Our Human Rights, Our Health, and Our Children. New York: Skyhorse.
Hagen, Kurtis. 2018. ‘Conspiracy Theories and the Paranoid Style: Do Conspiracy Theories Posit Implausibly Vast and Evil Conspiracies?’ Social Epistemology 32 (1): 24–40.
Harambam, Jaron. 2020. Contemporary Conspiracy Culture: Truth and Knowledge in an Era of Epistemic Instability. London: Routledge.
Harambam, Jaron and Stef Aupers. 2015. ‘Contesting Epistemic Authority: Conspiracy Theories on the Boundaries of Science.’ Public Understanding of Science 24 (4): 466–480.
Harambam, Jaron and Stef Aupers. 2017. ‘“I Am Not a Conspiracy Theorist”: Relational Identifications in the Dutch Conspiracy Milieu.’ Cultural Sociology 11 (1): 113–129.
Hobson-West, Pru. 2007. ‘“Trusting Blindly Can Be the Biggest Risk of All”: Organised Resistance to Childhood Vaccination in the UK.’ Sociology of Health & Illness 29 (2): 198–215.
Husting, Ginna and Martin Orr. 2007. ‘Dangerous Machinery: “Conspiracy Theorist” as a Transpersonal Strategy of Exclusion.’ Symbolic Interaction 30 (2): 127–150.
Jolley, Daniel and Karen M. Douglas. 2014. ‘The Effects of Anti-vaccine Conspiracy Theories on Vaccination Intentions.’ PLOS ONE 9 (2): e89177.
Kitta, Andrea. 2012. Vaccinations and Public Concern in History: Legend, Rumor, and Risk Perception. New York: Routledge.
Leach, Melissa and James Fairhead. 2007. Vaccine Anxieties: Global Science, Child Health & Society. London: Earthscan.
Leask, Julie, Paul Kinnersley, Cath Jackson, Francine Cheater, Helen Bedford and Greg Rowles. 2012. ‘Communicating with Parents about Vaccination: A Framework for Health Professionals.’ BMC Pediatrics 12 (154): 1–11.
Levy, Neil. 2007. ‘Radically Socialized Knowledge and Conspiracy Theories.’ Episteme 4 (2): 181–192.
Loussikian, Kylar. 2016. ‘Uni Accepts Thesis on Vaccine “Conspiracy”.’ The Australian, 13 January, pp. 1, 4.
Martin, Brian. 1999. ‘Suppression of Dissent in Science.’ Research in Social Problems and Public Policy 7: 105–135.
Martin, Brian. 2007. Justice Ignited: The Dynamics of Backfire. Lanham, MD: Rowman & Littlefield.
Martin, Brian. 2009. ‘Managing Outrage over Genocide: Case Study Rwanda.’ Global Change, Peace & Security 21 (3): 275–290.
Martin, Brian. 2011. ‘Debating Vaccination.’ Living Wisdom 8: 14–40.
Martin, Brian. 2015. ‘Censorship and Free Speech in Scientific Controversies.’ Science and Public Policy 42 (3): 377–386.
Martin, Brian. 2016. ‘News with a negative frame: a vaccination case study.’ 3 March. http://www.bmartin.cc/pubs/16Loussikian.html.
Martin, Brian. 2017. ‘Defending University Integrity.’ International Journal for Educational Integrity 13: 1–14.
Martin, Brian. 2018a. Vaccination Panic in Australia. Sparsnäs, Sweden: Irene Publishing.
Martin, Brian. 2018b. ‘Persistent Bias on Wikipedia: Methods and Responses.’ Social Science Computer Review 36 (3): 379–388.
Martin, Brian. 2019. ‘Uptake of a Conspiracy Theory Attribution.’ Social Epistemology Review and Reply Collective 8 (6): 16–30.
Martin, Brian. 2020. ‘Dealing with Conspiracy Theory Attributions.’ Social Epistemology [This paper]
Martin, Brian and Truda Gray. 2005. ‘How to Make Defamation Threats and Actions Backfire.’ Australian Journalism Review 27 (1): 157–166.
McDonald, Paula, Tina Graham and Brian Martin. 2010. ‘Outrage Management in Cases of Sexual Harassment as Revealed in Judicial Decisions.’ Psychology of Women Quarterly 34: 165–180.
Oliver, J. Eric and Thomas J. Wood. 2014. ‘Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.’ American Journal of Political Science 58 (4): 952–966.
Pigden, Charles. 2007. ‘Conspiracy Theories and the Conventional Wisdom.’ Episteme 4 (2): 219–232.
Plotkin, Stanley A., Walter A. Orenstein, Paul A. Offit and Kathryn M. Edwards. 2017. Plotkin’s Vaccines. 7th ed. Amsterdam: Elsevier.
Räikkä, Juha and Lee Basham. 2019. ‘Conspiracy Theory Phobia.’ In Conspiracy Theories and the People Who Believe Them, edited by Joseph E. Uscinski, 178–186. New York: Oxford University Press.
reasonable hank. 2012. ‘Of Publication, and Sleights of Hand.’ 18 September. https://reasonablehank.com/2012/09/18/of-publication-and-sleights-of-hand/.
Reich, Jennifer A. 2016. Calling the Shots: Why Parents Reject Vaccines. New York: New York University Press.
Sunstein, Cass R. and Adrian Vermeule. 2009. ‘Conspiracy Theories: Causes and Cures.’ Journal of Political Philosophy 17 (2): 202–227.
Thalmann, Katharina. 2019. The Stigmatization of Conspiracy Theory since the 1950s: ‘A Plot to Make Us Look Foolish.’ London: Routledge.
Uscinski, Joseph E., ed. 2019a. Conspiracy Theories and the People Who Believe Them. New York: Oxford University Press.
Uscinski, Joseph E. 2019b. ‘Down the Rabbit Hole We Go!’ In Conspiracy Theories and the People Who Believe Them, edited by Joseph E. Uscinski, 1–32. New York: Oxford University Press.
Uscinski, Joseph E. 2019c. ‘Are Conspiracy Theories ‘Anti-science’?’ In Conspiracy Theories and the People Who Believe Them, edited by Joseph E. Uscinski, 199–200. New York: Oxford University Press.
Wiley, Kerrie E., Julie Leask, Margaret A. Burgess and Peter B. McIntyre. 2019. ‘PhD Thesis Opposing Immunisation: Failure of Academic Rigour with Real-world Consequences.’ Vaccine 37: 1541–1545.
Wilyman, Judy. 2015. A Critical Analysis of the Australian Government’s Rationale for its Vaccination Policy. PhD thesis, University of Wollongong. http://ro.uow.edu.au/theses/4541/.
Wilyman, Judy. 2019. ‘PhD Thesis on Vaccination Policy: Scholarly and Socially Relevant.’ Vaccination Decisions, https://www.vaccinationdecisions.net/wp-content/uploads/2019/04/Response-final-to-the-Vaccine-Article-by-Wiley-et-al190417.pdf
Wood, Michael J. 2016. ‘Some Dare Call It Conspiracy: Labeling Something a Conspiracy Theory Does Not Reduce Belief in It.’ Political Psychology 37 (5): 695–705.
Wood, Michael J. and Karen M. Douglas. 2013. ‘“What about Building 7?” A Social Psychological Study of Online Discussion of 9/11 Conspiracy Theories.’ Frontiers in Psychology 4,article 409: 1–9.
World Health Organization. 2019. ‘Ten Threats to Global Health in 2019.’ https://www.who.int/emergencies/ten-threats-to-global-health-in-2019.