Persistent bias on Wikipedia: methods and responses

Published, with minor subediting, in Social Science Computer Review, Vol. 36, No. 3, June 2018, pp. 379-388; doi: 10.1177/0894439317715434

Pdf of published version

Brian Martin


Go to

Brian Martin's publications

Brian Martin's website


Abstract

Systematically biased editing, persistently maintained, can occur on Wikipedia while nominally following guidelines. Techniques for biasing an entry include deleting positive material, adding negative material, using a one-sided selection of sources, and exaggerating the significance of particular topics. To maintain bias in an entry in the face of resistance, key techniques are reverting edits, selectively invoking Wikipedia rules, and overruling resistant editors. Options for dealing with sustained biased editing include making complaints, mobilising counter-editing and exposing the bias. To illustrate these techniques and responses, the rewriting of my own Wikipedia entry serves as a case study. It is worthwhile becoming aware of persistent bias and developing ways to counter it in order for Wikipedia to move closer to its goal of providing accurate and balanced information.

 

Wikipedia is perhaps the best-known product of cooperative voluntary work (Jemielniak, 2014; Lih, 2009; Reagle, 2010). Through the efforts of thousands of contributors, it has rapidly outgrown traditional encyclopaedias in size and influence. Anyone who regularly searches the web for information is likely to see numerous links to Wikipedia entries. All sorts of people, from students to people with health problems, rely on Wikipedia for information. Wikipedia’s ascendancy was achieved within a few years of exceptional growth, based on contributions from unpaid and unheralded editors.

Yet Wikipedia has had plenty of teething problems (Lovink & Tkacz, 2011). On some controversial topics, such as abortion and George W. Bush, there have been edit wars, with committed editors seeking to impose their viewpoints (Lih, 2009, pp. 122-131; Yasseri et al., 2012). There are trolls and vandals who, for various reasons, seek to deface well-written entries. There are covert editing efforts to shape the portrayal of individuals, organisations and topics, for example when paid workers edit entries about their employer or client (Craver, 2015; Thompson, 2016). Allegations have been made about systematic bias on certain topics, for example parapsychology (Weiler, 2013: 152-183), and about a range of other problems (Wikipediocracy, 2017).

Wikipedia has instituted various measures to address problems. To interrupt editing wars, versions of some entries are locked down. To fix edits by trolls and vandals, various bots patrol entries, alerting administrators to suspicious changes (Geiger, 2011). Although nominally Wikipedia is an egalitarian enterprise in that anyone can be an editor, in practice admins have a lot of power, and some have more power than others (O’Neil, 2009). 

Wikipedia has detailed sets of guidelines about various matters, for example neutral point of view, vandalism, disruptive editing, and biographies of living persons (Reagle, 2010). If these guidelines were always followed, Wikipedia would be remarkably free of problems. The real challenge is ensuring they are implemented in practice. All sorts of organisations, including governments, corporations and churches, have noble-sounding aims and rigorous rules, but these sometimes provide a façade for corruption and abuse. Ironically, Wikipedia rules are often used less to resolve disputes than as tools in waging editing struggles (Tkacz, 2015, p. 99).

Because of the possible discrepancies between rules and practice, the test of the performance of Wikipedia is through examining actual practice. There are various ways to go about this, for example looking at the processes by which vandalism is rectified (Geiger & Ribes, 2010) and at the bias in entries on political-party relevant topics as revealed by word use (Greenstein & Zhu, 2012). Because Wikipedia is so huge and in multiple languages, any attempt to look broadly at patterns of inaccuracy and bias is overwhelming in scale.

My aim here is to indicate some of the techniques of biased editing, ways of maintaining it in the face of resistance, methods for probing it and options for responding. My aim is not to estimate the prevalence or seriousness of bias but rather to show how it can be instituted and opposed.

To illustrate techniques and responses, I use a particular Wikipedia page as my primary example: the entry about myself in the first half of 2016. My central purpose is to illustrate methods for imposing and maintaining bias. Entrenched bias on some other pages is far more extensive and serious than the treatment of my page. My page is convenient for analysis because the volume of data is smaller and the trigger for rewriting is obvious.

Given that all claims can be disputed and that there are legitimate differences of perspective, a purely neutral presentation is only an aspiration or guiding principle, not an achievement. It might be said that every Wikipedia editor is biased in one way or another, usually unconsciously. However, by being challenged by other editors, with different views, the more extreme sorts of bias may be overcome, or at least that is the assumption underlying Wikipedia’s operations. The problem I address here is not the bias of individual editors, which is predictable, unavoidable and usually unconscious rather than intentional, but the systematic and sustained imposition of a particular point of view in the face of reasonable objections, using techniques or creating outcomes that are not regularly observed in other Wikipedia entries. This sort of imposed bias might also be called advocacy or partisanship, or even propaganda or disinformation, though these latter terms have connotations of government manipulation of the facts. Commercially inspired biased editing might be considered a form of covert advertising. The essence of the sort of bias addressed here is concerted and sustained promotion of a particular point of view, overruling objections.

In the next section, I list a number of methods that can be used to bias an entry, using examples from my own entry to illustrate them. In the following section, I look at methods to maintain bias in the face of resistance, concluding with a comparison between my Wikipedia entry and several others. Finally, I outline six options for responding to particular cases of persistent bias and present a general suggestion for revising Wikipedia policies.

Methods for biasing a Wikipedia entry

The following methods are among those that can be used to bias a Wikipedia entry. They are couched in terms of turning a Wikipedia article or page that is positive about a person or topic into one that is negative. Most of these methods are obvious enough in principle, though the ways they are implemented can be ingenious. Some of the subtleties are addressed in Martin (2016).

A major constraint on biasing techniques is that they need to appear to conform to Wikipedia policies, at least on the surface. Adherence to guidelines can actually make an entry more convincing, with the bias being disguised through the formalities of style and referencing.

On January 21, 2016, admin JzG (aka Guy) rewrote most of my Wikipedia entry (https://en.wikipedia.org/wiki/Brian_Martin_(social_scientist)), turning it into an attack on my reputation. In the following months, this negative framing was maintained, primarily by JzG and editor Gongwool. See Martin (2016) for a detailed analysis of the content of my Wikipedia entry that shows all of these techniques in more detail.

Delete positive material. JzG removed text about my achievements, for example a paragraph about my studies of the dynamics of power. JzG also deleted the list of some of my “works” (books and articles). Yet nearly every other Wikipedia entry for a living Australian academic, excluding stubs, lists some of the subject’s publications.

Add negative material. JzG added new material to my entry, initially devoted to the criticisms relating to the thesis of my PhD student Judy Wilyman. In the following months, considerable material was added about my social science analyses concerning the debate over the origin of AIDS, all oriented to discredit the theory that AIDS arose from contaminated experimental 1950s polio vaccines. Other Wikipedia entries about the origin of AIDS were similarly slanted (Dildine, 2016).

Use a one-sided selection of sources. According to Google Scholar, my publications have received thousands of citations. Of these citations, many are incidental and do not contain discussions of my work. Of those that do, a majority are favourable. By looking through the citations to my academic publications, it would be easy to find positive references to my work (e.g., Delborne, 2008; Mazur, 1992; McDonald and Backstrom, 2008). However, the hostile editors of my Wikipedia page relied primarily on critical citations, especially on newspaper articles, even though these are not high quality sources. What is interesting here is that this form of systematic bias can be implemented while conforming on the surface to Wikipedia’s guidelines on using sources to support claims.

On February 9, 2016, JzG made an edit accompanied by this comment: “(this is, I think, more accurate. Lots of sources criticising Martin for his defence of Wakefraud, but I am still looking for one that will not be challenged as "TEH SKEPTICXS!!!").” Setting aside the misreading of my paper that discusses Andrew Wakefield (Martin, 2015), JzG here articulates his intent to find a source to support his view rather than to provide a neutral point of view.

Expand or exaggerate the significance of negative material; omit or downgrade other contributions. My publications on the origin-of-AIDS debate and my supervision of Judy Wilyman are given extensive treatment even though these form only a small part of my research output and supervision, while my research and advocacy on other controversies, whistleblowing, nonviolent action, democracy, information politics, tactics against injustice, and other issues are either not mentioned or given only brief coverage.

Write text so it has negative connotations or conveys incorrect information. Many of the comments about my work are slanted, almost always to emphasise or suggest negatives. False claims are made, for example that I endorse the OPV origin-of-AIDS theory and oppose vaccination, backed up by citations to others who made these misrepresentations. For example, I am included in the category “anti-vaccination activists” even though I do not have strong views about vaccines and have never campaigned or even argued against them, something I repeatedly state in my writings.

One statement (19 July 2016 version) is “Agence Science Presse reports Martin ‘also defends the idea of a vaccine-autism link’.” However, the report is wrong: I have never defended this idea. The technique here is to find some source that makes false or misleading claims and to quote and cite it without any attempt to present contrary views. There is no secondary source available to counter the Agence Science Press claim: after all, no one could be expected to write about every belief I do not hold.

Maintaining bias

That editors and edits will be biased is to be expected, and Wikipedia policies take this into account, most notably Neutral Point of View. The idea behind Wikipedia’s collaborative model is that individually biased editors can collectively create a relatively neutral treatment of a topic through the give-and-take of contributors with a variety of points of view, with the bias of any individual being countered by others. As well, some sort of appeal procedure is needed in cases where bias is doggedly imposed. Biasing of entries uses various techniques, as outlined in the previous section. To maintain bias in the face of resistance, several additional techniques are important:

Revert contrary edits. An example, one of several: Johnfos on 25 January 2016 added some text; this was immediately reverted by JzG.

Invoke Wikipedia policies selectively. Wikipedia guidelines are powerful tools: they can be selectively applied. Attackers may violate the guidelines while using them to justify reverting corrections to their edits (O’Neil, 2009, pp. 154, 161). On 4 February 2016, JzG removed a link to the publisher of several of my books, commenting “Sales promotion pages are not a reliable independent secondary source.” JzG thus invoked Wikipedia’s rule on reliable sources to remove a link to a publisher’s website while leaving links to the websites of other publishers untouched. JzG earlier had relied heavily on newspaper articles as secondary sources.

Attack resistant editors. SmithBlue made a concerted effort to oppose some of the one-sided material on my entry, pointing out that additions did not conform to Wikipedia’s guidelines. SmithBlue’s changes were mostly reversed. SmithBlue then sought to appeal to Wikipedia admins, leading to a lengthy and bitter dispute on the admins’ noticeboard (Wikipedia, 2016), with the result that SmithBlue was henceforth banned from editing on the topic of AIDS (including my page).

The rewriting of my entry can be understood as being part of a wider campaign. On January 11, 2016, an announcement was made that a doctoral student I had been supervising, Judy Wilyman, had received her PhD. Her thesis (Wilyman, 2015) is a critique of the Australian government’s vaccination policy. Since 2009, there has been a citizens’ group in Australia called Stop the Australian (Anti)Vaccination Network (SAVN) that has denigrated and harassed public critics of vaccination. For years Judy had been one of their targets.

On January 13, 2016, a front-page story appeared in the national newspaper The Australian condemning the thesis, me as supervisor and the university for allowing Judy to graduate. This story led to an outpouring of hostile commentary in blogs and tweets, including a petition calling for disciplinary action to be taken against the university. In my thirty years of studying dissent and scientific controversies, I had never heard of such a huge and sustained attack related to a student’s thesis (Martin, 2017).

On 14 January, JzG created a new Wikipedia entry titled “Judith Wilyman PhD controversy,” almost entirely hostile to her thesis. In addition, a version of this new entry was added to the University of Wollongong’s Wikipedia page. JzG’s extensive rewriting of my Wikipedia entry commenced on 21 January. The initial rewriting of my page thus can be seen as part of a coordinated effort to denigrate Judy’s thesis and individuals and organisations associated with it, including me. Later, my work on the origin-of-AIDS debate was denigrated. The reframing of my page was maintained for months in the face of resistance.

It is revealing to compare my entry, before and after being rewritten, with the Wikipedia entries of several of my peers. For this comparison, I chose four living male Australian academics in the humanities or social sciences, born within five years of me, who have been involved in public issues that might be considered controversial, who have written numerous books, and who have Wikipedia entries. In Table 1, three features of the entries are reported: lists of works/publications, the fraction of references (counted as endnotes in Wikipedia entries) to media sources, and the fraction of these references to works by the subject of the entry. The figures in Table 1, collected on February 10, 2017, could vary slightly depending on how sources are classified as media (mass or social) or non-media.

Table 1. Selected comparisons between Wikipedia entries of five Australian academics, February 10, 2017 unless noted otherwise

Name

List of publications

References to mass or social media sources

References to works by subject of entry

Dennis Altman

Separate section listing 13 books with publishers and dates

2/12

5/12

Stuart Macintyre

Separate section listing 6 books with publishers and dates

6/17

8/17

Robert Manne

Separate sections listing 24 books, 12 essays, etc., with publishers and dates

1/4

2/4

Brian Martin  (January 1, 2016)

Separate sections listing 17 books and 7 articles with publishers and dates

3/12

7/12

Brian Martin  (February 10, 2017)

Separate section mentioning 4 books, without specific dates

15/25

1/25

Keith Windschuttle

Separate section listing 10 books with publishers and dates

8/87

18/87

This comparison shows that prior to 2016, my entry was fairly similar, in the categories listed, to these four Australian academics. By July 2016, my entry had become atypical in three ways: it no longer had a dot-pointed list of works with publishers and dates; it relied unusually heavily on media sources; and it contained only a single reference (that is, a Wikipedia endnote) to my own works. This suggests that my entry, while being edited in 2016, in several ways became systematically different from the usual form for such entries, a difference maintained through dozens of edits over many months. This is evidence of persistent bias, as judged by comparison with the typical practice of Wikipedia editors in a particular domain.

Responding to persistent bias

Options for response include:

1. Edit the page yourself.

2. Complain to Wikipedia.

3. Threaten legal action.

4. Do nothing.

5. Mobilise counter-editing.

6. Expose the bias.

1. Edit the page yourself. This is an obvious option for many pages, though if it’s your own page, Wikipedia guidelines advise against personally editing it except in special circumstances. Editing entries is Wikipedia’s operating principle, the idea being that the combined efforts of different contributors will lead to a more accurate and balanced text. However, when entries are highly contested, any individual editor can be outgunned by more determined, experienced and influential editors or admins. In such cases, personal edits will be regularly reversed and efforts will be in vain. Another limitation is that your own editing is almost certainly biased, and furthermore you can be accused of imposing bias yourself. A related possibility is to comment on talk pages.

2. Complain to Wikipedia. There are various processes within Wikipedia for registering disagreement with entries and editing. Some of them involve seeking intervention by admins. Complaints are more likely to be successful if admins are fair-minded and willing to intervene against unfair editing. However, sometimes admins may be responsible for biased editing, tolerant of it, or unwilling to enter into a dispute with another admin, in which case this approach will not succeed. It is also possible to make complaints at a higher level, but these are most likely to be referred to lower-level processes.

3. Threaten legal action. If hostile editors are known and they have included false and defamatory material in an entry, they could be sued. However, many editors do not reveal their off-line identity. Furthermore, threatening legal action might trigger even more hostile editing, in what is known as the Streisand effect or defamation backfire (Jansen & Martin, 2015), and the new editing would be more careful to create a negative image while avoiding obviously defamatory statements.

It would also be possible to sue Wikipedia as the publisher of defamatory material, with the identity of editors sought using the discovery process during litigation. Suing is expensive and not guaranteed to succeed, and there is no guarantee that even a successful legal action would lead to a better entry: detractors might set up a new page, or initiate other reputation-damaging actions in social media. Furthermore, suing might well be damaging to Wikipedia as an enterprise. If the aim is to help Wikipedia develop processes to ensure high-quality pages, legal action is not promising.

4. Do nothing. This means just accepting that an entry is systematically biased but that it is not worth the effort to try to fix it. Doing nothing can mean judging that few people pay attention to the page, or that the opportunity costs of trying to change it are too great: there may be other ways to convey more accurate information, for example personal web pages, social media, publicity, advertising or word-of-mouth.

5. Mobilise counter-editing. Although personally trying to redress persistent bias on Wikipedia may be futile, there is a greater prospect of succeeding by recruiting a whole team or network of editors. When the bias was created or reinforced by more than one editor, then to counter it will probably require recruiting supporters in greater numbers or with greater energy, skills and/or authority. For example, it might be possible to recruit co-workers, members of a social movement, or friends and acquaintances, and help them or encourage them to become editors of the pages in question. This might even be a learning opportunity for those involved.

Mobilising a sufficiently large and energetic group of sympathetic editors is one way to overcome persistent bias, but it is worth making an assessment of the time and energy involved, which might be used for other purposes. This is the same issue of opportunity cost mentioned in option 4, except with a collective effort the cost is higher.

There is another possible risk in mobilising editors to intervene: others might see this as creating rather than redressing bias, and seek to counter it by undertaking their own mobilisation efforts. The result might well be an escalation of the struggle, with no great improvement in the objectivity of the entry, as the tactic is basically an attempt to oppose propaganda with counter-propaganda.

Finally, it is important to note that organised editing contravenes the Wikipedia policy on canvassing. Hence, this response is risky; it could lead to measures against those involved.

6. Expose the bias. Rather than, or as well as, trying to counter bias, it is possible to expose it by documenting inaccuracies, misleading statements, omissions and so forth. Beyond writing a convincing account, to expose bias requires getting the account to interested readers. To do this in a way that matches the reach and influence of Wikipedia is a daunting challenge. Most people just want to read Wikipedia to learn about whatever topic that interests them and are unlikely to be bothered to search for and read an explanation of why a particular Wikipedia entry is allegedly biased. However, this might still be worthwhile if the account is distributed to recipients with a special interest in the topic.

In the case of my Wikipedia entry, I did not pursue option 1, editing it myself. However, a number of others, sympathetic to but independently of me, individually edited my entry. (Had I requested their efforts, this would have been option 5.) Their efforts made some difference to the content but did not redress the fundamental reorientation of my page imposed primarily by JzG and Gongwool. One of these independent editors, SmithBlue, attempted to use Wikipedia procedures to question the wholesale revisions to my page, but with little success. To my knowledge, there were no threats of legal action (option 3). There is a Wikipedia policy on attack pages that potentially could be invoked (option 2), though whether my entry ever fully conformed to the specifications for an attack page is debatable.

My case is special because I am a social scientist researching tactics of denigration, harassment, and censorship. Therefore, I am more interested in the value of my page in revealing techniques of biasing than in having the page be a more accurate reflection of my life and work. For years, I never paid much attention to my Wikipedia entry because I run my own website, which contains the full text of nearly all my publications plus much additional information, and is thus a far more comprehensive portrayal of my work than Wikipedia.

The highly partisan editing of my page did not upset me; rather, it provided a research opportunity. I was alerted to the editing by an editor previously unknown to me, and subsequently asked several individuals for advice about Wikipedia, in particular about addressing disruptive and biased editing. Several Wikipedia editors told me that trying to redress systematic bias in Wikipedia pages is a thankless task, because great efforts can be applied in making changes only to have them reversed by others who are more determined or have more sway. This persuaded me to dismiss options 1 to 5 and instead to write this article, which is one way to pursue option 6, “Expose the bias” (see also Martin, 2016). However, as noted, my interest is less in the bias itself and more in the techniques used. Therefore, I have not sought to have the entry deleted, which might be considered another option.

None of the six options has the capacity to address the general problem of persistent bias on Wikipedia. At most, some of the options might lead to improvements in specific entries, most likely at the expense of considerable effort. Ultimately, reform needs to be driven by changes in Wikipedia policies, practices, and culture. The question is, what sorts of policies are actually likely to limit the impact of persistent bias? One option would be to allow the subject of a “biography of a living person” to write a comment about their own entry, to be displayed or, albeit less effectively, made available via a link from their Wikipedia entry. However, this might give only a superficial appearance of fairness and in any case would not address systematic bias on non-biography pages.

My preference would be to institute an experimental process. Based on suggestions submitted, a panel of independent advisers would select several promising proposals and implement them on randomly selected subsets of Wikipedia pages. Then, down the track, a different panel of independent analysts would judge the outcomes. To undertake this sort of experiment, much prior study would be required to evaluate existing patterns and persistence of bias in Wikipedia entries in order to provide a benchmark for assessing any changes due to different policies. This approach to improving Wikipedia relies on experimental testing using control groups (Wilson, 2011) and on incremental learning from shortcomings (Syed, 2016).

Conclusion

Wikipedia is an amazing innovation in collective voluntary effort to create a highly valuable resource in the public domain. As such, it is worth defending and improving. However, this does not mean Wikipedia is flawless. As well as the more obvious problems of vandalism and struggles over contentious topics, I have sought here to highlight a less obvious but nevertheless pernicious phenomenon of persistent bias that usually hides under the radar because editors and admins appear to follow formal Wikipedia guidelines, though in a selective way. Whether individual editors are consciously biased is not the issue; no doubt nearly all have good intentions. The test of bias is in the product.

Effective imposition of bias will superficially conform to Wikipedia policies. It provides citations for statements in accordance with the policy on reliable sources, except that the citations chosen are an unrepresentative sample of all those available. It excises contrary comment as involving original research while passing judgements about contentious matters. It presents information in an appearance of neutrality while imposing a non-neutral point of view.

I have listed a few key techniques for biasing an entry, illustrating them with examples from the editing of my page. It would be a worthwhile comparative exercise to examine biased entries in completely different domains and see whether the techniques used to slant them are similar. A preliminary hypothesis might be that nearly every Wikipedia entry on a contentious topic is likely to be subject to persistent bias and that for the result to be close to neutral depends on the unlikely circumstance that the various biased interventions happen to counteract each other.

Finally, I listed six options for dealing with persistent bias in Wikipedia entries, none of which is particularly promising. A better approach is to reform Wikipedia policies and practices; trialling a variety of options is one way to figure out how best to do this. Much more needs to be learned about how to defend and improve what is worthwhile about Wikipedia.

Acknowledgements

For many valuable comments, I thank Chris Barker, Don Eldridge, Jonathan Mavroudis and eight others who prefer to remain anonymous.

References

Craver, J. (2015). PR firm covertly edits the Wikipedia entries of its celebrity clients. Wiki Strategies, 16 June. Retrieved May 18, 2017, from http://wikistrategies.net/sunshine-sachs/.

Delborne, J. A. (2008). Transgenes and transgressions: scientific dissent as heterogeneous practice. Social Studies of Science, 38, 509-541.

Dildine, R. (2016). Wikipedia’s strange certainty about Edward Hooper, Brian Martin, and the OPV/AIDS hypothesis. Retrieved May 18, 2017, from http://www.aidsorigins.com/sites/all/files/pdfs/Wikipedias_Strange_Certainty.pdf.

Geiger, R. S. (2011). The lives of bots. In G. Lovink & N. Tkacz (Eds.), Wikipedia: A critical point of view (pp. 78-93). Amsterdam: Institute of Network Cultures.

Geiger, R. S., & Ribes, D. (2010). The work of sustaining order in Wikipedia: The banning of a vandal. Proceedings of the 2010 ACM Conference on Computer-Supported Cooperative Work (CSCW 2012) (pp. 117-126). New York: ACM Digital Library.

Greenstein, S., & Zhu, F. (2012). Is Wikipedia biased? American Economic Review, 102, 343-348.

Jansen, S. C., & Martin, B. (2015). The Streisand effect and censorship backfire. International Journal of Communication, 9, 656-671.

Jemielniak, D. (2014). Common knowledge? An Ethnography of Wikipedia. Stanford, CA: Stanford University Press.

Lih, A. (2009). The Wikipedia revolution: How a bunch of nobodies created the world’s greatest encyclopedia. London: Aurum.

Lovink, G., & Tkacz, N. (Eds.) (2011). Critical point of view: A Wikipedia reader. Amsterdam: Institute of Network Cultures.

Martin, B. (2015). On the suppression of vaccination dissent. Science and Engineering Ethics, 21, 143-157.

Martin, B. (2016). “Brian Martin (social scientist)”: a Wikipedia entry annotated by its subject. Retrieved May 18, 2017, from http://www.bmartin.cc/pubs/16wp.pdf.

Martin, B. (2017). Defending university integrity. InternationalJournal for Education Integrity, 13, pp. 1-14.

Mazur, A. (1992). Determining the truth in a scientific controversy. Minerva, 30, 582-584.

McDonald, P., & Backstrom, S. (2008). Fighting back: Workplace sexual harassment and the case of North Country. Australian Bulletin of Labour, 34, 47-63.

O’Neil, M. (2009). Cyberchiefs: Autonomy and authority in online tribes. London: Pluto.

Reagle, J. M., Jr. (2010). Good faith collaboration: The culture of Wikipedia. Cambridge, MA: MIT Press.

Syed, M. (2016). Black box thinking: Marginal gains and the secrets of high performance. London: John Murray.

Tkacz, N. (2015). Wikipedia and the politics of openness. Chicago: University of Chicago Press.

Thompson, G. (2016). Public relations interactions with Wikipedia. Journal of Communication Management, 20, 4-20.

Weiler, C. (2013). Psi wars: TED, Wikipedia and the battle for the Internet. US: Craig Weiler.

Wilson, T. D. (2011). Redirect: Changing the stories we live by. London: Allen Lane.

Wilyman, J. (2015). A critical analysis of the Australian government’s rationale for its vaccination policy. PhD thesis, University of Wollongong. Retrieved May 18, 2017 from http://ro.uow.edu.au/theses/4541/.

Wikipedia, 2016. Administrators’ noticeboard/IncidentArchive917, item 75. Retrieved May 18, 2017, from https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/IncidentArchive917 - WP:Brian_Martin_.28social_scientist.29_:_other_editor_is_feeling_stalked.2Fharassed._And_is_also_attacking_me.

Wikipediocracy, 2017. Wikipediocracy. Retrieved May 17, 2017 from http://wikipediocracy.com.

Yasseri, T., Sumi, R., Rung, A., Kornai, A. & Kertész, J. (2012). Dynamics of conflicts in Wikipedia. PLoS ONE, 7, e38869.

Author biography

Brian Martin is an honorary professorial fellow at the University of Wollongong, Australia. He is the author of 17 books and hundreds of articles on dissent, nonviolent action, scientific controversies, democracy, and other topics. Email: bmartin@uow.edu.au