Correcting error: strategic considerations

Published in Journal of Information Ethics, Volume 24, Number 2, Fall 2015, pp. 31-42

pdf of published article

Brian Martin


Go to

Backfire materials

Brian Martin's publications

Brian Martin's website


Abstract

The issue of correcting scholarly errors can be conceived of as a strategic encounter. The first challenge is recognizing that an error has been made, something often made difficult by confirmation bias, paradigm effects, and self-deception. The next challenge is deciding how to respond to error or allegations of error. Five options are covering up the error, attacking critics who raised the error, explaining the error away, acknowledging the error in context, and making an unqualified apology. The appropriate choice depends on circumstances, including the significance of the error and the likely responses of opponents.

Introduction

Imagine someone alleges you’ve made an error in an article or a talk. Often the first question you ask is whether it actually is an error and, if it is, the next questions concern ethics, namely whether to make a correction and what sort of correction to make. Here, a slightly different approach will be taken, namely to consider error correction as part of a strategic encounter. When there are opponents who seek to obtain some advantage from your error, and possibly from your admission of error, their likely responses need to be taken into account.

A common assumption in discussions of scholarly error is that everyone is, or should be, primarily concerned with a search for the truth. My special interest here is in cases where there is a struggle in which truth is treated more as a tool than a goal. In controversies such as those over climate change or genetic modification, scientific truth is only one aspect: there are also social goals, such as reducing the production of greenhouse gases or increasing the food supply, not to mention crass considerations of profit and scientific careers. Focusing narrowly on the ethics of correction may, in some cases, obscure the wider ethics involved in achieving good policy or fostering fair and open debate.

To tackle this topic, I first examine the challenges involved in recognizing and accepting that one has made an error, along with the associated issues concerning allegations of lying. Then I introduce strategy by looking at circumstances in which there are double standards, for example when opponents demand acknowledgement of a trivial error while remaining silent about their own larger error. To address strategy more systematically, I outline several different responses that can be made when allegations of error are used as means for disparaging reputations.

Recognizing error

Before it is possible to acknowledge or correct an error, it is necessary to actually recognize that it is an error. There are several social and psychological processes that can make this difficult. The phenomenon of confirmation bias affects nearly everyone: people tend to pay attention to information that supports their views and ignore information that challenges their views (Heffernan, 2011; Kahneman, 2011; Tavris and Aronson, 2007). As well, supportive information is often accepted uncritically whereas challenging information is subject to stringent critique. This is compatible with studies of the psychology of scientists showing that many of them remain committed to their viewpoints despite new evidence (Mitroff, 1974). The implication is that if one makes an error that supports one’s own view, then the error is likely to receive little scrutiny, whereas claims by an opponent about this error are likely to be carefully analyzed for flaws.

Consider this example: a supporter of the standard view about climate change says “The fact that glaciers are melting supports the theory of global warming.” A climate skeptic might point out an error: not all glaciers are melting; a few are growing. Is it an error to say “glaciers are melting” rather than “most glaciers are melting”? From the point of view of the person who endorses the standard view on climate change, it is not an error, or at least not one of any significance. Another flaw in the claim is that the melting of glaciers might be a natural process, not due to human activities. The climate skeptic has looked for flaws that the standard-view supporter has not (or does not consider important).

Related to this is the idea of paradigms, which are comprehensive frameworks for understanding the world. Thomas Kuhn’s original formulation of paradigms referred to wide-ranging perspectives, such as the Ptolemaic or earth-centered solar system superseded by the sun-centered or Copernican system, or creationism versus evolution (Kuhn, 1970). Within each model of the earth-sun system, new information can be accommodated by modifying the model slightly, or sometimes by simply disregarding non-conforming information, assuming that an explanation for it will be found eventually. For example, when experiments showed evidence of “ether drift,” in apparent contradiction with the theory of special relativity, many scientists simply ignored the results on the assumption that the experiments must be incorrect in some way (Swenson, 1972: 241).

On many controversial issues, one side may refuse to accept evidence that seems, to those on the other side, to be incontrovertible. This may be due to paradigm-like divergences in assumptions, methods, and ways of understanding the world. For example, HIV-AIDS dissidents, on being presented with evidence about the apparently ironclad evidence that HIV is contagious and is implicated in AIDS, can come up with explanations based on other factors, such as false diagnosis or alternative disease causation (Bauer, 2012; Duesberg, 1996).

In these circumstances, is it reasonable to accuse opponents of refusing to acknowledge error? There is no easy answer. Challenges to orthodoxy do not divide neatly into the credible and the unbelievable, but fall on a spectrum. The position of HIV-AIDS dissidents might be seen as aberrant or simply unlikely, but what about moon-landing skepticism? In any case, when people argue for a position and come up with self-consistent explanations for their views, they are hardly likely to admit error, and it is unreasonable to expect them to.

The implication of commitments to positions mean that correcting error is a reasonable expectation only in situations in which people on different sides of an issue share the same framework. Paradigm disputes are sometimes said to involve incommensurability, namely an inability to even comprehend what the other side is saying due to terms having different meanings embedded in incompatible systems of meaning. In most cases, though, it is possible to communicate between paradigms and there are some facts and processes agreed across them. The challenge is to figure out what can and cannot be agreed. For example, two scientists who disagree about the role of HIV in causing AIDS might agree that a person has died and, more specifically, that a person has died of Kaposi’s sarcoma, but not agree about whether the person had AIDS or that HIV was a causal factor. In such circumstances, it would be an error to say “no one has ever died of Kaposi’s sarcoma” but not necessarily to assert “no one has ever died of AIDS.”

In some controversies, campaigners claim that opponents are lying. Care needs to be taken, because people’s understandings of what counts as a lie differ. Paul Ekman (1985), a prominent researcher on lying, says a lie is a conscious attempt to deceive, and encompasses both telling falsehoods and withholding the truth. In this sort of definition, lying is perhaps better described as intentional deception. (On lying, see also Ford, 1996; Robinson, 1996.)

Contrary to Ekman’s view, many people think that remaining silent about relevant information does not count as lying. Consequently, they will go through contortions to make statements that are factually correct but misleading because they omit important truths. Consider the supporter of nuclear power who says “no member of the public has died from the use of nuclear power.” This omits mentioning workers who have died in reactor accidents, because workers are not categorized as members of the public. It also omits mentioning the possibility that numerous members of the public have died from radiation releases, both routine emissions and releases from major accidents such as Chernobyl and Fukushima, but that these cancer deaths are impossible to distinguish from the much greater numbers of cancers from other causes. The nuclear power supporter can be accused of lying, but can defend by saying no falsehood was involved. Agreement needs to be reached about what counts as lying and whether the supporter was intentionally being deceptive.

To add to the complexity, there is the phenomenon called self-deception, in which a person seems to have two minds, one of which deceives the other (Ariely, 2012; Cohen, 2001; Trivers, 2011). At one time and level of consciousness, a person knows something, but at a different time or level manages to forget or ignore it. A researcher might say, “all the data have been published” while knowing that some data were thrown out as suspect. The suspect data are unconsciously excluded from the category of real data and therefore not included in “all the data.” The researcher, when confronted with the disgarded data, may have to admit that not all the data were published, but had previously deceived themselves about this.

So powerful are the processes of confirmation bias, paradigm adherence (called framing in other contexts), and self-deception that it can be very difficult for some people to recognize they have made an error. It often requires someone else to bring the problem to their attention (Haidt, 2012: 47), and even then they may not grasp it.

Error-response tactics

Although there are many barriers to recognizing errors, in some cases they are obvious. For example, you might have written the wrong year of birth for a famous person, have given the wrong figures in a table of results (so they do not even add up properly), or have used someone else’s text without proper acknowledgment. Assume for the moment that these are genuine mistakes rather than intentional efforts to deceive the reader. The mistakes might be due to relying on memory and not checking for accuracy, transcribing the wrong column of figures, or not taking care in providing and checking sources.

Some mistakes are due to not understanding the topic or protocols sufficiently well. You might have the wrong year of birth because you were thinking of a different person and did not know the field well enough to notice. You might have listed the wrong figures because you used the output from a statistical package without sufficient care or understanding. You might have failed to give proper acknowledgement because you didn’t fully understand conventions or because of poor methods of keeping track of sources.

If others notice your mistakes, they can react in different ways. If they are friends or are sympathetic to what you are saying, they may simply ignore the mistakes or let you know privately. If they are personal foes or hostile to your research, they may challenge and publicize your mistakes with the intent of hurting your reputation. They might use words like deception, sloppy scholarship, scientific fraud, and plagiarism, with associated stigma.

A key factor is the way you are informed about the alleged error, and the likely audience for the allegations. When you discover the error yourself, it may be that no one else knows about it. Next, in terms of wider awareness, someone may personally contact you and politely suggest that you have made an error. The person contacting you might even be sympathetic to your work, and wants to alert you so you can fix things or to avoid making the same mistake again. This friendly person might even suggest how you can rectify the problem with the least damage to your reputation. In this circumstance, only one other person may know about the error.

Then there are situations in which a whole group or network knows about the error, and one of them takes the initiative to tell you. Or perhaps they do not tell you directly, but refer to the error obliquely in a talk or publication with limited circulation, so wider audiences probably will not know about the problem or even understand it. Finally there are situations in which a large audience is informed, sometimes through the mass media. In looking at the following options, the actual or potential audience for the information about your error may weigh heavily in choosing how to respond.

To begin a discussion of strategies for dealing with error, I will take the scenario of one person or side — referred to as “you” for convenience, and to make the problem less hypothetical — having made a mistake being challenged by someone or some group — “they” or “your opponents.” After examining options in this scenario, the ambit can be broadened to others.

Option 1: cover up

If you discover a mistake in your own published work that no one else has noticed, it is both tempting and easy to say nothing about it. Unless you point it out, probably no one will ever discover you made the mistake, and perhaps they won’t care either. For example, your mistake might be in a paper published several years ago that never attracted much attention. Maybe you cited incorrect references, or your data sample was incomplete, or your description of the research methods was faulty, or your results were skewed. No one will know unless they know the field very well, and even then they might not know unless they have access to your lab books or early drafts of your paper or perhaps even redo the entire study. If you are scrupulous, you might send an erratum to the journal, except that it seems pointless now that years have passed and no one cares about the article anyway. If the mistake did not affect your overall findings — your conclusion will stand despite the mistakes — then sending an erratum seems even less necessary.

This sort of cover-up is probably quite common, but it is virtually impossible to detect and therefore no one really knows how much undeclared error exists in the published literature. Studies suggest the level of published statistical error is considerable (Gore et al., 1977; Nieuwenhuis et al., 2011), though only some authors would be aware of their own errors. Given how easy it is to say nothing and how prevalent detected error is, it is reasonable to suppose that the tactic of cover-up is quite common.

Ironically, simple sorts of error, notably word-for-word plagiarism, are far easier to verify later on. If you become well known and an enemy of yours wants to bring you down, they may go to the trouble of running your old papers through text-matching software and trying to detect copied but unacknowledged passages. Prior to discovery by opponents, though, saying nothing may seem an attractive option. Possibly at the time you did not realize that you had copied text inappropriately, or didn’t know it was wrong. To broadcast a mea culpa years later would bring disrepute, especially given the stigma of plagiarism or fraud, for no particular benefit.

Even if someone points out a problem, the strategy of cover-up may seem attractive. If your accuser has little credibility and publishes only in an obscure outlet, such as a low-profile blog, if you say nothing, most people will never know the difference. Not dignifying the accusation with a reply is a type of cover-up, with the intention of not further publicizing the matter and hoping it will not become any bigger. If your accuser is neither skilled nor influential, whereas you are both, then remaining silent about the claims can let them die away with little impact.

The same line of argument applies if the accusations are completely wrong, either because they are malicious or simply misguided. Therefore, that you decide to say nothing about a potentially damaging claim does not necessarily signal that you are hiding something. Indeed, you may be a generous person who does not want to embarrass your accuser by undertaking a demolition of their ill-judged claims.

Option 2: counter-attack

A common response to allegations of error is to criticize those who make the allegations, often by claiming that they have made much greater errors. Attacking critics can be done in various ways, for example by saying they are biased, lack suitable credentials, are linked to a disreputable organization, have false or disgusting beliefs, are being paid by a group with a vested interest, or have made false allegations previously. These sorts of counter-attacks have the intent or effect of discrediting the critic. Most people assume that a claim by a person thought to be prestigious is more credible than one from a person with no claim to fame.

Counter-attack can be very effective, even though there may be little logical basis for the way it works. For example, if you can successfully label your critic as a pedophile or a Holocaust denier, then the critic’s claims about plagiarism will probably lose credibility in many people’s eyes, though there is no evidence that pedophilia or beliefs about the Holocaust affect the content of plagiarism claims.

Counter-attack is understandable, and even predictable, in cases in which allegations of error are part of a wider struggle.  Consider the case of Ward Churchill, well known for his advocacy for the cause of Native Americans. Many disagreed with Churchill’s views and felt threatened by his activism. When Churchill was accused of scholarly fraud, one part of his response was to accuse his critics of political bias (Churchill, 2008–9).

This example raises a wider question. Suppose you are guilty of a particular mistake, for example listing in your bibliography a reference that you haven’t read or checked but instead simply copied from another source. Suppose also that there are one thousand other scholars who have done the same thing, but only you are accused of this transgression of good scholarly practice. You might think it quite legitimate to respond by accusing your critics of running an agenda against you.

Another reason for counter-attacking is double standards. You are guilty of a small error, but your critic is guilty of something much worse. It can seem unfair to be expected to make an apology over a minor transgression when being accused by those whose transgressions are much greater. For example, you are accused of a conflict of interest because you received a free lunch from a corporate sponsor, but your accuser has received tens of thousands of dollars annually from a sponsor. Your counter-attack is an attempt to redirect attention towards the greater problem.

Option 3: reinterpretation

You might decide to relabel the error, dismiss it as trivial, provide some face-saving explanation for why it occurred, blame someone else for it, or say it is standard practice. Your critic has provided an interpretation of your acts, and you have provided a reinterpretation, namely a different interpretation. You admit that something happened, so you are not involved in cover-up, but you try to reduce its apparent significance or your responsibility. Relabeling is often effective. If your critics say you were cheating or corrupt, then you can call it an error or mistake. If your critics say it was an error or mistake, you might say it was an accident or an event. The idea is to use terms that imply lower levels of significance, intention, or responsibility.

A common explanation for plagiarism is that “There was a breakdown in my note-taking system”: making an inadvertent mistake is less blameworthy than knowingly violating rules for giving acknowledgement. Sometimes it is possible to blame a research assistant or editor for errors. Finally, there is the plea that “Everyone does it.”

Imagine you are a co-author of a paper that contains data that are misrepresented, and you are accused of scientific fraud. You have a choice of reinterpretations. You can say it is merely a mistake in presentation, that it is a minor issue, that your co-authors were responsible, or that presenting the data in the way you did is standard in the field.

Option 4: acknowledgment in context

You admit you have made an error, at the same time putting it in context of the important (and accurate) things you have to say. This option can be effective when the error is hard to deny and when it does not affect the wider case you are making. For example, when challenged about some discrepant data, you can admit that several data points were put in the wrong category, but that having redone the calculation, your conclusion is unaffected.

This option is similar to the option of reinterpretation except that instead of minimizing or blaming others for the error, you accept full responsibility for the error. The value of this option is to redirect attention to the important issues at stake. It is an especially useful method when your opponents bring up trivial points in an attempt to discredit you. Every time they do this, you can use the opportunity to refocus on what you believe are the central issues.

Some people are highly resistant to admitting making an error, so acknowledgment of any sort, even in context, can be difficult psychologically. It does have some advantages. It positions you as a person willing to acknowledge mistakes and learn from them. If your opponent refuses to ever admit mistakes, it shows them to be inflexible, devious, or dogmatic.

Option 5: unqualified apology

If you have made a serious mistake, you can give a complete admission of guilt, without any attempt to raise any objections, counter-attacks, qualifications, or diversionary comments. For this option, being direct, prompt, and open are valuable. If you can do this before being accused, it has the advantage of being seen to be open and honest.

What happens in many cases is that a person realizes they have done something wrong, but says nothing, hoping no one will notice. In other words, they start with option 1. Then, when challenged, they counter-attack or reinterpret. If the error is significant and easy for others to see, these techniques can further damage their reputation. They seen as culpable for the mistake, plus as being deceptive and devious. Making an unqualified apology avoids compounding the impact of the original mistake.

A climate change error

To illustrate these strategies, consider an error in the 2007 report by the Intergovernmental Panel on Climate Change (IPCC). An incorrect figure was given for the rate of melting of Himalayan glaciers (Banerjee and Collins, 2010). No one in the IPCC process noticed the error before publication. If the IPCC had been acknowledged by everyone as authoritative, then the error might never have been detected: it would have been ignored. However, climate change science is highly contested, with a small percentage of scientists challenging many of the assumptions, data and models used to estimate likely impacts of greenhouse gas emissions. Climate skeptics latched onto the error and used it to suggest the shortcomings of the IPCC process (Berini, 2010). So ignoring the error was not a possibility.

Some defenders of the IPCC preferred to counter-attack, for example by pointing out weaknesses of the arguments of the skeptics or their conflicts of interest. The strategy of reinterpretation can take many forms. One option — the one adopted by the IPCC (2010) — is to say that the Himalayan error does not affect the overall conclusions of the IPCC report. Another is to blame a particular individual for the error, noting that different scientists prepared other parts of the report. This was hinted at in the IPCC’s statement.

Next, there is the strategy of acknowledgement in context. One option is to say, yes, the IPCC in making this error overestimated the impacts of global warming, but in other parts of its report it underestimated the impacts, for example in assessing the rate of increase in forest blight in British Columbia. Finally, there is the strategy of unqualified apology. This would mean the IPCC would just say, “We got it wrong” without any qualifications or contextualization.

How to judge these strategies, in ethical terms, depends on one’s views about climate change. If you are skeptical about climate science and believe that measures to reduce greenhouse gas emissions are ill-advised, then the Himalayan melt error exemplifies the shortcomings of the IPCC and therefore needs highlighting: anything short of an unqualified apology is unacceptable. If you think anthropogenic climate change is one of the greatest problems facing humanity, and that the IPCC has largely been correct (or is even too cautious in its predictions), then an unqualified apology is absurd. More appropriate would be counter-attack, reinterpretation, or acknowledgment in context.

Conclusion

On the surface, the issue of correcting error seems straightforward: someone who makes a mistake should admit it and correct it. A closer inspection reveals two main shortcomings of the admission-correction model. The first is people’s inability to see their own errors, due to confirmation bias, paradigm effects and self-deception. The second shortcoming is to treat errors in isolation from the wider context, including struggles over theories and policies. For example, admitting an error may weaken the credibility of campaigners on a vital social issue.

To better understand the dynamics of disputes over error, it is useful to examine different methods of responding to claims about error. Five options are considered here: cover-up, counterattack, reinterpretation, acknowledgement in context, and unqualified apology. Many discussions of error assume that unqualified apology is the most suitable response, namely an ethical approach. However, a key point here is that options need to be assessed in context. In many cases, attributions of error are themselves part of a campaign to dispute a position or discredit an opponent. The errors involved may be irrelevant to the key issues, trivial in context, or less concerning than errors by those on the other side.

This analysis is not a warrant for hiding, disguising, or rationalizing errors. It does, though, point to the need to examine the wider strategic context in looking at claims about errors. Everyone makes errors, large, small, or sometimes simply in the imagination of opponents. How to address these errors and allegations of error depends on the context.

There is a parallel issue concerning lying. Absolutists, following Immanuel Kant, say one should never lie, but others say lying can be justified in certain circumstances, for example when Nazis come to your door asking whether any Jews are in the house: lying is a minor matter compared to saving lives. The revisionist view of lying is that it is sometimes functional or even benevolent (Bailey, 1991; Barnes, 1994; Nyberg, 1993). The same sorts of considerations apply to the issue of correcting error.

In a world in which everyone was open and honest, then there would be little risk in admitting errors, because this would be a process of personal and collective learning. Indeed, those who recognized and acknowledged their errors might be especially praiseworthy. However, today’s world is very far from this sort of ideal: errors are frequently used as vulnerabilities to be attacked, whether because of personal vendettas or campaigning on public issues. Acknowledgment of error is easier and safer in a joint search for truth.

References

Ariely, D. (2012). The (Honest) Truth about Dishonesty: How We Lie to Everyone — Especially Ourselves. New York: HarperCollins.

Bailey, F.G. (1991). The Prevalence of Deceit. Ithaca, NY: Cornell University Press.

Banerjee, B., & Collins, G. (2010). Anatomy of IPCC’s mistake on Himalayan galciers and year 2035. Yale Climate Connections, 4 February. Retrieved from http://www.yaleclimateconnections.org/2010/02/anatomy-of-ipccs-himalayan-glacier-year-2035-mess/.

Barnes, J.A. (1994). A Pack of Lies: Towards a Sociology of Lying. Cambridge: Cambridge University Press.

Bauer, H. H. (2012). Dogmatism in Science and Medicine: How Dominant Theories Monopolize Research and Stifle the Search for Truth. Jefferson, NC: McFarland.

Berini, N. (2010). Himalayan glaciers: how the IPCC erred and what the science says. SkepticalScience. Retrieved from http://www.skepticalscience.com/IPCC-Himalayan-glacier-2035-prediction.htm.

Churchill, W. (2008–9). The myth of academic freedom: experiencing the application of liberal principle in a neoconservative era. Works & Days, 26–27, 139–230.

Cohen, S. (2001). States of Denial: Knowing about Atrocities and Suffering. Cambridge: Polity Press.

Duesberg, P. (1996). Inventing the AIDS virus. Washington, DC: Regnery.

Ekman, P. (1985). Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage. New York: Norton.

Ford, C.V. (1996). Lies! Lies!! Lies!!! The Psychology of Deceit. Washington, DC: American Psychiatric Press.

Gore, S. M., Jones, I. G., & Rytter, E. C. (1977). Misuse of statistical methods: critical assessment of articles in BMJ from January to March 1976. British Medical Journal, 8 January, 85–87.

Haidt, J. (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. New York: Pantheon.

Heffernan, M. (2011). Willful Blindness: Why We Ignore the Obvious at Our Peril. New York: Walker & Company.

IPCC. (2010). IPCC statement on the melting of Himalayan glaciers. Intergovernmental Panel on Climate Change, 20 January.

Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.

Kuhn, T. S. (1970) The Structure of Scientific Revolutions, 2nd edition. Chicago: University of Chicago Press.

Mitroff, I. I. (1974) The Subjective Side of Science. Amsterdam: Elsevier.

Nieuwenhuis, S., Forstmann, B. U., & Wagenmakers, E.-J. (2011). Erroneous analyses of interactions in neuroscience: a problem of significance. Nature Neuroscience, 14 (9), 1105–1107

Robinson, W.P. (1996). Deceit, Delusion and Detection. London: Sage.

Swenson, Jr., L. S. (1972) The Ethereal Aether: A History of the Michelson-Morley-Miller Aether-Drift Experiments, 1880–1930. Austin, TX: University of Texas Press.

Tavris, C., & Aronson, E. (2007). Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. Orlando, FL: Harcourt.

Trivers, R. (2011). The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life. New York: Basic Books.