Suppression and dissent in science

Published in Tracey Bretag (editor), Handbook of Academic Integrity (Singapore: Springer, 2016), pp. 943-956

Jason A. Delborne

Associate Professor of Science, Policy, and Society, North Carolina State University


Go to

Other articles in the same section of The Handbook of Academic Integrity


Abstract

Academic integrity becomes more challenging during scientific controversies, as scientists and their allies and opponents struggle over the credibility and significance of knowledge claims. Such debates are healthy and necessary, but because science remains embedded in broader institutional, political, cultural, and economic contexts, struggles over truth often reflect dynamics of power. For example, those who challenge dominant ideas may face a landscape that does not welcome contrarian positions, which may result in what this chapter describes as "dissenting" behavior by scientists. In extreme cases, contrarian scientists may face attempts at scientific suppression: discrediting or silencing a scientist or scientific claim in ways that violate accepted standards of scientific conduct. While such actions are unusual, they happen frequently enough to deserve careful consideration as breaches of academic integrity. This chapter offers a scholarly perspective on how to understand scientific dissent and suppression, as well as a list of best practices to avoid suppression, respect dissent, and encourage healthy debates in the production of knowledge.


David Lewis, a scientist working for the U.S. Environmental Protection Agency (EPA), published research that challenged the safety of land application of sewage sludge. Sludge industry representatives put pressure on EPA to discontinue Lewis' funding and produced materials attacking the scientist's credibility, which were distributed by an EPA official. The EPA denied Lewis a promotion, after applying ethics rules on the print size in a publication disclaimer, an action that the U.S. Department of Labor reviewed and found to be discriminatory and unlawful (Kuehn, 2004, pp. 338-339).

Ignacio Chapela, a microbial ecologist at the University of California, Berkeley, published a peer-reviewed letter in the scientific journal Nature that provided evidence suggesting that transgenic DNA had spread from genetically modified corn into landraces of Mexican maize in Oaxaca, Mexico. UC Berkeley colleagues challenged his findings publicly, a public relations firm invented false identities to attack him on professional listservs, and Nature followed with an unprecedented announcement that withdrew support from the published manuscript without a formal retraction. Chapela, who had also spoken out against a strategic alliance between a transnational agricultural biotechnology firm and UC Berkeley's College of Natural Resources, was subsequently denied tenure under suspicious circumstances. Chapela appealed the decision and brought a lawsuit charging the university with discrimination. The university president awarded him tenure retroactive to his initial application, but his scientific reputation remains damaged (Delborne, 2011).

In opposition to the orthodox theory of the origin of AIDS, which assumes the transfer of a simian immunodeficiency virus (SIV) from a monkey to a human through a bite or other physical means (e.g., the "bushmeat hypothesis"), a number of dissenters have argued that that early polio vaccines were unintentionally contaminated with SIV by a manufacturing process involving monkey kidneys, and then distributed to a million people in central Africa during 1957-1960, launching the AIDS epidemic. Martin (1996) shows how the scientific community has not taken this hypothesis seriously, demanded much higher standards of evidence from the challenging theory than the orthodox theory (which itself remains highly speculative with little empirical grounding), and even failed to conduct tests on archived samples of polio vaccines from that time period, which might have shown whether the AIDS-polio connection could have been proven.

Introduction

Academic integrity becomes more challenging - and arguably more significant - during scientific controversies. If we accept that science is a social process, then controversy implies conflicts in that social realm and creates opportunities for breaching standards of integrity. An egregious form of such a breach is scientific suppression: discrediting or silencing a scientist or scientific claim in a manner that goes against the norms of scientific practice. We might therefore view the suppression of science as a corrupt practice that both interferes with meaningful scientific debate (that could clarify significant knowledge) as well as undermines the legitimacy of the scientific community.

Scientific dissent, a much broader term, calls attention to the ways in which conflict and controversy are integral to the practice of science. Without dissent, science would become nothing more than orthodoxy, a dogmatic way of knowing, closed to revision or challenge. Yet, science is not a realm devoid of politics, strategic behavior, or power differentials. Science is social and embedded in broader institutional, political, cultural, and economic contexts. As such, struggles over "truth" will reflect dynamics of power, and paying attention to dissent reveals how those challenging dominant ideas face a landscape that does not necessarily welcome contrarian ideas.

This chapter explores the phenomenon of scientific suppression within a conceptual framework of scientific dissent. After a more extended discussion of the role of dissent in science, a conceptual framework is presented for understanding scientific dissent as a practice. Next, scholarship in the field of science, technology, and society (STS) helps to unpack the phenomenon of suppression in science. The chapter concludes with suggestions for best practices for researchers seeking to work effectively and with integrity, especially in the midst of highly politicized scientific controversies.

The role of dissent in science

If science is considered as a particular way of knowing about the world, disagreement and conflict are key aspects that distinguish science from other ways of knowing. Unlike religion, for example, science disrupts dogmatism - faith in science rests largely upon the lack of faith ... in science. To illustrate, a student learning a faith tradition would not usually be encouraged to imagine herself developing sets of ideas contrary to religious teachings, which could be tested and potentially overthrow accepted understandings; in contrast, a student learning about the scientific tradition continually encounters a history of competing hypotheses, lively disagreements, and celebrated scientific revolutions. Einstein, for example, is a scientific hero because his ideas overthrew the dominance of Netwonian physics. Likewise, one of the most popular works in the history of science is Thomas Kuhn's (1996) The Structure of Scientific Revolutions. As such, one might argue that dissent - the practice of actively challenging a dominant idea - is central to the function and reputation of science as a way of knowing that reflects both humility (a willingness to be wrong in the face of new evidence) and rigor (a commitment to letting the strongest ideas and evidence win the day, even in the face of popular opinion) (for a more critical view, see Chalmers, 2013).

Robert Merton, a founder of the sociology of science, identified "organized skepticism" as one of the key norms of science that create its ethos (Merton, 1973). As but one example, the common practice of scientists challenging one another's ideas through the process of peer review is foundational to the production of legitimate scientific knowledge. Peer reviewers are asked to be "skeptical" of the claims made in a submitted journal manuscript, with the goal of weeding out poorly constructed research, unsupported inferences, inappropriate methods, and unsubstantiated conclusions (for a thorough treatment, see Weller, 2001). Such a process does not guarantee "truth," per se, but organized skepticism creates a social process of knowledge production that benefits from testing new ideas with the benefits of accumulated expertise. Scholars have debated whether Merton's norms of science accurately describe the practice of science in a general sense (Mittroff, 1974), but the pattern within the scientific community of celebrating dissenters who are later recognized as correct (e.g., Galileo) suggests that dissent retains an important function in marking the credibility of science as a distinct way of knowing and approaching the truth.

Scholars have noted, however, that scientific dissent does not represent a neat, rational, and fair clash of ideas. Thomas Gieryn's study of boundary work and scientific credibility demonstrates the ways in which arguments over what is true often involve struggles to define the "boundaries" of science - in other words, who counts as a scientist and who does not, and what counts as science and what does not. For example, the political struggle over hiring a new chair of logic and metaphysics at the University of Edinburgh in 1836 shows how supporters of phrenology (a discredited theory of linking the physical shape of the brain with personality and intellectual abilities) squared off against opponents who did not want to disrupt the status quo. At the time, empirical evidence could not determine with confidence whether the phrenologists were right or wrong, but the political battle was fierce and consequential for the future of science at that university (Gieryn, 1999, pp. 115-182). What this suggests is that dissent in science may look very different, depending on one's perspective. From the perspective of a dissenter, the clash of ideas may feel like an unfair attack on one's credibility; while from the perspective of mainstream science, the same clash can appear as the unfounded, wild ideas shouted by a non-scientist.

One way to understand this range of phenomena of scientific dissent - from rationalized debates over ideas to politicized efforts to marginalize outside perspectives - is to recognize the spectrum of uncertainty in science. At the frontier of knowledge in a scientific field, uncertainty reigns; the best and most experienced minds do not know exactly what is true. In this realm, dissent may be embraced because the cost of overthrowing or discarding provisionally accepted knowledge is low. Toward the other extreme, where scientific knowledge has solidified and become institutionalized, the cost of upsetting accepted knowledge may be quite high. Here, dissent may be resisted much more forcefully. Bruno Latour offers one way to think about this dynamic in his book, Science in Action (1987), showing how scientific knowledge becomes embedded within networks that become harder and harder to challenge.

Another perspective suggests that understanding scientific dissent requires a more explicit acknowledgement of the politics within and surrounding science. The analogy to political dissent is helpful. A healthy democracy requires that diverse and competing ideas emerge for debate and consideration, but democracies do not categorically embrace political dissent. In fact, political dissent is frequently dismissed, marginalized, or actively silenced because dissenting ideas can help coalesce and strengthen political opposition. Similarly, science has its own power structures - consider the roles of funding agencies, journal editorships, and disciplinary traditions - and also operates within a society with divergent interests that engage scientific knowledge in political struggles (e.g, regulation of toxic pollution, ensuring food safety, creating incentives for desired economic behavior). As such, we should not be surprised to see scientific dissent look a great deal like political dissent.

A conceptual framework of scientific dissent

While some understand scientific dissent as a position, as in believing in a claim that goes against scientific orthodoxy, it is more useful to understand scientific dissent as a practice. This has the advantage of encouraging the analysis of the many ways in which scientists navigate controversies that erupt within and around their technical fields. In particular, the framework below draws attention to the uneven power structures and practices within scientific communities that shape the kind of knowledge that is produced and legitimized (for a more thorough treatment of this framework, see Delborne, 2008).

The framework begins with the recognition that scientific fields contain dominant claims that reflect generally accepted epistemologies, methodologies, and motivations for research. For example, epidemiologists generally employ statistical methods to find correlations between patterns of disease and human behaviors or environmental conditions, with the goal of identifying possible intervention points to improve public health. Other scientific disciplines also address human health, but ask very different questions and use very different research methods - consider pharmacologists who seek new medicines to counter drug-resistant bacteria. Disagreements clearly occur in these and other fields, but disagreements rarely challenge foundational ideas. It would be unusual for an epidemiologist to challenge the validity of statistical methods in a general sense, or for a pharmacologist to suggest that bacteria are the wrong target for fighting infections. While such extreme examples are rare, contrarian science does occur. Contrarian science challenges a dominant set of assumptions, frames, and methodologies. This is not dissent, because at this early stage, it may be unclear to the contrarian scientist whether - and to what degree - the contrarian claims truly upset the dominant way of thinking. In other words, contrarian science challenges something that a majority within a scientific community has come to believe, but there is no a priori reason to assume that new evidence could not alter the community's assumptions, frames, or accepted methodologies. Other scientist s may simply be convinced by a contrarian argument, in which case dissent - as understood in this framework - would not have the opportunity to emerge.

When a contrarian claim is neither accepted - changing the dominant way of thinking - nor ignored, the contrarian scientist faces resistance, or impedance. The value of this terminology is that it highlights that impedance can originate from within or outside of the traditional scientific community, and that the contrarian claim may be right or wrong. To be clear, some contrarian claims deserve to be impeded - many would agree, for example, that contrarian claims that disavow the utility of condoms to prevent the spread of HIV are a threat to public health initiatives. It is not surprising that mainstream health professionals (scientists and non-scientists) would work to undermine the legitimacy of those contrarian claims. In contrast, early tobacco research that challenged the safety of cigarette smoking also faced impedance - largely by tobacco-funded researchers (Proctor, 1995), but history has judged that example of impedance as corrupt, misguided, and motivated by special interests (Oreskes & Conway, 2011). Thus, neither the appearance of contrarian science nor impedance necessarily mean a breach in scientific integrity, but one may have occurred.

Within this framework, scientific dissent becomes possible in the face of impedance. Here, the contrarian scientist can choose whether to drop their initial claim - either because they realize their error or because they choose not to fight the battle - or to attempt to restore their scientific credibility. The latter option represents the wide variety of practices of scientific dissent that range from what we might call agonistic engagement (established and accepted behaviors in mainstream science, such as providing additional evidence or debating the criticisms) to dissident science (practices that are explicitly political, creating more resources to gain credibility, but also putting one's scientific identity at risk of "pollution" from political concerns). Like impedance, dissent may be successful or unsuccessful - in restoring scientific credibility, changing dominant epistemologies, or undermining the legitimacy of impedance.

Table 1: Key elements of a conceptual framework of scientific dissent (adapted from Delborne, 2008).


Mainstream science

Accepted ways of thinking, dominant ideas, orthodox perspectives on what is true and how a scientific community produces knowledge.

Contrarian science

Ideas, evidence, or perspectives that challenge mainstream science.

Impedance

Efforts and actions to reduce the legitimacy of contrarian claims or the credibility of contrarian scientists. Suppression is an extreme form of impedance that violates the norms of the scientific community.

Scientific dissent

Practices by contrarian scientists to restore their own credibility or the legitimacy of their claims against impedance. Agonistic engagement includes practices that are customary in scientific debates (e.g., providing additional evidence), while dissident science merges the controversy explicitly with political struggles and actors.

From a methodological perspective of studying scientific controversies, this framework has the advantage of allowing analysis without requiring certainty of who is right and who is wrong. In the long and fruitful tradition of symmetry within the field of science and technology studies (Barnes & Bloor, 1982), scientific dissent in the fields of AIDS causation and health impacts of tobacco can be considered with the same conceptual framework (rather than one framework for understanding when the dissenters are "right," and a different model for when they are "wrong"). Analysis focuses on distinguishing dominant ideas from contrarian ideas and the mixture of credibility challenges and defenses that represent the controversy. Making judgments about the rightness of these practices then becomes an explicitly normative task.

Suppression in science

In light of the discussion above about scientific dissent as a practice in response to impedance, the suppression of science can be understood as a particular form of impedance. To be specific, suppression represents a normative category of impedance that is unfair, unjust, and counter to the standards of scientific behavior. What is difficult, however, is that suppression - from one perspective - looks very much like the justified and necessary policing of the boundaries of legitimate science. For example, a book like The Deniers: The World-Renowned Scientists Who Stood up against Global Warming Hysteria, Political Persecution, and Fraud (Solomon, 2010) claims that science challenging the orthodoxy of global warming has been suppressed; while other books, such as Reality Check: How Science Deniers Threaten Our Future (Prothero, 2013), make the argument that those who challenge the notion that climate change is real and anthropogenic are not credible scientists and deserve to be marginalized and discredited. Put simply, Prothero sees the necessary policing of the boundaries of science against irresponsible contrarians who lack integrity, while Solomon sees scientific suppression. Are scholars with a tradition of symmetrical analysis left with only a relativist position - simply to acknowledge the different perspectives of opposing participants and make no particular normative judgments?

Brian Martin recognizes this challenge, and offers strategies to navigate such controversies: "Ultimately, there is no way to prove that suppression is involved in any particular case, but ... [a] useful tool is the double standard test: is a dissident scientist treated any differently from other scientists with similar records of performance?" (Martin, 1999a, p. 110). The authors of the books on global warming deniers mentioned above would no doubt answer this question differently, but the principle of using the double standard test as a marker offers analytic purchase. For example, one could interrogate the evidence of double standards being applied to scientists skeptical of anthropogenic global climate change as one way to sort through the controversy over whether suppression has occurred. What should be kept in mind, however, is that just because a scientist's work has been suppressed by powerful forces - identified by the application of the double standard test - does not guarantee that the suppressed research was accurate! An early study of the policing of scientific boundaries sheds light on this subtle point. Two science studies researchers conducted an analysis of the field of paranormal psychology, finding that proponents of this field were frequently held to double standards as they were refused publication opportunities, scientific legitimacy, and access to research funds (Collins & Pinch, 1979; Pinch, 1979). That these behaviors were widespread in the field, and not just from an isolated case, is a secondary criterion to identify suppression (Martin, 1999a, p. 111). Yet, even if the double standard exists in the field of paranormal psychology, one need not necessarily accept the diverse claims made by paranormal psychology researchers as true. Indeed, it may be quite rational for the scientific community to hold higher standards of evidence for claims that would disrupt accepted ideas.

That suppression can be viewed differently, however, does not mean that it is an empty category. In fact, recognizing that to identify and name suppression in science is a normative process opens the conversation to the reality that any reference to suppression necessarily rests upon accepted norms of scientific practice. In other words, for suppression to occur, standards must have been violated. What is up for debate, then, is both the details of the alleged suppression (e.g., did an editor reject a paper despite positive peer reviews?) and the implied norm of scientific behavior (e.g., should an editor have the authority to overrule a peer review process for any reason?).

Brian Martin's research program represents the most comprehensive treatment of suppression in science by any scholar in the world (e.g., Martin, 1981, 1991, 1999a, 1999b, 2010, 2014b; Martin, Baker, Manwell, & Pugh, 1986). His work has ruffled more than a few feathers, as he himself acknowledges, but the nature of this domain of inquiry guarantees drawing the ire of those engaged in the politics of scientific knowledge. Demonstrating his own reflexivity in his early work on controversies over water fluoridation, he notes that just the act of paying attention to contrarians can be seen as a partisan act, since actors supporting the orthodox position - the safety of fluoridated water, in this case - often seek to silence the debate altogether:

Since proponents generally maintain that there is no credible scientific opposition to fluoridation, my analysis appeared to give the opponents far too much credibility ... [A]s soon as one begins interacting with partisans in a polarized controversy, there is no neutral position (Martin, 1991, p. 165).

Martin's observation raises the uncomfortable possibility that any effort to understand, analyze, or publicize controversies over scientific dissent or suppression will be interpreted - by some - as a political action. Given the cultural tendencies of academics to prefer the veil of neutrality, it is perhaps not surprising that Martin has few colleagues who have focused on such issues (for some exceptions, see Allen, 2004; Delborne, 2011; Epstein, 1996; Gieryn, 1999; Kuehn, 2004; Moran, 1998; Oreskes & Conway, 2011; Simon, 2002).

Because suppression is both contested and often uncertain, typologies can aid in analyzing the diverse behaviors and multiple moments in the production of controversial scientific knowledge. For example, Martin (1999a) describes suppression, noting how "agents or supporters of the powerful interest group make attempts to stop the scientist's activity or to undermine or penalize the scientist - for example, by censorship, denial of access to research facilities, withdrawal of funds, complaints to superiors, reprimands, punitive transfer, demotion, dismissal, and blacklisting, or threats of any of these" (p. 107). This list of behaviors does not provide an easy litmus test for identifying suppression, but instead a reminder of the various pathways of impedance in the scientific community.

The typology offered here calls attention to the multiple targets of scientific suppression (see Table 2). While it is not meant to suggest neat and clear lines between the categories - indeed part of the power of suppression is its spillover effects, described in more detail below - it might be useful to consider that suppression targets different entities at different moments in the production of scientific knowledge.

Table 2: Targets of Scientific Suppression


Target

Description

Examples

Ideas

Making the development of a set of research questions less likely or impossible (perhaps the hardest to measure).

A research funding agency not initiating a new program when it is called for by contrarian interests; an advisor discouraging a student from pursuing a novel and contrarian research project.

Data & Results

Manipulating, confiscating, or silencing data or results.

A research sponsor refusing to allow access to data that questions the safety of their product; a scientist confiscating an employee's results because they undermine a favored hypothesis; an editor rejecting a paper prior to peer-review because of its political implications.

Scientists

Credibility: Undermining scientists' credibility and reputation to reduce the legitimacy of claims associated with them.

Position: Attempting to dislodge scientists from institutional positions that make their research possible.

Practice: Coercing scientists to censor their own present or future work.

Accusing a scientist of being motivated by activism rather than the pursuit of truth; exposing embarrassing details of a scientist's personal life.

Threatening to withdraw foundation support from a university unless they fire a particular researcher.

Offering a scientist rewards not to publish a finding; threatening a scientist with a public relations attack unless they pursue a different line of research.

Scientific field

 

Undermining the credibility and reputation of a field of inquiry, leading to institutional changes.

Calling for the National Science Foundation to eliminate a program of funding because of its supposed political bias.

A significant shortcoming of this typology is its failure to adequately address what one might call, a "chilling effect." Namely, when scientists become aware of attempts at scientific suppression - whether successful or unsuccessful - they may change their own scientific practice in response, despite not being a direct target. Martin (1999a) notes: "[I]t is my observation that quite a number of scientists avoid doing research or making statements on sensitive issues because they are aware, at some level, of the danger of being attacked if they do" (p. 108). In fact, Joanna Kempner interviewed thirty National Institutes of Health scientists who became embroiled in a political attack on federally funded research, and she found that half of them subsequently had removed controversial words from their research proposals and about one-quarter had avoided controversial topics entirely (Kempner, 2008). While personality characteristics such as conviction, courage, and confidence might play a mediating role - as would differences in professional security (e.g., tenure status) - it seems clear that witnessing suppression could change a researcher's calculus about whether to pursue a particular question or how to disseminate controversial results.

In contrast to the chilling effect, a counterintuitive outcome of suppression can also be the flourishing and publicizing of dissenting views. As with political oppression, such as that which occurred during the U.S. Civil Rights Movement, suppression may be attempted, but result in greater attention and sympathy to the dissenting position. Martin (2007) refers to this as "backfire" or the "boomerang effect." As but one example, a dissenting scientist in the field of agricultural biotechnology, Ignacio Chapela, produced a public event during which he showcased the attempts at suppression that he and some of his colleagues faced as they challenged the safety of genetically engineered crops. Chapela titled the event "The Pulse of Scientific Freedom in the Age of the Biotech Industry," and it drew over five-hundred attendees on the University of California, Berkeley campus and was webcast around the world. While one event does not prove "backfire," it demonstrates a key strategic option for dissident scientists: exposing suppression in a public manner to win support (Delborne, 2008, pp. 524-526).

Best practices

It would be naïve to hope for a state of the world in which scientists policed the boundaries of credible knowledge perfectly - never suppressing "good science" and always responsibly and respectfully discrediting "bad science." Science is full of diverse actors, uncertainties, and powerful interests that will simply never vanish. Instead, scientists should aim for high standards of conduct - best practices - that avoid suppression, respect dissent, and encourage healthy debates in the production of knowledge.

Conclusion

Contextualizing scientific suppression as a particular form of impedance illustrates the relationship between academic integrity and scientific controversy. To be clear, the eruption of scientific controversy does not automatically signal a lapse in scientific integrity, even if exchanges appear harsh or dismissive. Instead, this chapter encourages attention to how contrarian science, impedance, and scientific dissent are core practices in a healthy scientific community. At the same time, the suppression of science represents an unhealthy extreme of impedance - one that threatens both knowledge production and the legitimacy of science.

Best practices in light of these complex phenomena include: 1) engaging directly and respectfully with scientific opponents; 2) fostering free speech and free inquiry, even when doing so risks allowing your opponents to make their claims publicly; 3) maintaining awareness of the political economy of science as a means of staying sensitive to the role of power in influencing the dynamics of scientific controversy; 4) recognizing that scientists can play a variety of legitimate and helpful roles in the policymaking process, depending on the degrees of scientific uncertainty and values consensus; and 5) using the boomerang effect as a strategy to counter scientific suppression and anticipating others' use of the same strategy.

While it may be tempting to believe that scientific and academic integrity will only be found when science is objective, apolitical, and non-controversial, such a perspective not only denies the reality of scientific practice but also offers a mirage that distracts from pragmatic efforts we can take to pursue integrity even in the midst of uncertain, politicized, and controversial science. Scientific dissent is necessary, and understanding the role that it plays strengthens our ability to detect and oppose efforts of scientific suppression.

References

Allen, B. (2004). Shifting Boundary Work: Issues and Tensions in Environmental Health Science in the Case of Grand Bois, Louisiana. Science as Culture, 13, 429-448. doi:10.1080/0950543042000311805

Barnes, B., & Bloor, D. (1982). Relativism, Rationalism and the Sociology of Knowledge. In M. Hollis & S. Lukes (Eds.), Rationality and Relativism (pp. 21-47). Oxford: B. Blackwell.

Chalmers, A. F. (2013). What Is This Thing Called Science? (4th ed.). Indianapolis, IN: Hackett Publishing.

Collins, H. M., & Pinch, T. J. (1979). The Construction of the Paranormal: Nothing Unscientific Is Happening. In R. Wallis (Ed.), On the Margins of Science: The Social Construction of Rejected Knowledge (pp. 237-69). Keele, England: University of Keele Press.

Delborne, J. A. (2008). Transgenes and Transgressions: Scientific Dissent as Heterogeneous Practice. Social Studies of Science, 38(4), 509-541. doi:10.1177/0306312708089716

Delborne, J. A. (2011). Constructing Audiences in Scientific Controversy. Social Epistemology: A Journal of Knowledge, Culture and Policy, 25(1), 67-95. doi:10.1080/02691728.2010.534565

Epstein, S. (1996). Impure Science: AIDS, Activism, and the Politics of Knowledge. (A. Scull, Ed.). Berkeley: University of California Press.

Frickel, S., & Moore, K. (Eds.). (2006). The New Political Sociology of Science: Institutions, Networks, and Power. Madison: University of Wisconsin Press.

Gieryn, T. F. (1999). Cultural Boundaries of Science: Credibility on the Line. Chicago: University of Chicago Press.

Kempner, J. (2008). The Chilling Effect: How Do Researchers React to Controversy? PLoS Med, 5(11), e222. doi:10.1371/journal.pmed.0050222

Kinchy, A. J. (2012). Seeds, Science, and Struggle: The Global Politics of Transgenic Crops. MIT Press.

Kleinman, D. L. (2003). Impure Cultures: Biology and the World of Commerce. (D. L. Kleinman & J. Handelsman, Eds.). Madison, WI: University of Wisconsin Press.

Krimsky, S. (2003). Science in the Private Interest: Has the Lure of Profits Corrupted Biomedical Research? Lanham, MD: Rowman and Littlefield Publishers.

Kuehn, R. R. (2004). Suppression of environmental science. American Journal of Law & Medicine, 30, 333-369.

Kuhn, T. S. (1996). The Structure of Scientific Revolutions (Vol. Third). Chicago: University of Chicago Press.

Latour, B. (1987). Science in Action: How to Follow Scientists and Engineers through Society. Cambridge, MA: Harvard University Press.

Martin, B. (1981). The Scientific Straightjacket: The Power Structure of Science and the Suppression of Environmental Scholarship. The Ecologist, 11(1), 33-43.

Martin, B. (1991). Scientific Knowledge in Controversy: The Social Dynamics of the Fluoridation Debate. (S. Restivo, Ed.). Albany: State University of New York Press.

Martin, B. (1996). Sticking a needle into science: the case of polio vaccines and the origin of AIDS. Social Studies of Science, 26(2), 245-276.

Martin, B. (1999a). Suppression of Dissent in Science. Research in Social Problems and Public Policy, 7, 105-35.

Martin, B. (1999b). Supressing Research Data: Methods, Context, Accountability, and Responses. Accountability in Research, 6, 333-372.

Martin, B. (2007). Justice Ignited: The Dynamics of Backfire. Lanham, MD: Rowman & Littlefield.

Martin, B. (2010). How to Attack a Scientific Theory and Get Away with It (Usually): The Attempt to Destroy an Origin-of-AIDS Hypothesis. Science as Culture, 19(2), 215-239. doi:10.1080/09505430903186088

Martin, B. (2014a). Censorship and free speech in scientific controversies. Science and Public Policy. doi:10.1093/scipol/scu061

Martin, B. (2014b). On the Suppression of Vaccination Dissent. Science and Engineering Ethics, OnlineFirst, 1-15. doi:10.1007/s11948-014-9530-3

Martin, B., Baker, C. M. A., Manwell, C., & Pugh, C. (1986). Intellectual Suppression: Australian Case Histories, Analysis and Responses. North Ryde, Australia: Angus & Robertson Publishers.

Merton, R. K. (1973). The Normative Structure of Science. In N. W. Storer (Ed.), The Sociology of Science: Theoretical and Empirical Investigations / Robert K. Merton (pp. 267-78). Chicago: University of Chicago Press.

Mittroff, I. I. (1974). Norms and Counter-Norms in a Select Group of the Apollo Moon Scientists: A Case Study of the Ambivalence of Scientists. American Sociological Review, 39(4), 579-95.

Moran, G. (1998). Silencing scientists and scholars in other fields: Power, paradigm controls, peer review, and scholarly communication. Greenwich, CT: Ablex Publishing Corp.

Oreskes, N., & Conway, E. M. M. (2011). Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming (Reprint). New York: Bloomsbury Press.

Pielke, Jr., R. (2007). The Honest Broker: Making Sense of Science in Policy and Politics. Cambridge University Press.

Pinch, T. J. (1979). Normal Explanations of the Paranormal: The Demarcation Problem and Fraud in Parapsychology. Social Studies of Science, 9(3), 329-48.

Proctor, R. N. (1995). Cancer Wars: How Politics Shapes What We Know and Don't Know About Cancer. New York: BasicBooks.

Prothero, D. R. (2013). Reality check: How science deniers threaten our future. Bloomington, Indiana: Indiana University Press.

Simon, B. (2002). Undead Science: Science Studies and the Afterlife of Cold Fusion. Piscataway, NJ: Rutgers University Press.

Solomon, L. (2010). The deniers?: The world-renowned scientists who stood up against global warming hysteria, political persecution, and fraud ([Rev. ed.]). Minneapolis, MN: Richard Vigilante Books.

Weller, A. C. (2001). Editorial Peer Review: Its Strengths and Weaknesses. Medford, NJ: Information Today, Inc.