What Is the Social Responsibility of Climate Scientists?
Do scientists have a responsibility to act affirmatively to ensure that our findings are known, understood, and put to use to protect our fellow citizens, even if it means expanding our activities beyond the field and the laboratory? I argue that scientists have a sentinel responsibility to alert society to threats about which ordinary people have no other way of knowing. However, the same expertise that makes a scientist an appropriate sentinel in one or several domains almost necessarily makes them inexpert in other domains. I believe that we should exercise restraint when asked to intercede in areas beyond our proximate expertise.
Many years ago, I read psychiatrist Robert Jay Lifton’s work on Nazi doctors.1 That work has been a touchstone for me in thinking about scientists’ social responsibility and how scientists see their place in the world. Among other things, it taught me at an early age not to assume that educated people can be relied upon to do the right thing.
In hindsight, nearly all right-minded people are appalled by the ways in which large segments of the German medical establishment not only failed to oppose Nazi genocide, but participated with Nazi programs to exterminate Jews, mentally and physically handicapped citizens, and others thought by the Nazis to be undesirable. Would American physicians have behaved differently? Would they behave differently today?
Throughout the late twentieth century, more than a few American doctors collaborated with the tobacco industry, whose products are responsible for eight million preventable deaths each year.2 Historian Robert Proctor has called this an “Auschwitz an annum,” which sounds inflammatory, but is quantitatively an understatement.3 We also know that even doctors who did not work for or with the industry often blithely accepted industry safety reassurances, without making the effort to scrutinize those claims in light of industry intentions and motivations.4 Physicians have also collaborated in dubious ways with Big Pharma: historian Nicolas Rasmussen has argued that physician-pharma collaboration has biased clinical trials in ways that favor the drug companies at the expense of good science and patient health and safety.5 Historians have collaborated with the tobacco industry, as well, leading to distortions in our understanding of this history.6 And during the Cold War, various social scientists and artists collaborated with the CIA in ways that sat in tension, if not overt conflict, with the goals of objectivity and intellectual freedom. Some scholars who claimed to be working to defend intellectual freedom were in fact engaged in projects that undermined and denied it.7
Most scientists, ethicists, and other observers would agree that scientists should not participate in morally dubious activities, nor engage in collaborations that undermine academic freedom and objectivity. These are, as ethicists would say, negative considerations: things we should not do. But what about positive considerations? Do scientists have an obligation to speak out against dubious practices, or to call public attention to threats to public health and well-being? Is it enough to do good science and publish it in reputable peer-reviewed journals, or do scientists also have the obligation to be witnesses, testifying to matters that they as the relevant experts are uniquely positioned to observe, understand, and explain to the rest of us?
A famous example from the earth and environmental sciences involves the ozone hole. In the 1990s, atmospheric chemist Sherwood Rowland shared the Nobel Prize for his work predicting that chlorinated fluorocarbons could destroy stratospheric ozone, endangering the existence of life on Earth. But Rowland was not just a great scientist; a decade before, he had become a public figure, not only alerting the public and political leaders to the threat but insisting that something needed to be done to address it. As an expert who understood the cause of ozone depletion, he considered it obvious that the solution was to control the chemicals that had caused the problem. Not surprisingly, he was criticized mightily by the chemical industry.8 But he was also criticized by scientific colleagues who took issue with his “activism.” Rowland knew as much about ozone as anyone, yet some colleagues argued that he should be excluded from ozone science assessments, because his activism undermined–or could be viewed as undermining–the objectivity of the process (even though the assessment panels sometimes included industry scientists).9 Rowland’s response to this was to ask: “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”10
What is the point of researching issues that involve public health and safety if we are afraid to warn the public, for fear that we will be viewed as biased? How can politicians or other leaders act on pertinent science if scientists don’t inform them about it? Is the obligation of scientists simply to do the best science possible and leave it to others to explain, publicize, and act upon? Or do scientists have a responsibility, as Rowland believed, to act affirmatively to ensure that our findings are known, understood, and put to use to protect our fellow citizens, even if it means expanding our activities beyond the field and the laboratory?
I come to this issue having faced these questions in my own work, both as a geologist and as a historian. My original training is in Earth science. I earned an undergraduate degree in mining geology, choosing that specialty because I liked its real-life, dirt-under-your-fingernails, shower-after-work quality, and, not incidentally, I wanted to be able to get a job when I graduated. (I also wanted to travel.) I worked for three years as an exploration geologist in the Australian outback, where I helped to evaluate and develop a large polymetallic ore deposit. One of the metals in the deposit was uranium, and my company came under a great deal of scrutiny from Australians opposed to nuclear power. There were protests at our site. Antinuclear activists camped out around the drill rigs that I was supervising.
This was in the early 1980s and the anticipated customer for our uranium was Japan. While I wasn’t entirely convinced of the universal virtues of nuclear power, I did think it was a reasonable option for that country, which had few other obvious energy resources. No one I knew in the mining industry seriously doubted that civilian nuclear power was a reasonable thing to pursue, and therefore that uranium mining for it was likewise reasonable, but I encountered some very negative reactions from people I knew outside the industry. Many people questioned the allegedly sharp distinction between civilian nuclear power and nuclear weaponry, and considered it not unlikely that at least some Australian uranium would end up in bombs. More than a few folks blamed me, personally, for things they didn’t like about nuclear power. Some people I met–at parties, at dances, on vacation–could not believe that I would actually work for a uranium mining company. I remember one party in Melbourne, where a nice young man asked me what I did for a living. When I told him, his reply was: “Really? REALLY?” “Yes, really,” I said, and there the conversation ended.
That was my first personal encounter with the issue of the social responsibility of scientists. I sat at the lowest possible level in my company. I had no executive authority. But many people acted as if I were personally responsible for the ills of nuclear power and nuclear weapons (often combined, rightly or wrongly, in people’s minds). In some ways they were right. While I was a low-level employee in a position of no authority, if I worked in uranium mining, then I did bear some responsibility, however small, for the consequences of nuclear technologies. My job was at the base of the nuclear fuel cycle: doing the basic science that enabled our company to find and mine uranium ores, to be processed in nuclear fuel rods used in nuclear reactors.
I took on board the responsibility to become educated about nuclear power. The more I learned about the history of American nuclear power, including its two central failed promises–of electricity “too cheap to meter” and of easy waste disposal–the less persuaded I became that it made much sense, particularly in the United States where we had other, better options. I didn’t think that nuclear power was evil–and I still don’t. I believed that the distinction between reactor-grade and weapons-grade fuel was pertinent: the uranium ore we were mining could not be easily converted to fuel for a bomb. But I realized that there were many significant unanswered questions, and that people’s discomfort with nuclear technologies was not irrational. In particular, I learned that the U.S. government had a long history of lying and dissembling on matters nuclear, as well as overstating the promise and downplaying the risks of civilian applications. And then, in 1986, the Chernobyl nuclear disaster occurred.
Many American scientists insisted that the Chernobyl disaster wasn’t “relevant” to the safety of American and European reactors, because the accident had happened in the Soviet Union, which was obviously corrupt, and because the reactor was a graphite-cooled one, a dangerous design that was not used in U.S. commercial reactors. Meanwhile, I had moved on to graduate school, where I was in the process of becoming a historian and philosopher of science. Nuclear power generation more or less faded as a pressing issue from my life, although I tracked the progress (or lack thereof) of the proposed nuclear waste repository at Yucca Mountain, where many geologists were then employed.
I also made a surprising discovery, one that revealed to me how closely the nuclear fuel cycle was intertwined with American science, writ large. As part of my Ph.D. research, I undertook geochemical modeling of the ore deposit on which I had worked in Australia, only to discover that there was surprisingly little high-quality thermodynamic data available for common minerals in our ore deposit, including quartz (SiO2) and hematite (Fe2O3), yet astonishingly good data for rare and obscure lanthanide and actinide series minerals. The reason? The latter had been closely studied by the U.S. Department of Energy for their pertinence in nuclear waste disposal. Thus, I developed an early insight into how political considerations shape what we do and don’t know about the world.
Fast-forward twenty years. I am now, in the mid-2000s, a historian working at the University of California on the history of climate science. As I began to write and speak about the scientific consensus on climate change, I was personally attacked. I started to receive hate mail and threatening telephone calls. A group of people filed complaints against me, challenged my work, and tried to get me fired from my job. A senator from Oklahoma, of whom I had at that time never heard, accused me of being part of a “liberal conspiracy to bring down global capitalism.” This was all very odd. All I was doing–in my own mind–was explaining the state of the science. But others did not see it that way.
That was a frightening time, far more troubling than what I encountered in Australia. In Australia, I knew that my company–rightly or wrongly–would be influenced not one iota by bedraggled, antinuclear protesters. I did not know whether the University of California would be influenced by my attackers, in part because largely I did not know who they were and the one I did know was a U.S. senator! Moreover, in Australia, I considered it possible that the protesters were right. But in California, I knew, for sure, that the attacks on science that I was uncovering were deeply wrong. I knew I had discovered something important. I realized that if someone was trying to shut me down, it meant I had to stand up. But I would be lying if I didn’t admit to more than a few sleepless nights.
I share this personal story to make clear that I understand and sympathize with colleagues who want to lay low. In Australia, it would have been easy simply to say to myself, “that’s above my pay grade.” In California, it would have been safer to retreat. Moreover, it’s not just a matter of safety. Most scientists just want to do science. It is what we trained to do. It is what we are good at. On some level, it is who we are. But the world sometimes forces us to make choices that no one prepared us for.
When I got attacked, I could have been frightened and intimidated. I was frightened. But I also realized that something significant was going on. One thing that made a difference for me (in addition to the fact that the University of California did stand by me) was that I soon learned that I was not alone. Several climate scientists had been attacked, too. It helped that I was a historian as well as a scientist, because I began to think about what was happening to me not in personal terms, but in historical ones: Why am I (and others) being pressured when we speak up about the facts of climate change? Where is this coming from and who are these people? Why would a senator from Oklahoma attack a historian of science over a paper in a peer-reviewed journal? Most scientific papers never even get read; why had mine loosened a torrent of political abuse?
There are different ways that we can respond to outside pressure, and in the past few years I’ve tried to understand why scientists respond in the ways that they do. In particular, I’ve tried to understand why it’s been so difficult for most of my scientific colleagues in the Earth sciences to respond in efficacious ways.
I now think that scientists are different from other professionals in that other professionals have clients. Physicians have patients. Lawyers, psychologists, and engineers have identifiable clients paying for their time. These professionals all recognize some kinds of obligations, often articulated by professional codes of conduct. According to these codes, certain forms of public statements or actions may be disallowed or, alternatively, obligatory. Often these codes of conduct are historically linked to professional licensing arrangements. A physician who egregiously violates medical norms can lose her license, a lawyer can be disbarred, an engineer can be decertified. But in science, although we may have identifiable patrons, we don’t have clearly identifiable clients. And, with some exceptions, we don’t have formal licensing agreements. Perhaps for these reasons, we have few formal codes of conduct that govern our behavior. Scientists are for the most part left to our own devices to figure out how to behave.
Scientists can be discredited, but there’s no formal means to exclude, dishonor, or shame a scientist who has misbehaved (or might be construed to have misbehaved). In most cases, there’s no formal code of conduct that enables us to say that a scientist has transgressed. However, and perhaps for this reason, scientists are very sensitive to their community norms. In my experience, scientists tend to be extremely sensitive to the opinions of their colleagues, more than to any sense of obligation to funders or to society as a whole. Many scientists, for example, have told me that they are cautious in what they say about climate change for fear of damaging their reputations. The harm they fear is not public censure, but collegial disapproval, and they anticipate that disapproval to arise primarily from speaking up, grandstanding, or overstating a threat. The societal harm that may come from understating a threat seems (in most cases) to be of much less concern.11 Perhaps a lack of formal codes of conduct makes scientists more sensitive to community norms than other kinds of professionals, because community norms are all that scientists have.12
These concerns came to the fore in my work with climate scientist Michael Oppenheimer and philosopher Dale Jamieson on scientific assessments for environmental policy. We found that earth and environmental scientists are highly attuned and sensitive to community norms and fearful of collegial censor. When we asked scientists about speaking up in public, many said things along the lines of: “I’ll lose credibility.” But with whom do they fear losing credibility? Our evidence suggests it is not the public (whoever they conceive that to be), nor political leaders, but their professional colleagues.13
As a cautionary tale, many climate scientists point to climate modeler James Hansen, who first testified in Congress in 1988. They say things such as, “Just look at Jim Hansen.” (I can remember colleagues in the late 1980s and early 1990s criticizing Hansen for being too vocal, too public. Many thought he had gone “out on a limb.”) Hansen himself has criticized his colleagues for reticence, which he has identified as a community norm.14 But I know of no evidence that the public at large considers Hansen to have lost credibility when he became a public figure. On the contrary, to many in the public today, Hansen is a hero.15 He is almost certainly the most well-known of climate scientists. And he has won innumerable prizes, of both the scientific and the public sort. In 2007, for example, he won the Dan David Prize, a sort of Nobel Prize in areas not recognized by the Nobel itself. This hardly suggests a loss of public credibility.
Why should scientists involved in environmental assessments criticize colleagues who speak out on environmental matters? After all, these assessments exist to inform public policy on issues that potentially affect large numbers of people, or even the entire population of the planet. Surely, the very fact of participating in such an assessment implies a sense of larger obligation? In theory, perhaps, but we have found that scientists do not generally express a strong sense of obligation to the entire population. (And sometimes they express no sense of such obligation at all.) They do, however, express a strong sense of obligation to each other, and to their disciplines. I think this explains why Hansen bothers them. Climate scientists see Hansen as someone who stepped outside the fold: he called attention to himself, sounded an alarm, and didn’t wait for the rest of his colleagues to reach the same conclusions that he had reached.
Science is a collective enterprise in which scientists attend with great seriousness to the work and conclusions of their colleagues, for it is through this attention that scientific questions are mooted and resolved.16 This is what makes science reliable, but it can also make scientists behaviorally conservative. They are always metaphorically–and sometimes literally–looking over their shoulders to see what their colleagues think.
Another line of argument relating to scientific responsibility emerges from my work on the history of Cold War Earth science, and the role of U.S. Navy funding of oceanography and marine geophysics during World War II and the Cold War.17
During the twentieth century, there was a major change in how earth scientists interacted with people outside of their discipline. Before World War II, most American earth scientists were poorly funded; what little funding they had came from state governments, private philanthropy, private industry, or from the public through book royalties, payments for magazine and newspaper articles, and public lectures. Scientists who wrote popular books or gave public lectures had to find ways to communicate to nonspecialists. They had to be concerned with public interests and opinions.
During the war, however, this changed, and in the late 1940s and 1950s, the rise of scientific research support through specialized federal government agencies such as the National Science Foundation, Defense Advanced Research Projects Agency (DARPA), and the Office of Naval Research made scientists less dependent on the general public and more dependent on governmental patrons. This shifted their sense of where their obligations lay. Moreover, these postwar agencies often had program directors who were themselves scientists. Increasingly, scientists obtained funding from programs that were designed by scientists, and in quite a few cases, run in part by scientists. Many American scientific communities became what historian Paul Edwards has called “closed worlds,” in which the demands of military secrecy limited their interactions with people outside those worlds, and even with other scientists outside their fields of specialization.18
As the Cold War progressed, scientists increasingly worked in these closed worlds. They had far less interaction with general publics (and even with scientists in other fields) than they did before World War II. The Cold War also created a context in which speaking up about certain kinds of threats could be perceived as disloyal. Many scientists in the Cold War came to feel that if they spoke up against American weapons programs, for example, that would be perceived as being disloyal to America, which famously happened to physicist J. Robert Oppenheimer.19
These conditions have left a lasting legacy. One example is documented in my forthcoming book Science on a Mission: How Military Funding Shaped What We Do and Don’t Know about the Ocean. It involves a major controversy that erupted in the 1990s, when physical oceanographers proposed a project to demonstrate global warming by measuring the warming of the oceans. These oceanographers had a long history of collaboration with the U.S. Navy, but no history of engagement with environmental groups and scant engagement even with biologists. Perhaps for this reason, they failed to consider the effects that their project might have on marine life. This led cetacean biologists–along with many others–to oppose the project. The oceanographers also failed to realize that, because it could adversely affect marine life, their proposed project might violate the law (specifically the Marine Mammal Protection Act). A consortium of environmental, community, and animal protection groups filed a lawsuit to stop the project. And they succeeded. Although the project might well have been valuable scientifically, it was stopped.
The physical and intellectual isolation of Cold War oceanographers affected their sense of the scope and character of their responsibilities, and to whom they thought they had obligations. Physical oceanographers working with the U.S. Navy understood that they needed Navy approval–for funding, for the use of instrumentation, for access to infrastructure–but they failed to consider that they also needed the approval of scientists in other fields, of environmentalists, and of the public. They even failed to consider that they needed to obey the law! When they took on the task of measuring the temperature of the ocean, they did so in the name of “society,” who, they insisted, needed a definitive answer to the question of whether the planet was warming up. But their approach failed because it was insensitive to what “society” as a whole really wanted. Some parts of society didn’t want an answer to the question, and many of those who did didn’t want it in the form that scientists were offering.
The available evidence suggests that the group to whom natural scientists feel responsible–and whose censure they fear if things go wrong–is not society, but fellow scientists, and, more specifically, scientists in their own discipline. This accounts for the reticence about which James Hansen has complained and that my colleagues and I found in our own research: scientists are afraid to speak out on policy-sensitive issues lest their colleagues criticize them for it. But it also puts them in an awkward position: the public or policy-makers may want scientists to tell them clearly if something dreadful is about to happen, but scientists are often afraid to do so lest their colleagues disapprove.
What many scientists fail to appreciate, however, is that our views of the appropriate role of science and scientists are historically contingent. During the Cold War, many distinguished physicists, including Hans Bethe, Niels Bohr, Albert Einstein, and Philip Morrison, spoke strongly about the risks of nuclear proliferation, and many argued the need for arms control. These men were highly articulate spokespeople who helped to shape the public conversation over nuclear weapons. They were able to do so, in part, because their expertise qua physicists gave them a particularly acute appreciation of what an uncontrolled arms race would lead to.
Now, a new set of issues have come to the fore, but the basic situation–of an existential threat that scientists are in a position to understand and explain–is comparable. Physicists served as sentinels in the Cold War; climate scientists are serving as sentinels now. And that, in my view, is as it should be, because scientists do have a general obligation to the society they serve, particularly when our research is taxpayer funded. In the United States, that is most basic research, and a good deal of applied research, too. It includes scientists working in national laboratories and federal agencies, and most scientists working in academia. In that sense, we do have clients, and they are the American people. To the extent that we justify our work by its value to humanity, then our clients are all humanity.
This obligation, in my view, includes education and communication, with which most scientists are reasonably comfortable if they get the right institutional support. But there’s a more specific obligation.20 It is what I have called the sentinel obligation. It is, in effect, a duty to warn.
Many areas of scientific research are of interest and significance primarily, or even exclusively, to other scientists. But not all. There are certain kinds of problems in the world that matter profoundly beyond the halls of science, but we would not know about were it not for scientific expertise. Think again about Sherwood Rowland and the ozone hole. If he and his fellow atmospheric chemists had not spoken up to alert us to the possibility that chlorinated fluorocarbons could deplete stratospheric ozone, we would not have known that was the case, and we would not have had the Montreal Protocol.
Now imagine the following scenario. Fast-forward fifty years. Physicians have noted that the rate of cataracts and skin cancer is skyrocketing. Horticulturalists have noticed that certain plants are exhibiting strange pathologies. Farmers have noted increased livestock mortality and decreased crop yields. These alarming phenomena are noticed by different experts and lay people, and at first no one realizes that they are part of a single story.
At some point, however, someone suggests that they might be related, or at least the skin cancers and cataracts, since these are known to be caused by excessive exposure to ultraviolet radiation. A commission is empaneled, perhaps at the National ÇďżűĘÓƵ of Sciences. The commissioners dig through the scientific literature and they find that, in 1979, Sherwood Rowland, Mario Molina, and Paul Crutzen predicted stratospheric ozone depletion, which can cause exactly the effects now being observed. However, the scientists had only ever published their work in scientific journals, so the public and political leaders never learned of it and therefore nothing was done. Now, fifty years later, it is too late to fix. The world must scramble to build a new form of wholly indoor life, or invent UV protective clothing or some other means to live on a now very dangerous planet.
Fortunately for us, Rowland and his colleagues did speak out. They acted as sentinels–alerting us to an imminent danger–and our political leaders acted successfully to avert the threat and protect life on Earth. Disruptive climate change is bigger and more difficult to solve than the ozone hole, but the ozone example demonstrates the essential role that scientists play as sentinels. Scientists need to be sentinels on emerging problems about which ordinary people have no other way of knowing. They must do this; there is no one else who can.
How far should scientists go in accepting a public role? Once one adopts a sentinel role, one will likely soon face the question: “So what do we do about it?” Then things get more complicated. There is an enormous temptation to answer that question, because there you are. You are being asked and of course you have an opinion. If you’re a scientist, you may think that you are a good deal smarter and better informed than most citizens. And perhaps you are.
But if you are a natural scientist, then the very expertise that enabled you to be a sentinel also makes you unlikely to be an expert about the solutions, which often are largely legal, technological, economic, regulatory, or otherwise social. Solving the problems that natural scientists identify usually means passing the baton to other experts. Thus, my colleagues and I have introduced the concept of proximate expertise. As professionals, we have expertise that makes us the appropriate individuals to speak up on particular challenges, problems, and threats, but that very expertise means that we will typically not be experts on other matters. On those other matters, we should in most cases exercise restraint.
For example, as a geologist/geochemist, I have some degree of expertise to talk about carbon sequestration, because I know quite a bit about how carbon dioxide reacts with water and rocks in the subsurface. I also know something about the problem of overpressuring of the subsurface. In fact, I know more about these matters than many climate modelers. Expressing a view on carbon sequestration could, therefore, be viewed as within my range of proximate expertise. As a person with broad knowledge of the Earth sciences, I might have a well-informed expert opinion on solar radiation management, as well. However, I am not an expert about many other possible questions related to the solutions to climate change. As a historian, I may have insights into how certain proposed solutions are likely to work or fail, or what it might take to generate broad support for them. But I am certainly not an expert, for example, on carbon pricing systems. For that, I need to turn to other people.
An obvious cautionary example of scientists disrespecting the boundaries of expertise appears in my work with historian Erik M. Conway. In Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming, we showed how a group of prominent physicists rejected the conclusions of their colleagues in public health and oncology to make common cause with the tobacco industry and cast doubt on the science that demonstrated the harms of tobacco use. From there, they went on to cast doubt on the science that demonstrated a set of other environmental and public health threats: acid rain, the ozone hole, and global warming. In our book, we argue that the range of sciences across which they spread doubt should have been a red flag to any onlooker:no one could be a credible expert on so many different topics. The fact that they cast doubt on science in scientific findings in radically diverse domains was a “tell” that they were motivated by something other than their own scientific knowledge and expertise.
Expertise is by definition specific, and so the obligation to speak up in our areas of expertise implies a reciprocal obligation to respect the expertise of others. Put another way: we have obligations both to speak and to listen. We need to speak up, to act as sentinels, and to be witnessing professionals in our domain of expertise, but we also need to act with respect for colleagues who are the appropriate witnessing professionals in other domains.
This is not to say that as scientists, we give up our rights as citizens when we earn our Ph.D.s. As citizens, we will all have views on many matters and we are always within our rights to comment, talk, discuss, and vote according to our views. Moreover, sometimes it will be appropriate for us to stand up and be counted as both citizen and scientists, for example on matters that involve defending science, or the environment, or public health generally.
Expertise, moreover, is not an either/or proposition; there are areas about which I know a great deal, areas about which I know more than the average person but less than the experts, and areas about which I know very little. It can be tempting to express opinions, particularly in that middle domain, even when it would be better to refer people to others with greater expertise. It requires humility and mindfulness to exercise appropriate restraint, particularly when others press you for an answer.
What I am proposing is admittedly not always easy. I have had the experience of trying to refer journalists to more appropriate experts, only to have them insist that I was the “name” in their Rolodex, that they did not have time to make another phone call before their 5 p.m. deadline, or even that they needed a quotation from someone in the “Ivy League.” (One reporter once told me that if he quoted someone at Harvard who turned out to be wrong, his editor would be unvexed, but if he quoted someone from the University of Oklahoma who turned out to be wrong, then he’d face a pile of questions about why he had quoted that person.) This is laziness, against which we should push back. Even when journalists resist, I often say, “Look, I’m not an expert on that issue, but my colleague, Irene Doe, is. Please call her. Here is her number.” Besides being the right thing to do, it also reminds my interlocutors that expertise is a complex thing. If we really want to understand and solve any problem, particularly one as multifaceted as climate change, we must employ all the expertise that we have.
Endnotes
- 1Robert Jay Lifton, The Nazi Doctors: Medical Killing and the Psychology of Genocide (New York: Basic Books, 1986).
- 2World Health Organization, “,” May 27, 2020.
- 3Robert Proctor, Golden Holocaust: TheOrigins of the Cigarette Catastrophe and the Case for Abolition (Los Angeles: The University of California Press, 2012).
- 4Allan Brandt, The Cigarette Century: The Rise, Fall, and Deadly Persistence of the Product that Defined America (New York: Basic Books, 2007); and Proctor, Golden Holocaust.
- 5Nicolas Rasmussen, “The Drug Industry and Clinical Research in Interwar America: Three Types of Physician Collaborator,” Bulletin of the History of Medicine 79 (1) (2005): 50–80.
- 6Nicolas Rasmussen and Robert Proctor, “From Maverick to Mole: John C. Burnham, Tobacco Consultant,” Isis 110 (4) (2019): 779–783. See also Jon Wiener, “,” The Nation, February 25, 2010.
- 7Frances Stonor Saunders, The Cultural Cold War: The CIA and the World of Arts andLetters (London: The New Press, 2000).
- 8Naomi Oreskes and Erik M. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming (New York: Bloomsbury, 2010); and Michael Oppenheimer, Naomi Oreskes, Dale Jamieson, et al., Discerning Experts: The Practices of Scientific Assessment for Public Policy (Chicago: The University of Chicago Press, 2019). On the science of ozone depletion, see also Erik M. Conway, Atmopheric Science at NASA(Baltimore: The Johns Hopkins University Press, 2008).
- 9Oppenheimer et al., Discerning Experts.
- 10Nathan J. Robinson, “,” Current Affairs, February 4, 2019.
- 11Elisabeth A. Lloyd and Naomi Oreskes, “Climate Change Attribution: When Is It Appropriate to Accept New Methods?” Earth’s Future 6 (3) (2018): 311–325; and Michael E. Mann, Elisabeth A. Lloyd, and Naomi Oreskes, “,” Climatic Change 144 (2017).
- 12It seems plausible that scientists would worry that collegial disapproval could lead to loss of funding, or to papers not making it through peer review, but the scientists we interviewed did not discuss this. Their concern seemed to be a more general one about harming their reputations, in an abstract way, rather than specific concrete consequences.
- 13Oppenheimer et al., Discerning Experts.
- 14James E. Hansen, “Scientific Reticence and Sea Level Rise,” Environmental Research Letters 2 (2007). See also Keynyn Brysse, Naomi Oreskes, Jessica O’Reilly, and Michael Oppenheimer, “” GlobalEnvironmental Change 23 (1) (2013): 327–333.
- 15The popular press often refers to him as a hero. See, for example, Ben Block, “A Look Back at James Hansen’s Seminal Testimony on Climate,” part one, Grist, June 16, 2008, https://grist.org/article/a-climate-hero-the-early-years/. I have referred to him as a tragic hero; see Oliver Milman, ,” The Guardian, June 19, 2008.
- 16Naomi Oreskes, Why Trust Science? (Princeton, N.J.: Princeton University Press, 2019).
- 17Naomi Oreskes, Science on a Mission: How Military Funding Shaped What We Do and Don’t Know about the Ocean (Chicago: The University of Chicago Press, forthcoming 2020).
- 18Paul Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, Mass.: The MIT Press, 1997).
- 19Jessica Wang, American Science in a State of Anxiety (Chapel Hill: University of North Carolina Press, 1999).
- 20Naomi Oreskes, “The Scientist as Sentinel,” Limn 3 (2013): 69–71.