Tent of Bad Science 2: Why Trust Science?
START - Start Talking About Research Today - was Trinity's European Researchers' Night event in 2020.
This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 955428.
Jenny Daly: Welcome back to What Do You Want To Know, where we are bringing you a mini-series called the Tent of Bad Science. The tent was part of the programme at START, which stands for Start Talking About Research Today. START was Trinity College Dublin’s European Researchers’ Night event in November, 2020. We hosted four public discussions on topics like fake news, climate change, the importance of how we tell stories and power and protest, and we're sharing the recordings of those events with you now. Enjoy!
Linda Doyle: Hello everybody. My name is Linda Doyle. I'm Professor of Engineering and the Arts here in Trinity College and I'm delighted to be here to welcome everybody to our event in The Tent of Bad Science this evening. Just to let everyone know that our event, called Why Trust Science is part of the Marie Curie Actions and funded through the European Researchers’ Night. And like many things that I'm sure people are tired of discussing at this stage, European Researchers’ Night would have initially taken place back in September and we would have initially been in the flesh - meeting each other and doing all sorts of exciting things - and like everything else, it had to move online. And I have to say it has done so brilliantly and not least because of Jenny Daly, Michael Foley, and Sarah Bowman who are very much behind lots of the events that we're having. So, can I start by introducing Professor Naomi Oreskes and most people here will know that she's a world-renowned geologist, historian, and public speaker, and she's joining us from Harvard University.
She's a leading voice on the role of science in society. Her works includes Merchants of Doubt, The Collapse of Western Civilization, Discerning Experts, and we've taken the title of her book, Why Trust Science for this event. So, Naomi, we’re delighted to have you here and we’re delighted to use the title of that book! We also have joining us Dr. Jacob Erickson from here in Trinity. He's a theologian and ethicist and his research focuses on the intersections of religion and ecology, climate crisis, ecological justice, and ethical perspectives of being human in the wake of global warming. And I've heard Jacob speak many times and I never come away being disappointed - he always brings such a different perspective on things. It's fantastic that you're here with us, Jacob and the final person on the panel is Susan Murphy. She's a lecturer in development practice here in the School of Natural Sciences.
Her research interests are in international political theory, global justice, human rights and climate change, and she focuses hugely on social inclusion. And again, every time I hear Susan, it always makes me rethink, I suppose, my own perception and my own biases on things and gives me a very different perspective.
So the way it's going to go, I'm going to start by asking a few questions and then the panel can discuss things among themselves and we’ll come to audience questions and answers. When we were getting together, you know, preparing for this, one of the things I noticed that I am actually not needed here at all, that you have three amazing people who could, I think, happily discuss very fantastic issues with just me saying “Go, off you go.” But I am going to start with is just a small few questions, and I'm going to start with Naomi, and Naomi, maybe we could actually talk about the title of your book and ask why trust science? And maybe you’d unpack that for us, please. Thank you.
Naomi Oreskes : Of course. Thank you. Well, first of all, I just want to say it's really a pleasure to be here with all of you today. And it's a little known fact about me that I did my honors thesis in Ireland, in County Kildare looking at some rocks there and I love Ireland and I wish I could be with all of you today in person, but being here on Zoom is the next best thing. So the short answer to the question why trust science is because it works. Science has been around for a long time. Science, as we know it has existed for at least 400 years, arguably longer. It has an extremely long, well-documented, and well-studied track record of success in explaining the natural world and enabling us to use that knowledge, to do things in the world like cure disease or better understand geological history or send rockets into space and improve weather forecasting.
So we have this very long track history of science functioning and functioning well, and if you think about it, you realize that science is actually one of our oldest continuing existing human institutions. So if you think about, say, the Royal Society or l’Académie Française these are institutions that have been around for 400, 500 years - much longer than most governments, much longer than any media organization you can think of, and indeed some of our oldest journals are scientific journals like the Proceedings of the Royal Society or Nature. So it has this very, very long track record. And we can look at that track record and say yes, by and large science has proved trustworthy, a second reason which I emphasize strongly in my book. And thank you. It's a big compliment to me that you've named this session after my book.
So it's something that people often don't think about and that when I say sometimes surprises people, but it's because science is a job. Scientists are people whose job it is to understand the natural world. And so if we ask ourselves, well, why should we trust scientists when they tell us, you know, that if we wear masks that will help stop the spread of COVID-19? Well, one way to answer that question is to bring it down to earth. So science is often exalted, sometimes even deified, people often talk about science in these very lofty terms but I like to talk about science as a job. So if you had a toothache, you would go to the dentist and you would trust your dentist to fix your tooth because that's what he or she is trained to do and has experience in doing, and you wouldn't call an electrician to fix your tooth, but conversely, if you had an electrical problem, you wouldn't call your dentist.
And we all know that, it's sort of obvious when you talk about it that way, because we understand that an electrician understands electricity, hopefully, and a dentist understands teeth, hopefully, and mostly in our experience, that turns out to be true. So if we think of science that way, then we realize that it's actually a very simple argument to make, to say, well, if we have a scientific question, who do we call? We call scientists. And if we're not sure about an issue and we need clarification about something like climate change or, you know, the contagiousness of COVID-19 or where this virus came from, we would do well not to trust, you know, the Murdoch media or really any media - we would do well to trust the people whose job it is to study these things. And that means scientists. But more than that, it means specifically the scientists who have expertise in this area.
So a really important part of my argument, which I discussed in the book is that when I talk about trusting science, we use science as a kind of grab bag term for a whole diversity of different fields. But it's actually really important to understand: does this person have expertise in immunology or virology or atmospheric physics? And if they don't, then they're talking out of the side of their mouth and maybe we actually should be a bit skeptical. And we definitely see that in the media. The third argument in the book has to do with something that probably most of us were taught in school, which is about scientific method. So if you ask people the question, well, do you think science is trustworthy? And if so, why? One answer you often get is that scientists use the scientific method. And so the idea is that there's some kind of method that guarantees the reliability and the trustworthiness of scientific findings.
Well, I'm sorry to say that that's untrue, that's a myth, that in fact there is no singular scientific method. Historians and philosophers have spent an awful lot of time trying to find it. And what we've found is that we can't find it because there isn't one. What there are, are diversity of different methods that scientists in different fields and at different times in history have used different methods, but there is one thing that they all have in common and this is the critical vetting of claims that as a scientist, if I think I've made a discovery, that's not where my job ends – that's where my job begins because now I have to take that claim to my peers, to the juries of my peers, to my fellow scientists. I have to present it at a conference or a workshop. I have to give seminars. And ultimately, I have to publish it in a peer reviewed journal. And through that process, what do my colleagues do? They don't just say, thank you, Naomi, this is great. No, they say, wait, hold on. What about this? I'm not sure your sample size is big enough. I'm not sure about the conclusions you drew from that data. So there's this very critical process that all scientific claims go through before they can be published and before they become accepted as scientific knowledge. The feminist philosopher of science, Helen Langino has called this transformative interrogation. And I really like that phrase because I think it captures two essential things about this process. The first is the transformative part, that through this critical process of questioning and arguing and vetting claims, claims are transformed from just a claim or just a theory, or just an opinion, or just my view, to knowledge.
So if there's anything sort of magical about science, and I actually don't think any of this is magical, but if there is anything that's sort of, let's say alchemical, it's this transformation of opinion or theories into knowledge and facts. And that happens through this critical process. Now we also, I also use the term interrogation because it is important to acknowledge that it's not always nice. It's pretty tough. There's tough vetting of claims as part of how scientists weed out claims that are not supported by evidence. So that would find what emerges as knowledge are things that are very, very well supported by evidence and ideally by different kinds of evidence from different approaches. And that leads to one final thing I'd like to say before passing it on. So there is no singular scientific method. We have this idea that what makes knowledge claims reliable is that they've been really, they've gone through a tough vetting process, but that vetting process is more likely to work well when scientists look at the problem from a lot of different angles and ideally use a diversity of different methods to test claims, and that is more likely to occur when a scientific community is diverse.
And so this is one of the most important arguments for why it's really essential, it really behooves us to strive to have and sustain diversity in science. Because if we have a diverse community looking at a problem from a variety of different angles, it's much more likely that the findings that finally result will not be biased by the views of just, you know, sort of an in-group of people or something, but will actually have been really thoroughly tested and any individual bias has been identified and hopefully accounted for. And so, part of the argument I make in the book is that for science to work well, it does need to be diverse. And if we're not sure about a scientific claim one question we can ask is, well, have scientists been studying this for a while? Have they looked at it from a variety of different angles? And is the community involved a diverse one? If the answer to any one of those is no, then we might want to scrutinize the claim a little bit more thoroughly, but if the answer to those questions is yes, then in most cases, I think we can count on the knowledge as likely to be trustworthy.
LD: Thank you, Naomi. You brought up so many interesting points there. I'm sure there'll be tons of questions. I love the phrase transformative interrogation. I think it's a great phrase that I suppose it's much more positive than what I often consider - the academic life is a life of rejection of ideas. So transformative interrogation is a great term, but speaking of language, the language of trust is very compelling, Jake, and as an ethicist and theologian, what's your take on the subject?
Jacob Erikson: Yeah. Well, first thanks for having me. And I have to say it's somewhat intimidating going after Professor Oreskes because of course, Merchants of Doubt was such an important book for me, thinking about climate ethics, and thinking about power, and a number of issues. And this book, this Why Trust Science book, is really extraordinary for the ways that it serves as a kind of a sequel to that and I'm really grateful for it. But, I find the language of trust really compelling because I think one of the things that we've, I don't know if we're more aware of it now, or the circumstances of the politics of the last few years have made us more self-aware of the ways that our values and our human experiences seep into the scientific enterprise and seep into the ways that we engage all of the contemporary crises and scientific inquiries of our time.
And so it's precisely that humanizing point that Naomi mentioned at the beginning, that's what's really interesting to me - science as a job. And I think of ethics as a job too - I'm an ethicist who is interrogating claims about morals and values. But science as a job and science as a relational practice of communities, people actually engaging in scientific deliberation, asking questions about the world, coming to a greater or not understandings of the world. And, I think my job as an ethicist is really to step back, and ask, okay, why are we asking these particular scientific questions? What values are going into those scientific discussions, and are folks involved in those scientific discussions aware of those values? We could call it bias – a lot of scientists call that checking one's bias, but I think it's also not quite as completely negative perhaps as sort of weeding out bias.
Sometimes values are really important for why we're asking certain questions. I may be a scientist interested in researching cancer because I have family members who've had cancer and want to actually help them. I may be interested in doing scientific deliberation or ethical reflection on climate precisely because I'm concerned about the future of the planet. And so these values become places and touchstones of commonality for us where we can engage one another, where we can find motivation and where we can ask whether the questions that are being asked by scientists in their expertise are in fact good to be trustworthy of. As a religious ethicist as well, I work with a lot of communities that deal with complexity, questions of climate denial and, and all of these issues.
And, so my role becomes really asking, okay if science is a human practice that has these time and tested ways that answer questions for us, how might we engage in ethical and values conversations that help us become more aware of what our own values are; what those values do and how they're transformed in the world; and also, how they might in fact motivate change or not. Do we just get back to normal, out of COVID? Or do we want to tell a different kind of story? So, I think the trust language becomes important for me in all of those complicated realms, but really at the end of the day, it's about thinking about the ways that self-reflection happens in science, what values we have, and how to interrogate, really, the power dynamics of those conversations.
Who's sitting at the table when scientific discoveries are being made – the diversity point that Naomi made – are the problems that folks actually have, are the people who are being affected by climate change the most actually at the table with us, and helping to make reflection on, on decisions that we need to make politically. So trust becomes a relational thing for me and not a deified moment. And then just finally, I think, you know because religion and science conversations often revolve around notions of what is true or factual or not. And, oftentimes we tend to think of the language of true as located in this nonfiction category. What is true is factually in this non-fiction category and science and its deified way is true, and everything else is just fiction, which means that's not really real.
But of course, as a humanities scholar and as a person who studies religious stories and poetry and novels and fiction, human beings make and find truth in all kinds of ways and help in ways that help them hold contradiction, in ways that help them tell better stories whether they be scientific ones or religious ones about their lives, and where those stories connect or don't. So, I think the language of trust places it really back in the human, a human element, and I'm really grateful for that.
LD: Thanks, Jake, that's very, very interesting. And you, you went back to a point that Naomi made, and I thought the question you were asking, one of those three questions about, you know who's included in that discussion, is the community involved a diverse one? And if the answer to that is no, we need to think again. So that leads me to you Susan, and your research is very, very much focused on about who's included and who's not. So when we think about trust and science, what are we getting wrong then?
Susan Murphy: Thanks so much, Linda, and thank you to you and to the organizers for inviting me to participate in this event today, I'm very honored to be here with this fantastic panel and this team of people. I've been teaching climate justice and thinking about climate justice related issues for the past decade here in Trinity College. And so the, the conversations, the inputs that Naomi has shared, the focus that Jake has brought to climate justice related matters and climate ethics, and indeed thinking relationally, I think is very, it very much speaks to my own work. So interestingly, we've lots of agreement here, I think, across the three panelists. So I'm sure we're going to find points to debate, but there is quite a degree of, of consistency, I think in terms of the perspectives. But I suppose I wanted to kind of share my thoughts around this question, really focusing on three different dimensions.
The first one is linked to the idea of the concept of trust and trustworthiness itself. Secondly, just to briefly focus a little bit on some of the efforts of the scientific community in particular, to bridge that gap in trust, by focusing very heavily on science and science communication in particular as means of improving trust and what works within that and indeed, you know, where there are still some gaps and limitations with this approach. And then thirdly, I just want to think a little bit, or introduce some ideas around what I might call the translational problem. So that's the tricky step between moving from trustworthy evidence and a scientific consensus towards solutions and trustworthy policies, because I think that this is where some of the most pressing challenges are, that we need to consider particularly in relation to climate change on the climate debates.
So I suppose to come to that first point in that idea of trust and trustworthiness very much to reiterate the point that Jake has so eloquently made, I mean, trust is not a fixed entity. It is essentially relational. It's not something that anyone or any single individual or institution commands indefinitely. And it's, it's very, very distinct from belief and faith. It has to be earned and it has to be maintained and there's emergence and is strengthened, or weakened, through processes of engagement and interaction checking and challenging. And these are welcome elements by scientists, but they can, sometimes they can sometimes lead to degrees of confusion or create opportunities for intentionally building census of mistrust. Nora O’Neill has argued that the degree to which an individual or an institution is trusted is very much linked to their integrity, to their honesty, to their confidence, and also to the fact that they are reliably honest and reliably competent.
So that idea of continued engagement and maintenance, but we do know, and there have been instances and examples, and Naomi has covered them very well in her work where, you know, individuals, and indeed some institutions, can become captured too either through cognitive bias and indeed through special interests. And this of course creates a space for politics and for disruption and for intentional confusion and so on. So trust in scientific findings can be established, you know, it may be established, you know, there can be very specific understandings rooted in some, you know, the best available scientific evidence and consensus. But then there is that ongoing process of maintaining trust as evidence and the evidence basis becomes refracted through the policy-making process, and political ideologies, and varying world views. So if we look at the area of climate change in particular, you know, indications suggest that there is, you know, a growing degree of trust in the scientific consensus around climate change, particularly in the high income and high emission states.
So I'm going to point to the 2019 Eurobarometer, for example, as a really interesting place where we find a very high degree of concern among citizens, of course, Europe, of course, all of the states identifying climate change and the need for climate action to be a top priority that should be addressed through public institutions and political parties. And for populations living in lower-income and lower emitting parts of the world, you know, changing climates are part of the everyday lived realities of populations. They are witnessing, they are living with those increasing droughts and floods and the more frequent storms and weather events and so on. So it's not some abstract future that, you know, people focusing on living in small island, developing states are experiencing - it is their everyday experience. But interestingly, in that Eurobarometer report, although there is a high degree of trust and recognition indeed of the urgency of the climate of climate action, we are also seeing, you know, declining levels of trust in public authorities and very low levels of trust in political parties, and given the fundamental task of politicians to direct policy priorities, and public authorities to implement these, this is a really worrying space.
It's a really worrying development and not one necessarily the scientists can feed into or direct or indeed control, because it is that politics space. On the question of scientific communication, so we know that many scientists have engaged in efforts to more effectively communicate the science to communities and to much wider ranges of audience. And there's been an enormous number of events and investments essentially to encourage scientists to communicate in intelligible and understandable ways to much more diverse audiences. But I'm going to suggest that, and maybe somewhat contentiously, I’m going to suggest that I don't think it's sufficient simply to communicate better, that it's necessarily better just to communicate, to, to speak to people all the time. I think it's equally important to ensure that we're listening to audiences and gathering understandings and insights into their values, their fears and their concerns, their suggestions, their ideas, and so on.
So the importance again of diversity and voice through the scientific interrogation process of wider publics, I think, is a really critical component. You know, scientific solutions have to be operable and intelligible not only in the lab, but very much in the real world. And the background conditions into which scientific solutions are embedded, they're deeply marked as you know, by deep and enduring power asymmetries, social and economic landscape that is distinctly bumpy and uneven, in a world where voices do not carry equal weight. And credibility is often linked to social status, to who knows who, and the position that one holds and not what one knows. So, you know, rather than, you know, it's not necessarily linked to knowledge and what one knows. So scientific communication, I fully agree is really important but active engagement with communities on interests and values and viability and desirability of proposed actions is also absolutely essential.
And then just very briefly to comment on that translational problem. So there's really interesting research being done at the moment through the OECD Trust Lab, which clearly indicates, you know, that trust is a, you know, a fundamental bedrock to social and economic progress, and to functioning democracies, but it's precisely these systems that are most challenged through the climate debates. So our economic system, which is based upon principles of maximum extraction, exploitation of all possible available resources - both human and natural - and continuous growth, it's contributed very heavily to improving human development indicators in all countries over the last number of decades. But it's also resulted in very high levels of income inequality, declining levels of social identity, and detrimental impacts on environmental systems and by diversity. So how do we shift away from this type of system without causing harm and further hardships? The answer to this isn’t simple and the solutions will not be found in single disciplines or in single expert expertise, such as economics or engineering. So it very much points to that need for more diverse voices across the academy, across the disciplines, but also more transdisciplinary activity with, through engagements with populations outside of the university setting. And I going to, I'm going to hold off saying anything further there, cause I think I've gone over my five minutes, but I do have lots more to say about this and I welcome any questions or any comments that anybody may wish to share. Thank you.
LD: Thanks Susan. That was really interesting. I'm going to actually segue into one of the questions that's come in at the moment, cause it kind of takes up from your point and it is that issue around the political and the scientific. Martin, one of our attendees, has brought this up, but I think a lot of people are struggling with it at the moment, especially because I suppose with COVID as well, there is such a mix of the political and the scientific presented even in the same fora or sitting, you know, you might watch a chat-show when there'll be a journalist debating with a scientist. So maybe I'll just throw that out there and Naomi, I know you have a lot to say about this as well.
NO: Yeah, and I've been reading the questions in the comments and I agree completely with what Susan and Jacob just said, you know, trust is a two-way street or maybe it's better even to think about it as a roundabout, right? I mean, these are things that it's – scientists, I think, have had a hard time sometimes accepting that, that idea. I think that many scientists have been trained to think that because we worked so hard to develop this expertise in a particular area, that then we become the expert, the authority, and it's our job to sort of hand over that information. I call that supply-side science and we know that it doesn't work. I mean, it works in some contexts, but in general, it's not effective because it's not actually how human relationships are developed. I think Susan's point is spot on, on this idea. I mean, we develop relationships through interactions. We develop trust through interactions. So it's not just supplying information. It's also listening. It's also understanding the context in which that information might be used. And of course this is hard for scientists because science is already pretty hard. And we ask scientists to do a lot just to be scientists and publish their work and get grants. And now to add this additional component of a more complex multilayered form of engagement and communication you know, that's, it's asking a lot. And so I think it is important for us who are asking that of scientists to acknowledge that we're asking a lot more of them, and often asking for something that they don't, they weren't trained to do. In fact, in some cases asking for things that they were trained not to do like to think about the value aspects of their work, but I agree it's really essential.
And the way I frame that when I speak to scientists is to say, well, look, you know, we're all under pressure to publish so many papers all the time, but wouldn't it be better to publish, you know, one or two papers that really have the effect you want because you've done the extra work to communicate it in a more efficacious way and to listen to the communities who might actually use the work then to publish 10 papers that no one ever reads. And I think, you know, when you put it that way, it depends on the context, right? But at least in some contexts, there are definitely scientists who really would like their work, you know, to have traction. And if, if a better way of thinking about engagement with diverse communities enables them to do that, I think there is a possibility there for some meaningful, meaningful change.
JE: One of the pieces that Susan mentioned that was so compelling was the listening dimension, and just that language of listening is really interesting. So often times, and I have to tell my students this, the way we use ethics in colloquial language, we simply mean, is it good or bad, right? Is this, is this ethical? Is this good or bad? But ethics as a system, and as a complex reflection on engagement is about reflecting on the ways in which we interact with one another, and the great feminist, ethicist, and theologian Nelle Morton uses this language that I use in my classes quite a bit that, that one of the demands of our moment is always that we hear one another to speech. And that kind of relational listening that happens – not when we're simply offering an opinion or not, when we're simply advocating for somebody else and then can feel like we can step away, but what does it look like to actually stand and sit in solidarity with people and hear them? And then they might also hear our concerns and something might happen out of that differently. And I think that's a process that is not just a process of ethics, not just the process of science, but it's just something deeply human that needs to be incorporated throughout.
LD: Thank you, Jake. Maybe – Susan, were you trying to get in there?
SM: I was just going to say, just in relation to, to the COVID experience that we've all just come through. I mean, I do think that's one of the most, that's just such an interesting space and we'll be thinking about this, and working around this for quite some time to come. But I think to me, it also shows, you know, certainly in terms of the vaccinations that have been developed, you know, it shows the best of science, of scientific method, of the scientific community coming together, working together, and actually delivering and developing, and delivering this within such a short period of time. It also shows that enormous gap between the scientific evidence around a disease and disease transmission, and the wider public health considerations that need to feed into the formation of policy. And we have seen some major errors and mistakes, and we have seen some very, very good and very strong leadership in this space.
So it's been a very, very interesting time for looking at some of the pitfalls around that. But I definitely, if I could point to, I mean, I have to say in terms of the work that was being done through the University of Oxford and AstraZeneca, I mean, that to me is a beautiful example, actually, of the way in which a scientific approach sought to be inclusive, and sought to be based in a firm understanding of its, you know, its ethical obligation to find a solution that was relevant and effective for the global population, rather than necessarily using this as an opportunity to generate income or indeed to protect a single population and so on. So I do think that to me is one of the really interesting, and so far, I mean, it's, it's very early stages and I like everybody else, or many other people, would have just been following this through the news, but the way in which it's been presented, it does look like this activity that was undertaken where rather than just seeking a solution to that very narrow problem, what they actually sought to do was to think about, okay, who, what is the context, the very, very non-ideal context into which this is going to be placed and based, and how do we ensure maximum access for the widest possible number of communities across whole countries within this space.
And I think that's a really leading example right now, thinking of the scientific approach and all of the rigor that comes with that, but in a very inclusive way where we're not just thinking about the narrow, specific problem of a single solution to that virus, but understanding how that virus is actually embedded in social communities and systems.
LD: That's a fantastic point. Expanding it out a little bit, you know, and we are all here with this kind of strong trust, I suppose, in science, but why isn't science alone compelling enough to convince people? And I mean, I think about this: my partner smokes and there's no convincing him, that he's never going to stop that behavior (he’ll kill me for saying that) but you know! But you know, one of the people in the audience, Paul, also brought up the fact that maybe this is some of the reason why it doesn't convince that, you know, it's very often that the conversation from the scientist is very uni-directional – it’s science to the public, and is looking to understand the kind of active engagements that could exist to kind of promote it better. So maybe that's part of the answer of why science alone isn't compelling enough?
NO: Well, I think there's a lot of things go on here. I mean, obviously science alone isn't going to answer our problems because science alone doesn't answer our problems, right? I mean, science can tell us that, tell us that if we smoke the odds of getting lung cancer, emphysema, bronchitis, heart disease, blah, blah, blah, are greatly elevated, but a person could decide, they could say, well, I know that, and I like smoking and so I'm going to continue to smoke. And personally, I mean I, this sometimes surprises people when I say this, I actually have no problem with that. So long as they do it in the privacy of their own homes, they don't put other people at risk. Now I'm a little concerned for you because the scientific evidence on the harms of secondhand smoke is very great and in fact, the most famous study on this question was the Japanese wives study done many years ago, which was a study of wives in Japan who did not smoke, but whose husbands did, and had very high rates of lung cancer.
So, I mean, in many of these cases, this gets back to the ethical problem. These choices don't just affect ourselves, they also affect the people around us. So, I think that it's not illegitimate to, to recognize - in fact, it's more than not illegitimate - it's essential to recognize that social decisions, personal decisions are never just about the facts. They're always about some complex nexus of the facts, our preferences, our desires, our values, or aspirations, the stresses in our lives - you know, if we find smoking helps us relax, we may decide, I mean, alcohol, there's a big scientific fight over whether a little drinking is good or not. You know, I know I drink and, and there is some evidence that probably I should, well, we don't know. I mean, there, the science is vague, so it's not a good example, but the point is, it's not wrong that our decisions are made on lots of different axes.
What is wrong is when we don't have access to good scientific information, because people are deliberately trying to confuse us. And that's what my book Merchants of Doubt was all about. It wasn't to say that everyone in the world necessarily had to stop smoking, it wasn't for me to say to your partner, anyone else, you know, you must not smoke. That's not the purpose of the work I do. The purpose of the work I do is to say people have a right to good information. Citizens have a right to good information. And if they don't have that information because people, for ideological, economic or other reasons are fomenting deliberate disinformation, then that's profoundly unethical because it denies people the right to make informed choices. And so that's why I get angry and hot under the collar at the merchants of doubt, because they're denying all of us our right to make good choices.
Now, if we have good information and then we choose to make bad choices, well, because they're not bad for us, or because we interpret them as being good choices for us, that's a very different matter. And there, I have a lot of respect for people's right to make choices that might appear bad to me, again so long as those choices don't harm other people, but that's a very big “so long as” because part of the reason why issues like climate change, and smoking, and not wearing masks are so fraught is because they do harm other people. And it turns out because the way we live, people are social beings, we live in families, we live in communities, we travel, there are very few things that we do that don't affect other people. And that's, I think, why these issues become so fraught or at least it's one of the reasons they become so fraught.
LD: That’s very interesting. I'm going to have to read the Japanese wives study! Jake, you’re trying to get in there and say something?
JE: No, no. I think there are two things that this, this nexus of values and all of the stuff that goes into making these complex decisions - being self-aware, being critically self-aware is one place where, I think, ethics is interesting because when we're more conscious and self-aware of our values, we can hold others to what they say their values are. If you are engaging in the golden rule and saying, I want to love my neighbor as myself or something like that, what does that actually look like in this context and what does it mean to hold someone to account, and to interrogate what that means to them in such a way that they are actually acting in ethical ways around any particular issue. But also, I think one of the reasons that's really important to name about why trusting science is our histories of power and systemic forms of social injustice, right?
Systemic racism, systemic heterosexism, where folks have gotten the wrong end of mal-uses of the power of the scientific discourse, and therefore trust is not, and probably shouldn't be, sort of readily accepted immediately without some interrogation about what communities are actually, are actually doing - the Tuskegee studies and these historical examples of science gone bad, so to speak. Again, human, but part of, I think the values conversation is not just, not just asking what is valuable here, but in fact, asking what does it mean to build up trust across time. And that then that's precisely your relationships and precisely power analysis.
LD: Right, and maybe actually this is a good point to bring in a question from Eoin that's come in. And I think everyone has maybe skirted on this for a bit? And he says “I know of no university anywhere that is not dependent on either funding from major commercial enterprises or funding from politicians to government or both. And it's clear that whether it be Monsanto, the fossil fuel industry, or agriculture, that much of the official information is biased, and how can we non-scientists know who to trust?” I think this is a really, really interesting point and I know, Naomi, you've been working on that in a wider context as well with your new book, but I think this is, I think this is a really, really difficult thing for people to listen in and kind of figure out, you know, I suppose what they should be paying attention to.
NO: Yeah, this is a really difficult one, and this is an area that I would really like to see university administrators take on in a serious way, because I've seen this in my own lifetime that most university administrators take the view that, you know, there's no such thing as tainted money except t’ain’t enough, right! And that all money is good. I mean, Derek Bok, a former president of Harvard even said as much in a speech years ago, he said, you know, once we get the money, we do good things with it. So therefore it's all fine. Well, I think we know it's not all fine. It's not all fine on a lot of levels. And you know, the recent scandal here in the United States over Jeffrey Epstein funding scientific research here at Harvard is just one of many examples of the ways in which it's not fine.
So I would really like to see a big sustained discussion about this, because I think we have lots of evidence for deeply problematic practices in science. So there's a sort of superficial answer, or maybe not superficial, but kind of a stop-gap answer. And as you said, my new book it's called Science on a Mission, looks explicitly at this question of what difference it made that - in this particular case, I'm studying oceanography – that virtually all of oceanography in America in the Cold War was funded by the U.S. military, what difference did that make? So for anyone who is seriously interested in that issue, I'll make a shameless plug for a 700 page book – 200 pages are notes, but it's a long, it's a serious book, but if you're really interested in this question I think it is very, very illuminating, I hope. I think in the short run, this is another area where the issue of diversity comes in.
I don't think it's necessarily problematic that a university gets money from a private sector funder, but it is deeply problematic if all of the funding for a particular area of research is coming from a single source. So I think it's really important that universities have policies to ensure that there's diversity of funding, that people who have results that might not be favorable to, let's say you know, a genetic, genetic engineering entity, have the opportunity to research and that there's full, 100% disclosure of all funding so that citizens can judge for themselves whether they think there could be a funding bias at stake. I think there is absolutely no excuse under any circumstances for nondisclosure of funding sources. And I think every university in the world should have a policy – some do, but many don't. And the same with journals – increasingly scientific journals are requiring all authors to identify their funding sources, but not all.
So that's a simple thing that could be put in place pretty quickly. The deeper issue about how we fund science is harder. My own view is that, I've heard people say well, but if we didn't take money from X we would have to do less science. And my answer to that is well, that might be, that might be the case, but maybe it's better off to do a little less science, or even maybe quite a bit less, but have science that we know is trustworthy and that does earn the respect and trust of our fellow citizens rather than sort of maximizing science, but in the process, actually losing the crucial trust of the communities that we serve.
LD: That's very interesting. And I'm reminded of, like, the open science movement more broadly is about not just open access to journals, it's much more than that. It's more about robust processes and exposing everything and all you do from the data you use to your techniques, your protocols. And I think the idea of including that kind of funding sources and that kind of openness is really, really interesting. Susan, were you trying to get in there?
SM: I mean, it was just, it was very much to agree with the comments that have been made thus far, but it's also to, you know, I suppose to call out maybe a little bit more explicitly, you know, what is the underlying, what are some of the underlying challenges we're facing? And it's under-funding of the university sector and the research sector. It is states essentially not fulfilling their obligations to ensure that this sector is properly funded.
And we know that - this has been a battle that we've had in the Irish context, in the European context, and is absolutely evident in the U.S. context and the response to that is the financialization of the sector in the way that we have seen, and then the risk that that generates to undermining trust and confidence in the work that is produced in that process. And absolutely, there's an awful lot that can be done through the peer review process, through the, you know, the leading journals in this space. But I do think that the fundamental problem is to do with a lack of value that is attributed, and essentially that worldview, that everything has to be delivered in partnership with private organizations, and you know, I think again, the Oxford AstraZeneca example is really interesting, but they entered into that relationship on the basis that it would not be for profit.
It's a very, very different structure, a very different arrangement. You know, the vast majority of times the research is actually used, and is used very, very effectively to fund and progress private interests. And we have to recognize that. So I think, you know, as public universities, and Trinity is a public university, we do have an obligation to ensure that our research has wider social impact and is in the public benefit. And yet at the same time, we're not properly funded to do that. And, indeed we're heavily encouraged to seek partners who have very deep pockets, whose interests may not necessarily align, I’m not saying that they don't, but may not necessarily align with the public interest. So I think the, you know, the radical underfunding of the sector has to be called out here as a major problem. And it's very much linked with the kind of political economic structure that governs global relations, global economies, and essentially the university sector and the research sector on that international scale.
NO: Yeah, I think to jump back in on that point, because it's such an important point I appreciate, Susan, you articulating it that crisply, and I'll just give one example to underscore that. So, when it came out in the United States that Jeffrey Epstein, who was a convicted sex offender, had funded research here at Harvard, you know, many people were upset about that because of the fact that he was a sex offender, and that was offensive, especially because he had access to university buildings where students could have been present, you know, after hours. But there was an, I think in a way actually deeper problem that got very little conversation, which was why was this man whose ethics and values were clearly deeply distorted making decisions about what was funded at Harvard University? And it turned out that he wasn't just a sex offender, although that was obviously bad enough, but he was also a eugenicist, and a racist, and a man who had very retrograde and bizarre ideas about the genetic basis of human traits, something that has been largely discredited for, you know, what, 80 or 90 years of scientific work. And yet, this man was making choices about what we should and shouldn't fund, something that the university encouraged because he was a deep-pocketed individual. So it really creates intellectual as well as moral distortions when we think that, oh private funding is just great because it helps us do this work – no it can be deeply problematic. And I think this is a kind of really dark underbelly of a lot of contemporary research that this idea, as you said, Susan, that, you know, we should be developing public private partnerships. Well, yes, in some cases, public private partnerships make a lot of sense, but in other cases they're deeply, deeply problematic.
LD: I think that those are just fantastic points and I could dive into them further, but I'm going to throw a question from the audience at you, Jake, this time. It comes from Paul and he says: “I'd love to hear if my scientist colleagues can overcome their narrow perspectives on trust language to contribute positively to the civic discourse rather than pull away from it under the auspices of segregating the scientific enterprise from all other aspects of human enterprise.” I think that's an interesting point, Jake, I'm going to throw that at you and see what you have to say to that.
JE: Yeah, throw what the scientists should be doing at me! But, it's a great question. And, I think that, you know, the more that we're aware of the role that values and ethics plays in scientific discourse and deliberation, the more clear and important it becomes that ethics as a field of study, as a self-awareness, as a professional discipline needs to be incorporated in the study of science, right? And in the study of history of science, philosophy of science, that the ethical ways that the stories of science, and histories of science have been told. So firstly, there's a professional education component, and that's always difficult because cramming another course into a program is always a complicated endeavor, but it seems to me that if we're going to try to live on a world that we want to live on, that having these aspirational, transdisciplinary conversations, recognizing that climate change as, what a lot of experts are calling this, a wicked problem with all of these deep dimensions to them, there has to be some transdisciplinary and interdisciplinary engagement and solidarity.
And it takes the really awkward thing where we have to actually learn a little bit about each other's disciplines, even though we claim we're not professionals in them. So, I think there's an institutional obligation there that the study of ethics, we're thinking about ethics is not just about getting ethical approval for a scientific project. It's actually about engaging the theoretical disciplines and engagement and so that's a systemic way. It also needs to be engaged in that way in society. And, so I think that's one way that research universities and institutions can help and make, in these partnerships that are being made facilitating forms of ethical deliberation and processes. Why are we making this partnership? What are the qualities that we want to involve here? Is this simply just an economic instrument and utilization of knowledge to make more money, or are we actually solving real problems and making a human community more sustainable, more livable, engaging in climate mitigation, resilience, all these things that we need to do so urgently. So, there's a multi-scale thing here in my mind.
LD: Yeah. That's very interesting. I mean, there's another question in here that I could equally have pushed your way, but I’ll just open it to everybody, and it is from Tara. And it's interesting. Tara says, are there ways that belief and trust have become confused and unhelpful in some ways? And they're asking the question, you know, it leads - they're saying that it leads to confusing questions like: do you believe in climate change? And I know you guys have various things you like talking about this topic.
NO: This is a very tricky one because I have two competing instincts in response to this question. On the one hand, we know that language really matters, and the choice of words really matter, and words convey different meanings to different people and we really – part of the whole art of communication and engagement is being sensitive to how words are taken up by different communities in different ways. So I do know many scientists who feel very strongly that we should never say we believe in climate change. We should say, we accept the scientific evidence. We accept the scientific findings. And I think that that's broadly correct because I do think the word belief for many people is tied up with notions of faith and leaps of faith, and that conveys notions of religious faith, and non fact-based knowledge systems. So I think scientists are right to be sensitive about that and as much as possible to try to find the language of accepting evidence or accepting the findings of scientific investigations.
That said though you know, because we've been talking about trust there is a way at the end of the day that when we accept the findings of other experts, it is a trust relationship, right? I mean, I have done some research in climate science, but most of my research is in the history of climate science. And if I, if I let's say, you know, we have this vaccination that we hope will be available soon. And if I go and get this vaccination, I'm going to be trusting my colleagues here at Harvard and at Oxford and all the different places where people have worked on this. And I'm going to be trusting colleagues at the FDA who have been part of the review panels. And I'm going to be trusting my physician. And there's going to be a moment which I'm going to believe that what the scientists have said about this is true, and I'm going to believe that there was no fraud or malfeasance, and I'm going to get this vaccination.
So at the end of the day, there's this, you know, we want science to be objective. We want it to be about facts and evidence, and it is about facts and evidence, but there's this core, you know, we can peel away the onion skins, and peel and peel and peel, and at the kernel of it, there's always going to be this irreducible element of trust. And I think that's okay because it's a human activity and as all of us have been saying, humans are involved in trust relationships. And I think that if we deny that we put ourselves in this sort of weird thing where we have to do all these gyrations to avoid using words like belief, as opposed to saying, no, I do believe it because the people who are telling me this have done a lot of fine work. And I believe it because I think the science of epidemiology is robust and yes, I believe it, and yes, I'm going to get this vaccination.
LD: So we have a short amount of time. So I'm going to try and get two more questions even though I know we could dig in even further into any of these questions. So, taking that up further Joe has asked: how do we feel about the role of scientific advisors to government in the current climate, especially in the tension between that duty/ responsibility and the kind of, I suppose there are personal relationships that are links through being the scientific officers. Anyone want to jump into that?
SM: You know, one of the challenges is around ensuring that there are sufficient diversity of voice in that space and representation across different disciplines within the scientific community, and particularly within the social scientific community. I think one of the challenges that we have, one of the critiques that I would have, for example, in relation to the conversations around climate action in particular would be the heavy predominance of an economic sense and lead economists.
And I have no problems with economics. I think economics and economists are absolutely essential, you know, make an essential contribution to the wider discussion, but it is one that, you know, it tends to be one lens to which you will apply and explore rather than thinking from, you know, ensuring that there are other voices represented in that space, ensuring that there are ethicists in that space. And just to kind of give one example, I had a conversation with a good friend of mine who was talking about carbon taxes and, you know, as a really important economic mechanism or response to address climate change. And, you know, he was explaining to me, and I was saying, well, one of the challenges with that actually is the way in which that can perpetuate inequality - flat taxes are very problematic, you know, so it's not that anybody wants to support the continued emission of carbon, but, you know, there are wider considerations to think about rather than just a flat tax and stopping people accessing, you know, what it is they may need. And his response to me at the time was that, well, it was almost like that those types of considerations needed to come outside of the scientific advice that he would give as social scientist and economist advising on that particular response.
And to me, I kind of thought, well, actually, no, that's a very, very specific worldview of human beings and decision-makings, and who would be affected by this, and what the role is, you know, who needs to be part of this wider conversation. So I have to say, I think, you know, definitely more of a diversity of voice across the disciplines is absolutely essential to make sure that we are not, you know, creating circumstances through the kind of siloed approach, that can essentially lead to unintended consequences, deepening harms in different forms of inequalities and exclusions that, you know, may not be the intent at all, but may indeed be, you know, an oversight due to the hiding or ignoring all those other perspectives on those reflections and considerations. And indeed recognizing, you know, that every position, every discipline is based upon a theoretical framework with its own underlying set of assumptions and those assumptions are points that we will challenge one another around and in the best possible way, I think.
LD: Thank you. That was very clear. I think we have run out of time and apologies to the audience as we're not going to get to every question. It's amazing when you're having such an interesting discussion like this, how time flies, and you don't really know where it went. For me there were two words that stood out in this entire conversation. And the first word, obviously, is trust because it's not only the title of your book, Naomi, and the title of the session. And the second word very much, Susan, you mentioned it again at the end, is that word diversity. And I think it's one of the, you know, what you've all done is reinforced kind of, I suppose, even more fantastic views around the need for diversity. And I think that was brilliant to hear.
So, can I thank you all so much for your wonderful insights. Thanks Naomi. Thanks Jake. And thanks, Susan. I, for one, could have listened to you for many more hours. I could just sit here and as I said at the beginning to say, go, and whatever you talk about, and this area is just fantastic. I'd also like to thank the audience. We got some great questions and apologies, we didn't get to them all. But we're really, really delighted that the audience were there in The Tent of Bad Science, which obviously is full of great ideas and great science really! I want to make sure I thank Jenny Daly, Sarah Bowman, and Michael Foley, again, you haven't seen their faces on this, but my God, the work that they're doing behind the scenes. And in fact, actually we wouldn't even have all of this research, this event without Jenny. Jenny was the person who has been working on this for many months and many, many other events. Thank you so so much again to everyone and hope everyone enjoys their evening.
Jenny Daly: Thanks for listening! You can find links to a full transcript of the discussion and some other stuff in the show notes. There's also a recording of the event on YouTube if you want to watch it back. Thanks to Conor Reid at Headstuff for production assistance. And as always, you can send us ideas for future episodes via our suggestion box. We would love to hear from you! Bye for now!