The HPS Podcast - Conversations from History, Philosophy and Social Studies of Science

S4 Ep 4 - Darrin Durant on 'Expertise'

HPS@UniMelb Samara Greenwood Season 4 Episode 4

Today Carmelina is joined by Dr Darrin Durant a Senior Lecturer in HPS at the University of Melbourne specialising in Science and Technology studies. Darrin's research covers two seemingly distinct areas: nuclear energy and expertise. Yet nuclear energy and other contested public policy issues are informed by experts on both sides of the debate.  As Darrin explains in today’s episode, there are different types of expertise and we must learn to better judge who is, and who isn’t an expert. 

Using real-world case studies, Darrin discusses the problems around creating public policy where conflicting scientific evidence or scientific uncertainty exists. By understanding how conflicting positions are treated when differing expert opinions arise and by understanding the different types of expertise at play, Darrin argues that policymakers and the public are better equipped to make active judgements about the experts involved and the contentious issues under discussion. 

Transcript available here (links to an external site)

Relevant Links:

Profile: 

Books:

Book Chapters:

Thanks for listening to The HPS Podcast with current producers, Samara Greenwood and Carmelina Contarino. You can find more about us on our blog, website, bluesky, twitter, instagram and facebook feeds. Music by ComaStudio.

This podcast would not be possible without the support of School of Historical and Philosophical Studies at the University of Melbourne.

HPS Podcast | hpsunimelb.org

Hello and welcome back to The HPS Podcast where we discuss all things History, Philosophy, and Social Studies of Science for a broad audience. I’m Carmelina Contarino, your host, and today we are joined by Dr Darrin Durant, Senior Lecturer in HPS at the University of Melbourne. Darrin is a specialist in Science and Technology studies, and his research covers two seemingly distinct areas: nuclear energy and expertise. 

Yet contested public policy debates, such as those around nuclear energy, are informed by experts on both sides. How, then, do we determine who is and who isn't an expert?  Using real-world case studies, Darrin discusses the problems around creating public policy where conflicting scientific evidence or scientific uncertainty exists. Darrin explains that understanding the different types and roles of expertise allows policymakers and the public to make active judgments about the experts involved and the contentious issues under discussion.

Carmelina Contarino: Hi Darrin, thanks for joining us on the podcast, it's great to have you with us. 

Darrin Durant: Thanks, Carmelina.

[00:01:15]

Carmelina Contarino: Firstly, we'd love to know a little about your background, how did you come to HPS? 

Darrin Durant: I originally went to the University of Wollongong, which is just south of Sydney, to do journalism. There were a number of streams in that degree, and one of those streams was Science and Technology Studies. So, I essentially just shifted from journalism into majoring in Science and Technology Studies, and at that time the University of Wollongong was the largest STS program, I believe. So, I worked under people like Jim Falk and John Schuster. And you know, when I went to graduate school, I went to the University of Toronto. They gave me a large Connaught Fellowship, which was nice. Most Australian students, have to hunt around for fellowships and often have to go overseas. So, I've went overseas as well. Within the interdisciplinary field of Science and Technology Studies, most come from other kinds of disciplines. They come out of straight history, or they come out of sociology maybe, or they come out of a policy field, or a communications field, some out of the sciences proper, whereas I was always an STS scholar.

There was a key moment, I think, when I was at the University of Toronto, and I was finishing up my doctorate. I was already teaching at York University, which is in Toronto as well, and I realized that, of course, I wanted to have an opportunity to come back to Australia, and so, I couldn't simply be a specialist in nuclear waste disposal. So that's the doctorate research that I'd done, and I realized I needed to branch out from there. Which is fairly instructive, I think, for younger scholars to be marketable, you know that you have to be portable in some ways. So, I had branched out into looking at theories of expertise, because I thought, okay, that's not going to be dependent on any one particular national context for its case study, and it's going to be something that's portable across national borders. And that was the reason that I was eventually hired at the University of Melbourne. 

[00:03:07]

Carmelina Contarino: So today we're talking about expertise, and I was wondering if you could start by giving us a little bit of background - are there different types of expertise?

Darrin Durant: There are different accounts of what expertise is itself. There is a realist account, there's an attributional account, and there's a relational account. So, a recent book of mine is 'Experts and the Will of the People', came out in 2020, that was co-authored with Harry Collins and Rob Evans and Martin Weinel, and there we're setting out a realist account. So, the realist account of expertise focuses on expertise as a property that is possessed by a person, as part of a group. Now, that skill can be various degrees of tacit knowledge, articulated knowledge and demonstrable practice. That skill is going to be embrained or embodied, as we say, and again, it can be tacit as well. But where does the skill come from? It's acquired in the act of socialization. You are socialized into a narrow domain practice, and as you're socialized into that particular practice, you're not just gaining a cognitive skill or a practical skill, you're also acquiring social knowledge about the field. That social knowledge of the field becomes a key component of the situated contingent judgments that an expert makes under conditions of contingency and uncertainty, and it gives you a clear way of distinguishing between different types of expertise. So, we would say that there is a contributory expert who is able to contribute to a particular domain or practice, and their contributions are directly accepted by their peers within that particular domain of practice. 

Then there can be interactional expertise - you may never have went through that socialization process, but you are able to do a lot of reading and do some kind of immersion or participation in that particular field, and you can converse concretely with contributory experts in that field and ask relevant questions of them. And then of course it gives you a clear criterion of the non-expert, and the typical way of thinking is to think that they can't do something, and they don't know something. So, we're focusing only on the cognitive skill and the practical dimension.We would say that the non-expert maybe can't do something, maybe doesn't know something, but crucially they lack the socialization process. By lacking the socialization process, their judgment under conditions of uncertainty and contingency is qualitatively different than either the interactional or the contributory expert.

So, the realist account gives us a way of thinking through who we might be thinking are relevant experts, and it also gives us a relatively clear pragmatic demarcation of who is not an expert because of the lack of that socialization process. That's consistent with the development of History and Philosophy of Science, and Science and Technology Studies since the late 1960s, developing out of Kuhn's work, because we're thinking about expertise as a social practice. It's a collective social practice, rather than simply an individual practice.

[00:06:16]

Carmelina Contarino: So, we can approach the idea of expertise from different standpoints but what does that difference make when experts are brought forward on both sides of discussions or policy debates, how can the public discern who is and who is not an expert?

Darrin Durant: I said that there were three accounts of expertise and the attributional and relational accounts probably more focus on that particular question than the realist account in some ways. So, the attributional account of expertise starts with the problem of recognition. What counts as an expert claim and who counts as the relevant expert is an achieved status, in any given social conflict. And then as a case study methodology, the attributional account gives you a set of tools for mapping the contours in social divisions amongst groups and individuals about who they take to be an expert, why they take them to be an expert, and what claims are thought to be expert claims, and why they're taken to be expert claims.

Now, I do think that it ends up running into a philosophical problem. The philosophical problem is idealism. The idealist account says that, for instance, say, money, what is money is what social groups take to be money and treat as money, and when they change how they treat something as money, it changes whether it counts as money. The idealist account, therefore, works very well for social properties and social statuses like money and leadership. But of course, the realist account wants to treat expertises as more than the attributed social statuses. It wants to treat it as also containing a socialization process and skills that are possessed. The attributional account starts to run into trouble, because what counts as an expert and what counts as expert claims can devolve down to a whole random set of individuals and what they want to take to be expertise. It makes it very difficult to extract a sorting principle that might be relevant to decision making under time pressure or making normative assessments of who you want to trust and find credible.

Now, the relational account does try and solve some of those issues because it joins parts of the attributional account and the realist account. The relational account treats expertise as a distributed property that's made in the way that social groups build networks and alliances. A good example for the relational account would be Gil Eyal's book, ' The Crisis of Expertise'. Now, I think that the realist account and the relational account can cooperate. But I think at a certain point, if we want to ask about how to think concretely about the ways that we might want experts and citizens to engage with each other in policy context where decisions are called for, and where we might be moving beyond deliberation and into the intricacies of decision making, we need to move beyond the attributional account because it's more of a mapping technique of the differences there. And we need the resources of either the realist account or the relational account. 

[00:09:12]

Carmelina Contarino: Okay. So, in these difficult situations where we can't reach consensus on policy issues, how does democracy manage tensions between experts and the public? 

Darrin Durant: For thinking through the tensions between experts and citizens and how we might combine these, different groupings, and they're overlapping groupings, obviously, but how we might integrate these groupings into decision making context. We probably have to have a look at some canonical case studies within Science and Technology Studies where these issues have been worked out. Sometimes the way that these case studies are mobilized are not always giving us the right insights into solving the integration problem. So, one canonical case study that's referred to here is always Brian Wynne's study of the Cumbria sheep farmers. So, Chernobyl is melted down, there's radioactive plumes floating across northwestern Europe, and they fall onto the Cumbria fells in the mid-England area. This is a sheep farming area, there's also a nuclear fuels reprocessing centre, Sellafield, so the two main employers are sheep farmers and nuclear workers. These radioactive plumes deposit caesium into the grasslands and the sheep farmers cannot sell contaminated meat. 

Now in Wynne's version of this story, this is an episode in which nuclear scientists working for the Ministry of Agriculture and Farming were tasked to go into the Cumbria area and to inform or advise the sheep farmers about what to do. And as the scientists rolled in, as the story gets told, they were arrogant, they knew everything, and they gave perfect reassurance. They said, there's no risk whatsoever. A couple of weeks later, they, changed their mind, there is a risk, and then a month or so after that, they changed their mind again, the sheep are continually being re-contaminated. So, we've moved from don't worry, there's nothing to see here, there's no risk, to there is a risk, and we don't really quite know why the risk is persisting, but it is. And as Wynne tells the story, the scientists were confident and certain every single time they changed their mind, their obvious contradiction that is leading to distrust from the public who's been given false reassurances and they're seeing through these false reassurances. Now Wynne's story is mobilized by others to say that the sheep farmers had sophisticated technical knowledge that was being ignored.

I want to suggest that it's not what Wynne was actually showing. Wynne says that there were two mistakes that were made in this particular case. The first mistake was that the scientists couldn't figure out why the sheep were continually re-contaminating. They wanted to do a controlled experiment, to pen the sheep in, spread some bentonite on the ground, which would absorb the caesium. The sheep farmers said to them, this won't work, the sheep will not eat if they're penned in in this way, and of course, the sheep farmers were correct, and the scientists were wrong, and so, the experiments didn't work. 

The second mistake was the nature of the soils in the area. Now, the scientists had models of how caesium was chemically immobilized in soils, and they had these models based upon alkaline clay soils. Now it turned out in the area, it was acidic peaty soils, and so, it turns out that caesium is chemically immobilized in alkaline clay soils but remains chemically mobile in acidic peaty soils. And so, Wynne says there was an assumption there, we would call it an auxiliary assumption, that was made about the type of soil, and the scientists hadn't stopped to check this.

Now, what's interesting about those two particular errors is that in the first case, what we're looking at is local knowledge possessed by the farmers that was ignored and therefore not integrated into decision making. The problem is the failure to integrate local specific knowledge. However, did the sheep farmers have that particular knowledge because they were lay experts? No, actually, the sheep farmers had contributory expertise with regard to sheep farming. They were not lay, they were experts. What was happening at that moment was the failure to integrate contributory expertise, but this case is routinely reported as if it represents lay knowledge, it doesn't at all, it actually represents a specific zone of contributory expertise.

If we generalize from the case and say it was lay citizen knowledge not being integrated, we're overlooking the social distribution of contributory expertise itself, and we're not paying attention to the different kinds of social experiences that may generate that contributory knowledge. So, we may be privileging formal training in a university setting or a laboratory setting as opposed to informal training in the field, in practice. Part of thinking through integrating citizens into decision making context is being aware of the different kinds of socialization processes that can produce contributory knowledge, and not rendering informal practical experience, invisible in our process. 

Now the other example about the soils is also interesting because in the soils case what's going on is that there's an auxiliary assumption that wasn't checked. Now do we say that we don't want expertise to leave any auxiliary assumption unchecked? And if our answer to that is yes, you can kiss goodbye a lot of our contemporary science that is working on ill structured or wicked problems that require modelling and projections into the future. All of that work, of course is going to involve assumptions, and of course, there's going to be assumptions that you can't always check, maybe you don't know them.

[00:14:39]

Carmelina Contarino: So, part of the problem is that as a public we tend to expect our experts to know everything there is to know about a given area. 

Darrin Durant: We are expecting the experts to be perfect, and we're expecting certainty at that moment. Now that expectation of certainty comes from both the scientific community and the publics. It doesn't come from just one of those sources. Scientists are quite fond of saying we know exactly what we're talking about, and Wynne was talking about that very false reassurance dynamic. Citizens, can also expect scientists to be perfect, in the sense that they can look at the soils case and say, how could you have possibly overlooked this auxiliary assumption? Well, that's just the nature of auxiliary assumptions. Sometimes they cannot be checked because of temporal processes. Sometimes they cannot be known because of social processes. So, to expect always being able to check every single assumption at that moment, especially under tense decision-making context, the temporality here is determined by politics rather than by science. 

Under those conditions, Wynne's complaint that there was something drastically wrong in the missing of that assumption, if pushed to its extreme, actually generates the kind of dynamic that we see in merchant of doubt cases. Oreskes and Conway's book of that title, Merchants of Doubt, had talked about this. In Merchant of Doubt cases, usually commercial industrial science claims that unless you can show absolute certainty and that you've checked every single possible auxiliary assumption, there's no reason to regulate their action. So, Wynne's complaint here about the soils, pushed to its extreme, actually places us at a moment where citizens are automatically being disavowed by the merchant of doubt practices.

[00:16:22]

Carmelina Contarino: So, when there's a failure to integrate, does the problem lay with the public or with experts?

Darrin Durant: Well, the easy answer is to say both, depending on the case. Let's have a look at another canonical case, which is Stephen Epstein's analysis of AIDS activists in the US. These case studies came out in the early 1990s and became a 1995 book called Impure Science. Epstein's case shows us some of the dynamics that we want to look at when there's a failure to integrate, and again, Epstein's case is routinely misreported. Through rather bland claims about citizens have sophisticated knowledge, because that can also lead to missing the dynamics that Epstein was actually talking about. 

So, in this case, Epstein is telling a story of US AIDS activists who managed to successfully change clinical trial methodology for drug testing for, AIDS patients. Sometimes it's said that these AIDS activists were young kids thrown out of homes and lacking education because of, say, homophobia in the US. That's not what Epstein shows at all. What Epstein shows instead is that there was an existing gay movement within the US, that had a lot of cultural capital. It was populated by a lot of professionals, teachers, scientists, bureaucrats, and that gay movement had the cultural capital to say take our diversity and the violation of our autonomy, take that seriously as a political project. So, AIDS activists piggybacked on the cultural capital of that, existing gay movement. When it came time to start intervening within policy debates about how to run clinical trials for AIDS drugs, what Epstein shows is that those AIDS activists were able to read the biomedical literature.

Now in our terms, they acquired interactional expertise that permitted them to talk sensibly with existing biomedical researchers. And not only were they reading the biomedical literature, they were paying close attention to divisions within the community that already existed in and around what were called, pure and messy clinical trials. In a messy clinical trial, you may have, drug cocktails. You may have patients who are in multiple trials. And of course, if you get something that works in that context, you don't know exactly why it worked, just that it did. So, you would have to go back and do a pure clinical trial, try and sort out one factor from another. But if you're that AIDS patient and you are dying, you don't really care about the pure trial, you care about efficacy at that moment. 

So, there's a powerful ethical argument coming into play to change the clinical trial methodology as well. And biomedicine was also relatively open because, of course, clinical trials are going out to patients and trying to increase their representativeness. So, those AIDS activists were able to make powerful political arguments about what counted as representative. They knew the diversity in their field, and they knew that some clinical trial methodologies were not paying attention to that diversity of their community.

Now, all of this story is very different than the simple catchphrase of "AIDS activists brought sophisticated technical knowledge, and therefore changed science at that moment". What you've actually got is a number of factors that tell us about the degree to which integration may be successful. One factor being, what is the relative degree of openness of the scientific community itself? And the biomedical community in this case was much more open than the story that Wynne tells about nuclear scientists. Those sheep farmers already distrusted those nuclear scientists because of a long social history with the Sellafield Nuclear Fuels Reprocessing Centre. So, the degree of openness of the science is different across the two cases.

The other thing is the existing cultural capital of the social groups themselves. In Wynne's story, the sheep farmers had quite a complicated kind of relationship with the nuclear scientists. There were two main employers in the area, and there were criss-crossing social relations that made both trust and distrust present within a fraught social engagement.

Over in Epstein's case, we're talking about cultural capital of a relatively powerful social movement, that AIDS activists are able to draw upon, so they come in with some cultural capital at that moment. And you take those particular factors, the degree of openness of the scientific community, the cultural capital that the social movement may come into, and the demonstrated capacity that they may show for developing interactional expertise, for instance, they're telling us about whether there is integration. So, once you compare across those two cases, the Cumbria sheep farmers and the AIDS activists, you can see what shapes integration. And it's not just you happen to have raw technical expertise. It's a broader set of factors that we should be focusing upon in order to improve the chances of integration. 

[00:21:09]

Carmelina Contarino: We seem to have very different cases where on one hand, in the Epstein case, there is cultural capital married with interactional expertise and an openness by the scientists to discuss and engage, and in the Wynne case, contributory expertise and a closed attitude between the scientists and the contributory experts, the sheep farmers. Does acquiring expertise ensure that a public member will be heard? 

Darrin Durant: Again, we can turn to canonical cases, this one a little bit more recent by Suryanarayanan and Kleinman, and it's about the colony collapse disorder amongst bees, which became a book in 2017 called The Vanishing Bees. Here we have this case of a conflict between toxicologists and commercial beekeepers over the cause of colony collapse disorder. No one is denying that the bees are dying, it's just why are the bees dying? The toxicologists believe that It's probably the practices of commercial beekeepers, whereas commercial beekeepers believe that it's long-lived pesticides.

The story here for colony collapse disorder is therefore about two epistemic forms: toxicologists emphasising formal, lab-based single-factor explanations, and commercial beekeepers emphasising informal, field-based, multi-factor explanations. One of the things that interested me in the case is the lack of impartiality that is extended towards toxicologists and commercial beekeepers in most of these analyses. The toxicologists who are the more powerful group, they're the overdog, and the commercial beekeepers, they're the underdog, they're the less powerful group. The epistemic form of the commercial beekeepers is often described, you know, they've got real knowledge, it's practical, it's based upon real-world experiences, and in Sainath and Dan's analysis, the toxicologists are described as having faulty knowledge that doesn't attend to real-life context. And from an STS perspective, we've got a relatively clear case of violating our impartiality about what we take to be the truth. In that case, it's the possession of some real-world, true factual knowledge that is mobilized in order to say that the commercial beekeepers are being unjustly ignored.

What Sainath and Dan end up saying is that just because you've identified someone's expertise, that doesn't ensure or guarantee that that group will be heard. Now there's an implication there that when you've got conflict among citizens groups or conflict among citizens and experts, that our guiding principle should be equal influence. So, I wrote a paper called the Undead Linear Model of Expertise, where I was analysing this case, now I want to say that equal influence is not as democratically ideal as people think. So, every single time that you find unequal influence across a plurality, you think there's some kind of social distortion or bias going on. This puts Science and Technology Studies back to what we call wave one, kind of rationalist philosophy of science, where someone has the truth, and other people should listen to it. So, when we're thinking about whether we should ensure publics have influence, we can want it, we can encourage its input, but to guarantee it is to run into problems with a whole bunch of other egalitarian goals.  

[00:24:27]

Carmelina Contarino: Okay, if you were to give one final piece of advice to the public about how they can determine who the expert is in a given situation, or how they can interact in a meaningful way with a policy discussion, what would that be? 

Darrin Durant: Well, I think there's two pieces of advice. One is to imagine the active judgments that citizens can make under conditions of complexity and uncertainty and conflict between expert groups. In these situations, we often worry that the citizen is passive and disempowered. So of course, we want to look for cases where the citizens are in fact making active judgments. You could say there are three types, judging the substantive claim itself, most of us do this about laws and rules all of the time.

An example I often use is speeding laws. So, anyone who's listening to this and not in Victoria, Victoria has extremely precise rules around speeding. There's lots of fixed cameras, and you can't be more than about three kilometres per hour over the speed limit, otherwise you'll be ticketed by one of these cameras. Now we can judge the validity and the merits of that particular law, and it doesn't necessarily code our compliance. I can think that's a really stupid way to run a traffic system, but I'll comply anyway.

So, we often find ourselves judging the merit of rules, or laws, or claims, regardless of whether we're an expert and regardless of whether we are or are not going to comply with it. Of course, the more technical something becomes, and especially the more complex that technical problem becomes, at that moment, citizens are starting to struggle to make those judgments of merit. They still make them, but it becomes more and more difficult.

One theorist within science and technology studies, Stephen Turner, called it the inevitable problem of discursive asymmetry. Experts are always able to convince others, whereas non experts struggle to convince experts. So, the discursive asymmetry problem becomes more and more manifest, the more technical it becomes. But you can still be making active judgments in this regard. You can also be making active judgments about the source of the claim, about the values and the commitments, and the social conflicts or conflicts of interest that might be animated by the source of a particular claim. You can ask how close or how far away they are aligned with you, and you're making an active judgment there about credibility and trust. And, of course, you can also be making active judgments about the authority relation itself. Just what kind of social relation does that authority have with you? Is it transparent? Is it open? Is it accountable? And you can be making active judgments about the nature of the authority relation. So, we should be looking at the way that citizens can make active judgments across those dimensions to increase their chances of being integrated into deliberation and decision making. 

[00:28:19]

Carmelina Contarino: Darrin, thank you so much for coming on the podcast today, it's been an absolute pleasure having you. 

Darrin Durant: Thank you, it's been great to be here. Thanks.

Thank you for listening to season four of The HPS Podcast. The details of today's conversation, including the transcript and show notes, are available on our website at www.hpsunimelb.org

You can join the discussion on our social media pages, including Bluesky, X, Facebook and Instagram, and follow us for bonus material and updates from the world of.

We would like to thank the School of Historical and Philosophical Studies at the University of Melbourne for their ongoing support. And thank you, our listeners, for joining us in the wonderful world of HPS. We look forward to having you back again next time.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The P-Value Podcast Artwork

The P-Value Podcast

Rachael Brown
Let's Talk SciComm Artwork

Let's Talk SciComm

Unimelb SciComm
Time to Eat the Dogs Artwork

Time to Eat the Dogs

Michael Robinson: historian of science and exploration
Nullius in Verba Artwork

Nullius in Verba

Smriti Mehta and Daniël Lakens
Narrative Now Artwork

Narrative Now

Narrative Now
On Humans Artwork

On Humans

Ilari Mäkelä
Simplifying Complexity Artwork

Simplifying Complexity

Sean Brady from Brady Heywood
Working Fathers Artwork

Working Fathers

University of Melbourne