Why the precautionary principle matters and You're sure of a big surprise
1.Why the precautionary principle matters
Andy Stirling
The Guardian, 8 July 2013
http://www.guardian.co.uk/science/political-science/2013/jul/08/precautionary-principle-science-policy
In the first of a series on the precautionary principle, Andy Stirling argues it offers crucial time to think through options
Precaution is arguably one of the most misunderstood and misrepresented issues in the global politics of science and technology. Misunderstood, because precaution is so often wrongly asserted to be unscientific or anti-technology. Misrepresented, because a large part of the resulting stigma can be a systematic – even deliberate – effect of power.
Powerful interests behind a particular innovation can understandably get pretty exercised when challenged by precautionary concerns over their favoured new technology. But these highly partisan commotions need not provoke such existential angst across society as a whole. Precaution does not necessarily mean a ban. It simply urges that time and space be found to get things right.
To see the value of this, we can start by considering history. Take, for example, asbestos, lead, benzene, pesticides, ozone-depleters, or overfishing. In all these areas and many more, early precautionary action was dismissed as irrational by governments, business and scientific establishments alike – claiming there were no alternatives. Yet now, it is agreed on all sides of the debate that levels of risk were initially quite significantly understated. And, in retrospect, there were more viable substitutes than were claimed at the time. Similar questions arise in forward-looking dilemmas of technology choice; around alternatives to nuclear power or GM food, for example.
In a nutshell, precaution reminds us that innovation is not a forced one-track race to the future. Instead – like biological evolution – technological progress entails constantly branching paths. Though often concealed behind science, each involves intrinsically political choices. This requires understanding, rather than denial, of the real nature of uncertainty. Although there exist many versions of precaution, the general gist is that, where there are threats to human health or environment, scientific uncertainty is not a reason for inaction. This does not compel a particular action. It merely reminds us that lack of evidence of harm, is not the same thing as evidence of lack of harm. In other words, the crux of precaution lies in the rigour of taking similar care in avoiding the scientific error of mistakenly assuming safety, to avoiding mistakenly assuming harm.
This in turn hinges on a crucial technical distinction between risk and uncertainty. Risk is a state of knowledge where we feel confident in assigning numerical probabilities. In conventional risk assessment, the onus, burden and levels of proof typically fall most heavily on those concerned about a particular pathway, or who prefer alternatives. The balance of emphasis tends to favour those products with most powerful backing. Precaution offers to level the playing field by inviting a focus not only on risk, but also on uncertainty. Whether due to incomplete evidence, complexity, divergent values, scientific disagreement, gaps in knowledge or the simple possibility of surprise – uncertainties cannot be reduced to neat numerical probabilities. But they are still crucial to rational consideration – and there are plenty of practical ways to deal with them (pdf).
Under uncertainty, then, it is not merely difficult in practice to calculate some single definitive "sound scientific" "evidence based" solution. The point is, it is irrational even to try, let alone claim, this. The notion of exclusively science-based decisions under uncertainty is an oxymoron. How has such confusion come about? Uncertainties, after all, are among the most important driving forces in science. A typical scientist is well aware of the uncertainties in their field, often strongly motivated by them. Reasoned scepticism and open disagreement about uncertainties, are among the most crucial distinguishing qualities of science. Yet when science comes into contact with economic and political power, there develops a strange kind of uncertainty denial. This brings us back at the end, to where this blog began. In order to understand the rhetorical intensity of so much opposition to precaution, we need to look behind the methodological technicalities and consider the powerful political forces and high economic stakes that often hinge on the outcomes.
It is with some sympathy for beleaguered decision makers in business or regulation, that we can understand the often-overwhelming political pressures to justify decisions. This can mean building "policy-based evidence" to assert some pre-decided outcome. Or it can merely mean pressuring an artificially unambiguous "evidence base" for justifying any firm decision at all. In a myriad ways this pressure incentivises analysts and independent expert advisers to sidestep precaution and produce more apparently confident and precise "risk-based" prescriptions than their better judgement might suggest. It is not necessary to envisage any conspiracy or bad faith. The effect is more like iron filings lining up in the magnetic field of power. Either way, it is this pressure for justification that explains why the animosity to precaution extends beyond the partisan advocates of particular uncertain technologies, to political debates in general.
But, in the end, the picture is quite optimistic. Far from the pessimistic caricature, precaution actually celebrates the full depth and potential for human agency in knowledge and innovation. Blinkered risk assessment ignores both positive and negative implications of uncertainty. Though politically inconvenient for some, precaution simply acknowledges this scope and choice. So, while mistaken rhetorical rejections of precaution add further poison to current political tensions around technology, precaution itself offers an antidote – one that is in the best traditions of rationality. By upholding both scientific rigour and democratic accountability under uncertainty, precaution offers a means to help reconcile these increasingly sundered Enlightenment cultures.
Andy Stirling is professor of science and technology policy at the University of Sussex. This is the first in a series on the precautionary principle. Come back later this week to see pieces by Tracey Brown, Steve Fuller, and Jack Stilgoe. On Friday, we'll pull out readers' comments and give the contributors a chance to respond to one another.
---
---
2.You're sure of a big surprise
Jack Stilgoe
The Guardian, 10 July 2013
http://www.guardian.co.uk/profile/jack-stilgoe
Our series on the precautionary principle has revealed some important lines of debate
Precaution, like any policy idea, means little until we give it substance. Of course we should take precautions. Of course we should be cautious. The questions, as ever, relate to how, when and who. The three blog posts so far this week on precaution, from Andy Stirling, Tracey Brown and Steve Fuller, have opened up some of the lines of debate.
Andy Stirling began by describing the thinking behind the precautionary principle – the idea that we should emphasise uncertainty rather than risk and shift the burden of proof towards those who propose rather than oppose particular technologies. We know all too well the tendency to ignore uncertainty. We enter into technological experiments while pretending their outcomes are known, treating future generations with contempt, hoping that technology will get them out of the mess we create, when the lessons of history give us little cause for optimism. The paradox of the Teddy Bear's picnic – "you're sure of a big surprise" is all too real.
Following Stirling's reasoning, the case for precaution seems incontrovertible. It makes sense to apply scientific scepticism to technological choice. It makes sense to keep options open and appraise them as fully as possible. But we can sympathise too with Tracey Brown, who argues not about the spirit of precaution, but its practice. Brown is right to point out that, in practice, an emphasis on uncertainty can privilege those who oppose. While science maintains a degree of control over scientific evidence, uncertainty is anyone's game. This is why precaution is democratically important, but it is bedevilled by all of the usual troubles of democracy. Advocates of precaution can be as opaque as those whose hackles are raised by the term (although they would argue that the other camp has all the power).
This is not to say that I agree with Brown. I don't. Her classic litany of technological wonders that would have been prevented by precaution, from the Green Revolution to the Internet, is misplaced. How are we to know what might have been different, or what other possibilities might have been realised in the more enlightened policy arrangements to which Stirling points? Perhaps, with a precautionary approach, the Green Revolution could have been more sustainable, or more beneficial to the African countries that didn't in fact share in the cornucopia.
But Brown's emphasis on the reality of precaution is important. Principles don't prevent bad decisions. They are merely a way of structuring decisionmaking, rebalancing power and redirecting attention. The decisions we face contain difficult trade offs. In many cases, there are no ideal options. The status quo may stink, so conservatism has consequences. As Brown says, we "have to face the problem". The question, therefore, is whether precaution can become a tool for creating new, sustainable possibilities. Brown suspects that "its only tool is to stop a thing". Precaution aspires to go beyond prohibition, but one can understand the frustration to which Steve Fuller points in his piece.
Fuller turns our attention towards recent discussions of the "proactionary principle". A naïve scan over the proactionary principle suggests little to disagree with. It proposes themes that are warm if not completely vacuous. But we should not extract it from its origins. It comes, as Fuller describes, from a bunch of transhumanists with, shall we say, a very particular idea of what innovation is for – namely the technological improvement of the human condition. To elevate their principle to and compare it with the flawed but well-conceptualised precautionary principle is ridiculous. Fuller even appears to admit as much with his reference to Seasteading. Seasteading is a terrifying hint of a real agenda – a contempt for democratic politics – that makes precaution seem not so bad after all.
P.S. Reading the latest Dan Brown novel took numerous hours and brain cells that I will never see again. So that you don't have to do the same, its baddies are transhumanists and (sort of) seasteaders. Enough said.
Jack Stilgoe teaches science policy at University College London and is an affiliated research fellow at the Centre for Science and Policy, University of Cambridge.