A UK government select committee is leading an onslaught against Europe's GMO regulations, warns Claire Robinson
As part of the UK government's unrelenting push to dismantle Europe's GM food regulations, the House of Commons Science and Technology Select Committee is conducting an inquiry into GM foods and the way they are regulated in Europe under the precautionary principle.
The intended outcome is almost certainly to replace the precautionary principle, which is currently built into European law, with something along the lines of the 'innovation principle', a concept pushed by corporate interests.
This initiative comes conveniently in the midst of the TTIP free trade negotiations, which aim to do away with Europe's process-based GMO regulations – that means regulation based on the method of producing the GM plant, which demands that all GMOs are labelled because they are GMOs.
The objectivity of this inquiry is seriously in doubt. The Terms of Reference on the Committee's website state that "European Commission reports" have found "no scientific evidence associating GM organisms with higher risks for the environment or food and feed safety".
The "reports" referred to almost certainly include a report by two divisions of the Commission, "A Decade of EU-Funded GMO Research". Some of the report's editors claim in the introduction that the research projects under this EU initiative found no evidence that GM was any more risky than non-GM breeding. What they do not say is that the project didn't even look at the safety of any GMO that's actually in the food and feed supply!
The authors of GMO Myths and Truths – myself included – searched this report for hard evidence of GM food safety – and found that the report cited only five short animal feeding studies that were performed under the EU-funded research project and published in peer-reviewed journals.
Two of these studies were carried out with a GM rice expressing a protein known to be toxic to mammals, in order to ascertain that the methodology used was sensitive enough to detect toxicity of a comparable level.
None of the studies tested a commercialized GM food; none tested the GM food for long-term effects beyond the medium-term period of 90 days; all found differences in the GM-fed animals, which in some cases were statistically significant; and none were able to conclude on the safety of the GM food tested, let alone on the safety of GM foods in general. Therefore the EU research project provides no evidence that could support claims of safety for any single GM food or of GM crops in general.
It is worrying that the Committee is misleading the public, scientists, and EU regulators by making claims about GMO safety based on this disingenuous report.
Fortunately, Prof Andy Stirling of the University of Sussex brought some much-needed balance to the Select Committee inquiry, standing up for the precautionary principle (item 1 below). In a separate article, he explains why the precautionary principle is important (item 2).
1. Making choices in the face of uncertainty
2. Why the precautionary principle matters
---
1. Making choices in the face of uncertainty
SPRU - Science Policy Research Unit, November 2014
http://www.sussex.ac.uk/spru/newsandevents/reports/innocracy
* SPRU academic, Professor Stirling contributes to the first ever Government Chief Scientific Adviser's annual report with a chapter that calls for the strengthening of innovation democracy
On 19 November 2014, the Government Chief Scientific Adviser, Sir Mark Walport, launched his first ever annual report, which includes a chapter by Andy Stirling, Professor of Science & Technology Policy at SPRU and co-Director of the ESRC funded STEPS Centre.
In his contributory chapter, Professor Stirling criticizes the tendency in conventional debates on new technologies, to treat supporters as being simply ‘pro innovation’ and critics generally ‘anti-science’. Such language can be routinely heard being used, for instance, in controversies over GM foods, new chemicals or nuclear power.
According to Professor Stirling, “the problem is that this misses the single most important point about innovation. Like other areas of policy – the key issues are about choosing between alternatives. To reduce this to simply being ‘for’ or ‘against’ the particular choice favoured by the most powerful interests is both irrational and anti-democratic”.
Referring to torture, weapons of mass destruction and financial fraud, Professor Stirling points out that not all innovation is necessarily positive. Any particular innovation is typically ambiguous – open to being viewed in different ways. Therefore, Stirling argues, “whether any given innovation is preferable to the alternatives is not just a technical issue, but a fundamentally political question. To pretend that this is simply about ‘science based ‘ evidence - with no room for different social values – is also undermining of democracy”.
The chapter, entitled ‘Making choices in the face of uncertainty: strengthening innovation democracy?’ (chapter 4) explores the case for more mature debate and more reasoned decision-making. Across a range of areas, if we are to secure a future for all, there is a need to treat alternative priorities, resource allocations and innovation options in much more balanced and transparent ways.
For Stirling, the issues are not just about how fast to go, or even what the risks or benefits might be, in pursuing some supposedly single option (like GM foods). The real questions are about how privileged innovations can quickly get ‘locked in’ and alternatives ‘crowded out’. In the case of sustainable global food production, the chapter details a wide range of alternatives that even UK government support suggests often to be preferable to GM in their potential.
On the same day Andy Stirling and Paul Nightingale (also SPRU) gave evidence at the fourth session of the Commons Science and Technology Select Committee’s inquiry into genetically modified (GM) foods and the way in which these are regulated at European level under the precautionary principle. See resource pack GM food and the precautionary principle.
In both his chapter and evidence to the committee, Stirling highlights the importance of more democratic institutions, practices and debates around innovation. Rather than reducing everything simply to ‘risk’, much more attention needs to be given to unquantifiable uncertainties – highlighting the value of more responsible, participatory and precautionary methods for assessing alternative choices.
Stirling also argues for much greater attention to diversity – both in the portfolios of options that can be supported and in the plurality of perspectives to take into account. There exist a range of different practical methods for more effectively addressing these issues, but these also tend to be neglected in simplistic polarizing ‘pro’ / ‘anti’ debates.
The Government Chief Scientific Adviser’s report, ‘Innovation: Managing risk not avoiding it’ aims to help improve decision making in regulation and innovation policy. It is hoped that the report will promote discussion and a regulatory culture surrounding risk in which robust scientific evidence is openly considered alongside political and other non-scientific issues in shaping policy.
A PDF of Prof. Stirling's chapter 'Making choices in the face of uncertainty: strengthening innovation democracy?' is available for download.
A PDF short version of Stirling's chapter 'Democratising Innovation' is also available for download.
---
2. Why the precautionary principle matters
Andy Stirling
The Guardian, 8 July 2013
http://www.theguardian.com/science/political-science/2013/jul/08/precautionary-principle-science-policy
* In the first of a series on the precautionary principle, Andy Stirling argues it offers crucial time to think through options
Precaution is arguably one of the most misunderstood and misrepresented issues in the global politics of science and technology. Misunderstood, because precaution is so often wrongly asserted to be unscientific or anti-technology. Misrepresented, because a large part of the resulting stigma can be a systematic – even deliberate – effect of power.
Powerful interests behind a particular innovation can understandably get pretty exercised when challenged by precautionary concerns over their favoured new technology. But these highly partisan commotions need not provoke such existential angst across society as a whole. Precaution does not necessarily mean a ban. It simply urges that time and space be found to get things right.
To see the value of this, we can start by considering history. Take, for example, asbestos, lead, benzene, pesticides, ozone-depleters or overfishing. In all these areas and many more, early precautionary action was dismissed as irrational by governments, business and scientific establishments alike – claiming there were no alternatives. Yet now, it is agreed on all sides of the debate that levels of risk were initially quite significantly understated. And, in retrospect, there were more viable substitutes than were claimed at the time. Similar questions arise in forward-looking dilemmas of technology choice; around alternatives to nuclear power or GM food, for example.
In a nutshell, precaution reminds us that innovation is not a forced one-track race to the future. Instead – like biological evolution – technological progress entails constantly branching paths. Though often concealed behind science, each involves intrinsically political choices. This requires understanding, rather than denial, of the real nature of uncertainty. Although there exist many versions of precaution, the general gist is that, where there are threats to human health or environment, scientific uncertainty is not a reason for inaction. This does not compel a particular action. It merely reminds us that lack of evidence of harm, is not the same thing as evidence of lack of harm. In other words, the crux of precaution lies in the rigour of taking similar care in avoiding the scientific error of mistakenly assuming safety, to avoiding mistakenly assuming harm.
This in turn hinges on a crucial technical distinction between risk and uncertainty. Risk is a state of knowledge where we feel confident in assigning numerical probabilities. In conventional risk assessment, the onus, burden and levels of proof typically fall most heavily on those concerned about a particular pathway, or who prefer alternatives. The balance of emphasis tends to favour those products with most powerful backing. Precaution offers to level the playing field by inviting a focus not only on risk, but also on uncertainty. Whether due to incomplete evidence, complexity, divergent values, scientific disagreement, gaps in knowledge or the simple possibility of surprise – uncertainties cannot be reduced to neat numerical probabilities. But they are still crucial to rational consideration – and there are plenty of practical ways to deal with them (pdf).
Under uncertainty, then, it is not merely difficult in practice to calculate some single definitive "sound scientific" "evidence based" solution. The point is, it is irrational even to try, let alone claim, this. The notion of exclusively science-based decisions under uncertainty is an oxymoron. How has such confusion come about? Uncertainties, after all, are among the most important driving forces in science. A typical scientist is well aware of the uncertainties in their field, often strongly motivated by them. Reasoned scepticism and open disagreement about uncertainties, are among the most crucial distinguishing qualities of science. Yet when science comes into contact with economic and political power, there develops a strange kind of uncertainty denial. This brings us back at the end, to where this blog began. In order to understand the rhetorical intensity of so much opposition to precaution, we need to look behind the methodological technicalities and consider the powerful political forces and high economic stakes that often hinge on the outcomes.
It is with some sympathy for beleaguered decision makers in business or regulation, that we can understand the often-overwhelming political pressures to justify decisions. This can mean building "policy-based evidence" to assert some pre-decided outcome. Or it can merely mean pressuring an artificially unambiguous "evidence base" for justifying any firm decision at all. In a myriad ways this pressure incentivises analysts and independent expert advisers to sidestep precaution and produce more apparently confident and precise "risk-based" prescriptions than their better judgement might suggest. It is not necessary to envisage any conspiracy or bad faith. The effect is more like iron filings lining up in the magnetic field of power. Either way, it is this pressure for justification that explains why the animosity to precaution extends beyond the partisan advocates of particular uncertain technologies, to political debates in general.
But, in the end, the picture is quite optimistic. Far from the pessimistic caricature, precaution actually celebrates the full depth and potential for human agency in knowledge and innovation. Blinkered risk assessment ignores both positive and negative implications of uncertainty. Though politically inconvenient for some, precaution simply acknowledges this scope and choice. So, while mistaken rhetorical rejections of precaution add further poison to current political tensions around technology, precaution itself offers an antidote – one that is in the best traditions of rationality. By upholding both scientific rigour and democratic accountability under uncertainty, precaution offers a means to help reconcile these increasingly sundered Enlightenment cultures.
Andy Stirling is professor of science and technology policy at the University of Sussex. This is the first in a series on the precautionary principle. Come back later this week to see pieces by Tracey Brown, Steve Fuller and Jack Stilgoe. On Friday, we'll pull out readers' comments and give the contributors a chance to respond to one another.