Two degrees is too much

To avoid many of the worst consequences of climate change, the increase in global average surface temperature must remain below 2°C compared with the pre-industrial era.  Measures to reduce greenhouse gas emissions commensurate with this overarching ambition must be guided by the best available science.  Here I explain why I continue to advocate the 2°C threshold for avoiding dangerous climate change, why I don’t embrace a “ppm” target, and what 2°C means for policy makers and the business community.  This article first appeared on the website of think tank and strategy consultancy SustainAbility

Credible climate change policies and business strategies are driven by the primary objective of the United Nations Framework Convention on Climate Change (UNFCCC) – ratified by 192 countries, including the US and China – as well as being informed by the scientific assessments of the Intergovernmental Panel on Climate Change (IPCC), the authoritative voice on the causes, impacts, and mitigation of climate change.  

Article 2 of the UNFCCC states as its ultimate objective: “to stabilise GHG concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system.”  The problem is that the UNFCCC never went so far as to define what “dangerous” meant, and this omission has been a bone of contention ever since.  However, through different means, several researchers have arrived at the figure of 2° Celsius, referring to the maximum acceptable increase in the Earth’s average surface temperature versus the pre industrial era.  

The 2°C threshold has since become the de facto limit advocated by most civil society organisations concerned with climate change, as well as several scientific advisory bodies.  In the mid 1990s it was also adopted by the European Union as official policy and therefore, by extension, by the heads of government of the EU Member States.

The EU’s first mention of the need to stay below 2°C appears in the Spring Council conclusions of 1996.  Referring to the IPCC’s Second Assessment Report (SAR) – the most recent scientific assessment at that time – the European Council introduced a link between 2°C and 550 ppm (parts per million) of CO2:

The Council believes that global average temperatures should not exceed 2 degrees above pre industrial level and that therefore concentration levels lower than 550 ppm CO2 should guide global limitation and reduction efforts.

Back in 1996, based on the contemporary science of IPCC SAR, it was believed that stabilisation of atmospheric CO2 concentrations at below 550 ppm would deliver a below 2°C outcome.  Numerous policy makers and businesses subsequently adopted 550 ppm, and many have since failed to move in step with the advancing science.

In 2003, the German Advisory Council on Global Change (WBGU) issued a report titled “Climate Protection Strategies for the 21st Century: Kyoto and Beyond”.  In it, the link was explicitly made between temperature increase and tangible climate related impacts, such as: threats to biodiversity; food security; water scarcity; and ice sheet collapse.  The WBGU reaffirmed its earlier conviction that in order to avert dangerous climatic changes “it is essential to comply with a ‘climate guard rail’ defined by a maximum warming of 2°C relative to pre-industrial values”.  To meet this requirement, it recommended a stabilisation target of below 450 ppm CO2, accompanied by substantial reductions in other GHGs.

Two years later in 2005, an International Climate Change Taskforce (ICCT) – comprising leading scientists, public officials, and representatives of business and NGOs from both developed and developing countries – published a report titled “Meeting the Climate Challenge” which said essentially the same thing regarding the dangerous 2°C threshold, but went further than the WBGU by assigning a much stricter GHG limit of 400 ppm CO2-equivalent (CO2e).  Accounting for current levels of non-CO2 greenhouse gases, this figure amounted to roughly 350 ppm CO2.

It is now believed that even this target may be too high; the IPCC AR4 suggested that stabilisation at 400 ppm CO2e could lead to a temperature rise anywhere up to 2.5°C.

Alarmed by the lack of practical progress on climate change, civil society organisations have been cranking up the pressure on the road to COP-15 in Copenhagen.  The campaign group 350.org was established following the publication of a study by NASA climatologist James Hansen and colleagues in 2008, titled “Target Atmospheric CO2: Where Should Humanity Aim?”  The report’s conclusions?  CO2 will need to be reduced from its current 385 ppm to at most 350 ppm CO2 – with the greatest uncertainty in the target arising from possible changes in non-CO2 effects.

The initial “350” call was more or less aligned with the recommendation of the ICCT in 2005.  However, the 350.org campaign has morphed during its relatively short lifetime to mean the more challenging 350 ppm CO2e, as the website explains: 

Climate impacts happening more quickly than anticipated have led 350.org to see the 350 ppm target not only in terms of CO2, but CO2e.  On a technical level, this becomes a more ambitious target, incorporating other greenhouse gases.  On a practical level, it signifies the same priorities 350.org has embodied all along.

To complete the picture, in July 2009 following the welcome return of the United States to constructive international dialogue, at the Major Economies Forum on Climate Change in L’Aquila, Italy, leaders of the world’s largest economies – including the EU, US, Japan, China, India, Brazil and Russia – made the following ground-breaking declaration:

We recognise the scientific view that the increase in global average temperature above pre-industrial levels ought not to exceed 2°C.

Despite falling short of calling for a long-term target for stabilisation or percentage emissions cuts, the political significance of this declaration is huge.  What is clear from the science is that to stand a reasonable chance of staying below 2°C, we will need to engineer a rapid transition to a completely decarbonised energy system by 2050, as well as reversing deforestation and cutting emissions from other land-based sources.

It is for good reasons that the 2°C principle is widely accepted by heads of State, civil society organisations, climate change scientists and policy makers.  But the threshold has not been universally accepted.  To date, we do not know of any business that has a publicly declared ambition to keep global warming below 2°C.  Then again, we are not aware of any business that has taken any public position on what represents “dangerous” climate change.

The really bad news – and perhaps an insight into why businesses are slow to embrace 2°C – is that every time we look at the science it seems the greenhouse gas stabilisation target for avoiding 2°C is lower than before: from 550 ppm CO2 in 1996, to 450 ppm CO2 in 2003, to 400 ppm CO2e in 2005, even as low as 350 ppm CO2e in 2009.

This is precisely the reason why the 2°C concept is so important to understand. “Staying below 2°C” sets the overarching level of ambition, from which we derive stabilisation targets in “parts per million” (based on the best available science), which then inform us about actionable “percentage emissions reductions” (e.g. minus 85% by 2050).  We cannot fall into the trap of conflating 2°C with a particular ppm stabilisation target – recent history has shown this to be folly.

Our atmosphere is already loaded with greenhouse gases to the tune of around 435 ppm CO2e and rising by 2-3 ppm every year, i.e. well beyond what today’s science indicates is the necessary long-term stabilisation level.  So as well as eliminating emissions from fossil fuels, we will likely need to enhance terrestrial carbon sinks – through massive reforestation programmes and advanced soil management practices – as well as considering other ways to suck CO2 out of the atmosphere if we are to stand any chance of avoiding “dangerous climate change”.