Prospects for Averting Severe Climate Change at COP21?

For instructors discussing the prospects for “The Road to Paris” at COP21 to help us build a bridge to a safer climatic future, a new study in the journal Nature would be a good student reading. The study draws upon the Intended National Determined CoNaturentributions of the more than 150 countries that have made such pledges to date,embodying 90% of the globe’s emissions. The study’s authors seek to assess both the prospects for limiting temperature changes to 2C from pre-industrial levels, as well as how much such pledges reduce the risk of the highest potential increases in temperatures. The authors emphasize that because temperature changes ultimately depend on cumulative emissions, it’s critical to assess the likely long-term paths of emissions commitments beyond the INDCs, which extend to only 2025 or 2030. This was calculated through the use of a global integrated assessment model. Also, the uncertainties associated with the global carbon cycle and climate system responses necessitates probabilistic assessments. The study utilizes two scenarios, a Paris-Continued minimum (2% annual rate) scenario assuming that countries proceed to reduce emissions at the same rate as required to achieve their INDCs between 2020-2030, and a Paris-Increased ambition scenario, assuming a 5% annual reduction beyond 2030.

The study’s conclusions include the following:

  • The Paris-Continued scenario reduces the probability of temperatures increasing more than 4C in 2100 by 75% compared to the Reference-Low policy scenario, and by 80% from a Reference-No policy scenario;
    • The chance of exceeding 4C is virtually eliminated if mitigation efforts are increased beyond 2030, such as in the Paris-Increased ambition scenario
  • There is an 8% probability of limiting temperature increases to 2C from pre-industrial levels In the Paris-Continued ambition; this increases to about 30% under the Paris-Increased scenario.
    • Scenarios to increase the probability of limiting temperatures to 2C to between 50-66% are plausible, but assume rapid emissions reductions after 2030, and many also include negative global emissions in the second half of the century, effectuated through the deployment of Bio-energy Carbon Capture and Sequestration (BECCS).
  • To limit warming to any prescribed level in the future will necessitate ultimately reducing carbon dioxide emissions to zero. If this doesn’t transpire quickly beyond 2100, the prospects of both extreme temperature changes and exceeding the 2C threshold are substantially increased.

New WMO Greenhouse Gas Bulletin

In its most recent GWMO Bullreenhouse Gas Bulletin, the World Meteorological Organization provides some of the most contemporaneous data on the status of long-lived greenhouse gases in the atmosphere, as well as providing some excellent charts for lectures and presentations on climate science.

Among the key findings in the publication:

  1. Radiative forcing by long-lived greenhouse gases increased by 36% between 1990 and 2014, with carbon dioxide accounting for approximately 80% of this increase;
  2. Carbon dioxide levels reached 143% of pre-industrial levels in 2014 and is responsible for 83% of the the increase in radiative forcing over the past decade. Global atmospheric concentrations reached 397.7ppm in 2014, with an average annual growth rate of 2.06ppm over the past decade, with last year’s growth rate over 2013 of 1.9ppm
    1. Approximately 44% of anthropogenic carbon dioxide emissions reached the atmosphere in the past decade, with the remaining 56% removed by oceans and the terrestrial biosphere
  3. Methane concentrations in the atmosphere reached 254% of pre-industrial levels in 2014, contributing 17% of the radiative forcing of long-lived greenhouse gases. Atmospheric concentrations were 1833 ppb in 2014;
  4. Nitrous oxide levels reached 327 ppb in 2014, up 21% above pre-industrial levels. Nitrous oxide accounts for 6% of radiative forcing by long-lived greenhouse gases;
  5. Chlorofluorocarbons and minor halogenated gases account for 12% of radiative forcing by long-lived greenhouse gases, though their production is declining due to international treaty regulation. While potent greenhouse gases hydrochlorofluorocarbons and hydrofluorcarbons are increasing in production at a substantial clip, their atmospheric concentrations remain low, in the parts per trillion currently.

The Bulletin also provides a concise explanation of the anthropogenic greenhouse effect, including an excellent chart explaining radiative forcing.

 

 

 

Mind the (Emissions) Gap

For instructors discussing the likely impacts of the emissions reductions commitments agreed to by the Parties to the UNFCCC under the Durban Platform for Enhanced Action (denominated “Intended National Determined Contributions” or “INDCs”), the just-released eight-page Executive Summary of UNEP’s Annual “Emissions Gap Report” would be an excellent reading. Other recent assessments of INDCs include the UNFCCC’s Synthesis Report on the Aggregate Effect of the Intended Nationally Determined Contributions, and studies by Climate Action Tracker and Climate InteractiveUntitled-1. The 2015 Report compares projected emission levels in 2030 (based on the INDCs of 114 States by October 1, 2015) with scientific assessments of emissions pathways consistent with keeping temperature increases below 2C from pre-industrial levels.

Among the study’s findings are:

  1. Based on the IPCC Fifth Assessment Report’s estimate of a remaining cumulative carbon dioxide emissions budget of 1000 GtCO2 (to avoid passing the 2C threshold), net global carbon emissions will have to be reduced to zero between 2060 and 2075;
  2. To have a greater than 66% chance of avoiding temperature increases above 2C by the end of century the median level of carbon dioxide equivalent emissions in 2030 should be 42 GtCO2e (range of 31-44), 39 GtCO2e to keep temperature increases to 1.5C.
    1. While the INDCs made by the Parties to the UNFCCC to date constitute “a real increase in the ambition level compared to a projection of current policies, the emissions gap between full implementation of unconditional INDCs and the least-cost emission level for a pathway to remain below 2C are estimated at 14 GtCO2e in 2030 and 7 GtCO2e in 2025. Conditional INDCs could reduce the gap to 5 GtCO2e in 2025 and 12 GtCO2e in 2030. This translates into a temperature increase of 3.5C by 2100 (66% chance)
    2. The global emissions levels in 2030 consistent with avoiding passing the 2C threshold is 42 GtCO2e in 2030, while project emissions from unconditional INDCs are projected to be 56 GtCO2e in 2030, or 45 GtCO2e when conditional INDCs are taken into account.
  3. Global greenhouse gas emissions could be reduced by an additional 5-12 GtCO2e below unconditional INDCs through measures such as enhanced energy efficiency, and International Cooperative Initiatives, such as efforts by cities and regions, sector specific initiatives (such as reducing cement-related initiatives), and forest-related initiatives, e.g. REDD+

The electronic version of the report also includes a number of charts and diagrams that would could be used in class lectures, including portrayals of historical GHG emissions and projections until 2050, the emissions gap of INDCs and requisite reductions in emissions to avoid passing critical temperature thresholds, and a map outlining the INDCs of UNFCCC Parties.

 

 

Assessing Contrarian Climate Studies

A number of recent studies indicate that 97-98% of actively publishing climate researchers support the conclusions of the Intergovernmental Panel on Climate Change in terms of anthropogenic global warming (AGW). However, a large percentage of the public continue to believe that there’s a close division of opinion in the scientific community. As a new study by Rasmus E. Benestad, et al. in the journal Theoretical and Applied Climatology suggests, this includes many undergraduate students in the United States. This is the reason that I always devote a full class in my climate change courses to discussing the specific arguments of climate skeptics. The Benestad study utilizes an analytical tool to test the results and methods used in high profile 38 contrarian papers (grouped into five categories based on analytical setup, statistics, mathematics, physical and representation of previous results), seeking to replicate and test the results and methods in these studies.
The study could be a useful reading for a teaching module on the arguments advanced by those who challenge the tenets of AGW, as well as providing some broader lessons about the nature of scientific research and the interface of scientists and the public.
Among the conclusions from the study were the following:

evangelicals-and-climate-change
• Most of the studies across the categories failed to cite or address relevant literature that proffered evidence or conclusions counter to their conclusions;
• Many of the studies failed to compare models against independent values that were not deployed in the development of the study, which is critical to prevent curve-fitting;
• Many studies engaged in false dichotomy reasoning, such as arguing that warming was attributable to solar activities, while not acknowledging that greenhouse gas forcing could be a co-existing reason;
• Many of the studies employed spectral methods, which almost inevitable finds cycles or periodicities, even when they may not exist;
• A number of papers were published in journals that were far abreast from the field of climate change research, presenting the question of whether the editors were likely to be able to identify qualified reviewers.

The study suggests the need for openness and transparency to facilitate access to open source code and data that can facilitate replication to assess the validity of study findings. The researchers also argue that IPCC assessment reports might be more compelling to the public it is also made it source code and data available.

The Limits to Fossil Fuel Use?

CaptureAs a number of recent studies have emphasized, projected global temperatures will largely be determined by cumulative emissions of greenhouse gas emissions over a given period of time. In a new study published in the journal Nature, researchers Christophe McGlade and Paul Elkins employ a single integrated assessment model that includes estimates of fossil fuel “resources” (defined as remaining ultimately recoverable resources) and “reserves” (defined as resources recoverable under current economic conditions) to determine what portion of coal, oil and natural gas can ultimately be utilized consistent with the objective of keeping temperature increases to 2C above pre-industrial levels.

Among the study’s conclusions:

  1. To maintain at least a 50% chance of keeping temperature increases to below 2C, the cumulative remaining global “carbon budget” is approximately 1100 gigatons of carbon dioxide;
  2. There is a massive gap between both estimated fossil fuel resources (nearly 11,000 Gt CO2) and reserves (nearly 2,900) and the 1100 gigaton carbon dioxide budget that may be necessary to avoid passing critical temperature thresholds;
  3. Under a scenario in which carbon capture and sequestration (CCS) is extensively deployed from 2025 onwards, over 430 billion barrels of oil and 95 trillion cubic meters of gas reserves would have to stay in the ground, translating in half of the oil in the Middle East, and a whopping 75% of Canada’s oil reserves, with its large reserves of bitumen;
    1. 82% of coal reserves will need to remain unburned to not exceed the world’s carbon budget by 2050, with the US and former Soviet Union availing themselves of less than 10% of their reserves
  4. If CCS is not widely deployed, utilization of current fossil fuel reserves must be lower, though CCS only increases potential utilization of coal reserves by 6% by 2050;
  5. In terms of unconventional oil, natural bitumen utilization in Canada must become “negligible” by 2020, and without CCS, all bitumen production must cease by 2040. Unconventional natural gas fares better, displacing coal in electric and industrial sectors, permitting 50 trillion cubic meters to be burned. However, by 2050, fully 80% of unconventional gas resources in China, India, Africa and the Middle East must remain unburned.

Among the class discussion questions this piece might generate:

  1. Given the limited utility of CCS through 2050, does it make sense to press forward with its development?. What are the implications of CCS deployment post-2050 and its significance for carbon budgets?;
  2. Should the allocation of “unburned” carbon be actively managed under a set of equitable principles? If yes, what principles should control said allocation?
  3. What are the implications of aiming for a better than 50% chance of avoiding exceeding a 2C increase in temperatures?

 

Resources on 2014 Temperature Record

From Eban Goodstein at Bard College:

Colleagues, Today. NOAA and NASA are likely to confirm officially 2014 as the hottest year in 134 years of record keeping. The big news here is that this will be the first record-breaking year not to coincide with an El Nino; it was an Enso-neutral year. To help your students gains some context on this event, here is a useful review article (and a short video).

Onward, Eban

The Video: : http://bit.ly/2014HottestYearEver

http://climatenexus.org/2014-putting-hottest-year-ever-perspective

Carbon Dioxide Removal Approaches: Long-Term Implications And Requisite Societal Commitments

In recent years, a number of climate change commentators, non-governmental organizations, and intergovernmental organizations have discussed the potential need for so-called “negative greenhouse gas emissions” strategies. It is also anticipated that Working Group III of the Intergovernmental Panel on Climate Change will include a discussion of this approach in its upcoming report pursuant to its contribution to the Fifth Assessment Report. The rationale advanced for focusing on negative emissions approaches are usually the threat posed by burgeoning emissions, which could result in exceeding of critical climatic thresholds in a few decades, as well as system inertia, which could lock in temperature increases associated with radiative forcing for many centuries. The processes that could effectuate permanent removal carbon dioxide from Earth’s atmosphere include air capture, bioenergy and carbon capture and storage, ocean iron fertilization and soil mineralization, and are usually classified as carbon dioxide removal (CDR) geoengineering approaches.

In my next few postings on the site, I’d like to highlight some of the excellent peer-reviewed literature on carbon dioxide removal strategies that has been released in the past few years. In 2010, Stanford professors Long Cao and Ken Caldeira published a study in the journal Environmental Research Letters (open access) that sought to assess both the long-term consequences and level of commitment required to effectuate massive removal of carbon dioxide from the atmosphere. The researchers employed a coupled-climate carbon cycle model, initially integrated under a fixed pre-industrial atmospheric concentration of 278ppm for 5000 years. The model was subsequently integrated under prescribed historical carbon dioxide concentrations between 1800-2008, and then forced with carbon dioxide emissions from 2009-2049 following the IPCC’s A2 emissions scenario. The study then simulated cessation of carbon dioxide emissions and two extreme carbon dioxide scenarios, one in which carbon dioxide was instantaneously set to its pre-industrial level of 278ppm at the beginning of 2050 by removing all carbon dioxide from atmosphere, with carbon dioxide levels in the atmosphere permitted to evolve freely thereafter, and the other scenario in which carbon dioxide was set at 278ppm in 2050 and then held at that level thereafter. In both scenarios, integration of the simulations was continued until 2500.

Among the study’s conclusions:

1.     Following an extreme one-time removal of all anthropogenic carbon dioxide from the atmosphere, atmospheric concentrations are restored under the simulation to pre-industrial levels of 278ppm.

a.     However, due to efflux of carbon from land associated with responses of net primary production and soil respiration, as well as releases of carbon from oceans, atmospheric concentrations experience an overshoot, with a peak concentration of 362ppm 30 years after the removal.  Overall, 27% of removed carbon returns to the atmosphere. Thus, if society would wish to maintain atmospheric concentrations of carbon dioxide at a specified level, it would have to commit itself to long-term removal of carbon dioxide released from land and ocean sources;

b.     A one-time removal of anthropogenic carbon dioxide also reduces warming by a little less than 50% at the time of removal, and radiative forcing by two-thirds on centennial timescales.

2.     If atmospheric concentrations of carbon dioxide were restored to 350ppm, it would result in surface warming of 1.2°C, which would last for several centuries;

3.     The simulated reduction in temperatures in the study yields a cooling of 0.16°C for every 100 PgC CO2 removal. The conclusion that the concept of proportional temperature change to cumulative carbon dioxide emissions is apposite to carbon dioxide removal has implications for assessing the potential effectiveness of such approaches.

4.     The study also contains a cautionary note: if effective heat capacity should prove to be less than estimated in the study, or the carbon dioxide degassing timescale proved longer, it could result in temperature overshoots in which initial temperature decreases are reversed when carbon dioxide re-accumulates in the atmosphere. However, the research concluded that they did not observe this result in their simulations.

Of course, it is not likely that deployment of negative emissions approaches would, or in the case of most prospective technologies, could, follow what the researchers themselves characterize as an “extreme” scenario, i.e. a one-time removal of all carbon dioxide at a discrete point. Moreover, it’s far from clear that society would seek to return to pre-industrial climatic conditions, even if that could be effectuated. And, of course, the study did not address issues associated with feasibility, However, this study is very valuable for a number of reasons. First, it provides a preliminary estimate of anticipated reductions in temperatures per 100 PgC CO2, providing a guide to policymakers who might contemplate more limited uses of negative emissions strategies than contemplated in this study. Second, the study provides a pointed reminder of the fact that a negative emissions strategy would likely necessitate a multi-generational societal commitment, with all of the implications that this would hold for governance, ethics and practical logistics. Finally, the study could provide students with an excellent window into the methods, as well as challenges, of simulating the climatic impacts of geoengineering strategies with climate models.

Interactive Climate Exercises

The folks at the website Simple Climate have posted a really good set of exercises to guide those interested in learning more about climate change science. The exercises includes an excellent method to calculate one’s carbon footprint, an interactive map on the Keeling curve, and NASA’s Global Equilibrium Energy Balance Interactive Tinker Toy ((GEEBITT). There is also an excellent app for Apple platforms to guide users on how scientists construct and use global circulation models to predict climate change.

Multimillennial Sea-Level Commitment Associated with Global Warming

A recent study in the Proceedings of the National Academies of Science (open access article) assesses prospective sea-level rise over the course of the next 2000 years, combining paleo-reconstructions of sea-level rise and simulations from physical models focusing on the four main components that contribute to sea-level change.

Among the study’s findings:

  1. Thermal expansion yields a global mean sea-level rise of 0.38m with a homogenous increase of ocean temperature by 1C;
  2. The total contribution of all glaciers (all land ice excluding ice sheets) to sea-level rise over the next 2000 years is ~0.6m;
  3. The potential contribution of the Greenland Ice Sheet is projected to be 0.18m °C-1 up to a 1C temperature increase and 0.34m °C-1  for temperature increases between 2-4C;
  4. Simulated temperature rise over the next two millennium yields a 1.2m rise in sea level associated with melting of the Antarctic Ice Sheet;
  5. On a 2000 year time scale, the contribution of the sources outlined above will be largely independent of the projected warming path during the first century;
  6. The total sea-level commitment from all sources is 2.3m °C-1 over 2000 years. However, the melting of the Greenland Ice Sheet ultimately results in 6m of sea-level rise over the course of several ten thousand years.

This study could stimulate some good classroom discussion. Some potential questions:

  • In the context of questions of inter-generational equity, do these potential impacts substantially expand the scope of generations whose interests must be acknowledged and protected?;
  • What are the implications of projected long-term rises in sea-levels for adaptation initiatives?
  • What are the most significant sources of uncertainty associated with paleo-climatic sea level rise and temperature records?

Ocean Acidification & Mussel Byssus Attachment

To date, most research on the possible impacts of ocean acidification on marine organisms has focused on potential adverse impacts on secretion of calcium carbonate in species e.g. corals and echinoderms. However, a study published in the latest issue of the  journal Nature Climate Change demonstrates potential impacts on other biomaterials critical for bivalve molluscs, a group of species that provide for than $1.5 billion in revenue to the global aquaculture industry.

Mytilid mussels are competitive dominant species in many rocky shore ecosystems throughout the world. This is largely attributable to their ability to attach themselves to bare rocks with byssal threads, which are formed from collagen-like liquid precursors that polymerizes into a stiff and extensible thread. Byssal threads contain high concentrations o f a modified amino acid and histidine-metal crosslinkages that appear critical to facilitating surface adhesion and self-healing following deformation.

In the study, carbon dioxide was increased from 300 to 1,500 μatm (a pH decline from 8.0 to 7.5). Under high carbon dioxide concentrations (1,200 μatm), the study found substantial diminution of the performance of bysall performance, including a 35-10% decline in tenacity. The authors concluded that this could adversely affect community and ecosystem dynamics given the importance of mussels’ ability to securely attach themselves to rocks.

Of course, ocean acidification is not a manifestation of climate change, but rather a parallel impact of rising levels of carbon dioxide concentrations. The potential impacts of carbon dioxide emissions on an array of bio-structures may provide an additional rationale for focusing climate policymaking on reducing this discrete greenhouse source. It thus provides a good gateway for discussing the interface of efforts to reduce greenhouse gas emissions and to confront ocean acidification, as well as the appropriate regimes to address these issues, including coordination of initiatives.