Global average temperature data?

Global average surface temperatures

Global average temperature data show 1.2-1.3C increases since pre-industrial times and continue rising at 0.02-0.03C per year, according to data-sets from NASA, NOAA, the UK Met Office and academic institutions. This note assesses their methodologies and controversies. Uncertainty in the data is likely much higher than admitted. But the strong upward warming trend is robust.


2020 is said to be the joint-hottest year on record, tied with 2016, which experienced a particularly sharp El Nino effect. 2020 temperatures were around 1.2C warmer than 1880-1900, on data reported by NASA’s Goddard Institute for Space Studies (GISS), and 1.3C warmer on data reported by the UK Met Office’s Hadley Center and East Anglia’s Climatic Research Unit (HadCRUT).

2020’s hot temperatures were partly influenced by COVID-19, as โ€œglobal shutdowns related to the ongoing coronavirus (COVID-19) pandemic reduced particulate air pollution in many areas, allowing more sunlight to reach the surface and producing a small but potentially significant warming effectโ€, per NASA. But the largest component of the warming is attributed to rising CO2 levels in the Earth’s atmosphere, which reached 414 ppm.

Overall, NASA’s data show temperatures have warmed by 0.02C per year over the past 50-years, 0.023C per year over the past 25-years and 0.03C per year over the past 10-years (chart above). Likewise, HadCRUT shows 0.02C per year over the past 50-years, 0.022C/year over the past 25-years and 0.024C/year over the past 10-years (below). Both data-sets suggest the rate of warming is accelerating.

Global average surface temperatures

How accurate are the data-sets? To answer this question, this note delves into global average surface temperature records (GASTs). It is not as simple as shoving a thermometer under the world’s armpit and waiting three minutes…

How are global average surface temperatures measures?

Surface air temperatures (SATs) are measured at weather stations. 32,000 weather stations are currently in operation and feed into various GAST indices.

Surface sea temperatures (SSTs) are measured at the surface of the sea as a proxy for immediately overlying air temperatures. Up until the 1930s, SSTs were most commonly taken by lowering a bucket overboard, and then measuring the temperature of the water in the bucket. This most likely under-estimated water temperatures in the past. From 1930-1990, SSTs were mostly measured from shipsโ€™ engine intakes. From 1990 onwards, SSTs were most commonly measured by specialized buoys and supplemented by satellite imagery.

Sea ice is particularly complicated. It takes 80% as much heat to melt 1kg of ice as it takes to boil 1kg of water, which means that the surface temperatures above sea ice can be higher than 0C. But it is difficult to access locations that are iced over with permanent weather stations. So temperatures over sea ice are often modelled not measured.

Global average surface temperatures
Data here: https://thundersaidenergy.com/downloads/refrigeration-and-phase-change-materials-energy-economics/

Temperatures are measured at each of these sites noted above. However GAST indices do not take raw temperature data as their inputs. First, absolute temperatures vary markedly between different weather stations that are scattered over short distances (e.g., due to shade, aspect, elevation, wind exposure), making the readings too site-specific. Moreover, the global average temperature is actually 3.6C higher in July-August than it is in December and January, because land masses experience greater seasonal temperature fluctuation than oceans, while two-thirds of the worldโ€™s land is in the Northern hemisphere, experiencing summer conditions in July-August. This would introduce too much noise into the data.

Temperature anomalies are the input to GAST indices. These are calculated by comparing average temperatures throughout each day with a baseline temperature for that site at that particular time of year. These anomalies are highly correlated, site-by-site, across hundreds of kilometers. By convention the 30-year average period from 1951-1980 is used as the baseline by NASA.

Averaging is used to aggregate the temperature anomaly data from different temperature stations across regions. Regional anomalies are then averaged into a global anomaly. Each region is weighted by its proportionate share of the Earth’s surface area. Thus a GAST index is derived.

Controversies: could there be systematic biases in the data?

Very large data-sets over very long timeframes are complicated beasts. They are prone to being revised and adjusted. Some commentators have worried that there could be systematic biases in the revisions and adjustments.

More data. As an example, NOAA digitized and added more observations from the early 20th century into its methodology in 2016. This caused prior data to be re-stated. But this reason seems fair and relatively uncontroversial.

New weather stations are slightly more controversial. How do you know what the baseline temperature would have been at a site in 1950-1980, if the first weather station was only added there in 2000? Some of the baselines must therefore be derived from models, rather than hard data. Some commentators have criticized that the models used to set these baselines themselves pre-suppose anthropogenic climate change, assuming past temperatures were cooler, thereby placing the cart before the horse. This fear may be counter-balanced by looking at weather stations with longer records. For example, of the 12,000 weather stations surveyed by NASA’s 2020 data, as many as 5,000 may have records going back beyond 1930.

Urban heat islands are somewhat more controversial again. Imagine a weather station situated in the countryside outside of a city. Over the past century, the city has grown. Now the weather station has been engulfed by the city. Cities will tend to be 1-3C warmer than rural lands, due to the urban heat island effect. So for the data to remain comparable, past data must be adjusted upwards. GISS notes that the largest change in its calculation methodology over time has been to adjust for urban heat islands. Although some commentators have questioned whether the adjustment process is at risk of not being extensive enough. This fear may be counter-balanced by the relatively small portion of weather stations experiencing this engulfing effect.

The adjustment of anomalous-looking data is most controversial. Algorithms are used to sift through millions of historical data-points and filter away outliers that run counter to expectations. The algorithms are opaque. One set of algorithms ‘homogenizes’ the data of stations showing divergent patterns to its neighbours, by replacing that station’s data with that of its neighbours. As the general trend has been for a warming climate, this means that some stations showing cooling could be at risk of getting homogenized out of the data-set, causing the overall data to overstate the degree of warming.

The data also do not correlate perfectly with rising CO2 levels: especially in 1880-1920, which appears to get cooler; and around the Second World War, which appears to produce a spike in temperatures then a normalization. On the other hand, no one is arguing that CO2 is the sole modulator of global temperature. El Nino, solar cycles and ‘weather’ also play a role. And despite the annual volatility, the recent and most accurate data from 1970+ rise in lockstep with CO2.

Global average surface temperatures

The most vehement critics of GAST indices have therefore argued that past temperature adjustments could be seen to contribute over half of the warming shown in the data. There is most distrust over the revisions to NASAโ€™s early temperature records. One paper states โ€œEach new version of GAST has nearly always exhibited a steeper warming linear trend over its entire history. And, it was nearly always accomplished by systematically removing the previously existing cyclical temperature pattern. This was true for all three entities providing GAST data measurement, NOAA, NASA and Hadley CRUโ€.

Uncertainties should not detract from the big picture

Our own impression from reviewing the evidence is that the controversies above should not be blown out of proportion. The Earth is most likely experiencing a 0.02-0.03C/year warming trend over the past 10-50-years.

Multiple independent bodies are constructing GAST indices in parallel, and all seem to show a similar warming trend. Pages could be written on the subtle differences in methodologies. For example, NOAA and the Berkeley Earth project use different, more complex methodologies than GISS, but produce similar end results. NOAA, for example, does not infer temperatures in polar regions that lack observations, and thus reports somewhat lower warming, of just 1.0C. This is because Arctic warming exceeds the global average, as minimum sea ice has declined by 13% each decade, allowing more sunlight to be absorbed and in turn, and causing more warming.

No doubt the construction of a global average temperature index, covering the whole planet back to 1880 is fraught with enormous data-challenges that could in principle be subject to uncertainties and biases. But it is nothing short of a conspiracy theory to suggest that multiple independent agencies are wilfully introducing those biases. And then lying about it. The Q&A section of NASA’s website states โ€œQ: Does NASA/GISS skew the global temperature trends to better match climate models? A: Noโ€.

You can also review all of the adjusted and unadjusted data for individual weather stations side-by-side, here. If we take the example below, in New York’s Central Park, the adjusted/homogenized and unadjusted data are not materially different, especially in recent years. Although the uncertainty is visibly higher for the 1880-1940 data.

Global average surface temperatures
Source: https://data.giss.nasa.gov/cgi-bin/gistemp/stdata_show_v4.cgi?id=USW00094728&dt=1&ds=14

Criticisms of the NASA data adjustments, cited in the skeptical technical paper above, do not appear particularly well founded either. It is true that NASA’s 1980s estimates of temperatures in 1920-1980 have been progressively lowered by 0.1-0.3C from 1981 to 2017, which could be seen to over-exaggerate the warming that has occurred since that time-period. However this is mostly because of bad data back in the pre-digital world of the 1980s. In fact, NASAโ€™s 1981 data-set did not include any sea surface temperature data and only included data from 1,219 land stations, all in the Northern hemisphere.

Revisions in more recent data-sets are minimal. For example, HadCRUT’s data are shown below.

Global average surface temperatures

Another review of the data concludes that the net effect of revisions has been to under-state global temperature increases, by adjusting the temperatures in 1880-1930 upwards, which would under-state warming relative to this baseline (here). This is actually a blend of effects. Temperatures on land have generally been adjusted downwards by 0.1-0.2C in the 1880-1950 timeframe. Temperatures at sea have been adjusted upwards by 0.2-0.3C over 1880-1935. The original article also contains some helpful and transparent charts.

Finally, recent data are increasingly reliable. In total, 32,000 land stations and 1.2M sea surface observations are now taken every year, across multiple data-sets. Hence GISS estimates the uncertainty range in its global annual temperature data is +/- 0.05C, rising to 0.1-0.2C prior to 1960, at a 95% confidence level. From our review above, we think the uncertainty is likely higher than this. But the strong upwards trend is nevertheless robust.

Conclusions: 30-years to get to net zero?

Global temperatures are most likely rising at 0.023-0.03C per year, and 1.2-1.3C of warming has most likely occurred since pre-industrial times. This would suggest 30-years is an appropriate time-frame to get to Net Zero while limiting total warming to 2C, per our recent research note below.

A paradox in our research is that the $3trn per year economic cost of reaching net zero by 2050 seems to outweigh the $1.5trn per year economic cost of unmitigated climate change. One of the most popular solutions to this paradox, per the recent survey on our website below, was to consider re-optimizing and potentially softening climate targets. There may be economic justifications for this position. But the temperature data above show the result could be materially more warming.

Our own view is that the world should also decarbonize for moral reasons and as an insurance policy against tail-risks (arguments 3 and 6, above). And it should favor decarbonization pathways that are most economical and also restore nature (note below).

Our climate model, including all of the temperature data cited in this report are tabulated in the data-file below.

Britain’s industrial revolution: what happened to energy demand?

britain's industrial revolution

Britain’s remarkable industrialization in the 18th and 19th centuries was part of the world’s first great energy transition. In this short note, we have aggregated data, estimated the end uses of different energy sources in the Industrial Revolution, and drawn five key conclusions for the current Energy Transition.


In this short note, we have sourced and interpolated long run data into energy supplies in England and Wales, by decade, from 1560-1860. The graph is a hockey stick, with Britain’s total energy supplies ramping up 30x from 18TWH to 515TWH per year. Part of this can be attributed to England’s population rising 6x, from around 3M people to 18M people over the same timeframe. The remainder of the chart is dominated by a vast increase in coal from the 1750s onwards.

britain's industrial revolution

A more comparable way to present the data is shown below (and tabulated here). We have divided through by population to present the data on a per-capita basis. But we have also adjusted each decade’s data by estimated efficiency factors, to yield a measure of the total useful energy consumed per person. For example, coal supplies rose 40x from 1660 to 1860, but per-capita end use of coal energy only rose c6.5x, because the prime movers of the early industrial revolution were inefficient. This note presents our top five conclusions from evaluating the data.

britain's industrial revolution

Five Conclusions into Energy Demand from the Industrial Revolution

(1) Context. Useful energy demand per capita trebled from 1MWH pp pa in the 1600s to over 3MWH pp pa in the mid-19th century, an unprecedented increase.

For comparison, today’s useful energy consumption per capita in the developed world is 6x higher again, as compared with the 1850s. A key challenge for energy transition in the developed world is that people want to keep consuming 20MWH pp pa of energy, rather than reverting to pre-industrial or early-industrial energy levels. As a rough indicator, 20MWH is the annual energy output of c$120-150k of solar panels spread across 600 m2 (model here).

Furthermore, today’s useful energy consumption in the emerging world is only c2x higher than Britain in the 1860s. I.e., large parts of the emerging world are in very early stages of industrialization, comparable to where Britain was 150-years ago. Models of global decarbonization must therefore allow energy access to continue rising in the emerging world (charts below), and woe-betide any attempt to stop this train.

britain's industrial revolution

(2) Shortages as a driver of transition? One of the great cliches among energy analysts is that we “didn’t emerge from the stone age because we ran out of stone”. In Britain’s case, in fact, the data suggest we did shift from wood to coal combustion as we began to run out of wood.

Wood use and total energy use both declined in the 16th Century, and coal first began ramping up as an alternative heating fuel (charts above). In 1560, Britain’s heating fuel was 70% wood and 30% coal. By 1660, it was 70% coal and 30% wood. This was long before the first coal-fired pumps, machines or locomotives.

This is another reminder that energy transitions tend to occur when incumbent energy sources are under-supplied and highly priced, per our research below. Peak supply tends to preceed peak demand, not the other way around.

(3) Energy transition and abolitionism? Amazingly, human labor was the joint-largest source of useful energy around 1600, at c25% of total final energy consumption. But reliance upon human muscle power as a prime mover was bound up in one of the greatest atrocities of human history: the coercion of millions of Africans, slaves and serfs; to row in galleys, transport bulk materials and work land.

By the time Britain banned the slave trade in 1807, human muscle power was supplying just 10% of usable energy. By the time of the Abolition Act in 1833, it was closer to 5%.

Some people today feel that unmitigated CO2 emissions is an equally great modern-day evil. On this model, it could be the vast ramp-up of renewable energy that eventually helps to phase out conventional energy. But our current models below do not suggest that renewables can reach sufficient size or scale for this feat until around 2100.

What is also different today is that policy-makers seem intent on banning incumbent energy sources before we have transitioned to alternatives. We have never found a good precedent for bans working in past energy systems. Although US Prohibition, from 1920-1933, makes an interesting case study.

britain's industrial revolution

(4) Jevons Paradox states that more efficient energy technologies cause demand to rise (not fall) as better ways of consuming energy simply lead to more consumption.

Hence no major energy source in history has ‘peaked’ in absolute terms. Even biomass and animal traction remain higher in absolute terms than before the industrial revolution, both globally and in our UK data from 1560-1860.

Jevons Paradox is epitomized by the continued emergence of new coal-consuming technologies in the chart below, which in turn stoked the ascent of coal-powered demand, while wood demand was not totally displaced.

The fascinating modern-day equivalent would suggest that the increasing supply of renewable electricity technologies will create new demand for electric vehicles, drones, flying cars, smart energy and digitization; rather than simply substituting out fossil fuels.

britain's industrial revolution

(5) Long timeframes. Any analysis of long-term energy markets inevitably concludes that transitions take decades, even centuries. This is visible in the 300-year evolution plotted above, and in the full data-set linked below. Attempts to speed up the transition create the paradox of very high costs or potential bubbles. We have also compiled a helpful guide into transition timings by mapping twenty prior technology transitions. Our recent research, summarized below, covers all of these topics, for further information.


Source: Wrigley, E. A. (2011). Energy and the English Industrial Revolution, Cambridge, TSE Estimates. With thanks to the Renewable Energy Foundation for sharing the data-set.

Costs of climate change: solving the paradox?

costs of climate change

Our lowest cost route to an energy transition was spelled out in December-2020 (note below), looking across 90 prior research reports and 270 data-files. It is fully possible to reach ‘net zero’ by 2050. The economic costs ratchet up to $3trn per annum.

However, based on the latest disclosures from the IPCC, we estimate that the unmitigated costs of climate change are only $1.5trn per year. So paradoxically, the energy transition appears to cost 2x more than climate change itself (note below).

What implications? The note above spells out our own solutions to the paradox. However, we also undertook a survey, to understand broader perspectives among decision makers. The results are tabulated below…

(1) Does energy transition matter? 68% of those who participated in our survey agreed or strongly agreed that the energy transition matters. Conversely, 23% disagreed or strongly disagreed. It is interesting to note that support for targeting an energy transition is strong but not universal.

(2) Do economics matter in the energy transition? 90% agreed that economics should be a material consideration in targeting an energy transition. 77% strongly agreed, the most strongly held view in the entire survey. This would suggest there is great importance in understanding which transition technologies will ultimately be economic. We expected a more polarized view here, although there could be some sampling bias at play.

(3) The paradox identified in our recent research is that the $3trn per year costs of achieving net zero appear to exceed the $1.5trn per year costs of unmitigated climate change. 65% agreed that this paradox might seem to challenge the rationale for targeting an energy transition. 23% disagreed. This is a similar split as on question (1).

(4) Could energy transition therefore be becoming a bubble? 74% agreed with this statement. 48% strongly agreed. Only 9% disagreed or strongly disagreed. This is interesting, albeit frightning support for our own thesis below.

(5) A higher cost of climate change was one solution that we proposed to resolve our paradox above. It is possible, the argument goes, that the costs of climate change are much higher than the $1.5trn per year that we estimated. 42% agreed or strongly agreed that the costs of climate change would likely be higher than we had estimated. 32% disagreed. There is no clear consensus on the unmitigated costs of climate change.

(6) A lower cost of achieving a transition was another solution that we proposed to the paradox. 35% agreed or strongly agreed that the costs of climate change would likely be lower than we had estimated. Again, there is no clear consensus on the cost of delivering the energy transition. Questions (5) and (6) had the highest portion of ‘neutral’ responses of any categories in our survey.

(7) The insurance argument is that it may still be rational to target an energy transition, even if the most likely costs of the transition exceed the most likely costs of unmitigated climate change, because climate change creates tail-risks of very large, catastrophic problems. 48% agreed or strongly agreed with this argument, making it the second most popular resolution to the paradox.

(8) The affordability argument is that it may still be rational to target an energy transition, even if the costs exceed the costs of climate change, because the costs of the former will be borne by wealthier populations in the developed world, while the costs of the latter would be incurred by poorer populations in the emerging world. Only 29% agreed with this argument, making it the least popular argument in the survey.

(9) The optimization argument is that targeting 2C of warming may not be the optimal balance, as the costs appear to outweigh the benefits. But there could be alternative timings, or alternative ceilings on global warming (e.g., 2.5C, 3C, 3.5C ?) where the costs do outweigh the benefits. 58% agreed or strongly agreed with this argument, making it the most popular argument in the survey. It may be interesting to consider how markets would react if a re-optimization of global climate targets ever came onto the political agenda.

(10) The moral argument is that it is morally ‘wrong’ to change the world’s climate, so it does not matter whether the energy transition is economically rational. 35% agreed or strongly agreed with this statement, making it the second least popular argument in the survey.


Notable perspectives and feedback

Notable perspectives were also shared by those taking part in the survey. In order to help understand others’ perspectives, some of the most cogent examples are noted below.

“The climate has always changed and humans have adapted in the past. Low cost adaption plus nature based solutions could go a long way to address the issue. Additionally, we really do not know if the 1.5-2.0 degrees is the tipping point. It was just a number conveniently picked. Reduce carbon and emissions yes, but not to the extent it wrecks the global economy.” — investor

“Difficult to see where rationality will come from, as what began as a worthy goal has become an uber-ideology that will cut any politician who stands in the way to shreds. I don’t think humanity has seen such a pervasive injection of zeal since the Christianization of the Roman Empires.” — investor

“I think the missing piece is the much higher likelihood of war(s) in an adverse climate change scenario. Judging purely by the costs incurred in a (relatively) small conflict like Iraq ($1tn per year), such costs would likely quickly exceed the $1.5tn total estimate. There are further risks to democracies and government stability if climate change results in mass migrations, epidemics, additional wealth inequality, etc. Such risks would also easily “cost” meaningfully more than the $1.5tn estimate implies.” — investor

“Ideally, the energy transition would be guided by the optimal low cost solution. this is unlikely to occur because the politicians will not support it. Furthermore, current politicians who set goals for decades in the future will not be around to enforce the painful restrictions required to achieve the targets”. — energy company strategist

“Paradoxically, we must be as economically rational as possible in execution of something that may be economically irrational! Thus, I find your comparisons of various costs of ton of captured to be extremely powerful” — investor

“Climate change will cost lives while energy transition will save it. Covid has shown that cost should not be the only factor driving policy decisions. So even if the transition is expensive (though I don’t agree with the thesis), it is worth pursuing.” — individual

“There should be a Pareto optimal solution analysis of the costs of mitigation and adaption (the latter of which tends to be ignored) relative to the costs of climate change itself.” — strategy consultant


Energy Transition Paradox Survey

The survey is still open, below, and we welcome additional perspectives…


Transitioning the world to 'net zero' is a crucial objective that should be pursued by decision-makers.
Economics matter to the energy transition. Lower-cost transition pathways are preferable to higher-cost transition pathways.
The costs of transitioning the energy system to 'net zero' by 2050 will likely reach $3trn per year. However, the ultimate costs of unmitigated climate change are likely only $1.5trn per year. If these numbers are correct, then they would challenge the rationale for targeting 'net zero' by 2050?
If achieving 'net zero' by 2050 is economically irrational, then this is another reason to fear many new energy technologies may be becoming bubbles?
Targeting an energy transition is still economically rational, because the costs of climate change will most likely be materially higher than the $1.5trn per annum that Thunder Said Energy has estimated.
Targeting an energy transition is still economically rational, because new technologies are likely to deflate the costs of achieving 'net zero' far below the $3trn per annum that Thunder Said Energy has estimated.
Targeting an energy transition is still economically rational, if this lowers the chances of catastrophic tail risks. Even though these tail risks are unlikely, they could have materially larger impacts than Thunder Said Energy's base case cost estimate of $1.5trn per annum.
Targeting an energy transition is still economically rational, because the $3trn per year costs of energy transition will mostly be borne in wealthier countries (which can more readily afford them), while the $1.5trn per year costs of climate change would mostly be borne in less wealthy countries (which can less readily afford them).
If the costs of achieving the Paris Climate goals, limiting warming to 2C, and reaching 'net zero' by 2050 outweigh the benefits, then this suggests the need to re-optimize climate targets. There could be alternative timings, or alternative ceilings on global warming (e.g., 2.5C, 3C, 3.5C ?) where the costs do outweigh the benefits. Better-optimized climate targets should therefore be explored.
It is morally 'wrong' to change the world's climate. So it does not matter whether the energy transition is economically rational.
We will share the results of the survey with you, when we have gathered enough responses. We will not disclose your email address to anyone else.

Our Top Ten Research Notes of 2020

top research notes of 2020

We published 250 new research notes and data-files on our website in 2020. The purpose of this review is to highlight the ‘top ten’ reports. This includes our economic roadmap to reaching ‘Net Zero’, the greatest risks and opportunities that we have found in the transition, and the analysis that has most shaped our views.


(1) The single most powerful decarbonization option in all of our work is reforestation. Costs are as low as $3-10/ton. There is 15GTpa of carbon-offsetting potential (note here). But this is not an investment. It is an act of charity. It matters because increasing numbers of decision-makers are choosing to restore nature and offset their CO2 at low cost, rather than purchasing higher-cost new energies, which could make them uncompetitive.

(2) Restoring soil carbon is equally powerful, and surprisingly fascinating. Agricultural soil has lost three-quarters of its carbon since pre-industrial times. Restoring it could offset another 3-15GTpa of CO2. With a $30/ton CO2 price, mid-Western farmers could make more money farming carbon than corn. The theme would also disrupt the global fertilizer industry.

(3) Is energy transition becoming a bubble? If you read a single piece of research on energy transition this year, I would recommend this one. We fear an “investment bubble” is forming in the energy transition space. Half of all transition technologies we have evaluated are on the “wrong” side of the cost curve and may be displaced by the nature based solutions we described above.. Deflation and profitability are often antagonistic. And some spaces have seen incredible run-ups despite challenging economics and overlooked technical challenges. The purpose of this note is to suggest pragmatic responses.

(4) The green hydrogen economy may be the largest bubble. Our work this year has assessed the theme in detail, both in power markets and as a transportation fuel. Costs are immutably high. This is due to the laws of physics and thermodynamics. Transporting green hydrogen will also be more challenge than any other commodity in history (note here). The note below is the best overview of our work. Many expect c80% deflation in the total costs of electrolysers. Our data suggest this is impossible. We welcome challenges to these numbers, but so far, have not received any from our contacts in the hydrogen industry.

(5) The green hydrogen bubble will give way to blue. Blue hydrogen is not just a low-carbon fuel. More importantly, it is the most economic and practical route to establishing large-scale carbon capture and storage. Economics are 80-90% superior to green hydrogen. Risks are materially lower. Our research note ends by identifying projects that should reach FID in 2021, and a public company with a clear ‘moat’ in the space.

(6) Non-obvious opportunities. Other novel technologies have a vast role in the route to Net Zero. The non-obvious opportunities are best, and are not at risk of becoming bubbles. Our research has covered many examples in 2020: deep geothermal, supercapacitors eclipsing batteries, industrial efficiency initiatives such as fully subsea offshore or next-generation refining; and backing up renewables with CHPs, PCMs and smart energy systems. If you read one research note into a non-obvious opportunity, we recommend this deep-dive below into additive manufacturing, which will re-shape every industry globally.

(7) Patent analysis can give you an edge identifying opportunities in the energy transition and, and avoiding hidden risks, particularly as bubbles build. Our note below lays out six themes, including worked examples, based on reviewing over 1M patents.

(8) Our most economic roadmap to Net Zero ties together all of our work. We find it is possible to decarbonize global energy by 2050, even as global energy demand rises by 65%. The total cost of decarbonization is $50trn, which is almost halved versus last year’s estimate in December-2019. The fully decarbonized energy system still contains 85Mbpd of oil and 375TCF per year of natural gas.

(9) Oil and gas are heading for devastating under-supply if our analysis is correct. This is historically precedented during technology transitions. Below we have evaluated supply-demand and pricing in the whale oil industry from 1805-1905, as it was disrupted by rock oil and later by electric lighting. Whale oil prices outperformed over this timeframe, as supply peaked before demand. Our latest our oil market outlook is here, and our gas market outlook is here.

(10) The optimal strategy for incumbent energy companies is thus suggested in our research note below. We argue that Energy Majors embracing these principles can uplift their valuations by c50% (assuming flat commodity prices).

Wet sand: what impacts on shale breakevens and CO2?

wet sand for hydraulic fracturing in shale play

A fully decarbonized energy system may still require 85Mbpd of oil and 375TCF of gas. Hence a focus of our research is to find improved technologies that can improve the efficiency and lower the CO2 intensity of oil and gas production. This note profiles the exciting new prospect of ‘wet sand’ for hydraulic fracturing in shale plays. It can reduce breakeven costs up to $1/bbl and CO2 intensity up to 0.6kg/bbl.


Wet sand is defined as having a moisture content between 1% and 10% by weight. This is opposed to the 43MTpa of dry sand supplied in the Permian in 2018, where all the moisture has been burned away in a kiln, shortly after the washing process.

Wet sand has reached technical maturity. In a December-2020 technical paper, PropX described a patent pending process to “screen, transport, deliver and meter wet sand from local mines to the frac blender, bypassing the drying process altogetherโ€.

A full trial of wet sand has also been undertaken at a 10-well pad in Oklahoma, implied to have been operated by Ovintiv. Over 4Mlbs of sand per day was delivered and pumped downhole. The system was reliable and consistently flowed wet sand with 4-8% moisture content.

The advantages: cost and CO2 savings?

Cost savings from pumping wet sand are estimated at $2-10/ton. The largest capital component is in potential capex savings, as the kiln at a sand mine usually comprises $20-50M (including associated drying, storage and conveyance) out of a $180M total budget at a 2.5MTpa mine. A second saving is in opex, as labor costs average $5/ton across 15 surveyed sand mines (chart below), and the drying unit requires one-third of the labor force. Finally, there are fuel savings, likely around $1/ton.

After modelling the economics below, our base case estimate is that a greenfield sand mine can lower its total production costs by $5/ton, with a shift from dry to wet sand.

To test the economic impacts of a $5/ton reduction in sand costs, we turn to our economic model below. At a large, sand-intensive well, we estimate $0.1M of potential savings. This flows through to a $0.5/bbl reduction in the well’s NPV10 breakeven. However, the savings will be lower at industry average wells, which only consume 10-20M lbs of proppant.

CO2 savings are realized by cutting out fuel demand in the drying kilns at sand mines, typically at 0.3-0.4mmbtu per ton of dried sand, requiring 0.4mcf of gas, whose combustion would release 20kg of CO2. Again, we can run the savings through our models (below) and estimate CO2 could be reduced by up to 0.6kg/boe, which is not bad against a baseline of 26kg/boe of total upstream Scope 1 and Scope 2.

HSE advantages are also noted in the technical paper. Fugitive silica dust is materially reduced, as wet sand particles adhere to one-another. This helps meet OSHAโ€™s 2016 silica exposure limits, below 50mg/m3 of air, averaged over an 8-hour shift.

The challenges: is wet sand more difficult to pump?

The challenge of wet sand is that wet san grains cohere to one-another, which impedes their smooth flow and can cause sand to “clump together”. In cold climates (but probably not Texas!) the water molecules can also freeze. Hence, PropX notes three areas where it has needed to innovate the sand supply chain.

Last-mile. It is recommended to use containers (rather than trailers) to transport sand to well-sites. These can be lifted from trucks onto the wellsite with โ€œthe fewest touch points and the least modificationโ€.  A typical container system carries 23,000-27,000lbs of sand, with a capacity of 28,500 lbs. These volumes have been emulated by PropX, by enlarging the container opening (from 20โ€ diameter to 6โ€™x6โ€™, and covering it with a tarp, as is widely used in transportation of agricultural products.

Emptying containers is easy with dry sand as it flows naturally, emptying in c60-seconds. Wet sand containers need to be emptied into a blender hopper. This likely takes 120-seconds, via a 100bpm slurry rate carrying 2.7ppg of sand. PropX has undertaken successful trials placing the sand containers on a vibration table, with a sloped discharge cone, silicon-inserts to lower friction and a larger discharge exit gate.

Sand delivery to the well must occur at a rate of 16,800 lbs of sand per minute, for example, comprising 80-100bpm of slurry carrying 0.5-4.0ppg of sand. A unique belt has been designed which can carry up to 7.9ppg at up to 100bpm. It includes a metering system and screwless surge hopper, also on vibrating tables, to enable accurate, reliable and continuous pumping of wet sand without the risk of bridging.

Sand has undergone huge changes in the past, which suggests this supply chain is not ossified. For example, back in 2017, 90% of sand was shipped by rail to the Permian from Minnesota, Illinois and Wisconsin. Today there are 50 โ€œlocalโ€ sand mines in the Permian basin, with 107MTpa of capacity. This has reduced transload cost, and allowed sand pricing to run as low as $20/ton recently. Further deflation may lie ahead.

Why it matters: deflation and CO2 reduction?

A fully decarbonized energy system may still require 85Mbpd of oil and 375TCF of gas, as per the conclusion of our research to date. Hence a focus of our research is to find improved technologies that can improve the efficieny and lower the CO2 intensity of oil and gas production.

We still see great productivity enhancements ahead for the shale industry, after reviewing over 1,000 technical papers. A 5% CAGR is possible from 2019’s baseline.

There is also great potential for shale to lower its CO2 intensity, potentially towards zero (Scope 1 and 2 basis), as argued in our recent research (below). The potential is further enhanced by using waste water to cultivate nature based solutions (also below).

Shale thus sets the marginal cost in oil markets, as our numbers require of 2.5Mbpd of shale growth each year from 2022-2025 (models below).

https://thundersaidenergy.com/downloads/2020-oil-markets-bounding-the-uncertainty/

But nearer-term we see risks, that sentiment will sour around shale capex, while productivity could temporarily disappoint during the COVID recovery.

Bio-engineer plants to absorb more CO2?

Bio-engineer plants to absorb more CO2?

Our roadmap towards ‘net zero’ requires 20-30GTpa of carbon offsets using nature based solutions, including reforestation and soil carbon. This short note considers whether the task could be facilitated by bio-engineering plants to sequester more CO2. We find exciting ambitions, and promising pilots, but the space is not yet investable.


What is bio-engineering? In 2016, scientists at DuPont gene-edited maize to grow more effectively in dry conditions. In 2017, researchers at the University of Oxford introduced a maize gene into rice plants, to increase the number of photosynthetic chloroplasts surrounding leaf veins. In 2019, scientists at Huazhong Agricultural University gene-edited rice to tolerate higher soil salinity. These are examples of bio-engineering: modifying the genetic code of plants for practical purposes.

How could it help? The world’s land plants absorb 123GTpa of carbon each year through photosynthesis. 120GTpa is re-released through respiration and decomposition. The result is a net sink of 3GTpa. For contrast, total anthropogenic carbon emissions are 12GTpa. It follows that small changes in the natural carbon cycle could materially shift carbon balances, per our climate model below.

The limitations of photosynthesis. Photosynthesis uses sunlight to convert CO2 into plant-sugars. It is only 1-5% inefficient, suggesting great potential for improvement. It is also vastly complex, comprising over 170 separate sub-stages. Amidst the complexity, RuBisCO is the most crucial limitation.

The limitations of RuBisCO. RuBisCO is an enzyme that catalyzes the reaction between CO2 and RuBP during photosynthesis. However, the RuBisCO enzyme is imprecise. It evolved at a time when the worldโ€™s atmosphere contained much lower oxygen concentrations. Unfortunately, under present atmospheric conditions, 20-35% of RuBisCOโ€™s catalytic activity reacts O2 with RuBP, instead of CO2. The resultant products cannot continue their biochemical journey into becoming sugars. Instead, they are broken down in the process of photorespiration. Photorespiration uses up c30% of the total energy fixed by photosynthesis, and re-releases CO2 into the atmosphere. Photorespiration lowers agricultural yields by 20-40%.

What if RuBisCO could be helped to fix more CO2 and less oxygen? One way to do this is to increase the atmospheric concentration of CO2 in greenhouses, which can increase crop yields by c30%, per our note below. Another way is through bio-engineering.

Realizing Increased Photosynthetic Efficiency (RIPE) is a research institute funded by the Bill and Melinda Gates Foundation, UK foreign aid, the USDA and academic institutions. It aims to generate higher crop yields per unit of land, using bioscience. After ten years of research, RIPE has recently modified tobacco plants with genes from green algae and pumpkin plants to reduce the energy penalties from photorespiration. The result is that these modified tobacco plants grew 40% larger. A follow-up study may achieve plants that are 60% larger. Similar modifications are also being tested on soybeans and cowpea plants.

Researchers at the University of Wurzburg have also modelled metabolic pathways that may increase the photosynthetic efficiency of plants, potentially by as much as 5x, with results published in 2020. The work uses synthetic CO2-fixating carboxylases, RuBisCO from cyanobacteria, and additional methods of preventing fixed CO2 from being re-released. Experiments are planned to test the work in tobacco plants and thale cress.

Increasing photosynthetic efficiency and crop yields could be a crucial help, lowering the land intensity of crop production, which covers 1.7bn hectares of the globe today (data below). For comparison, our target of 15GTpa of reforestation will require 1.2bn hectares of land, hence any material reductions in cropland requirements will be helpful.

Sequestering more of the CO2. 50-95% of the carbon that is stored in natural eco-systems is not stored in biomass above ground, but in the soil. An emerging set of agricultural practices that restore soil carbon are explored in our research note below. But another option is to ‘program’ plants to grow deeper, larger roots, which push more carbon into soils.

The Land Institute in Salina, Kansas has developed a grain called Kernza. It is derived from an ancestor of wheat. It is perennial, rather than requiring yearly replanting. Its roots reach 3-6x further into the soil than conventional wheat, which connotes 3-6x more carbon storage, and also promotes drought resistance. It is being grown across 2,000 acres today.

The US Department of Energy also has a Laboratory of Environmental Molecular Sciences, aiming to increase carbon transfer into the soil. One team has developed a strain of rice that emits less methane, as it contains a gene from barley, reducing the carbon that the plant moves underground, which in turn reduces the carbon that can be metabolized by anaerobic bacteria. Studies are underway to reverse the process and increase the carbon that crops move underground.

The Salk Institute for Biological Studies is based in La Jolla, California. It is undertaking the most elaborate program to bioengineer crops and other plants, to sequester up to 20x more CO2 than conventional crops. Deploying these plants across 6% of the world’s agricultural lands are said to potentially offset 50% of global CO2 emissions.

Salk’s Harnessing Plants Initiative started in 2017 and aims to grow โ€œideal plantsโ€ with greater efficiency at pulling CO2 from the air, deeper roots that store more carbon underground, and other superior agricultural properties. One pathway is to promote production of suberin, the carbon-rich polymer in cork (but also found in melon rinds, avocado skins and plant roots). This is a waxy, water-resistant compound that degrades very slowly, thus remaining in the soil for centuries.

In 2019, Salkโ€™s team discovered a gene, which determines whether roots will grow shallow or deep. It is called EXOCYST70A3, and affects the distribution of the PIN4 protein. PIN4 modulates the transport of auxin, a hormone that regulates root architecture. Different alleles of EXOCYST70A3 can increase root depth and plant resistance.

Technical readiness is the challenge for all of the bio-engineering methods discussed above. We generally begin integrating technologies into our models (first with high risking, later with lower risking) once they have surpassed TRL7. No bio-engineering method is there yet. Salk received a $35M grant in 2019, to accelerate its work, but prototype crop variants (corn, soybean, rice) are still not foreseen for five years. More pessimistically, scientists at RIPE have said it could take 15-years to deploy enhanced crops in the field. So while we will track this technology, it is not yet moving our models.

The Amazon tipping point theory?

The Amazon tipping point theory

The Amazon tipping point theory postulates that another 2-10% deforestation could make the world’s largest tropical rainforest too dry to sustain itself. Thus the Amazon would turn into a savanna, releasing 80GT of carbon into the atmosphere, single-handedly inflating atmospheric CO2 by 40ppm (to well above the 450ppm limit for 2C warming). This matters as Amazon deforestation rates have already doubled under Jair Bolsonaro’s presidency. This note explores implications, including international tensions, divestments, prioritization in a Biden presidency, and consequences for other transition technologies.


Global deforestation remains the single largest contributor to CO2e-emissions induced by man’s activities, more than the emissions from all passenger cars; and destruction of nature remains the largest overall contributor, more than all of China (chart below). This note is about a particularly worrying feedback loop in the Amazon rainforest, which could single-handedly wipe out the world’s remaining CO2 budget, effectively negating the impact of all other climate policies globally.

What is the Amazon tipping point theory?

The Amazon rainforest currently covers 5.5M square kilometers, comprising the largest, contiguous tropical forest in the world. 50% is in Brazil, and the remainder is spread around Peru, Colombia and half-a-dozen other South American countries. It contains 20% of all the planet’s plant and animal species, including 40,000 plant species alone.

Deforestation of the Amazon has reached 15-17% of its original area overall, and around 19% in Brazil. 800,000 square kilometers has been lost to-date (a land area equivalent to 2x California; or all of France plus Germany). Brazil’s annual deforestation rates have averaged 20,000 square kilometers per year from 1990-2004 (the land area of New Jersey or Slovenia). But the rate slowed to a trough of 5,000 square kilometers in 2014 due to improving environmental policies.

Unfortunately, more recently, Brazil’s deforestation rate has re-doubled (chart below). Jair Bolsonaro’s Presidency began in January-2019, following campaign pledges to ease environmental and land use regulations (which require 80% of legal Amazon land holdings to remain uncleared). Violations of these regulations are now said to be going unpunished. Bans on planting sugarcane in the Amazon have been lifted. Bolsonaro has even repudiated data published by Brazil’s own government agencies showing deforestation rates rising and accused actor and environmentalist, Leonardo DiCaprio of starting wildfires!

This matters because of the hydrology of the Amazon. Water in the basin tends to move from East to West. Each molecule typically falls as rainfall six times. It is repeatedly taken up by trees, transpired back into the atmosphere, and precipitated back down to Earth. Over half of the rain falling in the Amazon has originated from trees in the Amazon. It is a self-sustaining feedback loop.

The Amazon Tipping Point theory predicts that below some critical level of forest cover, this self-sustaining feedback loop will break. Less rainforest means less transpiration. Less transpiration means less rainfall. Less rainfall means less rainforest. Specifically, converting each hectare of forest to cropland reduces regional precipitation by 0.5M liters/year.

After the tipping point it is feared that the basin will transition into a savanna or scrubland. 50-100% of the forest cover would die back.

Unfortunately, this is not a ‘fringe’ theory. Many different technical papers acknowledge and model the risk, although specific climate models are imprecise, and do not always agree on timings and magnitudes. For example, the Western Amazon, closer to the Andes, might retain more forests than the East and Central parts of the basin. Another uncertainty is the moderating impacts of fire, as dryer forests will be more flammable, and thus more susceptible to slash-and-burn clearances, while raging fires will also reach further.

When is the tipping point? Various technical papers have estimated that the Amazon tipping point occurs when 20-25% of the forest has been cleared. This is an additional 2-10% from today’s levels, equivalent to deforesting another 100-600k acres, which could happen within 2-30 years.

What carbon stock is at risk of being released?

A typical forest contains around 300T of carbon per hectare (chart below). Thus 5.5M square kilometers of the Amazon is expected to contain 165GT of carbon. About 40% of the carbon is usually stored in trees (estimated at 60-80GT in the Amazon) and 60% is stored in roots and soils, which degrades more slowly. Hence, if just half of the remaining Amazon disappears, this would slowly release c80GT of carbon into the atmosphere.

Each billion tons (GT) of carbon released into the atmosphere is equivalent to raising atmospheric CO2 by around 0.5ppm. Hence a 80GT carbon release from the Amazon would by itself raise atmospheric CO2 from 415ppm today to around 455ppm. This single change (notwithstanding the continued and unmitigated burning of fossil fuels) would tip the world above the 450ppm threshold needed to keep global warming to an estimated 2-degrees (climate model below).

Can the tipping point be averted?

The solution to Amazon tipping points is technically simple: stop burning down forests and start re-planting them. This does not require electrolysing water molecules into hydrogen, smoothing volatility in renewable-heavy grids, or developing next-generation batteries. It requires something much harder: international diplomacy.

Inflammatory statements? In September-2019, Bolsonaro defended his environmental policies in a speech at the UN General Assembly. International critics were accused of assaulting Brazil’s sovereignty. Brazil considers itself free to prioritize economic development over environment.

Forest for ransom? In the past, Western countries have actually paid Brazil to safeguard its rainforests, although this arrangement has now fallen apart. Specifically, the ‘Amazon Fund’ was created in 2008. It is managed by Brazilโ€™s state-owned development bank, BNDES. $1.3bn has been donated to the fund, from Norway (94%), Germany (5%) and Petrobras (1%). But after taking office, Bolsonaro has packed the fundโ€™s steering committee with members of his inner circle, and in May-2019, he started using the Fund to compensate land developers whose lands were confiscated for environmental violations. Hence Norway and Germany suspended fund payments.

Divestment and trade tensions? As Brazil’s stance on the Amazon has grown more confrontational, it is possible that decision-makers may distance themselves from the country. Global investment funds have threatened to divest. (Could Brazil even surpass the coal industry as the divestment movement’s whipping boy?). Multi-national corporations may also be more cautious around investing in the country (but probably at the margin). Finally, Amazon deforestation is said to endanger future trade deals.

The Biden Factor? President-elect Biden may also seek to influence the Amazon issue. Biden stated the world should collectively offer Brazil $20bn to stop Amazon deforestation and threaten economic consequences for refusing. An executive order re-entering the Paris Climate Agreement would also help the situation (Brazil had actually committed to restoring 12M hectares of native vegetation under the accord). It will be interesting to see how Biden balances climate-focused priorities in the US with this arguably more urgent issue abroad.

Crucial Conclusions? If the Amazon surpasses its tipping point, there would be no chance of limiting atmospheric CO2 to 450ppm or preventing a catastrophic loss of biodiversity. Diplomacy is difficult. But fortunately, decision-makers can take measures into their own hands. Our note below profiles tree-planting charities. This is the lowest-cost decarbonization option we have found in all of our research. It restores nature, including the Amazon. Ultimately, we have argued that restoring nature may the most practical route to achieving climate objectives, while ‘bursting the bubble’ of other transition technologies.

Shale productivity: snakes and ladders?

shale productivity data

Unprecedented high-grading is now occurring in the US shale industry, amidst challenging industry conditions. This means 2020-21 production surprising to the upside, and we raise our forecasts +0.7 and +0.9Mbpd respectively. Conversely, when shale activity recovers, productivity may disappoint, and we lower our 2022+ forecasts by 0.2-0.9 Mbpd. This 7-page note explores shale productivity data, and the causes and consequences of this whipsaw effect.

Biden presidency: our top ten research reports?

energy transition during Joe Biden's presidency

Joe Biden’s presidency will prioritize energy transition among its top four focus areas. Below we present our top ten pieces of research that gain increasing importance as the new landscape unfolds. We are cautious that aggressive subsidies may stoke bubbles and supply shortages in the mid-2020s. Decisions-makers will become more discerning of CO2. As usual, we focus on non-obvious opportunities.


(1) Kingmaker? There are two policy routes to accelerate the energy transition. An escalating CO2 tax could decarbonize the entire US by 2050, for a total abatement cost of $75/ton, while unlocking $3.5trn of investment. The other approach is with subsidies. This is likely to be Biden’s preferred approach. However, giving subsidies to a select few technologies tends to crowd out progress elsewhere. Who gets the subsidies is arbitrary, and thus ensues a snake-pit of lobbying. It is also more expensive, with some subsidies today costing $300-600/ton. Finally, subsidies will only achieve limited decarbonization on our models. Our 14-page note outlines these ideas and backs them up with data, to help you understand the policy landscape we are entering.

(2) Bubbles? The most direct risk of aggressive subsidies is that we fear they will stoke bubbles in the energy transition. Specifically, we have argued a frightening resemblance is appearing between prior and notorious investment bubbles (from Dutch tulips to DotCom stocks) and many of the best-known decarbonization themes today. It is driven by an expectation that government policies will grow ever more favorable, thus technical and economic challenges are being overlooked. Our 19-page note evaluates the warning signs, theme by theme, to help you understand where bubbles may be likely to build and later burst.

(3) Overbuilding renewables is a potential bubble. Our sense is that Biden’s policy team prefers to subsidize renewables today and defer the resultant volatility issues for later. But eventually, we model that this will result in power grids becoming more expensive and more volatile, which could end up having negative consequences, both for consumers and industrial competitiveness. More interestingly, we find expensive and volatile grids have historically motivated installations of combined heat and power systems behind the meter, which can also cut CO2 emissions by 6-30% compared to buying power from the grid, at 20-30% IRRs. The reason is that CHPs capture and use waste heat. Thus they achieve c70-80% thermal efficiencies, where simple cycle gas turbines only achieve c40%. The theme and opportunity are therefore explored in our 17-page note below.

(4) Over-building electric vehicles? Subsidies for EVs are also more likely under a Biden presidency. This is widely expected to destroy fossil fuel demand. Indeed a vast scale-up of EVs is present in our oil demand forecasts helping global oil demand to peak in 2023. However, our 13-page note finds this electrical vehicle ramp-up will actually increase net fossil fuel demand by +0.7Mboed from 2020-35, with gains in gas exceeding losses of oil. The reason is that manufacturing each EV battery consumes 3.7x more energy than the EV displaces each year. So there is an energy deficit in early years. But EV sales are growing exponentially, so the energy costs to manufacture ever more EVs each year outweighs the energy savings from running previous years’ EVs until the EV sales rate plateaus.

(5) Under-investment in fossil fuels? A sticking point in the presidential debates was whether President Biden would ban fracking. An impressive understanding of the energy industry was shown by his response that instead “we need a transition”. However, some have commentators continued fearmongering. We think the fearmongering is overdone. Nevertheless, at the margin, Biden’s presidency may reduce investment appetite for oil and gas. In turn, this would exacerbate the shortages we are modelling in the 2020s. A historical analogy is explored in our 8-page note, which looks back at whale oil, a barbaric lighting fuel from the 19th century. Amidst the transition to kerosene and electric lighting, whale oil supply peaked long before whale oil demand, causing strong price performance for whale oil itself, and very strong price performance for by-products such as whale bone.

(6) Under-investment in oil? Our oil market outlook in 2021-25 is published below. New changes include downward revisions to US shale supplies (particularly from 2022), increased chances of production returning in Iran, and increased production from Saudi Arabia and Russia to compensate for lower output in the US. Steep under-supply is seen in 2022, over 1Mbpd, even after OPEC has exited all production cuts. Restoring market balance in 2024-25 requires incentivizing an 8Mbpd shale scale-up. We do not believe Biden’s policies will block this shale ramp, but they may help its incentive costs re-inflate by c$5-15/bbl, particularly if Trump-era tax breaks are reversed.

https://thundersaidenergy.com/downloads/2020-oil-markets-bounding-the-uncertainty/

(7) Under-investment in gas? Where US shale growth slows, there is clearly going to be less associated gas available to feed US LNG facilities. But there may also be a lower investment appetite to construct US LNG facilities. This matters because our 12-page note below finds gas shortages are likely to be a bottleneck on decarbonization in Europe, which compounds our fears that Europe’s own decarbonization objectives could need to be walked back. Specifically, Europe must attract another 85MTpa of global LNG supplies before 2030 to meet the targets shown on the chart. This is one-third of the 240MTpa risked LNG supply growth due to occur in the 2020s, of which 100MTpa is slated to come from the United States. There is no change to our numbers yet.

(8) Lower carbon beats higher carbon? We are not fearmongering that oil and gas investment will stall under a Biden presidency. But we do believe that investment in all carbon-intensive sectors will proceed somewhat more discerningly than it would have under Trump. Low-carbon producers will be more advantaged in attracting capital, while higher-carbon producers will be penalized with higher capital costs and lower multiples. In order to help you rank different operators, we have assembled a data-file covering 13Mboed of production from major US basins, operator-by-operator (below and here) alongside our broader screens of CO2 intensity, which span across 30 different sectors, such as LNG plants, refineries, chemical facilities, cement and biofuels (here).

(9) Mitigating methane? Biden’s presidency will likely re-strengthen the EPA. Our hope is that this will accelerate the industry’s assault on leaking methane, which is a 25-120x more powerful greenhouse gas than CO2. Methane accounts for 25-30% of all man-made warming, of which c25% derives from the oil and gas industry. If 3.5% of gas is leaked across the value chain, then debatably gas is no greener than coal (the number is less than 1% in the US but can be greatly improved). Our 23-page note evaluates the best emerging technology options to mitigate methane. We are excited by replacing high-bleed pneumatics, as profiled in our short follow-up note (also below). We also see shale operators accelerating their quest for ‘CO2-neutral’ production (note below).

(10) The weatherization of 2M homes is a central part of Joe Biden’s proposed energy policy. Hence we created a data-file assessing the costs and benefits of different options. The most cost-effective way to lower home heating bills is smart thermostats. They can cut energy use c18%. Leading providers include Nest (Google), Honeywell, Emerson, Ecobee. Second most cost-effective is sealing air leaks. GE Sealants is #1 by market share in silicone sealants. Advanced plastics would also see a modest boost in demand. More questionable are large and expensive construction projects, which appear to have larger up front costs and abatement costs per ton of CO2.

Paulownia tomentosa: the miracle tree?

Empress Tree CO2 uptake

The ‘Empress Tree’ has been highlighted as a miracle solution to climate change, with potential to absorb 10x more CO2 than other tree species; while its strong, light-weight timber is prized as the “aluminium of woods”. This note investigates Empress Tree CO2 uptake. There is clear room to optimise nature based solutions. But there may be risks for the Empress.


Nature based solutions to climate change represent the largest and lowest cost opportunity in the energy transition. Those who follow our research will know we see potential to offset 15-30bn tons of CO2 emissions per year via this route (summary below).

The costs are incredibly low, at $3-10/ton, when reforestation efforts are well structured through reputable tree-planting charities (note below). Hence we argue that restoring nature will push higher-cost energy technologies off the cost curve.

Broadly, our reforestation numbers assume 3bn acres could be re-planted, absorbing 5T of CO2 per acre per year, which is the average across dozens of technical papers for typical deciduous forests in the Northern hemisphere (data-file below).

There are further optimisation opportunities to capture around 10T of CO2 per acre per year using faster-growing tree species, such as poplar, eucalyptus and mangrove. However, some commentators claim that another tree genus, known as Paulownia, can achieve an incredible 103T of CO2 offsets per acre per year.

If 100T/acre/year were possible, it would be a game-changer for the potential of reforestation. It would, in principle, only require 0.2 acres of Paulownia to offset the 20Tpa CO2 emissions of the average American. For comparison, population density in the Lower 48 is around 6 acres per American.

Paulownia: the miracle tree?

What is Paulownia? Paulownia is a tree genus, named after Princess Anna Pavlovna, daughter of Tsar Paul I of Russia (1754-1801). It has at least 6 species, of which Paulownia tomentosa is the fastest-growing “miracle” variety. This species also goes by the names: Empress Tree, Princess Tree and Kiri (Japanese).

Paulownia tomentosa can grow by a remarkable 6 meters in one year and reach 27m in height. It then adds 3-4cm of diameter to its trunk each year. It is shown below towering over the other plants in a garden (here, at about 1.5 years old).

Reasons for remarkable growth rates include that Paulownia is a C4 plant. This photosynthetic pathway produces more leaf sugar, especially in warm conditions. By contrast, most other trees are C3 plants and fix CO2 using the Rubisco enzyme, which is not saturated (creating inefficiency) and not specific (so it also wastes energy fixing oxygen). Paulownia’s leaves are also very large, helping it to absorb more light. It also simply appears to have a faster metabolism than other species. And finally, its wood is 30-40% less dense than other species, allowing it to accumulate a large size quickly.

Other Advantages?

Paulownia’s timber is highly prized and sometimes termed the โ€œaluminium of woodsโ€. It is light, at 300kg/m3 (oak is 540kg/m3) and 30% stronger than pine. It does not warp, crack or twist. It is naturally water and fire resistant. When used in flooring, it is also less slippery and softer than other woods (which is noted as advantageous for those prone to falling over). The wood is also suited to making furniture and musical instruments.

Pollutants are well absorbed by Paulownia’s large leaves, which can be 40-60cm long. Hence one study that crossed are screens examined planting Paulownia in a Northern Italian city, to reduce particulate concentrations toward recommended limits.

Other advantages are ornamental qualities with shade, โ€œwonderful purple scented flowersโ€ (below), which support honey bees, and the ability to restore degraded soils.

A final remarkable feature of Paulownia is that you can cut it down and it will re-grow, up to seven times, rapidly springing back from its stump.

Source: Wikimedia Commons

Costs of CO2 offsets using Paulownia?

Our usual model for reforestation economics is shown below, assuming a typical planting cost of $360/acre. Paulownia may be modestly more expensive to grow. Our reading suggests a broad range of $2-7/tree multiplied by c250 trees per acre in commercial plantations. The largest costs are cuttings and cultivation of saplings. Thereafter, paulownia requires โ€œminimal management and little investmentโ€. Hence if growth rates are 10x faster than traditional trees, all else equal, we would expect CO2 offset costs to be c10x lower, at $2-5/ton (including land acquisition costs at developed world prices).

Examples of Paulownia?

Over 2M hectares of Empress trees are cultivated in China, often being inter-cropped with wheat. But Paulownia cultivation in the Western world is more niche. As some examples: Jimmy Carter famously grows 15 acres of Paulownia trees on his farm in Georgia. As a commercial investment, WorldTree is an Arizona-based company that manages 2,600 acres of Empress Trees and plans to plant 30,000 acres more. It claims to be the largest grower of non-invasive Paulownia in the world. Furthermore, ECO2 is a privately owned Australian company, headquartered in Queensland. It claims to have cultivated a variety of Paulownia tree, which can reach 20m after 3-5 years and sequester 5-10x more CO2 than other trees, or around 2.5T of CO2 per tree. Finally, oil companies are exploring reforestation initiatives. For example, YPF noted in its 2018 sustainability report plans to test-plant 40 species of Empress Trees in 2019.

Problems with Paulownia?

Invasiveness? One of the largest pushbacks on reforestation is that large-scale planting of single forest varieties may impair biodiversity (a chart of all the pushbacks is below, with some irony that environmentalists call for drastic action to avert the perils of climate change, then often say, no, “not that drastic action”). In the US, Paulownia is categorized as an invasive plant. A single plant can produce 20M seeds in a year. In some States, such as Connecticut, sales of the plant are even banned. Paulownia did in fact exist in North America prior the last Ice Age. It was re-introduced from China in 1834, when seeds were accidentally released from dinnerware packaging materials. Whatever intuitions one might have, some factions are going to protest against Western cultivation of Paulownia.

But the greatest question mark over Paulownia’s CO2 offset credentials is in the numbers. Different studies are tabulated below.

Empress Tree CO2 uptake

103T of Empress Tree CO2 uptake per acre is the most widely cited number online. But this figure derives from a single study, conducted in 2005. Whose methodology is woefully rough. The study simply assumes a 12’x12′ planting of Paulownia (750/ha, 99.5% survival) and then uses a formula to estimate the CO2 uptake from the trees’ target height and width.

A follow-up study was published in 2019, estimating 38-90T of Empress Tree CO2 uptake per acre per year. But upon review, the upper bound is extrapolating the “maximum growth rate”, which is known to be 2-3x faster than the average growth rate (charts below). The study is also vague on its modelling assumptions. It was funded by a company that commercializes Paulownia plantations. Finally, the study itself notes โ€œadditional research is needed in order to quantify the carbon sequestration rates of Paulownia trees under the specific management regime employed by World Treeโ€™s Eco-Tree Program, by continuing to collect DBH values over the 10 to 12 year harvest cycle.โ€

Empress Tree CO2 uptake

Achieving monster growth rates will vary with growing conditions. Ideal conditions are warmer climates (the tolerable range is -24 to 45C), flattish, well-drained soil with pH 5-9, <25% clay, <1% salinity, <2,000m altitude, >800mm rainfall and <28kmph wind. But past studies planting the Empress Tree in Eurasia have ranged from 3-15 tons of CO2 per acre per year, which is not so remarkable versus other tree varieties.

Diseases. Finally, dense clusters of trees may fall short of growth targets due to disease. Paulownia, in particular, is susceptible to an affliction known as ‘Witches Broom’, which causes the tips of infected branches to die, leading to a cluster of dead branches. The wood is of poor quality and the growth rate of the plant diminished.

We conclude that there is great potential for nature based solutions, especially for their optimisation to boost CO2 uptake rates. Paulownia may be among the options. However, more data may be needed in the West before it can be heralded as a miracle plant.

Our key points on Empress Tree CO2 uptake are highlighted in the research note sent out to our distribution list.

Copyright: Thunder Said Energy, 2019-2024.