The Confusion & Deceit of Climate Change

With all the dissent following the announcement by the USA President Mr Trump to withdraw from the Paris Climate Change Agreement, it’s timely to make some salient points to the non-biased followers of the climate change issue.

Pronouncements by the world’s leading authority the IPCC – Intergovernmental Panel for Climate Change, may not be as accurate or as diligently honest as they profess to be.

There can be no doubt that the Earth is currently on a warming trend, but it’s far from being scientifically accepted across the board that it will continue unabated, or that human activity is to blame, or even that greenhouse gases including CO2 are the prime threat to human life as we know it.

Limiting the Scope of Temperature Predictions

Meteorology today is by no means a perfected science. That is not a critical reflection because the subject is so profoundly complex. But even with all the technology and information available today, meteorologists cannot always get it right.  There is a reason why you will not see weather forecasts usually no further than (say) a week ahead in the newspapers or broadcasts. There are just too many vagaries in the planet’s weather processes. Determination of near-future weather relies to some degree on what meteorological events can be observed today in real time, and outside of that it comes down to skilled deduction – read educated guesswork.

Fig SPM.4 from AR5 – IPCC Fifth Assessment Report of 2014, Summary for Policymakers. It shows temperature rises around 2ºC to 3ºC for a low emission scenario by 2100, and temperatures up to 11.7ºC with a high emission scenario out to 2100.

  So … if they can’t consistently get it right for a month or even two weeks ahead, then how are we to believe the IPCC when they give us such forbidding global temperature projections for (say) two or 12 or even 83 years from now?

Certainly they can put up what blinkered ardent followers and others who can’t think outside the box might consider a good case, but leaving aside the question of mankind’s activities for a moment there’s also the planet’s natural climate trends to consider – something the IPCC doesn’t seem to give a lot of thought to.

They do at least recognise it in AR5 which was their last report stating, “Climate change may be due to natural internal processes or external forcings such as modulations of the solar cycles, volcanic eruptions, and persistent anthropogenic changes in the composition of the atmosphere or in land use.”  They also say there is “High Confidence” in the (existence of) uncertainties of interlinked human and natural systems. But then they go on to emphasis just the human aspects.
IPCC Summary for Policy Makers 2014

Dr Judith Curry is an eminent American climatologist and author who challenges the IPCC about their failings to address the “Uncertainty Monster” when projecting future climate trends. During an interview on 6th February 2017 she talks about how the IPCC processes have robbed (non IPCC aligned ) scientists opportunities to explore natural climate change. Among other points of interest she noted the failure of their climate models to address the pre-1950 natural climate variation saying, “If science can’t explain climate shifts pre 1950, how can we trust today’s climate models?” 
Read more:
WUWT – Dr. Judith Curry on climate science’s fatal flaw – the failure to explore and understand uncertainty.

Pros & Cons of Atmospheric CO2  Concentrations

It’s highly likely one of the IPCC’s (and many of their advocates) officially unstated aims is to frighten us, the people of the world into agitating our respective governments to take action to reduce greenhouse emissions. That may sound like a true sceptic’s view but there’s plenty of evidence to suggest that “cause noblesse” i.e. delivering untruths for what they believe to be for the greater good, continues to happen. A classic case was Senator Al Gore with his, “An Inconvenient Truth” in his 2006 documentary.

Yet even if it isn’t, the wording in the Assessment Reports are getting more and more alarmist. Among many other claims they say CO2 levels are rising at a rapid rate.

Currently the content of CO2 in the atmosphere is 406 ppm – parts per million. According to the IPCC an excess of greenhouse gases created by mankind including CO2 has tripped a natural climate warming trend into a higher gear, thus making the planet approx 1ºC warmer since about 1850.  If the IPCC is correct then CO2 levels are projected to reach around 500 ppm by 2050 which would probably make the Earth an extra 1ºC to 2ºC warmer – albeit in particular places and especially at the poles.

One global warming supporter is Nicola Jones who is a freelance journalist with a background in Chemistry and Oceanography. In a refreshing argument for the global warmers, Jones explains why the content of future carbon in the atmosphere should be kept below 400 ppm in an article published 26th January 2017.  At face value and assuming what she says is factually correct then she makes some very good points, particularly in relation to ancient levels of CO2 and it’s relationship to temperatures at the time, that have been overlooked or ignored by climate deniers. 
Read more:
How the World Passed a Carbon Threshold and Why It Matters

On the other hand there’s Malcolm Roberts, a Senator elected in 2016 to represent the State of Queensland in Australia. The Senator was annoyed that because of poorly researched climate policies, people have lost jobs, paid higher taxes, wasted opportunities, lost businesses and fritted away scarce resources, and that billions of dollars had been wasted on mothballed white elephants such as useless desalination plants. In September 2016 the Senator challenged the Australian leading scientific organisation, the CSIRO to present its case on climate change.

Australia’s CSIRO is highly respected and it supports the global warming theory. The Senator’s findings with the assistance of two well known climate sceptics were that, “the CSIRO had no empirical evidence proving carbon dioxide from human activity affects climate, and that their presentation contradicted the empirical climate evidence”. Basically what they are saying is that the CSIRO is simply “rubber stamping” everything it’s being told about climate change without checking for themselves, and relying on theory and logic rather than proven facts.
Read more:
Senator Malcolm Roberts – On Climate, CSIRO Lacks Empirical Proof

Cherry Picking Temperature Records

 Instrument Records

The earliest temperature measuring instruments didn’t appear until the 16th century but it wasn’t until 1714 that the first reliable thermometer using the Fahrenheit scale appeared. Not until 1860 was it thought there were enough observation sites around the world to begin measuring global temperatures.

Unfortunately the IPCC only uses instrumental records back to 1850. This gives them a mere 167 years of meteorological data out of a climate scale of tens of thousands of years to prove their theory of AGW – Anthropogenic Global Warming.

By over-emphasising the trivially short instrument record, and greatly under-emphasising the varied changes that exist in geological records … the IPCC signals its failure to comprehend that climate change is as much a geological phenomenon as it is a meteorological one.
Prof. Robert M Carter – Climate: The Counter Consensus, 2010

Geological Proxy Data Ignored

Scientists have been able to study the ancient history of the Earth’s climate using geological data aka proxy data e.g. tree rings, ice-cores, lake and ocean sediment, tree and fossil pollens.

Written human records are also used. Paleoclimatologists are skilled scientists who work on the climate of past ages using proxy data such as historical records, journals and diaries, newspaper reports, ship’s logs, farm yields and so on.
Read more:
IEDRO – Paleo Proxy Data: What Is It?

The grey shaded areas indicate the range of uncertainty above and below the solid black line representing the annual calculated temperatures back to about 1760. Note how the range of uncertainty gets narrower with the gradual introduction of new technologies such as thermometers, weather balloons and satellites over time.

Proxy data however is not entirely accurate. They leave large “error bars” or “percentages of uncertainty” which basically comes down to skilled but highly educated guesses. At worst, such proxy data at least provides a starting point towards what the climate was at a given time e.g. warm or cold, warming or cooling, the rate of the warming or cooling and so on.

Weather and climate are both driven by the same processes and there is no real point in time where one can separate them. Both are driven basically by the movement of heat between the land, oceans and atmosphere and it happens in time frames that can run from seconds to millions of years. As well, there are many other physical, chemical and biological processes also happening which affect the planet to a more or lesser degree.

So at what point can we measure climate as opposed to weather?

Misuse of Climate Measurements

People generally accept the word “normal” to mean what is usual. Therefore the term Climate Normal would ordinarily be considered to mean what might be expected or is usual.  But in meteorology it means an average measurement of weather conditions that have actually occurred over a particular period.

In 1935 the WMO – World Meteorological Organization’s conference in Warsaw agreed on a “Standard Normal” aka Climate Normal system  by which climate could be measured over time. The basic idea was to have a benchmark against which past, recent, current or future meteorological conditions could be measured, and to provide a historic context to them e.g. to an recent extreme weather event.

Climate Normals are produced at local, national and global levels and they represent a 30-year average of meteorological data. This period was decided because statisticians believe 30 numbers gives them a reliable mean or average, but it’s not compulsory.  Each Climate Normal is assigned one data point which might be used (say) for plotting temperatures on a graph.  Each data point is calculated as an arithmetic average for the 30-year period being analysed.  The first Climate Normal was set for 1901-30, followed by 1931-60 then 1961-90. The WMO will not analyse the currently running Climate Normal until the end of 2020. Records prior to 1900 are not generally considered to be reliable.
Read more:
WMO – The Role of Climatological Normals in a Changing Climate.

In 2011 the WMO introduced a second tier of Climate Normals ostensibly to account for the “rapid pace of climate change” which provides for measurements of current temperatures.  The new tier retains the 30-year period but is updated every 10 years instead of 30 years e.g. 1961-90, 1971-2000 and 1981-2010 (which is the current baseline period) used by the WMO.
Read more: WMO – New Two-Tier approach on “climate normals”.

Changing the Message

Some scientists have calculated 30-year and 20-year climate normals by going back to 1850 which gives them even more data points. However there are those who believe the early temperature records should not be used. Among other concerns there is no real guarantee that temperature readings were always observed under similar conditions, or that some temperature extremes may have been recorded pre-1910 using non-standard equipment and that they could be location specific, or that other warmer or cooler data may not even have been entered into the database.

To confuse the issue even further there are other systems of measuring temperature being used. “Period Averages” allows analysis of a minimum period of 10 years and 20 year graphs are fairly common. “Normals” are used for any period as long as it’s three consecutive 10-year periods. Another is the “Hinge Fit” used by the NOAA -National Centers for Environmental Information.  On top of everything, the use of the terms “climate normal” and “normal” are often misused by people who don’t fully understand them.

Using the Tier 2 Climate Normal and other systems can no doubt be justified, but they can have unfortunate side effects. They can cause confusion for non-scientists and lend themselves to bias or flat out deception. The following highlights one common type of deception used by the IPCC.

Below:  Typical Climate Normal graph showing “anomalies”
i.e. departures in degrees either above or below the
mean temperature average for a selected period.
 Approx 160 year graph from around  1880 to 2007 showing anomalies up to 0.7ºC below the average before about 1980, and up to 0.8ºC above the average by about 2010.    This is typical of an IPCC graph with no historic context provided.
Graphs like these are commonly used to demonstrate an “unprecedented” temperature spike in the late 20th century.
2000 year graph providing historic context to the late 20th century temperature spike.    It shows a warmer period during the MWP – Medieval Warm Period at almost 0.6ºC above the mean average followed by the LIA – Little Ice Age during the 1600s.    This was followed by rising temperatures to about 0.4ºC by year 2000. At that time the temperature was believed to have been cooler than during the MWP by the researchers.
Graphs showing temperature rises in historic context do not look quite so alarming.

Note that the difference between the size of the anomalies between the first and second graph is due to the selection of different Climate Normal to use as a baseline. It’s a common enough practice by those with lesser integrity.

Of course there are other ways to misrepresent Climate Normal (anomaly) type data on graphs. For instance once might select a different baseline period that has a hotter or cooler mean average temperature thus making the anomalies higher (warmer) or lower (cooler) on a graph. 

Jo Nova is an Australian  Bachelor of Science, author and blogger on the science, funding and politics of climate change. For three years she was an Associate Lecturer on Science Communication at the Australian National University. In a light-hearted manner she discusses some of the methods that are actually used to misrepresent climate change.
Read more: Jo Nova – How to make climate graphs look scary — a reply to XKCD

In recent years there has been much brouhaha in the USA about a perceived global warming pause and even a possible cooling trend. It’s not surprising then that some presenters have probably been cherry-picking the data and building anomaly graphs to prove their case.

What it all boils down to is that deception abounds and non-scientists should be careful of any presenter today who shows a temperature graph purporting to prove excessive warming or cooling.

As we’ve seen, climate changes occur naturally in time intervals of thousands of years. And despite claims by climate warming advocates that the late 20th century warming spike is unprecedented, non-aligned IPCC scientists have shown by using geological proxy data, temperatures similar to those recorded at the end of the 20th century have occurred since the emergence of mankind or at least very near to it.

One example is the CET – Central England Temperature. This is considered to be a reliable source of regional data for Central England.  Many believe it is also a reliable proxy dataset for analysing past climate in Europe and also the North Atlantic. The CET shows at least two warming spikes over just a few years since around 1820. Both were of shorter duration to the one which occurred at the end of the 1900s. Yet the people in that region have flourished apparently without the calamitous climatic events which the IPCC is predicting about to happen to us.
Read more:
Met Office Hadley Centre – CET (HadCET) Dataset

Bill Whittle is an American conservative blogger who, among other things discusses climate change issues.  In this short video he discusses several of the issues mentioned here about how we are being deceived by only getting part of the story, whether the sciences is real, plus some other issues not known about by this writer:
Watch video: Bill Nye – The Science Lie

Do We Change Our Economic Policies?

Using the WMO Climate Normal system provides only three full climate data points on which to plot a hypothetical graph of global temperatures. The Tier 2 climate normal system provides up to 11 data points based on 10 yearly intervals but as mentioned these tend to be misused.  Yet no matter what system is used for representing alleged dangerous global warming, there is still only 167 years of recorded temperature data available, some of which is not considered reliable.

That’s not much on which to base changing a countries entire economic policies. And yet we have one State in Australia soon to be followed by another, that is currently implementing a policy to replace it’s baseline energy system to renewable energies and decommissioning it’s coal fired power-plants. It’s already experienced huge blackouts when storms damaged the renewable systems infrastructure and they had no backup system, other than to ask another State for help.

“Oh but it’s okay” they say, “we’ve fixed the problems and it won’t happen again” they say.

Yeah right ….

Ultimately, basing policies on just 160 years or so of climate records at the least must be considered a bit short sighted.  In fact it’s more about idealism than practicality.  It certainly cannot help anyone make accurate judgements whether the climate is going to continue trending up or down instead of just assume it will happen. And it’s pointless to keep throwing so much money at something with no real scientific resolution after almost 30 years, and which might eventually be a non-problem anyway.

Are We Prepared for a Catastrophic Event?

On the face of it the outlook for the World populations is bleak, but not necessarily from climate change.  The real issue is about the actions now being taken by collective countries to solve what may turn out to be a non-event, and who are being led by an ideal driven organisation that bases its opinions on theory and logic rather than practical science and procedures.  So much money is being wasted to try and prevent something that may very well happen anyway – or not, with or without additional greenhouse gases.

But unfortunately something even worse could happen. The climate warming trend could potentially reverse course into an even more disastrous cooling trend – don’t laugh just yet.

At the moment the Earth is in an Interglacial Period i.e. between alternating ice ages. Based on previous Milankovitch cycles the onset of thousands of years decline into another Ice Age is overdue. One of the triggers is when summer temperatures in the northern hemisphere fail to rise above freezing for years, snowfall doesn’t melt and compresses then turns into ice sheets over time.

Some scientists are even suggesting the current AGW greenhouse effect is preventing that onset.  As silly as that may seem now, consider that the last real Ice Age finished about 12,000 years ago when mankind was in it’s infancy, and that Interglacial periods have historically lasted about 10,000 years. Is it as far fetched as the Earth being round or that mankind would walk on the moon?

Right now there’s probably no one looking into how the massive loss of grain growing farms in the Northern Hemisphere can be substituted to prevent famine … just in case the world needs because of some catastrophic disaster (let’s say a meteor strike or super volcano?)

In any case farms are being bought out now anyway due to continuing urban sprawl so hopefully somebody is starting to look into it.  Or maybe we should have more scientists looking into ways of reducing famine in places like Somalia instead of meeting obligations of funding research in global warming.


Billions of dollars is spent annually to the false god of the AGW hypothesis.  If the people of the Earth are to make any headway into the issue of climate change, either warming or cooling then thought might be given to the following:

1. The IPCC should be disbanded or reconstituted into a purely scientific organisation because:

  •  It’s staff is entirely bureaucratic. As such it’s political by nature and subject to political manoeuvring. It’s member countries have too much input into the final Assessment Reports.
  • There’s every indication that it has been corrupted as a result of “cause noblesse” i.e. not being truthful for what it believes is to the greater good.
  • It’s scope of research was limited to just the instrumental meteorological records right from the start. No real regard is given to natural climate processes.
  • It does not consult geological proxy data in order to determine any historic context against what’s happening now, or whether alarming the world as it has, is justified.
  • It assumed right from the start that mankind was responsible for global warming without using proper scientific proof procedures.
  • It’s primary focus continues to be related just to mankind’s activities in relation to it’s effect on climate.

2. Ideally, a purely scientific organisation needs to be raised to monitor and try to determine the possible future of the Earth’s climate and if possible:

  • should remain under the charter of the UN and World Meteorological Office;
  • be autonomous to the extent that it should not be influenced by the desires of individual countries i.e. no member countries;
  • issue a scientific paper every 10 years as to the state of research in climate activities over the previous decade based on 30-year Climate Normals;
  • any and all scientific theories and papers to undergo proper scientific testing and approbation prior to public release; and
  • research better ways of investigating earlier warning and response system for climatic disasters.

In the meantime, let’s take the pressure of the world’s scientists to do what they know best without political meddling, hindrance, misinformation or manipulation. Let’s be really sure that when they do say something, that we can actually trust it – unlike now with so much disparagement going on between the two sides of the issue i.e. alarmists and sceptics.


1. Climate: The Counter Consensus, Prof. Robert M. Carter 2010

Alternative Greenhouse Gas Theory

The 33°C Greenhouse “Envelope”

This subject was previously included in the previous post: “Errors in CO² Emissions?”  It attracted sufficient interest to require that it should have it’s own post.

So … there’s all this talk about CO² emissions being at the heart of the problem and causing so much angst between alarmists and sceptics, but what if CO² isn’t even relevant?

Scientists mostly agree that without GHG the global average temperature of the Earth would be about -19°C and that a natural 33°C “envelope” of warming GHG brings it up to around 14°C or so.  What they don’t agree on is the cause of this natural warming. There are at least two separate theories. One that discusses GHG radiation and the other a gravito-thermal greenhouse effect or to put it more simply – greenhouse gases vs natural processes.

The 33°C Arrhenius Radiative Greenhouse Effect
(Greenhouse Gases)

Don’t let the title put you off.  It just means the greenhouse theory used by the IPCC and in their computer climate models . It’s probable that most scientists agree that elements of different radiative gases including CO² are trapped in the atmosphere and re-radiate heat back to Earth thus causing a cycle of continual warming of the planet.

An example of the 33°C Arrhenius Radiative Greenhouse Effect theory. NB: Natural “feedback” cooling effects not shown here.

But it’s important to understand that the 33°C envelope is not just all warming as is commonly expressed as shown in the example.  It is really the end result of roughly 50/50 warming AND cooling effects i.e. a combination of both natural climate forcing (heating) and feedback (cooling) systems. This has been known for a long time and supported for example in scientific papers by Messrs Manabe and Strickler in 1964 and Dick Lindzen’s paper in 1990.

In any case,  if the Arrhenius theory is correct, then mankind obviously must be responsible to some extent – although arguably not to the levels we are being led to believe. That viewpoint also applies to the possibly exaggerated future consequences of increased global warming.

But given that the planet is subject to both warming and cooling influences, shouldn’t the warming of less than 1°C over the last 150 years or so alleged to have been caused by mankind, also be reduced to about 0.05°C?

The 33°C Gravito-Thermal Greenhouse Effect
(Natural Processes)

Other scientists follow the Gravito-Thermal theory which began in 1738 when Daniel Bernoulli learned how to understand air pressure at a molecular level. Some problems were further explained in the 1850s by Maxwell  who found it wasn’t necessary to track every molecule but just the distribution of them e.g. how the microscopic connected to the macroscopic. Albert Einstein did some work on related Kinetic Theory in 1907. In 1976 the final version of a 241 page supporting document the US Standard Atmosphere was published.

One of the adherents to the Gravito-Thermal theory was a leading Physicist Richard Feynmann (decd 1988). He said the greenhouse effect that warms the Earth is due solely to the effects of gravity, atmospheric mass, pressure, density, and heat capacities, and not due to any “trapped” radiating elements of greenhouse gases. And not just the 33°C “envelope” but constantly. 
Read More:
Principa Scientific International: Physicist Richard Feynman Discredits Greenhouse Gas Theory

Was he a crackpot?  Not likely. He was a Nobel  Prize winner in Physics in 1965 along with several other awards. He and allegedly hundreds of rocket and atmospheric scientists, physicists and aeronautical engineers were involved in formulating the US Standard Atmosphere. It provides the means to determine the temperature, pressure and density at any altitude. It’s used for example in aviation applications and in application of this theory.

Nowadays, the “33°C Maxwell/Clausius/Carnot/Feynman Gravito-Thermal Radiation Effect” aka gravitational forcing theory maintains that the generally accepted greenhouse radiative gases version confuses “cause with effect” in the Earth’s warming processes.  In other words they say the gravito-thermal processes comes before any radiation from greenhouse gases.
Read More: The Hockey Schtick Dec 2014: Why Atmospheric Temperature is a Linear Function of Mass & Gravity, and Not Influenced by Greenhouse Gas Concentrations  

The world’s scientists don’t seem to be even able to agree on how the Earth is warmed naturally let alone that mankind’s activities are responsible for additional warming.  Only one 33°C “envelope” warming theory can be correct.   Surely it’s impossible to determine human effects if you don’t know the natural patterns or causes.  And we apparently don’t …


1. Climate: The Counter Consensus, Professor Robert M. Carter 2010.
2. IPCC Fifth Assessment Report 2014
3. Other Links as indicated in the text.

Errors in CO² Emissions?

Is the Climate Change CO² Science Right?

A section of this post dealing with an alternative theory on how the Earth warms and cools naturally by re-radiating greenhouse gases was originally included in this post.  It attracted sufficient interest to warrant it’s own post here: Alternative Greenhouse Theory.

Science has historically not always got it right despite any “overwhelming proof” in their day.

For almost three decades the IPCC and its advocates have been saying that their scientists are right in declaring human kind responsible for causing dangerous global warming aka AGW – Anthropogenic Global Warming.   Not proven, but so near that they agree it IS indeed right. So right in fact that they are squandering literally trillions of dollars around the world still trying to prove it. They believe implicitly that excess CO² greenhouse gases (GHG) produced by mankind are warming the planet higher than natural processes.

According to climate alarm sceptics which include thousands of eminent scientists among them, they believe there is sufficient evidence to at least cast some doubts on the accuracy of many of the IPCC pronouncements, if not throw out the case for AGW completely.

The question is – could the IPCC actually be right?  This article will take a look at just a few of the disputed issues that relate to CO² emissions.

The Current Warming Trend

The mainstay of the IPCC argument is that mankind is responsible for the late 20th century warming trend.  But it’s important to understand that there is a natural warming phase going on right now anyway.

As we know, the planet goes through cyclical periods of warming and cooling. The Earth is currently on a natural warming trend following on from the last Little Ice Age.  Hypothetically, how long this warming cycle  would  have lasted without mankind’s contributions is anybody’s guess.

A Caution About Global Averages

The term “global average” is often used in public discussions of climate change to demonstrate negative trends such as rising global temperatures or the amounts of global CO² in the atmosphere.  But at the end of the day it is merely a statistical figure usually based on different specimens of different data from several different sources. Data is collected and manipulated in any number of legitimate methods by different people with different statistician skills to produce different outcomes – just select the one that suits your argument best.

Images from IPCC Summary for Policy Makers AR5 showing a 0.8°C global surface temperature rise since 1850 and a 0.2mm global rise in sea levels since 1900

The IPCC in their Fifth Assessment Report of 2014 (AR5) continuously mentions global averages  in respect of temperatures, CO² emissions, sea level rises, precipitation and so on.

To be fair they do acknowledge that temperatures etc at any given region may experience more, or less, or no effects of increased global warming in the future.  But it’s a passing sentence and you need to actually read the document rather than just skim through it. It really ought to be flagged more prominently.

But of course temperatures vary widely around the Earth depending on time of year, latitude, ocean and wind currents,   For example, the coldest inhabited place on Earth is arguably the village of Oymyakon in Russia where it can reach -45ºC, and the hottest inhabited is probably Death Valley in California USA where it can get up 56.7ºC.

Whether people could actually live independently of outside sources in places like these is another story. But in general, any given place will usually have a hotter or colder climate than the stated global average.

But let’s get back to the overuse of global averages … professional writers know that the written word (and diagrams etc) are often interpreted differently depending on the reader or their level of focus at the time of reading.

It’s not likely to be stretching things too far to say that the constant use by alarmists in using global average figures can lend itself to misconceptions in some lesser educated or inattentive people that it is going to get hotter where they actually live – or that extreme weather events are going to happen in their own region.

The bottom line is that if a media presentation keeps blathering about global averages and how negative it’s going to be, and which does not relate it to your geographical region then please let me suggest you turn it off. Such stuff is neither scientific or even sensible and is more about devotion to an quasi-religious eco-alarmism … or headline seeking.

Uncertainty Errors in CO² Emission Calculations

Scientists generally refer to an “error bar” or “uncertainty range” where an exact figure is not known.

So let’s take a common method of measuring an unknown distance by asking a group of people to give a visual distance estimate and call it a range of uncertainty or as in science, an error bar.  Now remove the highest and lowest distances and what’s left is your error bar or range.  Somewhere within that range the real distance should be located – hopefully. Now either centre or else average out between the highest and lowest to find what you hope is close to the real distance or to provide a baseline point.

Average atmospheric CO2 concentrations measured in the 19th and 20th century. Encircled are the values used by Callendar (1958). Redrawn after Fonselius et al. (1956).

In a very basic sense this is how scientists originally estimated the pre-industrial levels of CO² in the atmosphere as being 280 ppm. And that’s ignoring the many scientifically recorded measurements taken during the 19th and 20th centuries which indicated higher readings. And so it’s been used during pre-industrial times and then accepted by the IPCC when it first formed.

Obviously the methodology was more calculated than that but the principles were most likely basically the same.  But let’s stick with 280 ppm because it at least provides a kind of baseline.

If we assume the GHG theory as being correct, there can be little argument that humans have contributed to the current estimate of about 400 ppm of  CO² in our atmosphere.  Nor do scientists necessarily argue that CO² is at the very least a mild GHG – though of course there is diligent argument whether it is more than that.

Yet doubts have been cast on the previously accepted levels. Examination of glacier data has often been used to determine the levels of CO² concentrations in the atmosphere during the pre-industrial era, and they are also used for important calculations in climate change research.   For example, Messrs Jaworowski, Segalstad & Hisdal in their 1992 paper discussed this in their paper, “Atmospheric CO2 and Global Warming – A Critical Review, 2nd Revised Edition 1992″.

The report is believed to be the first critical review of CO² trapped in air bubbles in glaciers.   It reveals several errors in methodology and incorrect scientific assumptions which question the very validity of the AGW hypothesis. Some of the issues discussed include:

  • the subjective manner in which the value of 290 (sic) ppm was originally decided;
  • the siting of some of the observatories near volcanic activity and the methods used to edit the results;
  • the instrumentation and methods used to record historic thermometer temperatures; and
  • a new discovery of liquid found still trapped in air bubbles in ice under -73C that can significantly enrich or deplete CO² compared to an original atmosphere.
The projections of man-made climate change through burning of fossil carbon fuels (coal, gas, oil) to CO² gas are based mainly on interpretations of measured CO2 concentrations in the atmosphere and in glacier ice. These measurements and interpretations are subject to serious uncertainties…
Jaworowski 1992

The Uncertainty Range of Volcano Effects

There have been some very big volcanic eruptions in recent decades causing all sorts of problems spewing out volcanic dust and CO². Major fractures, hot springs and geysers also vent CO². Over the last 10,000 years or so there have been around 1500 land volcanoes active.

The Kilauyea Volcano in Hawaii

Let’s take just one example. The Kilauyea Volcano in Hawaii has been active for a long time erupting on average about once per three years or so and is among the most watched in the world. Until recently it was thought to be emitting around 2,800 tons of CO² per day. In 2001 it was thought to be more accurately amended to 8,800 tons/CO2/day. In 2008 the USGS – the US Geological Survey changed it again to 4,000 tons/CO²/day.  That all makes for an uncertainty error bar of between 100% to 300%.

But compared to land volcanoes, not so much is known about sub-sea volcanoes which make up the majority on the planet. There are literally  thousands of them. CO² is the most common gas found in their volcanic hydro-thermals but rarely is it found in liquid form as well.

White smoking vents at the Champagne sub-sea volcano

In 2006 the Champagne volcanic site in the Mariana Trench was found to be discharging a 103°C gas rich fluid and droplets at less then 4°C of mainly liquid CO2 were also discovered. The hot fluid at a molecular level of 2.7 moles/kg of CO² was the highest ever reported. The droplets contained 98% CO². All of this CO² was being absorbed into the ocean before it had risen less than 200m. This site alone is estimated to be contributing 0.1% of the “global carbon flux” i.e. from all natural sources being sent into the atmosphere – and that’s a lot.

An example of a “global carbon flux”

See: Submarine venting of liquid carbon dioxide on a Mariana Arc volcano

Following the Champagne discovery there have been suggestions that perhaps sub-sea volcanoes may be contributing more to the global carbon flux than previously realized.  With so much uncertainty on volcanoes generally and other forcing (CO² adding) agents, how can the IPCC be so certain on the extent of mankind’s contributions of CO² compared to natural sources?

The bottom line is they can’t really know.  Very little of it is yet known. They are forced to make calculated, educated guesses and produce results that include error bars of uncertainty about accuracy. And the ranges of those error bars are also under attack by sceptics.

CO2 “Residence” Time in the Atmosphere

As of 2010 there was an estimated  780Gt of CO² of which about 210 Gt (25%) was believed to be exchanged between the oceans and land “sinks” e.g. plants etc. So how long does the remainder stay up there?

The IPCC estimates the “residence time” i.e. the time that CO² elements remain in the atmosphere before being reabsorbed or emitted to space is anywhere between 5 and 200 years or more. That’s quite a error bar range of uncertainty.  I have read where one alarmist advocate stated that the rates of absorption of CO² into the Earth varied widely depending on how it’s being absorbed e.g. by the oceans, land or sea biota.  Maybe that is possible.
See also: Working Group I: The Scientific Basis:

IPCC AR5 2014 CO2 residence time chart

Other non-IPCC aligned scientists generally estimate a CO² residence time of between 5 to 10 years.

And the observed decrease in the radioactive carbon 14C in the atmosphere following the cessation of atmospheric nuclear testing in 1963 has confirmed the half life of CO² in the atmosphere at less than 10 years.  Incidentally, the 14C radioactive element can also be present naturally.
Source: Environmental Effects of Increased Atmospheric Carbon Dioxide

Unfortunately the IPCC tends to rely on a longer residence time in their computer models which consistently produce a higher global average temperature result by a given time e.g. 2020.
Also read:
Don Aitkin – How Long Does Carbon Dioxide Remain in the Atmosphere.

So if the non-IPCC aligned scientists and educated others are right, then the future temperatures following the currently observed trend is going to be more likely correct?


Both the sceptic and alarmist sides of the climate change debate are prone to making exaggerated and implausible claims. So much so that it’s sometimes difficult to find the real truth about the alleged dangerous global warming being caused by humans aka AGW.

This site is about trying to find that truth.  However, these pages may at first appear to be on the sceptic side – but that is not entirely true. Information in support of AGW that can be proven from sources outside the IPCC will be accepted.

The information here is believed to be correct at time of writing. Comments to the contrary which can prove otherwise are welcome.   Only comments from rational people who can discuss AGW issues dispassionately and with common courtesy will be considered.


1. Climate: The Counter Consensus, Professor Robert M. Carter 2010.
2. IPCC Fifth Assessment Report 2014
3. Other Links as indicated in the text.

Greenhouse Gases – A Perspective

We’re always hearing about greenhouse gases and carbon dioxide (CO2) and how they are causing warming of the Earth that will be dangerous to life on the planet.

But is it all rhetoric?

What this post is about, is to try and put some perspective on the issue of carbon dioxide, and whether or not it is the big bad demon or not that alarmists are saying it is.

A “Normal” Climate

Our atmosphere is made up of molecules of which about 99% is radiatively inactive. Greenhouse gases make up the remaining 1%.  Under ideal “normal” circumstances suitable for life, the combined greenhouse gases increase the Earth’s temperature from a theoretical -19⁰C to an average of around +15⁰C i.e. 34⁰C of warming.

Example of a “balanced” greenhouse system.

Typical estimates of the proportions of the greenhouse warming gases are around 78% water vapour, 20% carbon dioxide and around 2% from methane, nitrous oxide and other minor compounds. However estimates of the amount of water vapour can differ significantly from 60% less to 88% higher than 78%.

Source: “Climate: The Counter Consensus” by Prof. Robert M. Carter 2010

 What exactly are Greenhouse Gases?

Greenhouse gases absorb infra-red thermal radiation and hold the heat for a period of time in the atmosphere before being eventually re-absorbed back into the Earth or escapes into space.  It’s like a blanket that effectively keeps heat trapped near to the Earth.

Low vapour = higher CO2 levels = higher temperatures. High vapour = lower CO2 levels = lower temperatures.

The more important ones are water vapour, carbon dioxide, methane, nitrous oxide and ozone, in decreasing order of effectiveness mainly because of their concentrations.

Source: “Energy & Environment Vol 16 No 6 2005” by Jack Barrett

The Kyoto Protocol recognizes six greenhouse gases. The numbers in brackets are the IPCC’s estimate of the warming potential of each compound compared to carbon dioxide e.g. methane has 21 times warming effect than 1 part of carbon dioxide:

  • CO2 – carbon dioxide(1)
  • methane (21),
  • nitrous oxide (310),
  • hydroflurocarbons (140 to 1,700),
  • perflurocarbons (6,500 to 9,200) and sulphur hexafluoride (23,00).

On the surface it makes CO2 look positively benign and might be the sort of thing a sceptic could say, except that the non-CO2 gases are only in very small percentages as to be almost negligible by comparison.

But you might notice the most prominent agent of water vapour is missing from the IPCC list. They say it is a “feedback” not a “forcing” agent and that it only lasts in the atmosphere for a few days as opposed to CO2, which they say lasts for more than 100 years. Therefore they don’t use it for the purpose of arguing the AGW – human caused warming case.

Feedback and Forcing Agents

Estimates taken from ice cores in Antarctica and Greenland up to 1 million years ago.

It’s a given scientific fact that the Earth’s global temperatures throughout its history has ranged widely from hot to cold.   At any given time when the temperature was fairly steady, the atmosphere has been “balanced” i.e. the incoming energy from the sun was equalled by outgoing energy in some form from the Earth.

Then along came forcing or feedback agents to tip the balance and warm or cool the Earth respectively. The various cycles of warming and cooling has continued from the beginning, mostly due to how far the Earth orbits around the Sun.

Forcing agents created by mankind according to the IPCC

Forcing Agents: These can be external e.g. the Sun, or internal e.g. material which absorbs heat such as aerosol dust from natural sources such as volcanoes or human sources such as aerosol spray cans, CO2 – carbon dioxide emissions and so on. According to the IPCC, CO2 is believed to account for the majority of the radiative forcing that keeps the greenhouse effect going.

See also :

Feedback Agents: These can change the effects of Forcing either positively and/or negatively. For example an increase in temperature causes more evaporation which causes higher temperatures, but at the same time creates more low clouds which reflect solar radiation back into space and rains to cause cooling.

An agreed scientific fact is that water vapour is not a forcing agent in its own right. It requires a rise in temperature to first create evaporation and more clouds. This causes an initial and temporary increase in temperature, but as the warm air rises, cooler air is sucked in from surrounding regions.

It then it becomes a feedback agent. The tops of clouds reflect incoming energy from the sun. The warm vaporous air is cooled in the upper altitudes and condenses. Rainfall then cools the earth.  Cooler air under these clouds are drawn away to other warmer areas.

Water Vapour

Even though water vapour is the most dominant in its ability to absorb heat, the IPCC does not include it as a greenhouse gas as previously mentioned. So let’s look at why they should include it in their calculations.

There is considerable overlap in absorption qualities between water vapour and other greenhouse gases.

The CO2 absorption rate increases 1.5% in the absence of other greenhouse gases which causes temperature to rise. But when put together with other greenhouse gases its absorption rate drops down to 0.5%. In other words it’s much less effective in taking up heat.

Water vapour has about 5 times the ability to absorb heat as CO2, however the cooling effect mainly happens in the first 100m of the atmosphere, and gets less and less effective in comparison to CO2 as altitude increases.

Trade winds are classic examples of air moving from colder to warmer areas.

Yet even if water vapour only lasts in the atmosphere for a few days, there is always rain happening somewhere on the planet every day. Cumulatively it must have at least some contributing cooling effect on a regional, if not global scale?

Further, with the water vapour in the higher altitudes being almost ineffective in absorbing heat,  it would be expected that more CO2 would have a greater effect on atmospheric warming. But this does not seem to be occurring in spite of the predictions of most GCMs.

So the real question is whether water vapour is a more powerful forcing or cooling agent?  And if so, by how much?

The question of climate feedback systems is a highly complex subject. Scientists don’t yet know all the feedback systems that affect climate and maybe never will. It’s one of the reasons why computer modelling might be considered unreliable by some.

With all the uncertainty about feedback effects, not to mention outright scientific disagreements, it has to be wondered for example if this overlap of absorption rates has been built into the computer modelling on which the IPCC relies and which must give cause for some scepticism.

See also:

Health Effects of CO2

 The level of CO2 in our atmosphere is at 0.04% (?)  which today is about 400 ppm (parts per million). The level considered safe is 600 ppm for indoors.

In low concentrations e.g. less than 1% there is little noticeable effect. Inside a building without a fresh air supply then 1% (10,000 ppm) of CO2, some occupants might feel drowsy.

It usually needs to be over 2% or so before most people become aware of it without something to alert them e.g. some form of smell. An acid condition of the blood “acidosis” may form after several hours at that level. At 5% the breathing rate doubles and above that level it becomes toxic. Prolonged exposure creates various noticeable symptoms, which if left untreated can cause unconsciousness and even death.

However it should be understood that adding a given amount of CO2 to a contained space correspondingly reduces the amount of oxygen available. Any health issues as a result of exposure to CO2 could likely be as much due to a lack of oxygen as to the CO2.

Source: InspectAPedia article, “Toxicity of Carbon Dioxide Gas Exposure, CO2 Poisoning Symptoms, Carbon Dioxide Exposure Limits”.

How Much CO2 Is Up There?

Estimations about how much CO2 is present in the atmosphere are up to 780 Gt of total carbon.  About 7 Gt is estimated to be added each year from all sources, of which the IPCC argues about 3.5 Gt (half) is man-made.  Since pre-industrial times the total amount has risen from about 280 ppm – parts per million to 400 ppm.

You may remember the IPCC believes CO2 stays up there for about 100 years, but there are other scientific calculations based on carbon isotope evidence and a much shorter time that the CO2 stays up there i.e. five to 10 years. They suggest that only four to five percent – about 0.05 Gt  of the total 780 Gt is derived from fossil fuel burning.

Note: The whole paper is an interesting read but page 13 lists the scientists who have published shorter “residence” time of CO2 in the atmosphere.

But even if that is not true or even near it, if we use the IPCC figure of 3.5 Gt per year being added each year because of humans, it’s still only 0.45% of the total carbon count. (780 Gt divided by 3.5 Gt).  In other words, 99.55% of greenhouse gases has nothing to do with human CO2 emissions, and the remaining 0.45% just happens to be equivalent to the 0.1⁰C warming which other non-IPCC aligned scientists have come up with.

See also:

Another relationship between water vapour and carbon dioxide

Despite how much water vapour dominates the greenhouse gases as a primary factor, and the IPCCs own estimations about how much CO2 is actually being added each year to the total atmospheric content, carbon dioxide still remains the centre of attention.

One of the reasons would have to be the continued focus of the IPCC on promoting AGW – Anthropogenic Global Warming instead of looking at the whole of the climatic systems. Until they are removed as the leading authority on climate change and the matter placed back into the hands of actual scientists, the matter will continue to be political.