The temperature forecasting record of the IPCC

In geology we use computer models to simulate complex processes. A good example would be 4D simulation of fluid flow in oil and gas reservoirs. These reservoir models are likely every bit as complex as computer simulations of Earth’s atmosphere. An important part of the modelling process is to compare model realisations with what actually comes to pass after oil or gas production has begun. It is called history matching. At the outset, the models are always wrong but as more data is gathered they are updated and refined to the point that they have skill in hind casting what just happened and forecasting what the future holds. This informs the commercial decision making process.

The IPCC (Intergovernmental Panel on Climate Change) has now published 5 major reports, the First Assessment Report (FAR) in 1990. This provides an opportunity to examine what has been forecast with what has come to pass. Examining past reports is quite enlightening since it reveals what the IPCC has learned in the last 24 years. I conclude that nothing has been learned other than how to obfuscate, mislead and deceive.

Figure 1 Temperature forecasts from the FAR (1990). Is this the best forecast the IPCC has ever made? It is clearly stated in the caption that each model uses the same emissions scenario. Hence the differences between Low, Best and High estimates are down to different physical assumptions such as climate sensitivity to CO2. Holding the key variable constant (CO2 emissions trajectory) allows the reader to see how different scientific judgements play out. This is the correct way to do this. All models are initiated in 1850 and by the year 2000 already display significant divergence. This is what should happen. So how does this compare to what came to pass and with subsequent IPCC practice?

I am aware that many others will have carried out this exercise before and in a much more sophisticated way than I do here. The best example I am aware of was done by Roy Spencer [1] who produced this splendid chart that also drew some criticism.

Figure 2 Comparison of multiple IPCC models with reality compiled by Roy Spencer. The fact that reality tracks along the low boundary of the models has been made many times by IPCC sceptics. The only scientists that this reality appears to have escaped are those attached to the IPCC.

My approach is much more simple and crude. I have simply cut and pasted IPCC graphics into XL charts where I compare the IPCC forecasts with the HadCRUT4 temperature reconstructions. As we shall see, the IPCC have an extraordinary lax approach to temperature datums and in each example a different adjustment has to be made to HadCRUT4 to make it comparable with the IPCC framework.

Figure 3 Comparison of the FAR (1990) temperature forecasts with HadCRUT4. HadCRUT4 data was downloaded from WoodForTrees [2] and annual averages calculated.

Figure 3 shows how the temperature forecasts from the FAR (1990) [3] compare with reality. It should be quite clear that the best model is the Low Model. I cannot easily find the parameters used to define the Low, Best and High models but the report states that a range of climate sensitivities from 1.5 to 4.5˚C are used. It should be abundantly clear that the Low model is the one that lies closest to the reality of HadCRUT4. The High model is already running about 1.2˚C too warm in 2013.

Figure 4 The TAR (2001) introduced the hockey stick. The observed temperature record is spliced onto the proxy record and the model record is spliced onto the observed record and no opportunity to examine the veracity of the models is offered. But 13 years have since past and we can see how reality compares with the models in that very short time period.

I could not find a summary of the Second Assessment Report (SAR) from 1994 and so jump to the TAR (third assessment report) from 2001 [4]. This was the year (I believe) that the hockey stick was born (Figure 4). In the imaginary world of the IPCC, Northern Hemisphere temperatures were constant from 1000 to 1900 AD with not the faintest trace of Medieval Warm Period or Little Ice Age where real people either prospered or died by the million. The actual temperature record is spliced onto the proxy record and the model world is spliced onto that to create a picture of future temperature catastrophe. So how does this compare with reality?

Figure 5 From 1850 to 2001 the IPCC background image is plotting observations (not model output) that agree with the HadCRUT4 observations. Well done IPCC! The detail of what has happened since 2001 is shown in Figure 6. To have any value or meaning all of the models should have been initiated in 1850. We would then see that the majority are running far too hot by 2001.

Figure 5 shows how HadCRUT4 compares with the model world. The fit from 1850 to 2001 is excellent.  That is because the background image is simply plotting observations in this period. I have nevertheless had to subtract 0.6˚C from HadCRUT4 to get it to match the observations while a decade earlier I had to add 0.5˚C. The 250 year x-axis scale makes it difficult to see how models initiated in 2001 now compare with 13 years of observations since. Figure 6 shows a blow up of the detail.

Figure 6 The single vertical grid line is the year 2000. The blue line is HadCRUT4 (reality) moving sideways while all of the models are moving up.

The detailed excerpt illustrates the nature of the problem in evaluating IPCC models. While real world temperatures have moved sideways since about 1997 and all the model trends are clearly going up, there is really not enough time to evaluate the models properly. To be scientifically valid the models should have been run from 1850, as before (Figure 1), but they have not. Had they been, by 2001 they would have been widely divergent (as 1990) and it would be easy to pick the winners. But they are brought together conveniently by initiating the models at around the year 2000. Scientifically this is bad practice.

Figure 7 IPCC future temperature scenarios from AR4 published in 2007. It seems that the IPCC has taken on board the need to initiate models in the past and in this case the initiation date stays at 2000 offering the same 14 years to compare models with what came to pass.

For the Fourth Assessment Report (AR4) [5] we move on to 2007 and the summary shown in Figure 7. By this stage I’m unsure what the B1 to A1F1 scenarios mean. The caption to this Figure in the reports says this:

Figure SPM.5. Solid lines are multi-model global averages of surface warming (relative to 1980–1999) for the scenarios A2, A1B and B1, shown as continuations of the 20th century simulations. Shading denotes the ±1 standard deviation range of individual model annual averages. The orange line is for the experiment where concentrations were held constant at year 2000 values. The grey bars at right indicate the best estimate (solid line within each bar) and the likely range assessed for the six SRES marker scenarios. The assessment of the best estimate and likely ranges in the grey bars includes the AOGCMs in the left part of the figure, as well as results from a hierarchy of independent models and observational constraints. {Figures 10.4 and 10.29}

Implicit in this caption is the assertion that the pre-year 2000 black line is a simulation produced by the post-2000 models (my bold). The orange line denotes constant CO2 and the fact that this is a virtual flat line shows that the IPCC at that time believed that variance in CO2 was the only process capable of producing temperature change on Earth. I don’t know if the B1 to A1F1 scenarios all use the same or different CO2 increase trajectories. What I do know for sure is that it is physically impossible for models that incorporate a range of physical input variables, initiated in the year 1900, to be closely aligned and to converge on the year 2000 as shown here. It is a physical impossibility as demonstrated by the IPCC models published in 1990 (Figure 1).

So how do the 2007 simulations stack up against reality?

Figure 7 Comparison of AR4 models with reality. Since 2000, reality is tracking along the lower bound of the models as observed by Roy Spencer and many others. If anything, reality is aligned with the zero anthropogenic forcing model shown in orange.

Last time out I had to subtract 0.6˚C to align reality with the IPCC models. Now I have to add 0.6˚C to HadCRUT4 to achieve alignment. And the luxury of tracking history from 1850 has now been curtailed to 1900. The pre-2000 simulations align pretty well with observed temperatures from 1940 even though we already know that it is impossible for the pre-2000 simulations to have been produced by a large number of different computer models programmed to do different things – how can this be? Post 2000, reality seems to be aligned best with the orange no CO2 rise /  no anthropogenic forcing model.

From 1900 to 1950 the alleged simulations do not in fact reproduce reality at all well (Figure 8). The actual temperature record rises at a steeper gradient than the model record. And reality has much greater variability due to natural processes that the IPCC by and large ignore.

Figure 8 From 1900 to 1950 the alleged AR4 simulations actually do a very poor job of simulating reality, HadCRUT4 in blue.

Figure 9 The IPCC view from AR5 (2014). The inconvenient mismatch 1900 to 1950 observed in AR4 is dealt with by simply chopping the chart to 1950. The flat blue line is essentially equivalent to the flat orange line shown in AR4.

The fifth assessment report (AR5) was published this year and the IPCC current view on future temperatures is shown in Figure 9 [6]. The inconvenient mismatch of alleged model data with reality in the period 1900 to 1950 is dealt with by chopping that time interval off the chart. A very simple simulation picture is presented. Future temperature trajectories are shown for a range of Representative Concentration Pathways (RCP). This is the completely wrong approach since the IPCC is no longer modelling climate but different human, societal and political choices, that result in different CO2 trajectories. Skepitcalscience provides these descriptions [7]:

RCP2.6 was developed by the IMAGE modeling team of the PBL Netherlands Environmental Assessment Agency. The emission pathway is representative of scenarios in the literature that lead to very low greenhouse gas concentration levels. It  is a “peak-and-decline”  scenario; its radiative forcing level first reaches a value of around 3.1 W/m2  by mid-century, and returns to 2.6 W/m2  by 2100. In order to reach such radiative forcing levels, greenhouse gas emissions (and indirectly emissions of air pollutants) are reduced substantially, over time (Van Vuuren et al. 2007a). (Characteristics quoted from van Vuuren 2011)


RCP 8.5 was developed using the MESSAGE model and  the IIASA Integrated Assessment Framework by  the International  Institute  for  Applied  Systems  Analysis  (IIASA),  Austria.  This  RCP  is characterized by increasing greenhouse gas emissions over time, representative of scenarios in the literature that lead to high greenhouse gas concentration levels (Riahi et al. 2007).

This is Mickey Mouse science speak. In essence they show that 32 models programmed with a low future emissions scenario have lower temperature trajectories than 39 models programmed with high future emissions trajectories.

The models are initiated in 2005 (the better practice of using a year 2000 datum as employed in AR4 is ditched) and from 1950 to 2005 it is alleged that 42 models provide a reasonable version of reality (see below). We do not know which, if any, of the 71 post-2005 models are included in the pre-2005 group. We do know that pre-2005, each of the models should be using actual CO2 et al concentrations and since they are all closely aligned we must assume they all use similar climate sensitivities. What the reader really wants to see is how varying climate sensitivity influences different models using fixed CO2 trajectories and this is clearly not done. The modelling work shown in Figure 9 is effectively worthless. Nevertheless, let us see how it compares with reality.

Figure 10 Comparison of reality with the AR5 model scenarios.

With models initiated in 2005 we have only 8 years to compare models with reality. This time I have to subtract 0.3˚C from HadCRUT4 to get alignment with the models. Pre-2005 the models allegedly reproduce reality from 1950. Pre-1950 we are denied a view of how the models worked then. Post-2005 it is clear that reality is tracking along the lower limit of the two uncertainty envelopes that are plotted. This is an observation made by many others [e.g 1].

Concluding comments

  • To achieve alignment of the HadCRUT4 reality with the IPCC models the following temperature corrections need to be applied: 1990 +0.5; 2001 -0.6; 2007 +0.6; 2014 -0.3. I cannot think of any good reason to continuously change the temperature datum other than to create a barrier to auditing the model results.
  • Comparing models with reality is severely hampered by the poor practice adopted by the IPCC in data presentation. Back in 1990 it was done the correct way. That is all models were initiated in 1850 and used the same CO2 emissions trajectories. The variations in model output are consequently controlled by physical parameters like climate sensitivity and with the 164 years that have past since 1850 it is straight forward to select the models that provide the best match with reality. In 1990, it was quite clear that it was the “Low Model” that was best almost certainly pointing to a low climate sensitivity.
  • There is no good scientific reason for the IPCC not adopting today the correct approach adopted in 1990 other than to obscure the fact that the sensitivity of the climate to CO2 is likely much less than 1.5˚C based on my and others’ assertion that a component of the Twentieth Century warming is natural.
  • Back in 1990, the IPCC view on climate sensitivity was a range from 1.5 to 4.5˚C. In 2014 the IPCC view on climate sensitivity is a range from 1.5 to 4.5˚C. 24 years have past and billions of dollars spent and absolutely nothing has been learned! The wool has been pulled over the eyes of policy makers, governments and the public to the extent of total brain washing. Trillions of dollars have been misallocated on energy infrastructure that will ultimately lead to widespread misery among millions.
  • In the UK, if a commercial research organisation were found cooking research results in order to make money with no regard for public safety they would find the authorities knocking at their door.


[1] Roy Spencer: 95% of Climate Models Agree: The Observations Must be Wrong
[2] Wood For Trees
[3] IPCC: First Assessment Report – FAR
[4] IPCC: Third Assessment Report – TAR
[5] IPCC: Fourth Assessment Report – AR4
[6] IPCC: Fifth Assessment Report – AR5
[7] Skepticalscience: The Beginner’s Guide to Representative Concentration Pathways

This entry was posted in Climate change, Political commentary and tagged , , , , , , , , , , . Bookmark the permalink.

49 Responses to The temperature forecasting record of the IPCC

  1. Joe Public says:

    The significant difference between producing climatic models, and models for oil and gas reservoir fluid flow, is that hind cast errors from the latter cost the owners money, and their creators their reputation.

    Errors from the former cost everyone else money because of erroneous political decisions made on their predictions; and, are bizarrely used as justification for yet further research – sometimes by the same incompetents.

    • Euan Mearns says:

      Agree entirely. Going back to 1990, I had already bought my first Mac and had charting software called Cricket Graph and Delta Graph. The IPCC were still using Rotring pens. It might not have been straight forward for them to compare their models with HadCRUT – maybe it wasn’t even published then? But by the time the IPCC bought themselves their first super computer and compared their 1990 models with reality it should have been blindingly obvious that climate sensitivity lay at the low end of expectations. From then on, research funds should have been directed at trying to work out how much of the very little global waring was down to CO2 and other human influences and how much was down to natural cyclic change. The conclusion normally is climate sensitivity is in the vicinity of 1 or less and at that level there is precious little to worry about.

  2. A C Osborn says:

    The UN (which includes all the national Governments) asked for the data to show CO2 affecting the temperatures in the future and that is exactly what they got. It has absolutlely nothing to do with SCIENCE and everything to do with POLITICS.
    Even with all the massaging of the temperature data they still can’t get the so called globull temperature up to the required amount to match even their lowest CO2 increases, if the Raw data was plotted it would show temperatures dropping and not even staying stable.

    • Euan Mearns says:

      The UN asked for the data to show CO2 affecting the temperatures in the future and that is exactly what they got.

      Do you have a link to support this? I can recall reading this many years ago but can no longer find the source. I can understand the UN wanting to support a thorough review of all factors affecting climate change on Earth, including emissions. But where is the political gain in showing a false and disproportionate impact of Man and the ensuing wrecking of our energy infrastructure, natural environment and quite soon, prosperity and security too?

      • A C Osborn says:

        Euan, from Wikipedia,
        “The IPCC produces reports that support the United Nations Framework Convention on Climate Change (UNFCCC), which is the main international treaty on climate change.[5][6] The ultimate objective of the UNFCCC is to “stabilize greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic [i.e., human-induced] interference with the climate system”.[5] IPCC reports cover “the scientific, technical and socio-economic information relevant to understanding the scientific basis of risk of human-induced climate change, its potential impacts and options for adaptation and mitigation.”[6]”


        “The aims of the IPCC are to assess scientific information relevant to:[6]

        Human-induced climate change,
        The impacts of human-induced climate change,
        Options for adaptation and mitigation.”

        From the IPCC Website their Role is
        The role of the IPCC is to assess on a comprehensive, objective, open and transparent basis the
        scientific, technical and socio-economic informationrelevant to understanding the scientific basis of
        risk of human-induced climate change, its potential impacts and options for adaptation and mitigation.

        IPCC reports should be neutral with respect to policy, although they may need to deal objectively with
        scientific, technical and socio-economic factors relevant to the application of particular policies.

        Note the part that I have separated out, the first part of that sentence is what they are not ie Policy Neutral.

        Their original remit had Nothing to do with actually understanding “Climate” at all and certainly nothing to do with anything that opposes those original aims. Also note that it was always called Climate Change and there is a note in AR1 that says “popularly known as Global Warming”.

  3. clivebest says:


    Here is an old comparison I made between Hadcrut3/UAH data and the 1990 model predictions from FAR.

    H3/UAH data are shown as 5 year averages

    One of the iconic graphs from AR5 was the one plotting temperature versus cumulative emissions. This was used by Prof. Walport in his presentation to the cabinet to argue that mankind had already burned half the fossil fuels needed to exceed 2C warming and that therefore we had to curb emissions in order to avoid burning the other half. However if you put the H4 data on the same graph using a logarithmic CO2 fit to extrapolate into the future then you get
    this graph which shows we have over twice the leeway.

    • Euan Mearns says:

      Clive, I think these are interesting exercises since it seems you reach a similar conclusion to me independently. It’s not that hard to do unless your livelihood depends on you not managing to do it ;-) Does Low 1.3 mean that climate sensitivity of 1.3 is used. It’s amazing that we keep coming back to that number or less.

  4. Roger Andrews says:


    An excellent compilation. It leaves anyone but the most committed alarmist in no doubt as to what the IPCC’s models are telling us, which isn’t what the “consensus” claims.

    And now I’ve said that I can say what I really came to say ;-)

    Your post brings up an important question that gets very little attention. We say that there’s been X degrees of surface warming since 18** and Y degrees of surface warming since 19** and that there’s going to be Z more degrees of surface warming by 21**, but how, exactly, do we measure “surface warming”?

    Presently we measure it from series like HadCRUT4, which is an area-weighted average of surface air and ocean surface temperatures, and since climate models don’t output area-weighted averages we must area-weight the model surface air and ocean surface temperatures in the same way to compare them with HadCRUT4. As illustrated in the graphic below, however, the models show that ocean and air temperature trends aren’t the same, with the air warming by about 1.3C but the ocean by only about 0.8C since 1861. (It also shows modeled air and ocean temperatures continuing to diverge during the 21st century, but it turns out that this doesn’t affect the IPCC’s warming projections because the IPCC has pulled another fast one. I’ll have a separate comment on this coming up shortly):

    But we go ahead and area-weight the air and ocean model temperatures anyway, coming up with ~0.95C of model-hindcast “surface warming” since 1861. Then we compare this warming with the HadCRUT4 warming and find that they’re in the same ballpark, and everyone is happy.

    But does this comparison mean anything? No, for three reasons:

    * Area-weighting air and ocean temperatures that show different trends, particularly when the heat content of sea water vastly exceeds that of air, is not a valid procedure.

    * The surface at which HadCRUT4 measures “surface temperature” doesn’t exist as an identifiable physical feature. (The average HadCRUT4 measurement is taken either slightly below the sea surface or slightly above the ground in a medium consisting of ~70% sea water and ~30% air bubbles.)

    * Combining air and ocean temperatures covers up a “bust” in the ocean temperature model-observed comparisons.

    So how, exactly, do we measure global “surface warming” while getting results that are meaningful enough to be compared with climate models? We don’t. There’s no single metric that defines it. The only valid approach is to compare modeled and observed air temperatures and modeled and observed ocean temperatures separately. So out with HadCRUT4 and in with …. hmm. I’ll have to think about that.

  5. Roger Andrews says:

    Comment no. 2. IPCC caught with its hand in the cookie jar again.

    Euan’s Figure 10 compares HadCRUT4 observations against the CMIP5 climate models and projects model temperatures out to 2100.

    The plot below gives a closer view of the model-observation comparison. One of the observational series is HadCRUT4:

    The model temperatures shown in the above two plots are weighted averages of model air temperatures and ocean surface temperatures, and because the model air temperatures show more warming than the ocean temperatures these weighted averages show less warming than the model air temperatures.

    The model projections beyond the black line on Euan’s graph, however, are model air temperatures. I’ve overlain my earlier air temperature plots on top of them (link below) and they fit exactly.

    So the IPCC compares observations to weighted model temperatures, which is the only way of getting a half-way decent fit, but projects 21st century warming using model air temperatures, which show more warming than either ocean or weighted temperatures and therefore paint as black a picture as possible.

    If the post-black-line model temperatures were weighted like the pre-black-line temperatures, and assuming a 30/70 air/ocean weighting, model-projected warming between now and 2100 would decrease by 0.16C (from 0.51C to 0.35C) for the RCP28 case and by 0.84C (from 3.70C to 2.86C) for the RCP85 case. A significant difference, I would say.

    • Roger,

      “So the IPCC compares observations to weighted model temperatures, which is the only way of getting a half-way decent fit, but projects 21st century warming using model air temperatures, which show more warming than either ocean or weighted temperatures and therefore paint as black a picture as possible.”

      It does appear that that is what they have done. A great catch.



      • Roger Andrews says:

        Hi Dave:

        Thanks for your kind words, but it actually wasn’t a great catch at all. Fact of the matter is that I screwed up. On review I find that the IPCC in fact compares observations to model air temperatures, not weighted model temperatures, so I have to apologize for accusing them of doing something they didn’t do and to anyone else I might have misled. My earlier comments about the use of HadCRUT4 to simulate “surface temperatures” still stand, however.

        • Euan Mearns says:

          Does that mean I don’t have to spend a couple of hours trying to understand this?

          • Roger Andrews says:

            Hi Euan. Yes, I know I keep harping on about this but it really is important, so let me try another approach.

            Here’s a plot of the two Hadley/CRU temperature time series that are derived from observations – CRUTEM4 and HadSST3, both zeroed to 1950-1975 means. CRUTEM4, the land air temperature series, shows twice as much warming after 1975 as the HadSST3 ocean series (1.0 vs. 0.5C).


            With this large a difference in temperature trends we clearly have to perform separate comparisons – CRUTEM4 vs. modeled surface air temperatures over land and HadSST3 vs. modeled SSTs in the ocean – if we want an objective evaluation of climate model performance. But instead we take an area-weighted average of CRUTEM4 and HadSST3, plot it down the middle and call it HadCRUT4.


            Then we take HadCRUT4, which a) gives temperatures at a “surface” that the models can’t analog because it doesn’t exist as a physically-definable entity, b) shows an amount of warming since 1975 (0.65C) that’s representative of neither the land nor the oceans and c) is ~70% derived from SSTs, and compare it with modeled global surface air temperatures.

            Is a comparison like this going to tell us how the climate models are doing? I don’t think so.

  6. Euan,

    A great post.

    “This is the completely wrong approach since the IPCC is no longer modelling climate but different human, societal and political choices, that result in different CO2 trajectories.”

    I think the actual RCP process works in the opposite direction. The RCP forcings were chosen first at 6W/m^2, 4.5W/m^2 and 3W/m^2 (peak). Note the constant spacing of 1.5W/m^2, selected to be big enough to give climate differences in the model outcomes. The business-as-usual scenario RCP8.5 was allowed a higher jump, and it should be noted that this will make the maximum temperature and sea level rise that are usually quoted larger. It is my understanding that the models use specified concentrations of the greenhouse gases directly, rather than the emissions that would come out of a social model. So the RCPs do not relate directly to any particular social scenario, and in principle many social scenarios could be associated with a single RCP.

    These models have a large enough number of fitted parameters that I would argue that the hindcasts have little meaning. I am thinking of Von Neumann wagging the elephant’s tail here. I would be inclined to use only projections for the comparison, like Clive’s graph, and the parts of your graphs where the temperatures were in the future at the time the model was published.


    • Roger Andrews says:

      “These models have a large enough number of fitted parameters that I would argue that the hindcasts have little meaning. I am thinking of Von Neumann wagging the elephant’s tail here.”

      The remarkable thing about climate models isn’t how many parameters they have but how few. They are driven pretty much entirely by changes in radiative forcings at the top of the atmosphere, and according to the IPCC almost all of the changes in TOA radiative forcing since 1750 has been anthropogenic (+1.60 w/sq m anthropogenic and a paltry +0.12 w/sq m solar).

      And this of course gives us a predetermined result. When effectively all the radiative forcings are anthropogenic and when the models respond only to radiative forcings they are bound to show that effectively all the warming since 1750 was anthropogenic. They can’t show anything else.

      The same applies to projected future warming. Model temperatures are bound to increase as CO2 increases. The relationship between the two is in fact so strong that we can replicate projected CMIP5 model temperatures through 2100 almost exactly from the IPCC’s predicted CO2 concentrations with no input from anything else. Here’s the RCP85 case (note the 2.2C best-fit climate sensitivity):

      On the question of Neumann’s elephant, we could indeed make it wiggle its trunk in the models by adding more input variables, but the variables we would have to add are natural climatic cycles like the PDO and AMO, and a lot of the anthropogenic warming goes away when we do this.

      • Hi Roger,

        It is true that the model results for a single parameter like global average temperature can often be described in a simple way, but internally the models are complex, even without the PDO and the AMO. From an engineering perspective, it makes the climate virtually impossible to validate.


    • Euan Mearns says:

      Dave, I disagree with you here. If the simulation models have any value at all they should be able to reproduce past climate change. A great song and dance is in fact made about their ability to do so. When in fact a single model tuned to reproduce the past might manage a reasonable job, it is I believe a physical impossibility for a model that forecasts 4 to 6 ˚C warming by 2100 to reproduce past climate at all. Auditors need to go in here and require that all modellers initiate their models in 1850 and see what they produce. I assure you it will be in the vast majority of cases total dross. They should for example have stochastic volcanoes in there. It is easy to do. But they don’t do it. As things stand we will have to wait another 10 to 20 years to have long enough time series to compare what comes to pass with what was forecast. And we will have a bunch of Green over paid climate scientists hoping that the world begins to fry because it is more important for them “to be proven correct” than is caring for the welfare of humanity and ecosystems.

    • Bernard Durand says:

      Dave, as you have shown yourself, there is no SRES in the ICCP ‘approach which fits the emission scenarios that you can deduce from fossil fuels production scenarios based on Hubbert type modeling. All of them are too high ! And this is already a good reason for temperature increase to be lower than deduced from the SRES.
      But RCP don’t fit either ! What they are concretely based on is for me a mystery.
      But using inadequacy of SRES prediction to reality to calculate the sensitivity of climate to CO2 emissions is also a methodological error.
      A good project would be to set up a loyal cooperation between fossil fuels specialists and the ICCP .

  7. Ed says:

    So basically, you’re saying climate change isn’t worth doing anything about. Well, we are certainly doing that. You must be pleased. Lets hope climate change doesn’t exhibit any non-linear characteristics to the upside or our children and grandchildren are toast.

    • Euan Mearns says:

      Ed you don’t seem to be able to grasp the simple message of this post which is that the scientific evidence that exists suggests that man made climate change is nothing like as serious a problem that the IPCC have led us all to believe. It is true that I see very little merit is squandering trillions of $ and failing to impact a largely over inflated problem, causing misery for millions and undermining prosperity for all. I have two grown up sons. I lose no sleep worrying about them being consumed by global warming, I am concerned that they live in a world where being right no longer matters and science has been corrupted. And I am concerned that in the next 30 to 50 years they may have to experience some bitterly cold winters without electricity or gas.

      What evidence do you have for non-linear climatic catastrophes? My view is that Earth’s climate is in general amazingly robaust, kept within very tight parameters, suggesting that it is dominated by negative feedbacks. If it was any other way we would not be here. The first and probably most important negative feedback is convection closely followed by clouds.

      • Ed says:

        I think I know you all too well. You don’t need to worry in any case because we’re not doing anything or planning to do anything to holt climate change (or should I say imaginary climate change).

        • Ed says:

          My view is that we are destined to burn though all our fossil fuels come what may. Even if we are following the worst IPCC set of predictions it wouldn’t make an iota of difference to the path we’re taking. Ideally an intelligent species would plan for the worst and hope for the best. You seem to be advocating, plan for the best and hope the worst doesn’t come about. The world is following your wish.

          As regards non-linearity, you can’t predict it. That’s the nature of non-linearity. By going over 400ppm CO2 we are playing with fire.

          • Euan Mearns says:

            Ed, you may be surprised but our positions are not as far apart as you may believe. An intelligent species should be able to prepare reports on climate change that have a semblance of credibility. Producing a report that has a range of climate sensitivity from 1.5 to 4.5˚C and to then claim 97% consensus is totally without credibility and merit – IMO. ALL of the data tends to point to climate sensitivity below 1.5˚C. If the IPCC said that and then recommended a course of action to transition away from FF that involved workable alternatives then we would be getting somewhere. Cooking the books and suggesting we save the planet with wind, wave and solar power is doing humanity a great disservice. A 50 year plan to electrify everything based on nuclear power might actually work. What you astutely recognise is that what we are doing at the moment simply is not working. It is time for plan B. If you are a Green activist who is also against nuclear power then you are either condemning the planet to 1000 ppm + CO2 – or humanity to a chaotic energy starved decline.

            At the moment I feel unsure how to call the impact of 1000 ppm – 2 doublings may take us to + 3˚C. Or there again it may take as nowhere if say 50% of twentieth century warming was natural, which seems likely to me.

            Speculating and scare mongering about non-linear climate effects is not a scientific basis for decision or policy making.

          • Bernard Durand says:

            Ed, in general, people saying that we will burn all quantities of fossil fuel that can be extracted are unable to give any precise idea of these quantities. May be it is not your case. If so please give us your views on the evolution with time of fossil fuel productions and tell us the methods you used to find the truth.
            Fiction is easy and give us many degrees of liberty. But for world’sake it is safer to rely on quantitative knowledge and physical laws.

  8. Pingback: The Temperature Forecasting Record Of The IPCC | The Global Warming Policy Foundation (GWPF)

  9. Craig W. Crosby, Sr. says:

    I am very happy that we are on the low range of IPCC predicted temps. I hope we stay there; I am waiting for the results of mid-oceanic temps to see whether or not the increases are being absorbed there along with much of the CO2 being released. Unfortunately I can find little hard data on either present or historic ranges there.

    The only scary parts of the IPCC and MSM information to me would be speculation about a tipping point for widespread release of methane from calthrates. And whilst it is speculation, that, it seems to me, is where any research should be directed.

    Meanwhile, it seems that a prudent person would consider the dangers, and take action to prevent a possible debacle. For instance, if someone told you that there was a 10% chance that the bridge ahead would collapse if it was not repaired, would you argue against repair? And how would you feel several years later about the safety of that bridge? Would you say, “Hey! It hasn’t collapsed yet. Those engineers are just alarmists. It would be far too costly to repair that bridge” ?

    But then, bridge collapse has happened before; climate collapse hasn’t.


    • Craig W. Crosby, Sr. says:

      Sorry. Unclear at the end. I know climate collapse has in the distant past taken place. I meant in living memory.

    • Euan Mearns says:

      Craig, to be happy is an honest position to hold. The ocean temperature thing is important but I’m afraid I mistrust the harbingers of that news. The oceans are vast and complex. The surface is heated by direct sunlight (not CO2). Evaporation causes salinity to increase, increasing density. Warm water less dense, salty water more dense. Cold salty waters at the poles sinks setting up global thermo-haline circulation. What sinks somewhere has to rise somewhere else, often known as cold nutrient rich upwellings. The oceans are stratified with regard to chemistry, salinity and temperature with locally very strong currents.

      To generate an accurate picture of all this complexity would require thousands of measuring points at hundreds of depths over centuries. The data simply does not exist.

      To return to an earlier comment I made. Ocean warming will be controlled by direct sunlight. It is a black body. The amount of direct sunlight will be controlled by cloud cover (not CO2)

      The stability of methane clathrates on the ocean floor are controlled by two variables – pressure and temperature. Rising sea levels increases pressure and stabilises clathrates. Temperature rise in the oceans is miniscule. And water has maximum density at 4˚C meaning that bottom waters tend to gravitate towards 4˚C and stay there.

  10. Ed says:

    Thanks Euan for replying to my comments in a very civil manner. I’m not sure there is going to be a happy solution to our Energy/Population/climate change problems. This might surprise you, and a bit of common ground between us, but out of the three issues, I think energy/population issue the most important. The nuclear/renewables discussion goes on, I’m afaid.

  11. Anthony Watts says:

    Hello Euan,
    It has been suggested to me by several people that I should repost this on WUWT so that it gains a wider audience. Per your policy, I’m contacting you to ask for your decision on the matter. Thank you for your consideration.

  12. manicbeancounter says:

    You mentioned about not being able to find SAR.
    Copies are all the UNIPCC climate assessment reports are available from

    For SAR, the three SPMs were combined into one. The fill is about 0.6MB and has very few diagrams.

    The detail you want is probably in WG1 report. The 50MB+ takes an age to download though.

    • Euan Mearns says:

      Thanks Manic, Very few pictures in the SAR. So maybe they realised that the charts in the FAR could be shot down and didn’t know what to do. By the time of the TAR they had developed skills in obfuscation.

  13. Quick, get it peer reviewed and journal published! I have no time or expertise to pick through your assessment, but I’m sure if it’s sound analysis then you could well make your name, Euan. Or it could just rattle around the denier echo-chambers that are WUWT, the GWPF and (hesitantly) your own blog and the scientific community will carry on with their fraud (sic).

    • Euan Mearns says:

      Kit, I note that you have neither the time nor expertise to scientifically judge what I have written but from this enviable position of ignorance you have decided that I am a “denier” even though my views fit in at the bottom end of the so called IPCC consensus. And you want me to respect academia?

      Interesting wee blog! Hope you enjoyed the Global Energy Systems conference that I conceived, originated and co-organised. With hindsight you perhaps want to decide it was rubbish since I was so deeply involved.

      Interesting map of N Sea oil and gas fields on your blog. I’ve probably worked on the inorganic geochemistry of the rocks, minerals and formation fluids on about half of these fields and likely sit on one the biggest proprietary data bases on the inorganic geochemistry of N Sea reservoirs. North sea formation waters are NaCl brines because their origin is seawater – maybe you could discover that and get it published.

      Since you have chosen to be so critical of this post and my blog I think you should now be obliged to read the post and return with credible critique of what I have written together with a defence of IPCC practice that you seem to admire.

      Finally I would like to conclude with one of my favourite quotes:

      It is difficult to get a man to understand something, when his salary depends on his not understanding it.

      Upton Sinclair

  14. Charles Hall says:

    What is the most important greenhouse gas?? Write down your answer now…

    Most people would say CO2, a few (per molecule) methane…
    In fact its water vapor, four times as potent as CO2 in total impact… technically not a gas I suppose.

    This area is not my expertise (EROI, energy and the economy etc is, also modeling in the past) but I understand that for the models to get the large sensitivity to increased CO2 they must also INCREASE WATER VAPOR. They do this with an empirical relation between temperature ad water vapor. So more CO2 increases temperature which increases water vapor which brings up the total temperature (in the model) four fold… which does NOT lead to further water vapor in the atmosphere….. hmmmm.

    Does anyone really understand how this works? Is my summary correct?
    The best summary I have seen is found by googling “Ian Clark Canadian Parliament” . Is his assessment correct??
    His expertise lies elsewhere but he gives a pretty compelling argument.

    So I remain …. confused.

    Charlie Hall,

    • Euan Mearns says:

      Charlie, I’m really happy that you have come to post a comment here. But thought it might have been on net energy. You are an ecologist and so are naturally concerned about what the “fire monkeys” may be doing to the planet. And so am I. I’m just concerned that after all the money that has been spent, we still don’t know the answer.

      At the root of your query is Clausius–Clapeyron–Clapeyron_relation

      Anyone who knows 1+1 = 2 should be able to understand the math. In reality, the outcome is dependent upon system feedbacks. The main feedback is convection. Warmer means more heat convected from surface to tropopause and then irradiated to space. This I believe (but I do not know) is the main mechanism that maintains Earth climate amazingly stable.

      So this is speculative musing. But my understanding is that GCMs hold convection constant and you as a university professor needs to explain to me why one of the key VARIABLES in the climate system should be held CONSTANT and that this seems to be accepted universally in the peer reviewed literature?

    • Bernard Durand says:

      Charlie, you are basically right, but the question is: did the water vapour content increased really as predicted by models?

    • kap55 says:

      A few misconceptions here. First, water vapor is responsible for 60% of the total greenhouse effect, and CO2 for 26% (Kiehl & Trenberth 1997 BAMS 78.2). But CO2 is still the most important for driving climate, because water vapor is too short-lived in the air. [Do the following thought experiment: wave your magic wand and make the entire atmosphere 100% relative humidity. What happens? Two weeks of rain, then we're right back where we started. Now wave your magic wand and double CO2 in the air. What happens? Centuries of warmer climate until the deep ocean and lithosphere absorb the excess.]

      You’re basically right about water vapor feedback. Higher temperatures cause more evaporation, which means more water vapor in the air and more greenhouse effect. But warmer air holds more water too, which means that even though the specific humidity increases, relative humidity does not. Relative humidity is a measure of how much water vapor is in the air compared to the maximum the air can hold *at a given temperature*. So if temperature increases at the same rate as evaporation, relative humidity stays the same even though specific humidity increases.

      That’s why climatologists aren’t terribly worried about water vapor: it follows temperature, and causes feedback, but it’s not a forcing agent for long-term climate.effects.

  15. Pingback: Kindred Spirits and all that | Louis Hissink's Crazy World

  16. John B. says:

    I am merely leaving this in case some are interested…I am not qualified to attack this subject:

  17. kap55 says:

    Let me also repeat a comment I made at WUWT.

    The problem here is that the FAR 1990 “business as usual” scenario assumes that we would have gotten more than 1.2 W/m² since 1990 (see FAR Annex, Figure A.6). But the actual emissions we have seen since 1990 has been lower than FARs’ BAU scenario for CO2, lower for methane, and lower for CFC’s (see FAR Annex, Figure A.3, and compare to NOAA’s Aggregate Greenhouse Gas Index at Which means the actual forcing we have seen since 1990 is about half that of FAR’s BAU scenario, and is in fact very close to FAR’s Scenario D.
    And the predicted temperature rise under Scenario D is pretty close to what we have actually experienced (See FAR Chapter 6, Figure 6.11). Which means that the FAR models pretty much get it right, if we give them the right inputs.

    • Bernard Durand says:

      Kap 55, the basic problem is that not a single scenario of anthropic CO2 emissions used by ICCP is based on a serious prospective of fossil fuel productions. Dave Rutledge and many others have made a demonstration of that. The relatively good coïncidence of RCP 2,5 with observations is because this is the scenario having, by happy luck and not because it is better thought, emissions from fossil fuels which are the closest to real ones.
      Even IEA predictions are much lower than those of high RCP. So officials are in a strange schizophrenia. On the one hand they support ICCP emission scenarios, on the other hand they also support IEA emission scenarios which are completely different.
      Moreover, IEA is known to be over optimistic. Since the beginning, they have predicted oil productions which were much greater than reality, as demonstrated by experience.

    • Euan Mearns says:

      kap55 – I’ve been meaning to get back to you on this for days. It’s a very good point you make and I (or someone else) needs to check out the IPCC Co2 forecast models – maybe DaveR has already done this. If the IPCC were so wildly wrong as you suggest this beggars the question why? Since near term population growth and per capita CO2 are fairly straightforward to forecast, a 20 to 50 year emissions forecast should be a fairly simple exercise – I’ve added it to my very long list of things to look into.

      • Beranrd Durand says:

        Euan, the exercice is not so simple, but it has to be done. It needs a modelling of fossil fuel productions, and therefore a patient collection of good data and also a modelling method. ASPO did already a lot for oil and gas, Dave for coal.
        ICCP, for some reason, never considered this approach.

        • Euan Mearns says:

          Bernard, your comment went into the approval Q for some reason – capital R on your email?

          I’d agree its very complex to do this to end of century, but a couple of decades out an extrapolation of existing gross trend should work pretty well – so long as finance system holds together. The real question is what have IPCC done back in 1990?

  18. Pingback: Recent Energy And Environmental News – June 16th 2014 | PA Pundits - International

Comments are closed.