What’s up with the Bern Model?

In modelling the growth of CO2 in the atmosphere from emissions data it is standard practice to model what remains in the atmosphere since after all it is the residual CO2 that is of concern in climate studies. In this post I turn that approach on its head and look at what is sequestered. This gives a very different picture showing that the Bern T1.2 and T18.5 time constants account for virtually all of the sequestration of CO2 from the atmosphere on human timescales (see chart below). The much longer T173 and T∞ processes are doing virtually nothing. Their principle action is to remove CO2 from the fast sinks, not from the atmosphere, in a two stage process that should not be modelled as a single stage. Given time, the slow sinks will eventually sequester 100% of human emissions  and not 48% as the Bern model implies.

If emissions were switched off today the fast sinks would continue to pump down CO2 quickly, assuming they are not saturated, until a new equilibrium between the fast sinks is reached where the eventual CO2 concentration of the atmosphere may still contain 19% of total emissions over and above the pre-industrial baseline, that is until the slow sinks have time to pump that residual CO2 away.

The chart shows the amount of annual emissions removed by the various components of the Bern model. Unsurprisingly the T∞ component with a decline rate of 0% removes zero emissions and the T173 slow sink is not much better. Arguably, these components should not be in the model at all. The fast T1.2 and T18.5 sinks are doing all the work.  The model does not handle the pre-1965 emissions decline perfectly, shown as underlying, but these too will be removed by the fast sinks and should also be coloured yellow and blue. Note that year on year the amount of CO2 removed has risen as partial P of CO2 has gone up. The gap between the coloured slices and the black line is that portion of emissions that remained in the atmosphere.

This is Part 2 of the mini series on CO2 in the atmosphere. What’s up with the Bomb Model coming next.

The Bern Model

The Bern Model for sequestration of CO2 from Earth’s atmosphere imagines the participation of a multitude of processes that are summarised into four time constants of 1.2, 18.5 and 173 years and one constant with infinity (Figure 1). I described it at length in this earlier post The Half Life of CO2 in Earth’s Atmosphere. T173 and T∞ constants are supposed to account for 48% of the CO2 that is sequestered. In fact, on the human time scale these slow processes sequester virtually zero manmade emissions. It is traditional when modelling the atmosphere to model what is left and not what has been sequestered. Doing this gives the perception that the T173 and T∞ time slices dominate the non-sequestered CO2 and give the impression that these components may linger in the atmosphere for a very long time (Figure 1). This is a false impression.

Figure 1 The Bern model has 4 components representing sequestration at different rates by different geological and biological processes. The wedge labelled “underlying” represents the pre 1965 emissions that in the model go back to 1910 and these are simply declined at 3% per annum which is an imperfect approximation. The line labelled atmosphere is the actual atmosphere trend based upon Mauna Loa CO2 and IPCC Grid Arendal carbon cycle. The model does not easily convey the fact that the T1.2 process is actually one of the most important for sequestering CO2 (see Figure 2). Nor does it convey the fact that the T173 and T∞ slices remove virtually zero CO2 on this time scale.

Modelling what is actually sequestered gives a totally different and truer picture (Figure 2). The vast bulk of CO2 is sequestered by the fast T1.2 and T18.5 sinks. The T173 and T∞ sinks do nothing on this human time scale and should not be in the model at all. The gap between the sequestered CO2 and the emissions in Figure 1 represents those emissions that the fast sinks have not yet had time to sequester. If CO2 emissions are turned off, the fast sinks will continue to operate, for so long as they are not saturated, and CO2 in the atmosphere will drop like a stone, but not to pre-industrial levels (see below).

Figure 2 Plotting the annual emissions that are sequestered shows which processes are doing all the work. See caption for the figure up top for further clarification. This chart opens the possibility of examining the % of emissions sequestered each year (Figure 3).

We can now look at the percentage of annual emissions removed each year (Figure 3) that have fallen from 70% in 1966 to 40% today according to the Bern model. This looks alarming, but it is just a model and not reality.

Figure 3 A derivative of Figure 2 showing the percentage annual amount of emissions sequestered by the Bern model. The gradient in the data may look worrying but is an artefact of the model that does not necessarily represent reality.

From an earlier post:

In 1965 the mass of CO2 in the atmosphere was roughly 2400 Gt (billion tonnes) and today (2010) it is roughly 2924 Gt. That is an increase of 524 Gt. And we also know that we have added roughly 1126 Gt of CO2 to the atmosphere through burning FF and deforestation (emissions model from Roger Andrews). And so while Man’s activities may have led to a rise in CO2 the rise is only 46% of that expected from our emissions. Earth systems have already removed at least 54%.

In the period 1966 to 2010 the Bern model on average removed 51% of emissions which is not a good match to observations.

The fast sink equilibrium theory

In a comment to an earlier post retired NASA astronaut Phil Chapman proposed that there should be an equilibrium distribution of CO2 between the fast sinks. If that equilibrium is disturbed by the addition of CO2 to the atmosphere (emissions) the system adjusts until the CO2 is redistributed between the reservoirs and a new equilibrium is reached. We see this as sequestration of CO2 from atmosphere into the fast sinks (bio mass and ocean water). Since there is a net addition of CO2, at equilibrium each sink, including the atmosphere will contain more CO2 than at the outset.

Therefore, if we stopped emissions today, fast sink sequestration will continue until equilibrium is reached and CO2 will decline, quite rapidly, in the atmosphere but not to pre-industrial levels since the overall amount of CO2 in the fast sinks has increased. Phil suggested that since the atmosphere currently has 19% of the CO2 in the fast sinks that at the new equilibrium point it may still contain 19% of the total additions with the caveat that this will only occur if the redistribution processes between the sinks are linear. Furthermore, the slow sinks are continually removing CO2 from the fast sinks although on human time scales these slow processes are trivial to the outcome (Figure 2). The distribution of CO2 between the fast sinks is shown below. Note that Phil originally used a NASA carbon cycle model while I am using data from the IPCC Grid Arendal model that are slightly different.

Atmosphere          Land biomass         Biodetritus        Shallow ocean
GtCO2     750                         550                         1580                    1020
%               19.2                        14.1                         40.51                   26.15

I have taken Phil’s idea and built a simple two Tau model where 19% of emissions remain in the atmosphere (T∞) and the model is balanced by varying the exponential decline rate of T2. It was found that applying a decline rate of 4% to 81% of emissions balanced the model and it was surprisingly easy to achieve this balance (Figure 4). One weakness with this model is that it assumes equilibrium is continually reached but I still think it is conceptually useful. It might also appear that I am following the same procedure adopted by Bern by mixing fast and slow sinks in a single model. I do not believe that I am doing this since my “T∞” is not modelling a slow sequestration process but residual CO2 in the atmosphere left behind by the fast process and so it should accumulate at the same rate as removal.

Figure 4 This two component model has 1 time constant set to ∞ for 19% of emissions and the other set to 17 years which would represent an average figure for the fast sinks applied to 81% of annual emissions. In 2010 the model removed 40% of emissions added which is close to observations.

With this model we can do fun things like switch off emissions to see what would happen (Figure 5).

Figure 5 In this example, emissions are switched off in 1990 and it can be seen that atmospheric CO2 drops like a stone (top of red band). The black line shows how the atmosphere actually was with emissions on.

The fast sinks continue to pump seeking that new equilibrium between the fast sinks. Projecting this into the future it can be seen that according to the equilibrium distribution theory, 19% of CO2 emissions will remain (red band) and the atmosphere will not return to pre-industrial levels until the slow processes have time to achieve that. Emissions from 1965 to 2010 are 1126 Gt. 19% of that = 214Gt equivalent to about 29 ppm CO2 in the atmosphere for this time span.

Note that in this model 19% of emissions linger because the overall amount of CO2 in the system has increased and a new equilibrium is reached. That equilibrium can then only be moved by the slow sinks pumping down the fast sinks.

To complete the story Figure 6 shows the sequestered emissions for the 2Tau model and derived from that the % of emissions sequestered each year are shown in Figure 7. In the 2Tau model virtually all of the emissions that are sequestered are sequestered by fast sinks with a mean Thalf of 17 years. The underlying (pre-1965) emissions will be sequestered by these same processes. The T∞ portion with 0% decline sequesters nothing in this time frame. Interestingly the sequestered profile is similar to Bern but they are not the same even although both models match the actual atmosphere.

Figure 6 The sequestered emissions for the 2Tau model. All the work is done by the T17 fast sinks. The T∞ sink does no work at all.

Subtracting the sequestered from the emissions provides a picture of the annual removal with time. Interestingly, the 2Tau model is flat (Figure 7) and does not have the gradient evident in Bern and the mean sequestration rate is 54.9% per annum that matches observations almost exactly.

Figure 7 Derivative of Figure 6 showing the percentage of emissions sequestered each year.


  • Modelling the CO2 evolution of the atmosphere by looking at what is sequestered provides insight to what is actually going on. Models that rely on unsequestered residuals left behind in the atmosphere are unlikely to have any skill at describing the processes that left these residuals behind.
  • Examining CO2 sequestration in the Bern model shows clearly that virtually all the work is done by the fast sinks (T1.2 and T18.5). The slow T173 sink does hardly anything on human time scales and the T∞ process removes no CO2 at all (decline = 0%).
  • CO2 is removed from the atmosphere by fast sinks and slow sinks remove CO2 from fast sinks over longer time scales. The slow sinks in fact remove virtually no CO2 from the atmosphere. This two stage process should not be modelled as a single stage in the way that the Bern model functions. Given sufficient time, slow sinks will remove 100% of human emissions and not the 48% implicit in Bern.
  • A fast sink equilibrium model is presented that assumes the total amount of CO2 in the fast sinks rises with increasing emissions but the distribution of CO2 between the four fast sinks will be the same at T2 compared with T1 when equilibrium is restored. This leads to the conclusion that if emissions were halted today that fast sinks would continue to sequester CO2 that would fall to a baseline above pre-industrial levels represented by 19% of total emissions.
  • A two component model is developed where 81% of emissions are removed by a fast T17 sink (17 year half life, 4% per annum decline). This may be the average of a number of sinks of various speeds. The T17 sink reduces CO2 to a baseline at T2 that is 19% higher than T1.
  • The fit of model to observations is excellent. And the model removes 54.9% of annual emissions that is also a close match to observations.
  • Just because a model seems to give the correct result does not mean it is valid.

Earlier posts
The residence time of CO2 in the atmosphere is …. 33 years?
The Half Life of CO2 in Earth’s Atmosphere – Part 1

This entry was posted in Climate change and tagged , , , , . Bookmark the permalink.

51 Responses to What’s up with the Bern Model?

  1. Euan Mearns says:

    This post has not been reviewed by anyone (Roger had a read) and its implications are potentially far reaching. Please recall that I use blogging as an iterative means of trying to get at the truth. If there are flaws or major flaws here, then please point these out politely and back this up with better data and theories.

  2. Euan Mearns says:

    I sense reader fatigue on atmospheric CO2.

    I was trying to think of an analogy to describe the flaw in the logic of studying the atmosphere and not the sinks. 19th Century, thousands of Scotts left for the New World. Some ended up in California. If you wanted to study the migration pattern and particularly how many ended up in California would you conduct a census of Scotland (the atmosphere) or California (the sink)?

  3. Gerard Wroe says:

    I am interested in the Bern model, how well it reproduces the sequestration of CO2 & residual atmospheric mix. With Lewis & Curry’s paper on sesnsitivity my next question is how confident we are in CO2 projections for RCP4.5 & 6 used by IPCC AR5.

    • Euan Mearns says:

      Figure 3 leaves the impression that less and less a % of CO2 is being sequestered each year. Combine with that rising emissions and you would quickly arrive at a catastrophic outcome. If you check the chart I posted in comments on the observed % of sequestration you’ll see reality is flat, or even rising slightly.

      • dennis coyne says:

        Hi Euan,

        The reality is that the percentage sequestered fluctuates a lot over time, especially year to year. Our data on land use change is not very good so long term averages are what matters, since 1900 the average has been about 45%, the land use change component has become less important (on a % basis) as fossil fuel and cement emissions have increased.
        From 1990 to 2010 about 48% of emissions were sequestered. From 1960 to 2010 the average was 44% for emissions sequestered.

        At some point destruction of the oceanic ecosystem due to acidification may result in lower amounts of sequestered CO2 emissions, so if there is an increase in sequestration on a percentage basis it may not continue for long.

  4. dennis coyne says:

    Hi Euan,

    I did some more work on this and my criticisms from yesterday were incorrect, my apologies.

    The Bern Model also assumes an equilibrium and the model applies to emissions only so we essentially ignore the CO2 at the pre-industrial equilibrium (about 2200 Gt of CO2 at 280 ppm) and look at atmospheric CO2 above this amount and how this level is affected by anthropogenic emissions and the natural processes that remove this excess CO2 from the atmosphere.

    Somehow I missed this crucial point and thought that your 2.5% (or 4%) decline in CO2 due to fast processes applied to all CO2, rather than only the roughly 850 Gt of CO2 above pre-industrial levels. As Roger attempted to explain, your model and his will only fall back to 2200 Gt if there are no more emissions of CO2.

    Another mistaken assumption by me was that the Bern model works back to 1751, this is false.

    I ran the model from 1751 to the present and it was immediately apparent that the model fit is for 1900 to 2010 where the fit is excellent.

    Using your two tier model, it is apparent that many different assumptions could be used and there are an infinite number of models that can be fit to the data (the same would be true of the Bern model.)

    Consider the following:

    Cumulative anthropogenic CO2 emissions were 1294 Gt from 1900 to 2010 and the atmospheric CO2 level rose by 735 Gt over the 1900 to 2010 period (from 2312 Gt to 3047 Gt). So 735/1294 or 56.8% of CO2 emissions remained in the atmosphere, only 43.2% of CO2 emissions were sequestered.

    So I tried a two tier model with 50% of CO2 emissions remaining forever and 50% of emissions declining at 12.53% per year. If emissions end in 2014 then CO2 stabilizes at about 3000 Gt (380 ppm), though that scenario is not very realistic.

    The Bern model falls to 2800 Gt under the same conditions (no emissions after 2013).

    A model with 35% of CO2 emissions remaining forever and 65% declining at 4.92% per year yields a model which stabilizes at 2800 Gt by 2100 (assuming no emissions after 2013).

    Note that all models were run from 1900 to 2100 with no emissions after 2013, Atmospheric CO2 assumed at 2300 Gt in 1900 (data suggests 2312 Gt in 1900).

    Link to chart for 35%/65% model below:


    • Dennis Coyne says:

      Link to chart for Bern Model for comparison (model in blue data in red)


      Also for Bern model the CO2 sequestered by the three Bern processes (T173, T18.5 and T1.2) from 1900 to 2010. Link below



      feel free to edit these so they will show up in the comment, you can do that but the tags don’t work for commenters. There is a utility that you can add easily to allow commenters to put up charts, e-mail me if your interested.

      I mention this because most people will not click on the link to see the chart.

    • Euan Mearns says:

      Dennis, thanks for your efforts and I am glad that we are now on the same page. This began with Roger’s post where he said that emissions and atmosphere can be modelled with a single exponential. I thin in fact it can be modelled with a huge number of multi-component models. And so while model fit is a pre-requisite of validity, model fit does not prove validity.

      My 2 Tau model is I hope rooted in physical science. I don’t think that the Bern model is.

      • dennis coyne says:

        Hi Euan,

        I think the Bern Model is just fine. All of the processes are happening. Nobody can track individual carbon dioxide molecules to know specifically which processes are sequestering which carbon molecules.

        The point of my chart is to show that the T173 process does sequester some of the carbon dioxide over time, about 60 Gt since 1900.

        As far as I can tell, very little physical science has been discussed to this point.

        You seem to think a two stage model would be better.

        I think it is better to roll it into a single model, the question answered is where does the sequestered carbon go? If it was not removed from the surface layers of the ocean, then that process would not proceed in the same way due to saturation and a slowing of the chemistry that removes CO2 from the atmosphere into the surface waters of the ocean.

        • Euan Mearns says:

          Dennis Coyne 26 sep 2014

          I think the Bern Model is just fine. All of the processes are happening.

          Dennis, would you care to explain the physical / geological process that results in 22% of man made emissions staying in the atmosphere for ever.

          For the Bern model to be valid it MUST fit sequestration to observations which it does not do.

          I think you may find that mixing of surface ocean water with deeper water is more rapid than you may think. There is a massive sink hole in the N Atlantic removing surface waters into the deep ocean and a massive global conveyer system. The ocean mixing time is of the order 1000 years – that is the time to homogenise all of the world’s ocean water.

          Google: “87sr/86sr ocean mixing time”, you’ll find a company there called “Isotopic” – these guys really do know what they’re talking about! Scroll down to Strontium Isotope Stratigraphy.

          • Yvan Dutil says:

            Euan, BERN model is calibrated using CFC and C14 and other tracer. But, their utilization is not trivial. There is plenty of reference to these work in the literature have send to you.

          • Euan Mearns says:

            Yvan, I have a post on 14C on Monday. I don’t think bomb 14C can be used for anything. It certainly can’t be used to falsify Bern. And by the same token I very much doubt it can be used to calibrate Bern. To do so would require intimate knowledge of the amount of C in the upper oceans that inhaled atmospheric C mixed with together with the 14C composition of that C. So as you say any use here would have to be non-trivial, I suspect so complex that no one understands what they do. The Bern model is falsified by 1) having a time slice in there that sees 22% of C emissions staying in the atmosphere forever and 2) the distribution of sequestration with time does not match observations.

            The sceptic blogs will hate my post on bomb 14C. The model I present here is actually just a modified Bern, it could be improved, but it ticks all boxes that Bern does not. Note I modelled 19% with T∞ for convenience. It is actually closer to T135 years. I’ll send you an email.

          • Dennis Coyne says:

            Hi Euan,

            The ocean mixing time is indeed around 1000 years, but the ocean does not mix uniformly. The Atlantic turns over more quickly than the Pacific and Indian Oceans.

            For the residence time of nutrient-like elements (which would include carbon):

            “The residence times of nutrient-like elements typically range from 1000 years (1 ocean mixing cycle) to several hundred thousand years. They have shorter residence times than conservative elements because some of the biogenic particles, with their elemental loads, inevitably reach the sediments and are removed from the ocean.”

            From link below:


          • dennis coyne says:

            Hi Euan,

            We do not really have observations of sequestration, we have observations of atmospheric CO2, the Bern model fits those well over the 1910 to 2010 time frame.

            The Tauinfinity I have adjusted to a Tau1000 with all other coefficients at the previous level, I start the model in 1751 with equilibrium set at 2310 Gt CO2 for the new model (set at 2290 previously). The model does not work well over the 1751 to 1900 period, the two Tau model has a similar shortcoming (no doubt natural variation from ENSO, solar variation or volcanoes, or other causes may be effecting CO2 more strongly than man-made emissions).

            Chart at link below:

  5. Euan:

    Let me throw in my ten cents’ worth. 😉

    You are basically arguing that the decline of atmospheric CO2 can be simulated with two time constants – a short one for a “fast” sink and a longer one for a “slow” sink. No problems there.

    Then you go on to say that the 173 and ∞ Bern time constants don’t sequester any significant amount of CO2 in the short-term and therefore “arguably … should not be in the model at all”. But if you take them out the model no longer fits observations and therefore has no predictive skill:

    What you need is a model that a) fits observations and b) uses two time constants that are physically realistic relative to the characteristics of the two sinks. a) is easily done, b) isn’t. But the time constants would have to be a lot longer than 1.2 and 18.5 years or the model won’t fit observations.

    Another complicating factor is that according to my estimates sequestration in the terrestrial biosphere sink is increasing rapidly as rising CO2 stimulates more vegetation growth (up from ~zero in 1960 to maybe 10 Gt CO2/year now, roughly twice the ocean uptake). Time-constant-based models don’t take expanding carbon sinks into account.

    • dennis coyne says:

      Hi Roger,

      If your estimate is correct, it is surprising that the Bern Model works so well. Maybe the amount being taken up by the ocean is decreasing at a similar rate. If your estimate is correct we would see a big difference in the peaks and troughs of atmospheric CO2 seasonally due to the land area difference between northern and southern hemispheres.
      Note that 10 Gt is a relatively small part of the yearly 400 Gt flux from plant growth. Note also that as the plants die much of previous increase to the sinks flows back to the atmosphere. Is there peer reviewed literature supporting your estimates?

    • Euan Mearns says:


      My 2Tau model is really looking at CO2 sequestered from the atmosphere by fast sinks (T17y) and a component not sequestered by fast sinks each year. I’m quite happy for the T17 component to be spilt into as many sub-parts as physical processes demand. The T∞ component is removing CO2 from fast sinks by slow sinks. It is of course not T∞ and I calculated it has a half life of about 135 years which tallies well with mixing shallow with deeper ocean water. I left that out of the post because I was a bit unsure about the calculation – will post on demand.

      But if you take them out the model no longer fits observations and therefore has no predictive skill:

      You are still plotting atmosphere and not sequestration. In the terms of your chart the 18.5 + 1.2 is really just 18.5 since in this kind of representation (which is wrong) the 1.2 barely exists. The 1.2 time slice I believe is the big con in the Bern model. Do you really believe that 50% of 18% of emissions disappears in 1.2 years? Having that really fast component creates a void in your chart that is replaced by one of the two slow Taus.

      I’d be at pains to point out that my 2 Tau model is a variant of Bern and is therefore likely to be hated by sceptics as well. As already stated, the T17 can be replaced with as many Taus as you want so long as they are rooted in physical science. But I believe the atmosphere sees only 1 speed, the average that is sequestered. And the T∞ component is rooted in the physical science of the mass of CO2 circulating in fast sinks increasing with time. The Bern concept of waiting to be sequestered by slow processes has no basis in physical science.

      Non-linearity of sinks with time is of course an interesting issue and where the money should be being spent. Saturation of sinks is another worthy issue to study.

      • dennis coyne says:

        Hi Euan,

        On a yearly basis there is a lot of flux of carbon to and from the atmosphere on the order of 200 Gt (earlier I had stated 400 Gt which upon further reading was incorrect).

        It is very easy for me to imagine that some portion of the anthropogenic emissions is removed very quickly from the atmosphere, but it is limited by how much capacity the upper layer of the ocean has to remove excess CO2, in 2002 this was estimated at 18.6% of emissions.

        On sequestration not matching observations, here is Bern model CO2 sequestration from 1905 to 2010 with a 10 year centered moving average, it varies between 40 and 50% of CO2 annual emissions sequestered when considering 11 year averages.

        When the Tau1000 replaces the Tau infinity, little changes, an extra 9 Gt of CO2 is sequestered which adds about 1.5% to the total CO2 sequestered.

        Chart at link below:


        • Euan Mearns says:

          Dennis, here’s your chart.

          Here’s what I said in an earlier post:

          What we kind of know for sure is that in 1965 the mass of CO2 in the atmosphere was roughly 2400 Gt (billion tonnes) and today (2010) it is roughly 2924 Gt. That is an increase of 524 Gt. And we also know that we have added roughly 1126 Gt of CO2 to the atmosphere through burning FF and deforestation (emissions model from Roger Andrews). And so while Man’s activities may have led to a rise in CO2 the rise is only 46% of that expected from our emissions. Earth systems have already removed at least 54%. How is this reconciled with the warnings of climatic meltdown?

          Clearly at odds with your chart / Bern. My figure 3 / Bern is also at odds with the quote above.

          If you look at my Figure 2 you can see that the % of emissions being sequestered is in decline, confirmed by Figure 3. Also at odds with your chart. There is something wrong somewhere. Why does your chart have cycles? This is an important point to nail and I was meaning to ask Roger to see if he could replicate my Figure 3. Are you able to provide your version of my Figure 2?

          • dennis coyne says:

            Hi Euan,

            I did that above, using a model with Tau1000 replacing Tau infinity.

            Link is


            Note that you would not see the cycles due to the short time frame of your analysis.

            Chart below for 1965 to 2010 with Gt of CO2 sequestered per year on right axis.


          • dennis coyne says:

            Hi Euan,

            Emissions from 1965 to 2010 were 991 Gt and CO2 in the atmospheric CO2 increased from 2500 to 3050 Gt
            or about 550 Gt, so I get 55.5% of emissions remaining in the atmosphere and 44.5% of CO2 emissions sequestered.

            The Bern Model (modified to Tau 1000 for first term) has an average sequestration of 45% over the 1965 to 2010 period, in pretty close agreement with observations.

            Some of the figures in your quote are different from what I use, I have CO2 at 320 ppm in 1965 which is 2500 Gt CO2 rather than 2400 Gt. I use CDIAC emissions which are 991 Gt from 1965 to 2010, which is very different from your 1126 Gt (I have 1296 Gt from 1900 to 2010, and 1164 Gt from 1950 to 2010.)
            Also I have CO2 at 389.6 ppm (3047 Gt CO2).

            Link to data below:


          • dennis coyne says:

            I should have said “I have CO2 at 389.6 in 2010”

          • Euan Mearns says:

            BP and CDIAC are the same. I know Roger added some for deforestation. Need Roger to explain background to his model. I started using it so that we were working off same data.

          • dennis Coyne says:

            Hi Euan,

            I tried running the Bern model from 1965 and the early part of the curve is very different from a run from 1900 ( I actually start in 1751, but I do not think the 1751 to 1899 part of the model has much effect). In order to get a decent fit I start in 1964 and 290 Gt (underlying change above pre-industrial) needs to decline at 0.45% per year in order to get a decent fit.
            Over the 1975 to 2010 period the average CO2 sequestered is 48% and the % sequestered is increasing rather than decreasing over the 1965 to 1980 period (from 26% in 1965 to 44% in 1980).

            Charts at links

            following model assumes CO2 emissions decline at 1% per year from 2014 to 2100, stabilizes at about 485 ppm)


            Chart with % CO2 sequestered for Bern model starting in 1964


          • Euan Mearns says:


            There are two sources of difference between us. The first is the imperfect way I manage the pre 1965 emissions. It seemed to work OK for my Figure 1. Your data and mine are the similar in 2010 ~ 16 Gt CO2 but are very different in 1965.

            To solve this, I need to go back and rebuild my model. My problem was starting with BP data (1965), not appreciating the significance of the pre-1965 stack and then getting lazy and building in an exponential fix to the pre-1965 stack. That is maybe part of the problem – but it worked OK for Figure 1!

            The other problem is that we are working off different emissions data sets. My 2 Tau model was simply tuned to provide a fit of emissions to observations. The Bern model has been tuned to fit a different set of inputs to observations. Roger built in a deforestation component to his emissions model.

            A third problem is possibly our atmosphere evolution model. I’m working of the Grid Arendal Carbon Cycle model (IPCC) with 750 Gt C in atmosphere in 1998 and scale that forwards and backwards calibrated to Mauna Loa as downloaded from woodfortrees.

            We need some input from Roger who I imagine right now is sitting in a sunny restaurant, sipping tequila with his lap top.

            You sequestration chart shows that most of the work is done by the fast sinks. Do you agree that if emissions were switched off that these fast sinks would pump down atmospheric CO2 quickly?

          • Euan:

            My emissions are higher than the CDIAC numbers because CDIAC gives only FF emissions while I add an allowance for forest burning based on your data. And according to this graph from KNMI my numbers are still probably low:

            from http://edgar.jrc.ec.europa.eu/part_CO2.php

            Emissions from forest burning and other land use changes have to be included if you want to calculate sequestered CO2 tonnages correctly, but as I understand it you’ve been using my numbers and not the CDIAC numbers.

            My model is put together exactly the same as yours except that I use e-folding decay and you use an annual percentage decline, b) I usually output the results as ppm CO2 rather than tons CO2 (although I can do both) and c) your rows are my columns and my columns are your rows.

  6. Euan Mearns says:

    Yvan, thanks for your effort. The point is that all this diverse observation and theory is distilled into one equation which I hope Clive has copied correctly from AR4. You are correct, I should have gone to AR4 to check this out and see what they say and I will do it.

    Interesting to note that if you Google Bern Model you get 2 returns for WUWT articles on P1 and 2 returns for Energy Matters on page 2 (this blog is still pretty small). I find increasingly that when I want to find out about something interesting on the internet that I get referred to my own work. The late John Daily is there too.

    Its a bit strange that the model used to predict global CO2 has so little prominence on Google. Is no one interested in this?

  7. michael hart says:

    “Is no one interested in this?”
    Well I am, Euan. But what is prominent on Google, and for what reasons, is something Google wouldn’t tell you!

    Something very interesting (IMO) in the carbon-cycle is actually happening right now (or the next 2-3 years). And I’m not referring to the new OCO-2 satellite.

    The atmospheric 14C isotope experiment is reaching a critical point, where the measured 14C atmospheric level is approaching the same level as it was before the atmospheric weapons tests. This means the “natural” removal rate should become zero (or rather the exponential decay approaches zero and is lost in the noise). The ongoing human contributions of cold carbon should then drive the relative 14C levels below the previous base line by ongoing dilution, which several authors say is now the biggest influence on 14C levels.

    As far as I know the best/latest data comes from the Jungfraujoch and Schauinsland measuring stations as reported by Levine, Kromer, Hammer, 2013.

    Graphs from the paper here:
    To my eye, the data still looks (predominantly) like a single exponential decay when it should be accelerating downwards due to the rapid recent increases in human emissions. I assume the same authors will be publishing more in future. Lets watch and see if the models have got it right!

    • Euan Mearns says:

      Michael, unfortunately the bomb 14C data cannot be used in the way it has been used. Its the subject of my next post. It looks like I’m carving out a position in the no mans land between IPCC and sceptics.

  8. michael hart says:

    I think the y-axis on those graphs I posted are incorrectly labelled as % when they should be “per-mill”.

  9. Euan Mearns says:

    Chart from Dennis Coyne, showing same thing as my figure2 in a superior way.

    The fast Tau1.2 and Tau18.5 processes are removing CO2 from the atmosphere based on partial pressure of CO2 in the atmosphere. The slow Tau173 and Tau1000 processes are predominantly removing CO2 from the fast sinks (not the atmosphere) – unless I am mistaken.

    A valid model needs to be constructed with 2 stages. One where fast sinks eventually remove 100% of emissions from the atmosphere. And the second eventually removes 100% of emissions from the fast sinks based upon the partial pressure of CO2 and bicarbonate and organic acids in these sinks.

    A valid C cycle model also needs to have sequestration of bio detritus in estuaries, deltas and marine sediments. This is absent in the Grid Arendal Model that I use.

  10. Euan Mearns says:

    Dennis Coyne posted this link to historic emissions data from Oak Ridge:


    It shows 515 million tonnes C emitted from burning coal in 1900. Britain alone emitted 129 million tonnes C from coal in that year. Was Britain alone really responsible for 25% of global emissions?

    Last week in a comment Roger said he didn’t need to look for the source of atmospheric CO2 rise since emissions and atmosphere match exactly.

    • dennis coyne says:

      Hi Euan,

      Note that Roger mentions that CDIAC emissions data only covers fossil fuel emissions, that is incorrect. It covers fossil fuels, cement, natural gas flaring, and land use change, all are included ( the total C on the right most column) in my models.

      To check the coal data we could look at Dave Rutledge’s data.

      For those interested, my modified Bern model (changed fixed term to a declining exponential with Tau=1000 which is a decline of 0.1% per year) can be downloaded at the link below (5 MB file so it takes a while toload, click on download link near top center of page to download excel spreadsheet):


      • Euan Mearns says:

        Dennis, I already sent an email to Dave R asking for data, though I imagine I could look on his web site. I don’t see that the CIDIAC data covers land use change. Seems to be FF and cement only.

        I have no issue with replacing T∞ with T1000 apart from, from my geochemistry days I recall ocean mixing time is of the order 1000 years. Post coming up tuesday perhaps.

        I don’t really have an issue with Bern tuning their model to fit emissions. But the emissions data base needs to be grounded in observation. Its very odd to exclude de-forestation.

        I just checked Dave’s page and downloaded his spread sheet. I think he has 800 million tonnes coal in 1900, so 500 million Carbon is probably on the money.

        Thanks for your spread sheet.

        • dennis coyne says:


          I also looked at Rutledge’s data and the CDIAC estimates may be low in 1900, if I did the maths correctly it seems the difference is a factor of 2, we could easily substitute the Rutledge data for coal.

          My mistake on the CDIAC data, I had thought land use change was included in that file. Land use data can be found at the link below (historical sheet):


          The model coefficients work best on data without land use change included.

          The coefficients need to be tweaked to get a good fit with land use change included.

          Below is a two tier model with land use change included, 21.7% of emissions go to slow proceeses with Tau=1000, the rest are sequestered by fast processes at Tau=14.4 (6.56% annual rate of decline)


        • dennis coyne says:


          Your welcome.

          The Bern model is pretty old, perhaps they did not have good land use change data when it was developed.

          It is strange that in AR4 they did not seem to include land use change when estimating the coefficients.

          Note that the land use change estimates are quite rough with large error bars, it is difficult to estimate this well, especially for the past. I have done some estimates including land use change, the coefficients on the Bern Model need to be adjusted for a decent fit and the fit is best from 1850 to 1950 (we do not have land use change estimates prior to 1850).

          The Bern model over estimates atmospheric CO2 from 1950 to 1990 and then under estimates from 2000 to 2013. Same is true of a two tier model from 1900 to 2013.

  11. Euan Mearns says:

    Here are your charts Dennis. Many thanks for your efforts. I think the two are virtually indistinguishable, apart from, I think I understand the physical science grounding of the 2 tier model.

    The Tau 14.7 in the two-tier can be substituted I believe with a multitude of fast Taus. But the atmosphere only sees one. Non-linearity with time / p CO2 is a complicating issue not yet addressed.

  12. Pingback: What’s up with the Bomb Model? | Energy Matters

  13. dennis coyne says:

    Hi Euan,

    At first glance the charts look the same but the two tier model above underestimates CO2 in the atmosphere by 75 Gt, where the modified Bern model is only low by 50 Gt. I did a second two tier model with 52.4% of emissions going to fast processes and the remainder to slow processes, with Tau136 and Tau4.48. I also created a CO2 emissions profile roughly based on Steve Mohr’s fossil fuel estimates and future land use change estimates for 2014 to 2100 to see how the models compare (all are based on this same emissions profile.) The three models (modified Bern and two tier model shown in charts above, plus a third two tier model (52.4% to fast sequestering processes).

    Note that for those that might say, aha the CO2 stabilizes at 450 ppm, this model does not account for other greenhouse gases and positive climate feedbacks, it also is a fairly conservative estimate of coal reserves which will rise as oil and natural gas deplete and coal prices inevitably rise. We may see World coal use follow the Chinese model as other fossil fuels deplete and coal prices go up.

    Note that a Hubbert type analysis invariably underestimates the long run URR as resources are developed into reserves with rising prices.

    Chart at link below


  14. Jim says:

    Hi Euan,
    If there has been a slight warming, at least since 1650, say, should there not be a component of ocean degassing in the contribution to CO2 in the atmosphere?

    • Euan Mearns says:

      Hi Jim, good to hear from you. Of course there should be a contribution from ocean warming. How to estimate that? Perhaps a cross plot of CO2 and T from Vostok plus an estimate for temperature from the depths of the LIA? Its quite simple to build that into a model and it would result in higher sequestration rates if we stick a further natural increment onto emissions.

      I have another question that you may be able to help with. In my last post on the Gulf Stream there is a lot written on the formation of Atlantic Deep water off N Norway. Cooling saline water eventually sinks. But what happens at the other end where the flow is supposed to upwell in the N Pacific. I really cannot find much written on that apart from the California current where “acid” nutrient rich waters upwell owing to wind shear and other processes. Is that the far end of the Gulf Stream?

Comments are closed.