The Horrors of Homogenization

There has recently been a lot of discussion about the homogeneity adjustments that GISS and others have applied to surface air temperature records, and since this is a subject I’ve done  some work on I thought it would be appropriate to say something about it.

The problem, however, is how to say it, because the subject defies exhaustive treatment in a single blog post, and I don’t think presenting yet another set of before-and-after examples of what homogenization does to raw records would greatly advance the state of knowledge. So what I will do here is touch on the basics and then work my way through a few examples of what homogenization actually does in practice, one of them in detail.

First a note on data sources. My main data source is the GHCN v3 mean temperature data set available at KNMI Climate Explorer, which includes the raw records (“GHCN all”) and the NOAA/NCDC homogeneity-adjusted records (“GHCN adjusted”). The GHCN v3 data are expressed as anomalies relative to 1981-2010 means. (Note that the GHCN v3 adjusted data set is not the same as the GISS “homogeneity adjusted” data set.) Supplementing the GHCN v3 data are raw GHCN v2 records that I downloaded from the GISTEMP GHCN v2 data set a few years ago. I am unable to update them because GISS has not updated them since October 2011. Other data sources are specified in the text.

Homogenization is a process whereby raw temperature records in the same general area are adjusted by computer algorithms to match each other to within acceptable limits. The adjustments are applied because the raw records are assumed to be distorted to a greater or lesser extent by station moves, equipment replacement, time-of-observation changes and/or physical changes in the vicinity of the station, and further assumed that these distortions must be removed before the records can be considered suitable for use. There are, however, two problems with these assumptions.

The first is that many raw temperature records show no sign of serious distortion. It’s common when comparing raw records from stations in the same area to find that they match quite well, meaning either that all of them are distorted by the same amount in the same direction at the same time, which is improbable, or that none of them is significantly distorted. Figure 1 shows an example from the Nova Scotia-Newfoundland area:

Figure 1: Five Nova Scotia-Newfoundland raw records (Charlottetown, Sydney, St. Johns, Gander, Shearwater). Data GISS

And often when a raw record does show what appears to be a large artificial discontinuity, such as the abrupt ~2C upward shift in the Nome, Alaska record after 1976 ….

Figure 2: Nome, Alaska, raw record. Data GISS

…. it turns out not to be. Other records in Alaska show the same feature (Figure 2). The upward shift was natural. It was caused by the 1976 “phase change” in the Pacific Decadal Oscillation:

Figure 3: Nome plus five other Alaska raw records (St. Paul, Bethel, Tanana, Bettles, Kodiak). Data GISS

Second is the problem of identifying any artificial distortions that may be present. Usually only the really large ones are visible in the raw records; small and medium-sized ones are difficult to detect and sometimes impossible to detect at all, either visually or statistically. The Paraguayan records, some of which were recently featured on Paul Homewood’s blog and elsewhere, are an example. Figure 4 plots eleven unadjusted GHCN v3 temperature records from the area. They are fairly typical of what raw temperature records over much of the world look like:

Figure 4: Eleven raw records in Paraguay and surrounding area (Puerto Casado, Concepción, Ponta Pora, Bahia Negra, Pedro Juan Caballero, Mariscal, Asunción, Corumba, Las Lomitas, San Juan Bautista, Formosa). Data GHCN v3

Unless historic records that accurately document the history of each of these stations are available – and here I’m sure they aren’t – identifying artificial discontinuities in records like this is a hopeless task. Yet the records get homogenized anyway, with the results shown in Figure 5:

Figure 5: Paraguayan records after homogeneity adjustment. Data GHCN v3

The adjustments that achieve this result are shown in Figure 6. (The internal workings of the NOAA/NCDC computer algorithm that generated them – hereafter the NCDC algorithm – aren’t strictly relevant to this post but details are here should anyone want more information.)

Figure 6: Homogeneity adjustments applied to Paraguayan records. Data GHCN v3

It’s hard to see how this hodgepodge of adjustments could reflect actual artificial discontinuities in the records. The NCDC algorithm seems to have applied adjustments simply to make the records track each other. But at the same time it adds about a degree of warming that isn’t seen in the raw records – where did that come from? Clearly there are questions as to whether the homogenization process is working the way it’s supposed to here. (A case can also be made that if adjustments this large are needed to make the records “correct” then the records were too distorted to have been used in the first place, but we’ll let that pass.)

To obtain further insights on how homogenization works we now turn to a detailed example – Alice Springs in Australia, which being the only continuous long-term record for many miles in any direction is one of the more important records in the global surface air temperature data base. Alice is a good example of what homogenization does in practice and it also gives an idea of the degree of detail we have to go into before we can decide whether a record needs adjusting.

Figure 7 presents the Alice record its raw state. It shows little or no warming since 1880 and no obvious evidence of artificial discontinuities:

Figure 7: Alice Springs raw record. Data GHCN v3

First we will review the history of the Alice station, which I reconstructed from historic records and Australian Bureau of Meteorology metadata (example here.) Temperature measurements at Alice began in 1879 at the Telegraph Office north of town, shown in the old sepia photo below.

Figure 8: The Telegraph Office

A key factor in evaluating temperature records is station quality, and this looks like a good place for a station (at least urban warming wouldn’t have been problem). But before we can confirm that it was we need to know that the thermometer was out in the open and properly screened. And thanks to a painstaking job of restoration by the local authorities plus a tourist photo posted on Google Earth I was able to find the location. The structure inside the black circle is a Stevenson screen , and given how persnickety the restorers of historic sites are we can reasonably assume that it was part of the original installation:

Figure 9: Tourist photo revealing screen location

Figure 10 pinpoints the screen on a Google Earth overhead view. The location probably would not have made WMO class 1 but It’s a high-quality site nonetheless (the red lines on this and following overhead views are provided for scale and are 100 m long unless otherwise specified).

Figure 10: Google Earth view of Telegraph Office thermometer location

In 1932 the Telegraph Office station was decommissioned and replaced by a station 3km south at the Alice Springs Post Office. Figure 11 shows what the site looks like now (the Post Office itself has moved to the suburbs). It would not of course have looked like this in 1932, but station quality would probably still have suffered and we might expect a discontinuity in the temperature record as a result.

Figure 11: Current Google Earth view of Post Office thermometer location (approximate)

The Post Office station operated until 1989, but in 1942 the station at what is now known as the Old Airport, located 11 km south, supplanted it as the official recording site. The Old Airport was a wartime base, so there probably would have been changes in equipment and observational procedures as well as a station move. The location of the Old Airport station is shown as best as I can fix it along with the current airport station in Figure 12. The exact location is uncertain but it was probably on or close to the asphalt hardstanding to the right of the buildings – hardly an ideal place for a thermometer – so we might expect another discontinuity in the Alice record in 1942.

Figure 12: Google Earth view of Old Airport and Current Airport thermometer locations

In 1974 the station was relocated for the third time from the the Old Airport to the current airport station a little less than a kilometer to the northeast (Figure 13). We might expect another discontinuity in the temperature record here because the current airport station is a high-quality station (I rate it WMO class 2) while the Old Airport station probably was not:

Figure 13: Google Earth close-up of current airport station thermometer location

Figure 14 shows the four station locations relative to each other. The current airport station is 13km south of the original Telegraph Office station and 42m lower.

So here we have three documented station moves, each of which might be expected to have generated an artificial discontinuity in the Alice temperature record. Did they?

Figure 14: Google Earth view of the four Alice Springs thermometer locations

One way to find out is to compare the individual records from the four stations. First we compare the Telegraph Office and Post Office records. There are no reliable overlap values but visually there’s no obvious sign of a discontinuity:

Figure 15: Telegraph Office and Post Office temperature records. Data GISS

Next we superimpose the Old Airport record. It matches the Post Office record very closely. There’s definitely no discontinuity here:

Figure 16: Telegraph Office, Post Office and Old Airport temperature records. Data GISS

Finally we superimpose the current airport station. It also overlays the Post Office record almost exactly. No discontinuity here either.

Figure 17: Telegraph Office, Post Office, Old Airport and current airport temperature records. Data GISS

Matches this close are, however, suspicious. Temperature records from different stations rarely line up this well. Is it possible that the four Alice records were at some point adjusted to match each other? Indeed it is. But if this is what was done the raw Alice record is already homogenized. There’s no need to re-homogenize it. And if the numbers weren’t adjusted there’s also no need to homogenize it. It was homogeneous to begin with.

I ran a final check by comparing the raw Alice record with the raw records from stations around it, which as noted earlier is a good way of confirming that a record isn’t seriously distorted. Unfortunately there are no records close to Alice that are long enough to tell us anything, so I had to settle for the records from the six stations shown in Figure 18, which are up to 900km away.

Figure 18: Stations around Alice Springs

Figure 19 plots the records from these six stations against the Alice record. The match isn’t perfect, but with the stations covering an area of at least a million sq km we wouldn’t expect it to be. Yet the peaks and troughs generally line up and the overall trends are substantially the same, giving us confidence that all of them are recording mostly real and not spurious temperature changes. (The plot excludes three 2-sigma outliers; Boulia in 1893 and 1894 and Halls Creek in 1949. The 1957-84 period is used as the baseline because all seven stations were operating in those years):

Figure 19: Alice Springs raw temperature record vs. raw records from surrounding stations. Data GISS

That concludes the detailed analysis of the raw Alice Springs temperature record (although it still isn’t as detailed as it should be; a full analysis would among other things require a review of the original paper records to confirm that the individual records shown in Figures 15 through 17 are what I think they are). Does the analysis prove beyond doubt that the raw record is correct? No, it doesn’t. But neither has it revealed any obvious flaws, and one of the things I learned in the years I spent analyzing assay data bases in the mining industry is that if your raw data are not obviously flawed you don’t mess with them.

But few records in the Southern Hemisphere escape the gentle ministrations of NCDC’s homogenization algorithm, and the Alice record is no exception. Figure 20 shows what it looks like after the algorithm has done its work:

Figure 20:  Homogeneity adjustments applied to Alice Springs raw record. Data GHCN v3

How did the algorithm obtain these results? The adjustments tell the story. The algorithm identified most of the peaks and troughs in the raw record between 1918 as artificial discontinuities and smoothed them out with stepwise adjustments. But only one of these adjustments (1930-32) coincides with a known station move. The others coincide with fluctuations that are all visible to a greater or lesser extent in the records from surrounding stations (Figure CC), indicating that they are real climatic features and not spurious shifts. So instead of replacing distorted data with valid data the algorithm has replaced valid data with distorted data. (The gaps in the adjusted record, incidentally, show where the algorithm decided the raw data failed quality control).

But again the adjustments add warming – in this case 1.5˚C, even more than they added in Paraguay. How did the NCDC algorithm achieve this? I’m at a loss to explain it. Warming can be added if a record is homogenized with surrounding records that show more warming, but there’s no record anywhere near that shows as much as the +2C of warming the adjusted Alice record shows.

Now homogeneity adjustments do not of course always add warming. In some cases they add cooling. But the impact over large areas is to bias the raw records towards warming. A few years ago I did a before-and-after analysis of 52 records in South America using the GISS raw records and an older version of NCDC’s adjusted GHCN v3 data set that has now been superseded, so the results are no longer current. Nevertheless they still illustrate the impact of homogeneity adjustments on the sub-continental scale. The results are summarized on Figure 21, which plots the warming/cooling trends measured from the raw GCNv2 records against the adjustments applied by the homogenization algorithm:

Figure 21: Raw record warming trend vs. homogeneity adjustment, 52 records in South America

The implications of these results are not immediately obvious so I will summarize them verbally. The trend line slopes up to the left, signifying that raw records that show cooling have received larger warming adjustments than those that show warming. This is the way the trend line has to go if the raw records are to be homogenized. But the warming adjustments applied to the records that show cooling are not offset by cooling adjustments to those that show warming, and as a result the adjustments add about a degree of overall warming that isn’t present in the raw records (the average of all the adjustments is plus 1.05C). Again I am unable to explain where the extra warming came from, but it’s clearly been manufactured somehow by the algorithm, like the warming at Alice Springs and in Paraguay. (I have a comparable plot for Africa south of the Equator which I won’t bother to show because it looks very much like the one for South America.)

And if you are still confused look at where the trend line crosses (0,0). If the homogenization algorithm is unbiased the trend line will pass through (0,0). Here it passes over a degree C above it.

So homogeneity adjustment adds warming in Central Australia, Southern Africa and South America, and similar adjustments by the Australian Bureau of Meteorology and NIWA add warming over the rest of Australia and over New Zealand too. Pretty much the entire Southern Hemisphere is adjusted. How does one justify adding warming to raw records over the entire Southern Hemisphere? One doesn’t. The warming is clearly manufactured, spurious, non-existent.

Curiously, however, the raw surface air temperature records in the Northern Hemisphere are rarely subjected to warming-biased homogeneity adjustments. I will not speculate as to why. I will just observe that they show approximately twice as much warming as the raw records in the Southern Hemisphere and leave it at that.

Two questions remain. First, how much difference does the manufactured warming in the Southern Hemisphere make? As a practical matter, not very much. The impact on global land air temperature series like CRUTEM4 is muted by the fact that less than a third of the Earth’s land area is in the Southern Hemisphere, so the impact on the global land surface temperature record would be only in the 0.1X degrees C range even if the amount of warming over the Southern Hemisphere landmasses had been artificially doubled. And the impact on “surface temperature” series like HadCRUT4, which are about 70% based on SSTs, would be down in the 0.0X degrees C range. So removing the homogeneity adjustments doesn’t make global warming go away.

As to the second and potentially more troublesome question of why such obviously flawed adjustments are being applied, I will leave that up to the judgment of the reader.

This entry was posted in Climate change and tagged , , , . Bookmark the permalink.

38 Responses to The Horrors of Homogenization

  1. Craig Crosby says:

    At what point in time were the adjustments made? That could seem to make a statement as to intention. That is to say, carelessly applied differs from intentional and timing would be important in discerning fraudulent activity.

    In any event interesting data. Thank you.

    • I don’t know when the last adjustments were made but the Alice Springs adjustments haven’t changed much in over two years. The first graph is the current one from the post and the second is a version I put together in October 2012:

      You’re very welcome to the data, but I wasn’t trying to discern fraudulent activity. “Carelessly applied” probably fits better.

  2. Dave Rutledge says:

    Hi Roger,

    Thank you for a thoughtful post.

    I spent some time trying to understand the temperature records for my father’s home of Detroit Lakes, Minnesota. This station got some attention in the blogs. I found that the adjustments were quite large, up to 2C. They were different for different months and for max and min readings. So the trends for different months and max and min readings changed in completely different ways. In addition, an obvious jump was missed by the algorithm when an air conditioner was added near the thermometer in 1999. The July max jumped 3.5 degrees the next year, while the average increase for the ten nearest stations was 0.5 degrees. However, no adjustment was made, and this biased the trend upward.

    I spent thirty years in experimental electronics research, and I just could not imagine making the kinds of adjustments that were being made. Better to have a simple quality test for the raw data, and then just live with the results. This is how state maximum temperature records are done. And it turns out that 23 state records were set during the 30s, but only one has been set so far this decade.


  3. Dave Rutledge says:

    Hi Roger,

    Thank you for the update. We used to talk to my grandmother on the phone at Christmas. She would say, “It’s a beautiful clear day, 10 below (Fahrenheit).


  4. Graeme No.3 says:

    I assume that you are aware that 2 blogs in Australia keep an eye on the BoM and its adjustments. They are and

    The latter leads to comment on adjustments at Giles Station – established as a purpose built meteorological station in 1956 – The site has never moved and has always been staffed by BoM personnel yet the BoM … added warming to Giles minimum data with a -0.48° adjustment prior to 1 Jan 1998.

    And an oldie but goodie …if you go to the bottom of the main page and find ‘what the stations say’ you can view .gif graphs of old records (John Daly died in 2004) from around the World before adjustment mania took hold.

    I wish I could share your view that the adjustments would make little difference to the final figures, but why are they bothering to adjust them if that was the case? I think it far more likely that the recent (200-2014) “warming” results from adjustments south of the equator.

    • Giles doesn’t escape the NCDC algorithm either:

      I wish I could share your view that the adjustments would make little difference to the final figures, but why are they bothering to adjust them if that was the case? I think it far more likely that the recent (200-2014) “warming” results from adjustments south of the equator. This is an important question that I may do another post on shortly.

  5. edhoskins says:

    As support for your findings please have a look at the example of Dale Enterprise station West Virginia. It is typical of these sorts of one way adjustments being made to the land based temperature record. It is a single correctly sited and continuously well-maintained, rural US weather station. Its records are instructive. The un-adulterated record even shows modest cooling of 0.29°C per century, if all other adjustments made by “climate scientists” are ignored.

    However as is shown above the NASA GISS published “value added” temperatures for this same location. This shows a massive adjustment lowering of past temperatures before 1965 to give the impression of very substantial (+0.78ºC / century) warming at this station. Of particular interest is the apparent step wise adjustment of the homogenised data, which would seem to be truly spurious. It is graphed along with other articles on the site below:

    Cumulatively the result has been to emphasise warming from the US rural data sets by some 0.47ºC / century. These results are always a one-way street to emphasise the apparent amount of warming. The following table clearly shows the scale and impact of the overall adjustments in the USA

  6. Joe Public says:

    A fascinating supplement to Paul Homewood’s & others’ exposure of temperature ‘records’ data manipulation.

    Clear explanations, too.

    Thanks, Roger.

  7. Euan Mearns says:

    Fascinating insight Roger. I really like the Alice Springs case study. A completely flat temperature record gets cooked into warming. It seems likely that the guys on the ground already applied adjustments for site changes etc and that GHCN have adjusted again. Its really quite shocking that this is going on.

    You claim that all S hemisphere records are modified in this way and like Graham I don’t understand why this evidently has so little impact on the global outcome. Perhaps another post to explain how the global record is constructed and weighted?

    • You claim that all S hemisphere records are cooked in this way and like Graham I don’t understand why this evidently has so little impact on the global outcome. Perhaps another post to explain how the global record is constructed and weighted?

      Coming up 🙂

  8. Yvan Dutil says:

    It is not because you don’t understand something, that this is making it wrong. Homogenization is carried by inter comparison between station. This is how jumps are detected. There is various algorithm to do this and global result a rather similar. This is still an active research field to find the most appropriate approach.

    For those interested you can gain more information there.

    BY the way, Victor Venama is very critical of the current homogenization procedure. His blog worth the reading:

    • A C Osborn says:

      Your statement about inter comparison between stations does not make sense in the light of Roger’s careful analysis of Alice Springs.

      • Yvan Dutil says:

        Sorry, Roger did not plot the difference between stations. He just put them all together. If you want to understand what is going one with the algorithm, you must somehow replicated it inner logic.

        • A C Osborn says:

          It doesn’t matter what it’s inner logic is, it is either wrong or not working properly.

          • Yvan Dutil says:

            Sorry, but you have not demonstrated that.

            What you need to do is to compare each station pair wise to find the jump.

            Also, check should be done to see if the time of observation has not change over time. This create strong non climatic drift.

  9. Mikky says:

    Alice Springs is one of the case studies given by Blair Trewin in this ACORN-SAT document (section 10.6):

    The hypothesis is that temperature shifts were caused by rapid changes in vegetation, which raises a tricky point of principle, should one remove temperature changes caused by changes in vegetation? I’d say yes if the vegetation change is only very local to the station, no if the change applies to a large region.

  10. Mikky says:

    You can read what Blair Trewin of the Australian BoM has to say about Alice Springs in section 10.6 of this document:

    He says temperature steps were caused by sudden changes in vegetation. Maybe so, but should the resulting climate variations be removed, if they apply across a large region?

    • A C Osborn says:

      If this were actually true then why did the other stations also show similar spikes and movements over 1 million sq km?

    • Correct, AC. There was no artificial temperature shift in the Alice Springs record during the 1974 station move. The record shows a downward shift in 1974, but so do all the other stations for miles around. It’s yet another example of how homogeneity adjustments are unable to distinguish between spurious shifts and real climatic events.

      (Alice is in black)

  11. A C Osborn says:

    Roger, the Northern Hemisphere data shows exactly the same kind of adjustments, just take a look at the work of Steven Goddard for the USA.
    The other thing that is not picked up by using GISS, is that they use NCDC data as their data source.
    If you look at NCDC data it uses “Estimated” data for many of the USA stations when actual real live Raw Data is available. The historic data is also chock full of Estimated data sometimes where there is Raw data and sometimes where there was not data because there was no Station.
    Add to that they do not adjust anywhere near enough for UHI effects.
    The whole thing is a complete corruption which is justified by “homogenisation” and TOBS.
    The peer reviewed papers that they use to justify these changes do not show changes in the ball park as they supposedly only add about 0.7 degrees warming.
    Note that is 0.7 out of 0.8 degrees per Century.
    Perhaps you understand why I get so angry about the worl wating so much money based on this kind of “Science”, which is a damned insult to real Scientists.

  12. A C Osborn says:

    Sorry that last sentence should say
    Perhaps you understand why I get so angry about the world wasting so much money based on this kind of “Science”, which is a damned insult to real Scientists.

  13. A C Osborn says:

    And this is really what it is all about.
    “This is probably the most difficult task we have ever given ourselves, which is to intentionally transform the economic development model, for the first time in human history,” Christiana Figueres, who heads up the U.N.’s Framework Convention on Climate Change, told reporters. – See more at:

    • JerryC says:

      Ah, I believe that intentionally transforming the economic development model has been attempted at least once previously.

      Glorious Five Year Plans, anyone?

  14. Retired Dave says:

    Roger, An excellent article thank you. Much work and greatly appreciated.

    Should fig 21. be headed Southern Africa or did you substitute the similar diagram for South America?

    • Retired Dave: Thanks for picking up on that. The plot is for South America and I have modified the text to eliminate further confusion.

      Here’s the plot for Africa south of the Equator:

  15. A C Osborn says:

    Paul Homewood has 2 more posts on Arctic Homogenisation horrors. The one that is the most telling is the one that shows the slightly older “Quality Controlled” data not having the same trends as the later “Quality Controlled” data.

  16. Javier says:

    Thanks for the article, Roger.

    I am troubled by all this. As I understand climate in the Northern Hemisphere is quite different from the Southern Hemisphere, where almost all is sea and appears to show both less warming now while Antarctica seems to be cooling, and also appears to be less cold during glacial periods, having very little ice cover. Surely all this shows as a disparity between GISP and Vostok ice cores.

    Probably the NH is warming naturally as we come out of the LIA, and oceanic currents redistribute the ENSO heat into the North Atlantic according to the multidecadal North Atlantic currents.

    To prove that the Earth is warming you probably need more warming in the SH, then. Otherwise you can only prove that the NH is warming, not the Earth.

    However I have a problem with Berkeley Earth temperatures. They show quite a similar warming to GISS and NOAA, yet they are independent, educationally funded and publish all the data and the rationale of their normalisations, as far as I know. Correct me if I am wrong.

    So either Berkeley Earth is tinkering with the data in a similar fashion to GISS and NOAA, which is hard to believe, or the tinkering with the data of GISS and NOAA does not produce too much of an impact. That is the nature of my problem.

    • Hi Javier:

      I mentioned in the post that the “tinkering” doesn’t in fact produce much of an impact (“how much difference does the manufactured warming in the Southern Hemisphere make? As a practical matter, not very much”). So you can set your mind at rest on that one.

      I haven’t looked very much at BEST but may do so in the post I’m currently putting together. But I don’t think the source of funding has much to do with the results.

      • Yvan Dutil says:

        If you care only about the global temperature, homogenization don’t make much a difference. This has been demonstrated by various amateur who simply averaged raw data. However, homogenization is important if you want local trend, this is why it is done.

      • Javier says:

        I am still confused, Roger. Does the global temperature dataset show a 20th century warming or not?

        In this blog the author (Eugene Zeien) uses only the GHCN raw data from the 613 sectors with continuous data between 1900 and 2009, and the data indicates no warming at all. Just a periodic oscillation.

        Yet the entire world, including most people that are skeptic about the warming being of human origin, is convinced that the NH has been warming since at least 1850. In fact glaciers have been retreating for about 175 years and sea level has been raising modestly for that much time.

        On what data are we basing this general believe? Is the global temperature dataset not supporting it unless the warming is artificially introduced in the normalisation? Being able to trust the data is essential to reach the right conclusions.

        • Javier: One of the few aspects of global warming I’m not skeptical about is that the Earth has warmed. Here’s a global surface air temperature series I put together from scratch a few years ago using ~800 unadjusted temperature records that I selected one-by-one from the +8000 record GHCN v2 data base. The funny thing was that I put it together in an attempt to prove there was no such thing as global warming.

          • Javier says:

            Thank you for the information, Roger. It surely was a lot of work. I trust then that the raw data of the global temperature dataset does show warming and therefore agrees with all the other evidence that the Northern Hemisphere has been warming since at least 1850.

            I think that if the raw data shows the warming that everybody agrees, about 0.8º C, we should be using raw data and live with the noise and the error bars.

    • A C Osborn says:

      Javier, BEST “tinkering” is even worse than GISS & NCDC and they openly admit it.
      Their Final Product is formulated based on Model Predictions of what the Temperatures “should look like”.
      They combine many stations and also Splice & Dice the data.

      • Javier says:

        If true that is awful A C Osborn. As I understand the only reason to establish the Berkeley Earth BEST was to have an independent source of a global temperature dataset open to scrutiny and therefore that could be trusted.

  17. Pingback: How Hemispheric Homogenization Hikes Global Warming | Energy Matters

Comments are closed.