|San José State University|
& Tornado Alley
The validity of climate models must be established by their use in explaining the past climate data, the so-called backcasting of the model. However this backcasting must be honest; i.e., the independent variables must be independently known and not surmised from the past climate data. For example, global temperatures after rising for decades took a downturn from 1940 to about 1955. Climatologist speculated that downturn was due to increased levels of sulfate aerosols. If measurements of the average global sulfate levels were available and higher levels did in fact coincide with the downturn in global temperatures then that would be a validation of the model. However if the investigators had no independent measurements of sulfate levels or anything correlated with sulfate level to test the speculation then that is not a validation of the model. The term that is used for this process is tweaking. Worst yet than tweaking is if the investigators asked what would the sulfate level have had to be to produce that downturn and found out the values by trial-and-error and presented that information as though it were an actual measurement of sulfate levels. This would be simple dishonesty.
Furthermore, the backcasting that is relevant for validation of models for the projection of future climate is the backcasts based upon no more information than is available for the future projections. This means volcano erruptions and such should not be used because they are not available for the future projections.
Here is a visiual depiction of the issue. Below is a representation of a typical forecast.
If the forecasting model is run backwards then the backcasts would be as shown below.
However if the backcasts involve the incorporation of past data the results could look like this.
Thus it is very easy to distinguish honest backcasts from the dishonest ones.
Below is shown what is purported to be a backcast of historical data with one of the climate models of the National Center for Atmospheric Research.
The red line is the historical data and the blue line is the values computed from the model.
The text accompanying the graph is
This simulation of 20th century climate incorporates variability from solar output, volcanoes, sulfates, and greenhouse gases. The modeled global average in surface temperature (red) captures most of the major rises and falls in the observed temperature (blue). Earth's surface warmed more than 0.6ｰC (1.0ｰF) in the 20th century. Much of the warming occurred from 1910 to 1940 and after 1970. Other periods showed little or no temperature increase. In an effort to explain this uneven warming sequence, Caspar Ammann (NCAR/University of Massachusetts) and Jeffrey Kiehl and Bette Otto-Bliesner (NCAR), together with Charles Zender (University of California, Irvine), have examined the last century's climate using the CSM. These are among the first global simulations to include each of four major elements: long-term solar changes, volcanic eruptions, anthropogenic sulfate aerosol, and observed greenhouse-gas concentrations. With these factors in play, the model successfully reproduces most of the peaks and valleys in the last century's global temperature record. According to the model, a rise in incoming solar energy since 1970 was offset by the effects of several large volcanic eruptions. This leaves human-produced greenhouse gases as the most likely cause of warming over the past 30 years.
The result looks good, perhaps to good to be true. There is information on solar output from the series on sunspot numbers. However where did the data on global levels greenhouse gases and sulfates come from? The CO2 levels from the Mauna Loa station only go back to 1958. The years when there were major volcano erruptions are known did the modelers have data on the levels of volcano dust year by year? Or did they chose the levels of volcano dust to make the model output most closely fit the historical data. The modelers might have had some information for more recent volcanic erruptions but what about the earlier ones. The strong indication that the backcasts were tweaked to fit the data is that the correspondence between the supposed results of the model and the historical data is just as good in the early years as in the later years when more accurate information was available. It is also notable that the supposed backcast starts from a different point in the last year. This probably means that the line for the computed values was shifted upward to make the fit look even better.
|Backcasting is essential for the validation
climate models but it must be honest backcasting!
No information for the past should be included
HOME PAGE OF Thayer Watkins