GREENHOUSE BULLETIN NO 123, APRIL 1999
Validation Of Computer-based Climate Models
by Dr Vincent Gray
Abstract
Validation of climate models based on the greenhouse effect requires that they be compared with a past temperature sequence assumed to be free of natural variability. The annual global surface temperature anomaly series before 1940 does not comply with this requirement as greenhouse gas emissions were unimportant over this period. If the data since 1940 are considered to be relatively free from natural variability, then the "best fit" computer-based climate model is one with a climate sensitivity (temperature rise at equilibrium, for doubling of carbon dioxide) of 0.8°C, with an uncertainty due to natural variability of ±0.6°C.
This value of climate sensitivity is capable of being simulated by climate models based on the greenhouse effect, provided a suitable set of climate parameters is chosen. However, the uncertainties associated with the model predictions are very great, and must be added to those due to natural variability to get a true value for model reliability. When this is done, the models can be considered as compatible, within low statistical limits, with almost any past or future conceivable climate sequence, including a zero or negative temperature change. Their value as predictors of global warming has therefore not so far been established, and their value for predicting future climate change is dubious.
On the other hand, global temperature changes for the past century continue to be capable of being explained by natural variability of the climate.
1. Validation
Scientific investigation attempts to provide a theoretical explanation of natural phenomena which is, if possible, capable of quantitative representation by mathematical means. Such a mathematical representation is called a mathematical model. Before such a model can be accepted as a possible representation of reality, it must be compared with the phenomena in question and its success, together with its level of accuracy, determined.
Computer-based mathematical models of the earth's climate have not been fully subjected to this procedure, which is called validation. This paper attempts to remedy this deficiency.
It should be noted that a correlation between a model and observations, even to a high level of accuracy, does not prove a cause and effect relationship. It merely provides an indication that the model may be correct, particularly if the level of accuracy is high and other models are markedly less successful..
2. Validation in the First Report
The first report of the Intergovernmental Panel on Climate Change ( Houghton et al. 1990) contained, as Chapter 4, Validation of Climate Models (Gates et al), which might have been expected to carry out this essential exercise.
Comparisons were made between the results of climate models and a number of climate observations, both global and regional. Periods covered were usually less than ten years, so although there were limited successes, no such model could be expected to make acceptable predictions beyond this short period. No comparisons were made with the past global temperature record, an essential for models which claim to represent global temperature change.
A more serious deficiency was the assumption that the enhanced greenhouse effect was the only influence on the climate. It is possible that this could be approximately true for some of the short periods considered, but generally it is quite wrong to assume that without the enhanced greenhouse effect, the climate could be constant and unvarying. There is ample evidence that there was a considerable variation in most aspects of the earth's climate before human influence was possible, and. this "natural" variability must surely be continuing. A climate model based solely on the enhanced greenhouse effect could thus not hope to predict climate properties, or expect to be successfully validated,. unless there was also an understanding of what the properties would be in its absence. Although ignored in this chapter, comparison with the past temperature record and a consideration of natural variability were considered in the later Chapter, No 8.
Chapter 8, Detection of the Greenhouse Effect in the Observations, by T.M.L. Wigley and T.P. Barnett, carried out the essential steps needed for validation of climate models. These are; a comparison with the past temperature record, and an estimate of the influence of natural variability. However, the authors failed to use this information to proceed to a proper validation of the models.
They stated (page 245):
"To claim detection……. we must not only identify a climatic change, but we must attribute at least part of such a change to the enhanced greenhouse effect".
The terms "detection", "identify" and "attribute" are not helpful. As stated above, a "validated" model is one that shows an acceptable level of correspondence with reality. But it can never detect let alone prove a cause and effect relationship. If attribute is intended as a synonym to prove, then attribution is just as impossible as proof.
The authors admit, with regard to the temperature changes over the past century (also page 245):
"We have strong evidence that changes of a similar magnitude and rate have occurred prior to this century.. Since these changes were certainly not due to the enhanced greenhouse effect, it might be argued that the most recent changes merely represent a natural long-term fluctuation"
If there is this plausible, alternative explanation, then there is no need for the greenhouse hypothesis.
Also on page 245:
"Detection requires that the observed signal is large relative to the noise"
The word "Detection" should be replaced by "Validation"
A "signal" is defined as "the predicted time-dependent climate response to the enhanced greenhouse effect" (page 245), but the authors obviously also intend it to mean an observed response which agrees with the predictions.
"Noise" is defined in a footnote, as including " variations due to other anthropogenic effects (which seems to refer mainly to sulphate aerosols}, and natural variability. "Natural Variability" refers to all natural climatic variations that are unrelated to Man's activities, including solar and volcanic effects, and internally-generated variability. Uncertainties in the observations also constitute a form of noise"
This definition is confusing, since it includes "other anthropogenic effects" which should properly be included in any computer model of the greenhouse effect, together with the anthropogenic effects already dealt with..
On the same page (245):
"Is the warming consistent with these (carbon dioxide) increases? To answer this question we must model the effects of these concentration changes on global-mean temperature and compare the results with the observations".
In other words, validation. But the possible effects of natural variability have also to be considered..
And then:
"Because of computing constraints and because of the relative inflexibility of coupled ocean-atmosphere GCMs (General Circulation Models) we cannot use such models for this purpose".
So, most of the computer based climate models simply cannot be validated! Then:
"Instead, we must use an upwelling-diffusion climate model to account for the damping or lag effect of the oceans The response of such a model is determined mainly by the climate sensitivity (DT2x), the magnitude of ocean mixing (specified by a diffusion coefficient K), and the ratio of the temperature change in the regions of sinking water relative to the global-mean change (p). Uncertainties in these parameters can be accounted for by using a range of values."
(The climate sensitivity is the equilibrium temperature rise at equilibrium for a doubling of carbon dioxide concentration)
The authors then carried out the essential exercises required for a validation of their model; a comparison with the past temperature record, and an estimate of natural variability.
They showed (Figure 8.1) that the temperature rise since 1860 (but not the details) could be simulated by their model by assuming a climate sensitivity of 1.5°C .
But this fails to consider the possible influence of natural variability.
The previous chapter, No 7, Observed Climatic Variation and Change, (Folland et al. 1990) stated:
"some of the global warming since 1850 could be a recovery from the Little Ice Age rather than a direct result of human activities" ( page 203 )
"The rather rapid changes in global temperature seen around 1920-1940 are very likely to have had a mainly natural origin".( page 233):
Since very little greenhouse gas emissions took place before 1940 it is probable, then, that much of the warming before then was due to natural causes.
The period after 1940 was possibly also subject to warming by natural causes, rather than a cooling, since there was an unusually low level of volcanic activity, combined with an unusually high level of solar activity during this period.
The period before 1940 should be ignored for validation purposes since at least part of the temperature rise is attributable to natural factors. If we assume for the moment that the period after 1940 is without net natural variability, then validation can proceed by comparing the models with the temperature record since 1940. When this is done, the "Best Fit" occurs with a climate sensitivity of 0.8°C (Figure 1.).
3. Assessment of Natural Variability
Wigley and Barnett then attempted to consider the possible contribution of natural variability to the measured temperature record. (page 247):
".Internal variability arising from the modulation of random atmospheric disturbances by the ocean may produce warming or cooling trends of up to 0.3°C per century, while ocean circulation changes and the effects of other external forcing factors such as volcanic eruptions and solar irradiance changes and/or other anthropogenic factors could produce changes of a similar magnitude. On time scales of the order of a decade, some of these (volcanic eruptions, sulphate aerosol derived cloud albedo changes) clearly have a negative forcing effect, while others have an uncertain sign. If the net century time scale-effect of all these non-greenhouse factors were close to zero, the climate sensitivity implied by Figure 8.1 would be in the range 1°C to 2°C. If their combined effect were a warming, then the implied sensitivity would be less than 1°C, while if it were a cooling, the implied sensitivity could be larger than 4°C".
This discussion is somewhat confused, since "other anthropogenic factors" and the "anthropogenic" sulphate aerosol effects, which are hardly "non-greenhouse factors", are lumped in with the components of "natural variability". If we separate out the components of "natural variability" we get the following:
* Modulation of random atmospheric disturbances by the ocean ± 0.3°C per century
* Ocean circulation changes ± 0.3°C per century
* Volcanic eruptions ± 0.3°C per century
* Solar irradiance changes ± 0.3°C per century
If we take the square root of the sum of squares, this gives a total "natural variability" of ± 0.6°C per century, which is compatible with the observed temperature record. However, the authors do not include amongst their components of natural variability "recovery from the little ice age", which is considered to influence the temperature increase of about 0.4°C between 1860 and 1940. If one half of that increase, +0.2°C, is considered to be due to "recovery from the little ice age", then the uncertainties due to natural variability during the past century becomes -0.4°C + 0.8°C, making it even more plausible that the observed increase of +0.4°C had a natural cause.
The effects of "natural variability" on climate sensitivity involves an assessment of what would happen to these effects over the period required to double the carbon dioxide in the atmosphere. In the absence of evidence to the contrary, let us assume for the moment that the figure of ±0.6°C would continue to apply over this period.
If the combined effects of the various components of natural variability produce a warming, then the implied climate sensitivity (without natural variability) for the authors' "best fit" of 1.5°C should be 1.5°C minus 0.6°C; 0.9°C, in their words "less than 1°C". If the combined effect of natural variability produced a cooling, then the "implied sensitivity" would be 1.5°C + 0.6°C, 2.1°C, not "larger than 4°C".
If the temperature sequence since 1940 and the effects of natural variability are considered, the "best fit" value of the climate sensitivity is 0.8°C±0.6°C
4 . Results of Computer Climate Models
The IPCC (Houghton et al 1990) summarised the results of computer climate models by choosing a "Best Estimate " of 2.5°C for the climate sensitivity, 4.5°C as the "High Estimate" and 1.5°C as the "Low Estimate.
They have continued to express this opinion in the more recent report (Houghton et al 1996), but have, in effect modified it, since they have identified effects of sulphate aerosols in addition to the factors considered in the original models, but have not incorporated them in the models.
Table 6.1 of Chapter 6 in Houghton et al 1996 (Kattenberg et al., Projections of Future Climate) gives a range of --0.8 °C to -1.6°C as the calculated temperature reduction during the last century due to sulphate aerosols. Since this represented 29% of the warming to doubling of carbon dioxide, the range of adjustment to the climate sensitivity for 100% warming (climate sensitivity) if the effects of aerosols increase at the same rate, is -2.8°C to -5.5°C. The adjusted IPCC climate sensitivity range now becomes -4.0°C to +1.7°C, with the "Best Estimate" in the range -3.0°C to -0.3°C. The range covers the established "Best Fit" value of 0.8°C ± 0.6°C, but, this time, at the upper end of the calculated range. The range places predominance on negative predicted values of climate sensitivity.
The IPCC, in Chapter 6 of Climate Change 1995 (Kattenberg et al) make two alternative assumptions for the future behaviour of sulphate aerosols for their future projections to 2100. One assumes a moderate continued increase in aerosols and the other that aerosol values will remain constant at 1990 levels,. If it is assumed that aerosols remain constant up to the doubling of carbon dioxide, then the modifications to the range of climate sensitivity are -0.8°C to -1.6°C, giving a revised IPCC range of -0.1°C to +3.7°C, with a "Best Estimate" at 0.9°C to -1.7°C. This time the "Best Estimate" almost equals the "Best Fit" from the temperature data, at its lower end. The IPCC avoids admitting that the models can predict a zero temperature change or a temperature drop by selecting a figure for the sulphate aerosol effect which is above the extreme high figure, for the future predictions.
The range of computer predictions of climate sensitivity, when modified to take into account the influence of sulphate aerosols, can be consistent with the measured temperature rise since 1940, if parameters are selected to give this result. However, the models are also compatible with a wide range of alternatives, including a temperature fall. Such a wise range of possible model parameters which could fit the observed temperature rise, and the wide range of other alternatives implicit in the uncertainty of the model predictions, including zero and negative temperature change, casts considerable doubt on the value of models to predict future climate.
The range of climate sensitivity as calculated by the models does not represent their true level of uncertainty. The authors of each model would have selected what was, in their opinion, the most appropriate value for each of the parameters making up the model. The range of model results represents only the range of "Best Estimates" of model parameters, but it does not represent the full range of uncertainty which should be attached to the models.
As is stated by Wigley and Barnett (1990, page 247)
"The range of sensitivity becomes even larger if uncertainties in the observed data are accounted for". These authors did not suggest figures for this additional uncertainty range, but they are evidently quite large.
The IPCC Second Assessment Report (Houghton et al 1996), devotes a very useful chapter (Dickenson et al. 1996), to a discussion on "Climate Processes" but no actual figures of the statistical variability that should be attached to the figures that characterise each of these processes is given. Full consideration of many of these processes is constrained by the practice of confining the less well-characterised components of the climate to the category of "feedback". A feedback is an effect which is directly dependent on another effect, and thus can be described by a mathematical relationship with that other effect. Such a mathematical relationship is called a parametrization.
The following contributors to climate change are categorised as feedbacks. Some of these processes enhance the greenhouse forcing (positive feedback), and some reduce it (negative feedback). These processes are not treated directly, but parametrized as a function of carbon dioxide increase or its effects (such as radiative forcing), and that of other minor greenhouse gases. In most cases there is no reliable record of their past variability, or of their compliance with the assumed parametrizartion.
· Water vapour amounts and distribution
· Cloud amounts (High clouds, Middle clouds,
Low clouds, Arctic clouds)
· Cloud water content
· Cloud particle size
· Wind-driven and thermohaline ocean circulation
· Ocean convection
· Interior ocean mixing
· Sea ice
· El Niño Southern Oscillation
· Land surface processes
· Soil-vegetation-atmosphere transfer schemes
The major example is the treatment of the main greenhouse gas, water vapour, but quantitative description of almost all of these parameters is subject to high levels of uncertainty which are not considered when the models which incorporate them are used for predictive purposes..
As shown above, the uncertainties associated with the effects of sulphate aerosols alone are enough to greatly widen the uncertainty of the models when based only on the range of model results. Although actual figures for the uncertainties associated with the above feedback parameters are not available, it is evident that even conservative estimates of these uncertainties would give an overall very high model uncertainty when all the uncertainties are amalgamated. This would allow most computer climate models to be compatible with global temperature change since 1940, but also to be compatible, within reasonable probability levels, with almost any conceivable past or future climate sequence, including a global cooling.
On the other hand, as stated on page 253 of Climate Change (1990)
" the magnitude of natural variability is such that all the warming of the past century could be attributed to this "cause""
The application of Occam's razor, ("it is vain to do with more what can be done with fewer") to this sentence would surely choose natural variability as the cause of the warming over the past century rather than the highly uncertain greenhouse effect.
5. Conclusions
A validation exercise (Figure 1) comparing climate models of the enhanced greenhouse effect with the mean global temperature anomalies since 1940 shows that the "best fit" equilibrium temperature change for doubling atmospheric carbon dioxide (climate sensitivity) is +0.8°C and the uncertainty in this figure due to natural variability, is ± 0.6°C. This result is capable of being represented by the computer climate models, provided a suitable range of climate parameters is chosen; but the very large uncertainties associated with the model parameters and predictions, which are also compatible with a zero or negative temperature change, means that the greenhouse theory of global warming cannot at present be confirmed by the models .They are so inaccurate that they are compatible with almost any past or future climate change that can be envisaged. They are therefore, at present, of little value for climate prediction.
Global temperature changes over the past century continue to be compatible with natural variability, which is still the most likely explanation for them.
Figure 1 - Annual global temperature anomalies 1859-1997, w.r.t. the 1961-1990 average, showing the "Best Fit" Climate Sensitivity of 0.8°C for a computer climate model to the data since 1940.
Literature Cited
Dickinson, R.E., V. Meleshko, D. Randall, E. Sarachik, P. Silva-Dias, A Slingo. 1996." Climate Processes". Chapter 4, pp 193-227, in Climate Change 1995 (J.T. Houghton et al., Eds) Cambridge.
Folland, C.K., T.R.Karl, K.Ya Vinnikov, 1990 "Observed Climate Variations and Change", Chapter 7, pp 195-238 in Climate Change: The IPCC Scientific Assessment. (J.T. Houghton, G.J.Jenkins and J.J. Ephraums, Eds) Cambridge
Gates, W.L., P.R.Rowntree, Q-C. Zeng, 1990 ."Validation of Climate Models", Chapter 4, pp 93-130 in Climate Change : The IPCC Scientific Assessment (J.T. Houghton, G.J.Jenkins and J.J.Ephraums, Eds) Cambridge, pp 93-130
Houghton, J.T., G,J, Jenkins, and J.J. Ephraums (Eds). 1990. Climate Change: The IPCC Scientific Assessment. Cambridge University Press
Houghton, J.T.L.G. Meira Filho, B.A. Callendar, N. Harris, A. Kattenberg and K. Maskell (Eds) 1996. Climate Change 1995: The Science of Climate Change (The Second Assessment Report). Cambridge University Press.
Kattenberg, A, + 8 authors ;1996, Climate Models- Projections of Future Climate Chapter 6, pp 285-358 in Climate Change 1995. (J.T. Houghton et al., eds) Cambridge
Wigley, T.M.L. and T.P. Barnett 1990 "Detection of the Greenhouse Effect in the Observations", Chapter 8, pp 239-256 in Climate Change: The IPCC Assessment( J.T Houghton, G.J. Jenkins and J.J. Ephraums, Eds) Cambridge, pp 239-255
Vincent R. Gray , M.A.,Ph.D., F.N.Z.I.C.
Climate Consultant
75 Silverstream Road
Crofton Downs
Wellington 6004,
New Zealand
Phone (FAX) (064) (04) 4795939
Email VINCEGRAY@xtra.co.nz
April 28th 1999
Return to "Climate Change Guest Papers" page
Return to "Still Waiting For Greenhouse" main page