The Greenhouse Delusion - Chapter 6 by Dr Vincent Gray
Computer Climate Models
D’Arcy
Thompson in his “On Growth and Form”
remarked “Numerical precision is the very
soul of science” The
main point of scientific laws is to be able to calculate from them. When
there is a project to send a man to the moon, the trajectory of the
rocket has to be calculated using a complex set of
mathematical equations with substituted parameters. The equations
are based on scientific laws which have been established and tested, to
known standards of accuracy. The parameters have been measured to known
standards of accuracy. In this way it is possible to predict exactly
where the moon module will land, with a known measure of its accuracy. The
complex series of mathematical equations is a computer-based
mathematical model. Before it can be used there must be
statistically based studies on the accuracy of each part of the
system, and comprehensive tests to prove that its predictions actually
work within known limits. Computer-based
mathematical models have many applications. It is possible to simulate
an entire industrial process and use the model to predict the effect of
changes in the process. The
whole point is, that a computer-based mathematical model of any process
or system is useless unless it has been validated.
Validation of such a
model involves the testing of each equation and the study of each
parameter, to discover its statistically based accuracy using a range of
numerically based probability distributions, standard deviation,
correlation coefficients, s and confidence limits. The final stage is a
thorough test of the model’s ability to predict the result of changes
in the model parameters over the entire
desired range. No
computer climate model has ever been validated. An early draft of Climate Change 95 had a
Chapter titled “Climate Models - Validation”
As a response to my comment that no model has ever been
validated, they changed the title to “Climate Models -
Evaluation” and changed the word “validation” in the text to
“evaluation” no less than fifty times. There is not even a procedure
in any IPCC publication describing what might need to be done in order
to validate a model. Without
a successful validation procedure, no model should be considered capable
of providing a plausible prediction of future behaviour of the climate. This
same point, made more politely, so as to make its way through the
hazards of peer review, is made in a recent paper by Soon et al. (1) Instead
of validation, and the traditional
use of mathematical statistics, the models are “evaluated” purely
from the opinion of those who have devised them. Such opinions are
partisan and biased. They are also nothing more than guesses. Attempts
have been made to attach spurious measures of precision to these
guesses. The following footnote appears on page 2 of the “Summary for
Policymakers” of Climate Change
01 (2) “In
this Summary for Policymakers and in the Technical Summary, the
following words have been used where appropriate to indicate judgmental
estimates of confidence: virtually
certain (greater than 99% chance that a result is true); very
likely (90-99% chance):
likely (66-90% chance); medium
likelihood ( 33-66% chance); unlikely
( 10-33% chance); very unlikely (1-10% chance); exceptionally unlikely (less than 1% chance)." As
might be expected, there
are no models or correlations falling into the medium
likelihood, , unlikely or very
unlikely categories. Chapter
8 of Climate Change 01 “Model
Evaluation” (3) evades the
problem. A paragraph headed “What is meant by evaluation?” (4)
never answers the question. They talk about “an approach” to
evaluation. They confess “We fully recognise
that many of the evaluation statements we make contain a degree of
subjective scientific perception and may contain much “community” or
“personal” knowledge. For example, the very choice of model
variables and model processes that are investigated are often based upon
the subjective judgement and experience of the modelling community” In
truth, all of their evaluation is subjective, and since it is made by
the modelling community itself, suspect. The
Executive Summary of the Chapter (5)
consists entirely of vague subjective opinions. “Coupled models can provide credible simulations” “Confidence in model projections is increased by the improved performance….” “ There is no systematic difference..” “ Some modelling studies suggest that… “The performance of coupled models …. has improved..” “Other phenomena previously not well simulated in coupled models are now handled reasonably well” “Analysis of, and confidence in, extreme events ..is emerging” “Coupled
models have evolved and improved significantly…” “
Confidence in the ability
of models to project future climates is increased by the ability of
several models to reproduce the warming trend in the 20th
century surface air temperature when driven by radiative forcing due to
increasing greenhouse gases and sulphate aerosols” This
statement illustrates the imperfect character of the IPCC
“confidence” Firstly,
as explained in our Chapter 3, the warming trend of the combined weather
station temperature measurements is most plausibly explained by their
biased proximity to human habitation, and by such phenomena as volcanic
eruptions, ocean and sun variability, none of which effects are
incorporated in the models Secondly,
model parameters, particularly those due to sulphate aerosols,
are so uncertain, that it is possible to simulate almost any climate
sequence, including a temperature fall, by suitable choices of
parameters. Thirdly
a correlation, however successful, does not necessarily imply a cause
and effect relationship. Fourthly,
the models are never applied to the more reliable temperature record in
the lower atmosphere, which shows no warming for the past 23 years Despite
these entirely qualitative, inevitably prejudiced “assessments” - “We
consider coupled models, as a class, to be suitable tools to provide
useful projections of future climates” (5) All
this, despite the fact that no model has ever provided a successful
prediction of a future climate. The
Chapter continues with similar qualitative opinions which are too
numerous to mention. A
number of spurious statistical procedures are used for “evaluation” For
example, it is comment to provide a “range” of results, and consider
this as somehow equivalent to an uncertainty figure. Of course this is
nonsense. Each modellist will choose what he thinks are the best
parameters and equations, but the “range” of results is not a fair
measure of the probability distribution that would result if a proper
statistical study were made. An
example is the treatment of “Climate Sensitivity”, the predicted
global mean temperature rise for a doubling of atmospheric carbon
dioxide concentration, derived from many models. The “range” of
results for the global temperature rise is quoted as between 1.5°C and
4.5°C. This figure was, apparently, originally derived by a “show of
hands”, a typically unscientific procedure. But this “range” does
not begin to characterise the true uncertainties of model results. Another
dubious statistical procedure is “pattern analysis” where a pattern
of climate data are compared with those predicted by a model.
Invariably, no account is taken of the uncertainties of both components. In
a recent paper (6) Reilly
et al put it this way: “it
is preferable to derive parameter uncertainty from observations, but the
needed data often do not exist. Distributions of input parameters then
must be selected by expert elicitation….. Care must be taken in
applying expert elicitation for well-known biases in human judgement” Surely,
when the “experts” are the modellists themselves the “well-known
bias” immediately applies. This
article was followed by another by Allen et al (7)
which concluded “results
are only of practical value when the factors responsible for the
uncertainty are reasonably well documented and understood, which is
certainly not the case for climate change in the late 21st
century” Perhaps
the best illustration of the huge uncertainties associated with all the
climate models is Figure 5.1 (8,9,10)
which shows the global and annual mean radiative forcing for some of the
model parameters This
diagram appears three times in Climate
Change 01, each with a
different caption. The caption in Chapter 6 is slightly more honest
about the uncertainties. First
it should be noted that several of the most important contributors to
radiative forcing are not even included. The most important greenhouse
gas, water vapour, and the clouds that results from it, have been
relegated to the status of a “feedback”, where the large
uncertainties in their estimation can be concealed. The
caption (10) says "The
forcing associated with stratospheric aerosols from volcanic eruptions
is highly variable over the period and is not considered for this
plot” Then,
indirect effects of tropospheric aerosols are left out, because they are
“poorly understood”, despite a statement in Chapter 5 (11)
that shows that they are well enough understood to say: “The
largest estimates of negative forcing due to the warm-cloud indirect
effect may approach or exceed the positive forcing due to long-lived
greenhouse gases” The
caption to Figure 6.6 (10)
states “The
uncertainty range specified here has no statistical basis….”
Meaning that the uncertainties are larger than indicated. The use of the
term “Level of Scientific Understanding” also implies much larger
uncertainties. The
caption warns that an overall figure for radiative forcing cannot be
obtained by merely adding and subtracting the figures in the diagram.
But it is surely obvious that with the admitted uncertainties, a very
large range of possible net radiative forcings are possible, including
zero and negative values. Figure
5.1 surely shows that the uncertainties associated with the parameters
commonly incorporated into climate models are so uncertain that the
results of the models are completely worthless. This
conclusion is enhanced by reading Chapter 7, ‘"Physical
Climate Processes and Feedbacks” (12)
of Climate Change 01 which
gives a detailed discussion of each of the processes, invariably
concluding that the uncertainties are greater than is usually
assumed by the models. But, of course, they decline to quantify
any conclusion. Figure 6.1 Global
annual mean radiative forcings (in watts per square metre) So
far, the discussion has been mainly concerned with the general climate
models; but exactly the same considerations apply to carbon cycle models
(13). We have some actual
measured values for the carbon emitted by combustion of fossil fuels,
and for the carbon in the atmosphere (if you
accept that the Mauna Loa and other measurements can be considered
representative). The other components of the cycle, are however,
without known numerical value. There is a theoretical treatment of
carbon dioxide absorption by the ocean, but no reliable measurements,
apart from very rough “estimates” from isotope studies.
The missing link is the carbon absorbed by the land surface, for
which there are no reliable measurements, and also no reliable theory.
Indeed, it used to be thought that there was a net outflow from the
land, due to “deforestation.” The uncertainties in carbon
cycle projections are thus without any numerical value for their
uncertainties. Yet the IPCC has the effrontery to extrapolate "Stabilisation
scenarios" as far ahead as the year 2300 (14)
with no indication of uncertainty. This is science fiction, not science. Although
models showed some success in predicting the temperature effects of the
eruption of Mount Pinatubo in June 1991, the models have failed to
successfully predict any other climate change. On the contrary: · All the models predict that the Arctic should warm much faster than the rest of the earth. This is just not happening. · The models predict a temperature increase in the lower atmosphere. Measurements for the last 23 years show that this is not happening. · The models predict a steady increase in global surface temperature. The combined weather station record has changed in a fashion which is far from steady. Between 1940 and 1975 it showed a fall in temperature. Models can only cope with this by addition of arbitrary quantities of aerosols. · Models predict that the Northern Hemisphere should warm at a slower rate than the Southern Hemisphere, because most aerosols are produced in the North. The combined weather station record shows greater warming in the North than in the South, and so does the satellite record in the lower troposphere. ·
Models are unable to explain why most
of the warming of the combined weather station record took place at
night, or in the winter. Despite the very great emphasis on models by the IPCC they have yet to show that their use, either to simulate climate, or to predict future climate, can be justified. References
1 Soon, W, S Baliunas, S B Idso, K Y Kodratyev and E S Posmentier 2001 “ Modelling climatic effects of anthropogenic carbon dioxide emissions: unknowns and uncertainties.” Climate Research 18 250-275 2 Climate Change 01 “Summary for Policymakers” , page 2, footnote 7 3 Climate Change 01 Chapter 8 “Model Evaluation” 4 Climate Change 01 Chapter 8, page 474 5 Climate Change Chapter 8 Executive Summary, page 473 6 Reilly J, P H Stone, C E Forest, M D Webster, H D Jacobs, R G Prinn. 2001.”Uncertainty and Climate Change Assessments”. Science 293 430-433 7 Allen M, S Raper, J Mitchell 2001 “Uncertainty in the IPCC’s Third Assessment Report“ Science 293 430-433 8 Climate Change 01 “Summary for Policymakers”, Figure 3, page 8 9 Climate Change 01 “Technical Summary”, Figure 9. Page 37 10 Climate Change 01 Chapter 6 “Radiative Forcing of Climate Change”, Figure 6.6 page 392 11 Climate Change 01 Chapter 5 “Aerosols, Their Direct and Indirect Effects”, page 334 12 Climate Change 01 Chapter 7 “Physical Climate Processes and Feedbacks”, 417-70 13 Climate Change 01 Chapter 3 “The Carbon Cycle and Atmospheric Carbon Dioxide”. 3.6. Carbon Cycle Model Evaluation, pages 213-218 14 Climate Change 01 Chapter 3 page 223 Return
to Chapter list and Summary |
Return to `Climate Change Guest Papers' page
Return to `Still Waiting for Greenhouse' main page