Third Assessment Report
(TAR) - 2000
Comments and Reviews - Part 1
Click here for Part 2 Click here for Part 3 Click here for Part 4
The Intergovernmental Panel on Climate Change (IPCC) has recently drafted a third assessment report (TAR-2000) intending it for restricted circulation only to `experts' (as determined by the IPCC) for review. The wider community who stand to be most affected by the policies flowing from this report were excluded from this process. The restricted URL containing this report has also since been withdrawn.
This forum is open to expert and non-expert alike to submit comments.
Comments should be addressed to daly@microtech.com.au
with `TAR-2000 Review' in the subject line.
- John
L. Daly
S. Fred Singer (USA) Vincent R. Gray (New Zealand) John L. Daly (Australia) Peter Dietze (Germany) Dr Jarl Ahlbeck (Finland) Dr Jarl Ahlbeck (Finland) Dr Hugh Ellsaesser (USA) Dr Sonja A. Boehmer-Christiansen (UK) Jarl Ahlbeck (Finland) `Wolfgang' (Germany) Jarl Ahlbeck (Finland) Dr Sonja A. Boehmer-Christiansen (UK) Richard Courtney (UK) Jack Barrett (UK) Chick Keller (USA) Vincent Gray (New Zealand) Chick Keller (USA) Hugh Ellsaesser (USA) Vincent Gray (New Zealand) Chick Keller (USA) Jarl Ahlbeck (Finland) Richard Courtney (UK) Dr Hartwig Volz (Germany) Steve Hemphill (USA) Jarl Ahlbeck (Finland) Jarl Ahlbeck (Finland) |
15 Dec 1999 13 Dec 1999 19 Dec 1999 21 Nov 1999 25 Nov 1999 21 Jan 2000 15 Dec 1999 31 Jan 2000 2 Feb 2000 2 Feb 2000 2 Feb 2000 2 Feb 2000 2 Feb 2000 2 Feb 2000 2 Feb 2000 3 Feb 2000 3 Feb 2000 2 Feb 2000 4 Feb 2000 3 Feb 2000 3 Feb 2000 3 Feb 2000 3 Feb 2000 3 Feb 2000 4 Feb 2000 4 Feb 2000 |
Detailed chapter-by-chapter review of TAR-2000 General Comment on the entire TAR-2000 report Comment on TAR-2000 `experts only' policy Technical critique of some key assumptions in TAR-2000 Detailed comment and critique of TAR-2000 Whatever happened to the Medieval Warm Epoch ? Technical critique of TAR-2000 IPCC politics, funding, and peer review Comments re corruption of the science Query re spectral characteristics of GHGs Response to Wolfgang re atmospheric heat transfer Response to Jar Ahlbeck re latent heat & convection Is there a greenhouse effect when it's raining? Tracing the origin of 4 w/m^2. Remarks on the value of computer modelling Discusses problems of model validation & sensitivity Discusses problems of tuning models Further discussion on atmospheric radiation physics Discusses surface-satellite differences Response to Vincent Gray re surface-satellite issues Comments on the uses and misuses of GCMs Response to Jarl Ahlbeck re the GCMs Response to Jack Barrett re radiation equations Response to Vincent Gray, incl. heat transfer physics Response to Steve Hemphill re heat transfer physics Proposal for a consensus |
Subject: : Review of IPCC TAR
Date: Wed, 15 Dec 1999 17:41:49 -0500
From: "S. Fred Singer" <ssinger1@gmu.edu
To: daly@vision.net.au
IPCC TAR REVIEW by S. Fred Singer
General
1. Please provide an index, at least for the topics if not authors, giving chapter, section, and page.
2. Address Article 2 of the FCCC. What is the scientific meaning of “dangerous interference with the climate system”? How can one pinpoint a critical level of greenhouse gases that will “prevent dangerous interference”?
3. Show the reduction in future temperatures for different emission control scenarios: i.e. for 5.2% [Kyoto Protocol], all the way to an 80% general reduction from 1990 levels mentioned in the IPCC-FAR [1990]
4. Provide one or two “standard” aerosol scenarios (i.e., geographic and temporal distribution, past and present) for modelers to allow better intercomparison. It will also prevent a modeler from matching the observed temperature record by adjusting the many different aerosol parameters.
5. Often, a paper is cited but its result is not stated if it disagrees with the main conclusion. For example, Fischer [1999].[Ch. 2, page 5, line 48]
Chapter 1
Page 1-line 42: The evidence for human activities influencing the climate is vague. Spell out exactly what this means. For example, stratosphere temperature decreased, reduced diurnal range. etc
2-8: What is meant by “large scale aspects”?
3-2: Water vapor is the strongest GHG.
3-26: Winds also affect the absorption or release of CO2 in the atmosphere ocean interface.
4-52: Replace “net radiation” by “net energy flux”.
5-38: WV can also produce a negative feedback depending on its vertical distribution.
6-18: Until the work of Mann [1999] is confirmed, I would not assume that the medieval warm period was local. Chapter 2 does not show any contrary evidence.
7-34: Is CO really increasing? Not in the United States, according to EPA
8-24: The optical properties of aerosols are also affected by ambient humidity.
10-21: It is significant that none of the four SRES scenarios agree with IS92 for the year 2000.
11-1: The predictability of El Nino is based on having accurate starting data. It does not validate the models as far as trends of GHG are concerned.
12-lines 8 to14: This paragraph is hard to understand.
12-23: Do not link the sea level rise to the increase of temperature. It is most likely the result of the slow melting of the West Antarctic ice sheet. It has been ongoing for some 15,000 years.
12-33: There also appears to be evidence for a decreasing frequency of El Nino events between 1900 and 1940 (Hadley data)
13-8: The temporal and spatial patterns do not support the model calculations according to the publications of Michaels, Webber, and Singer. Thus, there is no support for the IPCC conclusion quoted on line 11 and 12.
Figure 1.3: The troposphere temperature information is misleading. It should read 0.3 C increase between 1975 and 1980. Please check the data on spring snow cover and on soil moisture and on cloud cover trends. Give references for all of the assertions in Figure 1.3a and 1.3b.
Chapter 2
4-11: The temperature increase between 1979 and 1988 measured at the surface shows a strong geographic variation. It is certainly much larger than the temperature increase measured in balloons and satellites. Until the causes are well understood, one should not extrapolate the mean trend over the next century.
4-21: It may be agreed that a warming did occur between 1976 and 1980. The subsequent temperature record is under dispute until the differences between surface and satellite observations are fully explained.
4-20: The 1910 to 1945 warming may well have been global. There are huge data gaps, especially in the SH.
5-5: Using this method, Pielke has verified the MSU satellite data, but not the surface data.
5-19: The balloon record of warming occurred during a short span in the 1970s.
5-20: We believe that there is a real difference between temperature trends in MSU/balloon data and surface data.
5-48: The paper by Fischer [1999] shows CO2 increases lagging the temperature increases of all three deglaciations. This paper is cited in the references, but its result are not.
7-9: Can you link the NAO to the sudden warming around 1976?
9-6: This finding differs markedly from an earlier paper by Karl who made substantial corrections for urban heat island effects. Also not quoted is the work of Goodrich, Balling, Hughes; who find strong urban warming effects.
10-3: Table 2.1 should also show zonal temperature trends; five to seven zones would be optimal. In column 5 [data set of 1976-1998] add also a column showing 1976-1997, in order to eliminate the influence of the strong El Nino of 1998.
11-17: The difference between rural and urban stations furnishes strong evidence for the existence of an urban heat island effect.
11-48: Give reference to Dai.
14-6: Table 2.2: The meaning of “optimally average” is not clear. Also, add a sixth column for 1976-1997 to eliminate the influence of the 1998 El Nino.
18-11: The suggestion that the response of climate forcing can be different for the surface data than for troposphere data contradicts the earlier claim that the two data sets are consistent. Throughout this discussion, one detects a bias against the tropospheric data from satellites and balloons in favor of the surface data.
18-47: Perhaps one should explain why the data from ANGELL do not meet the standard necessary for analysis of trends.
19-40: In other words, the overall change to the satellite data is close to zero. Version D increases the global trend by about 0.05 C per decade compared to Version B, presented in IPCC-SAR. So this improved version has a slightly less negative trend.
20-31: To judge the consistency of tropospheric vs. surface trends, we need to show the zonal trends.
21-1: Trends are near zero for levels from 850 to 300 hPa. [1.5-8 km] [where GCM results show a maximum warming.]
26-23: Add: but an advance in many maritime glaciers.
26-46: I don’t believe that one can explain the substantial difference between surface and satellite trends by blaming it on uncertainties. This is contradicted also by the very firm statement on line 30 that the surface data show an increase of 0.2 C per decade. The balloon and satellite data cannot be stretched to match this. In any case, in order to be consistent with the GCM results, the balloon and satellite data should show a trend of about 0.4 C per decade. [which they don’t]
27-27: The claim that the 20th century “was warmer than any other in several thousand years” cannot be maintained. It may not even be the warmest century in the last 1000 years. [Dahl-Jensen, 1998]
28-4: There’s also evidence for a Medieval Warm Epoch from subtropical ocean sediment data [Keigwin] and tropical glaciers and Antarctic ice cores.
28-32: But tree ring data [Jacoby] also show no warming after 1940.
29-6: What are the results of temperatures and variability?
30-17: This is also confirmed by the ice core borehole data [Dahl-Jensen 1998] which show the 20th century as the warmest of the past five, but colder than the Medieval Climate Optimum around 1100 A.D.
31-10: These glaciers may be responding, with a delay, to the warming of 1900 to 1940.
31-50: This discussion downplays the real discrepancies of different investigators who use proxies. [see Figure 2.25]
32-11: The section tries to do away with a global or even hemispheric Little Ice Age and Medieval Warm Period. In view of the strong discrepancies between different proxy methods, particularly the uncertainties of the tree ring method, one should not accept such a claim without further proof. I find the ice core borehole data most convincing since they involve actual temperature measurement with thermometers. I would suggest that the report reprint the temperature graph, the main result of the Dahl-Jensen paper.
34-13: Now that we have agreed that solar influences have a role in the 20th century, we have to decide whether the number is closer to zero or to 100%. I don’t see a reference here to the work of Baliunas, one of the most important researchers in this field who has been a proponent of solar influence from the very beginning. Nor do I see a reference to the seminal work of Friis-Christensen and Lassen [1991] or of Hoyt. This leads me to think that this group has still not accepted the reality of a solar influence on climate variability. Also missing is a reference to Van Loon [1999].
35-18: The 20th century is here claimed to be the warmest in this millennium: earlier it said several thousand years.
35-22: The negative statement that the rate of warming appears to be too large to be explained by natural influences alone appears to be the main conclusion. But the greenhouse explanation is contradicted by the cooling of 1940-1975 and the absence of warming in the satellite data of the past two decades. This needs to be mentioned.
37-6: A Holocene variability of 5-8 C on a time scale of 1500 years is indeed very large and difficult to accept. It makes any man-made temperature change look insignificant.
37-47: Quotes Dahl-Jensen, but doesn’t develop her results nor show her graph.
38-37: I note that here, the earth experienced (almost) synchronous changes [as opposed to the claim for local variability by Mann].
40-7: Evidence for coupling between northern and southern polar regions, i.e. for world wide changes.
41-8: The same for the warming and cooling of 11,000 years ago.
41-24: IPCC-SAR showed precipitation increasing up to about 1950 and then declining. Figure 2.30 may in fact be taken from IPCC-SAR.
48-42: There’s no mention of the eleven year cycle of cloudiness discovered by Svensmark.
49-29: Stratospheric WV should be increasing because of the increasing emissions of methane and the contributions from increasing air traffic.
50-29: The temperature record [Singer 1999] shows El Nino’s decreasing while temperatures rise between 1900-1940.
51-11: The question is whether these oscillations are predictable by coupled ocean-atmosphere climate models. If not, then climate models cannot be used to derive the natural variability of climate.
65-17: The rest of this paragraph is difficult to read. Please clarify the wording.
65-51: The prior discussion does not support the claim that increases in lower tropospheric temperature of satellite and balloon measurements “have been confirmed by a variety of different analyses.”
66-44: Decreases in mountain glaciers are consistent with the earlier warming, before 1940, taking account of the delays of decades required to make an impact.
66-48: Satellite and balloon measurements do agree, but they do not show an increase in tropospheric temperatures.
Page 116, Figure 2.14: Which version of MSU is used here? Is it MSU version B?
Page 117, Figure 2.15: By taking the endpoint in 1998, a bias is introduced. Please show also the data for 1979-1997.
Page 121, Figure 2.19: It is strange that the large warming between 1900-1940 showed no effect on NH sea ice. Shrinking occurred in summer; little trend in winter. No trend I Antarctic (Fig 2.18, p. 120). Please explain --- or at least , comment.
Page 126, Figure 2.24: This graph essentially denies the existence of the Little Ice Age and Medieval Climate Optimum. This contradicts other proxy data we know of. [Jacoby, Dahl-Jensen, Keigwin] These other data show no warming since 1940.
Page 128, Figure 2.26: Give reference.
Page 129, Figure 2.27: Give reference.
Page 132, Figure 2.30: Is this the same figure as Figure 3.11 on page 155 of IPCC-SAR?
Page 135, Figure 2.32: Give reference.
Page 152, Figure 2.49: Shows decreasing trend of strongest tornadoes since about 1970.
Chapter 3
The chapter nowhere deals with the residence time of CO2. We may, therefore, accept (with Sarmiento et al) the 30-year period for half of an injected pulse to disappear. There is no evidence here for a claimed persistence of 15 to 30 % out to several thousand years (Ledley et al , Eos 1999)
4-15: The chapter contains a strong ideological bias against the existence of a major sink in North America during the period 1988-1992, as has been found by Fan et al.
5-33: There is much discussion about the negative and positive feedbacks that can affect the CO2 concentration. But in the final analysis, it is the uncertainty about future emissions that dominates.
9-17: The earlier onset of spring growth at northern high latitudes may not be the result of a warming trend, it may be caused by increased CO2.
10-4: Increased tree growth may be due to anthropogenic N deposition, i.e. pollution.
11-7: The discussion of soil management should be expanded, since soil uptake of CO2 may be an important sequestration mechanism.
11-16: Here for the first time, we get some inkling about the lifetime of CO2. What is the evidence that it will take “thousands of years” for the ocean to take up 90% of emitted CO2? How does this square with volcanic emissions?
11-32: What about the effect of the temperature and temperature history of these water parcels which determine the solubility of CO2?
11-52: As CO2 is increasing less rapidly now, does this mean that the ocean sink will be more effective? If so, spell it out.
12-6: Is there any experimental evidence to back this up? If so, please quote it.
12-49: Iron dust is also supplied by volcanoes. Please provide documentation.
13-39: The data on atmospheric concentration of CO2 from ice cores have been criticized by Jaworowski. These objections should be documented and discussed more fully.
14-41: Rising CO2 did not just parallel rising temperatures, but lagged them by about 600 years, as shown in the ice core data of Fischer et al. This paper is quoted elsewhere, but its main result is not mentioned.
15-3: Here again, a paper is quoted [Indermuehle et. al 1999] but there is no mention of its major result, the absence of a steady-state for CO2.
17-1 through 4: This discussion is somewhat obscure.
18-18: Tropical Pacific sea surface temperature did not increase in the eastern Pacific.[Singer, Climate Research submitted]
18-18: Should not a temperature increase produce outgassing and reduce rather than increase surface water pCO2?
18-27: There is something wrong with the sentence.
18-28: Why “reduced” outgassing during El Nino?
19-53: The results by Fan et al. are judged to be “implausible”. Please quote Fan’s response to his critics.
20-21ff: This paragraph seems to be biased against the results of Fan et al. [1998] [Not 1990] How does Fan respond?
26-25: Future scenarios of CO2 emissions clearly dominate all other uncertainties.
Page 39: On the ordinate, define “meg”.
Page 41: On the figure caption, is flux “to” atmosphere correct?
Page 42: This figure requires more explanation.
Chapter 4
4-13: Discuss whether increases in stratospheric WV can lead to cirrus.
5-33ff: The next two paragraphs deal with very complicated issues. Can these be accommodated in climate models?
11-9: It is ironic that emissions from automobile catalytic converters can be a significant source of N2O, which in turn is the major source of NOX in the stratosphere, and thereby can play an important role in depleting ozone in the lower and mid-stratosphere.
12-48: According to the WMO assessment of 1999, the depletion of stratospheric ozone at mid-latitudes has been about 4%. I don’t consider that substantial since day-to-day changes can be of the order of 100% or more.
16-54: I’m not sure that we have a quantitative handle yet on how much WV enters the stratosphere through the tropical tropopause. Methane may still be the major source.
24-1ff: The chain of events in this feedback loop are certainly plausible. Can one judge from paleoevidence whether such a feedback is negative and self-damping?
Page 35: The table caption of the last column on the right is not quite clear. Which of the two possibilities is correct: forcing per ppb or forcing since 1760?
Chapter 11 Changes in Sea Level
The chapter should show the sea level rise data of Fairbanks and Shackleton. The chapter should also show the corrected global sea level graph of Trupin and Wahr of 1900 to 1980 [which has also been adopted by the American Meteorological Society].
The chapter should show graphs giving best estimates of rate of sea level rise over the last 20,000 years.
3-13: Please give the references to back up the statement that sea level rose at 10-20 cm/cy over the last 3,000 years. I could not find them in the chapter.
3-21: The claimed acceleration “in the early 20th century” is not substantiated in the chapter. The best judgement by Douglas and others is that there has been no significant acceleration during the past 100 years, not just “during the latter part of the 20th century”.
3-43 and 44: Please specify the sign and amount of the mass imbalance of Greenland and Antarctica separately.
3-47 and 48: How do these numbers compare to the estimate on line 44? How do they compare to the estimates in IPCC SAR?
4-lines 1,2,3: The best central value from IPCC SAR is more like 3-6 cm rather than the 8-11 cm given here.
4-12: The thermal expansion given here is high compared to IPCC SAR. Why is that?
4-16: These projections ignore the publications of Bindschadler and of Conway et al. [Science 1999, and not quoted under references]
5-4: This Greenland melting rate seems very high, 7 cm/cy of SL rise. Since Greenland has experienced temperature increases of 3 C, one can check such numbers from ice core data.
7-22: But if the circulation slows down, as climate models have tried to calculate, then there will be less heat carried to the Northern Atlantic, less warming of the deeper ocean, and more evaporation from the tropics.
7-49: A thermal expansion from El Nino has been directly observed by Cazenave.
10-14: This thermal expansion, of the order of 1 mm/yr is local not global.
11-47: To avoid confusion, give the rise in sea level in mm rather than m. Note that the acceleration is calculated but not observed [Bruce Douglas, Chris Harrison]. Note also that the rate of SL rise varies by a factor of three among the different models.
17-17: The Greenland and Antarctic ice sheet should be discussed separately. Do not give joint numbers such as on line 21.
20-11: Do these values refer to equivalent sea level change or to accumulation? Confusing.
20-23: Please give a reference to the statement.
21-45: For Greenland, a 1 C warming corresponds to 3 cm/cy of global SL equivalent Question: is that a global warming or a local warming of Greenland only?
21-53: Check the numbers 220 mm/yr and 63 mm/yr.
22-1: Check the number 250 mm./yr an note that Thompson and Pollard give a much higher number in the abstract of their paper delivered at the AGU meeting.
22-5: 1 C warming: local or global?
27-35: No evidence for any acceleration of SL rise in the 20th century, i.e. when climate warmed rapidly. This data would seem to disagree with the table.
30-12,14: The calculated rise during the century is 5-8 cm versus 18 cm observed. Could this be due to decreasing storage [line 37] or to a background contribution of the ice sheets [line 46].
31-8,18: No acceleration during this century. [1905-1985]
33-7: The table uses a very high value of temperature rise corresponding to a CO2 increase following the IS92a scenario, at 1 % per year. It considers only thermal expansion and ignores the compensating effects from Antarctic accumulation.
37-21: The “background ice sheet” refers to what; is that 1 cm to +7 cm?
37-23: What about ground water mining, as contemplated in Egypt and Libya?
42-36: “Local annual warming over Greenland about double the global mean.” Please give reference. “About five times as strong during winter than summer.” This means that there will be little loss of ice by melting. Please calculate it or at least point it out. Compare this with the earlier estimates which assume an average increase in temperature.
43-45: Current recession is given by Bindschadler.
43-47: The contribution to rate of SL rise of up to 1 mm/yr is not at all “modest”.
44-18: WAIS demise in a few 100 years?. [but we have seen historic warmings which did not remove it] This value is also supported by Oppenheimer [line 26] but is not supported by Bentley [line 31]. But see line 42
Page 66: Explain the Figure caption more fully, including the abbreviations inside the box.
Page 71: Please quote the most recent work of the same author which looks at local sea level rise and local temperatures in the tropical Pacific.
Page 73: Check the accuracy of the caption for Figure 11.9d.
Chapter 12 “Detection of Climate Change and Attribution of Causes”
P.5-L8 Explain how model variability can be larger than the actual climate variability.
9-53 Is “reduced” correct?
9-54 Do you mean Fig. 12-5?
P.39 Fig 12.1 Aside from different drifts, can you explain to the readers why the 3 models show such different internal variability? Which one is closest to the “natural” variability?
S. Fred Singer,
President Science & Environmental Policy Project
4084 University Drive, Suite 206a
Fairfax, VA 22030
From: Dr Vincent Gray
Date: December 13th 1999
IPCC THIRD ASSESSMENT REPORT (TAR) EXPERT REVIEW
GENERAL COMMENTS ON THE ENTIRE REPORT
Please provide an INDEX. It was very difficult to track down any subject in previous reports.
The Subject of the Project, and its Title, should be "Climate Science"
The constant emphasis, amounting to an obsession, on "Climate Change" has distorted the whole Report. The impression is given that "change" is somehow abnormal, unnatural, undesirable or unnecessary, whereas. In fact, change is inevitable, necessary, natural and universal. It is impossible to study climate (or most other topics) without studying change. There is no need to include "change" in the title, or to continually harp on its existence. Headlines, such as "Is the climate really changing?’ (Chapter 2) are simply stupid.
An important objective of this report is to examine the evidence for a human influence on the climate. Despite continual claims of "confidence" and "progress" a human influence on the climate has not been established. The following statements from the Report summarise this fact and serve as a conclusion to the Report.
"The fact that global mean temperature has increased since the late nineteenth century and that other trends have been observed does not mean that we have identified an anthropogenic effect on the climate system" (Chapter 1, Page 12, Lines 48, 49
"The net forcing of the climate over the last 100 years (and since pre-industrial times) may be close to zero or even negative" (Chapter 5m Page 50, Lines 8, 9.).
There are two major pieces of evidence of human influence on the climate.
The first is summarised by the surface temperature record of the past century, obtained by amalgamation of meteorological temperature records, which shows a temperature rise of around 0.7°C over the century.. There seems, in Chapter 2 of this report, to be a recognition that the evidence for human influence is largely confined to the measurements since 1975. The period before 1945 is likely to have been influenced by a recovery from cold conditions in the last century. It could not have been influenced by greenhouse gas emissions, which were low over this period. The Report fails to examine the possibility that the figures might also have been influenced by inadequate data; particularly by the many gaps created by two world wars.
The period from 1945 to 1975 showed a slight fall in surface temperature. This Report makes no attempt to explain this. It is only interested in temperature increases.
According to Tables 2.1 and 2.2. the global rate of increase from 1976 to 1998 was 0.20°C per decade, mainly in the Northern Hemisphere land readings .(0.32°C per decade). The question is, could this be due to human influence, and if so, why should it be so much greater in the Northern Hemisphere?
From 1979, with results inadequately displayed in Chapter 2, MSU satellites have measured global temperature in the lower stratosphere and found a global trend 1979-1999 of 0.056°C, again predominantly in the Northern Hemisphere (0.113°C), with the trend in the Southern Hemisphere negative (-0.002°C per decade) This detailed information is omitted from the Report
Both the surface and lower troposphere trends have been biased upwards by an unusually large El Niño event in 1998. If allowance is made for this event the satellite measurements have not identified any continuing global upwards trend in lower troposphere temperature. As the measurements have taken place in the region where radiative forcing is considered likely, this means that a continually increasing forcing such as that attributed to greenhouse gasses has been negligible over the period of measurement. This conclusion has been corroborated by weather balloon measurements, also not adequately presented in this Report.
We therefore have to explain why surface temperature measurements have risen over the period when troposphere temperatures have not, and what human influence, excluding radiative forcing from greenhouse gas emissions, could have been involved.
Evidence provided in this Report shows that the explanation is due to local heating around surface measurement stations..
This evidence is as follows:
The study by Peterson et al 1999 showed that the average temperature rise in all stations was similar to the average rise in rural stations.. This does not mean that both are unaffected by local heating. It means that both are equally affected by local heating. Previous studies showing a small increase of urban over rural stations made the same false assumption that rural stations could be regarded as unaffected by local heating.
Figures 2.10a, 2.10b, 2.10d, and 12.6a. show that temperature rises from 1976 to 1996 were greatest in high latitude cold climates; presumably in "rural" measurement stations. These stations are actually more affected by local heating than urban stations in temperate climates. My paper on "Regional Temperature Change" to Geophysical Research Letters further explores this evidence
The temperature rise in the cold climates was greatest in the winter months. See Figures 2.11a and 2.11b.
The temperature rise was greatest in the Northern Hemisphere, where both the urban and the cold rural sites are concentrated
The reduction in the Diurnal Temperature Range (DTR) was also predominantly in cold climates for rural stations (see Figure 2.2), and was usually due to an increase in minimum temperature
The belief that "rural" stations do not undergo upwards temperature drift is proved wrong by the large temperature rises in "rural" stations in Russia/Siberia and the Arctic shown in Figures 2.10, 2.11 and 12.6.a.
The difference between land measurements and sea surface measurements (Figure 2.7) shows that there was an upwards temperature drift on land
The evidence that increases in greenhouse gases in the atmosphere are causing changes in the climate continues to be unconvincing That the effects of increases in greenhouse gases are not causing a global temperature change is shown by the following facts:
Temperature measurements by MSU satellites in the lower troposphere, where greenhouse warming is supposed to take place, have found no evidence for such a warming for the past twenty years, although these measurements do respond to volcanic, ocean (El Niño) and solar changes
Computer based climate models based on the theory of greenhouse warming do not fit the surface temperature record.
The fall in the surface temperature record between 1946 and 1975, at a time when greenhouse gases increased steadily, is incompatible with a greenhouse warming theory.
According to Chapter 5, Page 50. "the total net forcing of climate over the last 100 years (and since pre-industrial time) may be close to zero, or even negative."
Since the increase of greenhouse gases in the atmosphere is not responsible for the recent rise in global surface temperatures, and has not been shown to have had any other harmful effects on the climate, then there can be no justification for measures to control or reduce greenhouse gases, such as the Framework Convention on Climate Change, or the Kyoto Agreement.
The second important piece of evidence that human activity may be influencing the climate is the undoubted rise in concentration of greenhouse gases, particularly carbon dioxide, in the atmosphere.
The recent increases of carbon dioxide in the atmosphere are inadequately displayed in this Report. There is no Figure showing these increases. Although the variability in the rate of increase is shown in Figure 3.5. there is no statement of the obvious conclusion from this figure; that carbon dioxide in the atmosphere has been increasing at the approximately linear rate of 1.5ppv/yr (0.4%) for the past 33 years. There is suspicion that the failure to emphasize this fact is related to the ridiculous assumption of many climate models that carbon dioxide is increasing at 2½ times this rate; 1% a year.
The uncertainty in knowledge of land and ocean involvement in the carbon cycle, plus negligible information on trends has meant a complete failure to provide a plausible carbon cycle model. Models have attempted to simulate variability, but no model has even attempted to match the overall record since 1958. As a result, future projections of carbon dioxide (Table 4.10a) and methane (Table 4.10b) are absurdly incompatible with recent trends. Carbon dioxide concentrations in Chapter 4, Table 4.10a project that the rate of carbon dioxide increase in the atmosphere after having been constant for 33 years, will suddenly jump by about 50% in the next few years. The rate of increase of atmospheric methane, which has been falling for 16 years, is projected to suddenly start rising, again in the next few years in Table 4.10b. These projections should be replaced by some that are compatible with current and past trends..
A relationship between carbon emissions and atmospheric concentrations has not been found. Recent falls in emissions have not been noted.
The uncertainties that should be attached to the calculations of radiative forcing have not been supplied. They are partly obscured by the vague concept of . "Level of Scientific Understanding" .
The truthful statement (Chapter 5, Page 50, Lines 8,9) "the total net forcing of climate over the last 100 years (and since pre-industrial times) may be close to zero, or even negative" has not been absorbed, particularly by the modellists.
The oft-quoted conclusion of the last Report "The balance of the evidence suggests a discernible human influence on the global climate" should be clarified to explain that the emission of greenhouse gases as a results of human activity is not, so far, one of the discernible influences on the global climate
The level of mathematical statistics in the report is everywhere inadequate, verging on the appalling.
Only a few individual results give 95% confidence levels, and there is frequently confusion as to whether the figures quoted are one, or two standard deviations.
There is no realisation that both the climate observations and the model results are probability distributions, not single-valued functions. When climate observations are being compared with the models, statistical procedures which take this into account need to be used. "Fingerprint" and "pattern" correlations are defective in this respect.
There is no discussion of the procedures that need to be followed in order to "validate" or to "evaluate" a model, and no application of such a procedure. There should be an appreciation that this must involve the use of appropriate mathematical statistics.
Comparisons between climate observations and model outputs are everywhere inadequate; ranging from qualitative, subjective judgements to partial correlations which assume model outputs to be constant.
There should be an appreciation that a correlation, however impressive, does not prove a cause and effect relationship.
This statistical illiteracy is so widespread that there has to be a general explanation. It is, surely, that climate scientists and modellists dare not face the consequences of numerical probability figures on the value of model predictions. They may find that :
The chances that current temperatures are "unprecedented" are small.
The chances that net forcing of the climate from greenhouse gas emissions is zero or even negative are high.
The uncertainties in the future projections are so great as to render the projections meaningless.
The chances that we have identified a human contribution to the climate are near to zero.
The Report greatly exaggerates the importance of models
There are no calculations of the uncertainty of model outputs, in the form of probability distributions, or 95% confidence limits for individual predictions.
If correlation procedures for the two sets of probability distributions were to be carried out, the models would be adjudged successful, but with such large uncertainties they would be of little practical value.
The uncertainties for future projection, which would have to incorporate uncertainties in future human activities would provide future projections for the year 2100 with such a wide possible range that they would be meaningless.
No model has been validated, in the sense of showing a sufficiently impressive agreement with climate observations, and there is not even a procedure laid down as to how this should be done.
Most "evaluations" or comparisons of models with climate observations provide only arbitrary qualitative, subjective estimates of level of agreement, which have no scientific significance.
Most computer climate models make outrageous assumptions about the climate. Many assume that atmospheric carbon dioxide increases at 1% a year, 2½ times the measured value of the past 33 years.
Models place far too much emphasis on the supposed influence of radiative forcing. The fact that this quantity might actually be zero, or even negative (as stated in Chapter 5. Page 50) seems never to be realised. Other forcing mechanisms such as the sun, volcanoes, ocean variability are not incorporated into models, yet the models are expected to simulate real climates.
Models assume that the surface temperature record is determined by atmospheric greenhouse gas concentrations. Arguments given above indicate that this may not be true. There is then a question as to which climate observations might be simulated by the models
Carbon cycle models do not provide quantitative estimates of their large uncertainties, mostly related to the absence of information on land and ocean absorption
Carbon cycle models have never attempted to match recent annual trends in atmospheric carbon dioxide concentration even by the inadequate methods used with the climate models. As a result, their future projections are incompatible with these recent trends.
Many Chapters put model results before climate observations. There is often an implication that model results are more believable than climate observations despite their obvious limitations. One Chapter even regards climate observations as "anomalous" when they do not agree with the models.
The "State-of-the art" is truly primitive.
Chapters 7,8,9 and 12 should be amalgamated into one Chapter. At present they are repetitive and overlapping.. There would be an opportunity to combine the purely theoretical material in Chapter 7 with the attempts to relate models to real climates in Chapters 9 and 12
Chapter 13, on scenarios, fails to provide information about IPCC scenarios and their problems.
Vincent R. Gray
From: John L. Daly
Date: 19 Dec 99
Will They Never Learn?
The 1995 IPCC report on climate change was thoroughly discredited by subsequent revelations about ex-post-facto changes to Chapter 8 of that report, changes which diluted the science to make them consistent with the politically motivated statements in the executive summary. As many critics have now pointed out, the policy-makers summary should have been made consistent with the science, not the other way around.
The discrediting of the 1995 Report explains in large measure why governments have done little or nothing about climate change in spite of ringing rhetoric at a succession of climate conferences, most notably that of Kyoto. But then, talk is cheap.
A Third Assessment Report by the IPCC is now in preparation, the so-called `TAR 2000', and it seems they are setting themselves up for exactly the same debacle as happened in 1995. A draft of TAR 2000 is now being circulated to `experts' and the IPCC have already made it clear that not only do they not want public input, they don't even want the public to know what new climate scenarios they have in preparation.
Here was their response to a leading skeptic who innocently thought the TAR 2000 draft report was a public document.
"This review is for experts only and while we want as many legitimate experts as possible to review the document we do not want the document to be openly available as it is an early draft and must not be cited or quoted or used for any other purpose other than providing a review.
I am very concerned that the wide distribution of the address outside the expert community will lead to wide abuse of the embargo on the report for any purpose other than expert review ..."
It was signed by Dr Dave Griggs of the Hadley Centre at Bracknell U.K., the organisation co-ordinating the report.
`Experts only' is their catchcry. However, since this is a matter of public importance and interest, my own view is that `He who pays the piper should at least hear the tune'. The public have every right to know what these people are doing with our money, and the IPCC owe it to the public to adopt an `open review' procedure for all future reports.
This dedication to secrecy and avoidance of public scrutiny or accountability is against the public interest and a direct interference with the democratic process, given that this science (if that's the correct term) is being used and abused to foist policies on the people of the world which are not only un-necessary, but would also lead to widespread poverty.
The secrecy ends here. Here is the URL address for the `TAR 2000' draft that they don't want the public to see.
http://www.meto.gov.uk/sec5/CR_div/ipcc/wg1/drafts/chapts.html
18 Dec 1999: Individual TAR Review by Peter Dietze <091335371@t-online.de> ==> Anne Murrill <amurrill@meto.gov.uk>
Comments to IPCC TAR
Subject: Re IPCC WGI TAR
Date: Sun, 21 Nov 1999 19:24:27 +0100
From: "P. Dietze" <091335371@t-online.de>
To: djgriggs@meto.gov.uk
CC: jthoughton@ipccwg1.demon.co.uk, mmaccrac@usgcrp.gov
Dear Dr. Griggs,
sorry about posting around the TAR Draft URL - I suppose I got into some mailing list by mistake - and there was no indication like 'confidential' or 'for reviewers only'. So I thought IPCC may be intentionally running an open policy now to avoid critique after issuing the TAR and I was very glad and curious to read the draft and to let others do the same. After all, science should not be a secret, right?
Thank you for inviting me for a review. I think I ought to do it, but as I doubt as a Contrarian I have even a bit of a chance to move anything, I haven't made up my mind yet. But after a rough read-through, let me tell you my main points:
~~~~~~ Review points ~~~~~~~~~~
1. For solar (see Fig. 6.12) the TAR still only takes the primary forcing and ignores the amplification (Svensmark effects) by about a (transient) factor four - which is based on best fit (nonlinear regression) analysis, see http://www.john-daly.com/fraction/fraction.htm.
2. This leads to an overestimation of the GHG forcing by a factor of about *three* and thus essential modelling errors, in special for future higher concentrations. The forcing for CO2 doubling has been calculated from a tropopause energy balance model (Myhre et al.), leaving out the transmissive absorption of the lower atmosphere and in special the vapor overlapping which cancels a considerable part of the left side slope (wavenumbers < 660/cm). Whereas IPCC now takes 3.7 W/m² as best estimate (Ch 6.3.1, p.15), it may be rather 1.5 W/m² only, see the HITRAN based transmission spectra in the figure of Prof. H. Fischer, IMK Karlsruhe (including emission, vapor and other GHGs) at http://mepc03.met.fu-berlin.de/~dmg/Treibhaus_Statement_lang.html. Not only the forcing is less, the vapor feedback has to be reduced as well. Some modellers who assume a doubling sensitivity of ~3.5°C, caused by a far too high aerosol cooling and a far too small solar forcing, are far out of reality. My first impression is, the TAR again does not disclose the scientific principles of the calculation of IPCC's most important core parameter for CO2 forcing.
3. Though I feel, the Carbon Cycle knowledge is now much better presented than in the SAR, I missed the details of the dynamic Carbon Model. Acc. to my findings the half-life time of any atmospheric CO2 excess is ~38 years (or ~55y e-fold) in accordance with the observed sink flows. Using this realistic parameter only, a proper recalculation of the Mauna Loa curve is possible (see eq.3 and Table 1 in my German model paper http://www.wuerzburg.de/mm-physik/klima/cmodel.htm and Dr. Ahlbeck's very similar model at http://www.john-daly.com/co2-conc/ahl-co2.htm +reviews.htm.) Btw, the same parameters have been specified in a paper by O. Tahvonen, H. von Storch and J. von Storch: Economic efficiency of CO2 reduction programs [Clim. Res. 4, 127-141 (1994)]. With my model and IS92a (burning 1,500 GtC until 2100) I get 571 ppm only. And you should be aware that burning anything more than 1,300 GtC is rather unlikely - so ~548 ppm in 2090 should be the final level.
~~~~~~~~~~~~~~~~~
The impacts of these essential three points on all modelling results would be tremendous. I wonder how long IPCC can take the risk of using flawed core parameters.
As I am a known contrarian (from John Daly's Website), I suppose IPCC will not and cannot cope with my arguments. So as I do not consent with the TAR, do not use my name within the listing of reviewers.
Please, take care of the recent Climate Research Status Report of ten leading climate centers, published in the Bulletin of the American Meteorologic Society [Vol.80, p. 2631 (as reported by Joachim Müller-Jung in the Frankfurt newspaper FAZ on Dec 15th)] that says, there is no high significance for an anthropogenic greenhouse effect alone and thus basically supports my findings and contradicts the previous assertions of the mainstream.
With best regards, Peter Dietze
Dipl.-Ing.
P. Dietze
D-91094 Langensendelbach,
Germany
Phone&Fax +49/9133-5371
Email 091335371@t-online.de
From: Dr Jarl Ahlbeck
Date: 25 Nov 1999
Review report for IPCC WG1 Third Assessment Report (TAR)
This has been posted as a separate paper. Click above title to view
Subject: TAR Chapter 12, comment
Date: Fri, 21 Jan 2000 11:22:30 +0200
From: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
To: <gsharp@montereybay.com>, "George Taylor"
<taylor@OCE.ORST.EDU>
CC: (very large CC list)
Dear all,
A hockey stick curve, even more radical than the curve by Mann, for the recent 1000 years, was published in Finnish newspapers last week. As origin was mentioned WMO. In this curve, no medieval warm period, almost no "little ice age", but a millenium temperature record as early as 1950 were seen. The newspaper report claimed, that there is now consensus that this century has been the warmest in 1000 years because of the enhanced greenhouse effect, and that the forthcoming IPCC report (TAR) will contain that statement.
According to prof Wibjorn Karlen in Sweden, who is an expert of historical climate, the medieval heat occured in North America too, not only in Europe. Of course, nobody knows whether the heat was fully global or not.
After some research, I found exactly the same curve in the TAR draft, chapter 12, figure 12.1., the middle curve. So somebody gives out popularized versions the material that should "not be quoted" to the media. Well, there is nothing wrong about that, the curve will be published anyway.
But how were these historical temperatures (1000-1900) produced, temperatures that do not seem to be in any agreement at all with the history books ? Yes, by SIMULATION. The program used is called GFDL, the authors are Stouffer, R.J., Hegerl, G.C., Tett, SFB, and the reference written in the TAR is "A comparison of Surface Air Temperature Variability in three 1000-year Coupled Ocean-Atmopshere Model Integrations" (J.Climate, in press). The period 1900-2000 is the usual instrumental surface curve.
In TAR chapter 12, the two curves (the simulated hockey stick shaft and the instrumental hockey stick blade) are separate curves. To the media, these two have been combined to a single curve with no mention of how the curve is constructed.
I think this is one of the worst example of how climate research is misused. Ordinary people believe that the published historical curve is real. In fact, it is not. And it is certainly wrong too, for some simple reason. Most of the 1900-1950 warming is not due to the greenhouse effect (there is not enough CO2 increase), but the curve shows that no such great warming has occured ever before. So there are three possible explanations:
1. the (natural) warming 1900-1950 gave really a milennium-record
for 1950
2. the instrument record 1900-1950 shows too much warming
3. the computer simulation gives wrong results as it contains too much
greenhouse forcing and to less solar/volcanic forcing.
I would bet on both statements 2. and 3.
Your comments ?
Is there anybody who can send some good undergraduate history books to Mr. Stouffer and to the WMO? I think some people should lift their (--) sometimes from the computer chair and start to civilize themselves a bit.
regards, Jarl (Abo Akademi University, Finland)
Subject: Comments on IPCC
TAR
Date: Wed, 15 Dec 1999 14:13:42 -0800
From: Hugh Ellsaesser <hughel@home.com>
Organization: @Home Network
To: daly@vision.net.au
"Specific comments.
CHAPTER 1
Page 4, Line 15 - Incoming solar flux given as 342 W/m2 here but as 343 W/m2 in Fig 1.1.
CHAPTER 2
Page 4, Lines 16/18 - This consistency is not remarkable, it was forced by one of the methods used to generate corrections for the ship observations.
Page 5, Lines 12/13 - The last phrase is redundant.
Page 5, Lines 23/25 - Rates of warming of opposite tendency could equally be rates of cooling of opposite tendency; suggest rates of temperature change of opposite tendency.
Page 5, Lines 40/41 - The early warming of the 20th century is due to two causes; acceptance of unreliable ship data circa 1902/12 and 1942/45 and the Arctic Warming of 1917/38. The latter was undoubtedly due to a temporary acceleration of Broecker's thermohaline conveyor belt into the N. Atlantic. It's difficult to see how either of these could be attributed to solar effects.
Page 6, Lines 24/26 - The available data tell us nothing about the trends in water vapor in the 300/100 mb layer where it is most critical for ejection or IR to space.
Page 7, Lines 1/2 - The decreasing precipitation in subtropical land areas is essentially global thus is unlikely to be related to ENSO, although ENSO may aggravate certain areas such as the western pacific and NE Brazil.
Page 9, Lines 8/10 - These lines give trends for rural and all stations which reverse relative magnitudes between 1880/98 and 1951/89. If this is indeed correct, suggest it be made clear that the authors are aware of this switch in relative magnitudes, i.e. that it is not a typo.
Page 9, Lines 17/35 & Fig 2.1 - It would be interesting to know what new data led to the significant cold anomalies in 1917/18. The influenza people claim to have checked this period and found no unusual cold weather. The people still trying to explain the 20 million odd world wide deaths due to influenza during this period would surely like to know what data these anomalies are based on.
Page 12, Lines 6/34 & Figs 2.4 & 2.5 - From the beginning the rapid temperature drop in ship data 1902/04 with slow recovery has presented an insoluble problem. The rapid cooling may have been due to the transition from sailing to steam ships which changed both the selectivity of observing times and ship routing. In foul (usually cold) weather all hands on sailing ships were busy and few observations could be taken; in the warm (usually calm) weather there was little to do so observations could be taken on schedule. Steam ships both had more latitude in choice of routes (no tacking) and permitted the taking of observations on a regular schedule. This transition was made more rapid by the very rapid increase in total shipping at this time. While this might explain the abrupt relative cooling in the ship data circa 1903, it gives no basis for correcting the (obviously erroneous) greater warming of ship (than land) temperatures for 1904/45 (see Table 2.1). Quite obviously both the cold of 1903/12 and the warmth of 1942/5 were incorrect but no one has come up with an explanation that permits a credible correction for these periods.(See Ellsaesser et al., 1986, Sec. 3.3)
Page 13/14, Lines 12/54 & 1/4 - The use of colder NH continental interiors to explain the coldness of land relative to ship temperatures 1885/95 requires further justification to be credible - why does it not occur during other periods? why does it not occur during the simulations of Fig 2.4? Incorporating the unexplained cold ship temperatures of 1903/12 into the global record pushes the Little Ice Age into the 20th century. It also exaggerates the rate of global warming 1910/45, particularly for ship temperatures (See Table 2.1). It is misleading to combine land and ship data just because it makes the two hemispheres warm the same -- the ship data are far less credible and while corrections are clearly required, no one has corrections that remove all doubts and the corrected curve hides a lot of uncertainties. The ship and land data should be plotted side by side and separately to remind readers of their questionable nature through their lack of agreement. Figs 2.8 & 2.9 give the impression that these uncertainties no longer exist and that all parts of the record are as well known as the last 50 or 20 years. This is definitely not true. Table 2.1 gives a far more credible and probably more accurate picture of the warming of the past 138 years than does Table 2.2. And this results primarily from the combination of land and ship data in a single curve for each hemisphere. Is there some reason why ship data were not presented by individual hemispheres as were the land data? Do the anomalies near 1900 and 1945 on Fig 2.23 look like perturbations that would not have been noticed and remarked upon at the time they occurred?
Page 14, Lines 10/21 & Fig 2.9 -- While Fig 2.9 shows an increasing uncertainty going back in time it does not show the much greater uncertainty of 1904/12 and 1942/45 due to the inclusion of ship data which forces the oceans to warm substantially faster than the land for 1910/45, even in Table 2.2. I seriously doubt you can find any student of climate who is willing to defend this circumstance.
Page 14, Lines 35/37 - As pointed out at Page 4, Lines 11/12, the consistency in SST and land air data across land-ocean boundaries is not accidentally, it was forced by using this comparison to develop one set of corrections for the ship data.
Page 15, Lines 48/50 - The global ocean for 10 deg S-60 deg N is primarily tropical. Net anthropogenic forcing AT THE SURFACE in the topics is substantially less than 0.5 W/m2 due to water vapor overlap.
Page 26, Lines 46/48 - Why is "differential rates of warming of comparable magnitude but opposite tendency" used again. Suggest "rates of temperature change."
References
Ellsaesser et al., 1986, "Global Climatic Trends as Revealed by the Recorded Data, Reviews of Geophysics 24(4), 745-792. Hugh.
Regards, Hugh W. Ellsaesser.
Subject: Re: Jaworowski's GW article
Date: Mon, 31 Jan 2000 14:52:56 +0000 (GMT Standard Time)
From: Dr Sonja A. Boehmer-Christiansen <S.A.Boehmer-Christiansen@geo.hull.ac.uk>
To: Peter Dietze <091335371@t-online.de>
CC: <long list>, daly@vision.net.au
Thanks, Peter from pointing out the discussion page about solar influences.
I quote one piece from Jarl Ahlbeck because it raises a question I have an anwer to, and more questions,
Jarl:
"There are no general experts of "climate change" either, and thats one reason for the existence of the IPCC. But according to some reason that I don't understand, IPCC is a failure. The TAR report is a sad story for a critically thinking scientist. The solar forcing research (for example Friis-Christensen) is not given a fair chance.The reliability of the balloon-satellite temperature records is heavily questioned, probably because they do not show the same tropospheric warming as obtained by the holy computer models. The surface records are critized too, but not as heavily. The TAR text gives the false impression that the global uptake mechanisms of carbon dioxide is today fairly well known and correctly modeled. These carbon dioxide people portion anthropogenic carbon dioxide here and there around the globe and do not seem to understand much of diffusional mass transfer theory."
Sonja comments:
My answer to the IPCC failure is that this stems, is caused by even, its method of funding. Most IPCC related climate research takes place in big governmental research institutions, or bodies funded by government on A contract basis. This makes for selection and self-censorship, especially if government has decided its policies. A large range of official policies is now 'greenwashed' with reference to the need for combating global warming.
Usually I am in favour of goverment regulation (so my objection is not ideological as it seems to be in some quarters), but not when it comes to funding policy-relevant uncertain science.
As far as I know from USA, UK,and Australia - please disagree - climate research is very directly governmental and usually linked to Met Offices. even worse, such research may be funded by single departments or ministries, e.g environment ministries. Once these have fixed a policy -and climate protection policy has so many win-wins attached to it by now (even the World Bank is now earning from emissions trading), and the number of R&D people benefitting from carbon sequestration and geological injection and ocean disposal is increasing, not to mention carbon cycle research - so the scientific community in general now loves man-made climate change because of the research opportunities and cannot afford to contradict policy and policy commitments. Most politicians are now believers in dangerous man-made climate change. Who persuaded them if not people like Sir John Houghton? It is my understanding - purely anecdotal, that if your are scientist working in a government lab (Los Alamos included), you do not contradict public policy, in public. The IPCC Science Working Group is funded by the UK Environment Ministry, and climate policy is the flag ship of that Department in the EU and UN. Without pleasing the scientific bureaucracy, science research would not be funded. Isn't that an answer?
Peer review tends to lead to 'cronyism' - unless the peer group membership changes a lot and peer review approval does not have immediate funding implications. Who selects the peers in the IPCC and has this group changed a lot? I doubt it. It was one of my 'findings' some years ago (see below) that the IPCC selects is reviewers as it chooses (and solar physics just wasn't relevant, not part of Meteorology interest). The IPPC at the top (its Bureau) consists more of government people than active scientists. If such scientists are present , they represent official research bodies who kow-tow 'to policy' in order to get the next budget allocation. In private, I heard this during many interviews.
I add a few of my refs from the social science on the IPPC.
'Climate Change and the World Bank: Opportunity for Global Governance? Energy & Environment, Vol.10, No.1, January 1999
'A winning coalition of advocacy: climate research, bureaucracy and 'alternative' fuels', Energy Policy, Vol. 25, No. 4., pp.439-44, 1997
'Political Pressures in the Formation of Scientific Consensus', Energy & Environment, Vol.7, No.4, 1996 pp. 365-375; also in John Emsley (ed.), The Global Warming Debate, London 1996.
'Britain and the Intergovernmental Panel on Climate Change: The impacts of scientific advice on global warming: Integrated policy analysis and the global dimension.' Environmental Politics, Vol.4, No. 1, Spring 1995, pp.1-18.
'Britain and the Intergovernmental Panel on Climate Change: The impacts of scientific advice on Global warming Part II: The Domestic Story of the British Response to Climate Change, Environmental Politics, Vol.4, No.2, Summer 1995, pp.175-196.
Sonja
----------------------
Dr. Sonja A. Boehmer-Christiansen
Reader, Department of Geography
University of Hull,
Hull, HU5 6RX, UK
Editor, Energy&Environment, Multi-Science
sonja.b-c@geo.hull.ac.uk
00 44 (0)1482 465349/6341/5384
Fax: 01482 466340
Subject: Re: Jaworowski's GW
article
Date: Wed, 2 Feb 2000 11:13:42 +0200
From: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
To: "Sonja A. Boehmer-Christiansen" <S.A.Boehmer-Christiansen@geo.hull.ac.uk>,
<091335371@t-online.de>
Dear Sonja,
Thank you for your mail. I am afraid that you are right (at least to a great extent) about the corruption of science. It is, however nothing new, remember the numerous academic dissertations on "Rassenhygiene", "Marxismus-Leninismus", sponsored university works showing that industrial poisonous emissions are no problem, a.s.o. Fortunately, the payer has not always that much influence on the results, but we must be aware of the unconscious mechanisms involved.
Now the climate change propaganda war has concentrated on showing that there were no "global" medieval warm period and no little ice age. Nobody knows for sure to what extent these historical events were global or not. But on the other hand, the 1900-1998 "instrumental" warming has probably not been global either.
Greeting from the ice and snow, Jarl
Subject: Re: Jaworowski's GW
article
Date: Wed, 2 Feb 2000 05:34:28 EST
From: `Wolfgang' Thuene@aol.com
To: jarl.ahlbeck@abo.fi, S.A.Boehmer-Christiansen@geo.hull.ac.uk, 091335371@t-online.de
CC:
Dear all, especially Sonja and Jarl!
Thank you very much for your fruitful thoughts. I have only one question: You all know the theory of atomic spectra. Our present interpretation of spectral lines accounts for the fact that all spectra of gaseuos substances, that is, of free molecules, appear to consist of lines rather than of a continuous sequence of frequencies. If the Earth is nearly a "black body" (cavity radiation), how is it physically possible, that it can be warmed from -18 to +15 (natural greenhouse effect) by itself, that means, by emittet, partially absorbed und than b greenhouse gases re-emittet own invisible infrared radiation. All of you know the "cooling law" of Sir Isaac Newton.
Can you help me please to resolve this discrepancy or paradox???
With the best wishes Wolfgang
Subject: Re: Jaworowski's GW
article
Date: Wed, 2 Feb 2000 14:36:03 +0200
From: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
To: Wolfgang <Thuene@aol.com>, <S.A.Boehmer-Christiansen@geo.hull.ac.uk>,
<091335371@t-online.de>
Dear Wolfgang
Thank you for the feedback, and here a very trivial attempt to answer your question. My answer is simply based on some textbooks in atm. science and a couple of scientific reports.
In fact, a black-box temperature of -18 degrees in not correct because it involves the influence of the reflecting albedo of the polar ice, that, of course is not present if the earth would have no water on the surface. The "real" black-box temperature is therefore much higher. As long as we have water, the climate is warm. The main heat transfer mechanism is evaporation of water (75 % of the surface is water, isn't it ?) due to the incoming short-wave (SW) radiation from the sun. As the water condenses, the same amount of energy is released at a higher level from which the long-wave (LW) radiation can escape back to space. The second most important mechanism is convective heat transfer by means of rapidly moving air masses. The directly outgoing LW radiative heat transfer from the surface is negligible except for situations with extremely cold and dry conditions.
The net emission of LW radiation that is needed to balance the incoming SW radiation takes therefore place from the upper layers of the atmosphere. If we increase the amount of LW active gases ( mainly water vapour, but also carbon dioxide and methane), we change the radiative properties, or the "optical depth" of the upper troposphere. Carbon dioxide alone cannot do much because of the dominating influence of the water molecules. Thus all predictions of GW are based on a "positive (hypothetical) feedback" so that a little more warming by carbon dioxide creates a little more water vapour at the upper layer due to a change to another point on the saturation curve ...... According to equations presented (--- unfortunately I have no competence to judge the reliability of these equations ----), this would create a cooling of the stratosphere and a warming of the upper troposphere.
How such a hypothetical warming of the upper troposphere (that in fact has not been measured at all) should be calculated back into a warming at the surface is a difficult (unsolved) problem. The profile can principally be changed in many different ways leading to - a warming at the surface (parallell change) - no change at the surface (increased bending) or - a cooling (change of the slope) at the surface. The influence of clouds is still an unsolved problem, the influence of very small increase in the cloud cover (due to hypothetically increased water vapour) may work as a proportional regulator (P-regulator) thus reducing the final GW at the surface only to a amall bias due to lack of integration effect of the "controller" (a "negative feedback"). In that case, the result would be more rain, but not much warming. The often heard claim that "dry regions would become even dryer" is as false as most of the popular media claims concerning GW.
So it is really a very complex system that cannot be explained by simple radiation equations. The over-simplified explanation of the greenhouse effect is based on a stable atmosphere with continuously decreasing temperatures with height. But such an atmosphere would be unstable, because warm air almost always flows upwards ! In fact, computer modeling of the atmosphere contains a lot of heavy adaptations and hundreds of (unknown) parameters.These people who play this beautiful simulation game are certainly good scientists, but don't try to force them to explain what they really are doing to people outside the inner circle ! They never will, or they have lost control over the computer who has started to live a life of its own.
For weather forecasts, different types of models are adapted to reality (parameter values are calculated continously from measurements) by millions of on-line masurements giving a reliable forecast for a couple of days. For estimation of climate change, the computer cannot help much if the basic mechanisms of the climate system still are unknown.
regards, Jarl Ahlbeck (D.Sc.,
env. sci.),
Abo Akademi University (The Swedish University of Finland)
Subject: Re: Jaworowski's GW article
Date: Wed, 2 Feb 2000 13:16:27 +0000 (GMT)
From: "Sonja A. Boehmer-Christiansen" <S.A.Boehmer-Christiansen@geo.hull.ac.uk>
To: Jarl Ahlbeck <jarl.ahlbeck@abo.fi>
I could hug you, Jarl! What you explain below is exactly what I learnt as an undergraduate doing climatology in the geography department at Adelaide University..(have 'done' no more since) ...though I had forgotten it. Latent heat and convection, not radiation warm us from below...I think this was the simple formula taught to Arts students. Basing world climate predictions on black-body physics is not wrong, but it is only a small part of the truth. Am I right to assert that the GCMs are based on this inadequate truth?
Sonja
Subject: Re: Jaworowski's GW
article
Date: Wed, 2 Feb 2000 14:21:01 GMT
From: richard@courtney01.cix.co.uk (Richard Courtney)
To: "Sonja A. Boehmer-Christiansen" <S.A.Boehmer-Christiansen@geo.hull.ac.uk>
Dear Sonja:
Yes. You are right. And Jarl's explanation/summary of the primary atmospheric heat transfer mechanisms is excellent.
I have often asked the simple questions, "What is the magnitude of the local greenhouse effect when it is raining ? And what do the GCMs say it is ?"
All the best Richard
Subject: sensitivity history
Date: Wed, 2 Feb 2000 11:56:02 -0500
From: Jack Barrett <100436.3604@compuserve.com>
To: <big CC list>
Dear Friends,
Jarl has re-opened the can of worms with regard to sensitivity (4 W/m2) on doubling the carbon dioxide content of the atmosphere. The 1994 IPCC publication (page 174) gives the reference to the calculation as Shine (1991) in the Journal of the atmospheric sciences, Vol 48, page 1513. This does not contain the standard narrow band code referred to. All there is are references to the Malkmus narrow-band model (Kiehl and Ramanathan J. Geophys. Res., 88, 5191 (1985)) and the water vapour continuum code (Edwards Proc. Soc. Photo. Instrum. Engr. Modelling of the atmosphere, 298 (1988)) upon which the calculations are based. I'll keep digging in the hope of getting something real.
With regard to the point about radiative transfer versus convection (and there's a lot of that about) the accepted energy balance diagrams show quite clearly that the energy transfer from the Earth's surface towards the infinite is composed of 15% which sails out to space through the IR window, 10% by heat transfer to the lower atmosphere, 56% by convection and only 19% by radiative transfer. If it is cloudy and/or raining the radiative transfer proportion must be minimal, so not much global warming enhancement in Manchester!
Best wishes Jack
Subject: Jarl's good explanation--some
additional thoughts
Date: Wed, 2 Feb 2000 10:15:18 -0700
From: Chick Keller <cfk@lanl.gov>
To: "Sonja A. Boehmer-Christiansen" <S.A.Boehmer-Christiansen@geo.hull.ac.uk>
CC: Jarl Ahlbeck <jarl.ahlbeck@abo.fi>, etc.
Sonja,
Thank you for eliciting this tutorial. Jarl does a nice job of introducing the water vapor/latent heat/cloud/precipitation elements of the global heating dynamics. In fact, however, what he has done is make an excellent case for the necessity of large scale computer models. As imperfect as they are, we have no choice but to turn to them. There are so many competing processes, rates, forcings, feedbacks, etc., that one needs a computer to keep track--to get a quantitative answer to our qualitative questions. One can, of course, throw up one's hands and say it's all too hard, and the computer models will never do the job. But from my 35 years of computer modeling in fluid dynamics and radiation transfer, I am a bit more optimistic. This comes from looking as closely at what the models can do as at what they can't do. The cup, I suggest is more than half full, although it has a ways to go before it's full enough for us to unreservedly endorse it.
Right now, Jarl correctly points out that the main problem with the models is treatment of vertical distributions (as well as horizontal) of water vapor and the relative importance of clouds and their complex radiative interactions. So much more needs to be done to increase our understanding of formation of cloud droplets and ice crystals. But some things are pretty well in hand. Numerical modeling of computational fluid dynamics (the science of how materials move) is pretty sophisticated. Thus the convective motions, Rossby wave motions, etc. are well enough in hand to inspire confidence. Despite what some critics are saying, I think we can pretty well simulate clear sky conditions (radiative transfer part). But clouds continue to be a big uncertainty with many inconsistencies arising from the models (and the modelers will readily admit their problems here).
However, modeling and how modelers determine whether what they're getting it right on not is something most others don't have a feel for, and it's hard to explain in a short note. If you were able to follow a modeler around for a month, you'd feel much better about what is being done, but you might feel much worse about parts of it.
Right now, global models can reproduce most of the gross features of the climate, and the most recent runs with the latest improvements are looking very promising. But the largest problem to me is the very coarse resolution of these simulations. We're usually forced to run with grid cell sizes of 100-400 km in the horizontal. Such large zones are just too large to capture the processes we need to have in the codes. And they can't make use of the better physics we already know.
Why don't we run at finer resolution--say 30 km or with variable resolution to capture topography, etc? The simple answer is that here in the US, people don't care enough to give the modelers what they need. The computer that could do this could be in place now. It would cost less than a single fighter aircraft!! Why don't we provide our scientists with such a machine? You tell me. I think it's a scandal.
So, yes the models and codes need improvement, but when you run them with all their imperfections over a variety of scenarios with a variety of assumptions you get that greenhouse warming can't be ignored as a significant forcing compared with the observed warming.
One last comment--people have been asking how long you need to observe a warming trend before it is significant--20 years? 30 years? I submit that no time period is long enough if we don't understand what's going on. Without better computer models, we never will. There are just too many things going on and we can all discuss qualitatively each of the forcings and responses, but the quantitative answer that we must have will elude us.
Here at Los Alamos we're hoping to contribute to the water vapor problem in the near future. Wish us luck and perspective >
Regards,
Charles. "Chick" F. Keller,
Institute of Geophysics and Planetary Physics/University of California
Mail Stop MS C-305
Los Alamos National Laboratory
Los Alamos, New Mexico, 87545
cfk@lanl.gov
Phone: (505) 667-0920 FAX: (505) 665-3107
http://www.igpp.lanl.gov/climate.html
Subject: Re: sensitivity history
Date: Thu, 3 Feb 2000 11:50:24 +1300
From: "VINCENT GRAY" <vinmary.gray@paradise.net.nz>
To: <big list>
Dear Folks
I have been reluctant to enter this particular chat show, but my hairs bristle when I hear the word "sensitivity".
I have had no response whatsoever to my major criticism of the IPCC and of all modellists (including Jarl Ahlbeck and Chick Keller), that they are scared of statistics. They blthely go on about whether a particular model is "better" or "worse" than another, but they never try to put these untterly unscientific qualitative opinions to a scientific test..
It is usual in all other scientific disciplines except climate modelling to use statistical tests to find out whether models represent real data.
There is no model that is accompanied by a scientifically derived measure of its uncertainty. Each parameter and each "parametirization" has uncertainty distribution functions which are difficult to define, but which are usually broad.. They have to be combined into an overal probability distribution to give the statistical uncertainty that should be associated with the model predictions, but it is never done. I suspect that the arguments between the different modellists are all within the gross uncertainty associated with all the models.
Instead of a proper scientific study of the uncertatinties of the models we get a poor relation, "sensitivity analysis". This takes only one aspect of uncertainty like the differences between different modellists, or the effect of altering one parameter, with the object of pretending that the problem of uncertainties has been dealt with.
It is assumed by the IPCC and all modellists that models have a single-value output. The IPCC "Range" of 1.5 to 4.5 climate sensitivity is based on this delusion. The "fingerprint" and "pattern recoognition" studies perpetuate it.
They are right to be scared. A proper scientific study of uncertainty in models would probably show that all of them can model almost any climate feature with an equally low level of probability.
Vincent Gray
75 Silverstream Road
Crofton Downs
Wellington 6004 New Zealand
Phone/Fax 064 4 4795939
Email vinmary.gray@paradise.net.nz vincegray@xtra.co.nz
Subject: Re: sensitivity history
Date: Wed, 2 Feb 2000 17:12:54 -0700
From: Chick Keller <cfk@lanl.gov>
To: "VINCENT GRAY" <vinmary.gray@paradise.net.nz>,
<et al>
Dear Folks,
I feel like a tennis player with 6 or 7 people on the other side of the net hitting balls over.
Perhaps somebody else could field this one. The short answer is that when you are doing simulations with very large codes with many parameters, forcings, interactions, etc. evaluating sensitivies (if I understand what Jack means by sensitifity) is about the best you can do. (One note, climate sensitivity experiments where the amount of feedback can be dialed in are done with energy balance models. The big "first principles" coupled ocean/atmosphere codes don't do that although they can "tune" different processes to be more or less effective. And so it is important to keep in mind which of these two approaches the person is discussing.)
In actuality the codes are so sensitive to variations in some of their parametrizations that they must be tuned carefully else they won't reproduce the observed climate. Now, if they have to be tuned to values of the parameter that are clearly incorrect, it's back to the development board. But in general, the tuning is within the uncertainties allowed by the observations. This situation is a rapidly changing one, and as you go farther back in time--1990, 1985, the tuning and corrections were pretty heavy-handed. The most recent models are less and less dependent on such stuff. Still determilnilng sensitivities to processes that are not well known is the a very good way to decide how uncertain our outcomes are.
Perhaps someone else more familiar with the climate codes than I could improve on this. I did talk to some of our other modelers who are studying continental drift, convection in the earth's interior, etc. and they say their community does pretty much the same thing, so perhaps Jack could give us some examples of what modelers he knows, who do it right, are doing.
Regards,
Charles. "Chick" F. Keller,
Subject: Re: Jaworowski's
GW article
Date: Wed, 02 Feb 2000 17:18:14 -0800
From: Hugh Ellsaesser <hughel <hughel@home.com>
To: Wolfgang <Thuene@aol.com>
CC: <big list>
Dear Thuene, (Wolfgang)
The best way to get a concept for the GHW phenomena is to remember that the outgoing IR radiation of the planet is emitted near the altitude where the optical depth of the absorber/radiator reaches a value of unity when integrated from outer space inward.
The wavelength of maximum energy or color temperature of the earth is about 255K, that means most outgoing radiation originates near an altitude where the temperature is 255K, or 33K lower than the surface temperature. Very little of the IR going out to space originates from the surface and makes it up through the atmospheric window without absorption and reradiation.
The principal radiator is water vapor and the first optical depth of H2O is around 10 km. So the humidity of the atmosphere between this level and the tropopause is one of the major factors in the equation and one that is almost completely unknown from current observations.
It is my argument that any surface warming will lead to intensification of the Hadley circulation; that is, not only the ITCZs will intensify but also the downwelling over the subtropics, which creates our present deserts, will intensify. This will force the outer optical depth of water vapor over the subtropics to a lower and warmer altitude allowing even greater outflow of IR to space from this nearly half of the globe,i.e. over these regions the water vapor feed back will be negative. Since the water vapar feedback accounts for 30 to 60% of the radiative forcing, this will significantly reduce the computed 4 W/m2 of radiative forcing.
Regards, Hugh.
Subject: Re: sensitivity history
Date: Fri, 4 Feb 2000 09:05:56 +1300
From: "VINCENT GRAY" <vinmary.gray@paradise.net.nz>
To: Chick Keller
Dear Chick
<side remarks omitted> ... a sensitivity analysis on a minor parameter is "the best that you can do", as it diverts your attention from the overwhelming uncertainties in the models provided by the really important parameters, such as water vapour, clouds and aerosols. The authors of Chapter 5 of the draft TAR were the only ones in that report to realise this when they said
"the net forcing of the climate over the last 100 years (and since pre-industrial times) may be close to zero or even negative" (Chapter 5 page 50, Lines 8,9)
It will be interesting to see whether a truthful statement such as this survives processing.
The woeful ignorance of statistical methods amongst climate scientists is well illustrated by the recent Report of the National Science Council, which, given the task of trying to reconcile surface temperature readings with those from the MSU satellites, did not think to carry out a statistical correlation exercise between the two.
If they had done, they would have found that there is a good correlation between the two sets of measurements for such climate events as volcanoes (El Chichon, Pinatubo), El Niño, and solar variability, but the MSU record does not pick up the steady temperature increase shown by many of the surface stations, which is therefore an artifact due to human improvements in heating and buildings around a proportion of the measurement sites, and not to the greenhouse effect..
The Committee expressed the opinion that the regional distribution of temperature variability was similar for the MSU and surface record. but they included only the MSU map, and conveniently omitted the regional map of the surface changes, so that people would not notice that they are wrong.
Happy Landings! Vincent Gray
Subject: Re: sensitivity history
Date: Thu, 3 Feb 2000 14:15:20 -0700
From: Chick Keller <cfk@lanl.gov>
To: "VINCENT GRAY" <vinmary.gray@paradise.net.nz>
Dear Vincent,
thanks for the comments. Answers below follow your comments IN CAPS for ease of finding.
<some side remarks omitted>
in the thought that a sensitivity analysis on a minor parameter is "the best that you can do", as it diverts your attention from the overwhelming uncertainties in the models provided by the really important parameters, such as water vapour, clouds and aerosols. The authors of Chapter 5 of the draft TAR were the only ones in that report to realise this when they said
"the net forcing of the climate over the last 100 years (and since pre-industrial times) may be close to zero or even negative" (Chapter 5 page 50, Lines 8,9)
INTERESTING, I'LL HAVE TO GET A CY AND SEE WHAT THE CONTEXT OF SUCH AN EXTRAORDINARY STATEMENT IS.
It will be interesting to see whether a truthful statement such as this survives processing.
The woeful ignorance of statistical methods amongst climate scientists is well illustrated by the recent Report of the National Science Council, which, given the task of trying to reconcile surface temperature readings with those from the MSU satellites, did not think to carry out a statistical correlation exercise between the two.
If they had done, they would have found that there is a good correlation between the two sets of measurements for such climate events as volcanoes (El Chichon, Pinatubo), El Niño, and solar variability, but the MSU record does not pick up the steady temperature increase shown by many of the surface stations, which is therefore an artifact due to human improvements in heating and buildings around a proportion of the measurement sites, and not to the greenhouse effect..
WHO WAS IT SAID: THERE'S LIES, DAMNED LIES, AND STATISTICS!!
I DON'T SUBSCRIBE TO THIS, BUT I DO KNOW WHEN MINDLESS STATICTICAL STUDIES CAN BE VERY MISLEADING AS THEY ARE IN YOUR EXAMPLE ABOVE.
I WILL REPEAT WHAT I'VE WRITTEN TO JOHN DALY'S LIST OVER THE PAST FEW WEEKS ABOUT WHAT WE'RE FINDING BY LOOKING CAREFULLY AT MSU VS SURFACE.
1. IN ONLY 2 OF THE FIRST 13 YEARS OF THE MSU RECORD DID THE SURFACE RECORD SHOW TEMPERATURE ANOMALIES LARGER THAN THE SATELLITE!!
2. YOU ARE CORRECT THAT STATISTICS SHOULD SHOW CORRELATIONS WITH ENSO AND LARGE VOLCANOES (I HAVEN'T FOUND MUCH WITH SOLAR VARIATION), BUT YOU MAKE THE WRONG CONCLUSION FROM THE LACK OF CORRELATION WITH "STEADY TEMPERATURE INCREASE".
3. THE REASON YOUR STATISTICS DON'T SHOW A STRONG CORRELATION BETWEEN SATELLITE AND SURFACE STEADY TEMPERATURE INCREASE LIES MOST PROBABLY IN THE FACT THAT THEY ARE THROWN OFF BY THE 5 YEAR EXTRA COOLING OF THE MIDDLE AND UPPER TROPOSPHERE RELATIVE TO THE SURFACE FOLLOWING THE PINATUBO ERUPTION. ABSENT THAT COOLING AND THE TWO TRENDS AGREE PRETTY WELL. IN FACT FROM 1992 TO THE PRESENT THE TWO TRENDS ARE NEARLY IDENTICAL!!
4. AND FROM POINT #1 IT IS ILLOGICAL TO CONCLUDE THAT THE REASON FOR THE TREND DISAGREEMENT IS DUE TO URBAN HEAT ISLAND EFFECT. IF THE SURFACE RECORD IS GIVING TEMPERATURES THAT ARE TOO HIGH DUE TO UHI, THEN HOW TO EXPLAIN THAT THE SATELLITES RECORDED EVEN HIGHER TEMPERATURES UNTIL THE ERUPTION OF MT. PINATUBO AND AGAIN DURING THE EL NINO OF 1998? YOU CAN'T HAVE IT BOTH WAYS. THE SATELLITES ARE CLEARLY SHOWING THAT UHI EFFECTS MUST BE SMALL.
INSTEAD, WHAT WE'RE FINDING IS THAT OVER THE PAST 50 YEARS (USING NCEP REANALYSIS DATA) THE GENERAL RULE IS THAT MIDDLE AND UPPER TROPOSPHERE IS WARMER THAN SURFACE DURING EL NINO AND COOLER THAN SURFACE DURING LA NINA, (EXCEPTING IMMEDIATELY AFTER PINATUBO 1992-1996). (WE'RE FURTHER FINDING THAT THE UPPER TROPOSPHERE WARMS MORE THAN THE LOWER TROPOSPHERE DURING EL NINO AND COOLS MORE DURING LA NINA).
THUS FROM BOTH THESE OBSERVATIONS, WHATEVER REASON THERE IS FOR THE TRENDS NOT TO AGREE, ITS NOT DUE TO UHI, AND ITS PROBABLY DUE TO SOME SHORT TERM (5 YR) EFFECT CAUSED BY CHANGES IN THE TROPOSPHERE AFTER PINATUBO'S ERUPTION.
5. AS ONE POSSIBLE CAUSE OR AT LEAST STRONG CONTRIBUTOR TO THE EXTRA TROPOSPHERIC COOLING DURING THIS 5 YEAR TIME, WE'RE LOOKING AT THE SATELLITE RECORD FOR LOWER STRATOSPHERE WHICH SHOWS A DRAMATIC STEPWISE DROP IN TEMPERATURE AFTER PINATUBO (FOLLOWING A FEW MONTH WARMING DUE TO THE DUST). WE FURTHER NOTICE THAT DURING THIS TIME THE UPPER TROPOSPHERE IS UNCHARACTERISTICALLY COOLER THAN THE LOWER TROPOSPHERE (EL NINOS GOING ON) ALSO POINTING TO A STRATOSPHERIC INFLUENCE. WE NOTE THAT SEVERAL MODELING RESULTS SHOW A SIMILAR EFFECT, BUT NOT ENOUGH POSSIBLY DUE TO IN ADEQUATE TREATMENT OF THE STRATOSPHERIC COOLING.
BOTTOM LINE, STATISTICS AND RELATED STUDIES ARE ONLY AS GOOD AS THE ASSUMPTIONS THAT GO INTO THEM. IF, AS WE SUSPECT, THAT 5 YEAR PERIOD IS AN ANOMALY, THEN IT CAN EASILY BIAS THE STATISTICAL RESULT ON LONG TERM TRENDS.
IN FACT IN EVALUATION OF ACCURACY OF CLIMATE CODES SIMILAR CONSIDERATIONS OFTEN RENDER PURE STATISTICAL CORRELATION STUDIES INAPPROPRIATE. MORE ABOUT THAT IN A LATER POSTING.
The Committee expressed the opinion that the regional distribution of temperature variability was similar for the MSU and surface record. but they included only the MSU map, and conveniently omitted the regional map of the surface changes, so that people would not notice that they are wrong.
HAVE YOU SEEN THE LATEST PLOTS OF GLOBAL TEMPERATURES OVER THE PAST 50 YEARS WITH THE FSU (FORMER SOVIET UNION) REMOVED? SAME WARMING TRENDS!! JUST SLIGHTLY LOWER TEMPERATURES ALL ALONG......
Happy Landings! Charles. "Chick" F. Keller,
Subject: Re: Jaworowski's GW
article
Date: Thu, 3 Feb 2000 10:54:14 +0200
From: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
To: "Sonja A. Boehmer-Christiansen" <S.A.Boehmer-Christiansen@geo.hull.ac.uk>
<...> you wrote:
Basing world climate predictions on black-body physics is not wrong, but it is only a small part of the truth. Am I right to assert that the GCMs are based on this inadequate truth? - Sonja
At least the old GCM models from which I have more information than from the new models, perform a so called "convective adapation" which means that a pre-calulated, on radiative heat transfer based theoretically calculated vertical temperature profile that, due to the convection, is very different from a real (measured) profile, is simply replaced by a realistic profile and then the same calculations (based on radiative heat transfer) are continued starting from the corrected profile. In other words, radiative heat transfer gives the wrong profile. But no problem, let's adjust the profile so that we can go on computing interesting climate changes due to change of radiative properties.
As Richard Lindzen has said it: "The GCM models are just experimental tools, and now these tools are forced to make predictions that they are not able to.." When working as computer modeler to the chemical industry (I made my first simulation model in1969 using FORTRAN for a simultaneous heat- and mass transfer process on an IBM 1130 computer), nobody would like to take responsibility for the outputs for a model with as unreliable structure as a GCM model.
There is nothing wrong with GCM modelers, they do the best job they are able to. The problem is, that too many people believe in the unreliable predictions. This problem is thus not scientific, it is political.
have a nice day, Jarl
Subject: Re: Jaworowski's GW
article
Date: Thu, 3 Feb 2000 10:25:42 GMT
From: richard@courtney01.cix.co.uk (Richard Courtney)
To: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
Dear Jarl:
I write to support your comments to Sonja, when you said:
"As Richard Lindzen has said it: "The GCM models are just experimental tools, and now these tools are forced to make predictions that they are not able to.." "
and
"There is nothing wrong with GCM modelers, they do the best job they are able to. The problem is, that too many people believe in the unreliable predictions. This problem is thus not scientific, it is political."
You may recall that in an earlier circulated email to Sonja (22 October 1999) I wrote:
"The sample range for GCMs has two components. One is the range of observed climate systems that the models accurately emulate, and the other is the maximum possible range of values of the physical constants put into the models. The latter of these two ranges permits error limits to be stated for the calculations. Indeed, no calculation result has any scientific value unless the inherent error of the result is stated. Inherent errors are not quoted for outputs of the GCMs and this omission alone is sufficient to declare the GCM results to be worthless [for predictive purposes].
The inherent errors of GCM predictions are not provided because they are not known. GCMs use finite error and finite difference iterations to achieve stability. Errors compound with each iteration, and the accumulated error depends on the number of iterations conducted. Unfortunately, the number of conducted iterations is not known and it is hard to see how it could be known.
I again stress that I am a great admirer of the GCMs and their constructors. The GCMs are one of the greatest achievements of mankind. They represent our best understanding of the climate system, and the GCMs offer indications of where our understanding of climate needs improvement because their predictions of climate behaviours can be compared to behaviours of the real climate. However, I am disgusted by the misuse of the models for predicting changes for which they are not validated. This misuse of the models and their outputs is not merely "junk science" (as Jarl Ahlbeck comments); it is an attack on science."
I am writing to support your comments. Therefore, I now yet again say that I think the GCMs are one of the greatest achievements of mankind. They represent our best understanding of the climate system, and the GCMs offer indications of where our understanding of climate needs improvement because their predictions of climate behaviours can be compared to behaviours of the real climate.
The misuse of the GCMs is a scandal, and I fail to understand why so few people share my anger at it.
All the best Richard
Subject: sensitivity history
Date: Thu, 3 Feb 2000 15:56:50 +0100
From: "Volz, Dr. Hartwig" <Hartwig.Volz@rwedea.de>
To: 'Jack Barrett' <100436.3604@compuserve.com>
Dear Jack,
I would like to mention an old-fashioned piece of science generated when there was not yet any discussion about a catastrophe lurching around the corner (Kunde et al., 1974, Journal of Geophysical Research, vol.79, pages 777-784). The authors describe an experiment where they
* measured an IR-earth-emission-spectrum from space
* measured surface temperature
* measured atmospheric lapse rates
* measured concentration of certain IR-active gases, excluding ozone
* calculated the IR-earth-emission-spectrum by using the then accepted
radiative transfer code and the then accepted spectroscopic
data base. This figure compares observed and calculated results (calculated
is here called "theoretical")
There is a very good agreement between both spectra indeed, which shows that both the radiative transfer code as well as the spectroscopic data base are of good quality. My conclusion: to me it just does not make sense to cast doubt on the quality of codes and data bases again and again. I would agree that radiative transfer is not that easy to understand for somebody not really firm in atmospheric physics. But this does not mean that radiative transfer does not describe atmospheric physics objectively and properly.
A second remark: in the lower part of your communication you describe net energy flows and net radiative flow quantitatively round about correctly close to the surface of earth or in Manchester. The point is: radiative energy flow in the direction to space (radiation out, positive sign) is much higher. However, IR-active gases in the atmosphere do radiate also, radiating in all directions and thus also from atmosphere to earth (radiation in, negative sign). The net radiative flow is the sum of both values, one negative, one positive, the sum being essentially zero close to the surface of earth. But it is nothing than a mathematical code (the sum of two values of opposite sign), it is not a physical reality. The radiative energy going out into space (positive sign only) is much higher (compare my calculated example of the man in the balloon in John Daly's Hugdebate some time ago. Hundred meters above ground the outgoing radiation is practically unchanged). Seems to be pretty confusing for somebody not familiar with radiative transfer codes. At the top of the atmosphere there is still plenty of radiation left going out into space (with positive sign), see the figure above. But no other energy left aside of radiation.
For a forth-coming presentation I have just prepared a chart with which you can nicely calculate the man-in-the-balloon-example (equation (3)). Good luck if you want to try.
Schwarzschild equation dI = -Io*k*rho*dz + B*k*rho*dz (1) with dI = differential change in radiation The first term of the right-hand side is called "sink function", the second term is called "source function". Modified Schwarzschild equation With integration and considering the simplest of all cases, the one-dimensional radiation in only one direction, e.g. from the surface of a flat earth in the direction of space, one receives the following, with transmittance instead of absorption for the first term: T [* Io]
= (1 - A) [* Io] (2) "My greenhouse physics, including e.g. the quantification of forcings or calculation of anthropogenic temperature rises, are based on this equation. This is an equation intuitively easy to understand and very practical to work with" Schwarzschild 1906; Volz 1997 |
Having said all this, the 4W/m² for CO2-doubling are round about correct.
Best wishes Hartwig
Subject: Re: sensitivity history
Date: Thu, 03 Feb 2000 19:15:31 -0700
From: Steve Hemphill <steve@hemphill.net>
Organization: Earth
To: VINCENT GRAY <vinmary.gray@paradise.net.nz>
Vincent,
Why is it so many educated people can't seem to understand the simplicity of the insulator with depth? If the system was in equilibrium and the effectiveness of the insulation increases, the temperature response of the outer reaches of the insulator will be opposite that of the change of the dependent side until equilibrium is again attained. What this means for the atmosphere is that if the effectiveness of the insulation of the atmosphere increases, the outer reaches will be cooler, as less energy is reaching it from inside, until the system again reaches equilibrium. Visualize a partial vacuum thermos with a 5 watt light bulb and some mass at equilibrium. The temperature of the outer skin will be warmer than the surroundings. Now pump the rest of the air out. The outer skin temperature will drop. When equilibrium (temperature stability inside) is again reached the temperature of the outer skin will return to the initial temperature. Thus, the stratosphere is cooling because the insulation of the atmosphere is increasing and we are (and will probably be for centuries) increasing heat storage, until equilibrium.
So, your "therefore" statement needs reexamination.
Chick, do climate models consider the temperature drop happening at the end of the Holocene? According to the response time of the earth during the Eemian, we should have been dropping temperature at approximately the same rate as the end of the Eemian. This is, coincidentally, the rate of change from ~ 1200 ad until ~ 1700 ad (during the "Little Ice Age") after which we stabilized, then began warming. So the initial state is not at equilibrium, but dropping in temperature.
Steve H
Subject: Re: sensitivity history
Date: Fri, 4 Feb 2000 09:12:17 +0200
From: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
To: "Steve Hemphill" <steve@hemphill.net>, "VINCENT
GRAY" <vinmary.gray@paradise.net.nz>
Dear Steve and other,
OK for the insulation theory (you insulate more, gets a warming inside and a cooling at the surface), but the main point is that only if you insulate a steam pipe or anything where 100 % of the energy is created inside the system, your description is correct.
But now you have a more complex system where 100% of the energy comes from the outside. If your inside warming (lamp bulb) due to better insulation creates a little more more marine stratocumulus clouds that reflect more of the incoming energy you have a proportional temperature controller (a P-controller) with a small contant bias (a small warming) with no integration effect (only a PI-controller can completely nullify the bias).
In that case you can still have a measurable cooling on the outer surface (in fact you have), but no significant warming inside (in fact you have no measured tropospheric warming).
Do you want the equations too ?
Principally, it is not impossible to think of a natural integretion effect too, but that will probably be too complicated.
Simplified explanation are funny or ?
regards, Jarl
Subject: Re: sensitivity history
Date: Fri, 4 Feb 2000 08:51:07 +0200
From: "Jarl Ahlbeck" <jarl.ahlbeck@abo.fi>
To: "Chick Keller" <cfk@lanl.gov>
Dear Chick and other,
After having looked at all kinds local and global of temperature data for many years with the eyes of an experienced statistician and having preformed numerous statistical calculations (only misused statistics creates lies, correctly used mathematical statistics is the base of all scientific methodology), I have got the impression that the main difference of trend between the surface and the MSU/balloon originates from the oceans. However, some part could be due to the surface urban warming, impossible to say how much because the information of exactly how land-surface data is corrected for urban warming is not available. So the surface record from the ocean shows warming but the tropospheric record do not..
Three indicators show no tropospheric ocean warming: 1. MSU, 2. Radiosonde, 3. CO2-variation pattern (degassing/absorption).
I think there is enough information available to state a new consensus for people who believe that consensus has a place in a scientific discussion:
".....The hypothetical warming of the lower troposphere for ocean regions predicted by the GCM models cannot be verified by any reliable observation of temperature or any other indicator....."
Let's vote in the real IPCC style (we elect a new president in Finland next week). If you agree, answer YES, if you disagree, please answer NO.
I will write a Finnish press-release immediately after recieving the results of the voting.
regards, Jarl
Click here for Part 2 Click here for Part 3 Click here for Part 4
Return to the "Climate Change Guest Papers" Page
Return to "Still Waiting For Greenhouse" Main Page