Report: Global Warming

Policy and Politics Of Global Warming

 

Part I: The Climate Contradiction

Part II: The Jolly Book of Climate: Some of the Physics and Mathematics

The Obsession with T-Rex

(Scientific) Revolution!

Greenhouses Don’t Work by the Greenhouse Effect

Models Without Theory

Detection from Data

Conclusion

Part III: The Heads-Tails Report

 

 

==============================

 

www.takenbystorm.info

 

TAKEN BY STORM

The Troubled Science, Policy and Politics Of Global Warming

 

A Briefing Sponsored by the Cooler Heads Coalition

Senate Dirksen Building Room 403

Washington D.C.

February 27 2003

 

Prof. Christopher Essex

Department of Applied Mathematics

University of Western Ontario

London Ontario

and Visiting Professor, Ørsted Laboratory

Niels Bohr Institute

Copenhagen, Denmark

 

Prof. Ross McKitrick

Department of Economics

University of Guelph

Guelph Ontario

 

 

 

Taken By Storm:

The Troubled Science, Policy and Politics of Global Warming

 

 

 

Part I: The Climate Contradiction

 

About this time last year, headlines around the world were announcing the amazing news that it is getting colder in Antarctica. You probably heard about it. It was in all the papers, not to mention on the BBC and major North American news outlets. It was a big story because apparently some computer models of the climate had been predicting it should be getting warmer at the South Pole. So why did that make it a big news story? After all, about the same time some computer models predicted Toronto would get a snowfall that never came. That didn’t make news around the world. No one expects computer models of the weather to be that certain. Yet people have come to believe that computer models of climate should be so certain that a discrepancy between prediction and reality for a small region of the planet is a worldwide news event.

 

The truth is we have even less reason to expect certainty from climate models than models of the weather. As we will explain today, climate is a big time research issue. The uncertainties are as big and as fundamental as in any area of basic science. We don’t know how to measure or even define this thing called “climate”; nor is there a physical theory to guide scientific work. The familiar concepts upon which the climate change discussion have been built: ideas like “global temperature,” the “greenhouse effect” and “radiative forcing” have little or no physical basis. The system under study is a chaotic process whose media are turbulent fluids, and the physics or mathematics that might guide prediction and understanding do not exist. Computer modeling cannot be done from first principles. Classical statistical methods cannot reliably be applied. It’s quite a situation. We are confronted not just with uncertainty, but with nescience: that old Latin word meaning “not to know.” As to the future behaviour of climate, we are not merely uncertain, we are nescient.

 

We will go over these things today. We will walk you through the physics and even a bit of the math, letting you in on some of the revolution that has been going on at the foundations on which “climate science” is based. Not only are there new results and open secrets that have not been assimilated into the popular and scientific discourse but they make the notion of certainty on climate a laughable claim.

 

But let’s first begin with a basic observation, not about the strangeness of climate but about the strangeness of the climate change discussion.

 

There are two remarkable features about the climate change policy discussion. First, the underlying phenomena are unusually uncertain. Second, the rhetoric surrounding the issue is built on an unusually forceful claim of certainty. This contradiction should serve as a clue that something has gone wrong in the relationship between science and policy over global warming.

 

If you have been paying any attention to the climate change issue you will have heard these claims of certainty. It began 14 years ago—perhaps in this very room—where a scientist known to you all claimed he was 99% certain global warming was happening and humans were the cause. Then out of nowhere activist groups like the Environmental Defence Fund began telling their supporters there was a scientific consensus behind global warming, even before most of the research was underway. World politicians across the political spectrum declared “global warming is for real,” and newspaper columnists began calling it a reckless gamble with the planet’s future. The Canadian Environment Minister, David Anderson, wrote us to say scientists had “conclusive proof” humans are changing the climate. Recently, Rajendra Pachauri, the new chairman of the UN Intergovernmental Panel on Climate Change summed up the situation in an interview this way:

 

“…the fact is that our climate is changing and the consequences are very serious. Global warming demands dramatic behavioural changes on the part of individuals and societies, and we know that these changes are difficult to accept and to put into practice…. The Flat-Earth Society has only a handful of members today, and they continue to meet every year, to assert that the earth is indeed a slice. It is the same with climate change - you may deny it, but it is a fact.”

 

Last week, the French Prime Minister told a gathering of the IPCC: “Thanks to your work, carried out in total independence, the question of climate change is one of the rare domains where governments can count on a consensus of scientific analysis”

 

None of the fundamental problems in climate have been solved. In fact, proceeding from a time in the 1960’s they have become progressively “unsolved” for the scientists working at the foundations on which the whole edifice rests. We are not moving closer to a solution but further away from one. The ongoing revolution in the scientific understanding of complex systems like climate (see Part II below) makes the notion of certainty even more far-fetched than it was in 1988. While empty homage is paid to general scientific uncertainty only when under duress, there is no other area of policy where one routinely encounters such ironclad claims of certainty. This is obviously a strange situation.

 

This huge contradiction demands that we look critically at the interface between science and the policymaking process. Our book, Taken By Storm, argues that the connection between science and public policy is disintegrating. Climate change is a particularly acute example, and it is the one we focus on, but it is not the only area of concern. On many topics, especially it seems those to do with the environment, we are seeing more and more serious problems concerning the use and conduct of science in the policy process. Just when we need science following a mature, balanced and free path of inquiry, it seems to degenerate into partisan politics, fearmongering and tendentiousness. This is bad for the policymaking process, and bad for science. It is time to begin taking this problem more seriously.

 

The climate change issue illustrates this problem in cartoon-size proportions. A few weeks ago, President Clinton addressed the crowd at a free Rolling Stones concert, and urged them to greater activism to “stop the planet from burning up.” Let’s think about this. Someone wanting to educate people about climate dynamics paid the Rolling Stones to put on a concert. A former US President came and urged the attendees to prevent the combustion of the Earth. Now the Rolling Stones don’t claim to know much about the climate. Mr. Clinton probably thinks he knows a lot, but what he ended up saying was really very foolish. Those in the audience might have wondered why they were seeing thousands of kilowatts of power consumed on a sound and light show to encourage them to use less power. So we have a mixture of apathy, foolishness and the usual contradictions of celebrity activism, all rolled up in a festival of fearmongering and rock and roll. And all this was done to get nonexperts to take strong positions on scientific questions on which experts have no answers.

 

It would be easy to blame this nonsense on the general decline in education standards, or to conjure up stories of conspiracy or political intrigue. But we believe that the people involved in the climate debate really do have the best interests of the world at heart, and moreover there are too many people, working with conflicting goals and motives, to suggest a conspiracy at work. What has happened is more like a socioeconomic game, of the kind John Nash studied. In the famous prisoner’s dilemma, the players try to get away with the great bank heist, and pursue a strategy that seems best to each individual, but end up confessing and ratting each other out. Nash games are like that. The outcome is different from what everyone intended, and no one player is the cause. It’s also a bit like a physical phenomenon called phaselocking, in which separate mechanical systems start beating time with each other, like two clocks placed together on a shelf. Their interactions through the shelf are sufficient to coordinate their mechanisms, although no one intended to calibrate them this way. The players in the global warming game don’t mean to coordinate with one another either, but larger circumstances force them to. They start beating the same tune, pounding out the relentless drumbeat of false certainty and climate alarmism in this case.

 

Who are the players, and what are their motives?

 

There are six groups we need to distinguish: politicians, regular scientists, official science, the media, environmental groups and industry.

 

First come the politicians. To be fair, most seem to want to make the world a better place. But they need to get elected, and to do this they need to find issues around which to form a coalition that will get them elected. For some politicians, the environmental movement that took off in the 1980s was a good horse to ride. The issues got lavish public attention, they seemed to have intense social importance and they were not seen as problems that the private sector could resolve. They were perfect for political entrepreneurs. The only problem was that most of the conventional environmental problems like air pollution and water quality were getting better in the industrialized countries. The public was demanding action, but the environment seemed to be getting better on its own. Only in politics would this be a dilemma. The climate change issue presented a neat solution. It’s an environmental problem. It is so complex and baffling the public still has little clue what it’s really about. It’s global, so there’s the added attraction that you get to have your meetings in exotic locations. Policy initiatives could sound like heroic measures to save the planet (“from burning up”), but on the other hand the solutions are potentially very costly. So you need a high degree of scientific support if you are going to move on it. There’s a premium on certainty.

 

Then come the scientists. The regular ones. The ones you never hear about. Research is a personal journey that involves patience, hard work , humility and joy. You make mistakes, and you tend to make them in front of colleagues. You see grand systems of thought build up and then get overturned based on a new insight or some new data. You learn to sit loose to your intellectual convictions, and relentlessly test them against theory and observation. And if you’re lucky, then once in a while you are there to see a genuinely new discovery. This is the experience real researchers crave. But nature does not yield up these rewards easily, and if you want to pry her secrets loose you have to guard your intellectual independence, and remain fluent in the arcane mathematical language of advanced science. This makes political engagement difficult. It also means you are the last person to declare yourself certain when the subject at hand is an open research problem. On such matters certainty, like hubris, runs too close to the line of self-delusion. It’s dangerous territory for the practicing scientist.

 

That being the case, there stands between regular scientists and political decision-makers an intermediary body we call Official Science. Official Science is not science, but it is the layer of professionals which represents science to policy makers. This can include staff of scientific bureaucracies, editors of prominent magazines like Nature and Science, and directors of international panels, like the various UN bodies that work on scientific matters. Those in it represent only a minority of people involved with science, and they are not appointed by scientists to speak on their behalf. In fact, as often as not they are appointed by organizations that have little sympathy for concerns of scientists. Official Science serves a necessary function, but it really has an impossible job: to strike a compromise between the need for certainty in policymaking and the aversion to claims of certainty in regular science. In regular science, expertise is strictly probationary. Even the most famous and respected scientists can be taken to task and put to the test—if scientific culture is working properly. In Official Science, expertise means authority, in the sense of being authoritarian. Firsthand insight is not so highly valued, instead authorities define what is true and what is not. If an authority makes a pronouncement, doubting it or suggesting alternatives is not viewed as truth-seeking; it is taken as a challenge to power. So, in Official Science, testing an idea becomes a political struggle. This is not congenial to real scientists, who inevitably pull away from such areas of work, ironically just when their knowledge is most valuable.

 

While scientists are at odds with each other in many disputes, reflecting wide-ranging opinions, Official Science works to present a united front to its clientele of unschooled politicians and journalists. But how does this work when the subject at hand is something like climate, upon which there is a wide range of informed opinion and massive uncertainty on every side? We look to Sir John Houghton, former co-chair of the IPCC, to explain.

 

During the preparation of the reports, a considerable part of the debate amongst the scientists has centred on just how much can be said about the likely climate change next century. Particularly to begin with, some felt that the uncertainties were such that scientists should refrain from making any estimates or predictions for the future. However, it soon became clear that the responsibility of scientists to convey the best possible information could not be discharged without making estimates of the most likely magnitude of the change next century coupled with clear statements of our assumptions and the level of uncertainty in the estimates. Weather forecasters have a similar, although much more short-term responsibility. Even though they may feel uncertain about tomorrow’s weather, they cannot refuse to make a forecast….It has often been commented that without the clear message which came from the world’s scientists, orchestrated by the IPCC, the world’s leaders would never have agreed to sign the [Rio] Climate Convention. (Global Warming: The Complete Briefing, Cambridge University Press, 158-159).

 

This astounding statement by Sir John is the very anatomy of Official Science at work between scientists and politicians. It gathers in normal science in all its tumultuous reality: open debate, dissension and a refusal to make definitive claims where none are warranted. Then it trots off to Capitol Hill or Number 10 Downing St. with a serene and smiling certainty. Debate and dissent are extruded into a “clear message” in this case orchestrated by Official Science. The reason for this orchestration is simple: without it “the world’s leaders would never have agreed to sign” a treaty. If things were as they should be, leaders would want a treaty because they observe that scientists are in agreement. What happens instead is that Official Science “orchestrates” agreement because leaders want to make a treaty.

 

At the heart of the disintegration between science and public policy is the feedback loop between politicians and Official Science. Politicians must decide how much certainty to declare on an issue: likewise Official Science must counsel a degree of certainty. They reinforce each other’s opinion. We call this the Convection of Certainty. The more certain are the politicians, the stronger is the message of certainty from Official Science. The stronger the message from Official Science, the greater the degree of certainty from politicians. We explain how this works in Taken By Storm, and provide examples from the global warming file.

 

The media and environmental groups also have roles to play. In the climate case their interventions amplify the positive feedback between politicians and Official Science. We also explain this process in Taken By Storm. Print and TV media play up the worrisome side of climate stories, because worry sells. Environmental groups also have an interest in keeping up the worry, in fact that’s why they exist. Industry is also relevant here. They too have interests to defend. In the maelstrom of public opinion over global warming, it is easy to make assumptions about what industry’s strategy should be if they are to defend their interest. But the superficial predictions turn out to be wrong. Many firms are observed lobbying for global warming policy, and some even give money to environmental groups! It sounds odd but if you want to know why it is happening, you’ll have to read the book.

 

For now it is time to move on to some technical details. We want to convince you that the message of certainty you have been hearing from politicians and Official Science is not warranted. And simply striking another panel, i.e. creating another division of the Office of Official Science, is not going to help. It’s time to break out of the feedback loop and look again at the basics. Taken By Storm challenges the reader to do this, and that is why it is not a light read. But if you are interested in understanding the real challenges facing climate science, it is a necessary read.

 

 

 

Part II: The Jolly Book of Climate: Some of the Physics and Mathematics

 

A recent study drew a connection between red squirrels and global climate that went as follows. Some red squirrels in the Yukon have been observed to have pups earlier in the Spring. Ergo their DNA has changed, ergo some adaptation and Darwinian natural selection is occurring, ergo some change in the environment can be induced, ergo the Yukon must be warming, ergo global warming! Thus squirrel pups provide yet more proof that this thing called “global temperature” is changing.

 

We call this king of Temperatures “T-Rex.”

 

The focus on T-Rex obscures the fundamental complexity and uncertainty of climate, and has prevented proper consideration of some very real obstacles to a scientific treatment of the so-called “global” warming. This presentation covers 5 key topics fundamental to climate science. First come some remarks about the ongoing obsession with T-Rex. Then comes a look at the fundamental revolution in physics arising from the study of chaos. The third topic concerns greenhouses, and why they don’t work according to the ‘greenhouse effect.’ That leads to a look at models, and why it means that models are no substitute for a theory of climate. Finally there are some comments about what climate data can and cannot teach us.

 

The Obsession with T-Rex

 

Temperature is not energy. It is a thermodynamic variable with some special properties that make it far more interesting than it usually gets credit for. Consider that an ordinary laser pointer, powered by small flashlight batteries, generates peak temperatures of about 1011 Kelvin. Yet you can shine it on your hand and not feel any warmth! This can happen because temperature represents the distribution of energy across physical states, and the fewer the states the higher the peak temperature, even at low energies. The laser distributes a small amount of energy across very few states, which allows the temperature peak to get very large, despite not being perceptible by touch. Lasers are idealized in thermodynamics as having infinite temperature because ideally the laser radiation would have all of its energy in one quantum state.

 

Convincing people that temperature is not energy is difficult. Many people have heard in their early education that temperature is just internal energy. But it isn’t. In the real world you can change the internal energy of a physical system without changing the temperature, and you can change the temperature without changing the internal energy. And this disconnect happens routinely in the natural world around us all the time. Ultimately this has to be so because temperature and energy belong to two fundamentally different classes of thermodynamic variables.

 

Thermodynamic variables are categorized as extensive or intensive. Extensive variables occur in amounts. There is an amount of mass, energy, length, and so forth. Intensive variables refer to conditions of a system, defined continuously throughout its extent. Temperature, pressure, relative humidity, and chemical potential are examples of intensive variables.

 

You can add up extensive variables to get a total amount. In a thermodynamic system there is a total energy, total mass, etc. But this does not work with intensive variables. There is no such thing as a “total” temperature in a system. If you join two identical closed systems, to get the properties of the new system as a whole you add up extensive quantities, but you don’t add up the temperatures. It just doesn’t work that way.

 

The equation that governs the entropy of a system can be analyzed by taking the differential of it. This is the customary approach in thermodynamics. It yields a sum of pairs of variables. Each of the pairs is the partial derivative of the entropy function times the differential of one of the function’s independent extensive variables. Each term is a conjugates pair. Intensive variables are defined in terms of the partial derivatives and these are said to be conjugate to the extensive variable who’s differential they multiply. Intensive variables are never equal to extensive variables alone without some extensive quantity mediating a relationship. So it is too with temperature and internal energy in an ideal gas. Intensive variables never appear in a summation by themselves, but only as a conjugate with an extensive variable.

 

It is not essential to understand that point, as long as you grasp that the physical equations involving temperature assign no meaning to a sum of temperatures on their own, only in conjugate pairs with extensive variables. So here is a question: if the mean is the “total T”, divided by the number of observations n, and “total T” (i.e. the sum of some temperature numbers) does not have a physical significance, what does the mean mean?

 

This is not an idle philosophical question. An average is nothing more nor less than a rule made up for an occasion where you want a single value to stand in for a list of values. There is not just one way to do this: there are an infinite number of ways to boil a list of numbers down to a single value. The rule that you should follow depends on the circumstances.

 

For extensive variables the physical circumstance often recommends a mean. But a mean over what? You can compute the average height of people, for example, because it makes geometric sense to imagine laying everyone end to end and finding the total ‘height’ of a group, then dividing that by the number of people in the group. That’s an ordinary arithmetic mean. But the physical meaning of this particular average pertains to this overall length. If the mean height grows, that means that the total height of the group changes. So the mean means something geometric. In other circumstances other means make sense.

 

You could average over the kinetic energy of identical particles by adding up the square of the speeds and not the speeds themselves. If the mean speed-squared increases, then that means the kinetic energy increases, giving a physical meaning to the mean speed-squared computed in this way. You can then just take the square root and get a speed that means something in terms of energy.

 

You could average over the radiation energy of identical volumes in equilibrium by adding up their temperatures to the fourth power, not temperature to the first power and not temperature in other than absolute units. The mean of temperatures to the fourth power is connected in this way directly to the energy. If you take the fourth root of the mean you have a temperature that is connected to the energy.

 

You can go on with this kind of thinking. For example you could figure out the total resistance in a parallel circuit by adding up the reciprocals of the resistances of the individual components. And so on.

 

If the physical context does not imply one type of average then mathematics steps in with an infinity of ad hoc possibilities. It includes an infinite variety of weights, exponents, algebraic formulas, functions and other mathematical cookware, all chosen to reduce a list of numbers to a single value. When any one of these cookware rules is used, it is normally presumed wishfully that the number produced is “representative.” But in reality an averaging rule can be found to rationalize any value in the list from the largest to the smallest as the “representative.”

 

Only in the case of a particular physical context can we select one from this range that has a specific physical meaning. Everything else is just statistics. Everything else is ad hoc. But no such physical rules are prescribed for intensive variables on their own. For them physics does not provide a rule for averaging them. This is certainly true for temperature observations, the ideal gas law not withstanding. There is no such thing as a total temperature. It is mathematically possible to do the calculation, but not physically meaningful. Likewise you could take an average over telephone numbers. You could compute an average phone number in Washington DC if you like. But what significance does it have? In what way does it represent the phone system or anything else for that matter? If you dialed the number and asked whoever answers what the temperature is, maybe that could define the average temperature!

 

In the absence of physical guidance, any rule for averaging temperature is as good as any other. The folks who do the averaging happen to use the arithmetic mean over the field with specific sets of weights, rather than, say, the geometric mean or any other. But this is mere convention.

 

This matters not only because you will change your definition of “global average temperature” if you use a different averaging rule, but you can also change the meaning of “warming” and “cooling” themselves. This is no trivial matter, since “warming”—or not—is nearly the entire global warming question.

 

Consider a system consisting of a cup of coffee, at 33 degrees C, and a cup of ice water at 2C. What is the “one” temperature that describes both these liquids, at this moment, as they stand? Clearly there isn’t one temperature, there are two. The physics does not say in any way that there is a single temperature for the whole. But if you are interested in climate you might say that there is one temperature anyway, if your thinking has been clouded by the T-Rex obsession. To such a person combining the temperatures of the ice water and coffee into one number would be no different than combining the temperatures of the poles and equator into one number. Of course you can do it. You have an infinity of choices, but the physics doesn’t say which one to use.

 

Fine then: pick an averaging rule. Better yet, pick four out of the infinity of choices. Then ask whether this system is “warming” or “cooling” as the liquids relax to room temperature. As is conventional, cooling will mean our average is declining, while warming will mean the average is rising.

 

The four averages turn out to imply different things:

 

Figure 1: Two liquids, four averages.

 

The arithmetic mean says the system is “warming”. The root mean square says it’s “cooling.” Which is correct? They both are! Take your pick—the physics doesn’t say which one is the right one.

 

The situation is more problematic when discussing changes on the surface of the Earth. There we do not have a “room temperature” to know where all the temperatures are headed to. Furthermore there are not just two temperatures, but an infinite number of temperatures that form a continuous field. Means over temperature observations sampled from the field have no known physical connection to climate anymore than they do for the ice water and coffee. You could have a huge climate change without a given mean changing, and likewise you could have a change in the mean without any change in the underlying system. None of the governing equations of the climate system, so far as anyone knows, takes an arithmetic mean of temperatures as an argument, so we cannot say how temperature statistics are linked to climate.

 

As another example, we took the monthly temperature means for 1979-2001 for 10 stations ranging from Halley, Antarctica to Egedesminde, Greenland. How should these numbers be aggregated? The usual practice is to take an arithmetic mean, which yields Figure 2 below.

 

Mean Temperature: +0.17 C/decade

0.0

2.0

4.0

6.0

8.0

10.0

12.0

14.0

16.0

18.0

Jan-79

Jan-80

Jan-81

Jan-82

Jan-83

Jan-84

Jan-85

Jan-86

Jan-87

Jan-88

Jan-89

Jan-90

Jan-91

Jan-92

Jan-93

Jan-94

Jan-95

Jan-96

Jan-97

Jan-98

Jan-99

Jan-00

Figure 2: Global Warming

 

Amidst all the seasonal variation a warming trend of +0.17C per decade is discernible. Must be global warming!

 

But what if we aggregate the temperatures differently? Suppose we treat each month as a vector of 10 observed temperatures, and define the aggregate as the norm of the vector (with temperatures in Kelvins). This is a perfectly standard way in algebra to take the magnitude of a multidimensional array. Converted to an average it implies a root mean square rule. Of course this can be represented on the original temperature scale too.

 

This was applied to the same data used for Figure 2, and the result is in Figure 3.

 

Root Mean Square Temperature: -0.18C/decade

0.0

2.0

4.0

6.0

8.0

10.0

12.0

14.0

16.0

18.0

Jan-79

Jan-80

Jan-81

Jan-82

Jan-83

Jan-84

Jan-85

Jan-86

Jan-87

Jan-88

Jan-89

Jan-90

Jan-91

Jan-92

Jan-93

Jan-94

Jan-95

Jan-96

Jan-97

Jan-98

Jan-99

Jan-00

Figure 3: Global Cooling

 

Amidst all the seasonal variability a cooling trend of 0.18C per decade is discernible. Must be global cooling!

 

But wait—which is it? The same data can’t imply global warming and cooling can they? No they can’t. The data don’t imply “global” anything. That interpretation is forced on the data by a choice of statistical cookery. The data themselves only refer to an underlying temperature field that is not reducible to a single measure in a way that has physical meaning. You can invent a statistic to summarize the field in some way, but your statistic is not a physical rule and has no claim to primacy over any other rule.

 

Going further, if you want to describe “climate” as “average” weather, what variables would you average? And how do you do the averaging over the dynamics of these variables (which is typically a lot harder)? You can make up rules for this, but ideally these things should be done in a physically-meaningful way.

 

And no one knows how to do this. If you did you would have a theory for climate. We do not have one.

 

(Scientific) Revolution!

 

Let us move on from temperature to the system of which temperature is one feature. It has become commonplace to describe the climate system as chaotic. Chaos is a property of solutions of some nonlinear mappings and differential equations. It is interesting because in chaotic systems, very small changes in initial conditions can cause permanent and large changes in the behaviour of the system. But at the same time these changes are bounded. It is easy to find examples of mathematical systems in which small initial changes cause the system to race off to infinity. But in chaotic systems small changes have large effects, even while these effects stay within set bounds. This is like the natural world: small perturbations set the system on another path, but only within the bounds of the system itself. A change in the wind direction may trigger a series of other changes culminating in a thunderstorm, but not in the sun going nova.

 

One of the simplest examples of a chaotic system is the logistic map. This is a rule that takes a value x at some time t and maps it onto the next period, yielding x(t+1). The formula is x(t+1)=ax(t)(1-x(t)) where a is a coefficient chosen to be between a bit more than 3 and up to 4. If you start with x between 0 and 1 it remains bounded between those values. You can use the mapping over and over to generate a sequence of x values, all following from the first value according to this simple, deterministic formula.

 

Now suppose a=4 and you pick two initial values of x and run the logistic map on separate computers. You ought to get the same sequence of values. But let the initial values be slightly different: by less than one in 10,000,000,000,000,000 (ten thousand trillion). Many computers cannot spot numerical differences much smaller. The two sequences would remain the same for about 50 iterations, then they would fly apart. The magnitude of the differences would remain bounded between 0 and 1. But the sequences would become completely disconnected. Figure 4 illustrates this.

 

Figure 4: Differences between two logistic map sequences that differ initially by less than 1 in 1016.

 

This sensitivity to initial conditions is a pervasive feature of our natural world. But it was only recently in the history of science that we knew it.

 

Suppose you are looking at the 500-period series in Figure 4 and—without knowing how the data were generated—trying to figure out what caused the sudden change in the system at period 50. You would probably look for a “big” cause somewhere around period 49 or 48. Good luck! Would you think to look for a minuscule “cause” on the scale of 10-16 compared to the scale of the data itself, some 50 periods earlier?

 

Or to put it another way, if you were looking to explain “global warming” in the 1990s, would you think to study the activity of a housefly in the 1950s? Of course not. In classical physical theory small things typically cause small changes. In a chaotic system small things can cause big changes, and their effects can operate over long quiet intervals in which nothing appears to happen. The chaos revolution has meant the end of simple notions of predictability and change in deterministic systems.

 

We cannot even be sure that “effects” have causes, when we are talking about chaotic systems. The simple logistic equation with a different a, mapped onto itself three times, generated the pattern in Figure 5. Now suppose it is a graph of some climate variable. The variable hums along as steady as can be for just over 350 periods then wham! it jumps to a new level, holding steady thereafter through the next 400 periods. There was no exterior “cause.” It is just the nature of the system. You might go looking for a cause, and find some very plausible candidates. But nothing caused the jump.

 

0.0

0.2

0.4

0.6

0.8

1.0

1.2

0 100 200 300 400 500 600 700

Figure 5: 750 realizations of the 3-times logistic map

 

These curious behaviours are features of a very simple chaotic system. Is there any reason to believe more complex systems are “better” behaved? Of course not. It was the realization that deterministic systems can exhibit sensitivity to initial conditions, sudden reorganizations and causeless changes that led scientists in the 1970s and 80s to question the very concept of predictability in physical theories of natural systems.

 

This change in thinking was marked in an unprecedented apology to the public by Sir James Lighthill, one of the 20th century’s most distinguished fluid dynamicists:

 

“We are deeply conscious today that the enthusiasm of our forebears for the marvelous achievements of Newtonian mechanics led them to generalizations in this area of predictability which, indeed, we may have generally tended to believe before 1960, but which we now recognize as false. We collectively wish to apologize for having misled the general educated public by spreading ideas about the determinism of systems satisfying Newton’s laws that, after 1960 were proved incorrect. In this lecture, I am trying to make belated amends by explaining both the very different picture that we now discern, and the reasons for it being uncovered so late.” –Proc. Roy Soc A 407 35-50 1986

 

One cannot overstate the importance of seeing the “very different picture” that has emerged in physics as a result of this revolution. It affects all areas of science, though the message has not yet sunk in everywhere.

 

Fluid dynamics was always a problem, in particular through the old problem of turbulence. This revolution, however, gave this old problem new scope and a new language. The governing equations of fluid dynamics is a system called Navier-Stokes. This is a set of vector partial differential equations for which no general solution is known. The flow is driven by gradients (or rates of change) in variables like temperature, but to try to use the system for computation requires approximations and parameterizations since the levels of system variables cannot be solved out directly. In the case of turbulent fluids, we cannot even compute the average flow from first principles.

 

Our atmosphere, and hence our climate, is composed of fluids in motion, like air and water. Turbulence is a pervasive feature of atmospheric processes, yet analysis from first principles is not possible because the relevant equations cannot be solved directly. The governing system is known to exhibit chaotic behaviour, which means in principle that prediction and causal interpretation of past events is not possible. When you mix the different components of classical physics together on this bubbling boiling turbulent foundation, the result is a disaster for classical forecasting.

 

This is the kind of uncertainty that lies at the heart of the climate problem.

 

Greenhouses Don’t Work by the Greenhouse Effect

 

Here is an example of where the problem of turbulence applies to the climate discussion. You have heard of the “greenhouse effect.” The beginning of the story is the top panel in Figure 6.

 

Figure 6: The Greenhouse Effect

 

Incoming solar radiation adds energy to the Earth’s surface. This energy must be drained to the top of the atmosphere where it will be radiated back to space, preserving the balance in the planet’s energy budget. The two mechanisms for energy transport through the atmosphere are infrared radiation and fluid dynamics, the two arrows pointing upwards.

 

Real greenhouses work according to the bottom picture. To keep the greenhouse air from moving away, fluid (air and water) motion is shut off by putting up glass or plastic. To maintain the energy drain the infrared radiative flow must increase. In this case the equation governing the radiative transfer is relatively simple and can be solved for absolute temperature. The physics is clear and certain: temperature must rise. It is theoretically and experimentally certain.

 

What is happening in the atmosphere however follows the middle picture. Adding CO2 to the air slows the radiative drain slightly. So the fluid dynamics has to adjust. OK, so all we have to do is solve the governing equations to see what temperature will do. But the equations (Navier-Stokes) can’t be solved, and absolute temperature doesn’t even appear in them (intensive variables usually only appear in gradients). So no one can compute from first principles what the climate will do. It may warm, or cool, or nothing at all! It is like natural air conditioning without knowing where the thermostat is set.

 

Unfortunately, calling the middle picture a “greenhouse effect” grafts the certainty of the bottom picture onto the top picture. This is a very misleading abuse of terminology. The bottom picture could hardly be more simple. The middle picture could hardly be more uncertain. Moreover, in the popular discussion of the “greenhouse effect” so-called, the most important “greenhouse” gas gets forgotten: namely water vapour. People focus incessantly on CO2 yet good old H2O, the one truly important infrared-absorbing gas, is almost always forgotten! But water is not only the most important greenhouse gas, it is, unlike all the rest, deeply embroiled in the whole fluid dynamics problem. So leaving it off the list helps leave the turbulence problem out of the discussion, which strips away all the fundamental uncertainty.

 

Models Without Theory

 

So if the climate problem cannot be treated from first principles, what are climate models based upon? By necessity they rely on approximations and parameterizations that are chosen to stand in for all the underlying physical processes that cannot be treated directly. The relevant mathematics is just too big and complex to fit into any finite computer, no matter how large and powerful. So a lot of detail has to be left off. Is this a problem? Maybe yes, maybe no. The point is that without the underlying theory we cannot say. We do not know how “small” something can be before it does not matter. And the cracks that things can fall between in our best climate models are hundreds of kilometers wide.

 

Such is the reality of working with models rather than theory. There is no problem with this as long as people bear in mind that, unlike engineering models, climate models cannot be tested experimentally! No one knows if the parameterizations and approximations will work the same way in a future climate regime. They are assembled with a lot of care and wisdom, but all the good intentions in the world cannot overcome the fundamental problem that there is no theory of climate useable for computational purposes. Indeed it is fair to say there is no theory of climate, period. The equations of motion for climate have never been written down. That is, no one has derived from first principles a set of equations that takes ‘averaged’ climate variables and shows how they evolve under different circumstances, not to mention defining how the averages relate to the underlying variables themselves. Such a group of equations would constitute the beginnings of a theory of climate. But no one has derived these equations. Perhaps someone will, in time. Until then, there is no theory to guide computational modeling.

 

That being the case it is important to set aside unwarranted claims about the ‘accuracy’ of computer models. They are properly understood as cartoons of the climate. Adding more detail, in the form of more elaborate parameterizations, does not guarantee the cartoon will converge on the real thing, any more than adding detail to a cartoon mouse makes it converge on a real mouse. Nor does ‘ensemble’ averaging across different models necessarily converge to the real thing: just as averaging across Mickey Mouse and Speedy Gonzales doesn’t get you closer to a real mouse. And the current preoccupation with ‘regional climate forecasts’ overlooks a lot of conceptual barriers, one of which is that modelers are only supplying a sort of ‘blow-up’ of one part of their cartoon, but this does not imply greater realism is thereby obtained.

 

An example of the misuse of models is the old ‘proof’ of global warming that relies on a parameterization of something called the atmospheric lapse rate (the rate at which temperature falls in the troposphere as you gain altitude). The story is that adding CO2 to the air raises the effective emissions altitude. This in turn raises the temperature at the point where the radiation is emitted. By applying a parameter value (6.5 K/km) to map a drop in altitude onto an increase in temperature you can project all the way down to the surface to get the “result” that the surface temperature has to rise by about 2K in response to doubling CO2 levels. But what if the parameter value changes in the changed climate? It is observed to vary naturally from about 4 to about 10 K/km. It only has to change from 6.5 to 6.1 for the surface to experience cooling in response to the additional CO2. The ‘proof’ only applies to a cartoon atmosphere, not the real one.

 

Detection from Data

 

If models don’t offer a clear demonstration of global warming, can we hope to rely on the data, and do a purely statistical analysis? With today’s big computers, is it possible to crunch through terabytes of observational data and pick up the ‘climate signal’ directly?

 

Not likely. First we face the fatal problems of averaging discussed above. But in addition it must be noted that statistical analysis is itself a form of modeling. It requires judgment and assumptions and the substitution of a made-up structure in lieu of theory.

 

Here is an example of how this matters. Figure 7 shows a plot of the distribution of monthly values of the Arctic Oscillation or AO Index (solid line). The AO index tracks pressure variations in the Arctic and is believed to play an important role in driving decadal-scale temperature variations. It is a widely-studied data series.

 

To do a statistical analysis requires assuming a distribution function, from which probability calculations can be made. The dashed line is one possible function, called the Gaussian curve. The Gaussian line can be modified by choosing two parameter values: the mean and variance. These values have been chosen to fit the Arctic Oscillation values as closely as possible. But you can see that the fit is not quite exact: the centre of the data profile is too narrow and tall, and the tails are too a bit too fat.

 

Figure 7: Distribution of the Arctic Oscillation (solid line) and best-fit Gaussian density (dashed line).

 

Figure 8 shows a distribution of some temperature data: in this case the temperatures at Frobisher Bay, Nunavut, which have been standardized (by regressing each monthly observation from 1942:1 to 2001:12 on itself lagged once and a set of monthly dummy variables, then taking the residuals from this regression). Again the data are plotted along with a Gaussian curve, which does not quite fit. The distribution of temperature data is too narrow and spikey in the middle, and the tails are too fat at either side.

 

Figure 8: Distribution of Frobisher Bay temperatures with lag and monthly means removed. Dotted line shows a best-fit Gaussian distribution curve.

 

In either case if you use the Gaussian curve to compute probabilities (which is the standard assumption behind most of the formulas used by statistical packages) you will of course be a little off in the numbers. In many cases however it doesn’t matter if your data don’t exactly follow a Gaussian distribution. There are theorems which show that as long as your data follow any distribution with a well-defined mean and variance, you can average your data in the right way and the average will follow a Gaussian curve.

 

But there is some fine print. Not all distributions have a well-defined mean and variance. For instance there is the Levy class of distributions, for which the variance is infinite. Some members of this class do not even have a finite mean. If you sample data from a Levy distribution you can compute a mean and variance of the sample, but that tells you nothing about the rest of the data you didn’t observe. If more data come along they likely won’t yield the same mean. And the variance of samples will just keep growing to infinity the more data you collect. Furthermore all the convenient formulas used in standard spreadsheet programs and statistical packages on computers will be working on the wrong assumption, and will merrily generate illusory probability calculations.

 

Would a researcher know when these statistical illusions are being generated? Does the Levy distribution look so weird that it would be immediately apparent that standard Gaussian methods should not be applied? We report, you decide. Figure 9 shows a Gaussian curve and a Levy curve plotted together.

 

Figure 9: A Gaussian density curve and a Levy density curve. The Levy curve is narrower and taller with fatter tails at each side.

 

They don’t look all that different from each other. And the misfit between the Levy and the Gaussian is like the misfit in Figures 7 and 8. Maybe it’s just a fluke. But maybe it points to a deep problem in using conventional statistics on these particular data series. Levy distributions have turned up elsewhere too: in rainfall statistics for instance. And the absorption lines for CO2 in the infrared spectrum, called the Lorentz profile in molecular spectroscopy, are Levy distributions. Indeed the 15 micron band of CO2 that all the fuss is about has rotation lines that are shaped like Levy distributions. Great irony, but also a solid lesson about Nature.

 

A researcher using statistical methods has to make an assumption about whether the data are Gaussian or not. If they are, that’s fine. If not; if they follow a non-standard stable distribution like Levy, all bets are off as to what the output of a statistical analysis means.

 

Conclusion

 

We could go on. In Taken By Storm we discuss these and other issues in much detail. But the point here is simply that there are very fundamental uncertainties at hand when studying climate. There is no theory of climate. There is no known physical meaning for adding up data and dividing by the number of data that everyone insists on adding up and dividing by.

 

Furthermore once they have this number, no scientific basis exists to show that its behavior has any implications for our lives. There is a lot of hand waving and gesticulating, but no science. The people who like these statistics have a burden that they have not been made to bear. You have to prove a proposition in both directions in order to make an equivalence.

 

We can barely manage past complete handwaving to make it in one direction, and no one seems to have even thought about it at all in the reverse. In fact there is every reason to believe that it is not true at all in the reverse. Temperature statistics can go up in the very small amounts we speak of without any meaningful effect, and similarly they can remain fixed even during huge climate change.

 

Some types of statistic from the Earth’s temperature field are going up while others are going down. This reflects more on the nature of averages than on the state of the Earth’s climate.

 

The system under study is chaotic, and we know from studying simple chaotic systems that classical notions of causality and scale can be thrown out of the window in such situations. The fluid media of climate (air and water) are turbulent, ruling out computation of changes from first principles. The data show hints of following nonstandard distributions that rule out conventional statistical methods.

 

We don’t know what the climate would be like if we were not here. We may not be able to identify climate change, even after the fact. Climate research involves some of the greatest unsolved problems of basic science. It is not the stuff on which to base costly and far-reaching policy commitments.

 

To do so requires one to speak about climate with great certainty. Those who do so are only courting the laughter of the gods.

 

 

 

Part III: The Heads-Tails Report

 

If you were expecting a discussion of the difference between surface and satellite temperatures, or an examination of which parts of the world climate have not yet been modeled quite accurately, then the presentation so far may come as a surprise. Believe me, Taken By Storm is full of surprises, sort of like the climate.

 

That the climate is full of surprises is not the message of the IPCC Summaries. It is there in the Reports themselves if you know where to look. But the Summaries are the work of Official Science, and the aim there was to orchestrate a consensus. In reality there is no certainty to be had on this issue. Not yet, and perhaps not for a long time to come. And it is hard to have consensus amidst fundamental uncertainty and nescience.

 

This makes the climate contradiction that much more remarkable. One can scarcely imagine a research topic with so many unanswered and fiendishly complex questions on all the most fundamental issues. Yet it is precisely on this topic that a suffocating cloud of certainty has descended on the political discussion, which means all this rich science is cut off from the policy debate precisely when it is most needed.

 

This brings us to the final question, lurking behind the climate issue and many other controversies today:

 

How should policymakers gather technical information on uncertain scientific matters with controversial policy implications?

 

This question is not confined to the climate issue. Here are a few others that come to mind:

 

· What is the effect of abortion on women’s health?

 

· Does violence on TV cause adolescent delinquency?

 

· Do cellphones or high voltage power lines cause cancer?

 

· Should we bring back DDT for controlling mosquitoes?

 

All these topics have two things in common. First, they are important questions and we need all the help science can offer. Second, regular scientists are averse to working on them. Do you see the problem? On the very topics where we need the most scientific input, the fewest number of researchers want to get involved. The politics are too hot, the battles are too painful, and the dead hand of Official Science lurks in the background, ready to turn honest uncertainty into fictitious consensus, thereby stigmatizing holders of legitimate points of view as outsiders and skeptics, regardless of the merits of their position.

 

Early in the process of researching climate change, some regular scientists tried to warn the public about what was happening. For instance, Craig Bohren, an atmospheric physicist at Penn State, had this to say in 1994:

 

The government’s response to clamoring from an electorate frightened by global warmers to do something about global warming is to recklessly toss money to the wind, where it is eagerly grasped by various opportunists and porch-climbers… I have never understood Gresham’s law in economics--bad money drives good money out of circulation--but I do understand this law applied to science. Incompetent, dishonest, opportunistic, porch-climbing scientists will provide certainty where none exists, thereby driving out of circulation those scientists who can only confess to honest ignorance and uncertainty.

 

Alas such warnings went unheeded. And so these many years later the situation has deteriorated to the point where there is no easy way forward. It is no use simply setting up another expert panel, or asking the IPCC for a Fourth Assessment Report. So what should we do?

 

We would like to propose one possible solution to this dilemma.

 

Set aside the global warming example for a moment. Consider another topic that is scientifically complex but politically charged. Suppose we want to know whether lawn chemicals are a threat to health in cities.

 

The usual pattern looks something like this. The government wants advice on the science. It looks for a “neutral” expert. By some chain of rumour, acquaintance and political jockeying, Dr. Bland is selected to form a panel and write a report. The Panel quickly adopts a particular view. The Bland Report comes out, 500 pages long, dense with footnotes. It all boils down to a conclusion, which as it happens was precisely the view that Dr. Bland and the other panelists held before writing the report.

 

Then some other people start to object. They say they weren’t consulted, or that the Bland Panel overlooked important evidence. But by now the government has institutionalized the Bland Report. The opponents are the “skeptics,” the minority, the outsiders. It doesn’t matter how many of them there are, how big the errors are that they find in the Bland Report or how good their own arguments are. They do not have the money or the institutional credibility to produce a report of their own.

 

When they speak up they do so as individuals, and they can never carry the weight or gravitas of the Official Bland Panel. So they get frustrated and drop out of the debate. Their expertise gets lost just when we need it most. Thus it is that big policy decisions get made, time and again, on the basis of incomplete and unbalanced science.

 

In other areas of society, when a task requires adjudication—that is, a judgment as to the meaning of the available data by someone in a position of authority—a Bland panel is not the typical approach. Instead we use a forum in which contrasting opinions are deliberately sought out and given a full and fair hearing.

 

Isn’t this what happens every day in courts of law? Courts insist upon competent representation for the both the prosecutor and the defendant, and will suspend proceedings if one or the other is missing. Each side is given all the time it needs to present its case. The testimony of each witness is cross-examined. Each side can bring in its own experts. Attention must focus on the facts of the matter and the logic of the case, and not on the character or motivation of the counsel presenting arguments. If the losing side can show that the court displayed prejudice, another court is asked to start over and re-try the case.

 

Should cities ban lawn pesticides? Here is how they ought to decide. A city should form two panels. One is asked to produce the strongest possible case for banning them. The other is asked to produce the strongest possible case for their use. Then each team gets to write a rebuttal to the other’s. The final report consists of all four documents, without a summary.

 

Does this sound strange? Two teams? Handpicked so they hold foregone conclusions? Sure. Let them be as biased as they like. Let them self-select their members and tilt together into their preferred position. In the end their reports will be set side by side. If they are evenly matched, so be it. That is the honest message of the science. And any process that fails to convey it is perpetrating a fraud on the public.

 

In the case of climate change, the day is far spent and it may take a generational change to rehabilitate this field of study. But if we were starting from scratch, we would begin by recognizing that there are opposing views, and it is not obvious from the outset which is correct on any particular question. We would form two groups with equal funding and adequate membership in each. One group would be called Heads and the other group Tails. The job of the Heads group would be to produce a report making as strong a case as possible that human activity is causing a significant climate change that will have harmful consequences. The Tails group would have the job of making as strong a case as possible to the contrary.

 

Since we would have done away with the artificial labels of “mainstream” and “marginal,” a wider range of participants would have come forward, especially on what today is maligned as the “skeptical” side.

 

Each group would be asked to produce, say, a 200-page report, as well as a 50-page rebuttal to the other group. The complete 500-page document would be released without a summary, but with an index. It would be submitted to the world’s governments without either panel being asked to render a decision on which team’s report is stronger.

 

Each government then would have to decide for itself. They could, if they like, consult internal and external experts for their opinions. But even if one government made the mistake of setting up a national Official Science group to render a verdict and write a summary, it would not bind on any other country.

 

We can imagine the protests that supporters of the IPCC would raise against this sort of approach. “It would lead to confusion. The Report would not render a bottom-line decision. Everyone will conclude what they want from it.”

 

Oh really? Does the IPCC fear that the Heads and Tails reports would be so evenly matched that it would not be obvious which is the stronger case? That would seem to be an admission that the position espoused in recent years by the UN Panel is not nearly as conclusive as they have been claiming. But if they do think it will be obvious which is the stronger report, then what’s the problem? If the Heads group really have such a strong case, putting it alongside the Tails case will only sharpen the contrast, especially since the Heads group get to produce a rebuttal.

 

Or maybe the IPCC has been assuming all this time that people don’t actually read and understand the reports, instead they just look through the executive summary for an authoritative decree, rendered ex cathedra by the climate pontificate. Certainly some politicians talk this way. If that’s the case then a heads-tails model will only seem like a platform for heresy. The alternative is, admittedly, rather comforting. A solemn pronouncement of infallible doctrine arrives from the Papal see on finest parchment, sealed with red wax. But if you are up on your ecclesiological jargon, you will know that such a document is called a “bull”, so be careful about carrying the analogy over to the Summary for Policy Makers.

 

In any case if the purpose is to increase the scientific understanding on which policy is based a Heads- Tails model can only help. On scientific matters such as global warming the political sophistication now far exceeds the scientific understanding. We cannot cope wisely with complex technical issues while this is so. People have hoped that we can get by on authoritarian pronouncements from Official Science, or executive summaries from anonymous government committees. Others hope that we can just have a show of hands, or pass around petitions. But these are not issues that can be resolved by counting heads, as a substitute for people thinking with their own heads. The inordinate and aggressive claims of certainty about global warming stand in such wild and obvious contrast with the reality of the scientific base, it is hard not to conclude that there is a serious problem in the relationship between science and the policy process. This is bad for science and perilous for society. Taken By Storm is an invitation to begin the serious effort to repair this relationship.

 

Thank you.

 

For more information about Taken By Storm see www.takenbystorm.info

 

 

==============================