Things I know #3: Models are not fact. Models are expressions of belief. Model output is what we think should happen based on the beliefs encapsulated in the model.
If the beliefs are close to reality, model output can be quite useful. If they are not close to reality, they are garbage.
"Experiment is the sole source of truth. It alone can teach us something new; it alone can give us certainty." (Poincaré 1902).Real evidence is in observation. Models (in physics) are our way of mathematically "explaining" what we think might be true. Models we create produce imaginary results based on what we believe happens. Another way to put it is that models are expressions of belief.
With that in mind, Let's talk about things we know, and things we don't know.
First, we'll throw up a chart of "Global Temperature" change, as best we can determine "it", for about the past 120 years.
This is based on taking average temperature readings all over the world various methods and averaging them. We won't go into any of the nuances of the validity of that. Let's just accept it for our purposes here. Looks like temperature has gone generally up since about 1920, with a bit of a flattening from about 1940 to about 1977-ish, followed by a surge leading up to about now (2000-ish).
Alarmists tend to point especially to the second part of the curve and say "look, it's increasing faster and it's higher than "ever".
I want us to note two things here (the temperature scale here is fahrenheit as opposed to celsius on most of the other charts... a one degree fahrenheit change is 1.8 times a one degree celsius change). Seriously, how many lay-alarmists even know how much the earth has warmed so far in the last 150 years? Those two things are the scale we're talking about, and the fact that the rate of increase from 1920 to 1940 doesn't look a whole lot different from the rate of increase from 1977 to now. So the increase in the last 30 years is not an unusual rate of increase. As a matter of fact one of the things we know from long term data is that these global temperature increases tend to happen rapidly, while global temperature decreases happen a bit more slowly.
Now we're going to look at Mann's famous "hockey stick" chart.
This is the main chart that has caused a good chunk of the hubub about our sudden, rapid temperature increase from "normal". Note that "normal" for this chart is defined as the average from 1961-1990. Further understand that that is an arbitrary choice as far as climate history is concerned. But that's ok, just make sure you keep that in mind. We have to put a deviation line somewhere.
The "proxy data" is data gathered from tree rings, ice cores, and possibly sediment analysis. Note the large (grey) margin of error in the proxy data. It gets smaller around the time some records started being kept around 1600, and yet smaller during the 1900's when world populating grew rapidly and more and more observations were available in more places.
But we're kind of comparing apples to oranges here. We've increasingly grafted more and more actual, real observational to the right side of the graph as time goes on, until at the end there's virtually no proxy data and very little margin of error. But note that despite all this, the temperature today is still very close to the margin of error estimated for the proxy data. Also understand that the smooth line you see down the middle of the margin of error has been artificially smoothed up to the 1990's, and the shocking end of the chart is real data. What this means is that all throughout the chart from the left to the right, the actual "global temperature" may have been anywhere within the grey area. Anywhere, any given year. The smooth line is very misleading.
On top of that, climatologists and statisticians have long since trashed this chart, yet it is still being trotted out by Al Gore et. al. today. It is a computer construct and a faulty one at that.
From Friends of Science: (and noted by many others -- this is easily verifiable)
Over the years, the graph has been subject to many criticisms from other scientists, for a number of reasons. Some complained that the well-documented Medieval Warm Period (approx. 1000-1400) and the even better known Little Ice Age (approx. 1450-1850) do not register on Mann’s chart. Others took issue with the tree ring proxy methods being used for the time span before thermometer readings were available, or with the inaccuracy of the temperature readings themselves. They maintain that instrument errors are often larger than the anomalies measured, or that the urban heat island issue skews surface data badly.
In this article in Energy & Environment (Vol 14/6, 2003) Canadians Stephen McIntyre and Ross McKitrick reject Mann’s methodology, point to numerous errors, unjustifiably truncated data and extrapolations, and other defects. They then use Mann’s original data and recognized methodology to prove that Mann’s graph shape is an artifact and that a proper interpretation finds that temperatures around 1400 were warmer than anything in the 20th century. The widely read work by McIntyre and McKitrick has elicited much discussion in scientific circles. A more definitive version was later published in Geophysical Research Letters.
Now let's take a look at how the public is being manipulated by selecting what you see and blurring the line in the public mind between what we know and what we don't know.
First of all notice that this chart now goes back far enough to see the end of the last ice age, the halocene warm period, and Mann's obliteration of the midieval warm period is gone. We can now see that temperatures are, as far as we can tell, about as warm as they've ever been since the end of the last ice age -- and it's really not much higher, as far as climate history goes, than the lowest low since the last ice age. The difference has been less than 2 degrees celcius for about the last 10,000 years (and look how fast we came out of that ice age. Now that's a dramatic temperature increase!) But look at the right side of the chart. There's that worrisome model forecast, and it dwarfs any changes in the last 10,000 years.
But it's an illusion. It's a "thing we don't know". For the record, here's what we do know:
Well, that's not so alarming now, is it? And remember that "0" line and how I mentioned its arbitrariness? It is useful for looking at small changes, but still arbitrary. It doesn't mean, as our brains are likely to infer, that that's what the temperature "should" be. It's just what it's been for the past 10,000 years. We could put it anywhere. How about halfway in between the modern mean and the Ice Age mean? Just for fun.
Does that change our perspective any? It does, doesn't it? Looks even less alarming.
But we do have that pesky model output to address. Where did that come from? Have we figured climate out? Do we know how to forecast general average global temperature years, even decades into the future?
Well we do have blackbody equations -- which are, by the way, theoretical models of theoretical bodies. They have proved quite useful. However, they are theoretical models of very theoretical, very homogenious "systems" if you will, and of course the earth and it's atmosphere and chemistry are anything but. So we have taken these model equations and created a crude sun/earth/atmosphere model to try to describe what we believe should explain why the earth is warmer than the earth would be if it were a simple black body. And not surprisingly, those equations work out, roughly.
I majored in Meteorology, and I know how we come up with these models and equations. We have these little "constants" we throw in to adjust the equations so that they match our observations. This is often helpful when modeling. But it's still a model. Our weather forecasts are computer models. Much is made of how often they're wrong -- although they actually are amazingly accurate for what they are and the data we have over a period of a few days.
Well then we took THAT sun/earth/atmosphere model, paying special attention to CO2 as a greenhouse gas -- and plugged them into and integrated them with our climate models in an attempt to predict what might happen if we increased CO2.
Al's famous "did those two ever fit together?" graph in his slideshow/movie does show an amazing correlation between Co2 levels and mean global temperature.
Which brings us to Things I Know #4: Correlation does not mean causation.
Does more CO2 in the atmosphere cause mean global temperature to rise? Simple black body equations say it should to some extent. However, there are a zillion other factors that they don't take into account. We simply don't know what many of those factors even are. One of the ones we do know is that even in the equations (including the IR absorbtion windows in which CO2 absorbs IR radiation) the increase would not be linear. It's inverse logarithmic. At some point on an inverse logarithimic curve, large increases in CO2 means negligible increases in temperature. Eventually they become infinitesimal. And it looks as though we're getting to the high end of that curve.
At any rate, that is why we're inclined to believe that the correlation is evidence that CO2 is the driving factor. But is it? What if the models are wrong? Remember, models are not reality. They are expressions of belief.
One of the assumptions we had for a long time was that solar output was constant. Look in the models, and you will find the "solar constant". But it turns out it is not constant. Solar output varies over time, and solar output, as one might expect, affects earth's climate. In more ways than one, as it turns out -- one by delivering varying amounts of energy to the earth's position in space, and another by "blowing" on the magnetosphere, altering how much gamma radiation gets into the earth's atmosphere. That second one affects cloud formation. And it turns out that clouds have a significant role in how much of the sun's energy is kept in earth's system. They can act as a blanket, keeping some in. They can act as reflectors, keeping some out.
So, as I mentioned in an earlier, counter-snark post, let's take a look at sun activity and temperatures in the last 300 years.
Did ... those two ever ... fit together?
Seriously now, we have another correlation here. And, using Al's logic, we've got to believe that since CO2 levels correlate to temperature, and temperature correlates to sun cycle length.... that CO2 levels correlate to sun cycle length. So does more CO2 in the earth's atmosphere affect sun cycle length? I don't think even Al Gore would go that far.
So what other explanation might there be? Well let's try this one on for size. Sun cycles affect earth's temperatures. And earth's temperatures .... affect how much CO2 is in the atmosphere. So let's look at another chart which lays CO2 over these two.
Water can hold more of an atmospheric gas at colder temperatures, and less of it at warmer temperatures. Sun heats the earth, ocean gets warmer, releases CO2 into the atmosphere. Sun delivers less input to the earth, earth cools, oceans cool... absorb CO2 from the atmosphere, CO2 levels fall. Notice that the shape of these cycles is almost always abrupt (how alarming!) on the warming side, and more gradual on the cooling side. The warming forces CO2 out of solution fairly quickly, and it comes out of solution wherever the water is warmer -- not just at the surface. Once it comes out of solution the CO2 becomes a gas and rises quickly to the surface and is expelled into the atmosphere - whereas absorbtion can only occur at the surface, thus slowing down re-absorbtion into the ocean.
What does Occam's razor tell you?
Now back to the model, which, as you may recall is an expression of belief. If you believe that CO2 level is a significant forcing factor for temperature, you are going to express that in your model. If you express it in your model, then your model is going to end up "confirming" that belief. How can it not? It's programmed right in there!
What we have shown here is that the evidence strongly suggests that CO2 is not a significant forcing factor in temperature. Even if we ignore ice core evidence (and I'm not saying we should) that warming cycles actually lead CO2 fluctuations, and guess that CO2 enhances the effect as the warming starts and helps the rapid temperature increases we see at the beginning of the cycles -- there is clearly a limit to this phenomena. The CO2 IR window is effectively closed, and as temperatures fall on the down side of the cycle, with the cooling oceans drag CO2 levels behind them as they absorb it out of the atmosphere.
There is also ice core data that shows that temperatures start to go up before CO2 levels rise, further corroborating the idea that CO2 is not a major forcing factor -- at least not past about 300ppm. Before we started burning fossil fuels at any sort of high rate, something besides us drove those CO2 levels up.
In the end, we have observation that says that climate fluctuates, and recent fluctuations are not anything extraordinary. We do have extraordinarily high CO2 levels as far as we can tell (remember comparing inferred proxy data to actual data, which we've only recently been able to obtain -- has it's problems), and we have a theory that says more CO2 should trap more heat in the earth's atmosphere. We're missing anything that says this is actually happening, especially once we account for solar output and the associated processes. Observations to date are not alarming. All we have past today is model output. Model output is the logical conclusion of an expression of belief - it is not an observation. And so when you see a chart like this, keep that in mind.
and remember the extrapolated curve would fit just as easily to the curve if you cut it off at 1940, and at various other points in history.