web analytics


climate variation over time

The fabulous Watt’s Up With That ran a series of charts a few days ago showing temperature variation over time in Greenland ice cores. And by “over time” I mean going back over 400,000 years.

I thought it would make a cracking animation, with the fades and the arrows and the background music. But WordPress doesn’t play nice with Flash (which, in any case, is on my desktop machine in semi-exile). So I thought I’d do a simple Javascript dingus where you could click the picture to see the next one. Turns out, WordPress plays even less well with Javascript. Finally, in despair, I cut together a badly-paced lame-o animated .gif file that sucks ALL KINDS of ass.

Eh. Sorry. I really need to uncrate some of my old, professional tools and build some spiffier visuals. Reload the page to rewind.

Anyhow, do go read the article in the original Geek (not all the increments are in my retardimation). When he zooms all the way back to the longest view, it’s obvious that the earth’s most comfortable resting place is deep, deep cold. The last ten thousand years — you know, the period when our species crawled out of the muck and flourished — have all been much, MUCH warmer than the preceding umpty-ump hundreds of thousands.

And the 20th Century doesn’t come close to being the warmest of the warm.


Comment from dfbaskwill
Time: December 10, 2009, 7:37 pm

I’ve been following 10 or so “denier” blogs for quite a while. My college employs Michael Mann, one of the worst of the scientific quacks that instigated this hoax. In any event, at least us deniers have all the humor, something the quacks completely lack. Merry Christmas from the gang at Minnesotans for Global Warming: http://minnesotansforglobalwarming.com/m4gw/2009/12/its-a-climategate-christmas.html

Comment from Allen
Time: December 10, 2009, 7:59 pm

Be careful with that stick you could put somebody’s eye out! I liked the slide show Weasel.

After all the hulabaloo, I think what some of these researchers fear, is that their primary metric is really not there. I’ve had it happen, developing measurement techniques is filled with cruel mistakes.

It’s like being on the playground in kindergarten again and you make a booboo. Laughing, pointing, mockery, yes that kind of humiliation.

Comment from Mike C.
Time: December 10, 2009, 8:06 pm

Well, the “longest view” is a piddly 500,000 years or so, a geological eyeblink, so to speak. Still, it’s at least an attempt to point out the importance of looking at planetary things on planetary time scales. It has been warmer than you see on that “long view”, and it most certainly has been much, much colder, back about 6-700 MYA. Look up “Snowball Earth.” Them tillites didn’t just magically appear, you know.

Comment from weirdsister
Time: December 10, 2009, 8:52 pm

dfbaskwill…very nice! We could use some Global Warming here in Georgia, as well. It is gonna be in the 20s tonight. Pretty damn cold for the South.

Stoatie, I saw those charts earlier today. Too bad the libs are too dense to understand them. 🙁 *sigh* I feel as if I am looking into a future where I am telling my grandchildren about the “good old days,” when people were free and prosperous, before the fascists took over the world.

I think I’m going to go crank up the space heater, now, just to give that lame fucker Al Gore the metaphorical carbon footprint finger.

Comment from Enas Yorl
Time: December 10, 2009, 9:15 pm

See, this is EXACTLY the sort of thing we should have been throwing at these bastards ever since they switched from “OMG ICE AGE IS COMING!!11! Come on – we have to destroy Western Civilization to save ourselves!!!” to “OMG GLOBAL WARMING!!!11!! FOR REALZ THIS TIME NO KIDDING!!! Come on – we have to destroy Western Civilization to save ourselves!!”

Comment from LordFlashHeart
Time: December 10, 2009, 9:48 pm

This is a dastardly Plot against civilisation by people within our very own society!

Its not a conspiracy theory – but real … you don’t need me to tell you that. But what to do with these people?

Hang Phil Jones for a start, then the Greens

Comment from Red State Witch
Time: December 10, 2009, 10:00 pm

I think it’s time for a “global warming/no global warming” no-holds barred, cage-style, grudge match. I propose we place a space heater in front of an air conditioner and let them fight it out for supremacy. Come up with a catchy name like “The Smackdown in East Anglia” and we can sell tickets on Pay Per View.

Comment from S. Weasel
Time: December 10, 2009, 10:06 pm

The Medieval Warming Period was quite recent and much warmer than now. Surely we can work out from records whether sea levels were different, for example.

As far as I know, it was a period of prosperity. Or, at least, when it came to an end, there was a series of famines and misery. And then the Black Death (not that there’s a relationship).

Comment from Gromulin
Time: December 10, 2009, 11:00 pm

Hopefully, this will all lead to a climate “Scopes Trial”, like the Chamber of Commerce, or some other similarly bland organization here, called for earlier this year. Gimme judge Ito, baby! I’d love to sees me some real Perry Mason cross-examination of these fraudulent fuckers on C-Span!

Comment from Scubafreak
Time: December 10, 2009, 11:01 pm

Oh, there’s a relationship. Famine and stress lower peoples resistance to biological pathogens, making populations for suseptable to outbreaks of disease. It is natures way of culling the herd back to population levels that are more sustainable in the existing environment.

There has been evidence put forward that it is a built in response in the human genome, a so called ‘suicide gene’ that makes unsuccessfull members of the population self-destruct to ensure the longevity of the whole.

See “The Lucifer Principal” and “Global Brain” by howard bloom if you want to learn more about it…..


Comment from EZnSF
Time: December 11, 2009, 12:12 am

I love you.

Can you make a version we can email to friends, family and various idiot co-workers?
I’ll send you a jar of Kraft Pimento Cheese Spread!

Comment from Roman Wolf
Time: December 11, 2009, 1:04 am

I sometimes wonder if I’m the only one of my generation who hasn’t fallen for this man-made global warming crap. Granted, the single most famous climate scientist at my university, Dr. William Gray(the Hurricane prediction guy), doesn’t believe in man-made global warming.

Oh, and just to show you the basis in the grant system, apparently Dr. Gray can’t get grants to study “global warming” despite being one of the single most famous scientists in the United States. Still can get his hurricane research grants but not any for global warming…

Hrrm…and they insist that this is science and not politics? Surely Dr. Gray would of found some sort of man-made global warming if it was scientific…

This is what brought me to the conclusion that they must be covering up something. If this was science, they wouldn’t be afraid of sceptics.

Comment from Oldcat
Time: December 11, 2009, 1:27 am

Another relationship between colder temperatures and disease is due to the fact that poor folk then lived in the same house as livestock, so if all the stock was inside for longer due to cold, everyone shared about fleas, germs, and all that. And cold brings pests in from the fields if there’s no food out there to eat.

I see a similar thing every year here in CA. On the hottest days of the year, ants swarm into the house looking for water. If its cooler, they never come in.

Add in reduced resistance mentioned above and its bad news for everyone.

Pingback from Still I look to find a reason to believe « jdwill07 blog
Time: December 11, 2009, 1:50 am

[…] instructive read) the Medieval Warming Period with the Hockey Stick, so we could very well be returning to the top of the next hill with very little CO2 […]

Comment from David Gillies
Time: December 11, 2009, 2:59 am

I’m a professional software engineer upon whose work is staked serious amounts of money (like enough to make the sums written on those six square foot cardboard cheques they give to lottery winners look like chump change.) I have a background in physics and electronic engineering with an emphasis on computer simulation in both. And I have to say that the revelations of the state of the CRU’s codebase shocked even me, who has seen the lamentable state of programming in purportedly serious academic environments up close. Now, it has to be noted that none of the leaked code is concerned with the climate simulations themselves (as far as I am aware). It’s really just very basic time-series analysis with a bunch of statistical jiggery-pokery thrown in. The real climate models (called General Circulation Models or GCMs) are a big deal. They are basically implementations of a Computational Fluid Dynamics simulation, which are in turn discretised formulations (in space and time in the case of finite element models and space and frequency in the case of spectral theory models) of the Navier-Stokes equations plus sundry additional factors like oceanic CO2 absorption vs. temperature etc. The code for these has not been leaked, and is not in the public domain. But given the shockingly low quality of the stuff we have seen, we can have little confidence that the GCMs are robust.

It is really, really hard doing science with computers. Most scientists and engineers are not trained in computer science, and as a consequence their programs tend to be ad hoc, fragile and grossly inefficient. There is often a very cavalier approach to data and code curation. The very existence of the HARRY_READ_ME.txt file shows that none of the code was under even the most elementary form of version control. As soon as a project of mine becomes more than an itch-scratching exercise, i.e. more than a few hours invested in it, straight it goes into CVS or Subversion (and this is my own, personal stuff, never mind the code I get paid to write.) If I were responsible for generating a codebase whose history was so unreliable as this, I and my entire team would be fired by the powers that be, and with good reason. The ‘throwing away the raw data for storage space reasons’ story is either a lie or the sort of stunningly negligent decision that would in a just world see bureaucrats doing hard time. You could put a hundred megabytes on a 2400′ tape, and that is still a hell of a lot of data even today. Anyone with the space to store a halfway-decent LP collection could archive ten gigabytes thirty years ago. To suggest otherwise is just insultingly stupid shit.

Comment from James
Time: December 11, 2009, 6:01 am

WRT sea levels–it isn’t quite as easy as it seems to figure out whether the sea is the same, because you are fighting erosion and deposition and in some places land levels rising or falling. If the land rises (or sea falls) and your old pier is no good anymore it would make a handy source of materials to build a new one. So looking at things like piers has a bias in favor of studying sinking land (rising sea) where the old piers are unsalvageably drowned. It sounds like a long and hard study.

While I agree with most of what David wrote (speaking as a scientist who has written a lot of code myself 🙂 ), I suspect that the “raw data” wasn’t on tapes to begin with, but was keyed in from paper reports. I am sitting beside an untidy pile of papers with temporary notes and calculations on alignments. None of it is raw data, but I can easily imagine someone with the same desk pile ditching reports after he was done typing in the numbers: after all, he can always get them again from the reporting site. Except that 10 years later, he can’t; and can’t even point to which numbers came from where. In other words, it is the same coding carelessness writ large upon the dataset.

Comment from David Gillies
Time: December 11, 2009, 10:36 am

James, the situation you describe is familiar. I have seen it personally. But are the contents of your tottering piles of paper going to be used to justify the derangement of the entire world economy for the next hundred years? You’re providing an explanation, not an excuse.

Fifteen years ago I implemented a backup strategy for the small cluster of machines I administered for my University department. It encompassed a full weekly backup on DDS2 DAT tape with daily incrementals, plus periodic archiving. If my little group could do it, then anyone could. This was at most a million dollars’ worth of research data, unlike the jaw-dropping sums the CRU was allocated.

Comment from Mrs. Peel
Time: December 11, 2009, 10:47 am

David, I agree. I think another problem is that what they’re doing is based on statistics, and in addition to not understanding computer modeling, they (like the vast majority of people) don’t understand statistics.

I’m not sure how you teach critical thinking. A good statistics class is one way – it equips you to ask uncomfortable questions like “What’s your sample size?” “What assumptions did you use?” “What other factors could contribute to the effect you are studying?” Thing is, though, most people who even take statistics don’t take it until college, and you really need to form habits of critical thinking much younger.

(I was the annoying kid who was always pointing out the errors in the teachers’ statements. You know, the one who would remind the teacher that she meant to give a quiz today, and then correct the grammar on the quiz. So I guess it came naturally to me…)

Comment from Nicole
Time: December 11, 2009, 12:18 pm

Teaching critical thinking is well and good, as are all of the charts and such. The problem lies in the mindset to which you are attempting to get through. Once emotion enters the picture as it has via baby polar bears, etc. all the logic, facts and learning in the world will not change a mind. At that point you are changing a heart, or so they believe, and they will absolutely not let that happen as they think that would make them cold hearted conservatives, doncha know?

Oh, and good animation, Miz Weasel. And excellent points by commenters!

Comment from David Gillies
Time: December 11, 2009, 12:20 pm

Mrs. Peel, these people are supposed to be physicists. An extremely significant portion of the first year of my physics BSc at Imperial College London was devoted to statistics. It wasn’t merely to teach the techniques of statistical analysis, important as they were, but to equip us with the sort of robust bullshit detectors without which any halfway-competent scientist or engineer cannot function. That was only twenty years ago. When I say ‘statistical jiggery-pokery’ I mean to imply that this was overt. I don’t think from the tone of the HARRY_READ_ME.txt file and some of the emails that the protagonists were in any doubt that what they were doing was methodologically iffy. Computer power back then was just starting to be adequate to implement early versions of the GCMs, and as I recall none of the people I knew who worked in atmospheric physics put much faith in them. They were all well aware of the limitations of CFD, and in fact the biggest compute resource for studies in the area was in the aeronautical engineering department. As in so many things, the right question to ask when fishy doings occur is cui bono? Sadly it appears that money trumped scientific ethics in one of the starkest demonstrations of Public Choice Theory to date.

Comment from EW1(SG)
Time: December 11, 2009, 12:41 pm

Mrs. Peel sez:

I was the annoying kid who was always pointing out the errors in the teachers’ statements.

Gah! Tutoring the kid across the street in his math class~I must spend half my time explaining to the kid why the question in the book is wrong!

And bad enough that he’s a kid to start with (his attention span is even shorter than mine…I have no idea why we spent 15 minutes discussing biotoxins last week), but the constant “But the teacher said …” when explaining different ways to consider the problem at hand is disheartening.

Comment from Dawn
Time: December 11, 2009, 2:30 pm

I heart smart people.

Comment from EW1(SG)
Time: December 11, 2009, 3:33 pm

BTW, next time I see Einstein, I’m gonna kick his ass for that silly “Imagination is more important than knowledge” crap.

Imagination without knowledge (not what he said, but that’s the way current schoolteachers implement it) is nothing but fairy tales and unicorn farts.

Comment from Oldcat
Time: December 11, 2009, 3:37 pm

There’s another issue that so far is not mentioned – if they are doing finite modeling on a computer, then it is critical that the programmer know all about the problems with overflow of integral data and error propagation from floating point calculation. The code released so far shows the programmer didn’t even realize that fixed point overflow existed, which make it likely that the more subtle problems with floating point numbers are also not handled correctly.

Even if the raw data was perfect, floating point error can grow to dominate a calculation. For example, 32 bit floating point has a resolution of about 6 decimal digits. So if you take a variable with a million in it, then add 1, you can do that forever without changing the original variable. If you add the million 1s together first, you get 2 million for the result.

Comment from David Gillies
Time: December 11, 2009, 4:22 pm

The GCMs will be written in FORTRAN, Oldcat, and most implementations will be using at least 64-bit IEEE754 double precision. But error propagation in numerical solutions of differential equations is really, really nasty. A big chunk of my second year Numerical Methods lectures was how to mitigate the effects. The problem typically comes when finite precision means you have a small admixture of an exponentially growing function with the real solution which is convergent. After enough time steps, the error comes to overwhelm the true solution. There’s also big problems with singularities and discontinuities in your mesh (the aerodynamicists are the go-to guys here). You can sometimes do some clever tricks with coordinate transformations and the like but it’s still really easy to blow your code up. As I’ve said, the GCM codes appear to still be under wraps so it is impossible to comment on their numerical stability. And even if they’re stable and convergent, GIGO still applies.

Comment from Oldcat
Time: December 11, 2009, 6:13 pm

Giving that kind of power to that bad a programmer is like giving a running chainsaw to a toddler.

There was a note in the README where the programmer had no idea why a “sum of squares” function returned a negative number for large inputs.

When I was an undergrad CS/Physics major helping with analysis in academia in the 80s most of the fellows I worked with had no idea double precision even existed. (“Its a REAL. What are you talking about”) I’ve had little confidence in the results of these “thousand and thousands of hours of computation” type of claims. Thousands of hours makes the result MORE iffy, not less.

Comment from Mrs. Peel
Time: December 11, 2009, 8:28 pm

They’re SUPPOSED to be physicists, yes. My experience has been that people who claim to be “climate scientists” don’t actually have degrees in any kind of science (unless you count social science, which I don’t).

But then, CRU/GISS/GHCN apparently are deliberately falsifying data. And then people just swallow everything they say and repeat it on TV. So I think I am really complaining about stupid mouthpieces and useful idiots, rather than the people generating the fraudulent numbers.

There was a note in the README where the programmer had no idea why a “sum of squares” function returned a negative number for large inputs.

*smacks forehead*

(For those who don’t know, in computer programming, you store a value in a fixed number of bits. In certain types of data, if the value you are storing is larger than that fixed number of bits can accommodate, the bits “roll over” and you get a negative number. This is, like, Programming 101. I mean, I haven’t taken a single programming class beyond high school C++, and I know this. There was even an xkcd about this.)

Comment from Mrs. Peel
Time: December 11, 2009, 10:53 pm

Very educational post by Iowahawk, if you haven’t seen it yet.

Comment from EW1(SG)
Time: December 11, 2009, 11:50 pm

Mrs. Peel, you have my undying love and gratitude forever for leading me to this particular Burgeian nugget:

I’m also a big believer in learning by doing; if you really want to know how a carburetor works, nothing beats taking one apart and rebuilding it. That same rule applies to climate models.

I will treasure that amongst my stolen pet sayings for at least two weeks!

/Oh! Shiny! Buh-bye!

Comment from Waterhouse
Time: December 12, 2009, 1:16 am

Now if only Zazzle made an electronic shirt that could display an animated .gif.

Comment from JD Will
Time: December 12, 2009, 12:15 pm

Davie Gilles,

I am also a software engineer, though my work (mfg/commercial) is not so grand as yours. I too was shocked at Harry’s travails, though I have cleaned up similar messes taking on some legacy systems.

I am interested in your opinion on two things:

1. This reconstruction of the reconstruction. Who knew IowaHawk was more than a hot rodding comic genius?

2. Wouldn’t the GIGO principle invalidate any analysis of the GCMs themselves? Harry’s read me told me that the data provenance was uncertifiable and cherry pickers could have had a field day with the ‘value added data’.
There also is lively web discussion going on as to the shifting nature of the land based temperature record.

It may truly be a house of cards, but I am waiting for the science to get better, because there is a basic prima facie argument for incremental AGW via GHG that has not been disproved .


PS – Weasel, loved the GIF, spot on!

Comment from Mrs. Peel
Time: December 12, 2009, 3:16 pm

heh, Iowahawk now has three links to his post in this thread alone. Time to move him up to the front page, Weas?

Comment from David Gillies
Time: December 12, 2009, 4:04 pm

1) I assume anyone as screamingly funny as David Burge is by default of exceptionally high intelligence and probably polymathic. In my experience the smarter someone is, the funnier he is. The converse is true.

2) we don’t know. All we have is the time series analysis code and data from CRU. The GCMs are a mystery. They could be written with the same shocking level of incompetence as the CRU stuff; they could be shining examples of good software engineering practise. Unless and until we see the code we don’t now. GIGO is always there in potentia, but we can’t tell if it applies in this situation. We do know that the GCMs have failed to predict the decade long period we are in where there has been no net warming, but we cannot meaningfully critique them without the code and (almost as crucially) the boundary conditions.

Comment from James
Time: December 13, 2009, 1:04 pm

David, I was not trying to make excuses, just explain how sloppiness turns into policy.
I agree that it is insane to try to plan intrusive and expensive public policies on the basis of monte-carlo programs that can’t even seem to get the sign of the change right (I’m back in Wisconsin now, and spent an hour chipping global warming off the sidewalk). If you want to look at our monte-carlos, feel free: pythia, herwig, etc are all available for scrutiny. (I remember running across some dumbness in Isajet, so I won’t claim they’re bug-free). Quite a lot of money was risked on those models when we built the detectors, but at least the models more or less work: under conditions X and Y pythia predicts Z, and when you do the experiment you find something that looks a lot like Z; so we guess it will still work when you have X and Y+epsilon. We have people devoted to refining the programs and trying to find the limits to their validity.
Where are CRU’s programs? Data analysis can be hairy because you have to know a lot about the measurement (calibration, variability, how it correlates with other measurements, failure rate, etc) before you know what it means, but the predictions should be a lot easier to understand.
If they can do something as simple as predict the average temperature in western India for the next decade, then maybe we can start talking. About whether the “warming/cooling/neither” is going to be a bad thing, for instance.
FWIW, you might be gratified to know that the software frameworks for our experiments were written by pros, and tested and code reviewed. The analysis modules–um… we find most of the bugs when funny things happen during use.

Pingback from Climate Change In Context « ricketyclick
Time: December 13, 2009, 1:08 pm

[…] favorite weasel-girl did an early version of this using WUWT’s original […]

Comment from S. Weasel
Time: December 13, 2009, 6:42 pm

Nethicus did a version of this that’s a lot better than mine and is going viral 🙁

On an happier note, though, I got t-shirts and postcards and greeting cards. Send your favorite hippie a taunting postcard!

Comment from David Gillies
Time: December 13, 2009, 8:57 pm

James, sorry; I wasn’t trying to be rude. But I think we’d both agree that lay people would be amazed at the general shoddiness of academic codesmithing. Engineers tend to be better than scientists, probably because they approach the field as an engineering discipline rather than just a problem-solving tool. There are examples of good practise in scientific computing (I fondly remember a summer spent validating a corner of the Swedish Lund University code used to simulate the LEP particle accelerator: specifically the method for calculating the ‘thrust’ of an interaction, which is effectively the sum of the absolute values of parton momenta – I wrote a technique completely orthogonal to the standard one and validated the model, for low multiplicity events anyway, i.e. fewer than five or six jets. Most of the code had been checked in this fashion. FORTRAN 77 on a VAX 8650 under VMS/DCL with a greenscreen VT100 terminal! Happy days. And I know for most people this is the compsci equivalent of glossolalia).

Write a comment

(as if I cared)

(yeah. I'm going to write)

(oooo! you have a website?)

Beware: more than one link in a comment is apt to earn you a trip to the spam filter, where you will remain -- cold, frightened and alone -- until I remember to clean the trap. But, hey, without Akismet, we'd be up to our asses in...well, ass porn, mostly.

<< carry me back to ol' virginny