Buy

Books
Click images for more details

Twitter
Support

 

Recent comments
Recent posts
Links

A few sites I've stumbled across recently....

Powered by Squarespace
« Black and greenie | Main | More world government »
Monday
Mar192012

Climate models for politicians

Some weeks ago, I invited readers to improve upon parts of a summary of global warming science, written by Julia Slingo for the benefit of readers in central government. The ground covered was mainly about surface temperatures. At some point I may well write this up into something more formal.

I think it would be interesting to also say something about climate models and their uncertainties and I have been giving this some thought. My knowledge of climate models is somewhat sketchy, so some of my understanding may be incorrect, but here's the ground I think central government really ought to understand:

1. Climate models are based on well-understood physical laws. There is wide agreement that on its own a doubling of CO2 levels would produce an initial warming of around 1degC.

2. However, the knock-on effects of this initial warming ("the feedbacks") are not well understood, particularly the role of clouds.

3. Most climate models suggest a warming of 2-6degC/century. It is not clear that this range actually covers the full envelope of possilities because of uncertainties over the feedbacks.

4. The temperature predictions of climate models cannot be tested in the short-to-medium term; 30 years is required to properly assess their performance.

5. However, climate modellers derive comfort that their models are reasonable approximations of the climate systems from a number of observations:

  • their models generally replicate the Earth's temperature history ("hindcasts"), although it should be noted that even models ancapsulating very different sensitivities to CO2 can do this, demonstrating that the models are fudged.
  • some models spontaneously reproduce features of the real climate, such as the PDO and El Nino, although not well enough to make such models useful predictive tools.

6. However

  • when the detail of the "hindcasts" is examined, it is found that the models do not in general replicate regional temperature histories.
  • to the extent that models have had their predictions tested against outcome, their performance has been poor.
  • no model has been shown to be skillful at regional forecasts.

Have I got anything wrong? Have I missed anything? I also wonder if politicians actually need to know about feedbacks and physics and sciencey stuff like that. Don't they just need to know how good the models are?

PrintView Printer Friendly Version

Reader Comments (81)

Mar 19, 2012 at 11:07 AM | Billy Liar

Very well said. Let me emphasize for readers that the physics precedes the model and stands on its own. The model does not "add to" the physics in any way at all.

Mar 19, 2012 at 5:28 PM | Unregistered CommenterTheo Goodwin

"No one thought that climate models are based on alchemy"

Actually Theo, I do think that climate models equate to alchemy - a complete and total waste of time and money, for all of the reasons given above.

Mar 19, 2012 at 6:24 PM | Unregistered CommenterRoger Longstaff

Now we are on page two, may I ask my questions again. If they are stupid, don't hesitate to point that out, I won't mind.

1. Do they throw away results or abort runs where things are 'obviously wrong'?

2. did anybody ever model one square metre (or any small area) of surface and compare to measured heat flows?

Mar 19, 2012 at 6:28 PM | Unregistered CommenterRhoda

Climate model : Science ≈ Tower of Babel : Architecture

Mar 19, 2012 at 6:37 PM | Unregistered Commenterjorgekafkazar

@Rhoda

Good questions. I would ask the modelers over here: http://www.climate-lab-book.ac.uk/ they are 'Experimenting in open climate science – open for contributions' (that is their tagline anyway).

My guesses would be...(but you should ask the modelers):

1. I doubt that they throw 'undesirable' results away. As the models just return assumed 'forcings' with noise - there isn't any doubt that they'll give the expected upward trend in temperature - and cherry picking is therefore not required.
2. The same answer as 1 pertains I believe. As the modelers effectively feed in the average temperature change per square meter as 'forcings' - this will eventually be returned to them.

The only thing that the GCMs do is introduce random noise (caused by long term turbulence and energy transfers) on the long term temperature trends which have been input into them.

A similar question to 2 might be 'has anyone ever created an experimental model of the atmosphere realistic enough to help understand climate'. When people try this (e.g. Jasper Kirkby) they become unpopular with climatologists, for some reason.

Mar 19, 2012 at 6:47 PM | Unregistered CommenterZT

I wouldn't bother spending time debunking the models. It won't be long before they have failed to the extent that even politicians can see it. Just put your feet up.

Mar 19, 2012 at 7:07 PM | Unregistered CommenterJames Evans

Just because a model is described as being based upon "well-understood physical laws", it does not necessarily follow that those physical laws have been correctly implemented, or even understood, in those models. As noted by others, the large range of IPCC model "projections" suggests that those "well-understood" physical laws are not implemented correctly or not understood by most IPCC modellers. And what if the system is inherently unpredictable within useful boundaries? Well, the conversations will be steered well clear of such obstacles. They won't go there. And the policy-maker doesn't need to.

Policy-makers may not feel able to follow the technical details presented to them, but there are two key questions that are available to anyone and everyone.

1)
Test the models against predictions/projections. NOT HINDCASTS. I can get you a set of equations in less than a day that would give an apparently good match with historical temperatures. The criteria of success and failure must be determined in advance and then adhered to. Drug companies cannot change the criteria of success in a clinical drug trial as the trial progresses. If it helps, policy-makers should first wipe their minds clear of anything to do with climate. Then pretend that the models being presented are predicting the stock-market, and would the policy-makers like to invest some of their own money. That will help concentrate the mind wonderfully.

2)
When the predictions are initially made, ask what is a reasonable prediction that would demonstrate genuine predictive skill and not luck? This is probably the harder question. Think of the stock market analogy again. Will the returns be better than that charged by the central bank or offered on low-risk deposits? Saying "it will go up a bit" or "down a bit" gets nothing as a prediction. Null Pointes. 0/10. Zero. Like stock-markets, temperatures have been being going "up a bit" for a century, with some "down bits" mixed in.
http://www.woodfortrees.org/plot/hadcrut3gl/from:1912/to:2012
Many centuries, actually, and will probably continue to do so.

Of course people say similar kinds of things all the time about the stock-market [and some get very well paid for sounding credible]. Later, disappointed investors get pointed to the small print which explains how the investor had misunderstood what the analysts really meant. But if they really knew, then they probably wouldn't be telling us, would they? If they said "it will go up next week, and down the following week" then they start to get a little bit of credibility, if it comes true. But not much.

Global temperature trends have been essentially static, or even declining, for over a decade now.
http://www.woodfortrees.org/plot/hadcrut3gl/from:2002/to:2012

Yet I can't recall seeing anything similar in IPCC predictions from over a decade ago. And with no recent volcano prepared to take the blame, either. "Well we need 30 years to be sure, Mr Policymaker, Sir." Reply: "You didn't say that very loudly more than 20 years ago when temperatures were rising, did you.?” Doubtless the policy-maker will then hear reasons about all the aerosols from China, being unexpected [or not well understood!] . Well they weren't a surprise to me, and probably not to the policy-maker either. Or they will hear about "ENSO, PDO, PMO" etc. The policy-maker can then say "But you didn't predict them did you? Does that mean you didn't understand it before when you pretended that you did? Or is that just another “well-understood physical law” that you forgot to put into your model?"


Honestly. This is an easy game. Anyone can play it, even policy-makers. In fact they probably invented it.

Mar 19, 2012 at 7:15 PM | Unregistered Commentermichael hart

Aerosol cooling is used as a fiddle factor to fit the hindcasts: the climate models with the highest CO2 sensitivity have also the most cooling aerosols.

Mar 19, 2012 at 7:32 PM | Unregistered CommenterHans Erren

The failure of the models to prove anything is apparent from a key change by Trenberth and Kiehl from their 1997 to 2009 Earth Energy Balances.

1997: http://www.cgd.ucar.edu/cas/abstracts/files/kevin1997_1.html

2007: www.cgd.ucar.edu/cas/Trenberth/trenberth.papers/TFK_bams09.pdf

Minor differences in the atmospheric flows apart, the latter adds 0.9 W/m^2 retained in the earth. This is to give the impression that heat is accumulating in an unknown way in the oceans so as to purport that steady atmospheric temperatures are no guarantee we won't have major warming in the future.

The fact is that the climate models assume that unique-to-climate-science assumption that Prevost Exchange Energy can do thermodynamic work. it is this which gives imaginary positive feedback.

Also, they assume incorrect aerosol optical physics for clouds which gives dramatically higher cooling than reality [optical depth of low level clouds is double reality.]

Add to this faulty IR physics and a 3.7 times too high estimate of present GHG warming and it's a mess: no model can predict climate.

Mar 19, 2012 at 7:33 PM | Unregistered Commentermydogsgotnonose

There is absolutely no reason why politicians need to know much about climate models.
My wife knows very little about the workings of motor cars, but is an excellent driver, which allows her to make sensible decisions while using a motor car.
While the general public knows the difference between a Formula One car and a tractor, in very general terms, I am not convinced climate scientists know as much about modelling the climate as the general public knows about motor cars. There are a number of climate scientists I would not buy a used climate model from, in any sense of the word 'buy', which is the real question; how relevant to the world are climate scientists?

Mar 19, 2012 at 8:31 PM | Unregistered CommenterAlexander K

Apocryphal models and a True Story.
As a student I once attended a wonderful talk given by Professor Philip Eaton from the University of Chicago. Famous for being the first Chemist to make the molecule “Cubane” in 1964, he gave a talk describing the far more difficult synthesis of “Octanitrocubane.”

This molecule has an elegant structure with a simple beauty that can be admired by all: Mathematicians, Chemists and Military Generals.
The latter also liked it because it was predicted to be a better chemical explosive than any they had in their armoury. The calculations with quantum-mechanical and thermochemical computer models are relatively easy to perform today, with good levels of accuracy.

The potential fly in the ointment though, was that a good explosive also needs to have a sufficiently dense crystal structure, even if the thermodynamics are good. These calculations are far less useful as good predictions. The military knew roughly what was needed, [and so did the modellers] and provided the funding for the work. The initial calculations suggested that Octanitrocubane would fail by this criterion, but they set off in hope. As Prof Eaton and his students laboured their way slowly towards the final triumph, so did the modellers. At intermediate updates the modellers announced a series of incremental improvements suggesting that the crystal structure may indeed be close enough to that desired.

In a masterful demonstration of synthetic chemistry, the molecule Octanitrocubane was finally synthesised in 1999. But, yes, you guessed it, the structure of the isolated crystal proved inadequate. A modelling fail. [In fact cost alone would probably have ruled out it's widespread use, whatever the outcome.]

The computer models were wrong. Wrong, and as I recall, going in the wrong direction. They were going in the right direction towards what the sponsors needed, but not where reality was pointing. Professor Eaton did not make any accusations of researcher bias, incompetence, or malpractice. He just drew it to our attention for consideration.

Mar 19, 2012 at 9:09 PM | Unregistered Commentermichael hart

There's another issue with the IPCC heat flows. It's taken a long process of distillation to get there.

Use the data with incorrect assumptions of 'back radiation' and that the earth's surface emits at the black body level into a vacuum so the sum of radiation and convection exceeds real energy input, a perpetual motion machine, and the figures predict high AGW with high positive feedback.

Use the same data correctly without assuming 'back radiation' and that the sum of radiation and convection equals the energy input, and the figures predict much lower AGW and little positive feedback.

What are the odds that it was carefully set up to deceive?

Mar 19, 2012 at 9:47 PM | Unregistered Commentermydogsgotnonose

Rhoda -- I'll chip in on your questions as well, but with a different take from ZT.

1. Of course, they throw out bad results. I've seen reports of runs that have taken months to execute, with results that look "crazy", so are not used any further. Fundamentally, this must be so to an extent. No one will ever get something that is at all complex completely right the first time. So corrections must be made. The fundamental problem in something tremendously complex is that it is virtually impossible to tell when you are no longer correcting and are now "tweaking", "tuning", or even "fudging".

2. Due to computational speed limits, no global climate model can get anywhere near 1 square meter resolution. Most of them use blocks that are about 100km (horizontally) on a side. It is very difficult to compare model results for these to measurements, first given the difficulty of sampling enough points, and second given the difficulty of making comprehensive heat flux measurements anywhere.

But an example that made many people wide eyed occurred about a decade ago, when NASA scientists used the GISS model to evaluate the earth's overall energy imbalance. Their stated result was that this global imbalance, averaged over the earth's surface and a year in time was 0.85 +/- 0.15 W/m^2. But their intermediate results showed that some of the internal energy flux densities between "blocks" of the model were off from observations by 80 (not 0.80) W/m^2. This raised a big red flag for many people who wondered how the model could produce known internal errors hundreds of times bigger than the supposed final uncertainty.

By coincidence, today I received a reprint of a column I read 20 years ago by a man name Bob Pease, one of the best analog electronic designers ever. He died last year, and an electronics trade press is republishing a series of his "best of" columns. One of his recurrent themes was the dangers of over-reliance on computer models in any field. This particular column was on spreadsheets; others ranted about circuit models (like SPICE for those of you familiar with the field.

Anyway, this column from 1992, entitled "What's All This Spreadsheet Stuff, Anyhow?", starts:

***********************

"Sometimes people ask for my opinions on spreadsheets. I usually reply that they are perfectly fine on beds. If the people persist in further inquiry, I ask them to show me a definition of the word in a dictionary. Of course, that slows them down a little, because that’s a word too new to be in most dictionaries. But when they try to explain what a spreadsheet is, I say, “Oh, yeah, that’s one of those computer programs that makes mistakes, and lies a lot. We used to use those, but they made too many errors.” Which, alas, is partly the truth.

At one time we had a procedure for computing Return On Investment, using paper and pencil and maybe a calculator. It worked fine. It actually took you 3 or 4 minutes to crank through a whole set of numbers, and if you changed one input, you could modify the rest of the computations in 2 or 3 minutes. Then a new guy showed up one day, and he said he had computerized the whole thing. Just plug in the n umbers, push a key, and out come the results. If you wanted to try a different set of numbers, just plug them in and, Bingo, you get the new results. You could print out a dozen different versions on a stack of paper that weighed only 2 or 3 pounds…

About a year later, one of the managers, a singularly suspicious fellow, asked why the Speadsheet’s outputs did not match with the numbers he had calculated at home at midnight. After a good bit of checking, he found that the Spreadsheet was doing it wrong—some kind of systematic error had been programmed in, and all of the ROIs for the last year had this error, and nobody had ever questioned it. Sigh… So much for people who always trust a computer, because nothing can go wrong..?!? Fortunately, we did not have to go back and correct all of those errors, because those ROI numbers don’t mean anything, anyhow.…"

********************

The full column can be found at:

http://electronicdesign.com/article/articles/what-s-all-this-spreadsheet-stuff-anyhow-13965?cid=ed_powernewsletter&NL=1&YM_RID=cwilson@deltatau.com

It continues with more "physics-based" examples, and has a lot of insights relevant to climate science, IMHO.

Mar 19, 2012 at 10:28 PM | Unregistered CommenterCurt

Andrew - I'll add a modelling perspective as I've done lots of iterative and some statistical modelling.

As far as I can tell the GCM's are fitted to a training period of data (eg temperatures) using effectively a (multiple) regression approach. If you are doing classic multiple regression and you leave out a significant independent variable which is covarying (ie solar magnetic, ocean cycles) then the regression method will assign too large a value to the parameter (ie 'a' in y = ax + b). That is why climate sensitivity comes out high in the GCM's, because solar activity rose across the 20thC training period, as did the ocean cyclic component of temperature. They between them were responsible for about 5/6th of the temperature rise, so it is no surprise the IPCC value for climate sensitivity expressed as '2XCO2' is about 6 times too high (compared to the CERES and ERBE directly derived values) since these variables were left out of the models.

Its easy to demonstrate using a multiple regression example - just set up a few datasets, of which two or three are roughly covarying, do a multiple regression model and record the parameters. Then leave out the significant covarying variables and see what happens.

Mar 19, 2012 at 10:36 PM | Unregistered CommenterBruce of Newcastle

I think most people are assuming that politicians have some knowledge of science. This is probably not the case. I would say:

Climate models do not predict the future - not even the IPCC claim that. The IPCC call model outputs merely "projections". Some models cannot even reliably predict the past!

They are toys for climatologists and should not be used for policy making.

Please be aware that many alleged climate experiments are based on the output from these models and not on real life observations.

Mar 19, 2012 at 11:25 PM | Unregistered Commentergraphicconception

You want to keep it simple with politicians. Your list starts off way to tough.

Start from a broader perspective. Remember, politicians think "computer models and physics" and will get lost. By the time you reach hind-casts they will be scratching their heads thinking "what the heck?"

I highly recommend starting off with an introduction that gives an illustration of what the models do based on weather models first. (or the stock market or something that they as politicians can understand.)

Otherwise, you will lose them before you can even begin.

Second, several posters have touched the largest problem with hind-casting in general, how the heck does every GCM hind-cast the 20th century perfectly with such vastly different forcings for CO2 when its the "dominant climate forcing"?

Again, an assumption built into the models...and again one of those things that you have to touch upon with an analogy.

But in the broad sense, if you want to re-trace how Hansen et al went wrong, just go back to their days on working with the planet Venus and how they were wrong there....and how they simply applied their equations from Venus to Earth. They never fixed their mistakes there, and as such with new information today we know they were off by a factor of 10.

That should be enough to realize that Lindzen et al are on the right track and are probably doubling the actual effect of CO2....But that is another topic probably.

In actuality models tell us nothing of real science, and for politicians you can tell them that. They are simply a projection of what the modeler thinks reality will unfold as.

If the modeler has bias built in either on purpose or by accident, the model will not show reality at all, but some sort of fairy tale. In the case of a weather model, if a weather modeler wants to see Miami destroyed, they will show a category 5 storm smash the city to bits whether or not it will happen by inherently influencing the model unintentionally. In the case of the stock market, they want to see a certain stock do well and they influence the stock market to show it do well. Either case shows that observer bias can and does have a serious impact on the reality shown by modeling.

In which case, models are only worth how objective the programmable can be, and in the case of climate scientists who believe without a doubt that CO2 has an impact on the planet with no proof, well it goes without saying that they will find that CO2 does indeed have an impact.....

Mar 20, 2012 at 2:41 AM | Unregistered CommenterBenfromMO

At the start of my career I was told that when you check a document you should look not only at what is written but what is not written. To some extent we have fallen for this with climate models – we scrutinise what the modellers tell us but don’t ask about what they keep hidden.

There is some divergence between models when temperatures are expressed as anomalies. There’s an enormous amount when temperatures are expressed as Celsius relative to zero.

The models only represent the underlying trend of the 20th century temperatures. They do not represent the true rate of temperature change ,

Models are very bad at modelling precipitation . This is important as the feedback mechanism depends on modelling how much of the extra water vapour remains as water vapour and how much falls as precipitation.

Mar 20, 2012 at 6:13 AM | Unregistered CommenterRon

Most people, including many scientists, have been fooled by the pseudoscience that is hidden in the climate models. Every single result produced by the use of radiative forcing is invalid. The results from all of the models that use the empirical radiative forcing constants given by the IPCC are fraudulent.

The full explanation is little involved, but here goes:

In order to understand the issues it is necessary to go back to the beginning and look at the original 1967 paper by Manabe and Wetherald [J. Atmos. Sci. 24 241-249 (1967), ‘Thermal equilibrium of the atmosphere with a given distribution of relative humidity’. This can be found at http://www.gfdl.noaa.gov/bibliography/related_files/sm6701.pdf
The assumptions made in the derivations are clearly stated on the second page.

The Earth’s climate is stable, which means that there is an approximate energy balance between the incoming solar radiation and the outgoing LWIR flux somewhere ‘at the top of the atmosphere’. This is simply a statement of the First Law of Thermodynamics. The energy is conserved – more or less. However, this does not justify the assumption that an ‘average climate equilibrium state’ exists in which the solar flux is exactly balanced by the LWIR flux. This ignores the time dependence of the energy transfer and avoids the application Second Law of Thermodynamics to the surface temperature flux. It assumes that the sun is shining all the time with a single average flux. All of the subsequent mathematical derivations of the equilibrium flux equations and the use of perturbation theory have no basis in physical reality. There are no forcings or feedbacks because there is no equilibrium in the real climate.

Assumption 5 in Manabe and Wetherald is that ‘The heat capacity of the Earth’s surface is zero’. It is assumed that the Earth’s surface temperature is set by black body equilibrium, which is most definitely not the case. There is a significant lag between the peak solar flux and the peak surface temperatures that is characteristic of a classic thermal storage oscillator. The daily surface temperature lags by up to ~2 hours and the seasonal surface temperature lags by a couple of months. . Latent heat and wind driven evaporation are also fundamental in setting the surface temperature, especially over the oceans. The heat capacity of the ground is somewhere around 1.5 MJ.m-3 and that of water is ~4 MJ.m-3.

It is also important to understand that the ‘equilibrium average surface temperature’ calculated by radiative forcing is not a measurable climate variable. It has no existence outside of the ‘equilibrium Earth’ that resides only in the universe of computerized climate fiction found inside these radiative forcing models. .

If we use a reverse argument, an increase of 1 K in a black body surface at 288 K will produce an increase in black body flux of 5.5 W.m-2. This is similar to the increase in downward LWIR flux that would be produced by a doubling of the atmospheric CO2 concentration (to 580 ppm). Reliable atmospheric radiative transfer calculations using HITRAN, untainted by climate science give an increase in flux of about 4 W.m-2. However, there is simply no connection between either of these two numbers and the real surface temperature. The whole radiative forcing argument is nothing more than climate theology.

During the middle of the day, under full summer sun conditions, the peak solar flux is ~1000 W.m-2. The corresponding surface temperature is at least 50 C for dry ground. The increase in LWIR flux for a black body going from 288 to 338 K (+30 C) is about 227 W.m-2. Most of the solar heat is dissipated by convection. Heat is also stored below the surface and released later in the day. It is just impossible for a small change in LWIR flux from CO2 to have any effect on surface temperature when it added correctly to the surface flux balance. At night, convection more or less stops and the surface cools mainly by LWIR emission. The downward LWIR flux from the first 2 km layer of the atmosphere slows the night time surface cooling, but the atmospheric heating process is controlled by convection, not LWIR radiation. Furthermore, this is not an equilibrium process. The troposphere acts as two independent thermal reservoirs. The upper reservoir radiates to space all the time, mainly from the water bands near 5 km. The lower reservoir acts as a night time ‘thermal blanket’. The atmosphere is an open cycle convective heat engine with a radiatively cooled cold reservoir.

Manabe and Wetherald were quite honest about what they were doing. They simply produced an invalid hypothesis that should have been superseded. Later workers just allowed themselves to be seduced by the mathematics of the flux equations and never bothered to validate the models or investigate the real physics. The result is the global warming dogma that we still have today. And the associated corruption.

Now we get to the fraudulent part. Manabe and Wetherald were quite clear that they were calculating a surface temperature, however it was defined. This means the temperature on the ground that we feel with our bare feet. However, there is no long term record of the surface temperature. Instead, the meteorological surface air temperature (MSAT) was substituted for the surface temperature. This is the ‘weather temperature’ that is the air temperature measured in a ventilated enclosure placed at eye level, 1.5 to 2 m above the ground. It is simply impossible for there to be any observable change in MSAT caused by a small change in LWIR flux at the surface below the weather station enclosure.

The MSAT or weather recorded has been treated as a mathematical number series to be manipulated to ‘prove’ that global warming exists. The correct interpretation of the record is that the minimum MSAT is, approximately, with caveats, a measure of the heat content (lapse rate) of the air mass of the weather system as it passes through. This is usually an indicator of the ocean surface temperatures in the region of origin of the weather system. The maximum MSAT is a measure of the daily surface convection at the MSAT thermometer produce by the solar heating of the ground. This convetive heating is added to the minimum MSAT.

The US continental MSAT record shows a clear peak in the 1930s and 1940s from the dust bowl drought and a second peak from the recent modern maximum. These peaks can be explained using variations in ocean surface temperatures, particularly the Atlantic Multidecadal Oscillation (AMO) and the Pacific Decadal Oscillation (PDO). The observed global warming signal is nothing more that the warming phase of these ocean oscillations with some urban heat island effects and plain old data fiddling added. This has been explained by Joe D’Aleo and others. The oceans and the Earth’s climate have been cooling for over 10 years.

The climate models have been rigged using empirical pseudoscience as follows:

It has been assumed, a-priori that the observed increase in the MSAT record, the mutilated ‘average global temperature anomaly’ has been caused by an increase in atmospheric CO2 concentration. In particular it is assumed, without any justification whatsoever, that an increase in the downward LWIR flux from CO2 of 1 W.m-2 has produced an increase in the ‘average global temperature anomaly’ of 2/3 C. This allows the increase in LWIR flux from an increase in concentration of any ‘greenhouse gas’ to be converted to an increase in surface temperature. However, the real climate fluctuates and is now cooling, so ‘aerosol cooling’ has been added to compensate for the over warming produced by just the LWIR flux from the greenhouse gases. Sulfate aerosols are used for basic cooling and volcanic aerosols are used for ‘fine tuning’. The hindcast is just an empirical fit and the models have no predictive skill whatsoever. The climate models are pure pseudoscience. They are nothing more than the hockey stick propagating itself with aerosol adjustments. This is all climate astrology, not climate science.

However, it is not sufficient to say what is wrong with the climate models, it is necessary to propose a viable replacement, or the global warming acrimony will continue ad nauseam.

Radiative forcing has to be replaced with a dynamic energy transfer model. There is no climate equilibrium and the time dependence of the energy transfer must be explicitly included. Once this is done, CO2 induced global warming disappears. The minimum description requires four thermal reservoirs and six energy transfer process. This is discussed in the book ‘The Dynamic Greenhouse Effect and the Climate Averaging Paradox’. It is available on Amazon. There is also further discussion at http://www.venturaphotonics.com

There has to be a fundamental paradigm shift in the way we describe climate and climate change. There is no climate equilibrium on any time scale and all of the energy transfer processes are dynamic, not static. The sun only shines during the day. Sun wind and water need no help from CO2 to set the Earth’s climate.

Sol Invicto Comiti

Mar 20, 2012 at 6:38 AM | Unregistered CommenterRoy Clark

Roy Clark, Mar 20, 2012 at 6:38 AM:

An impressive summary of the physics - both the mistakes of the "climate alchemists" and a sensible way ahead IMHO.

Would I be correct in assuming that this concurs with the recent work of Nicholov & Zeller, who have used Holder's Inequality to more correctly describe energy transfer mechanisms? (noting that N&Z have concluded that only insolation, gravitation and the mass of a planet's atmosphere define "climate", regardless of the chemical composition of the atmosphere, and presenting empirical evidence for the theory).

I have come to the view that we should start with the only boundary condition that we understand - TOA, where only radiative energy transfer applies - and work backwards, with deterministic, dynamic, energy transfer calculations, and that ANY attempt to numerically simulate the "climate" (a non-linear, chaotic system with a large and uncertain number of dependant and independent variables) is fundamentally impossible.

Mar 20, 2012 at 8:32 AM | Unregistered CommenterRoger Longstaff

Ok, thanks for the answers. I suspect ZT is right in that the models have been developed to give no bad results, they do not contain parameters which could go wild, or at least they have all been tweaked out.

I'm not sure I posed the sqaure metre one clearly. Just one square metre. You know all the conditions. You get to measure an initial state. Then you model the heat flows and measure them too. Do the measurements match the model? No, then you do not have your physical laws correct. My original idea was a piece of ocean surface. That might be too difficult. How about a piece of concrete on a Tucson car park. Measure upwelling IR, downwelling IR, over the diurnal cycle. Does it match CO2? Does it follow seasonal variations in CO2? What difference do clouds make? How much heat goes up, down and sideways? What about non-IR parts of the spectrum? Now it seems to me, if I were a scientist, that having formed my CO2 hypothesis, that would be my first stop. I don't need a satellite, just a patch of ground. Can I explain the heat flow in a patch of ground? If this work had EVER been done, would we still be arguing about Venus, or Sky Dragons, or gravitation (I don't buy N&Z)?

Mar 20, 2012 at 9:21 AM | Unregistered CommenterRhoda

Hi Bish

Interesting post, sorry I didn’t get to it yesterday.

Much of what you say is fair enough, there does need to be an awareness of the limitations of the models (and indeed we do make sure this is made clear when we talk to policymakers). However you won’t be surprised if I pick you up on a few points…. :-)

Most climate models suggest a warming of 2-6degC/century.

Actually it’s more like 1-6 degrees over the 21st Century (not the 20th). Giving a “rate per century” is not really appropriate as this depends to some extent on the emissions that are assumed as an input - the range you quote includes different emissions as well as the range of model responses. You are right that the latter is wide though.

The temperature predictions of climate models cannot be tested in the short-to-medium term

I think you mean the global mean response to greenhouse forcing, which changes gradually. The response to short-term forcings such as volcanic aerosols can be tested, and has been.

their models generally replicate the Earth's temperature history ("hindcasts"), although it should be noted that even models ancapsulating very different sensitivities to CO2 can do this, demonstrating that the models are fudged.

Not really. The 20th century simulations vary between models, but there has not been enough of a CO2 rise so far to allow the differences between the model to become as apparent as they are in future projections.

Your point 6 applies to precipitation more than to temperature.

Cheers

Richard

Mar 20, 2012 at 9:52 AM | Registered CommenterRichard Betts

Richard Betts, Mar 20, 2012 at 9:52 AM:

"The 20th century simulations vary between models, but there has not been enough of a CO2 rise so far to allow the differences between the model to become as apparent as they are in future projections"

I am sorry Richard, but with all that has been written above, can you not see the glaring, logical absurdity in that statement?

Mar 20, 2012 at 10:11 AM | Unregistered CommenterRoger Longstaff

Richard

Re the precipitation point, are you saying that the hindcasting of regional temp is reasonable? (Not challenging you if you do - I'm somewhat in learning mode here).

Mar 20, 2012 at 10:19 AM | Registered CommenterBishop Hill

Andrew, re your point 5 about why modellers think that their models are reasonable approximations of the climate system, and the IPCC places faith in them, I think it might be worth quoting, in anything you write, the below comment about GCMs in a 2008 paper from the highly regarded MIT trio who have authored a number of studies involving climate models: Forest, Stone and Sokolov.

"Much of the work has focused on evaluating the models’ ability to simulate the annual mean state, the seasonal cycle, and the inter-annual variability of the climate system, since good data is available for evaluating these aspects of the climate system. However good simulations of these aspects do not guarantee a good prediction. For example, Stainforth et al. (2005) have shown that many different combinations of uncertain model sub-grid scale parameters can lead to good simulations of global mean surface temperature, but do not lead to a robust result for the model’s climate sensitivity.

A different test of a climate model’s capabilities that comes closer to actually testing its predictive capability on the century time scale is to compare its simulation of changes in the 20th century with observed changes. A particularly common test has been to compare observed changes in global mean surface temperature with model simulations using estimates of the changes in the 20th century forcings. The comparison often looks good, and this has led to statements such as: ”...the global temperature trend over the past century .... can be modelled with high skill when both human and natural factors that influence climate are included” (Randall et al., 2007). However the great uncertainties that affect the simulated trend (e.g., climate sensitivity, rate of heat uptake by the deep-ocean, and aerosol forcing strength) make this a highly dubious statement. For example, a model with a relatively high climate sensitivity can simulate the 20th century climate changes reasonably well if it also has a strong aerosol cooling and/or too much ocean heat uptake. Depending on the forcing scenario in the future, such models would generally give very different projections from one that had all those factors correct."

The "Randall et al., 2007" study that the above-quoted statement referred to as "highly dubious" comes from in fact constitutes the complete Chapter 8 "Climate Models and Their Evaluation" of AR4 WG1.

Pretty damning. I would likewise place very little weight on GCM estimates of key climate parameters such as sensitivity.

Mar 20, 2012 at 5:40 PM | Unregistered CommenterNic Lewis

“Climate models are based on well-understood physical law.”

This is an incorrect statement. The computer models utilize the “main-stream” science, which is flawed. Major flaws are: 1) The assumption that the atmosphere is transparent to solar radiations, which is not. Our basic observations suggest that the atmosphere absorbs solar radiations. This assumption has in fact eliminated the atmosphere from the science, an important earth subsystem. There can be no correct climate or atmospheric science without the atmosphere; 2) The assumption that carbon dioxide and water vapor exchange heat by radiation with their surroundings including the atmospheric air, which is impossible. The reason is that, in order for radiation to occur, there has to be a temperature difference and an interface separating the radiant entities. Carbon dioxide is not separate entity; it is part of a homogeneous mixture of the atmospheric air having the same temperature as that of the air. The only location that this air mixture can exchange heat by radiation is at the interface with the colder outer space at the mesopause elevation. As long as these flaws are not corrected, the climate models will continue to yield erroneous results.

Mar 21, 2012 at 1:00 AM | Unregistered CommenterNabil Swedan

I think that Clarke's exposition shows the game is over. Climate science has set out with fundamental mistakes in many of its basic assumptions. The models predict an imaginary world which is calibrated to the real world by fudge factors based on yet more phantasy physics.

Thus they use double the optical depth of low level clouds compared with reality and claim pollution increases cloud albedo when by inhibiting droplet coarsening, the reverse is the case as can be proved by the fact that when clouds prepare to rain they get darker underneath.

Time for a complete rethink: the game's over.

Mar 21, 2012 at 7:33 AM | Unregistered Commentermydogsgotnonose

Bishop Hill

Re the precipitation point, are you saying that the hindcasting of regional temp is reasonable? (Not challenging you if you do - I'm somewhat in learning mode here).

Basically yes, although the extent to which they are reasonable does vary from place to place according to the particular regional atmosphere-ocean phenomena that are important, eg: how strongly regional temperatures on certain land regions are affected by teleconnections to sea surface temperatures, which are more predictable as the ocean temperature varies more slowly.

Cheers

Richard

Mar 21, 2012 at 9:25 AM | Registered CommenterRichard Betts

A simple point, if you will allow. Farming ( my occupation ) is by its very nature a weather critical proposition. We scan the internet daily for weather information that might effect our very existance.

Naturally, ( and with all the alternative worldwide resources now poured into this industry), we surf the various channels looking vainly ( and stupidly ) for a better prediction that might suit our particular crop /cattle disposition. We are after all, only human.. But , we do watch, and we do scrutinize it all in minute detail ( pressure,wind. temp, etc)

In South Africa, we have available a) The National Weather Service.Gov , b) Wunderweather.com ) the Norwegian yr.no, and d) Kobus Botha Satelite Weather ( a "Most Excellent by any standards ") site. Bear in mind, that they all use the same basic information base available to all in order to make their prediction.

The fact is, they are all to some degree different. Some will have 40pct chance of rain, while the others will have nul. How can this be ?

Farmers ( yes we have some time on our hands ) are continuously seeking some sort of certainty from what is clearly, to us, an uncertain science. We have however one thing in common that the city folks do not. We watch the weather forecasters and their forecasts continuously. And, we remember when they were wrong. Not just tommorrow's weather, but up to seven days in advance. I doubt if the city folks go more than 2/3days ahead.

We see painfully in real life terms that the decision to cut Lucerne ( Alf-Alfa to the Americans ) knowing their would be dry days ahead before bailing, only to find that not only do forecasters get it wrong up to 50 pct of the time ( this kind of gets burned into your pscyhe watching your crop rot in the field ), but they are answerable to nobody.

And why should they be. It is after all only a "PREDICTION"

So, here's the point. ( sorry it took so long )

Why should I trust my future to persons 'predictions' of what may / or may not occur in 30 years time, when they have yet to display the competency to 'predict' relliably/accuratly in advance, what the weather will be 24 hrs time.

While I have an audience, one other little bugbear I would like to relieve myself of, that none of the media ever seems to recognise. Rising Oceans.....

My Geography school text book had a wonderful diagram of the life of a coral atoll. How the volcano slips beneath the waves and coral begins accumulate to grow on it highest point. It continue's to sink however. The fact is, it is inevitably doomed by the law physics to eventually disappear below the sea.

All these "Coral Island Nations" ( Atoll nations ) will eventually sink. So sad, but not my problem. No more than it is when people build their homes on the spring high tide mark. Of course , to them, the sea is percieved as rising..... And, the villain is the Industrialised nations and their Co2, and they must therefore pay....

This is nuts. In the most recent UN climate change conference in Durban S.Africa something like $600 billion was agreed to ( not given yet ) to satisfy this nonsence.... Oh, and they also predicted major droughts in ten years time for the horn of Africa that we should financially prepare for ( no matter the horn is in a drought region ).

Please let somebody of real importance get up and say that the "Emporer has no clothes....."

Debunk this nonsense once and forever and pour resources in to the World's real problems.

I thank you for my vent....

Mar 21, 2012 at 10:51 AM | Unregistered CommenterFarmer Steve

'Yes, yes - all jolly interesting, dear boy - but let's toddle along to the Commons bar and let's see what we can work out in terms of increased taxation....'

Mar 21, 2012 at 1:04 PM | Unregistered CommenterDavid

Concerning the response to a simple doubling of CO2, you may find this paper (peer reviewed by no less than James Hansen) of interest.

Schneider S. & Rasool S., "Atmospheric Carbon Dioxide and Aerosols - Effects of Large Increases on Global Climate", Science, vol.173, 9 July 1971, p.138-141

This snippet:

"We report here on the first results of a calculation in which separate estimates were made of the effects on global temperature of large increases in the amount of CO2 and dust in the atmosphere. It is found that even an increase by a factor of 8 in the amount of CO2, which is highly unlikely in the next several thousand years, will produce an increase in the surface temperature of less than 2 deg. K."

Is of particular interest, because (as far as I am aware) the basic science relating to simple addition of CO2 in isolation has in no way altered since 1971.

Mar 21, 2012 at 10:16 PM | Unregistered CommenterCatweazle666

Bish: 1) The IPCC'S calls in AR4 their models "an ensemble of opportunity" because they were selected by sponsoring governments. The ensemble is not intended to cover the range of scientifically plausible models. The range of results obtained from the models is totally meaningless, because the models were not selected for any scientific reason.

2). Stainforth et al (Nature 2005, I think) took the Hadley GCM and varied some of the parameters randomly varied them within known limits. They produced hundreds of plausible models with climate sensitivity ranging from 1.5 to 11 degC. They have been unable to prove that any of there new models are "wrong", ie inconsistent with observation. They have not explored varying any ocean parameters (too expensive computationally) especially thermal diffusivity, the rate at which heat is penetrating below the mixed layer, which may be the most critical parameter.

The full range of uncertainty in climate models is huge. The IPCC'S has simply selected a few to use without any good reasons.

Mar 22, 2012 at 7:26 AM | Unregistered CommenterFrank

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>