These models are meant to reflect the sum total of all of the knowledge of climate science. The fact that the models have such a poor record seems to have no impact on the Climate Faithful, for whom facts really do seem to be the real inconvenient truth.
Climateaudit is the home of Steve McIntyre, a good-natured and unassuming Canadian statistician who is not in the pocket of Big Oil, doesn't believe the earth is flat and doesn't defend the science that smoking is harmful (to repeat the three accusations of the Climate Faithful of their detractors), who is also the slayer of the infamous Hockey Stick and the person who pointed out the errors in NASA's US temperature data set. Steve is now posting a number of communications with well known climate scientists, the latest from James Annan.
A few days ago, Judith Curry, another well known climate scientist, included with a comment at Climateaudit the following:
...the way clouds are currently parameterized in climate models, there is no way to build into the model an assumption like cloud fraction decreases 10% if surface temperature increases by 1 degree. The “tuning knobs”, if you will, that are available to modify the cloud properties in climate models are the specification of the threshold grid scale humidity for the onset of cloud formation (possibly different for ice vs liquid clouds), the treatment of particle fall speed, several tunables in the convective parameterization, the way in which cloud phase is determined to be liquid or ice (or mixed), and the thresholding for onset of rainfall/snowfall.and:
So relationships like cloud fraction decreases 10% if surface temperature increases by 1 degree are used to evaluate climate models, not as input into climate models. if the climate model disagrees with the observations, then it is not straightforward to figure out what part of the model to “fix”: the problem may be the ocean dynamics influencing sea surface temperature, problems with the atmospheric dynamics, problems with the surface flux parameterization or the boundary layer parameterization, or problems with aspects of the cloud parameterization itself.
Climate models have become very complex and have substantial physical basis in the parameterizations, but the parameterizations do introduce errors owing to neglected degrees of freedom in the parameterization or coarse resolution. So it is actually fairly difficult to “tune” a climate model to give a specific result in terms of a relationship between two internally determined model variables. Other than “tweaking” parameterization elements such as described above, the only way you can tune a climate model is with the external forcing (and aerosols are the only realy wiggle room you have).
The cloud parameterization problem in climate models is so difficult owing to the complexity of the scales of the problem (from the scale of a micron sized drop to to scale of a tropical or extratropical storm system), with a model resolution that is order 100 km.
P.S. the word “definitive” was used in the sense of defining our current understanding of the problem, not to imply the problem is definitely solved.
The reason that the climate modelers were startled by the uniformly positive cloud feedbacks in the IPCC runs is that typically (previously) the cloud feedbacks have been all over the place. It is almost impossible to anticipate how the cloud feedback will turn out when you change an element of the cloud parameterization, since it interacts with so many things such as atmospheric dynam ics, atmospheric stability, boundary layer, etc.And to Steve McIntyre's current post:
I’ve been seeking an engineering-quality exposition of how 2.5 deg C is derived from doubled CO2 for some time. I posted up Gerry North’s suggestion here , which was an interesting article but hardly a solution to the question. I’ve noted that Ramanathan and the Charney Report in the 1970s discuss the topic, but these are hardly up-to-date or engineering quality. Schwartz has a recent journal article deriving a different number and, again, this is hardly a definitive treatment. At AGU, I asked Schwartz after his presentation for a reference setting out the contrary point of view, but he did not give a reference. I’ve emailed Gavin Schmidt asking for a reference and got no answer.As I've said before, the whole global warming scam will seriously impact on the public's belief in science and that's a tragedy.
James Annan, a thoughtful climate scientist (see link to his blog in left frame), recently sent me an email trying to answer my long-standing inquiry. While it was nice of him to offer these thoughts, an email hardly counts as a reference in the literature. Since James did not include a relevant reference, I presume that he feels that that the matter is not set out in existing literature. Secondly, a two-page email is hardly an “engineering quality” derivation of the result. By “engineering quality”, I mean the sort of study that one would use to construct a mining plant, oil refinery or auto factory - smaller enterprises than Kyoto.
Part of the reason that my inquiry seems to fall on deaf ears is that climate scientists seem to be so used to the format of little Nature and Science articles that they seem not to understand what an engineering-quality exposition would even look like.
Anyway on to James who writes:
I noticed on your blog that you had asked for any clear reference providing a direct calculation that climate sensitivity is 3C (for a doubling of CO2). The simple answer is that there is no direct calculation to accurately prove this, which is why it remains one of the most important open questions in climate science.
We can get part of the way with simple direct calculations, though. Starting with the Stefan-Boltzmann equation,
S (1-a)/4 = s T_e^4
where S is the solar constant (1370 Wm^-2), a the planetary albedo (0.3), s (sigma) the S-B constant (5.67×10^-8) and T_e the effective emitting temperature, we can calculate T_e = 255K (from which we also get the canonical estimate of the greenhouse effect as 33C at the surface).
The change in outgoing radiation as a function of temperature is the derivative of the RHS with respect to temperature, giving 4s.T_e^3 = 3.76. This is the extra Wm^-2 emitted per degree of warming, so if you are prepared to accept that we understand purely radiative transfer pretty well and thus the conventional value of 3.7Wm^-2 per doubling of CO2, that conveniently means a doubling of CO2 will result in a 1C warming at equilibrium, *if everything else in the atmosphere stays exactly the same*.
But of course there is no strong reason to expect everything else to stay exactly the same, and at least one very good argument why we might expect a somewhat increased warming: warmer air can hold more water vapour, and I’m sure all your readers will be quick to mention that water vapour is the dominant greenhouse gas anyway. We don’t know the size of this effect precisely, but a constant *relative* humidity seems like a plausible estimate, and GCM output also suggests this is a reasonable approximation (AIUI observations are generally consistent with this, I’m not sure how precise an estimate they can provide though), and sticking this in to our radiation code roughly doubles the warming to 2C for the same CO2 change. Of course this is not a precise figure, just an estimate, but it is widely considered to be a pretty good one. The real wild card is in the behaviour of clouds, which have a number of strong effects (both on albedo and LW trapping) and could in theory cause a large further amplification or suppression of AGW-induced warming. High thin clouds trap a lot of LW (especially at night when their albedo has no effect) and low clouds increase albedo. We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large, given our current state of knowledge. GCMs don’t do clouds very well but they do mostly (all?) suggest some further amplification from these effects. That’s really all that can be done from first principles.
If you want to look at things in the framework of feedback analysis, there’s a pretty clear explanation in the supplementary information to Roe and Baker’s recent Science paper. Briefly, if we have a blackbody sensitivity S0 (~1C) when everything else apart from CO2 is held fixed, then we can write the true sensitivity S as
S = S0/(1- Sum (f_i))
where the f_i are the individual feedback factors arising from the other processes. If f_1 for water vapour is 0.5, then it only takes a further factor of 0.17 for clouds (f_2, say) to reach the canonical S=3C value. Of course to some extent this may look like an artefact of the way the equation is written, but it’s also a rather natural way for scientists to think about things and explains how even a modest uncertainty in individual feedbacks can cause a large uncertainty in the overall climate sensitivity.
On top of this rather vague forward calculation there are a wide range of observations of how the climate system has responded to various forcing perturbations in the past (both recent and distant), all of which seem to match pretty well with a sensitivity of close to 3C. Some analyses give a max likelihood estimate as low as 2C, some are more like 3.5, all are somewhat skewed with the mean higher than the maximum likelihood. There is still plenty of argument about how far from 3C the real system could plausibly be believed to be. Personally, I think it’s very unlikely to be far either side and if you read my blog you’ll see why I think some of the more “exciting” results are seriously flawed. But that is a bit of a fine detail compared to what I have written above. Assuming I’ve not made any careless error, I think what I’ve written is entirely uncontentious among mainstream climate scientists (I certainly intended it that way).
Feel free to post and/or pick at as you please (maybe you’d like to LaTeX the maths first).
James
A Few Comments
As noted above, the above note contains only one (not very useful) reference and fails my request for something in the literature literature.
Annan says:
if you are prepared to accept that we understand purely radiative transfer pretty well and thus the conventional value of 3.7Wm^-2 per doubling of CO2
I do accept that we know radiative transfer of CO2 “pretty well”. I’m not as convinced that all the details of water vapor are understand as well. IPCC TAR GCMs all used a HITRAN version that included an (undisclosed) clerical error in water vapor NIR that amounted to about 4 wm-2 or so. This error had been identified prior to IPCC TAR, but not in time to re-do the GCMs. The error was not disclosed in IPCC TAR. The water vapor continuum seems to have a certain amount of hair on it yet.
Worse, as far as I’ve been able to determine, radiative transfer theory is not itself sufficient to yield the “conventional value of 3.7 Wm^-2 per doubling of CO2″. Getting to that value requires assumptions about the atmosphere and lapse rates and things like that - I’m not saying that any of these calculations are poorly done or incorrect, only that they are not simply a matter of radiative transfer.
Next, James identifies a second important assumption in the modern calculations:
constant *relative* humidity seems like a plausible estimate and GCM output also suggests this is a reasonable approximation
It may well be a “plausible estimate” but something better than this is required. I cannot imagine someone saying this in an engineering study. Lots of things “seem plausible” but turn out to be incorrect. That’s why you have engineers.
Annan goes on to say “GCM output also suggests this is a reasonable approximation”. I’m not sure entirely what he means by this as he did not provide any references. I interpret the statement to mean that GCMs use the constant relative humidity assumption and yield plausible results. Could one vary the constant relative humidity assumption and still get reasonable results from a GCM or a re-tuned GCM? I don’t know. Have people attempted to do so and failed? I don’t recall seeing references to such null experiments AR4 or elsewhere, but might have missed the discussion as it’s not a section that I’ve read closely so far.
In an interesting Crowley paleoclimate article (that I’ve not discussed yet but will at some point), he questions this particular assumption on the basis that allowing for varying lapse rates could explain otherwise puzzling paleo data.
Obviously in an engineering quality assumption, the constant relative humidity assumption would need to be thoroughly aired. I think that this is probably a very important topic and might take dozens of pages (if not a few hundred). A couple of sentences as done here by Annan is merely arm-waving through the problem.
Clouds
Annan says quite candidly:
The real wild card is in the behaviour of clouds, which have a number of strong effects (both on albedo and LW trapping) and could in theory cause a large further amplification or suppression of AGW-induced warming. High thin clouds trap a lot of LW (especially at night when their albedo has no effect) and low clouds increase albedo. We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large, given our current state of knowledge. GCMs don’t do clouds very well but they do mostly (all?) suggest some further amplification from these effects. That’s really all that can be done from first principles.
If we go back to the Charney Report in 1979, clouds were even then identified as the major problem. Given the seeming lack of progress in nearly 30 years, one wonders whether GCMs are really the way to go in trying to measure CO2 impact and whether irrelevant complications are being introduced into the assessment. There was an interesting discussion of cloud feedbacks at RC about a year ago, in which Isaac Held expressed astonishment when a lay commenter observed to him that cloud feedbacks in the models were all positive - Held apparently expecting the effects to be randomly distributed between positive and negative.
James says:
We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large
This is a pretty disquieting statement. If we don’t know this and if this is needed to assess doubled CO2, how does one get to an engineering-quality study?
As far as I’m concerned, James’ closing paragraph about feedbacks is tautological: if you know the feedback ratio, you know the result. But you don’t know the feedback ratios so what has James done here other than re-state the problem?
Thus, James’ exposition, while meant kindly, is not remotely close to answering my question. So the search for an engineering-quality explanation remains.
As I’ve said on many occasions, I do not jump from the seeming absence of a reference to the conclusion that such an exposition is impossible - a jump that readers make much too quickly in my opinion. Murray Pezim, a notorious Vancouver stock promoter, actually had a couple of important mineral discoveries (e.g. Hemlo). I do think that the IPCC has been seriously negligent in failing to provide such an exposition. Well before the scoping of IPCC AR4, I corresponded with Mike MacCracken and suggested that IPCC AR4 should include an exposition of how doubled CO2 leads to a 2.5-3 deg C overall temperature increase - the sort of exposition that readers here are thirsting for.
He undertook to pass the suggestion on to Susan Solomon. However, this idea was apparently rejected somewhere along the process. The first chapter of AR4 consists instead of a fatuous and self-congratulatory history of climate science that has no place whatever in a document addressed to policy-makers.
A side-effect of this IPCC failure is perhaps the dumbing down of the AGW debate, giving rise to shallow and opportunistic expositions like An Inconvenient Truth, in which we get polar bears, hockey sticks, Katrina, all artfully crafted to yield a promotional message. This places thoughtful climate scientists in a quandary, since, by and large, they agree with the AIT conclusion, but not the presentation and the details, and have tended to stay mute on AIT.
21 comments:
models that have never been accurate once, ever, at all - except for their predictions that the stratosphere would cool while the troposphere warms, their prediction of the subsequently observed planetary energy imbalance, their accurate prediction of what would happen to global temperatures following the eruption of Pinatubo, and their accurate prediction of greater warming in the Arctic than anywhere else.
You clearly have a deep-seated and irrational hatred of models. So put them aside and concentrate on basic physics. CO2 is a greenhouse gas, right? And its concentration has increased by 30% over the last 150 years, yes? Now tell me, please, in what possible way could that fail to make the atmosphere hotter?
And tell me what this could be described as, if not accurate.
To be honest anonymous (you really should post a name) making your computer model match historical data is not exactly difficult. Even I can do that & I'm not the greatest programmer alive.
The real problem as I see it is that the media et al are faithfully reporting the whole issue in hysterical Al Gore inconvenient truth terms, when even the IPCC worst case scenario for sea level rise etc is far below the Gore suggestions.
Surely we need some balance here. Shouldn't the media report the actuality and not the hysteria?
Just so you know it I'm still in the undecided camp and studying the science as you suggested, but I must say the more I read the more appalled I am by the behaviour of supposedly intelligent human beings what with all of this stupid trying to discredit perfectly decent people all of the time.
oh well. this website shows that almost all of the warming that Australia has sen is due to an increase in sunshine duration.
Lenny,
You really are an ignorant cretin.
I build computer models every single week, which is why I understand how egregious the back-fitting of the climate models is.
The models did NOT predict what would happen after Pinatubo. The models were modified after the event to reflect the cooling that the eruption would produce. Not only that, but the modification made was not on the basis of physics but on the actual amount of temperature change that occurred.
CO2 has a minimal impact on temperature, in my view. Maybe 10% of the warming we've seen can be attributed it to all GHGs (CO2, NH4, O3) in the last 150 years.
Furthermore, climate science has not ever given a plausible explanation as to why the LIA ended and how much of whatever forced that change is still in effect. As I've mentioned before, nearly all of the temperature change is down to CO2, which is appalling science and not supported by any of the data.
The other major, major problem for Climate Liars like yourself, for whom real scientists have complete contempt, is that the whole global warming myth is based on measuring UHI.
Hoppers, if it's easy to match the historical data, that tells us we've got the physics basically right.
Jonathan Lowe - sadly, what you say on your blog is worthless. Publish it in a peer-reviewed journal if you think it will stand up to the scrutiny.
Valerie - why lie? Why just make up pure fiction about Pinatubo to fit your denialist outlook? That's beyond stupid. James Hansen's testimony to congress in 1988 on global warming laid out three possibly scenarios of how the climate would evolve over the next decade. He included a hypothetical volcanic eruption in 1995. Go away and have a look at how the predictions matched the subsequent observations.
CO2 has a minimal impact on temperature, in my view - why is that your view? Is it because your political views make you feel obliged to deny global warming? It certainly doesn't have any basis in science.
climate science has not ever given a plausible explanation as to why the LIA ended - well now you're just showing your total ignorance of the real world. You ever heard of the Maunder Minimum? Do you know what happened when the Maunder Minimum ended?
And tell me, how do urban heat islands warm up the ocean? How do they melt tropical glaciers? How do they increase hurricane intensities? How do they make the Arctic heat up faster than the rest of the planet?
Not really. It just demonstrates that you can draw a line that matches an existing line.
Difficult stuff!!
You think that's how they make the models? Then you are very very ignorant of the reality I'm afraid. Many of the climate models are open-source - you can look at the source code and I think you'll find it's rather more complicated than that.
Of course the models are "tuned to match known observations".
I didn't think there was any disagreement about that.
It's the predictive stuff that's the challenge.
BTW don't call me ignorant you horrible person.
Models are not 'tuned'. You can't tune physics. I don't think you or Lacton really understand the way the kind of computer programmes we're talking about are written. If you think it's a simple matter of writing a program that produces a line that looks a bit like what past temperatures look like, then I'm afraid you are ignorant.
Have you looked at Hansen's 1988 graphs, specifically the line for scenario B, and compared them to what has happened in the 19 years since? Do that please, and then consider whether 19 years on today's models will be doing better or worse than that.
I'm not here to be nice. I'm here to oppose the constant lies and vitriol of lacton. If I see someone who is ignorant about the facts, I'll tell them so. You can do the same.
The models are tuned actually.
Who said anything about tuning physics?
You tune the variables & constants that the physics apply to.
There are many variables and constants in a climate programme by definition.
You are obviously very ignorant about computer programming. - Now you're in my area of expertise!
Would the Hansen you are referring to be the same chap who makes the occasional "It's much worse than we thought" press releases while quietly toning down his gloomy estimates in the background?
Now he used to be a scientist - before he became politicised.
No, sorry, you're being ignorant - apparently wilfully. You claim not to have made your mind up about the issues, but I think your last comment gives the lie to that. Did you actually look up what the 1988 predictions were, as I suggested?
AR4: “ad hoc tuning of radiative parameters”
http://www.climateaudit.org/?p=2565
If you can bear to read independant research.
I'm beginning to turn towards the denial side having read the flimsy science which you support, and the deceit of it's promoters.
I am reaching the conclusion that this is a gravy train which is now approaching the buffers.
I came, by the way, from belief in AGW when I first arrived at this site.
Lacton (and others) make convincing arguments.
You, and most of the promoters of what I am starting to believe is a myth indulge mostly in thuggish ad-hominem attacks.
Goodbye...I won't be engaging in any more discourse with you..It's a pointless exercise
Thanks, hoppers, for the spirited defence against poor, deluded Fudgie nee Lenny.
I have, indeed, looked at the source code of models, as some of them are open source. The fudge factors are there for all to see including the 1940-75 'aerosol' cooling, which different models adjust with different amounts because there's no empirical data.
The climateaudit stuff is pretty disturbing if correct. I want to do some of my own analysis though it does seem quite plausible.
Hello Jack,
I like most others accepted AGW as gospel right up to the time that I was told the debate was over...I wasn't aware there had been a debate! Then that fool Margaret Beckett tried to ban any "heretic denial talk" in the U.K.
I have lived long enough to know that when these tactics are employed, and the witch-hunts start , we have a problem, so I started digging.
I have researched long and hard. I am horrifified by what I have found, and my views are changing.
There appears to be deception on a grand scale, if only in the measure of what the scientists believe will happen, and what the public are told will happen...Little kids scared to death by it all for goodness sake.
Thank you for providing an alternative view.
Whether you are right (I now suspect you are) or wrong, it is outrageous to try to gag you.
Anonymous et all should understand that science, as all thing where opposite views collide, is about debate. A concensus only lasts until it is disproved, and gagging and ad-hominem smearing are the refuge of scoundrels.
I am a computer programmer, not a scientist. I have always respected science greatly. Now I fear it is being damaged for maybe generations to come because it has become politicised.
We are all the poorer for that.
If you're weak-minded enough to be persuaded away from rational science by an angry right-wing blogger, then there's not much I'm going to be able to do about it. Looks like decided you wanted to disbelieve the science - well, go ahead. The science will carry on regardless of whether you believe it or not.
Lacton's blog is just one of many websites I read actually including the 2 Mac's (Destroyers of the hockey stick), Manns blog on realclimate.org etc etc.
It takes a strong mind to stand up to the bullying tactics of agitators like you as it happens and make up your own mind in the face of pressure to conform.
You rather give your political leanings away by your description of Lacton as angry right wing. I suspect science is the last thing on your agenda.
Oh yes, the science, C02 is an interesting theory. but increasingly it would appear to me at least that other factors are more important. - I think others would conclude likewise were it not for the disinformation eminating from your side of the argument you have tried to stifle.
It's scary taking you people on. Are you lot given a one day science course and then sent out to monster dissenting websites?
Wouldn't surprise me.
other factors are more important - such as?
The Sun
No. It's been proven that the Sun cannot explain the warming seen since 1970. See Solanki and Lockwood and Fröhlich.
Post a Comment