|
Post by glc on Jan 26, 2010 10:38:52 GMT
Jim Cripwell constantly brings up issues such as convection and latent heat when discussing the likely warming effect from CO2. I have tried to make it clear that these processes simple move energy around the climate system. They do not cause the climate system to cool. The climate system, as a whole, can only cool by emitting radiation to space. There is no other way. Therefore, if we reduce the amount of outgoing radiation, we will have more incoming energy (from the sun) than outgoing energy and the earth will warm. The actual reduction can be calculated reasonably accurately and if CO2 doubles we can expect a warming of around 1 deg C. [Note: The earth's surface emits ~388 w/m2 but only ~235 w/m2 is actually emitted to space. The difference is due entirely to the greenhouse effect. ]Jim (and others) appear not to believe me so here are a few quotes from some notable AGW-sceptical scientists. 1. Richard Lindzen wattsupwiththat.com/2009/03/30/lindzen-on-negative-climate-feedback/For example, if one simply doubles the amount of CO2 in the atmosphere, the temperature increase is about 1¢XC. 2. Jack Barrett www.barrettbellamyclimate.com/page45.htmIf a forcing of 3.7 W m-2 is caused by a doubling of CO2 concentration then ƒä = 231.3/390 = 0.593 and this gives Tsurface = 293.2 K, an increase of 1.3 K. 3. Garth Paltridge www.sepp.org/Archive/weekwas/2005/May%2028.htmIt is fairly easy to calculate the likely rise of global average temperature dT for the purely theoretical situation where atmospheric carbon dioxide (CO2) is doubled but nothing else about the atmosphere is allowed to change. The answer is about 1.2 degrees CelsiusJim Could you start by explaining what you have discovered which has been missed by these 3 specialists. Please don't try to latch onto Paltridge's "nothing else is allowed to change" because we are then into the area of feedbacks which is a completely separate issue - and one which I have discussed at great length. Actually, you are on dodgy ground if you use feedbacks as an uncertainty because a large number of sceptics believe that feedbacks are positive (if only slightly).
|
|
|
Post by jimcripwell on Jan 26, 2010 12:11:45 GMT
glc writes "Could you start by explaining what you have discovered which has been missed by these 3 specialists."
You are mistaken in what my objections are. I simply do not believe it is possible to separate the effect of radiation alone, neglecting the effects of the other modes or energy transfer within the atmosphere. All forms of energy transfer interact with each other, within the atmosphere. I simply do not believe that it is possible to estimate the concept of "radiative forcing".
|
|
|
Post by glc on Jan 26, 2010 12:53:13 GMT
You are mistaken in what my objections are. I simply do not believe it is possible to separate the effect of radiation alone, neglecting the effects of the other modes or energy transfer within the atmosphere. All forms of energy transfer interact with each other, within the atmosphere. I simply do not believe that it is possible to estimate the concept of "radiative forcing".
Jim
Around 390 w/m2 leaves the surface as radiation. Around 240 w/m2 is eventually emitted to space. By studying emission spectra we can track the energy transmission through the atmosphere. Lindzen, Barrett, Paltridge and countless others know this. They also know that equations for IR transmission closely model what takes place in the atmosphere. They are well aware of the other processes and aware that these processes are operatin continuously but they have no effect on the earth's energy budget.
You say you do not believe what all these scientists are saying. I'd say that is due to a fault in your understanding. You would be better advised to adopt some humility on the issue. You are, without doubt, wrong and the fact that you can find no support from any leading scientist speaks for itself. I, on the other hand, am right and I have the entire climate science community (including sceptics) on my side.
Around 240 w/m2 enters the earth's climate system and around 240 w/m2 should leave it. What happens in between is interesting but is not terrible relevant to that basic fact.
|
|
|
Post by glc on Jan 26, 2010 12:55:02 GMT
You are mistaken in what my objections are. I simply do not believe it is possible to separate the effect of radiation alone, neglecting the effects of the other modes or energy transfer within the atmosphere. All forms of energy transfer interact with each other, within the atmosphere. I simply do not believe that it is possible to estimate the concept of "radiative forcing".
Jim
Around 390 w/m2 leaves the surface as radiation. Around 240 w/m2 is eventually emitted to space. By studying emission spectra we can track the energy transmission through the atmosphere. Lindzen, Barrett, Paltridge and countless others know this. They also know that equations for IR transmission closely model what takes place in the atmosphere. They are well aware of the other processes and aware that these processes are operatin continuously but they have no effect on the earth's energy budget.
You say you do not believe what all these scientists are saying. I'd say that is due to a fault in your understanding. You would be better advised to adopt some humility on the issue. You are, without doubt, wrong and the fact that you can find no support from any leading scientist speaks for itself. I, on the other hand, am right and I have the entire climate science community (including sceptics) on my side.
Around 240 w/m2 enters the earth's climate system and around 240 w/m2 should leave it. What happens in between is interesting but is not terrible relevant to that basic fact. The addition of CO2 will alter the balance in the upper atmosphere.
|
|
|
Post by icefisher on Jan 26, 2010 13:03:09 GMT
I have tried to make it clear that these processes simple move energy around the climate system. They do not cause the climate system to cool. The climate system, as a whole, can only cool by emitting radiation to space. There is no other way. Therefore, if we reduce the amount of outgoing radiation, we will have more incoming energy (from the sun) than outgoing energy and the earth will warm. Thats all very academic of you GLC. But it doesn't translate to the concerns of the average human. The average guy only cares about the climate system in the bottom 6 feet of the atmosphere. That which is above the clouds stays trapped above the clouds (clouds being opaque to IR) until in radiates to space, finds a path, which apparently takes practically no time at all since the Red Dog is missing and CO2 is far more tranparent than clouds. Bottom line is clouds increase, sunlight decreases, IR back radiance decreases, planet cools. It doesn't matter if its a feedback or if it is caused by GCRs. Further since the atmosphere is less dense up there it should be registering 1.5 to 2 times the temperature increase and its just not there. Ultimately what this is going to boil down to is not if some bureaucrat supervised scientist in some windowless office figures its going to get warmer but instead whether folks are feeling too warm or not. Other than the nuts and fruitcakes that are trying to make it an imprisonable offense in CA for driving around with under inflated tires, the average guy is not going to start feeling guilty for his family moving north generations ago and making a livelihood even if it does get a little warmer. The actual reduction can be calculated reasonably accurately and if CO2 doubles we can expect a warming of around 1 deg C. Calculations? You have been repeatedly asked for the evidence the calculations are correct. These are hypothetical calculations based upon parameters such as no LIA recovery, thus no natural long term variation. Indeed the calculations are correct if you can attribute the warming over the past century to CO2. . . .but that hasn't been done yet; though it has been attempted fraudulently which does nothing but bring into question everything else surrounding the topic for the average guy. But the take home point here is your claim of calculations being reasonably accurate has not been established in any way shape or form. It remains a supposition. [Note: The earth's surface emits ~388 w/m2 but only ~235 w/m2 is actually emitted to space. The difference is due entirely to the greenhouse effect. ]Thats poppycock! Huge amounts of heat in the atmosphere is from condensation of water vapor and surface conduction. The heat so transported isn't trivial much less zero.
|
|
|
Post by jimcripwell on Jan 26, 2010 14:30:06 GMT
glc writes "You say you do not believe what all these scientists are saying. "
If I said this I misspoke. I do not "understand" what these scientists are saying. It is now after breakfast let me try again. Let us take it as definite that the only way the earth loses energy, is by radiation. I am ONLY talking about the way energy is transferred WITHIN the atmosphere. From all I have read, there are four ways in which energy is transferred WITHIN the atmosphere. My reading also leads me to understand that all 4 modes interact with each other in highly complex ways which we do not fully comprehend. In other words, my reading leads me to believe that the way radiation is transferred WITHIN the atmosphere is affected by the other 3 modes. I may well be mistaken in this belief.
However, assuming I am correct, then it makes no sense to me to do any calculations whereby we study the way only radiation is transfered WITHIN the atmosphere, and neglect to take into account the other 3 modes of energy transfer, at the same time. Maybe it does make sense, but I cannot see why. Maybe you can explain this. I have searched for years to find any sort of a reference on this issue. I have not been able to find one. Maybe, again, you can help.
So let me be clear where I do not understand. I do not understand why it makes sense to do calculations on the way energy is transferred WITHIN the atmosphere, considering ONLY radiation, and neglect the other 3 modes of transfer. Can you help?
|
|
|
Post by steve on Jan 26, 2010 15:10:09 GMT
Jim,
Sounds about right to me.
Five answers.
1. A lot of physics starts with applying the basic laws as a sanity check. Eg. someone was selling a gravity powered lamp last year. A large weight gradually sunk to the ground, and the potential energy was converted to electric energy. If you got the calculator out, you'd find you'd need a 30 tonne weight to run a 2 watt CFL for 3 minutes (I made that bit up). Basically, though, it was nonsense.
If you ran the calculation and found that CO2's effect was minuscule you'd feel more confident ignoring it (It wouldn't mean it *should* be ignored).
2. Climate and forecast models don't seem that sensitive towards different causes of warming. eg. the effect of the sun warming is not that much different from the effect of increased CO2. They are qualitatively different, in details (eg. see point 3), but not so different that it is not useful and/or interesting to compare the two.
Common sense would concur with this finding. Currently we are talking about the transfer in amounts of hundreds of watts in radiation and many tens of watts in convection and evapotranspiration. The variation from CO2 is calculated in amounts of 2-3 watts.
3. The calculations are not just used to give a headline figure. They are done in detail to show the differences everywhere in the atmosphere. And of course they are repeatedly done as a climate model is run. Notably, they show up the fact that greenhouse gas increases will tend to result in a cooler stratosphere, and contrary to some people's expectations, they don't show that extra CO2 causes measurable excess warming in high CO2 areas (ie. downwind of cities).
4. At a policy level, politicians have to agree on the relative cost of emitting CO2 or methane or CFCs or SF6 or whatever. Given Point 2. it is reasonable for them to use the sensitivity metric as one of the bases for calculating the cost.
5. Satellite observations from space can be done to check the numbers. Currently the numbers are probably not good enough to accurately detect the amount of forcing and the amount of feedback over the whole planet all the time. But maybe one day they will be. Calculating the theoretical forcing helps in planning and designing the instrumentation.
|
|
|
Post by jurinko on Jan 26, 2010 16:06:34 GMT
Polar temperatures pretty much falsify this 1 degC assumption. We are in the halfway to the doubling and Antarctic shows nothing at all and Arctic shows just AMO related variations, present temperature bellow the 1940s level. Dry and cold polar air should exhibit the CO2 induced warming at most, since there is only a little water vapor and overall "GH effect" should increase mightily.
Either all that virtual calculations are completely wrong, or there are negative effects which swallow the CO2 effect without any traces.
|
|
|
Post by jimcripwell on Jan 26, 2010 17:06:32 GMT
steve writes Ffive answers". None of which address the main question. Is the way radiative energy is transferred WITHIN the atmosphere affected by the other three modes of energy transfer?
|
|
|
Post by northsphinx on Jan 26, 2010 17:15:39 GMT
The climate system, as a whole, can only cool by emitting radiation to space. There is no other way. Therefore, if we reduce the amount of outgoing radiation, we will have more incoming energy (from the sun) than outgoing energy and the earth will warm. Calm down now glc , I agree with You on this. But if I use Your own numbers is it something that really make the AGW view little bit odd. You say here that ~388 w/m2 is emitted from the earth surface but only ~235 w/m2 is actually emitted to space. And in another place that the atmospheric window is about 70 w/m2 at dry air and no cloud. I have seen slightly larger number but let use 70 as example. Of the total radiation from earth surface is some leaving TOA without being absorbed "on the way" out. That is the atmospheric window. The remaining have been absorbed and emitted during the way out to space. Ready for this? All of it. That is the thing. All of it. Except the part that pass through the atmospheric window. Most of outgoing radiation from earth is then from the atmopshere not the surface if You are right. Here is a picture of outgoing radiation Here is it obvious that outgoing radiation is depending on cloud and water vapor. And the outgoing is in every part of the world larger than Your atmospheric window value. In fact is is about the temperature of the emitting surface or cloud top. With an emission factor of about 0,9. Is that not really strange? The satellite measure a outgoing radiation far above the atmospheric window levels but still is it in line with the surface temperature OR cloud top temperature. How is that possible?
|
|
|
Post by steve on Jan 26, 2010 17:41:59 GMT
steve writes Ffive answers". None of which address the main question. Is the way radiative energy is transferred WITHIN the atmosphere affected by the other three modes of energy transfer? I agreed with you that it does. Answers 2, 3 and 5 touch on your point. But I am answering the "what is the point of looking at radiative transfer?" question, and I wanted to keep the answers short.
|
|
|
Post by stranger on Jan 26, 2010 17:59:37 GMT
One thing that should be considered before any hypothesis is presented is "plausiblity." We know water vapor absorbs energy over a much wider section of the infrared spectrum than CO2, and therefore has about seven times the "greenhouse gas effect" of CO2. Since the atmosphere typically has 15,000 to 20,000 ppm of water vapor, how plausible is the "greenhouse equivalent" of an extra 120 or so water vapor molecules per million in raising atmospheric temperature by a significant amount?
Another thing that should be considered is the simple fact that the Earth radiates over the entire electromagnetic spectrum. Yes, with a suitable antenna and receiver, well isolated from man made sources, you can detect terrestrial radiation at audio, at radio, and at higher frequencies, well into the infrared. Satellites, with their relatively narrow band "receivers" miss much of the total radiation from the Earth.
While both H2O and CO2 trap a certain amount of infrared, that energy is reradiated relatively quickly. The more quickly in the absence of clouds, of course. Experiments with atmosphere in relatively long tubes demonstrates that a temperature rise at the base is surprisingly quickly evident at the top, carried both by Brownian motion and convection. The effect is less noticeably in free air, since horizontal currents dissipate the extra energy.
Stranger
|
|
|
Post by steve on Jan 26, 2010 18:30:20 GMT
One thing that should be considered before any hypothesis is presented is "plausiblity." We know water vapor absorbs energy over a much wider section of the infrared spectrum than CO2, and therefore has about seven times the "greenhouse gas effect" of CO2. Since the atmosphere typically has 15,000 to 20,000 ppm of water vapor, how plausible is the "greenhouse equivalent" of an extra 120 or so water vapor molecules per million in raising atmospheric temperature by a significant amount? The extra water vapour that would be expected in a warmer world is thought to add to the warming induced by CO2. The proportion of the energy emitted at radio frequencies will be low. Lower frequencies are lower energy. At higher frequencies there is quite a sharp cut-off. That's a slightly over-simplified way of thinking about the "greenhouse" effect. If you imagine looking at the earth from above with infrared eyes at the frequencies at the edge of the CO2 absorbing peaks, you will see to a particular average depth - like looking into fog. By increasing CO2, the average depth you see is less deep (because the CO2 thickens the fog a little bit). Because you are looking less deep, you are seeing colder air. Colder air radiates less energy. And that is why there is an energy imbalance.
|
|
|
Post by jimcripwell on Jan 26, 2010 20:14:37 GMT
steve writes "steve writes Ffive answers". None of which address the main question. Is the way radiative energy is transferred WITHIN the atmosphere affected by the other three modes of energy transfer?
I agreed with you that it does."
If you agree that the way radiative energy is transferred WITHIN the atmosphere is affected by the other modes of energy transfer, then what sense does it make to do calcualtions on radiative energy transfer, while ignoring the other modes of energy transfer?
Or am I missing something?
|
|
|
Post by socold on Jan 26, 2010 22:16:06 GMT
Basic physics dictates that about 1 degree warming will occur from the radiative imbalance of doubling co2 if nothing else in the climate system changed except temperature.
That metric marks doubling co2 apart from most things in the world. Such a spitting, or riding a horse, none of which would produce any warming by the same metric, let alone 1C
The only reason doubling co2 could cause less than 1C warming is if the climate resolves the imbalance by changing in a way other than temperature. Ie clouds increasing.
Yet even in that case it would mean doubling co2 causes a significant change in cloud cover, which is a climate change in itself. So knowing the imbalance of doubling co2 is a whopping 3.7wm-2 means a significant climate change is inevitable, whatever form it may take. You can't resolve the imbalance without the climate changing significantly.
|
|