|
Post by northsphinx on Mar 24, 2010 19:03:51 GMT
Steve; No but it does not make any difference for Warren Buffet or his environment.
|
|
|
Post by nautonnier on Mar 25, 2010 1:30:14 GMT
And what you are giving him is an unknown value also. In the hypothetical world it could be $10 (of some country's currency) but the actual value in the real world is unknown.
|
|
|
Post by northsphinx on Mar 25, 2010 8:19:55 GMT
Quotes from ams.allenpress.com/archive/1520-0477/90/3/pdf/i1520-0477-90-3-311.pdf"The downward land flux associated with global warming (that accounts for melting land ice, etc.) is estimated to be less than about 0.01 PW, or 0.07 W m−2" "Global precipitation values are considered most reliable (Trenberth et al. 2007b), and for 2000–04 the global mean is 2.63 mm day−1, which is equivalent to 76.2 W m−2 latent heat flux." That is 0.07/76.2 => about 0,09% Compare with uncertainties of 10% and tell me then with a straight face that CO2 is mankind largest threat. ;D
|
|
|
Post by steve on Mar 25, 2010 11:30:02 GMT
And what you are giving him is an unknown value also. In the hypothetical world it could be $10 (of some country's currency) but the actual value in the real world is unknown. This started off as a claim that the physics of AGW was wrong because CO2 cooled the atmosphere, and so the atmosphere could not warm if CO2 rose. When that argument was lost, it mutated into a claim that because variability in the system exists, any change that was less than the variability could not have any effect. nx+10x = nx because the uncertainty in n is much greater than 10. A completely unjustified assumption.
|
|
|
Post by poitsplace on Mar 25, 2010 14:26:13 GMT
ACTUALLY, steve...it started out with the basic premise that convection and latent heat account for the bulk of the heat transfer. While a gradient (any gradient) is maintained largely by greenhouse gases, the bulk of the effect is achieved at extremely low concentrations...after which convection takes a larger and larger portion of the energy through the troposphere. SURELY you must understand that since convection and latent heat carry over half of the energy, CO2's impact must necessarily have LESS than half the impact suggested by raw absorption math...as will any other radiative "forcing" that attempts to establish a new gradient.
Few of us say it does absolutely nothing...but it sure as heck can't do a lot.
|
|
|
Post by steve on Mar 25, 2010 16:35:42 GMT
ACTUALLY, steve...it started out with the basic premise that convection and latent heat account for the bulk of the heat transfer. Mainly I was referring to the suggestion that the bulk of the emission came from the stratosphere where it cooled the atmosphere, and therefore increasing CO2 would cool the (whole) atmosphere more. If you are right, we should surely be observing that the upper troposphere is warming much faster than the models say, much faster than the surface and much much faster than the satellites say. This is because the extra energy that cannot escape from the midtroposphere due to the added greenhouse gas needs to be convected to higher up in the troposphere to be radiated from there. Since you are in disagreement with everybody, I think you are wrong.
|
|
|
Post by poitsplace on Mar 25, 2010 19:17:12 GMT
No, you misunderstand. The energy is ALREADY being convected to higher up in the troposphere.
If you start with an atmosphere that magically has no ability to emit, it would rapidly reach equilibrium through convection from the surface...at which point convection would stop, with the atmosphere at the same temperature as the surface.
If you bring up the concentration of CO2 to just 20ppm, the atmosphere would have MOST of the radiative efficiency it does today. The energy would pour out into space...but the speed at which it radiates is far too high given the relative mass of the atmosphere and this causes a drop in temperature. The drop in temperature causes a skew between radiative efficiency and absorptive efficiency...decreasing the radiative efficiency of the atmosphere and creating the "greenhouse effect".
BUT...even at 20ppm, the gradient it forces is high enough that convection takes over an enormous chunk of the energy movement. By the time you're up around current or pre-industrial CO2 levels...and including the FAR more potent effects (on both sides of this equation) for water vapor...convection and latent heat...there's not much capability left for CO2 to force gradient changes. Its relative potency compared to water vapor and convection are negligible.
|
|
|
Post by nautonnier on Mar 26, 2010 0:21:39 GMT
Nice to have someone else carry the water
|
|
|
Post by sigurdur on Mar 26, 2010 0:50:33 GMT
poitsplace: You said it in a nutshell. Is it really that hard to understand?
|
|
|
Post by steve on Mar 26, 2010 12:41:23 GMT
No, you misunderstand. The energy is ALREADY being convected to higher up in the troposphere. If you start with an atmosphere that magically has no ability to emit, it would rapidly reach equilibrium through convection from the surface...at which point convection would stop, with the atmosphere at the same temperature as the surface. If you bring up the concentration of CO2 to just 20ppm, the atmosphere would have MOST of the radiative efficiency it does today. The energy would pour out into space...but the speed at which it radiates is far too high given the relative mass of the atmosphere and this causes a drop in temperature. The drop in temperature causes a skew between radiative efficiency and absorptive efficiency...decreasing the radiative efficiency of the atmosphere and creating the "greenhouse effect". BUT...even at 20ppm, the gradient it forces is high enough that convection takes over an enormous chunk of the energy movement. By the time you're up around current or pre-industrial CO2 levels...and including the FAR more potent effects (on both sides of this equation) for water vapor...convection and latent heat...there's not much capability left for CO2 to force gradient changes. Its relative potency compared to water vapor and convection are negligible. You are contradicting yourself. You say that the increased CO2 is not sufficient to "force a gradient change". I'm not sure whether either of us understands what you mean by that. I think you are saying, in essence, that convection is a dominant process and its dominance is the same with more or less CO2 above a certain amount. But we can calculate the amount of radiation that comes from CO2 at 1, 2, 3, 4, 5, 6 etc. km and goes into space. And none of the results is zero. If we increase CO2, then we can repeat the calculation, assuming no change, and find that the amount from each of these levels is a bit less. For you to be correct, the energy imbalance *does* therefoer have to be solved by another change such as an increase in convection or an increase in temperature or a reduction in water vapour or a change in clouds, or a combination of all things. An increase in convection is predicted, but not sufficient to prevent lower layers from warming. If anything, the satellite obs say that this increase is *less* than the model predictions. You appear to be painting a picture of convection lifting the energy in the CO2 without it influencing the temperature of the air at different levels. It's really the other way around. Warm air is lifted, the warm air includes "warm CO2" which is kept warm by the air, but is continually "dumping" energy by radiation in proportion to the temperature of the air. Some of the radiation escapes to space. You can't get the energy to the top of the atmosphere without taking the warm air with it!
|
|
|
Post by poitsplace on Mar 26, 2010 14:29:13 GMT
You're so close to understanding it that you are trying to criticize me by pointing out that it would show behavior that I'm saying it shows.
As CO2's emission/absorption efficiency increases it tries to force one temperature gradient...but as the gradient increases, convection and latent heat begin bypassing CO2, taking the energy up to the higher levels that way (and heating it) instead of it having to pass as radiation through CO2's absorption gauntlet.
CO2 emissions (of radiation) are indeed one way in which much of the atmospheric energy is lost...but it is convection and latent heat that transfer most of the energy for CO2 to emit in the first place. CO2's absorption is therefore far less important...and therefore has a far, far weaker "forcing" than suggested by the raw absorption math. In a slab-like atmosphere you would suspect a doubling of CO2 might cause as much as a 1.2C increase in temperature...but in reality it must be substantially lower...less than half.
|
|
|
Post by northsphinx on Mar 26, 2010 16:55:24 GMT
Steve wrote; "This started off as a claim that the physics of AGW was wrong because CO2 cooled the atmosphere, and so the atmosphere could not warm if CO2 rose. When that argument was lost, it mutated into a claim that because variability in the system exists, any change that was less than the variability could not have any effect." And since I started this thread I am probably the "target". Let us then go back to A, CO2 radiative impact warming/cooling impact and then B, compare to natural variability. A, Is stated to be the widening of the CO2 band. Usually calculated by MODTRAN and come up with a average number of 3.7 W/m2 for twice as much CO2 than today. Then is the forcing calculated into temperature as part of total total greenhouse effect From www-ramanathan.ucsd.edu/FCMTheRadiativeForcingDuetoCloudsandWaterVapor.pdf is total forcing 131 W m−2. Or in Their words: "The global average Ga is 131 W m−2 or the normalized ga is 0.33, i.e., the atmosphere reduces the energy escaping to space by 131 W m−2" And with that 131 W m−2 is the earth heated about 33 degrees. Another 3.7 W m−2 trapped should then heat less than a degree. I don't agree. Even if it is less than most estimates. First: the atmosphere net energy balance is not primarily a radiative balance with earth surface. The atmosphere is primarily not net heated by long wave radiation. But is is primarily cooled by long wave radiation. Se NET radiative heating with the earth surface is 356-333 = 23 W m−2 Net radiation cooling from the ATMOSPHERE to space is 239-40 = 199 W m−2 That is atmospheric radiation capacity is responsible for less than 10% (23/239)of the atmosphere heat budget but 83% (199/239)of its cooling budget. Does it ring a bell? Of claimed 33 degrees warming of the atmosphere is outgoing radiative from earth surface heating responsible for 10 % or about 3 degrees. With an increased absorption of even 3.7 W m-2 will that change the atmosphere temperature be about 3,7/23 x 3 = 0,48 degrees. UNLESS the there is negative feedback. As if outgoing radiation increase more than incoming energy. Outgoing radiation is a function of temperature. But times 4 according to Stefan–Boltzmann law. Incoming energy is not by radiation and then not by times 4. But outgoing is. Does bell two ring? A well known example: if we bring in 2 x 3,7 W m-2 and that increase temperature 1 degrees. One degree causing a outgoing radiation increase of about 3,6 W m-2 in the 250 K range. For 300 K is that about 6 W m-2. Ooops about the supposed CO2 forcing. In short: It does not matter how much CO2 we have in the atmosphere since atmosphere heating is by convection and cooling is by radiation. Convection is a function of T and radiation is a function of T times 4. That is both A and B i guess
|
|
|
Post by steve on Mar 26, 2010 17:28:12 GMT
You're so close to understanding it that you are trying to criticize me by pointing out that it would show behavior that I'm saying it shows. As CO2's emission/absorption efficiency increases you mean as CO2 levels rise I assume?it tries to force one temperature gradient...but as the gradient increases, convection and latent heat begin bypassing CO2, taking the energy up to the higher levels that way (and heating it) instead of it having to pass as radiation through CO2's absorption gauntlet. Again you are not getting it. A body of air at a particular temperature contains a certain amount of CO2. The CO2 molecules are exited and deexcited continuously by collision. The proportion of time they are in an excited state depends on the temperature. The higher the proportion, the more the CO2 has chance to emit a photon. There is no "bypassing" of anything.CO2 emissions (of radiation) are indeed one way in which much of the atmospheric energy is lost...but it is convection and latent heat that transfer most of the energy for CO2 to emit in the first place Transfers it to where? Eventually the energy needs to be radiated into space from somewhere. Where is this place to which the heat is transferred such that the extra CO2 cannot influence it?. CO2's absorption is therefore far less important...and therefore has a far, far weaker "forcing" than suggested by the raw absorption math. In a slab-like atmosphere you would suspect a doubling of CO2 might cause as much as a 1.2C increase in temperature...but in reality it must be substantially lower...less than half.
|
|
|
Post by steve on Mar 26, 2010 17:51:12 GMT
northsphinx
|
|
|
Post by poitsplace on Mar 26, 2010 19:01:41 GMT
Again you are not getting it. A body of air at a particular temperature contains a certain amount of CO2. The CO2 molecules are exited and deexcited continuously by collision. The proportion of time they are in an excited state depends on the temperature. The higher the proportion, the more the CO2 has chance to emit a photon. There is no "bypassing" of anything. Your confusion is somehow both annoying and amusing. You and all the others in the AGW camp are claiming that CO2 is going to slow down the process of radiating energy into space...from the surface. Now imagine if there were some magical process by which the warmed surface of the earth were picked up and carried most of the way through the atmosphere where it was allowed to emit from there...BYPASSING all of this CO2 that you're so hung up on you can't think straight. Well it turns out mother nature is way ahead of me. As soon as any temperature gradient shows up, the warmed air at the surface starts rushing up...and is replaced by cold air rushing down. Water vapor does even more. The higher the gradient, the more powerful the forces of convection and latent heat become...and at a much higher rate than CO2. LOL, it is transferred to the point at which convection and water vapor no longer have any sway...which, not at all coincidentally, is the point at which the temperature gradient inverts and CO2 offers no resistance to outgoing radiation. Its dropped off at the tropopause.
|
|