|
Post by thingychambers69 on Dec 3, 2009 0:21:43 GMT
I too as well just got a pop up, which I never get. I wonder ... I have never gotten anything on here. I've been here awhile before commenting. I stopped looking for awhile because it looked liked cycle 24 was supposed to be off the charts, until I got an email . No sense in arguing when the sun was active. I'll have to check the logs for attacks. What the f**k is going on?
|
|
|
Post by thingychambers69 on Dec 3, 2009 0:25:06 GMT
Unsurprisingly, the answer seems reasonably straightforward. 1. Only some of the original data is lost. 2. The loss refers to ground stations - there are other sources of data, and in particular, the 1940's bump is from the sea surface temperatures. It is not a ground station issue as the image below shows. 3. There are other sources of data coming on-line all the time (eg. digitisation of old log books etc.). Which, of course, does not address the credibility issue (Note: source of info). True. Can we really trust the Met Office anymore? Steve, do you work for the Met Office?
|
|
|
Post by glc on Dec 3, 2009 9:35:58 GMT
True. Can we really trust the Met Office anymore?
Use UAH then.
And while you're at it explain why we have the warmest November on record duing the deepest solar minimum in a century.
|
|
|
Post by dwerth on Dec 3, 2009 9:47:43 GMT
GLC -
Thats actually pretty easy. Recorded history for temperatures only goes back as far (reliably anyway) as the mid 19th century. Lets call it 1850 for convenience sake. At this point, we are coming out of the LIA, and recovering to "normal" temperatures for the Holocene optimum. This is the point we find ourselves at circa 2009. Within half a degree C of a point in time we called "normal". We have had an enormous amount of heat energy added to the Earth's "balance sheet" if you will. It takes time for that amount of energy to unwind.
|
|
|
Post by steve on Dec 3, 2009 14:13:55 GMT
Are all the temperature datasets fraudulent, or was there a Little Ice Age. Let me know when you've decided, and then maybe we can continue.
|
|
|
Post by icefisher on Dec 3, 2009 14:56:03 GMT
True. Can we really trust the Met Office anymore?Use UAH then. And while you're at it explain why we have the warmest November on record duing the deepest solar minimum in a century. Sure several points: 1) The Svensmark solar effect minimum is still off in the future. 2) The atmosphere is a poor measure due to wide short term fluctuations, fluctuating by big chunks on a monthly basis. Likewise you might explain why the decade of the 1930's still is the holder of most maximum temperature records in the US. 3) Scientifically speaking you are like a schizophrenic. . . .getting excited by a warm excursion of one month. . . .while next week pooh poohing a 10 year atmospheric excursion as "weather". 4) Its been getting warmer by all measures for more than a century, thus lowest minimum in a century isn't exactly a cooling benchmark. Now few sunspots for 70 years now thats a different matter. 5) Continued centennial/millenial scale warming from natural variation hasn't been ruled out. In short a lot remains to be explained. You are welcome to ralph as another dip on this roller coaster ride comes up; but kindly please don't flip the ship by joining the stampede for the railing on one side of the ship. . . .try ralphing on the uncrowded side. . . .its actually a lot better.
|
|
|
Post by thingychambers69 on Dec 3, 2009 15:28:50 GMT
Are all the temperature datasets fraudulent, or was there a Little Ice Age. Let me know when you've decided, and then maybe we can continue. Maybe we can't trust any of them. I would definitely start questioning the validity of the data collected from the 1980s on wards. Another point. Allot of people are still confusing data obtained from modeling and actual recorded data. Climate scientists have done nothing to address this confusion.
|
|
|
Post by thingychambers69 on Dec 3, 2009 15:39:58 GMT
GLC - Thats actually pretty easy. Recorded history for temperatures only goes back as far (reliably anyway) as the mid 19th century. Lets call it 1850 for convenience sake. At this point, we are coming out of the LIA, and recovering to "normal" temperatures for the Holocene optimum. This is the point we find ourselves at circa 2009. Within half a degree C of a point in time we called "normal". We have had an enormous amount of heat energy added to the Earth's "balance sheet" if you will. It takes time for that amount of energy to unwind. I agree. It stands to reason if we started to record data at the end of the little ice age, we would see increased temperatures as we came out of it. Then again, thats if you believe all the historical reports of crops failing and rivers freezing over.
|
|
|
Post by Pooh on Dec 3, 2009 15:53:01 GMT
Global warming controversy hits NASA climate data(Duplicate of reply #768 on Re: Breaking news: CRU correspondence and data lea)Stephen Dinan, Washington Times, December 3, 2009 www.washingtontimes.com/news/2009/dec/03/nasa-embroiled-in-climate-dispute/Excerpts:"The fight over climate science is about to cross the Atlantic with a U.S. researcher poised to sue NASA, demanding the release of the same kind of information that landed a leading British center in hot water over charges that it skewed its data. "Christopher C. Horner, a senior fellow at the Competitive Enterprise Institute, said NASA has refused for two years to provide information under the Freedom of Information Act that would show how the agency has shaped its climate data and explain why the agency has repeatedly had to correct its data dating as far back as the 1930s. ... "NASA and CRU data are considered the backbone of much of the science that suggests the Earth is warming as a result of man-made greenhouse gas emissions. NASA argues that its data suggest this decade has been the warmest on record. "On the other hand, data from the University of Alabama-Huntsville suggest temperatures have been relatively flat for most of this decade."
|
|
|
Post by smoore on Dec 3, 2009 19:29:15 GMT
I've visited the forums here several time...and usually don't have any reason to post. But this graph bothers me. The uncertainty in the top graph for 1850 looks like about +/-.5c. The middle graph +/-.2c. (eyeballing both of course) The bottom graph is also +/-.2c. Since it's combining the 2 graphs, the uncertainty should at least be equal largest uncertainty of the data set, shouldn't it? MET office also seem pretty certain about the SSTs going back to 1850. I would think there'd be greater uncertainty in the sea temperatures than the land temperatures.
|
|
|
Post by dwerth on Dec 3, 2009 20:45:20 GMT
Indeed Smoore, that should be the case. One cannot have tighter precision in any endeavor, than the tool or object which has the worst precision. This is a basic tenet of manufacturing that holds true to any measurement of anything. You may be perfectly accurate when measuring something, hitting the target every time, but the precision of the measuring device leads to the margin of error. You cannot have a margin of error on a combined data set than is smaller than the margin of error on any one of its components.
|
|
|
Post by socold on Dec 3, 2009 20:49:45 GMT
Global warming controversy hits NASA climate data(Duplicate of reply #768 on Re: Breaking news: CRU correspondence and data lea)Stephen Dinan, Washington Times, December 3, 2009 www.washingtontimes.com/news/2009/dec/03/nasa-embroiled-in-climate-dispute/Excerpts:"The fight over climate science is about to cross the Atlantic with a U.S. researcher poised to sue NASA, demanding the release of the same kind of information that landed a leading British center in hot water over charges that it skewed its data. "Christopher C. Horner, a senior fellow at the Competitive Enterprise Institute, said NASA has refused for two years to provide information under the Freedom of Information Act that would show how the agency has shaped its climate data and explain why the agency has repeatedly had to correct its data dating as far back as the 1930s. ... "NASA and CRU data are considered the backbone of much of the science that suggests the Earth is warming as a result of man-made greenhouse gas emissions. NASA argues that its data suggest this decade has been the warmest on record. "On the other hand, data from the University of Alabama-Huntsville suggest temperatures have been relatively flat for most of this decade." Washpost is clearly a biased mouthpiece, albeit not as extreme as the telegraph or the register. CEI is the red flag, describing has PR attack written all over it. The best distortion though is this part: "NASA argues that its data suggest this decade has been the warmest on record. "On the other hand, data from the University of Alabama-Huntsville suggest temperatures have been relatively flat for most of this decade."" UAH data also shows this decade has been the warmest on record, so their "on the otherhand" is false. But yeah a nice wordplay trick was done there.
|
|
|
Post by Graeme on Dec 3, 2009 21:03:28 GMT
Indeed Smoore, that should be the case. One cannot have tighter precision in any endeavor, than the tool or object which has the worst precision. This is a basic tenet of manufacturing that holds true to any measurement of anything. You may be perfectly accurate when measuring something, hitting the target every time, but the precision of the measuring device leads to the margin of error. You cannot have a margin of error on a combined data set than is smaller than the margin of error on any one of its components. Well... generally, but not always. There are circumstances where the margin of error can be reduced, though you have to design your experiment accordingly. I remember from my high school days doing an experiment to calculate the acceleration due to gravity by observing the periodity of a pendulum. The margin of error in timing for a single swing was high, but the margin of error was the same for a single swing and for a thousand swings. Timing a thousand swings and the dividing the time by a thousand also reduces the margin of error by a thousand. The relevance with the topic at hand is that historical temperature measurements were not taken with a view for determining climatic trends. Older measurements probably have around a +/- 0.5 degree reading error (since those measurements would have been taken manually by viewing a thermometer), but for the purpose for what they were taken, that was considered sufficient. When we're trying to look at trends of below 0.5 degrees, though, that accuracy isn't sufficient. All that can be done is to assume that, given enough readings from enough stations, that the errors cancel out. ie. that the readings that were 0.5 degrees too high will balance those that were 0.5 degrees too low. That's an assumption, though, and not provable. And, of course, we're ignoring any temperature bias due to UHI or time of reading. I personally think the latter is probably insignificant, but the former can easily be more than trends we're looking at. In short, I agree that the error margins on these figures need to be explored in more depth to ensure they are being calculated accurately. The same applies to any adjustments made to historical temperatures (due to UHI, site location moves, etc). How are the error margins on those adjustments determined?
|
|
|
Post by dwerth on Dec 3, 2009 21:20:04 GMT
Graeme-
I remember doing the same pendulum experiments in my high school physics class. However, I am not sure that that analogy is quite appropriate or truly analogous to this one. That experiment relies on one pendulum being directly observed over a set period of time in a controlled environment. However, in this instance, we have some 2000 US monitoring stations that are being observed over time, but the conditions are not controlled. In fact, many of the stations have become contaminated with UHI effects, bad siting, or other factors.
Just my $.03 (inflation don't ya know)
|
|
|
Post by Graeme on Dec 3, 2009 21:25:12 GMT
Graeme- I remember doing the same pendulum experiments in my high school physics class. However, I am not sure that that analogy is quite appropriate or truly analogous to this one. That experiment relies on one pendulum being directly observed over a set period of time in a controlled environment. However, in this instance, we have some 2000 US monitoring stations that are being observed over time, but the conditions are not controlled. In fact, many of the stations have become contaminated with UHI effects, bad siting, or other factors. Just my $.03 (inflation don't ya know) I agree. The point I didn't make clearly was that the experiment was designed to minimise the error margin. Historical temperature collections were not designed to minimise the error margin. Indeed, I suspect error margins were never recorded (after all, who really wanted to know that yesterday's temperature reached 37+/-0.5 degrees?). Having a high error margin on one of the inputs doesn't necessarily mean a high error margin on the output, but it will unless you design things appropriate. Historical temperature collections were not designed to minimise the error margin because it was deemed unnecessary for the purpose that those temperature collections were made.
|
|