|
Post by glc on May 12, 2009 16:05:50 GMT
I do not accept 1992 as a reasonable start date for global temperature anomaly trends, land, sea, or combined. You claim that you chose your number for another reason than that it yields the maximum warming per decade signal. Timo is correct to call you on this.
The start period of1992 gives a large decadal warming trend because it follows the Pinatubo eruption (1991) which resulted in 2-3 years of cooler than 'normal' temperatures. I accept that. But the size of the trend(s) is not the issue. It's the similarity of trends of the 4 main global temperature records.
Why choose 1992?
Because before 1992 there was some disagreement between UAH and RSS (and by default GISS & Hadley also). Between 1979 and 1992, RSS measurements were cooler than UAH (see Roy Spencer's blog).
Why was RSS cooler than UAH?
I don't know and I can't find anyone else who does either. But the effect of the cool early period is that the RSS trend from 1979 to date is higher than the UAH trend. However since 1992, there has been much closer agreement between RSS and UAH. In fact, their trends are virtually identical at 0.22 deg per decade. Not only that but, because the GISS and Hadley trends are quite similar to RSS, all 4 trends are within a few hundredths of a degree of each other.
Without Pinatubo, the trends would be lower, but they would still be the same (or similar) for each record. I don't believe Timo's criticism is justified.
In anticipation of further comments: Isn't it true that GISS has diverged from the other metrics in recent years.
Yes it is, but I believe this is due to the GISS extrapolation to estimate the arctic. In the last few years the arctic has been relatively much warmer than other regions. This has bumped up GISS anomalies. However, in late 2008/early 2009, the arctic cooled a bit and this showed up in the GISS record. Using the same base period as the satellites (1979-1997) GISS was much cooler than the others. For example , in February the GISS anomaly was +0.15 while UAH was +0.36.
|
|
|
Post by socold on May 12, 2009 18:35:40 GMT
Rather than write a bit of code to presumably get rid of percieved errors why dont they FIX THE FLAWS in the network? Because so far all attempts to build a time machine have ended in failure. The flipant answer.
|
|
|
Post by ron on May 12, 2009 23:18:58 GMT
Because so far all attempts to build a time machine have ended in failure. That's what I will have thought.
|
|
|
Post by crakar24 on May 13, 2009 0:56:16 GMT
Rather than write a bit of code to presumably get rid of percieved errors why dont they FIX THE FLAWS in the network? Because so far all attempts to build a time machine have ended in failure. The flipant answer. Build a f@#$%^$#@g time machine is your response, i got a better i dea if you want to measure the global temps down to 3 decimal places and then use this data to show the world the sky is falling then is suggest in leiu of a time machine you get off your big fat arse and build/repair and maintain a system capable of doing so. Until then i suggest Gore et al keep their mouths shut when it comes to ooooh looky here the Jan anomoly was 0.12547893736578 above the average the sky is falling the sky is falling. And Stevesocold dont tell me how to produce good accurate data that is reliable, as i do this for a living if i had to work on a data collection system as broken as this one i would reject the data out of hand unless i was keeping it for curiosities sake.
|
|
|
Post by solartrack on May 13, 2009 3:12:19 GMT
Heh heh. I could hear the hammering on the keys. You'll get him next round champ...
|
|
|
Post by glc on May 13, 2009 7:34:06 GMT
Heh heh. I could hear the hammering on the keys. You'll get him next round champ... As well as 'contributing' on this blog, I 've been arguing against catastrophic AGW on pro-AGW blogs for years and I would dearly loved to have been able to land the knock out blow. But believe me the accuracy of the surface temperature record isn't going to produce it. If the surface record's significantly in error then the satellite record is significantly on error and presumably, ocean temperature data is also in error.
crakar24 and the rest can bang away all they we want, but over the past 20 years all 4 temperature records have been in close agreement. Whatever else it achieves, Anthony's survey is not going to alter the the recent surface trend to any significant degree.
|
|
|
Post by steve on May 13, 2009 9:05:21 GMT
Because so far all attempts to build a time machine have ended in failure. The flipant answer. And Stevesocold dont tell me how to produce good accurate data that is reliable, as i do this for a living if i had to work on a data collection system as broken as this one i would reject the data out of hand unless i was keeping it for curiosities sake. It's a huge waste of data to reject it just because it is imperfect. Most remote sensing applications in all areas have to deal with errors and degradations in systems. Noone says "Oh dear, the sensor has degraded a bit on that 200 million pound satellite we put up 5 years ago. Let's stop using the data". Furthermore, studying the errors helps you to improve future applications. If you are in a job where you can reject data out of hand, is it a job where it is cost effective to repeat the measurements?
|
|
|
Post by woodstove on May 13, 2009 12:02:54 GMT
And Stevesocold dont tell me how to produce good accurate data that is reliable, as i do this for a living if i had to work on a data collection system as broken as this one i would reject the data out of hand unless i was keeping it for curiosities sake. It's a huge waste of data to reject it just because it is imperfect. Most remote sensing applications in all areas have to deal with errors and degradations in systems. Noone says "Oh dear, the sensor has degraded a bit on that 200 million pound satellite we put up 5 years ago. Let's stop using the data". Furthermore, studying the errors helps you to improve future applications. If you are in a job where you can reject data out of hand, is it a job where it is cost effective to repeat the measurements? You sound about as interested in getting good data as Jim Hansen.
|
|
|
Post by socold on May 13, 2009 22:55:38 GMT
Because so far all attempts to build a time machine have ended in failure. The flipant answer. Build a f@#$%^$#@g time machine is your response, i got a better i dea if you want to measure the global temps down to 3 decimal places and then use this data to show the world the sky is falling then is suggest in leiu of a time machine you get off your big fat arse and build/repair and maintain a system capable of doing so. Until then i suggest Gore et al keep their mouths shut when it comes to ooooh looky here the Jan anomoly was 0.12547893736578 above the average the sky is falling the sky is falling. And Stevesocold dont tell me how to produce good accurate data that is reliable, as i do this for a living if i had to work on a data collection system as broken as this one i would reject the data out of hand unless i was keeping it for curiosities sake. Additionally you really answered your own question within your question: "Rather than write a bit of code to presumably get rid of percieved errors why dont they FIX THE FLAWS in the network?" If a bit of code can fix it sufficiently, there's little need to fix the flaws in the network. Nice to do, but there is such a thing in the business world as putting in too much effort for negliable gain. If fixing the flaws would only be a rubber stamp then I'd rather they don't bother.
|
|
|
Post by kenfeldman on May 14, 2009 0:26:29 GMT
Early work on just using the data from the "high quality, rural" (CRN 1 and 2) surface stations basically validated the NASA GISS temperature data for the US. After that, it seems like most commentors on WUWT lost interest in the project. Um...did you notice how GISS is warmer most of the time? Um... did you notice that the rate of change is pretty damn close? The decadal trends would be within the error of margins of each other. In other words, what GISS is intended to measure (climate change, not the daily or monthly weather), it's doing quite well.
|
|
|
Post by crakar24 on May 14, 2009 5:14:24 GMT
And Stevesocold dont tell me how to produce good accurate data that is reliable, as i do this for a living if i had to work on a data collection system as broken as this one i would reject the data out of hand unless i was keeping it for curiosities sake. It's a huge waste of data to reject it just because it is imperfect. Most remote sensing applications in all areas have to deal with errors and degradations in systems. Noone says "Oh dear, the sensor has degraded a bit on that 200 million pound satellite we put up 5 years ago. Let's stop using the data". Furthermore, studying the errors helps you to improve future applications. If you are in a job where you can reject data out of hand, is it a job where it is cost effective to repeat the measurements? Silly example with the satellite, but i understand your point. However in my line of work we usually only get one shot at getting good data and if we dont then we have to wait a yearor two and a couple of million dollars later to have another go so the data gathering system needs to be reliable. But it goes further than that if the data you collect is rubbish then what is the point of collecting it? It serves no purpose you can and a software bias to the data which really boils down to a guess and no more. If there are errors in the US system then address them so you can have confidence in the data you rx. To GLC the point of doing a station survey is to highlight the inadequate standard the yanks operate at. Whether Watts actions cause temps to go up or down is not the issue, data integrity is, well it used to be anyway.
|
|
|
Post by tacoman25 on May 14, 2009 5:21:25 GMT
Um...did you notice how GISS is warmer most of the time? Um... did you notice that the rate of change is pretty damn close? The decadal trends would be within the error of margins of each other. In other words, what GISS is intended to measure (climate change, not the daily or monthly weather), it's doing quite well. For the U.S., perhaps. But globally, GISS has measured a greater rate of change over the past century than any other temperature source.
|
|
|
Post by jimg on May 14, 2009 5:34:42 GMT
Kenfeldman.
Could you stretch out that Y-Axis a little more.
That .1C should look a little bigger over 100+ years.
BTW, what was the margin of error on that?
|
|
|
Post by glc on May 14, 2009 8:09:27 GMT
If there are errors in the US system then address them so you can have confidence in the data you rx. To GLC the point of doing a station survey is to highlight the inadequate standard the yanks operate at. Whether Watts actions cause temps to go up or down is not the issue, data integrity is, well it used to be anyway.
Trends for the contiguous 48 US states since 1979
UAH +0.26 deg per decade GISS +0.25 deg per decade
Whatever problems there might be with individual station data, they don't appear to be affect the overall conclusion. i.e. that there has been warming over the past 30 years.
Whether Watts actions cause temps to go up or down is not the issue, data integrity is, well it used to be anyway
That's fine, but that's not how it's been viewed by posters on here and other blogs.
|
|
|
Post by steve on May 14, 2009 9:45:17 GMT
It's a huge waste of data to reject it just because it is imperfect. Most remote sensing applications in all areas have to deal with errors and degradations in systems. Noone says "Oh dear, the sensor has degraded a bit on that 200 million pound satellite we put up 5 years ago. Let's stop using the data". Furthermore, studying the errors helps you to improve future applications. If you are in a job where you can reject data out of hand, is it a job where it is cost effective to repeat the measurements? Silly example with the satellite, but i understand your point. I'm not sure why it was a silly example. It derives from a paper I was reading about some earth observing satellite where the different components of the satellite were degrading in different ways leading to an incorrect trend. The *hypothesis* is that the data is rubbish. But the data have been thoroughly tested in numerous different ways. For example by comparing trends from different subsets of stations, or comparing trends between subsets of days when biases would be less with days when biases would be more. I get the impression that that is why there is more focus on satellites (which, as we have found, have their own issues).
|
|