|
Post by Pooh on Jan 18, 2010 7:45:39 GMT
GISS Data Manipulation? It is alleged that the GISS data set originally had 6000 temperature monitoring points. The base temperature (i.e. The base used for the temperature above average calculation.) used the original 6000 temperature monitoring points. The current published GISS planetary temperature only uses 1500 data points. The points used appear to be points that are warmer. (The colder high altitude data points and the high latitude points have been removed.) In addition to the dropping of 4500 temperature points an algorithm is applied to the 1500 measured temperature data points that “homogenizes” the measured temperature data. The affect of the “homogenizing algorithm is to raise the temperature. GISS data manipulation might explain why the GISS data does not correlate with satellite measurement of planetary temperature. www.kusi.com/weather/colemanscorner/81559212.htmlIf you disproportionately remove sites from a given area, the weight of the neighbouring site rises in proportion. So this claim does not directly have an effect. Good Grief! First, try averaging non-zero integers from -5 to +5. Then from +1 to +5. The same result occurs if colder stations are removed from the GIStemp et al databases, or if a series is truncated where it begins to decline (Briffa). But I do agree with you that if the colder sites are removed, the weight of neighboring warmer site do rise in proportion. And that is part of the problem. See E.M. Smith for a discussion of the misuse of anomalies. Which Steve McIntye E.M. Smith have been doing. It is difficult and time-consuming given that code and original data are undocumented, often missing and sometimes corrupted. Given collusion between certain analytical centers, agreement can be expected. However, the satellite data are diverging, and the UAH satellites are self-calibrating. Spencer, Ph.D., Roy W. “How the UAH Global Temperatures Are Produced.” Scientific. Global Warming (drroyspencer.com), January 6, 2009. www.drroyspencer.com/2010/01/how-the-uah-global-temperatures-are-produced/. No. The science is in, and the facts are indisputable. See above. The time for shell games is over. See above. See also the ClimateGate emails coordinating refusal to comply with FOI requests. For GIStemp, read E.M. Smith: GIStemp – A Human View « Musings from the Chiefio chiefio.wordpress.com/2009/11/09/gistemp-a-human-view/GIStemp « Musings from the Chiefio chiefio.wordpress.com/gistemp/
|
|
|
Post by steve on Jan 18, 2010 10:27:39 GMT
Pooh
It actually tends to be some of the colder northern sites that suffer more UHI, so it is not obvious that simply removing cold stations will change the anomaly. And, as has been pointed out, choosing a different set of stations doesn't seem to give much of a different answer. For example, choosing the 70 "best US stations" according to Watts project gives you a similar result to the one obtained from the whole set.
That's a poor excuse. There is enough data out there to calculate your own temperature metric. You don't have to try and follow exactly the same procedure. Get the date. Do the analysis. Provide a metric with good global coverage. Publish it.
|
|
|
Post by icefisher on Jan 18, 2010 19:39:19 GMT
GISS Data Manipulation? It is alleged that the GISS data set originally had 6000 temperature monitoring points. The base temperature (i.e. The base used for the temperature above average calculation.) used the original 6000 temperature monitoring points. The current published GISS planetary temperature only uses 1500 data points. The points used appear to be points that are warmer. (The colder high altitude data points and the high latitude points have been removed.) In addition to the dropping of 4500 temperature points an algorithm is applied to the 1500 measured temperature data points that “homogenizes” the measured temperature data. The affect of the “homogenizing algorithm is to raise the temperature. GISS data manipulation might explain why the GISS data does not correlate with satellite measurement of planetary temperature. www.kusi.com/weather/colemanscorner/81559212.htmlIf you disproportionately remove sites from a given area, the weight of the neighbouring site rises in proportion. So this claim does not directly have an effect. This would be true if you had a hard line across which synthethic spacial averaging was not occurring. Removing high latitude stations creates gaps into which denser lower latitude stations will fill effectively raising average temperature via moving the virtual latitude line. For high elevation stations there may not be a neighboring high elevation station. The fact is in order to identify if you have a cold station that should be averaged only from cold stations you need to keep the station in the database, even if you don't use it directly so as to document why it is not being used and how you compensated for it. Its inexcusable that an audit trail is not being created from the "value added" calculations. Each step down this path erodes the credibility of the data. There should be an investigation of this as it seems to be willful destruction of public assets for any such work funded by government funds.
|
|
|
Post by AstroMet on Jan 18, 2010 23:28:54 GMT
If you disproportionately remove sites from a given area, the weight of the neighbouring site rises in proportion. So this claim does not directly have an effect. This would be true if you had a hard line across which synthethic spacial averaging was not occurring. Removing high latitude stations creates gaps into which denser lower latitude stations will fill effectively raising average temperature via moving the virtual latitude line. For high elevation stations there may not be a neighboring high elevation station. The fact is in order to identify if you have a cold station that should be averaged only from cold stations you need to keep the station in the database, even if you don't use it directly so as to document why it is not being used and how you compensated for it. Its inexcusable that an audit trail is not being created from the "value added" calculations. Each step down this path erodes the credibility of the data. There should be an investigation of this as it seems to be willful destruction of public assets for any such work funded by government funds. Agreed. What Climategate shows, and what is going to reveal about science in general as it has been practiced since 1989 is just how low some in the climate and meteorological community have sunk. The tampering of data in favor of an ideology has led to serious disruption of Science, and global public policy in general. It seems what we have here is a failure to communicate. The religious fervor of those who bought, hook, line and sinker into "man-made global warming" has convinced half the world of something that is a lie - proving what Hitler once said about some people, not everyone, but a lot of "some" people. The propaganda has led to attempts to prevent free speech and scientific research, to shout down, tamper with raw data, misappropriation of climate funds, and even thoughts on "defining what the peer-review literature is" - has led to one of the worst scandals in modern science in centuries - all over a outright lie that humanity was the cause of global warming. Now, the mainstream news media is wringing its hand, hoping that "Climategate" will go away - but its not. It has just begun, and, in the weeks and months ahead we are going to discover just how deep the rabbit hole goes when it comes to scientific fraud and the attempted brainwashing of the world to the cause of AGW climate change.
|
|
|
Post by nautonnier on Jan 19, 2010 0:04:09 GMT
I think what 'climategate' shows is that the denizens of academia may be very well 'educated' in their own way - but when it comes to some basic issues they are amateurish in the extreme. So not keeping original data and an audit trail, not documenting what they are doing - apart from by accident of in-line comments from 'Harry'. Backbiting and trying to fix results of peer review and expressing dislike of people purely due to their holding opposing views. All these are signs of people that have never ever worked in the world outside academia and who probably are unable to do so.
In business one has to learn to work successfully and productively with people that you dislike and who are in competition with you.
In business it is expected to have software and data controls and apply trained and tested standard processes for all aspects of the software and information life cycle. This is called quality assurance. In the case of the University of East Anglia Climatic Research Unit - the entire world was depending on their output - yet they would not be able to pass even a basic narrow scope ISO-9000-3 audit.
Think about it - the entire world is dependent on the CRU output - and it is not even up to basic QA standards.
One wonders if the University of East Anglia _proud_ of their performance or are they trying to conceal it?
|
|
|
Post by scpg02 on Jan 19, 2010 0:24:03 GMT
I think what 'climategate' shows is that the denizens of academia may be very well 'educated' in their own way - but when it comes to some basic issues they are amateurish in the extreme. So not keeping original data and an audit trail, not documenting what they are doing - apart from by accident of in-line comments from 'Harry'. Backbiting and trying to fix results of peer review and expressing dislike of people purely due to their holding opposing views. All these are signs of people that have never ever worked in the world outside academia and who probably are unable to do so. They were too invested in being right. Their behavior is very political to say the least.
|
|
|
Post by sentient on Jan 19, 2010 3:10:46 GMT
nautonnier, I would like to think you are right, but the very existence of their own words relating how they manipulated the data, lost, threw away or distorted the data is just far too compelling. In 35 years now as a practicing geologist, I have never known a true scientist that does anything but preserve his or her raw data with something akin to religious zeal. I still have copies of my original geophysical research dating to the mid 1970's on gold prospects which have never been eroded (and therefore did not produce the placers that led to the '49 gold rush, meaning they are vestal virgins still). All carefully documented and archived. In fact it just became the focus of what may be a new exploration venture for a new client.
astromet is correct here. What we have witnessed, and are still witnessing, is a surreal corruption of the scientific process and scientific method. I witness this new "science" almost daily in environmental litigations where remarkable subterfuges have been undertaken to mask or promote all sorts of things. It often takes a team of specialists to meticulously unravel these monstrosities and daylight not only the truth of these matters, but to document each misprision, each step of the process of each corruption.
Which is what is going on right now with both the CRU and GISS emails, and perhaps more importantly, with the rest of the files released in FOI2009.zip. Again, I agree with astromet, this was just the crack that will be progressively wedged further and further open, if for no other reason than there is just so much money involved.
Suits, petitions, and FOI requests are being constructed and filed. With each revelation, such as the GISS files (dare we refer to this one as Hansengate?), new angles are daylighted, more avenues of pursuit blossom, more dung is heaped onto the legal fires which have been smoldering for just such a thing. Public funding of many of these escapades assures a certain level of success as regards disclosure and discovery motions.
As fast as I can, I am working towards getting a piece of this litigation support action. (1) because someone is going to get it, (2) because I do this well, (3) as complex as it will undoubtedly become, it may be like taking candy from a baby, (4) because it is the right thing to do, and last but not least (5) because I am deeply offended by the hijacking and corruption of science itself.
This will take years if not decades to play itself out in the courts. But there are forces out there already marshaling talent and girding for the fight that is to come. steve, socold and glc are already integral components in my strategy. Their sallies are carefully noted as some, if not all of their reasoning, I can well anticipate as the avenues I must also follow and eventually expect to counter. All of you here are already participating in what will eventually transpire at the most serious level. The courtrooms.....
And I cannot thank you all enough. Keep up the good work, everyone!
|
|
|
Post by itsthesunstupid on Jan 19, 2010 5:35:18 GMT
What the last few months of revelations have demonstrated to me is that there is almost no evidence of manipulation, distortion and ostracization by the AGW priests that won't be defended by the rank and file Carbo-chondriacs. It is the moveable masses that need to be educated about the myth of AGW because they will, in time, ensure the destruction of the elitist who currently control public policy and the institutions that attempt to give that policy credibility.
|
|
|
Post by scpg02 on Jan 19, 2010 6:03:59 GMT
What we have witnessed, and are still witnessing, is a surreal corruption of the scientific process and scientific method. I witness this new "science" almost daily in environmental litigations where remarkable subterfuges have been undertaken to mask or promote all sorts of things. Because the environment is not their real concern, regulations and thus control of private property is.
|
|
bxs
Level 3 Rank
Posts: 115
|
Post by bxs on Jan 19, 2010 7:06:02 GMT
As fast as I can, I am working towards getting a piece of this litigation support action. (1) because someone is going to get it, (2) because I do this well, (3) as complex as it will undoubtedly become, it may be like taking candy from a baby, (4) because it is the right thing to do, and last but not least (5) because I am deeply offended by the hijacking and corruption of science itself. This will take years if not decades to play itself out in the courts. But there are forces out there already marshaling talent and girding for the fight that is to come. steve, socold and glc are already integral components in my strategy. Their sallies are carefully noted as some, if not all of their reasoning, I can well anticipate as the avenues I must also follow and eventually expect to counter. All of you here are already participating in what will eventually transpire at the most serious level. The courtrooms..... And I cannot thank you all enough. Keep up the good work, everyone! While a noble cause indeed, I don't know how much legal ground you are standing on. I saw you in the "AGW, IPCC are not braking any laws..." solarcycle24com.proboards.com/index.cgi?board=globalwarming&action=display&thread=1018. But there aside from the international laws, I mentioned the "Internet Decency Act of 1996". I, along with millions of other internet users, as well as all the major ISPs at the time, was actually part of a lawsuit challenging that Idiocy. I still have the documents somewhere, I would offer to scan them for you, but I found this instead epic.org/free_speech/censorship/lawsuit/of course in the list of plaintiffs they didn't list all the individuals who signed up, but the major players are there. Hope it helps.
|
|
|
Post by sentient on Jan 19, 2010 13:45:55 GMT
Thanks bxs. I hope to have some time today to get back on to this. Busy day ahead.
|
|
|
Post by steve on Jan 19, 2010 14:46:01 GMT
There is an "audit trail". Analyses were done to show that the trends in neighbouring stations were close enough to make no difference to the final result. The analyses were published.
If you disagree with the results then the data is still there to reanalyse. For example, for periods prior to their removal, compare the trends of the data from sites that were removed and sites that were kept.
|
|
|
Post by steve on Jan 19, 2010 15:00:06 GMT
I think what 'climategate' shows is that the denizens of academia may be very well 'educated' in their own way - but when it comes to some basic issues they are amateurish in the extreme. So not keeping original data and an audit trail, not documenting what they are doing - apart from by accident of in-line comments from 'Harry'. Backbiting and trying to fix results of peer review and expressing dislike of people purely due to their holding opposing views. All these are signs of people that have never ever worked in the world outside academia and who probably are unable to do so. My experience in academia was that there was indeed very little support for maintaining processed data, but the data that was expensive to obtain usually had a reasonably secure archive. Some of the allegations about deleting data relate to analysed data or to data that is still available from elsewhere. So to me, some of the issues are unsurprising but less discreditable than is being claimed. Overstatement. The quality assurance is in the independent analyses that can be and are done. Failure to achieve ISO-9000 accreditation just means that you've not necessarily followed an agreed procedure, it doesn't mean the result is wrong. The software I work on did not always have an ISO9000 standard, but had a low level of defects because I followed good common sense. The entire world is *not* dependent on CRU output
|
|
|
Post by hunterson on Jan 19, 2010 18:30:28 GMT
Steve, Yes, if they are indeed using effective QA/QC procedures, they would nopt require ISO. But we know that they are not in fact practicing good QA/QC, so the lack of ISO is simply another way to confirm how poor the AGW promoters in fact are. CRU/GISS/IPCC have made the world hang on their every utterance. Now we know they are uttering gibberish.
|
|
|
Post by nautonnier on Jan 20, 2010 12:26:31 GMT
I think what 'climategate' shows is that the denizens of academia may be very well 'educated' in their own way - but when it comes to some basic issues they are amateurish in the extreme. So not keeping original data and an audit trail, not documenting what they are doing - apart from by accident of in-line comments from 'Harry'. Backbiting and trying to fix results of peer review and expressing dislike of people purely due to their holding opposing views. All these are signs of people that have never ever worked in the world outside academia and who probably are unable to do so. My experience in academia was that there was indeed very little support for maintaining processed data, but the data that was expensive to obtain usually had a reasonably secure archive. Some of the allegations about deleting data relate to analysed data or to data that is still available from elsewhere. So to me, some of the issues are unsurprising but less discreditable than is being claimed. Overstatement. The quality assurance is in the independent analyses that can be and are done. Failure to achieve ISO-9000 accreditation just means that you've not necessarily followed an agreed procedure, it doesn't mean the result is wrong. The software I work on did not always have an ISO9000 standard, but had a low level of defects because I followed good common sense. The entire world is *not* dependent on CRU output "My experience in academia was that there was indeed very little support for maintaining processed data, but the data that was expensive to obtain usually had a reasonably secure archive. Some of the allegations about deleting data relate to analysed data or to data that is still available from elsewhere. So to me, some of the issues are unsurprising but less discreditable than is being claimed." And then you find that the other group delete the data as they thought that you were keeping a copy. There should of course be a procedure that at the very least sets up a Memorandum of Understanding that requires basic data supporting research costing $millions to be given to you rather than destroyed. After all we are not talking about an individual PhD student here - this is a large governmental research laboratory directly funded by UK and US governments with contracts running over many years. "Overstatement. The quality assurance is in the independent analyses that can be and are done. Failure to achieve ISO-9000 accreditation just means that you've not necessarily followed an agreed procedure, it doesn't mean the result is wrong. The software I work on did not always have an ISO9000 standard, but had a low level of defects because I followed good common sense."It means that (among other things): 1. There may be no procedure defined for some or all of the activities of the unit 2. There may be no training in how to work in accordance with or apply the procedures 3. organization members and staff are not applying or misapplying the procedures As you say this may not affect the result being wrong - but then perhaps one of the procedures would be to always follow a verification and validation process, have a qualified statistician check any statistics used before publication and perhaps procedures to set up ethical peer review. If those minor requirements had been in place - do you think Biffra's single tree would have been accepted as statistically significant? Unfortunately, this requires more than 'good common sense'. One would also normally expect that the University of East Anglia would have some standard procedures in place that would include routine auditing the activities of CRU. The fact that they did not have these procedures or did not follow them is also a major problem as it means that their current first party audit is worthless. I continue to be amazed that considerable UK and US government funding was provided to laboratories that obviously had no QA standards.
|
|