|
Post by glc on Aug 17, 2009 10:27:27 GMT
The idea from GLC that this kind of change will cause a stepwise change in temperatures seems a guess. More likely in specific conditions you will get particular local heating influences that were not possible before. Given the fairly consistant prevailing windy conditions there it wont be a constant influence of significance we can guess. But how does it effect clear evening cold and temperature inversion hot?
You are getting hung up on accurate and precise temperature measurement, whereas we're really only interested in obtaining a reasonable estimate of the warming (or cooling) over a given period. They are not the same thing. Anthony Watts has made the same mistake.
I'll use the 'lake' analogy again. If you want to know the change in water level in a lake over the past 10 years, say, you don't need to have an accurate measurement of the lake depth. You just need estimates with respect to some reference point. Providing you have enough readings your errors are not likely to be significant. Think of a room with a constant temperature of 10 deg. Using only a primitive thermometer with 1 deg increments we could still probably provide a reasonable estimate of the warming if, say, the room warmed 0.5 deg in 20 years. The thermometer would measure 10 deg every day, so the average annual temperature would also be 10 deg. But what is the true temperature? If the thermometer can only record temperature to the nearest 1 degree, then the true temperature probably lies somewhere between 9.50 deg and 10.49 deg.
Let's assume that, on half the days in the year (182.5 days), the true temperature lies between 9.5 and 10.0 and on the other half of the year the true temperature lies between 10.0 and 10.5 deg.
So what happens if the room starts to warm. We have 2 cases, i.e. (1) it warms gradually over the 20 years (2) it warms immediately.
If it warms gradually then some of the days on which the true temperature was above 10 deg and near to 10.5 deg, it will flip over the 10.5 threshold and be recorded as 11 deg. In the first year of warming we might get 20 days on which the thermometer recorded 11 deg, the rest will still say 10 deg. The annual average will now be ~10.05 deg. As warming continues, the following year may have 40 11-deg days and so on. Eventually, after 20 years, all 182 days which were originally in the (10.0,10.5) interval will now be in the (10.5, 11.0) interval and will be recorded as 11 deg. The others will still be recorded as 10 deg. The average annual temperature will be ~10.5 deg, i.e. a 0.5 deg rise. So we will have calculated a reasonably accurate estimate of the rise in temperature despite the limitations of the available equipment.
If the room warms immediately - then the days in the [10,10.5] interval will immediately flip into the [10.5,11] interval and be recorded as 11 deg. The annual temperature will jump from 10 to 10.5 degrees. This will be evident in the data and show up in the LS trends.
This is a very crude example, but I hope it shows that a lot of the points made about precision and accuracy are not as relevant as one might think.
|
|
|
Post by northsphinx on Aug 17, 2009 12:34:43 GMT
So You are saying that since we nowdays measure with decimal accuaracy is the heating trace in measurements just a question of thresholds between degrees and decimals of degrees? A 0.5 degree heating MAY be a measurement treshold.
Good thought. But is that not something that talk against AGW?
|
|
|
Post by radiant on Aug 17, 2009 17:39:07 GMT
Using only a primitive thermometer with 1 deg increments we could still probably provide a reasonable estimate of the warming 1. Your lecture on thermometers is not connected to the text of mine you have quoted. But lets set that to one side and deal with your attempt to show how superior you are to me. 2. For the record a primitive thermometer cannot be used to measure temperatures because unless the temperatures are etched into the glass of the stem you cannot rely on the material with the marked degrees to have shrunk or expanded with age or moved out of position. 3. Even if a primitive thermometer can be found that measures true relative changes it can be lost or broken thus preventing the data series that follows the breakage from being connected to the data series prior to the breakage. 4. Only a laboratory standard thermometer can be used to measure weather temperatures to know if it is warmer or colder than some other point in time so that if the thermometer is replaced the amount of error before and after is known and can be found via the referenced calibration chart that is part of the record. 5. In case it is not obvious to you, your claim of AGW requires the ability to very accurately measure temperatures so that a huge number of different thermometers read by a huge number of different people in a huge number of different changing situations in the entire world can be assembled together to get a meaningful answer. 6. If you only had to measure one room with one unchanging thermometer things would be as simple as you have described them to be. Instead each different thermometer in each different room has to be linked to all of the other thermometers by unique events that happen on earth at certain temperatures where the differences between the thermometers and the unique events is known about and charted during the calibration process. You need to learn why calibration is important rather than lecturing me on why you think it is unimportant. As i said before do you know anything about the scientific method? What is your scientific training if any? I have a British degree in applied Chemistry with special emphasis on instrumentation and analytical techniques.
|
|
|
Post by icefisher on Aug 17, 2009 18:52:10 GMT
[ UPDATE: It does appear that a record exists which shows the mean Armagh temperature in 1846 is 10.4 deg. Duh!!! I will look into this but there's either an error or a mismatch in instrumentation and/or method of recording. But rest assured the late 20th century warming is still evident (0.35 deg per decade) whichever data is used. So the warming is now unique to the late 20th century. OK. Have you found anything regarding errors yet? As it sits over 155 years the trend looks closer to .0035 deg per decade than .35 degrees per decade. Here is a conclusion from Butler: "Whilst there has been an exceptional number (five) of warm years over the past decade (1993–2002), when the mean annual temperature at Armagh has reached 10 °C or above, annual temperatures of this level have been reached before, e.g. in 1828, 1834, 1846, 1857, 1921, 1945, 1949 and 1959, so we are not yet beyond the range of normal variability." Huh! as of 2002 "so we are not yet beyond beyond the range of normal variability"!!!!!!!!!!!!!!!!!!!!!Your 6 of the warmest years in the Armagh record is apparently shared with 8 others extending back to 1828!!!
|
|
|
Post by radiant on Aug 17, 2009 19:26:40 GMT
[ UPDATE: It does appear that a record exists which shows the mean Armagh temperature in 1846 is 10.4 deg. Duh!!! I will look into this but there's either an error or a mismatch in instrumentation and/or method of recording. But rest assured the late 20th century warming is still evident (0.35 deg per decade) whichever data is used. So the warming is now unique to the late 20th century. OK. Have you found anything regarding errors yet? As it sits over 155 years the trend looks closer to .0035 deg per decade than .35 degrees per decade. Here is a conclusion from Butler: "Whilst there has been an exceptional number (five) of warm years over the past decade (1993–2002), when the mean annual temperature at Armagh has reached 10 °C or above, annual temperatures of this level have been reached before, e.g. in 1828, 1834, 1846, 1857, 1921, 1945, 1949 and 1959, so we are not yet beyond the range of normal variability." Huh! as of 2002 "so we are not yet beyond beyond the range of normal variability"!!!!!!!!!!!!!!!!!!!!!Your 6 of the warmest years in the Armagh record is apparently shared with 8 others extending back to 1828!!! Yes but dont forget the thermometers were in a box on the north side of the east tower for a few decades so we dont know if that means it is now massively cooler because the unheated box was not in the sun all day long like the stevenson screens or it is massively hotter now because the occupant of the room next to the thermometers liked to be in a heated room in summer and winter.
|
|
|
Post by glc on Aug 17, 2009 20:23:31 GMT
UPDATE: It does appear that a record exists which shows the mean Armagh temperature in 1846 is 10.4 deg. Duh!!!
Yes - but it's not the reconstruction produced by Butler and it's not the data mainatained by the Armagh Observatory.
Today at 2:33am, glc wrote: I will look into this but there's either an error or a mismatch in instrumentation and/or method of recording. But rest assured the late 20th century warming is still evident (0.35 deg per decade) whichever data is used. So the warming is now unique to the late 20th century. OK.
Pretty much so - yes.
Have you found anything regarding errors yet?
Not yet.
As it sits over 155 years the trend looks closer to .0035 deg per decade than .35 degrees per decade.
The 30 year (1973-2002) trend is 0.35 deg per decade.
Here is a conclusion from Butler: "Whilst there has been an exceptional number (five) of warm years over the past decade (1993–2002), when the mean annual temperature at Armagh has reached 10 °C or above, annual temperatures of this level have been reached before, e.g. in 1828, 1834, 1846, 1857, 1921, 1945, 1949 and 1959, so we are not yet beyond the range of normal variability."
Huh! as of 2002 "so we are not yet beyond beyond the range of normal variability"!!!!!!!!!!!!!!!!!!!!!
"Not yet" - well not in 2002, but clearly there had been an unprecedented number of warm (10+ deg) years between 1993 and 2002. But there's more ....
Your 6 of the warmest years in the Armagh record is apparently shared with 8 others extending back to 1828!!!
No - the six warmest years include at least 4 years between 2003 and 2007. For obvious reasons, these are not included in Butler's summary. This means there have now been TEN warm (10+ deg) years since 1995 - SIX of which are the warmest on record. This compares with just 8 'warm' years in the 200 years before 1995.
Butler did not know this at the time, of course. The UK, as a whole, has had some pretty warm years recently. 2006 was completely off the scale and, believe me, it had nothing to do with urban heat and poor station siting. July 2006 was the hottest month any of us here have ever experienced. This was followed by September 2006 which was also unusually warm throughout.
|
|
|
Post by glc on Aug 17, 2009 20:29:18 GMT
Yes but dont forget the thermometers were in a box on the north side of the east tower for a few decades so we dont know if that means it is now massively cooler because the unheated box was not in the sun all day long like the stevenson screens or it is massively hotter now because the occupant of the room next to the thermometers liked to be in a heated room in summer and winter.
Take it up with Armagh. Believe me it won't be "massively cooler" or "massively warmer".
|
|
|
Post by radiant on Aug 17, 2009 20:43:02 GMT
|
|
|
Post by glc on Aug 17, 2009 21:47:30 GMT
Today at 3:29pm, glc wrote:
Believe me Ok don't - but do what I suggested and put your concerns to the Armagh Observatory staff. It's quite possible that they haven't considered the possible effects of heated buildings and the sun. The UK over the centuries has had some of the most concientious and meticulous scientists. We have had scientists and engineers like Newton and Brunel, and explorers who were charting distant lands and oceans with incredible accuracy. Clearly, the observers at Armagh, both past and present, cannot be trusted to provide a reasonably accurate temperature record.
I've given a number of examples and studies which support the reliability of the Armagh record and which confirm that the record has been largely unaffected by siting factors. This clearly isn't good enough for you, so you need to approach Armagh directly.
|
|
|
Post by radiant on Aug 18, 2009 6:03:03 GMT
Clearly, the observers at Armagh, both past and present, cannot be trusted to provide a reasonably accurate temperature record. For the purposes of climate change i agree with you because the temperature changes are so minute and the weather so hugely variable decade to decade that consistancy and accuracy are required as staff and thermometers come and go and the only constant we have is for example regular spacings on a scale between the freezing point of pure water at one atmosphere and the boiling point of water at one atmosphere which we attempt to determine as best we can. For security reasons after repeated theft, Armagh enclosed their galvanised steel instrument tower and Stevenson screen inside a galvanised steel fence about 6 feet high in 1999. A large metallic observatory has been placed nearby recently. A large area of stone work has been built nearby recently. An Astropark has been built nearby recently. So it is clear that the records at Armagh are not suited for the purposes of long term climate change measurements. there is nothing more to be said on this record because it is clear that given the tiny changes we are wanting to measure their data is hopelessly compromised.
|
|
|
Post by glc on Aug 18, 2009 9:10:11 GMT
So it is clear that the records at Armagh are not suited for the purposes of long term climate change measurements.
But I've provided you with studies and expert opinion which says exactly the opposite. Try reading the papers cited by Steve Milloy.
there is nothing more to be said on this record because it is clear that given the tiny changes we are wanting to measure their data is hopelessly compromised.
We don't want to measure tiny changes. We are looking for a trend. There is a sustained trend over the past 30-40 years which is not consistent with local environment changes. The data at Armagh is supported by other nearby records and satellite data. Are all sources compromised by the same factors. Have new buildings been built adjacent to the orbiting satellites.
The problem is that your understanding of trend and error analysis is flawed.
|
|
|
Post by nonentropic on Aug 18, 2009 9:37:51 GMT
A relatively consistent bias is possible as virtually all regions are being developed rather than disestablished. But your point is totally valid if there was bias in sites from urban inputs I would say you should be able to discount it and I would be surprised if this has not happened and well.
|
|
|
Post by radiant on Aug 18, 2009 9:59:08 GMT
We don't want to measure tiny changes. We are looking for a trend. There is a sustained trend over the past 30-40 years which is not consistent with local environment changes. The problem is that your understanding of trend and error analysis is flawed. Accuracy is not important! Surely the motto of climatology in the current era. I think the problem is that i am old enuf to know that 40 years is not a very long time. Any fool can cherry pick some data and create a trend to support some hair brained theory and give it meaning. Meanwhile in 40 years time your data has to be sufficiently accurate that you know that 40 years ago it was different. Evidently you and I are operating in different time and space dimensions scientifically or you just a politician hoping to win votes.
|
|
|
Post by glc on Aug 18, 2009 11:36:34 GMT
You dont want to measure tiny changes?
It's not a case of not wanting to measure them it's a case of knowing and managing them. The micro-climate temperature effects are well understood. The Armagh Observatory will be well aware of the implications of measurement location.
The point is there will always be measurement error. The important thing is not necessarily to remove the error but to recognise it and to understand how sensitive it is with respect to your overall result.
Why are you looking at a tiny time period of 30 years and giving it meaning?
Three reasons:
1. A 30 year period is, in a statistical sense, long enough to smooth out most (but not all) of the natural fluctuations. Note that you can pick out a number of short term periods in the last 30 years which show a negative trend because of short term fluctuations, but when you get to 20 or 30 years the trends become less sensitive (that word again) to short term blips.
2. The satellite era began ~30 years ago, so we have more independent sources. The data has also been more closly analysed over the past 30 years.
3. It's only about 30-50 years ago that, if there are any waring effects from increased CO2, they will have been detected. Before the 1960s the change in CO2 concentration from the pre-industrial level was negligible, i.e. ~30 ppm or ~0.5 w/m2.
How do you know if it is warmer now than 100 years or 50 years or 150 years ago? Why is accuracy so unimportant to you?
We are not just looking at a single source of data, e.g. Armagh. There are lots of other records which extend back more than 150 years, e.g. CET, Uppsala, Stockholm, DeBilt etc. There are lots more which extend back more than 100 years. Most of them tell pretty much the same story. We also have ocean temperature data. We also have proxy data - such as borehole temperatures and ice core data.
I didn't mean that accuracy was unimportant I meant that providing we understand what is happening then it is not as important as we might think. EG - Take a station where the readings are always 2 deg too high. Does that matter. It depends what you want. If you're just looking for a trend (i.e. whether it's warming or cooling) - it doesn't matter in the slightest. The trend will be totally unaffected.
Why are you discounting tiny changes to say that the earth has warmed by the tiny amount of 0.6C in 100 years?
That the earth has warmed by ~0.6 deg over 100 years is a highly significant result. It is not the same as saying Armagh or London warmed 0.6 degrees since last year. I assume you are familiar with the concept of sample size in statistics. If not, let me try to illustrate what I mean. Let's say that we suddenly found out that the whole of the US was 1 deg cooler in 2009 than we previously thought but that the temperatures in 1900 were correct. Let's also say this cooling was at a constant rate of ~0.1 deg per decade. How much effect do you think this would have on the overall trend of 0.6 deg. The answer is barely anything. The 20th century warming would still be about 0.6 deg.
Why is 30 years so important to you?
Already dealt with.
How on earth can you give meaning to 30 years when talking about climate change while telling me accuracy is not important?
Dealt with above.
Do you have any scientific training?
Yes - but this is irrelevant. If someone on this blog makes a valid point which debunks my opinion I don't play the qualifications card.
Why do you refer me to another persons opinion who you agree with in order to inform me you dont agree with me as if people who agree with your opinion are able to know the absolute truth?
I refer you to others who may be able to offer a more convincing explanation than I can give.
Does the concept of independance have any meaning for you?
I don't know what you mean.
|
|
|
Post by glc on Aug 18, 2009 11:43:44 GMT
A relatively consistent bias is possible as virtually all regions are being developed rather than disestablished. But your point is totally valid if there was bias in sites from urban inputs I would say you should be able to discount it and I would be surprised if this has not happened and well.
nonentropic
I assume this is targeted at me. You have summed it up more clearly than me, thank you.
Radiant
Please take note.
|
|