|
Post by jimg on Dec 23, 2008 17:53:20 GMT
Hi Ron.
Look at it this way.
When recording temperature you record it to two significant figures, with some recorded to three, ie 103F.
You then average the data. You have to round the result to two significant figures. (the least precise data).
I got dinged in high school science for not using significant figures correctly. The reason? You can't gain precision from imprecise data.
Assuming that the data "averages out" is bad science. You can only act upon what you know, not what you don't.
|
|
|
Post by socold on Dec 24, 2008 2:57:07 GMT
An average can have more precision than a single data point
For example the average number of children per family in a country might be 2.3, but I hope any specific family has a whole number of children.
|
|
|
Post by jimg on Dec 24, 2008 3:16:05 GMT
And hence, the reason that .3 children is a meaningless number. So in actuality, the average family has 2 children. Some will have more, some less. But none will have less than a whole child.
More precision was read into the data than was there to begin with.
|
|
|
Post by socold on Dec 24, 2008 4:00:30 GMT
2.3 indicates on average families have slightly greater than 2 children and slightly less than 3. You have to use a fractional amount in the average to express that.
Youtube allows you to rate videos out of 5.
Next to each video is the average rating given by all voters. Some videos have a rating of 4.5. That indicates they are more popular than rating 4's and less popular than rating 5's.
If youtube simply rounded up the average to the nearest whole number you would lose that information.
|
|
|
Post by Acolyte on Dec 24, 2008 4:25:51 GMT
But while the average for children may be 2.3, the government doesn't pay out child allowance at a rate of 2.3 x allowance per family - it pays out according to the exact numbers.
What is being done with temps would, on YouTube, be the equivalent of saying all videos on youtube rate at a global average of 2.8 (or whatever) which becomes, like the average wind speed across all bridges, a meaningless datum.
Sometimes, the actual figures matter & the averages tell you Sweet FA. Averaging erros then averaging the errors across unrelated locales is worse than nothing - it's misleading. Given the efforts put in on defence of the procedure one has to suspect it is deliberately misleading.
|
|
|
Post by socold on Dec 24, 2008 5:01:43 GMT
You are confusing utility with meaning. The statistical concept of an average has meaning. Whether someone uses it in any case is another matter.
|
|
|
Post by jimg on Dec 24, 2008 6:14:38 GMT
socold. This is the exact reason so many people are frustrated with the AGW community.
You are failing to use the common practice of significant digits! As such, you are creating data where there is none.
In the case of children, since the families are represented by a single digit, and that is the least number of significant digits, that is what you round your results to. One single digit. As such, the average family has two kids. Please show me the families that have a third of one. (and pregnant women don't count)
In the case of temperature anomoly, the data is being expressed to four significant digits (now I don't recall what the exact average is for the "historical period" so I will us 52.11)
You subtract 52.11 for a given year that might be 52.33 so you end up with an anomoly of +0.22 (0 being a place holder)
However, if the data recorded only had two significant digits, ie 53 or 52, then you created precision where there was none.
The margin of error would be +/- .5
You fudged the books.
This is middle school science.
|
|
|
Post by kiwistonewall on Dec 24, 2008 6:49:34 GMT
Leaping to socold's defense: when he says "You are confusing utility with meaning. The statistical concept of an average has meaning." he is being perfectly correct. There is a difference between a real numbers and integers!Since integer values are 100% precise, they can be statistically treated. But real number measurements are a whole different ball game. The issue here is that the "anomaly" has a serious loss of significance. Since we are subtracting two nearly equal numbers. The value produced is of the same significance as the errors. i.e. there is no significance. Since it is (in effect) a collection of errors, it can be cherry picked in any way the collector of the information wants. Now if temperatures could be (were in the past!) measured to +/- .01C instead of +/- 0.5 or .1, then an anomaly of +/- 0.5 might mean something. No amount of averaging at this level of (in)significance can produce valid data. This is more elegantly described here: www.bautforum.com/869198-post171.htmlNumerical Methods is tough (it's the only paper I ever got a C in ;D) But one thing I did learn, never subtract two nearly equal numbers. Do the math differently, and NEVER DESIGN A SYSTEM WHERE THE SUBTRACTION OF TWO NEARLY EQUAL NUMBERS IS CRITICAL! (unless you are trying to confuse & mislead) Which is why I don't bother with all that stuff - the ice & sunspots are more real & meaningful. Wake me up if we have a -5C anomaly - that WILL be meaningful.
|
|
|
Post by dopeydog on Dec 24, 2008 13:44:16 GMT
The destruction of science continues: "It is now mainstream in the campaign to suppress speech to call for criminalization of skepticism (that is, of science) and imprisonment of its practitioners. British Foreign Secretary Margaret Beckett publicly demands that media outlets refuse to grant “skeptics” space, on the grounds that they are just like Islamic terrorists." www.humanevents.com/article.php?id=30028
|
|
|
Post by jimg on Dec 24, 2008 15:47:30 GMT
Wow dopey, that's rather disturbing. The comments on Austrailian citizenship reminded me of a letter between Governor Pliny and Emperor Trajan circa 110AD. "Meanwhile, in the case of those who were denounced to me as Christians, I have observed the following procedure: I interrogated these as to whether they were Christians; those who confessed I interrogated a second and a third time, threatening them with punishment; those who persisted I ordered executed. For I had no doubt that, whatever the nature of their creed, stubbornness and inflexible obstinacy surely deserve to be punished. There were others possessed of the same folly; but because they were Roman citizens, I signed an order for them to be transferred to Rome. " Source: (also includes the Emperor's response) www.fordham.edu/halsall/source/pliny1.html"History never repeats itself, but it often rhymes." Mark Twain
|
|
|
Post by ron on Dec 24, 2008 19:58:44 GMT
.5 is NOOOOOOT the margin of error. Please. Trust me. It becomes statistically IMPOSSIBLE for the error to be .5 on a significant number of samples. While you cannot apply the average to a particular location, you can VERRRY ACCURATELY measure the whole GRRREAT confidence with instruments that don't have the granularity you want. It's all about probability. After measuring 1,000 locations and averaging their temps your probability for an error of .5 due to rounding is ..... impossible. It would require that every single measurement of the 1,000 having been at the exact spot that would create the .5 rounding error. Your positions are indefensible, and quite frankly you're coming off looking silly. Unless you think that temperatures are not analog in nature and increment in only 1 degree increments, you can say with 95% confidence that the measured average is within something like .005 or something of the actual. But I'd have to dig out my statistics books from 3 decades ago to get the exact formulas. Ya gotta trust that out of a couple hundred scientists that some of them know a little math here and there.
|
|
|
Post by kiwistonewall on Dec 24, 2008 20:49:33 GMT
Who are you talking to Ron?
If me, then you are wrong when you have an anomaly at the same order of magnitude as the error.
10.5 +/- .5 subtract 10.2 +/- .5 = anomaly of + .3 +/- 1!
Remember that we track our anomalies back in time - so we are always restricted by the error in the past, no matter how good our modern stuff is. We have calibration errors, reading errors and instrument precision.
It isn't the temperatures, its the anomaly created by averaging errors. There is only noise. You can get patterns from noise. You can process noise. But at the end of the day what we get from the AGW is mainly noise - double meaning intended.
No physical scientist would give these temperature anomalies much credence. The whole of Climate science is based on anomalies.
When (and only when) the anomaly is an order of magnitude greater than the measurement error will we have useful information.
The political need to convince that there is some sort of human caused warming, has introduced politics and the maths of "social science" into the picture. Those of us with a hard science, or solid numerical methods training (or both) find this stuff hard to swallow as meaningful.
|
|
|
Post by Acolyte on Dec 24, 2008 23:19:55 GMT
Try this...
In a month of recorded temps, using accuracy that the guy looking at it can record accurate to +/- 0.5º, which seems fair - there's a line on the gauge for each degree & the observer can estimate fairly accurately whether the temp is closer to the one above or the one below. * the 'Some' column is where most are accurate but some aren't (6 measurements) Temp .5 Up .5 Down Some 10 10.5 9.50 10.5 11 11.5 10.5 11 10 10.5 9.50 10 12 12.5 11.5 12 15 15.5 14.5 15 12 12.5 11.5 12.5 12 12.5 11.5 12 16 16.5 15.5 16 15 15.5 14.5 15 19 19.5 18.5 19.5 25 25.5 24.5 25 10 10.5 9.50 10 11 11.5 10.5 11.5 27 27.5 26.5 27 25 25.5 24.5 25 25 25.5 24.5 25 20 20.5 19.5 20 21 21.5 20.5 21 15 15.5 14.5 15 24 24.5 23.5 24 24 24.5 23.5 24 35 35.5 34.5 35.5 21 21.5 20.5 21 18 18.5 17.5 18 18 18.5 17.5 18 21 21.5 20.5 21 25 25.5 24.5 25.5 30 30.5 29.5 30 30 30.5 29.5 30 11 11.5 10.5 11 18.93333 19.43333 18.43333 19.03333 19 19 18 19
So to have any real idea of what the record is telling us, we need to know which records are out & by how much - the error bars of +/-0.5º allow for our final averages to be out by the same amount as original input.
So is the temp for the month to be 18º or 19º? For a lot of purposes it hardly matters but when there's billions of dollars & peoples lives riding on a result being measured in 1/100ths of 1º differences, we really REALLY need to get an accurate survey of each & every instrument being used.
And note, this whole scenario gets WAAAAY more complex if there's even the slightest chance the instruments actually vary with temperature.
And the level of compexity (& actual error) goes way higher if the instruments also are non-uniform. ie. this one is surrounded by tin, while that one has a latex paint & this one is in the jet-wash of LAX while that one is shaded by the new building... etc.
Now, let's take these uncertainties, throw in a dash of personal-preference manipulation, then make a few radical assumptions so we can handle the coding & computational requirements, then tell everyone that because the temp has gone up .005 this month it is evidence of agw.
|
|
|
Post by jimg on Dec 25, 2008 0:14:14 GMT
Ron,
all I know is that if I had an instrument that could be read to two significant digits, and I gave an answer that was carried out to four, I would get dinged for my answer.
Why? Because I introduced greater precision than was present in the instrument. From what I have seen of historical records so far, they were reported to the nearest whole degree, not fraction.
Have you not experienced that?
|
|
|
Post by socold on Dec 26, 2008 21:01:40 GMT
Lets do some examples with random data. I'll generate 3 random temperatures for 3 different locations and then generate the actual average. Then I'll round the location measurements and average them.
t1 = 29.524773852678376C t2 = 22.805751434949112C t3 = 21.353941760656397C average = 24.56148901609463C
sensor1 = 30C sensor2 = 23C sensor3 = 21C
sensor average = 24.67C
The sensor average is within 0.11C of the actual average.
If we kept repeating this process with different random numbers, 75% of the time the average of three sensors is within 0.2C of the actual average. In contrast a single sensor is only accurate within 0.2C of the actual temperature 40% of the time.
With 100 sensors the average of all sensor readings will be within 0.1C of the actual temperature average 99.9% of the time. So the more sensors you use the more fractional digits of the average are significant.
|
|