|
Chaos
Sept 30, 2019 8:44:57 GMT
via mobile
Post by acidohm on Sept 30, 2019 8:44:57 GMT
The Butterfly Effect was the reason. For small pieces of weather—and to a global forecaster, small can mean thunderstorms and blizzards—any prediction deteriorates rapidly. Errors and uncertainties multiply, cascading upward through a chain of turbulent features, from dust devils and squalls up to continent-size eddies that only satellites can see. The modern weather models work with a grid of points on the order of sixty miles apart, and even so, some starting data has to be guessed, since ground stations and satellites cannot see everywhere. But suppose the earth could be covered with sensors spaced one foot apart, rising at one-foot intervals all the way to the top of the atmosphere. Suppose every sensor gives perfectly accurate readings of temperature, pressure, humidity, and any other quantity a meteorologist would want. Precisely at noon an infinitely powerful computer takes all the data and calculates what will happen at each point at 12:01, then 12:02, then 12:03… The computer will still be unable to predict whether Princeton, New Jersey, will have sun or rain on a day one month away. At noon the spaces between the sensors will hide fluctuations that the computer will not know about, tiny deviations from the average. By 12:01, those fluctuations will already have created small errors one foot away. Soon the errors will have multiplied to the ten-foot scale, and so on up to the size of the globe.
|
|
|
Chaos
Sept 30, 2019 8:51:26 GMT
via mobile
Post by acidohm on Sept 30, 2019 8:51:26 GMT
This one is for Naut!
THIS NEW RUN should have exactly duplicated the old. Lorenz had copied the numbers into the machine himself. The program had not changed. Yet as he stared at the new printout, Lorenz saw his weather diverging so rapidly from the pattern of the last run that, within just a few months, all resemblance had disappeared. He looked at one set of numbers, then back at the other. He might as well have chosen two random weathers out of a hat. His first thought was that another vacuum tube had gone bad. Suddenly he realized the truth. There had been no malfunction. The problem lay in the numbers he had typed. In the computer’s memory, six decimal places were stored: .506127. On the printout, to save space, just three appeared: .506. Lorenz had entered the shorter, rounded-off numbers, assuming that the difference—one part in a thousand—was inconsequential. It was a reasonable assumption. If a weather satellite can read ocean-surface temperature to within one part in a thousand, its operators consider themselves lucky. Lorenz’s Royal McBee was implementing the classical program. It used a purely deterministic system of equations. Given a particular starting point, the weather would unfold exactly the same way each time. Given a slightly different starting point, the weather should unfold in a slightly different way. A small numerical error was like a small puff of wind—surely the small puffs faded or canceled each other out before they could change important, large-scale features of the weather. Yet in Lorenz’s particular system of equations, small errors proved catastrophic.
|
|
|
Chaos
Sept 30, 2019 8:54:42 GMT
via mobile
Post by acidohm on Sept 30, 2019 8:54:42 GMT
With his primitive computer, Lorenz had boiled weather down to the barest skeleton. Yet, line by line, the winds and temperatures in Lorenz’s printouts seemed to behave in a recognizable earthly way. They matched his cherished intuition about the weather, his sense that it repeated itself, displaying familiar patterns over time, pressure rising and falling, the airstream swinging north and south. He discovered that when a line went from high to low without a bump, a double bump would come next, and he said, “That’s the kind of rule a forecaster could use.” But the repetitions were never quite exact. There was pattern, with disturbances. An orderly disorder.
|
|
|
Chaos
Sept 30, 2019 9:03:46 GMT
via mobile
Post by acidohm on Sept 30, 2019 9:03:46 GMT
Thats it for now 😁
Im sure y'all understand the science of chaos and much/all of this isnt new.
However i feel highlighting all the above is worthwhile as everything we discuss is entirely laced with it.
Chaos shouldn't be in the back of ones mind when looking at atmosphere/oceans or sun, but at the front.
Observing periodicity while comforting is always going to diverge at some point.
Applying a linear effect to a non-linear system is essentially impossible.
There is not a single chance we can ever calculate cloud cover for our planet in advance.
|
|
|
Chaos
Sept 30, 2019 9:44:22 GMT
Post by nautonnier on Sept 30, 2019 9:44:22 GMT
Just a random thought. Lorenz first saw his chaotic system when the start parameters of a model were in fixed point so not quite the same as the floating point that they were based on. even with 'long float' values the precision of the number in the computer will not be the precision in nature which has no limit to the minuteness at the tail of the precision at the 10,000th decimal place. This is the mathematical butterfly wing that causes even the bsst computers to go wildly out of synch with what they are forecasting; and for each initial state value there is another butterfly wing slight difference -lack of precision- after a few thousand iterations the model bears no relation to reality. All that is before we have considered the accuracy of the initial states copied from nature. Most models have reality check parameters to stop them going wildly out of possibility, but perhaps there is that possibility and we are living on a knife edge Just some thoughts
|
|
|
Post by nautonnier on Sept 30, 2019 10:02:55 GMT
The Butterfly Effect was the reason. For small pieces of weather—and to a global forecaster, small can mean thunderstorms and blizzards—any prediction deteriorates rapidly. Errors and uncertainties multiply, cascading upward through a chain of turbulent features, from dust devils and squalls up to continent-size eddies that only satellites can see. The modern weather models work with a grid of points on the order of sixty miles apart, and even so, some starting data has to be guessed, since ground stations and satellites cannot see everywhere. But suppose the earth could be covered with sensors spaced one foot apart, rising at one-foot intervals all the way to the top of the atmosphere. Suppose every sensor gives perfectly accurate readings of temperature, pressure, humidity, and any other quantity a meteorologist would want. Precisely at noon an infinitely powerful computer takes all the data and calculates what will happen at each point at 12:01, then 12:02, then 12:03… The computer will still be unable to predict whether Princeton, New Jersey, will have sun or rain on a day one month away. At noon the spaces between the sensors will hide fluctuations that the computer will not know about, tiny deviations from the average. By 12:01, those fluctuations will already have created small errors one foot away. Soon the errors will have multiplied to the ten-foot scale, and so on up to the size of the globe. We were doing some research that required we kept weather information that we could relate to aircraft traffic. There is a HUGE amount of weather information so a decision has to be made on the size of the bucket that you can use. One of the ideas was just to take a degree of latitude and longitude. This seemed OK till we found the case of a small tight tropical storm inside the degree box and the winds averaged out in the box to 'hide' the storm. The only indication it was there were the winds in the adjacent degree boxes.
|
|
|
Chaos
Sept 30, 2019 10:03:42 GMT
via mobile
Post by acidohm on Sept 30, 2019 10:03:42 GMT
Just a random thought. Lorenz first saw his chaotic system when the start parameters of a model were in fixed point so not quite the same as the floating point that they were based on. even with 'long float' values the precision of the number in the computer will not be the precision in nature which has no limit to the minuteness at the tail of the precision at the 10,000th decimal place. This is the mathematical butterfly wing that causes even the bsst computers to go wildly out of synch with what they are forecasting; and for each initial state value there is another butterfly wing slight difference -lack of precision- after a few thousand iterations the model bears no relation to reality. All that is before we have considered the accuracy of the initial states copied from nature. Most models have reality check parameters to stop them going wildly out of possibility, but perhaps there is that possibility and we are living on a knife edge Just some thoughts Indeed Naut, sensible forecasters like Joe B often state that models are a tool, but experience can often tell where the model is off, and what a more likely outcome would be. This is a wise approach. Id say outside of 2 weeks, we can gain an idea for trends, but even 7-10 days can have wildly different outcomes from model output. When forecasting snow and how much where in UK for example (or any precipitation/cloud for that matter) it really becomes a 'nowcast' scenario. Models aren't really included in this process, its actually a matter of watching radar and seeing how known conditions move in real time. I watched a frontal system with snow in february, parts of it sped up, parts of it stalled, no-one knew where/if any stalling may occur, tho as you can imagine, where it did deposited the greatest amounts of snow (4" 🤣🤣🤣) There was no computer that could tell people at any location what would happen even 1 hr prior.... Of course, generally, no-one is too fussed with this kind of accuracy. If people know are told its going to rain if it drizzles/pours or stays dry the emotions are much less involved. If they're told its going to snow, they pay much more attention and notice what happens.. then the moaning at forecasters starts!
|
|
|
Chaos
Sept 30, 2019 10:06:36 GMT
via mobile
Post by acidohm on Sept 30, 2019 10:06:36 GMT
The Butterfly Effect was the reason. For small pieces of weather—and to a global forecaster, small can mean thunderstorms and blizzards—any prediction deteriorates rapidly. Errors and uncertainties multiply, cascading upward through a chain of turbulent features, from dust devils and squalls up to continent-size eddies that only satellites can see. The modern weather models work with a grid of points on the order of sixty miles apart, and even so, some starting data has to be guessed, since ground stations and satellites cannot see everywhere. But suppose the earth could be covered with sensors spaced one foot apart, rising at one-foot intervals all the way to the top of the atmosphere. Suppose every sensor gives perfectly accurate readings of temperature, pressure, humidity, and any other quantity a meteorologist would want. Precisely at noon an infinitely powerful computer takes all the data and calculates what will happen at each point at 12:01, then 12:02, then 12:03… The computer will still be unable to predict whether Princeton, New Jersey, will have sun or rain on a day one month away. At noon the spaces between the sensors will hide fluctuations that the computer will not know about, tiny deviations from the average. By 12:01, those fluctuations will already have created small errors one foot away. Soon the errors will have multiplied to the ten-foot scale, and so on up to the size of the globe. We were doing some research that required we kept weather information that we could relate to aircraft traffic. There is a HUGE amount of weather information so a decision has to be made on the size of the bucket that you can use. One of the ideas was just to take a degree of latitude and longitude. This seemed OK till we found the case of a small tight tropical storm inside the degree box and the winds averaged out in the box to 'hide' the storm. The only indication it was there were the winds in the adjacent degree boxes. One might even suggest the effect of the macro, your favorite, water molecules. These may or may not condense and effect the adjacent molecules, which scaled up determines cloud cover etc.... Thought experiment, does a hurricane at some point develop or not on the basis as to whether a single water molecule does or does not condense 🤔 Mathematically, i think the answer would have to be yes....
|
|
|
Chaos
Sept 30, 2019 10:13:02 GMT
Post by nautonnier on Sept 30, 2019 10:13:02 GMT
Just a random thought. Lorenz first saw his chaotic system when the start parameters of a model were in fixed point so not quite the same as the floating point that they were based on. even with 'long float' values the precision of the number in the computer will not be the precision in nature which has no limit to the minuteness at the tail of the precision at the 10,000th decimal place. This is the mathematical butterfly wing that causes even the bsst computers to go wildly out of synch with what they are forecasting; and for each initial state value there is another butterfly wing slight difference -lack of precision- after a few thousand iterations the model bears no relation to reality. All that is before we have considered the accuracy of the initial states copied from nature. Most models have reality check parameters to stop them going wildly out of possibility, but perhaps there is that possibility and we are living on a knife edge Just some thoughts Indeed Naut, sensible forecasters like Joe B often state that models are a tool, but experience can often tell where the model is off, and what a more likely outcome would be. This is a wise approach. Id say outside of 2 weeks, we can gain an idea for trends, but even 7-10 days can have wildly different outcomes from model output. When forecasting snow and how much where in UK for example (or any precipitation/cloud for that matter) it really becomes a 'nowcast' scenario. Models aren't really included in this process, its actually a matter of watching radar and seeing how known conditions move in real time. I watched a frontal system with snow in february, parts of it sped up, parts of it stalled, no-one knew where/if any stalling may occur, tho as you can imagine, where it did deposited the greatest amounts of snow (4" 🤣🤣🤣) There was no computer that could tell people at any location what would happen even 1 hr prior.... Of course, generally, no-one is too fussed with this kind of accuracy. If people know are told its going to rain if it drizzles/pours or stays dry the emotions are much less involved. If they're told its going to snow, they pay much more attention and notice what happens.. then the moaning at forecasters starts! In one 'previous existence' a decision I had to make was whether to keep the 'snow team' on to keep the runways clear or not and which if any anti-ice to put on the runway. I can remember going down to the met forecaster and asking if he thought it would snow - he threw the latest synoptic chart onto the desk and said well there it is - you tell me The height of the zero degree isotherm is really difficult and with precipitation it can change anyway. A later existence we were doing research into probabilistic forecasting of weather and we were told by a road building company that they could save huge amounts of money if they could be told a few hours in advance what the _type_ of rain was likely to be. Light rain or drizzle and tarmac laid is good quality and sets well, heavier rain (there as a particular limiting drop size) and tarmac cannot be laid. Worst case for them was trucks loaded with tarmac that cannot be laid - apparently that tarmac is waste and cannot be used. These are the 'customers' that Joe B is talking about a lot of the time.
|
|
|
Chaos
Sept 30, 2019 10:20:26 GMT
via mobile
Post by acidohm on Sept 30, 2019 10:20:26 GMT
Good point Naut, i had general public in mind but yes, certainly there's a difference to the interests of some!
|
|
|
Post by blustnmtn on Oct 7, 2019 23:31:11 GMT
|
|
|
Chaos
Oct 8, 2019 11:42:59 GMT
Post by nautonnier on Oct 8, 2019 11:42:59 GMT
Computers are not making mistakes - people that should know better are trying to model chaotic systems with deterministic computers written with all sorts of limitations in software, compiling, assembling etc etc. Nowadays many software engineers have not even gone deeper than the virtual machines they are 'programming'. They have no idea about the inherent limitations of the instantiation of that virtual service. Then happily iterate millions of times in recursive high level programs to roll forward to next century. Back at the end of the 1960s when commercial computing was in its infancy there were many jokes usually usually about a clerk who would say as two truckloads of paperclips were delivered ' it must be right the order was issued by the computer' Now we have climate scientists effectively saying the same thing - "the forecast of boiling oceans must be right the forecast was generated by the computer model."
|
|
|
Post by Ratty on Oct 8, 2019 12:11:27 GMT
Back in the 1970s, automation was creeping into many of the systems associated with large airliners. One day after the boffins and engineers had laboured mightily for many weeks, a fully-loaded Convair 880 took off from Heathrow bound for New York. The cabin crew did the normal safety demonstration and the aircraft taxied out to the active runway, lined up and took off in the usual manner. As the Convair climbed through about 26,000 feet, an announcement came from the flight-deck:-
"Ladies and Gentlemen, welcome onboard this the first fully-automated transatlantic flight from London to New York. So advanced are the automatic systems onboard this specially-equipped Convair 880, there is no actual flight crew onboard in the flight-deck, the door to which is therefore locked. The entire flight-plan, with all imaginable contingencies, has been programmed into quadruplicated flight management computers, all backup systems are duplicated and there is a fifth, entirely separate set of automatic systems in case of any unforeseen problems. So relax, sit back, enjoy the cabin service from our excellent crew, and again we hasten to assure any of you who may feel slightly apprehensive about this flight that nothing, I repeat, absolutely nothing can go wrong, go wrong, go wrong, go wrong, go wrong..."
|
|