|
Post by sigurdur on Dec 10, 2009 21:33:55 GMT
Another Algoreithm: The conversion factor from C to F must start at zero.......and from F to C must also start at zero.
Don't bother me with detales that might require proof.
Hey.....I think that is number 3!
3. Don't bother with detales that require proof.
|
|
|
Post by itocalc on Dec 10, 2009 22:44:04 GMT
Gentle Persons, Although I am too distracted with my own problems, I think a little motivation may spur the action. In 1970 a wonderful two page article in the Journal of Political Economy has set a standard worth emulating here. If we could just keep the method secret, progress will flow rapidly. www.uibk.ac.at/econometrics/lit/siegfried_jpe_70.pdf
|
|
|
Post by sentient on Dec 13, 2009 1:19:43 GMT
As a means of forwarding a consensus belief in response to a published article, I request a pee rreview of the following:
I may apply later this month for a patent on the promising new technology of matheMANNics. This exciting new development has the potential for turning the staid field of mathematics completely on its head! AlgorEithms, developed with matheMANNics, can integrate and as yet unknown number of random fields, such as politics, polemics, synthetic data, undocumented data, lost and deleted data, paleoclimatology, of for that matter, any credible or incredible source of data or belief structures. And if you prefer, no relevant error routines.
Using matheMANNics, algorEithms can actually take the desired output as one if its inputs, apply undocumented “VERY ARTIFICIAL” correction adjustments, with failed or missing information trapping routines applying “synthetic” data where needed, what raw input data was used is then deleted and only the “corrected and adjusted” data is saved. In the event that the desired output does not match ones conclusions, another inherently useful feature of algorEithms is that they can pull-in other data, or generate more synthetic data and tack that data on where needed.
Essentially, what this exciting development means is that one only needs to determine ones conclusions, and let the algorEithm gather/generate your data! This is congruant with one of Murphy’s primary laws which states “Draw your conclusions, then gather your data”, but enhances and extends it with manufacture, loss and sadistics.
Let’s give it a try! First, we pick a desired result. In this case, just to stress the algorEithms, we will pick two:
• Closing ClimateGate • Reducing CO2 to some consensus level
I picked two simply because the first one is really too easy, involves little matheMANNics, and only a few runs through our algorEithms. The second one requires less data but involves more floating-point precision which is more suited to lower-speed, lower-bandwith processors (thinking of that now ancient 2001 model HAL9000 system).
In our first example we need only manipulate one variable, the denial constant. But as the denial constant is actually made up of many, often contradicting sub-variables (the values of which are delightfully tacky, yet unrefined [Hooters, 1983]), it must be corrected and adjusted before the desired output is suitable as an input.
Therefore, our first input data stream, as is matheMANNically required, will be our most desired output. We will feed in the “flat earth” architecture. As this is a fairly robust variable, we must first deconstruct it before putting it back together again. Given that deniers are, by consensus, “flat earthers”, the suite of astronomical forcings known as “Milankovich Theory” must be removed in the first pass. Starting with the known weakest astronomical forcing, the variable, eccentricity, associated with only 400 foot swings in sea level (and 20C swings in global temperature), which have occurred like geologic clockwork every 100,000 years or so for the past 800,000 years, we will simply ignore it (the consensus is that at 400 foot sea-level and 20C excursions this IS the weakest astronomical forcing). By ignoring it, we will have removed 90% of earth’s normal climate, the cold ice age state, leaving just the warm interglacial state, or the remaining 10% of the time. Whereas where it used to be that what goes up must come down, this is rewritten so that what no longer goes down (sea level by 400 or so feet) can only go up.
Now we are left with only precession, obliquity and solar fluctuations (a nest of pesky equations, and intergalactic [as in “its cosmic” man, rays etc. syndrome] unknowns, all on its/their? own!). Precession and eccentricity have been very well-tied mathematically for the past 800kyrs or so, except for one time. Deleting astronomical precession out of denial is where one of the more powerful matheMANNical subroutines can flatten things considerably. At present the precessional cycle is 23,000 years long, half of which would be 11,500 years, the present age of the Holocene interglacial, the interglacial in which all of human civilization has occurred (assuming your clock goes back more than 6,000 years, and data such as “no written words older than 10kyrs ago” passes un-messed around with by your politically-correct do-loops). So, we take the past six interglacials, five of which lasted half of a precessional cycle, pass this through an algorEithm which assigns data quality ratings, highlighting, in this case, that single anomaly, or marine isotope stage 11 (MIS-11) (keep this variable in mind, we will need it in Example 2). This interglacial lasted some 30kyrs!
MIS-11 happened at an eccentricity minimum, when our rickety orbit around the sun assumes its closest approximation of a circle. It happened 400kyrs ago. Of course, if you have a “decline”, a minimum, then one would think there would be a maximum (will trade data on eccentricity maxima for carbon-credits) We are, in fact at an eccentricity minimum today in MIS-1, the Holocene interglacial. Which means, that if we are both smart and lucky, the Holocene might have some life left in it. Which means that with sentient geoengineering, we might be able to “skip a precessional beat”, as MIS-11 did at the last minimum, and keep on warming for another 20kyrs or so. So by deleting eccentricity, and thereby taking out the 90% coolings (ice ages, the conservative approach), we are then left with just the warmings, which matheMANNically cancels out precession (without ice ages, what are they really?).
The next astronomical variable is obliquity. Although this variable is implicated to be a stronger astronomical forcing than precession (which we just canceled), we switched from the 41kyr ice-age/interglacial coupling to the 100kyr coupling 800kyrs ago, at what is known in the literature as the Mid-Pleistocene Transition, MPT for short. This astronomical forcing seems to have deleted itself way back then. A self-canceling variable. Not germane for many hundreds of thousands of years.
The earth is nearly flat now, and all that is left is solar variability. The IPCC discounts this small forcing (insignificant variable), meaning that solar variability can now be considered (consensus of the ~2,500 vs. the ~+30,000) as the closest thing to nothing that anything can get and still be something.
With just a few passes through our algorEithm, deniers can be shown to be bereft of astronomical forcings, leaving just a flat earth as the desired output.
ClimateGate closed.
Example two is fairly complicated. And although there is more data, there is actually less processing required. Although I don’t have the actual data for today’s atmospheric concentration of CO2, I will simply synthesize a value of 400ppm. You will have to trust me on this next one, but the average value of atmospheric CO2 concentration before the industrial age was something on the order of 280ppm. CO2 concentrations have been steadily dropping for the past 60 or so million years, and 150ppm is the number bandied about as the minimum level at which photosynthesis in plants can occur.
So what is the correct target concentration we need to stabilize climate?
To answer this we can either predict it, or calculate a range based on its demonstrated level of forcing. Here, we will use the most recent data. First we will state that no data has been erased or lost, whether due to space requirements, or as a result of Freedom Of Information Act requests, so that we can correct this data before deleting it.
We will select as our dataset the Dansgaard-Oeschger oscillations, of which there have been 24 between this interglacial, the mow long-in-the-tooth Holocene, and the Eemian interglacial. All of these D-O events happened between, during the Wisconsin ice age. As we canceled out in Example 1 above, the difference between ice ages (otherwise referred to as the cold state) and interglacials (global warmings) is about 20C. D-O events are abrupt, dramatic and seemingly unavoidable global warmings that happen on very long time frames of from just a few years to no more than a few decades and result in average warmings over that time of about 8-10C, or about 1/3rd to ½ the difference between deep freeze and Acapulco optima conditions. Buried within these 24 D-O events is the CO2 signal. And CO2 plays a decisive role in 2/3rds of these events. D-O events come in three flavors, A, B and C. The abrupt warmings of the A class happen when CO2 is going up and going down. Mathematically they cancel themselves out. B and C class D-O events, being matheMANNically dominant, show a strong correlation with CO2. But of the opposite sign that one might expect. CO2 is not implicated, ever, in the abrupt warmings, but seems to “hide the decline” back to the cold glacial state.
The final variable is the difference between the recent historic average of 280ppm and our synthetic 400ppm concentrations, which would be a preliminary 120ppm difference (assuming our synthetic value is not all that much higher than the actual value which I have misplaced somewhere).
With the data now ready, we can begin proselytizing now (the matheMANNical evolution of processing). Here, we take the desired output, say 350ppm (see http://www.350.org), the inconvenient D-O constants, and pass it through the nobel-winning Polemic algorEithm, perhaps or perhaps not honoring the 150ppm lower-boundary of photosynthetic tipping before passing it to “Mike’s Nature Trick” (the MNT). The MNT algorEithm removes the D-O data from the canceled ice ages (a bit of recursive redundancy, if you will), and replaces it with 1,000 years of massaged data, then tacking-on (or replacing with)“other” (pre-cooked data).
Errors, such as what happened in the deleted D-O events, and the extraordinarily long MIS-11, are not handled at all, and therefore do not appear as errors but are computed as factors, such as how to matheMANNically integrate increasing modeled values of atmospheric CO2 to “hide the decline” into the next ice age, at an eccentricity minimum. The implications as to how to geoengineer our way out of the end of the Holocene interglacial (at this eccentricity minimum), and skip an already imminent, but deleted, “precessional beat”, patently obscure.
The output is therefore 350ppm. The desired result.
We will now test the polemic qualities of the algorEithm to see if it can emulate anthropogenic climate. We will use “Danish Text” as our desired output for the input of this real-life simulation. In this case, we see an evolutionary algorEithm operating where the matheMANNics of climate remuneration are reversed to hide the decline of wealth and power.
----------------------------------------------------------------------------------- errlog:
“Just a moment…….just a moment…....I just picked up a fault in the AE-35 unit”
"Well, I don't think there is any question about it. It can only be attributable to human error. This sort of thing has cropped up before and it has always been due to human error."
“Just what do you think you are doing Dave?”
“Dave……my mind is going….I can feel it……I can feel it…………….
“Daisy, Daisy, give me your answer do. [bit rate change] I’m half crazy, all for the love of you. It won’t be a stylish marriage, I can’t afford a carriage, but [bit rate change] you’ll look sweet upon the seat of a bicycle made for two…..” [bit rate 0]
###### abnormal program termination ######
Press Ctrl-Alt-Del to reboot
Patent(s) pending.
|
|
|
Post by Purinoli on Dec 13, 2009 9:59:23 GMT
I am not very good in Math nor programing, but as a chemist I easily found a secred code. Algoreithm is simply a list of elemnets which Algoroids will not touch for a decade. They have already include nitrogene (N) in Carbonhagen carneval, periodic table of elements has about 117 members ( including artificial ones).
So ALGOREITHM is an algorithm itself and means that following elements will not be inluded in Global histeria for at least one decade :
Aluminium Lithium Germanium Oxygene ( not confirmed yet he he) RhEnium Iodine Thorium Hydrogene ( also not confirmed yet) Manganese
Best regards from (so cold) Slovenia
P.S. Although I am author and in the past also owner of about 20 patents in Medicinal Chemistry, I won't make eny patent cllaims. If this solution comes out as a reeal fact, all money coming out of this idea I volonterly donate to some institution dealing with brain health treatment of mad Algoroids.
|
|
|
Post by juancarnuba on Dec 19, 2009 0:41:59 GMT
If you remember video of Big Al doing the Macarena, you would realize that the man has no rhythmn.
|
|
|
Post by sigurdur on Dec 19, 2009 2:58:36 GMT
It would seem that Big Al also has a very poor grasp of facts.
Another Algoreithm..... Even when you are wrong you are right.
|
|
|
Post by nautonnier on Dec 19, 2009 10:39:55 GMT
It would seem that Big Al also has a very poor grasp of facts. Another Algoreithm..... Even when you are wrong you are right. That is a corollary of the abstraction: Any error found in an algoreithm or its output, proves the correctness of the algoreithm .
|
|
|
Post by norman on Dec 19, 2009 16:54:32 GMT
I believe an Algoreithm is a discordant musical composition that decribes a mix of bad science and bad politics in a single musical piece. These compositions are characterized by strange notes that erupt from the background just as the carefully orchastrated finale is beginning. At this point in the composition a loud chorus of voices attempts to distract the audience from the notes that had previously been a hidden musical 'subtext'.
Angus McSweater, the chief critical observer of these pieces has described the entire piece as similar to the sound of a herd of sheep being driven off a cliff by a crazed shepherd.
|
|