Announcement

Collapse
No announcement yet.

Eating your own dog food...

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #46
    Re: Eating your own dog food...

    Okay, I guess I just have to put a cent or two into this as well.

    Solar System's Path May Have Spurred Ice Ages
    Windows no longer obstructs my view.
    Using Kubuntu Linux since March 23, 2007.
    "It is a capital mistake to theorize before one has data." - Sherlock Holmes

    Comment


      #47
      Re: Eating your own dog food...

      I read that article several years ago! A recent variation of that hypothesis explains that cosmic rays are not shielded as well when the solar system pops up out of the galactic dust, so both the earth and the atmosphere are heated by increased cosmic rays.

      Here is a physics paper by Latvian physicists on atmospheric science, discussing climate as a result of Earth heat reflection. An excellent IR spectrum graph on page 33 shows the IR spectrum of H2O, CO2, O2, C4 and NO. The total spectrum of absorption shows very clearly that the major part of the water vapor "window" which passes IR away from the planet is not blocked by any significant changes in the levels of CO2. On page 36 they describe how more than half of the necessary IR radiation into space is accomplished by heated currents of air rising high into the atmosphere, where the water vapor is precipitated out, releasing heat above the "window" at lower levels where water vapor can be as much as 2% of the atmosphere. The latent heat radiates away without having to pass through a narrow windows and can completely bypass most CO2, thus making the amount of CO2 irrelevant. This explains why, in the historical data, CO2 levels trail Temperature. All in all, a great read, and one which takes care to explain the scientific and mathematics behind atmospheric behavior.
      "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
      – John F. Kennedy, February 26, 1962.

      Comment


        #48
        Re: Eating your own dog food...

        Here's a galactic dust research paper, BUT, it takes off on GW using the CRU data which has now been shown by the CRU's own email to be totally worthless. >
        "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
        – John F. Kennedy, February 26, 1962.

        Comment


          #49
          Re: Eating your own dog food...

          I knew they were lying before this story came out. I'm glad they have been exposed. This is a scheme by The U.N and Socialists from Western Governments to increase the scope and size of Government both locally and globally. It's a huge tax and power grab. It is just another step to a loss of freedom. It will increase poverty. It will increase the gap between rich and poor. It will end in tyranny and oppression if the Socialists succeed.

          Comment


            #50
            Re: Eating your own dog food...

            Interesting article in the Economist about trusting scientists:
            http://www.economist.com/blogs/democ...ust_scientists

            Comment


              #51
              Re: Eating your own dog food...

              Ya, saw that. I went looking for the author but it wasn't signed. Interesting that. How do you check their credentials?

              I thought it was also interesting when he wrote:
              So is it reasonable, if the GHCN is using complex statistical tools to adjust the temperature readings at Darwin based on surrounding stations, that they might come up with the figures they came up with? Sure. No. Yes. I have no idea. And neither does Mr Eschenbach. Because in order to judge that, you would have to have a graduate-level understanding of statistical modeling.
              Complex tools, huh? Graduate level, huh? We'll see....

              He goes on to disqualify himself:
              I don't understand that formula. I don't have the math for it. The paper goes on to reject the Trewin formula for reasons which, again, I don't have the math to understand. This is academic-level statistics.

              Then he resorts to Al Gore style character assassinations:
              climate change denialists
              , which is supposed to end the debate, not by arguing data and its treatment, but by equating those who reject the AGW null hypothesis with those who deny the existence of Auschwitz.

              But, no matter. For his rebuttal he calls on members of the CRU gang, and their supporters, to rebut Eschenbach's analysis. But, Eschenbach rebuts the rebuttal.


              Oh, about the necessity of using "complex statistical tools" or possessing a "graduate-level understanding of statistical modeling" in order to understand the reports or criticize them ... that's is really a ludicrous charge to level at Eschenbach when "HARRY", the fellow who wrote the 700Kb "HARRY_README.TXT" document that is part of the FOIA zip file, made significant criticisms of the "flagship product" (the "deliverables"):

              Line 3277:
              Back to the gridding. I am seriously worried that our flagship gridded data product is produced by Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived at from a statistical perspective - since we're using an off-the-shelf product that isn't documented sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?

              Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding procedure? Of course, it's too late for me to fix it too. Meh.
              The "homogenization" of data was the criticism that Eschenbach leveled at the CRU's data.

              And, "HARRY" wrote of his own ability:
              LIne 8688:
              So, once again I don't understand statistics. Quel surprise, given that I haven't had any training in stats in my entire life, unless you count A-level maths.
              You may think that "A-Level Maths" is void of statistics. It is not. Comparing it to what I had to take I would equate the statistics section of A-Level Maths to what I had to know in order to qualify for graduate school. That assumes, of course, that the URL I cited for the A-Level Maths is similar to what "HARRY" had to take. Sorry, but it's not rocket science. In fact, it reads like the table of contents of a standard text book on statistics.

              But, to his credit, "HARRY" had a conscience, at line 8718:

              Another problem. Apparently I should have derived TMN and TMX from DTR and TMP, as that's what v2.10 did and that's what people expect. I disagree with publishing datasets that are simple arithmetic derivations of other datasets published at the same time, when the real data could be published instead.. but no.

              Line 11182:
              So to CLOUD. For over a year, rumours have been circulating that money had been found to pay somebody for a month to recreate Mark New's coefficients. But it never quite gelled. Now, at last, someone's producing them! Unfortunately.. it's me.
              ...
              Line 12563:
              Sometimes life is just too hard. It's after midnight - again. And I'm doing all this over VNC in 256 colours, which hurts. Anyway, the above line counts. I don't know which is the more worrying - the fact that adding the CLIMAT updates lost us 1251 lines from tmax but gained us 1448 for tmin, or that the BOM additions added sod all.
              And yes - I've checked, the int2 and int3 databases are IDENTICAL. Aaaarrgghhhhh.
              ...
              Line 12652:
              I really thought I was cracking this project. But every time, it ends up worse than before.
              ...
              line 13228:
              ARGH. Just went back to check on synthetic production. Apparently - I have no memory of this at all - we're not doing observed rain days! It's all synthetic from 1990 onwards. So I'm going to need conditionals in the update program to handle that. And separate gridding before 1989. And what TF happens to station counts?

              OH F**K THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm hitting yet another problem that's based on the hopeless state of our databases. There is no uniform data integrity, it's just a catalogue of issues that continues to grow as they're found.
              ...
              Line 1367:
              So, we need to be able to write six-month binaries. Oh, my giddy aunt. What a crap crap system. We'll have to switch to monthly binaries, it's the only unambiguous way. Meaning major modifications to numerous IDL roglets. F**k. Everything from the main progs (vap_gts_anom, quick_interp_tdm2, etc) to the supporting ones (rdbin for one).
              As a programmer I found this interesting because HARRY is on Linux and using the g77 fortran compiler!:

              Line 13963
              **sigh** WHAT THE HELL'S GOING ON?! Well, time to ask the compiler. So I recompiled as follows:

              g77 -o update -Wall -Wsurprising -fbounds-check programs/fortran/update.for

              Then, I re-ran. This time I got an error almost immediately:

              Producing anomalies
              Subscript out of range on file line 1011, procedure programs/fortran/update.for/MAIN.
              Attempt to access the 6-th element of variable dobin25[subscript-1-of-2].
              Abort (core dumped)

              Hurrah! In a way.. thyat bug was easy enough, I'd just forgotten to put an extra test (ipar.le.5) in the test for binary production, so as it was in a 1..8 loop, there was bound (ho ho) to be trouble. There was a second, identical, instance.

              After all that - final success:
              ..... lots of data printout ...
              All work completed satisfactorarily
              see: logs/completion/infolog.0905070939.dat
              and: logs/logs.0905070939/update.0905070939.log
              ...
              Meaning that a complete 1901-2008 run will need about 14gb of working data and the resulting files will need approximately 16gb. All gzipped!!
              16Gb of data! for the complete 1901-2008. Zipped.

              I'm surmising that "HARRY" was a graduate student. What follows next is similar to what I experienced in grad school when I presented some preliminary data to my major professor and he picked it apart, ... just the way "Tim" did with "HARRY":

              Line 14052:
              Then, of course (or 'at last', depending on your perspective), Tim O had a look at the data with that analytical brain thingy he's got. Oooops. Lots of wild values, even for TMP and PRE - and that's compared to the previous output!! Yes, this is comparing the automated 1901-2008 files with the 1901-June2006 files, not with CRu TS 2.1. So, you guessed it, bonnet up again.
              ...
              Line 14110:
              Now, this is a clear indication that the standard deviation limits are not being applied. Which is extremely bad news. So I had a drains-up on anomauto.for.. and.. yup, my awful programming strikes again.
              Notice that they are using the "complete 1900-2008 data", indicating that part of "HARRY"'s documentation was made sometime this year for a paper that was printed later this year or is yet to be printed.

              The anonymous Economist author remarked negatively about the use of Excel as tool for statistics, but "HARRY" wasn't using anything much better, He was using a tool, fortran, which he though didn't fit the job, and he was doing Q & D with Matlab:

              Line 14622:
              This time around, (dedupedb.for), I took as simple an approach as possible - and almost immediately hit a problem that's generic but which doesn't seem to get much attention: what's the minimum n for a reliable standard deviation?

              I wrote a quick Matlab proglet, stdevtest2.m, which takes a 12-column matrix of values and, for each month,
              calculates standard deviations using sliding windows of increasing size - finishing with the whole vector and what's taken to be *the* standard deviation.

              The results are depressing. For Paris, with 237 years, +/- 20% of the real value was possible with even 40 values. Windter months were more variable than Summer ones of course. What we really need, and I don't think it'll happen of course, is a set of metrics (by latitude band perhaps) so that we have a broad measure of the acceptable minimum value count for a given month and location. Even better, a confidence figure that allowed the actual standard deviation comparison to be made with a looseness proportional to the sample size.

              All that's beyond me - statistically and in terms of time. I'm going to have to say '30'.. it's pretty good apart from DJF. For the one station I've looked at.
              Excel or OOo Calc can be programmed to do what he did with Matlab.

              "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
              – John F. Kennedy, February 26, 1962.

              Comment


                #52
                Re: Eating your own dog food...

                I went looking for the author but it wasn't signed. Interesting that. How do you check their credentials?
                Why should I? To transform this in an argument by authority or respectively in an ad hominem? Important is the reasoning and the facts not who makes the case.

                Comment


                  #53
                  Re: Eating your own dog food...

                  Originally posted by Adrian
                  I went looking for the author but it wasn't signed. Interesting that. How do you check their credentials?
                  Why should I? ...... Important is the reasoning and the facts not who makes the case.
                  Because that is what the anonymous author of that article you referenced did. But, I agree with you. That's why I cited the FACTS in the "HARRY_README.TXT" document, and the rest of the files in the FOIA.zip file.

                  When Al Gore was asked if the contents of the FOIA emails would damage AGW credibility he claimed he had read "most" of them and that the most recent was ten years old and none of them contained any information which would call into doubt the "science" behind AGW. But, when you check the emails you see that the most recent was JUST A MONTH AGO, November 12, 2009, at 2:47pm, and the emails show that even the CRU staff discounted their own proxies and results. It was the "agenda" which was important, as phil Jones related to a GreenPeace member.

                  When the media publishes reports on the AGW controversy they always conclude with a scientist who claims that while the CRU may have "internal" problems the "science" behind the data is sound. This is a strange claim because for 300 years reputable scientists published the data supporting their papers and the conclusions they arrived at so that their work could be reviewed by their peers. The CRU staff conspired to withhold the data supporting the "Hockey Stick" graph on which the Kyoto accords was based. When Mann's graph was finally debunked as a fictitious results of a methodology which could plot red noise and give a hockey stick, they immediately replaced it with more "proxy" data, which "HARRY" refers to a "synthetic" data, to generate yet another hockey stick. They resisted FOIA on the YAML data. When the YAML data was finally obtained because Briffa made a blunder and published it in a journal which DEMANDED he archive the data, other researchers analyzed the data and revealed how Briffa fudged the data get yet another hockey stick by relying on a dozen hand picked tree rings, out of hundreds. When asked about the data they claimed (1256765544.txt) it was "thrown away to save room", but they had plenty of room for their synthetic proxie data. When reputable scientists with research and data which contradicted AGW attempted to get their research published in the JoC, or some other climate journals, the CRU conspired behind the scenes to block their papers, or wrote scathing reviews of them, even questioning the quality of their PhD degrees (1255538481.txt) and the integrity of their major professors! And, when the CRU papers were criticized the CRU conspired to get the reviwer thrown off the review panel, which they then got the journals to stuff with researchers who supported the AGW hypothesis. A real stacked deck. Then, when Britian passed a Freedom of Information Act law, and several researchers filed an FOIA request, Phil Jones and his pals conspired to delay or block a legal request. It was even revealed in the emails that Jones and the others at the CRU conspired with certain officials of the gov office to block the FOIA request. Before the collusion with the gov office, Jones wrote that if the FOIA request were to become a reality he would delete his files and emails and sent an email to his colleagues asking them to do the same.

                  It is important the those who want to fairly evaluate the CRU and the AGW work read the contents of the FOIA.zip.
                  The attitudes and actions in those emails are not the attitudes and behaviors of honest scientists.
                  "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
                  – John F. Kennedy, February 26, 1962.

                  Comment


                    #54
                    Re: Eating your own dog food...

                    Originally posted by Adrian
                    Interesting article in the Economist about trusting scientists:
                    http://www.economist.com/blogs/democ...ust_scientists
                    I noticed near the end of his criticism that the anonymous author stated:
                    Oh, and by the way: October was the hottest month on record in Darwin, Australia.
                    A little while ago, while perusing the Internet, I saw this weather news story:
                    A snowy dusting in Victoria's summer
                    Sam Terry, Friday December 11, 2009 - 18:12 EDT

                    Most people consider summer a time to wear shorts and thongs wherever one pleases, with little thought of ski jackets or snowboards. However Victoria's Mount Baw Baw saw a light dusting of snow, and it's already two weeks into summer.

                    A cold front crossed the nation's southeast during Thursday, bringing gusty winds and some good falls to southern Victoria.

                    Mount Baw Baw was one of the many locations to receive these falls, 31 millimetres in fact. Part of this was snow, around five centimetres, not enough to rejuvenate the ski season, but enough to create an excited buzz.

                    Melbourne itself didn't miss out. The city recorded 10 millimetres, now making a total (so far) of 30 millimetres.

                    Surrounding suburbs also had quenched rain gauges, with the largest falls over the eastern suburbs, right in the way of the southwesterly winds behind the front.

                    Saturday will see winds ease and isolated showers abate in southern Victoria. It is also unlikely that there'll be any more summer snow... at least until next Thursday.

                    - Weatherzone

                    © Weatherzone 2009
                    Amazing! TWO WEEKS AFTER the beginning of Summer and they are getting 30 mm (over an inch) of SNOW in Melbourne.

                    Darwin is 12 degrees South of the equator. Melbourne is 38 degrees south, which is about as far south of the equator as Wichita, KS is north of the equator, except that Melbourne is on the coast of Southern Australia, and water has a tendency to buffer local temperature changes. I've never heard of snow anywhere in Kansas on July4th, two weeks after June 21st.


                    Melbourne is an area of extremes. Last month they reported the hottest November since records were kept (150 yrs), but this month, two weeks into December, they have snow.
                    "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
                    – John F. Kennedy, February 26, 1962.

                    Comment


                      #55
                      Re: Eating your own dog food...

                      For those wanting to search the FOIA emails from the British Climate Research Unit (CRU) they are now on line, and thanks to a MySQL database they are searchable. You can search them HERE.
                      "A nation that is afraid to let its people judge the truth and falsehood in an open market is a nation that is afraid of its people.”
                      – John F. Kennedy, February 26, 1962.

                      Comment

                      Working...
                      X