NCDC charged with manipulating data to “prove” global warming

March 6, 2013 Archive 2829 Views

SAN DIEGO, January 14, 2010– Asheville’s National Climatic Data Center has been charged

With  manipulating data to achieve a desired global warming trend, according to John Coleman, founder

of  The Weather Channel.

 

Blogs and articles appeared on several reputable websites this week charging that a federal agency, The

National Oceanic and Atmospheric Administration (NOAA) via the National Climatic Data Center or

NCDC has “cooked the books” in the Global Historical Climatology Network database that is used by

governments,  private agencies and Non- Governmental Organizations around the world to understand

the world’s climate.

 

Coleman, now the senior meteorologist for KUSI-TV in San Diego, CA, summarized the charges on the air

and on the station’s website. (This article includes most of  Coleman’s posting.)

It has been revealed, Coleman wrote, that a “sleight of hand” was used in the computer program that

rated  2005 as “THE WARMEST YEAR ON RECORD.”

Skeptical climate researchers have discovered extensive manipulation of the data within the U.S.

Government’s two primary climate centers: the National Climate Data Center (NCDC) in Asheville, North

Carolina and the NASA Goddard Institute for Space Studies (GISS) at Columbia University in New York

City.   These centers are being accused of creating a strong bias toward warmer temperatures through a

system  that  dramatically trimmed the number and cherry-picked the locations of weather observation

stations they use to produce the data set on which temperature record reports are based.

According to Coleman and to a summary of reports by computer programmer E. Michael Smith writing at

chiefio/wordpress.com, the system has been distorted in several ways. Coleman summarized the charges

in a TV special, “Global Warming: The Other Side” which was shown in prime time at 9 p.m. Thursday,

January 14, on KUSI TV. The video may be seen on the KUSI website, www.kusi.com.

According to Coleman’s report, NOAA and NASA start with the unadjusted NOAA GHCN data.

NASA then eliminates some stations and adds some in the polar regions. For NASA, the computer

Program that manipulates the data is known as GIStemp. NASA and NOAA both then add their own

adjustments to calculate a global average temperature and a ranking for each month and year. The two

inter-related U.S Government agencies have so intertwined their programs and data sets that both are

being challenged by the investigating team that has produced this ‘smoking gun of U.S. Climate-gate.’

We suspect each center will try to hide behind, ‘It’s them; Not us’ and point fingers at each other,” says

The Computer Programmer from San Jose behind these new revelations. He and a Certified Consulting

Meteorologist from New Hampshire made their revelations public on January 14th on the KUSI-TV

report.

Coleman said, “Perhaps that is why Dr. Richard Anthes, President of the University Corporation for

Atmospheric Research in testimony to Congress in March 2009 noted, “The present federal agency

Paradigm with respect to NASA and NOAA is obsolete and nearly dysfunctional in spite of best efforts by

both agencies.”

 

The National Weather Service uses the NCDC data in its record temperature news releases put out

with much media fanfare on a regular basis as they declare a given month or year has set a record for

warmth, supporting the global warming agenda.

Also, the NCDC/NASA GISS data are regularly used by climate researchers doing studies at various

research centers and within university meteorology centers that are doing studies to support the United

Nations Intergovernmental Panel on Climate Change. This data is also shared with other global centers

Such as the recently hacked or leaked East Anglia University Hadley Climate Center in England.

NCDC Deleted Data Coleman also said that according to E. Michael Smith and Joseph D’Aleo,

the two men who did the research, there were no actual temperature records left in the computer

database when it proclaimed “2005 WAS THE WARMEST YEAR ON RECORD.” In the transition to

a computer averaging system, the National Data Climate Center deleted actual temperatures at

thousands of locations throughout the world as it evolved to a system of global grid boxes.

The number  that goes into each grid box is determined by averaging the temperatures

of two  or more weather observation stations nearest that grid box.  D’Aleo puts it this way, “Over

70 percent of the Earth’s surface is covered by water and vast areas of land masses remain unpopulated

as well. So it is reasonable to come up with some sort of grid method to simulate full global coverage.

The problem arises because not all of the grid boxes have continuous temperature measurements from

within them. So NCDC averages surrounding or nearby points and places that number in the box. In

some cases those observations are from several hundreds of miles away. That produces a serious

question,  ‘Does the resulting number represent the average temperature for that region within

meaningful limits?’” D’Aleo says it does not. “A vital issue,” he says is, “temperatures are not linear over

space, but instead vary enormously because of differences in terrain, elevation, vegetation, water

versus land and urbanization.” This problem is only the tip of the iceberg with the data being produced

at NCDC. For one thing, it is clear that comparing data from previous years when the final figure was

produced by averaging a large number of temperatures and those produced from a much smaller

temperature set with large data gaps is comparing apples and oranges. “When the differences between

the warmest year in history and the tenth warmest year is less than three quarters of a degree, it

becomes silly to rely on such comparisons,” Smith and D’Aleo say. But that is exactly what has been done

in touting the late 1990s and the early 2000s as the warmest ten years in history. “It is clearly a travesty

and agenda- driven by global warming advocates,” D’Aleo asserts. For E. Michael Smith this project was

quite a test of his computer programming skills. “Opening, unraveling and understanding what is

happening in a complex FORTRAN computer code, with 20 years of age and change in it, is a difficult and

grueling task,” he says, “and the deeper I dug the more amazing the details revealed. When doing a

benchmark test of the program, I found patterns in the input data from NCDC that looked like dramatic

and selective deletions of thermometers from cold locations.” Smith says after awhile, it became

clear this was not a random strange pattern he was  finding, but a well designed and orchestrated

manipulation process. “The more I looked, the more I found patterns of deletion that could not be

accidental. Thermometers moved from cold mountains to warm beaches; from Siberian Arctic to more

southerly locations and from pristine rural locations to jet airport tarmacs. The last remaining Arctic

thermometer in Canada is in a place called ‘The Garden Spot of the Arctic,’ always moving away from the

cold and toward the heat. I could not believe it was so blatant and it clearly looked like it was in support

of an agenda,” Smith says. Here are the numbers behind the startling findings of the new research paper.

The number of actual weather observation points used as a starting point for world average

temperatures has been reduced from about 6,000 in the 1970s to about 1,500 in the most recent years.

Still, more stations are dropped out in related programs and in the final NASA/GIStemp data file, it drops

to about 1,000. That leaves much of the world unaccounted for,” says Joseph D’Aleo of ICECAP.us and

SPPI.org, who has released a research study of the global temperature pattern today. “Think of it this

way,” he continues, “if Minneapolis and other northern cities suddenly disappeared but Kansas City and

St. Louis were still available, would you think an average of Kansas City and St. Louis would provide an

Accurate replacement for Minneapolis and expect to use that to determine how Minneapolis’

Temperature has changed with any hope of accuracy?”

 

E. Michael Smith pointed out that the November 2009 “anomaly map” from GISS shows a very hot

Bolivia, which is covered by high mountains.  “One small problem: there have been no temperatures

recorded in the NCDC data set for Bolivia since 1990. NASA/GISS have to fill in or make up the numbers

from up to 1200km away. This is on the beach in Peru or in the Amazon jungle,” he said.

He and D’Aleo say it is startling where the temperatures are that have been dropped from the

calculation. “A very high percentage of those dropped are from the more northern locations. Very few

are left north of sixty degrees longitude.” “Clearly there is also a bias to leave in locations with warmer

temperatures, i.e. from the arid areas and within the urban warmth of cities,” he adds. In the greatest

reduced list of locations, there are very few colder mountain locations retained.

For more information: E. Michael Smith and Joe D’Aleo are interviewed as part of a report on this study

on the television special, “Global Warming: The Other Side” on KUSI-TV, channel 9/51, San Diego,

California. That program is available on-demand at KUSI.com. The detailed report by D’Aleo is available

at http://icecap.us/images/uploads/ NOAAroleinclimategate.pdf.

For more information, contact: E. Michael Smith at pub4all@aol.com.

Smith’s climate blog: http://chiefio.wordpress.com/gistemp/.

Contact Joseph D’Aleo at Jsdaleo6331@aol.com, or 603-689-5646.

D’Aleo website: http://www.icecap.us.

 

Share this story
Email

About author

Related articles