IT systems run in exposed shed to prove reliability point

FRAMINGHAM (09/20/2011) â€" In an experiment that began in January, servers, networking gear and storage systems have been running in an simple shed without failure.

This experiment is giving David Filas, a data center engineer at the healthcare provider Trinity Health, the ammunition he needs to argue that IT equipment is a lot tougher than most think.

Through winter, spring and summer, these decommissioned systems keep running despite big variations in temperate and humidity. And the uptime of the systems has been better than what Google and Amazon have delivered so far this year.

Filas wants to convince IT administrators at his company, which runs 47 hospitals and other health care facilities, that it’s OK to raise the temperature in data centers. But the IT staff has been reluctant to do so, he says.

The project was inspired by something Microsoft did a few years back. From November 2007 to June 2008, Microsoft employees ran five Hewlett-Packard servers in a tent and reported “zero failures or 100% uptime.”

Filas is running his equipment in a generator shed at the healthcare firm’s Novi, Michigan headquarters, a suburb of Detroit.

There is a block heater on the generator which can generate some warmth, but otherwise “it’s more or less exposed to the same temperature and humidity conditions as the outdoors,” said Filas, who presented his work at the Afcom data center conference last week in Orlando.

The temperature inside the shed ranges from 31 degrees Fahrenheit to nearly 105 degrees. The relative humidity has ranged from nearly 8% to about 83%. But the door of the shed has been accidentally left open a few times, once when the temperature reached 5 below zero.

Filas even tossed sawdust in the shed to make a point about the ability of these systems to handle dust. The dust issue pops up when arguments are made for using outside air to cool data centers, he said.

“I’m trying to dispel the myth that the data center has to be a clean room because it doesn’t; today’s electronics are extremely resilient,” said Filas.

The equipment that is running in this experiment was pulled out of production three to four years ago. There are about a dozen pieces of equipment in this test, including HP servers, Cisco switches, and an IBM disk array.

The plan had been to keep these systems running until January, but Filas said he may extend that and add some workloads to the systems to address a criticism that it isn’t a true test. He is considering networking the equipment and putting it under a heavy load.

Filas said he didn’t expect the systems would fail, but says he is nonetheless surprised on how well the mechanical components have held up, the hard drives in particular.

There hasn’t been a single hard drive failure, he said. The Cisco equipment is very resilient, said Filas. Some Cisco equipment has a manufacturer’s upper temperature limit of 104 degrees, but Filas knows from experience that these systems can handle much more.

“Through unfortunate cooling outages, we have even had everything in a data center shut down except the Cisco equipment. The temperature was 117 degrees, and the Cisco equipment was purring away,” Filas said.

Filas argues that there is no excuse why the inlet temperatures on equipment cannot be between 80 to 82 degrees, which has been his goal in the main data center. He considers that an ideal temperature range for a data center, and says it is a range that also includes a little bit of a safety margin.

The American Society of Heating Refrigerating and Air-Conditioning Engineers (ASHRAE) has also been raising temperature recommendations as a result of improvements in data center equipment, to an upper limit of 81 degrees, and is expected to increase the ranges again.

“I want my IT staff to be more comfortable with the higher temperatures,” Filas said. “They are accustomed to having it being 65 degrees in the data center and they get very nervous when I dial up the temperature, even to the mid-70s.”

“I’m trying to dispel the myth among my own staff that it has to be that cold, because it doesn’t,” Filas said.

Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick’s RSS feed . His e-mail address is pthibodeau@computerworld.com .



Powered By WizardRSS.com | Full Text RSS Feed | Amazon Plugin | Settlement Statement | WordPress Tutorials

0 comments:

Leave a Comment