Monday, December 14, 2009

SU's Green Data Center

Take all the computer servers for all of Syracuse University, put them in a single building, connect them to SU’s network, and what do you get? Well, theoretically, you should get a huge energy sink – a building that sucks up energy to power the servers and then sucks up even more energy to remove the waste heat generated by powering the servers, typically via mechanical refrigeration.

But, instead, SU has just opened a new building (called a “green data center”) which contains its computer servers and which has the potential to use only one half the amount of energy used by conventional data centers. The building has been engineered to showcase an array of the most energy efficient technologies available for data center power and cooling. This is important because it turns out that 1.5% of all electricity generated in the US is used to power data centers and the air conditioning systems that cool them.

How did SU do this? By calling on the engineering expertise of its faculty, led by Dr. Ez Khalifa of the Mechanical and Aerospace Engineering Department; SU’s Chief Information Officer, Chris Sedore; and the university’s incredibly talented campus design and construction team, led by Kevin Noble. And, by combining this talent with IBM research engineers such as Dr. Roger Schmidt, an esteemed member of the National Academy of Engineering.

Together, this team designed a new kind of “green” data center. Our data center:

1. Uses natural gas-powered microturbines located in the building to generate electricity to power the servers, so the building can operate completely off the grid if necessary;
2. Converts waste heat from the microturbines into cooling for the servers and cooling or heating to an adjacent campus building;
3. Delivers cooling via water rather than the usual power-hungry air conditioning arrangement, thus permitting a greater cooling efficiency;
4. Generates direct current (DC) on site and distributes it throughout the data center, rather than converting between alternating current and direct current. This eliminates the energy losses inherent in switching between the two types of current.
5. Is completely instrumented so that sensors monitor server temperature and usage, and send exactly the right amount of cooling to each server.
6. Contains a battery room with enough electrical power to run the servers for nearly 20 minutes, and a propane storage tank for a full day of operation This ensures that if the natural gas source is interrupted, and if the power grid goes down, the servers will operate long enough for an orderly shut-down of the servers. As Vijay Lund, IBM Vice President, has said, the data center is so reliable it is nearly “bullet-proof.”
7. Serves as a research and analysis center for IBM clients who want to build new energy-efficient data centers. It is also a research facility for LCS and Dr. Khalifa’s team to study and continue to optimize the design of green data centers. The aim is to achieve further energy savings through thermally-aware, energy-optimized operation.

Below is a computer-generated image (using principles of computational fluid dynamics) from Dr. Khalifa’s team showing a hypothetical cooling scenario. The colored lines represent the paths of parcels of air as they flow from the floor (blue grid) into the computer servers (white grid) and then into room air conditioners. The changes in line color indicate the temperature change of the air parcel as it moves through the room.

To see the building and get a virtual tour from Chris Sedore and Dr. Ez Khalifa, click here And, see if you can spot me in the video!