Sandy takes out some sites - Gawker, HuffPo

GigaOm's Barb Darrow posts on a few sites knocked out by Con Ed turning off the power in lower manhattan.

Not everyone’s data centers were ready for Hurricane Sandy. Several major media sites —Buzzfeed, Gawker, Huffington Post — took major hits early Monday evening Eastern Time. The problems were all attributed to massive storm, which made landfall near Atlantic City, N.J., at around 8 p.m. local time.

Update: Many of these sites were impacted when Con Ed shut off power to lower Manhattan.

We'll see how many other sites join this list.

Which East Coast Data Center will run out of diesel fuel first?

The East Coast Data Centers are ready for Hurricane Sandy. 

With Sandy on horizon, East Coast data centers on high alert

Providers test generators, staff up facilities, ensure fuel deliveries

29 October 2012 by Yevgeniy Sverdlik - DatacenterDynamics

 
   
 
 
 
 
With Sandy on horizon, East Coast data centers on high alert
Satellite image of Hurricane Sandy by the US NOAA

Data center operators on the US East Coast are bracing for Hurricane Sandy’s landfall, expected by meteorologists on Monday evening.

Providers with substantial data center presence in the region, including Equinix, Savvis, Telx and Sungard Availability Services, have taken the basic steps to make sure their facilities are prepared to keep operational through prolonged power outages.

The basic steps all data center providers that have responded to our inquiries have taken are testing back-up generators, making sure onsite fuel-storage tanks are full, getting in touch with fuel vendors to ensure in-time deliveries in case fuel stored at the data center sites runs out.

The President has declared Emergency situations in 9 states.

The President's action authorizes FEMA to coordinate all disaster relief efforts which have the purpose of alleviating the hardship and suffering caused by the emergency on the local population, and to provide appropriate assistance for required emergency measures, authorized under Title V of the Stafford Act, to save lives and to protect property and public health and safety, and to lessen or avert the threat of a catastrophe in all counties in the State of New Jersey.

Specifically, FEMA is authorized to identify, mobilize, and provide at its discretion, equipment and resources necessary to alleviate the impacts of the emergency.  Emergency protective measures, limited to direct federal assistance, will be provided at 75 percent federal funding. 

For those who have run data centers through Federal Emergency conditions they know that a contract with a fuel vendor is not worth much when the Federal gov't has established control of critical resources to address the emergency.

Elizabeth Turner has been named as the Federal Coordinating Officer for federal response operations in the affected area. 

FEMA's mission is to support our citizens and first responders to ensure that as a nation we work together to build, sustain, and improve our capability to prepare for, protect against, respond to, recover from, and mitigate all hazards.

Depending on how bad the power outages are, how refineries are impacted, diesel fuel could be extremely scarce.   And, a data center could run out of fuel even if it has multiple contracts with fuel vendors.

There is a way to get fuel during a FEMA managed event as a smart data center guy asked this question over a year ago, and we talked about how to solve this problem.  I found a guy who has been in the fuel business for years supplying airlines and remote power plants with petroleum fuel, and he has been thinking for who you could make sure you get fuel.  

BTW, one of the easier ways to get fuel during a disaster is to have your facility be defined as critical infrastructure by gov't agencies.  But, that also brings a huge amount of oversight and regulation.

Wired's Steven Levy tells the Google Hamina Story - mines, views of Russia, sizzling water, swirling blood & batman

Wired's Steven Levy has an insider story of Google's Hamina data center.  It is entertaining.  Here are a few parts that caught my eye that I wouldn't put in a story, but I am not Steven Levy.

And as with most of Google data centers, the company’s secrecy fomented loony speculation about what those geeks were up to. In this case a rumor sprung up that Google had planted mines in the sea to keep away fishermen.

...

But the signature feature of the Hamina center arose from its coastal setting on the Gulf of Finland.This allows Google to claim, on a very clear day: “I can see Russia from my data center!” The border is only 40 miles away.

...

The pipes are color-coded: Blue represents cool water, and orange is hot. Both sea water pipes and those carrying hot water from The Floor go into giant heat exchangers whereupon the chilly seawater heats up and the sizzling data center water cools down. (Another connection to the sea is a thick fiber cable that Google submerged to connect the Hamina center to the rest of Europe.)

...

The space is a vast industrial ruin, big and high enough to entertain a reasonable amusement park. It has its own misty microclimate, the dust sometimes stirred by dive-bombing birds. Basically, it’s the kind of place where the early Batman might wind up fighting an all-star squad of marquee villains.

The highlight of the Hamina tour is tracking the journey of the seawater, the swirling blood of this data center.

I've gone to Finland and the custom of following a hot sauna with a dip in cold water is a tradition at the Hamina data center.

When Google bought the site, the sauna remained, and in keeping with its egalitarian ethic, the company opened this once-exclusive perk to all employees. Local Googlers accustomed to the true Finnish regimen are welcome to dash out of the hot steam room for a dunk into the same chilly seawater that cools Hamina’s servers.

Google shares its data center cooling best practice - water and hot aisle containment "hot huts"

Google has an end user friendly explanation of its data center cooling.

Our emphasis on cooling systems might come as a surprise, until you consider how warm a personal computer can become during use. Data centers, which house thousands of computers, need to stay within a specific operating temperature range. Even though we run our facilities hotter than a typical data center, we need cooling systems - both to prevent server breakdowns and to provide a reasonable working environment for technicians working on the data center floor.

After servers, the second largest consumer of power in a data center is the cooling system. We needed a cooling system which minimized our overall energy consumption. For this reason, we designed our own cooling systems from the ground up. 

The interior of a hot hut row

Google uses hot aisle containment (hot huts) creating a higher delta T  across the water cooling coils at the top of the hot huts.

IBM has used water cooling in its supercomputers for years and even used the waste heat for heating homes.

SuperMUC combines its hot-water cooling capability, which removes heat 4,000 times more efficiently than air, with 18,000 energy-efficient Intel Xeon processors. In addition to helping with scientific discovery, the integration of hot-water cooling and IBM application-oriented, dynamic systems management software, allows energy to be captured and reused to heat the buildings during the Winter on the sprawling Leibniz Supercomputing Centre Campus, for savings of one million Euros ($1.25 million USD) per year.

Now for those of you who think Google should use its waste heat to heat homes, there is the problem that Google data centers are not close to residential or commercial businesses that can use the low grade heat.

In some data centers there is a hard fast rule of keeping water out of the data center, but if you want to be the most efficient you need to break some rules.

Google's Data Center Videos, 1 week 2.2mil views vs. 2 1/2 yrs 1.3 mil views

Google made a lot of news with its data center photography and video.

One way to look at how well the videos is to looking at the traffic.  The latest video exceeded the first video in less than a week vs. 2 1/2 years of steady views.

The Latest Video.

Video statistics

Views and discovery

2,228,718

Views
 

Key discovery events

A

First referral from: Google

Oct 11, 2012 - 76,628 views

B

First embedded on: wired.com

Oct 16, 2012 - 53,618 views

The first Container Data Center video in 2009. 

Video statistics

Views and discovery

1,315,216

Views
 

Key discovery events

A

First embedded on: www.google.com

Apr 7, 2009 - 14,171 views

B

First embedded on: blogoscoped.com

Apr 7, 2009 - 15,559 views