Searchdatacenter.com reports from AFCOM with a dual interview with Microsoft's Mike Manos and Christian Belady, titled Microsoft spills the beans on its data center strategy at AFCOM.
On Tuesday, April 1, Microsoft Senior Director of Data Center Services Michael Manos delivered the keynote presentation at AFCOM's Data Center World conference. The company has opened its kimono in non-Redmond fashion -- sharing its insights on data center operations with anyone who's interested. We spoke with Manos and Microsoft Principal Power and Cooling Architect Christian Belady about Microsoft's experience with a rapidly expanding data center footprint, the problems the company has faced and challenges for the industry ahead.
What does Microsoft have to offer the AFCOM attendee?
Michael Manos: Most of the presentation focuses on two things. One is to talk about the challenges we've faced at Microsoft. But more importantly, we're going to talk about what everyone at this conference is going to face over the next two to three years and, to a large degree, show how Microsoft has solved these problems.
How much of the secret sauce of operating your data centers can you give away without losing the competitive advantage?
Manos: What's competitive advantage, and what's the right thing to do? You see people solving the same problems in different ways over and over. There is not a key driver or direction to the industry because we are solving the same problem 30 other people just solved. We have to share the findings that each of us is coming up with in order to make an impact on the industry at large.
Christian Belady: The industry is very fragmented. There is a loss of efficiency opportunities. If we share and others share, we start having a converged vision of what should be in the future.
Speaking of convergence, it seems like the message has taken hold in terms of infrastructure efficiency metrics like power usage effectiveness. Lots of data centers now work to make the power-and-cooling infrastructure as efficient as possible. But when will we get to the next step: measuring useful work? For example, what is the usefulness of an "efficient" server that runs an application twice a month?
Manos: I think it's coming. Some [of our] product groups have started to make the transition. You can't get there without effective monitoring in place. Also, exposure of that information to the developers is key. Most developers never think about energy, but we have a program that charges our developers for the energy they use. Measuring and exposing that internal chargeback brings focus to the product groups. You can't get there unless you can effectively measure what you're doing and expose it.
Belady: We're looking at using containers inside our future data centers. One of the things we like about them is we can take a bunch of servers and look at the output of that box and look at the power it draws. At the end of the day, we can determine, "What is the IT productivity of that unit? How many search queries were executed per box? How many emails sent or stored?" You can get into some really interesting metrics. A lot of people say you can't look at the productivity of a data center, but if you compartmentalize it -- not as small as the server level, but at some chunk in between -- you can measure productivity.
I've heard rumors, Google is contemplating its data center disclosure given Microsoft's big moves. Wouldn't it be great if we had Google and Microsoft competing to show who has the most efficient data centers and who is greener?
Google chose the path of being an electric company with its renewable energy initiative. Microsoft took a different path and chose to help people immediately save energy in their data centers. Who chose the greener path? Let's see what Mike Manos presents at his next keynote at the Uptime Institute's Symposium 2008.