Building your First Data Center, learn some lessons from Microsoft who say they can build for 50% less

Building your first data center can be a challenge.  Many have tackled this task over the past few years - Microsoft, Yahoo, Intuit, ask.com, eBay, Apple, and Facebook.  Building your first is an opportunity to consolidate your IT loads and reduce costs.  Given the difficulty of getting all the ducks lined up to get the project going, the budget for the first data center can be over $250 million.

DataCenterKnowledge reports on Microsoft's latest Quincy data center.

The new data center is being built next to Microsoft’s existing 470,000 square foot data center in Quincy, which was built in 2007 and is now one of the largest data centers in the world. But the new facility will be dramatically different in both its cost and design. After years of investing up to $500 million in each data center project, Microsoft plans to spend about $250 million or less on each data center going forward.

One trap I have seen many fall into is to build a big data center as the first.  Why?  Well, part of what drives this is data centers are the highest profit margin business for the construction industry and there are plenty of people who will tell you bigger is better.  The analysts will help you justify a $250 million dollar data center is the sweet spot of getting an ROI.

But, a different way of thinking about this problem is to build Ten $25 million data centers instead of one.  The first one may be a bit more than $25 million, but you can cut costs on the next, and the next, then after your third, you realize "hey there is a different way we can be doing this.  Let's change the design.  Build three more, then you go "wow we learned a lot, let's really push for something innovative."  The last three now cost $12.5 million instead of $25 million.

This is what Microsoft has done, but spending $500 million a data center.  They built Quincy 1, San Antonio, Dublin (air side economizer), and Chicago (container).  And the 4th generation data center is next.

Get Microsoft Silverlight

One additional benefit of building a $25 million data center is you don't end up with consultants, designers, and construction companies swarming to get your business.  If you choose an incremental data center design you'll learn a lot on what is real and what is hype.  Google, Microsoft, and Amazon can do this why can't you too?

BTW, another thing Microsoft has done is figured out how to build the 4th generation data center faster than the 1st generation. Part of the reason the first data center is so big is because it was so hard to get the project going.  Speed is important in addition to capabilities.

I've discussed these ideas with a few data center designers, and we have used the metaphor that data centers are designed like Battle tanks.  But not all businesses, so not all data centers should be same and if you have geo redundant SW like Google, Amazon, and Microsoft, it can be more cost effective to build different data center types for the same reason why there are light and heavy tanks.

Which brings up another benefit of the Microsoft 4th generation data center, the design is not in a concrete bunker which means it could be moved much easier if need be.

This next-generation design allows Microsoft to forego the concrete bunker exterior seen in the original Quincy facility in favor of a steel and aluminum structure built around a central power spine. The data centers will have no side walls, a decision guided by a research project in which the company housed servers in a tent for eight months.

What happens if you focused on building iterative data centers with a range of capabilities to adapt to business needs and could be moved if business or power conditions change in a location.  Doesn't this sound like a better way to spend $250 million.  But, the data center ecosystem is not going to promote this idea as it changes their profits and business models.

Microsoft, Google, and Amazon's battle for cloud computing is going to continue to drive some of the most innovative thinking.  And you don't have to wait to start thinking like they do.

Read more

Freedom to think of things others don't, accepting different belief structures - Human Factors and the Data Center

Last week I blogged about some big thinking I participated in Portland with a cloud computing director and ten others.  One of the things I do in big meetings is drop into an analyst mode watching the conversations, saying little, listening and watching the dynamics in the meeting.  Normally, I would be discussing big ideas, but with 12 people in the room and plenty of brain power going and an extremely smart guy presenting I could be quiet.   I frequently find I learn more and figure out things being quiet and watching the dynamics between the people.  Isn't it funny how your brain stops listening when you want to talk.

One of the entertaining moments is when a VC came into the meeting late and spent 2 minutes telling the group how important he is and how much influence he has.  I didn't say one word to him, even though he was the man with the money.  I had more important conversations, and for a person like this, it is many times difficult to explain my role, and that this meeting wouldn't be happening if  I hadn't been architecting the solution.

I was sitting next to my friend and we scribbled notes and whispered ideas during the presentation, taking the time to highlight important concepts. One of the concepts that was big is to model different belief structures to interpret data differently which allows you to put data in context of the user. 

A side story, I worked with some data center construction guys and I found their belief and value system was totally different than what I had assumed.  The construction guys thought I was not that smart the more I worked with them.  What I understood is their value and belief system was brittle when exposed to openness and transparency which is requirement for building a knowledge model for the data center.  Luckily I escaped that project.  In the process, I learned a valuable lesson why it is so hard to bridge thinking across data center design, construction and operation.  Most people are fixed in their belief and value system, they can't translate what others do into their beliefs, and vice versa.  Openness and transparency is not compatible with many existing approaches in data centers where keeping things secret is a standard practice.  Also, keeping secrets maximizes control and profits for the suppliers as the customers are mystified by the black magic skills to build a data center.

Note: I have met other data center construction guys who don't exhibit this behavior, so don't think I mean all data center construction is this way.  And, I have met data center designers who believe data center efficiency can be achieved with simpler designs that are easier to operate and maintain when the black magic is not part of the design.

After this lesson, I've spent more time analyzing people and companies for how well they fit in open approaches like we intend to use in the "Open Source Data Center Initiative."

While most people in the meeting were down in the details reviewing ideas presented, I was watching the people and their beliefs, trying to figure out if they accept other people's belief systems.  The more arrogant a person is the less they accept another person's view as being right in their context.

How many different belief systems do you accept as valid in the data center system?

Executive & CIO

Business Unit VP

Facility Operations

IT Operations

Application/Services Operations - Dev and Test

Enterprise Architect

Security

Networking

Database & Storage

Finance

Public Relations

Environmental Impact & Sustainability

Customers

Partners

Suppliers

Government, Finance, and Compliance Regulations

One view I haven't heard is Human factors.  I wrote the above yesterday, but knew it was not finished to post, this morning at 6a it clicked.  One view that touches almost all of  the above, but is not discussed is the holistic view from Human Factors.  I studied Human Factors in college and believed it was key to be a better Industrial Engineer. When I interviewed at IBM, one of the questions was "How do you know what to change?"  Being young and naive, I said you have to care about the people.  The IBM engineers probably thought I was a leftist tree-hugging radical thinker  as I was graduating from UC Berkeley.

What is Human Factors?

Human factors involves the study of all aspects of the way humans relate to the world around them, with the aim of improving operational performance, safety, through life costs and/or adoption through improvement in the experience of the end user.

An area where Human Factors shows up in most data centers is in facilities due to the maturity of the equipment used, regulations like OSHA, and safety requirements around large mechanical and power systems.  But, the application of Human Factors in data centers is relatively new.  In talking to Mike Manos, he described how Microsoft designed its data centers to make it easier to receive fully assembled racks and deploy the heavy racks to their location.

In software and hardware, User Interface design is a more popular term.

In the industrial design field of human-machine interaction, the user interface is (a place) where interaction between humans and machines occurs. The goal of interaction between a human and a machine at the user interface is effective operation and control of the machine, and feedback from the machine which aids the operator in making operational decisions. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls. and process controls. The design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology.

Human Factor and User Interface is discussed in isolated areas within the data center, but I don't think I have heard any one discussing human factors and data centers in the same breath.

The #1 risk to data center operations are human related.  How much did it cost Microsoft and Hitachi Data Systems for the T-Mobile data loss disaster that was a human error.

While users will be relieved that their information looks likely to be recovered, the episode poses several questions over the competence of Danger’s staff; the technical ability of contractor Hitachi Data Systems; and the inherent stupidity of the Cloud concept.

While we are unlikely ever to be told the full story, it looks very much as if Hitachi’s attempts to upgrade Danger’s Storage Area Network failed big time and that the data was put at risk not by hardware failure, but by good old-fashioned human error.

This one event that had a multiple human errors did hundreds of millions of dollars of damage to Hitachi and Microsoft.  Can Hitachi sell a storage system?  Can Microsoft sell its Smartphones?

This problem was caused by people who didn't spend the time to think how the people are interacting with the data center systems.

This is one of my more rambling posts, there are some good ideas here, I need to think about them a bit more though.

Read more

Mike Manos presents Data Centers are CO2, Yahoo and Koomey supporting the issue

DataCenterKnowledge has a post on Uptime presentation by Mike Manos on Data Centers CO2 impact.

  • Manos: Industry Must Prepare for ‘CO2K’

    May 19th, 2010 : Rich Miller

    Mike Manos of Nokia speaks Tuesday at the Uptime Institute Symposium 2010 in New York.

    In calling the data center industry to prepare for carbon regulation, Mike Manos invoked the Y2K crisis of the late 1990s, warning that “CO2K” threatens to be similarly disruptive.

It's great to see Mike Manos use his speaking spot to discuss carbon impact.

Jonathan Koomey supports the same issues.

The impact of a “carbon tax” was also highlighted by data center energy expert Jonathan Koomey, who said the issue is “not on the radar screen” of corporations.

‘A Price for Carbon’
“There will be a price for carbon,” Koomey said in his Monday keynote at Uptime. “We have to start thinking about how that price affects the economics of data centers. Carbon taxes will have an impact on where you locate your data centers.”

Koomey used the framework of the UK’s recently enacted Carbon Reduction Commitment (CRC) to illustrate the potential impact. At the CRC rate of $19 per ton of carbon emissions, a 130,000 square foot data center with coal-sourced utility power might pay an additional $5 million a year.

“That’s real money,” said Koomey. “If you have a data center in a place that’s all coal, that’s the business risk you’re taking on.”

And Yahoo's Christina Paige chime in too.

Manos’ assessment of the role of data centers was echoed by other speakers at the Uptime event. Yahoo initially bought offsets to address its carbon output, according to Christina Page, the company’s director of Climate and Energy Strategy. But the company soon shifted its focus to improving the energy efficiency of its data centers.

75 Percent of Carbon Footprint
“We quickly realized that 75 percent of our carbon footprint was from data centers,” said Page. “The best opportunities for leadership were in that area as well.”

Facebook is currently catching flack for its coal powered data center in Prineville, OR.  Currently the count is up to 442,000 members on English, Spanish, and French facebook pages asking for 100% renewable energy for Facebook.

Start measuring your carbon impact and think about how you can lower your carbon impact.

Read more

Breaking the rules for Data Center Site Selection, HP discusses Farm Waste as energy supply

One of the smart people I get to have regular conversations with is Pat Kennedy, Founder and CEO of OSIsoft.  Pat is the one who got me thinking about green data centers when he asked a simple question three years ago, "how do you measure the power consumption of an application in a data center?"  This got me started down a whole path of monitoring and modeling.

One of the latest topics Pat and I have discussed is MicroGrids.  Google thinks about this too.  See this Google video, I can see some of the Google data center team in the audience.

HP is making news today with their paper on a microgrid for data centers powered by cow manure.

image

ABSTRACT
In this paper, we design a supply-side infrastructure for data centers that runs primarily on energy from digested farm waste. Although the information technology and livestock industries may seem completely disjoint, they have complementary characteristics that we exploit for mutual benefit. In particular, the farm waste fuels a combined heat and power system. The data center consumes the power, and its waste heat feeds back into the combined system. We propose a resource management system to manage the resource flows and effluents, and evaluate the direct and indirect economic benefits. As an example, we explain how a hypothetical farm of 10,000 dairy cows could fulfill the power requirements of a 1MW data center.

Pat Kennedy long ago was making the point that data centers could be a lot efficient if sites were chosen to be next to power generation, biomass, and/or other large consumer of power.  But, this idea is controversial in that a standard practice for data center risk reduction to place data centers far away from hazardous materials.  I think a large methane store would typically get classified as a risk to a data center.  So, if you are totally risk averse and don't pay for the power bill, why not skip over the site with methane.  Most would.

Plus there are risks that HP doesn't mention in their brief statement on financial and associated risks.

Financial cost and associated risks are perhaps the most
important consideration. Existing farms that have invested in
supply-side infrastructure often do so only if a power-purchase
agreement can be signed. Otherwise, the return could be too
speculative to justify the capital investment. A data center has substantial, continuous, and long-term power needs. Thus the data center owner could sign the power purchase agreement and provide the assured return desired by the farmer.

You are now dependent on a Farm.  What is the #1 risk to your manure production?  Water!!!  When there is a drought there is an impact to agriculture production and cattle need a lot of water.  This article says it takes 2,000 gallons of water to make a gallon of milk.

It can take up to 2,000 gallons of water to produce one gallon of milk. The cow needs water to perform basic biological functions from day to day, and only a fraction of the water the cow consumes is actually converted into milk. The fact that it takes so much water to produce cow's milk means that anytime you or any consumer chooses to drink milk, the burden you place on the natural environment is a thousand times greater than if you were to consume water itself. Drinking one gallon of milk is like pouring 1,999 gallons of fresh water down the drain.

Actually putting a data center in operation using a Farm has these risks like water and methane gas.  There are a bunch of other issues that can be addressed like water. 

Mike Manos and I regularly discuss that water is the next scarce resource for data centers.  Be careful not thinking about the secondary and tertiary affects of a change in the water supply.

I congratulate the guys at HP for creating more awareness that a microgrid data center strategy has merit.

Read more

Blogging like a Dog

I've been on a writing frenzy last few days, and my family says I am on the computer too much.

I can't tell if the dog is telling me I spend too much time at the computer or she figures one way to get me to stop is if she is using it.  :-)

photo (1)

Read more