Data Center Construction Competition – DLR, MSFT, GOOG, AMZN – The Winner is?

Data center construction is a competitive business and we hear lots about the top companies – Digital Realty Trust (DLR), Microsoft (MSFT), Google (GOOG), and Amazon (AMZN). 

Who is winning?

Well if you want to judge based on stock price over the the last 5 years.  Check out this stock performance chart.  It looks like Digital Realty Trust (DLR)

image

It is interesting to wonder what would have happened if MSFT, GOOG, and AMZN were just in the data center business like DLR.

Also, keep in mind DLR is an REIT, and pays out its profits as dividends and has a 3.20% yield.

Read more

Bridging the Gap between Facilities and IT, with a Carbon Footprint discussion

I was listening to Mike Manos’s Chiller Side chat discussion hosted by DataCenterKnowledge’s Rich Miller.  One point Mike made was the challenge of facilities and IT having different views of what the data center needs to be.  Many of the questions were about power and cooling systems, so I think most of the listeners didn’t know how to address Mike’s challenge for facilities and IT working together.

image

Later in Mike’s conversation he discussed the issue of the carbon footprint from power production as an issue for data centers and the need to think about this as part of a green data center strategy.  In fact, Mike mentioned green IT often.

I was going to ask Mike “What is an example of what you have found works to bridge the gap between facilities and IT?”  Unfortunately, there was technical problems so I couldn’t ask the question.

So I came up with my own answer.  “Discuss the carbon footprint of site selection as facilities and IT personnel evaluate locations.”   To get the lowest carbon footprint requires the teams to work together to evaluate the alternatives and the trade-offs. 

I know Mike uses this strategy as he selects sites for Digital Realty Trust and its customers.

Do you?

Read more

KC Mares Asks The Tough Questions, Rewarded by PUE of 1.04

DataCenterKnowledge has a post on Ultra-Low PUE.

Designing for ‘Ultra-Low’ Efficiency and PUE

September 10th, 2009 : Rich Miller

The ongoing industry debate about energy efficiency reporting based on the Power Usage Efficiency (PUE) metric is about to get another jolt. Veteran data center specialist KC Mares reports that he has worked on three projects this year that used unconventional design decisions to achieve “ultra-low PUEs” of between 1.046 and 1.08. Those PUE numbers are even lower than those publicly reported by Google, which has announced an average PUE of 1.20 across its facilities, with one facility performing at a 1.11 PUE in the first quarter of 2009.

KC’s post has more details.

Is it possible, a data center PUE of 1.04, today?

I’ve been involved in the design and development of over $6 billion of data centers, maybe about $10 billion now, I lost count after $5 billion a few years ago, so I’ve seen a few things. One thing I do see in the data center industry is more or less, the same design over and over again. Yes, we push the envelope as an industry, yes, we do design some pretty cool stuff but rarely do we sit down with our client, the end-user, and ask them what they really need. They often tell us a certain Tier level, or availability they want, and the MWs of IT load to support, but what do they really need? Often everyone in the design charrette assumes what a data center should look like without really diving deep into what is important.

And KC asks the tough questions.

Rarely did I get the answers from the end-users I wanted to hear, where they really questioned the traditional thinking and what a data center should be and why, but we did get to some unconventional conclusions about what they needed instead of automatically assuming what they needed or wanted.

We questioned what they thought a data center should be: how much redundancy did they really need? Could we exceed ASHRAE TC9.9 recommended or even allowable ranges? Did all the IT load really NEED to be on UPS? Was N+1 really needed during the few peak hours a year or could we get by with just N during those few peak hours each year and N+1 the rest of the year?

KC provides background we wish others would share.

Now, you ask, how did we get to a PUE of 1.05? Let me hopefully answer a few of your questions: 1) yes, based on annual hourly site weather data; 2) all three have densities of 400-500 watts/sf; 3) all three are roughly Tier III to Tier III+, so all have roughly N+1 (I explain a little more below); 4) all three are in climates that exceed 90F in summer; 5) none use a body of water to transfer heat (i.e. lake, river, etc); 6) all are roughly 10 MWs of IT load, so pretty normal size; 7) all operate within TC9.9 recommended ranges except for a few hours a year within the  allowable range; and most importantly, 8) all have construction budgets equal to or LESS than standard data center construction. Oh, and one more thing: even though each of these sites have some renewable energy generation, this is not counted in the PUE to reduce it; I don’t believe that is in the spirit of the metric.

If you want higher efficiencies and lower costs you need to be ready to the tough questions. 

The easy thing to do is collect the requirements of various stakeholders and say this is what we need built.  And, don’t ask the questions of how much does that requirement cost?

I know KC’s blog entry has others curious, and he has lots more appointments.

Hopefully this will wake up many others to ask the tough questions of “how much does that data center requirement cost?”

Read more

Future Nuclear Reactors simplify to improve Reliability

WSJ has an article about future nuclear reactors.

The New Nukes

The next generation of nuclear reactors is on its way, and supporters say they will be safer, cheaper and more efficient than current plants. Here's a look at what's coming -- and when.

By REBECCA SMITH

If there ever were a time that seemed ripe for nuclear energy, it's now.

For the first time in decades, popular opinion is on the industry's side. A majority of Americans thinks nuclear power, which emits virtually no carbon dioxide, is a safe and effective way to battle climate change, according to recent polls. At the same time, legislators are showing renewed interest in nuclear as they hunt for ways to slash greenhouse-gas emissions.

[NEWNUKE]

There are interesting points made that forward thinkers like Mike Manos have said as a path for future data centers.

"A common theme of future reactors is to make them simpler so there are fewer systems to monitor and fewer systems that could fail," says Revis James, director of the Energy Technology Assessment Center at the Electric Power Research Institute, an independent power-industry research organization.

And, a specific example is discussed of simplification.

The current generation of nuclear plants requires a complex maze of redundant motors, pumps, valves and control systems to deal with emergency conditions. Generation III plants cut down on some of that infrastructure and rely more heavily on passive systems that don't need human intervention to keep the reactor in a safe condition—reducing the chance of an accident caused by operator error or equipment failure.

For example, the Westinghouse AP1000 boasts half as many safety-related valves, one-third fewer pumps and only one-fifth as much safety-related piping as earlier plants from Westinghouse, majority owned by Toshiba Corp. In an emergency, the reactor, which has been selected for use at Southern Co.'s Vogtle site in Georgia and at six other U.S. locations, is designed to shut down automatically and stay within a safe temperature range.

The reactor's passive designs take advantage of laws of nature, such as the pull of gravity. So, for example, emergency coolant is kept at a higher elevation than the reactor pressure vessel. If sensors detect a dangerously low level of coolant in the reactor core, valves open and coolant floods the reactor core. In older reactors, emergency flooding comes from a network of pumps—which require redundant systems and backup sources of power—and may also require operator action.

Gallup has a public opinion poll.

[MOREnukes]

Read more

Mike Manos Expands His Role, Again – Repeats an Organizational Pattern

Mike changes so often here is the latest on his job change.

Talking to a few friends we were discussing Mike Manos's running Digital Realty Trust's POD development.   The conversations was, “Oh did you hear.  He got more.  He did???  What?  Mike got Operations, Operation Engineering, and Future Innovation with Jim Smith the CTO.  Hey doesn't this look like the way Mike organized his group at Microsoft?  Yes it does.  And, now he reports to the CEO.”

As we look to the challenges ahead we are faced with the kinds of problems all companies wish they had.  We are challenged by an increased amount of customer demand for capacity coupled with a desire for the most technologically advanced facilities in the market today.   Additionally new offerings such as Pod Architecture Services is giving us visibility and penetration into opportunities that historically we could not be a part of.    This considerable growth is combined with an increasing amount of complexity in managing a world-wide facility portfolio of tens of facilities with power capacity  that is measured in the hundreds of megawatts!

Mike has used his organizational skills to pull a team together that rivals will have a hard to match.  Why?  Because Mike can perform data center organizational magic, and people like to work for him.

You may not have thought as Digital Realty Trust as a construction company, and they aren't a typical construction management solution. They are out to drive a change in the data center industry by looking at the TCO to provide data center services.  This may not be as sexy as watching Apple and Google data center moves, but it is going to drive significant changes.

One of the changes coming through the industry is combination of IT with Facilities/Real Estate in Site Selection, Design, and Construction of Data Centers.  There are data center rules ready to be broken as others figure out how much these rules limit and increase data center costs. We are about to see data centers built in totally different ways in places you would not typically consider.  In this recession, there are huge opportunities for those who can see a different way of business to take advantage of the economic incentives by federal, state, and local gov’t.

Here is a tough question I haven’t seen many people ask.  We have these long list of site selection and design criteria, how do each of these criteria affect the TCO of our data center?  If we are going to have a lower TCO shouldn’t we question the requirements and understand how much it costs.

You mean I could lower my TCO by changing the requirements?  Well, yeh.  Don’t you think that is easier to do, than adopting the latest technology with the hope of a high ROI.

There are a few who are thinking this way, and it is fun discussing how data center costs could be dramatically lower than the competition.  You know Google thinks this way, but we usually have to wait 3 years before they share their ideas.

I think Mike sees the way things are shifting, and has a vision which is why he has acted so quickly at Digital Realty Trust.

Are you ready?

Read more