Where the Clouds Meet The Ground

The Economist has a feature on Cloud computing.  An audio is available here.

Nicholas Carr summarizes The Economist article.

The Economist tours the cloud

October 25, 2008

The new issue of The Economist features a good primer on cloud computing, written by Ludwig Siegele, which looks at trends in data centers, software, networked devices, and IT economics and speculates about the broader implications for businesses and nations. A free pdf of the entire report is also available.

Siegele notes that the hype surrounding the term "cloud computing" may have peaked already - Google searches for the phrase have fallen after a big spike in July - but that "even if the term is already passé, the cloud itself is here to stay and to grow. It follows naturally from the combination of ever cheaper and more powerful processors with ever faster and more ubiquitous networks. As a result, data centres are becoming factories for computing services on an industrial scale; software is increasingly being delivered as an online service; and wireless networks connect more and more devices to such offerings." The "precipitation from the cloud," he concludes (milking the passé metaphor one last time), "will be huge."

Part of the report is a specific feature on Data Centres.

CORPORATE IT

Where the cloud meets the ground

Oct 23rd 2008
From The Economist print edition

Data centres are quickly evolving into service factories

Illustration by Matthew Hodson

IT IS almost as easy as plugging in a laser printer. Up to 2,500 servers—in essence, souped-up personal computers—are crammed into a 40-foot (13-metre) shipping container. A truck places the container inside a bare steel-and-concrete building. Workers quickly connect it to the electric grid, the computer network and a water supply for cooling. The necessary software is downloaded automatically. Within four days all the servers are ready to dish up videos, send e-mails or crunch a firm’s customer data.

This is Microsoft’s new data centre in Northlake, a suburb of Chicago, one of the world’s most modern, biggest and most expensive, covering 500,000 square feet (46,000 square metres) and costing $500m. One day it will hold 400,000 servers. The entire first floor will be filled with 200 containers like this one. Michael Manos, the head of Microsoft’s data centres, is really excited about these containers. They solve many of the problems that tend to crop up when putting up huge data centres: how to package and transport servers cheaply, how to limit their appetite for energy and how to install them only when they are needed to avoid leaving expensive assets idle.

But containers are not the only innovation of which Mr Manos is proud. Microsoft’s data centres in Chicago and across the world are equipped with software that tells him exactly how much power each application consumes and how much carbon it emits. “We’re building a global information utility,” he says.

Engineers must have spoken with similar passion when the first moving assembly lines were installed in car factories almost a century ago, and Microsoft’s data centre in Northlake, just like Henry Ford’s first large factory in Highland Park, Michigan, may one day be seen as a symbol of a new industrial era.

Read more

Microsoft CIO Uses MPG Analogy for Problem with Green IT

Ran across this post by Microsoft’s CIO Tony Scott on news.com

You are here for an environmental conference, EcoForum. I'm curious what is going on in that area? I know power is a big issue
Scott: Most CIOs have come to recognize that both their employees and the customers of the company want to know that the company that they are either working for or buying products from is acting in an ecologically responsible way and that you take these issues seriously. From a Microsoft standpoint, we have some great products on virtualization. We're also here talking about that and here learning what other companies are doing.

In our own space we've gone from 8 percent to 25 percent virtualization in our data centers in just a year. Next year we think we are going to hit 50 percent. That's as dramatic a progress as I've seen, any company anywhere.

One of the things I am convinced of is that the entire technology community is going to have to come together to solve some of these issues. I came out of automotive. There was a day when if you wanted to know car gas mileage you had to write down the mileage, then drive and write down the mileage again. Then you went to the gas station and did long division to figure out what your gas mileage was. Eventually as the world got interested in this a chip got built in every car. Most cars have a chip built in to tell you what your miles per gallon is.

We don't have the functional equivalent to that in the IT world. As a CIO, you really want to know, what is this app costing me, all up? It's the people resources and the energy costs. The tools to do it are emerging but we are not there yet.

It shouldn't be that hard. If the technology community works together and develops the right standards and interfaces, one day you will be able to say here's my compute factor or my miles per gallon in terms of the technologies we use. With that we should be able to do a better job of managing our resources. I'm hopeful we could get that done.

With all Microsoft’s talk on PUE the data is in the data center on power consumed by servers.

The next step is getting the applications to calculate power used to complete the work, publishing a watts per unit of work.

Hopefully one of these days the application developers will pull their head out of a feature focused view, and think about resources they consume.

Read more

US CTO Candidates, GOOG, AMZN, OR MSFT -Purpose Develop Environmental Friendly Tech

BusinessWeek writes on the idea of Obama’s idea for a US CTO.

The Short List for U.S. Chief Technology Officer

Barack Obama has pledged to name a cabinet-level CTO to oversee a job-creating national broadband buildout if he's elected. Big names abound

By Tom Lowry

Click here to find out more!

Barack Obama says that the U.S. is not doing nearly enough to create jobs through technology. Shortly after he launched his campaign, the Illinois Senator promised that if elected, he would create the first-ever Cabinet-level post of chief technology officer. The economic crisis has since made it certain that a White House CTO would become one of Obama's most important advisers, should he triumph in November.

Candidates for the job are Google, Amazon, and Microsoft.

Among the candidates who would be considered for the job, say Washington insiders, are Vint Cerf, Google's (GOOG) "chief internet evangelist," who is often cited as one of the fathers of the Internet; Microsoft (MSFT) chief executive officer Steve Ballmer; Amazon (AMZN) CEO Jeffrey Bezos; and Ed Felten, a prominent professor of computer science and public affairs at Princeton University.

Broadband is one focus as well as a $50 billion VC fund for environmental friendly technology.

A White House CTO would be expected to help create incentive programs to expand broadband's reach, particularly tax credits for smaller carriers. But the tech czar would almost certainly be deeply involved in overseeing a federally-backed $50 billion venture capital fund that Obama has proposed to develop more environmentally friendly technology.

It is interesting to imagine a US CTO with one of these high tech executives. Tech company politics are taken to a new level.

Read more

Yes, PUE is the Next Battle Ground

Microsoft’s Steve Clayton posted an interesting question.

Is PUE the new battleground?

I’m reading a lot about data centers of late – so much so that I’m even spelling it the US way already (sigh). What is becoming increasingly clear to me though is that PUE may well be the new battleground between some of the industry heavyweights.

In very basic terms, PUE (Power Usage Effectiveness) is the ratio of power incoming to a data center to power used. The theoretical ideal is 1.0 of course which means you’re not wasting any energy. As Mike Manos points out this is all part of the “Industrialization of IT” that he and our GFS team works on. Google and Sun have both been waxing lyrical about PUE of late with some impressive numbers, particularly from Google who cite a PUE of 1.13 in one datacenter. Very impressive indeed.

Yes PUE is the next battleground. 

Why how many servers you have is not a number to be proud of. As Google has found being the largest data center brings a critical eye.

image

A better number for people relate to market is their PUE.  It is closest thing we have to a MPG.  Ideally there will performance per watt, but given the range of work done in data centers I can’t think of what the performance metric would be.

Last week I talked to Google’s Erik Teetzal about Google’s PUE calculations, and I’ve been thinking of a good post based on our conversation. What I think surprised Google is how much coverage they received regarding their PUE. The main benefit of Google’s press is there are thousands more people who have heard the term PUE.

As Steve Clayton mentions Microsoft’s Mike Manos says the Container Data Center has a PUE of 1.22.  Google’s is 1.21.  Microsoft’s numbers are higher though because they count their office space in the overhead to run the data center.  I think Microsoft is willing to count the office space given they have 1/3 of the reported number of employees Google has in their data centers.

Sun has achieved a PUE of 1.28 in their data centers, and I need to talk to Sun’s Dean Nelson to get more details.

Noticeably absent from the PUE discussion is IBM and HP.  How efficient are the data centers that IBM and HP build? What is their PUE?

Should there be an independent auditing of PUE to ensure accuracy?

A good side effect is the data center vendors are thinking of how to position their products improving PUE.

Read more