Google's Data Center System Engineering approach

There was recent news on system engineers being the best job in America.

1. Systems Engineer

Systems Engineer

Anne O'Neil, a chief systems engineer at the N.Y.C. Transit Authority, is one of five female senior managers in a department of 1,500.

Top 50 rank: 1
Sector: Information Technology

What they do: They're the "big think" managers on large, complex projects, from major transportation networks to military defense programs. They figure out the technical specifications required and coordinate the efforts of lower-level engineers working on specific aspects of the project.

Why it's great: Demand is soaring for systems engineers, as what was once a niche job in the aerospace and defense industries becomes commonplace among a diverse and expanding universe of employers, from medical device makers to corporations like Xerox and BMW.

CNet News wrote as well.

Systems engineer deemed best job in America

by Chris Matyszczyk

If you're a systems engineer who wonders whether you've chosen the right profession, I bring you good news.

But, what got me write a blog entry was Google's job post for Data Center system engineer.

The role: Data Center Control Systems Engineer

Data Center Control Systems Engineers possess demonstrated design, operation, and construction experience in the areas of complex and mission critical facilities. You will have extensive knowledge of large-scale facilities controls and monitoring systems for all infrastructural systems.

As the Data Center Control Systems Engineer, you have excellent communication skills and are able to work in teams and matrix organizations. You are expected to develop and maintain strong functional relationships across multidisciplinary teams to anticipate future controls and monitoring design requirements. You will be continuously involved in the improvement of plant performance based on historical data collected and collaborate on retrofit projects to improve plant efficiency based on business case justifications.

and on top of that there is a Data Center Strategic Negotiator job which fits as a business/technical person to work with the system engineer.

The role: Data Center Strategic Negotiator

As a Data Center Strategic Negotiator,you will lead a team to collect and analyze large sets of location data, execute extensive on-the-ground due diligence, and to ultimately lead negotiations in to develop comprehensive legal contracts for data centers, real estate, power, and networking services around the world, for both new and existing assets, of all sizes. You must have substantial knowledge of global markets, in-depth technical expertise, and strategic analytical skills, in addition to rock-solid negotiation and collaboration capabilities. All location strategy and site selection initiatives are team efforts spearheaded by the Global Infrastructure Group (“GIG”). You will need to be a flexible, proactive team player who understands and seeks to support the larger strategic initiatives of the company. You are a proven professional with a track record that matches our philosophy of leading by innovation, who has a detailed understanding of both the technological and the commercial sides of data centers, and who has the ability to deliver against aggressive deadlines with a driving passion for cost reduction and highly effective solutions.

The Data Center Strategic Negotiator will carry out the selection and negotiations process for new data centers from start to finish. You will have experience designing and executing large-scale international site selection initiatives; deep and broad transactional knowledge; strong technical negotiation skills in the areas of data centers; real estate leases, purchase agreements, and entitlements; energy and other utilities; telecommunications; and economic development incentives. Technical knowledge and experience negotiating collocation space, racks, power circuits, cross connects and remote hands in conventional data centers is preferred. You will be adept at strategizing, structuring, negotiating, and closing a range of mission critical transactions in diverse settings and with diverse parties.

I spent more time going through the Google job postings for Mtn View.  Google is building teams I was used to working with at Apple developing hardware.  But, Apple didn't have the system engineers above as data centers back when I worked there were just for enterprise applications.

It will be hard to discover what Google's data center system engineer and strategic negotiator do, but keep in mind, they are developing systems for the way Google operates as a business.  Copying their actions could cause more problems than they solve unless you think of the whole system.

It is great to see that Google has reached a stage in maturity to identify system engineering and holistic system negotiation as keys to their continue growth and cost reduction.  On the other hand the job for these people would have been much easier if they were hired 10 years ago as they now need to work with the momentum of dozens of groups who are entrenched.

The biggest challenge to doing the jobs above is whether you have the organizational skills to instill change in groups.

My next read is Switch.

Buy Switch.
Come see us on the book tour.
• Read the first chapter.

Why is it so hard to make lasting changes in our companies, in our communities, and in our own lives?

The primary obstacle is a conflict that’s built into our brains, say Chip and Dan Heath, authors of the critically acclaimed bestseller Made to Stick. Psychologists have discovered that our minds are ruled by two different systems—the rational mind and the emotional mind—that compete for control. The rational mind wants a great beach body; the emotional mind wants that Oreo cookie. The rational mind wants to change something at work; the emotional mind loves the comfort of the existing routine. This tension can doom a change effort—but if it is overcome, change can come quickly.

Read more

Who is Monitoring Greenhouse Gases in the atmosphere? Top scientific minds or cash strapped well intended individuals

Here is something that will leave you thinking.  Who and what measures and monitors the greenhouse gases in the atmosphere?

The Orbiting Carbon Observation (OCO) satellite developed by NASA/JPL was supposed to do this, but it crashed after launch on Feb 24, 2009.

Scientists to NASA: We Need A Reliable Way to Track Global Emissions - 07.31.2009

By Keith Johnson

Forget all the haggling with China, India, and parts of the U.S. Congress—the real obstacle to a global climate-change treaty might be accurately measuring greenhouse-gas emissions in the first place.

That’s the warning from the National Academy of Science’s National Research Council to the head of NASA. The upshot? Without a sophisticated satellite that can track global emissions, it will be hard to know what everybody is really up to: “[C]urrent methods for estimating greenhouse gas emissions have limitations for monitoring a climate treaty.”

NASA had such a sophisticated satellite—the Orbiting Carbon Observatory—which failed to reach orbit in February. The space agency is considering trying again—thus the letter from the NAS pointing out just how useful such satellites can be.

The monitoring in OCO was simple.

The satellite carried a single instrument that would have taken the most precise measurements of atmospheric carbon dioxide ever made from space. The instrument consisted of three parallel, high-resolution spectrometers, integrated into a common structure and fed by a common telescope. The spectrometers would have made simultaneous measurements of the carbon dioxide and molecular oxygen absorption of sunlight reflected off the same location on Earth’s surface when viewed in the near-infrared part of the electromagnetic spectrum, invisible to the human eye.

Here is a video that gives you background on the OCO satellite

The Economist discusses the issue of monitoring greenhouse gases in length.

Monitoring greenhouse gases

Highs and lows

You might think that measuring the levels of greenhouse gases in the atmosphere would be a priority. If you did think that, though, you would be wrong

Mar 4th 2010 | From The Economist print edition

IN NEGOTIATIONS on nuclear weapons the preferred stance is “Trust but verify”. In negotiations on climate change there seems little opportunity for either. Trust, as anyone who attended last year’s summit in Copenhagen can attest, is in the shortest of supplies. So, too, is verification.

Barack Obama was asked when he was in Copenhagen whether a provision by which countries could peek into each others’ assessment processes was strong enough to be sure there was no cheating. He answered reassuringly that “we can actually monitor a lot of what takes place through satellite imagery”. That statement conjured up thoughts of the sort of cold-war satellite system that America used to identify and count Russian missiles. But the president was being a bit previous; at the moment, no such system exists, because America’s Orbiting Carbon Observatory (OCO), a satellite that would have fulfilled the role, was lost on launch this time last year. The purpose of OCO was to work out the fate of carbon dioxide that is emitted by industrial processes but does not then stay in the atmosphere—about 60% of the total.

The Economist author points out the problem with the system.

America is planning to build a new OCO. In the meantime, however, a small group of scientists labours away on Earth, doing its best to monitor emissions at ground level. At the end of February a number of these researchers met at the Royal Society in London, to discuss what they were up to.

Measuring gas levels day in, day out can look a little humdrum to outsiders, including those who hold the purse strings. They tend to prefer scientists to experiment and test hypotheses, not just tally things. But that attitude galls the greenhouse-gas measurers, and not only because it denies them money. It also ignores the fact that careful measurement is a way of discovering new things, not just of checking the status quo. Monitoring is not just a necessary handmaiden of science—it is the real thing.

And, what people do in the short term.

Indeed, for all the noise that is made about climate change, much of this research is done with next to no money. Asked how she paid for her monitoring of various greenhouse gases in Baden Württemberg, Ingeborg Levin of Heidelberg University replied “by stealing”—meaning not that she robs banks, but that the monitoring work is cross-subsidised by grants intended for other studies.

How broken is the discussion on GHG that there is no world-wide GHG monitoring system?

Let's hope the NASA budget gets approved for OCO 2.

Proposed reflight

Three days after the failed February 2009 launch, the OCO science team sent NASA headquarters a proposal to build and launch an OCO "carbon copy", which planned to have the replacement satellite launched by late 2011.[16] On February 1st, 2010, the FY 2010 NASA budget request did include US$170 million for NASA to develop and fly a replacement for the Orbiting Carbon Observatory.[17]

Read more

Defining a Data Center API, on the list of things to do for Open Source Data Center Initiative

I have spent so much of my life working with Operating System nerds both at Apple and Microsoft that I take it for granted the concepts of an API.

An application programming interface (API) is an interface implemented by a software program to enable interaction with other software, similar to the way a user interface facilitates interaction between humans and computers.

A critical concept of an API is abstraction.

An API is an abstraction that defines and describes an interface for the interaction with a set of functions used by components of a software system.

The data center is waiting for abstraction..

In computer science, the mechanism and practice of abstraction reduces and factors out details so that one can focus on a few concepts at a time.

But the many people are concrete minded thinkers. Concrete Thinking is.

Thinking characterized by a predominance of actual objects and events and the absence of concepts and generalizations.

Google’s Urs Hoelzle and Luis Andre Barossa wrote on the concept of a Data Center is a computer.

image

Well if the Data Center is a computer it should have a set of APIs.  It is a fact that Google has interfaces for its data centers.  I haven’t talked to a single Google employee on this concept.  But, it has to be.  How else are you going to interface with all the data centers around the world in Google’s inventory?  if you search the google document you see multiple references to API.

image

image

In Google’s Warehouse-Scale computers they close with.

At one level, WSCs are simple—just a few thousand cheap servers connected via a LAN. In reality, building a cost-efficient massive-scale computing platform that has the necessary reliability and programmability requirements for the next generation of cloud-computing workloads is as difficult and stimulating a challenge as any other in computer systems today.

Google thinks about the programmability, the APIs, of the data center.

I don’t need any more proof data centers need APIs.  But, concrete thinkers will not believe it until there are multiple customers already doing this.

In the short term, we can use Johnson Controls solution GridLogix I blogged about as a reference point.

Information Management for Sustainability
Gridlogix provides your organization with the tools for sustaining your enterprise. More than going "Green", Gridlogix helps you continuously cut wasteful costs, prolong the life of your facilities’ equipment, and maintain a comfort level throughout your enterprise. With Gridlogix's Automated Enterprise Management solution, Gridlogix empowers anyone in your organization with the real time data that allows your organization to improve the efficiency of your facilities, typically reducing energy and maintenance costs by 10-20% with a payback of less than 18 months. Gridlogix delivers the best form of Green Energy, conservation.

Read more

Microsoft Research Paper, Measuring Energy use of a Virtual Machine

Microsoft TechFest is going on now.

About TechFest

TechFest 2010TechFest is an annual event that brings researchers from Microsoft Research’s locations around the world to Redmond to share their latest work with fellow Microsoft employees. Attendees experience some of the freshest, most innovative technologies emerging from Microsoft’s research efforts. The event provides a forum in which product teams and researchers can discuss the novel work occurring in the labs, thereby encouraging effective technology transfer into Microsoft products.

I used to go when I was a full time employee, but there are some good things you can learn by going to the demo site.

One that caught my eye is the Network Embedded Computing that has consistently worked on data center energy sensor systems.

Their latest project is Joulemeter, a project that can measure the energy usage of VM, Server, Client and Software.

Joulemeter: VM, Server, Client, and Software Energy Usage

Joulemeter is a software based mechanism to measure the energy usage of virtual machines (VMs), servers, desktops, laptops, and even individual softwares running on a computer.

Joulemeter estimates the energy usage of a VM, computer, or software by measuring the hardware resources (CPU, disk, memory, screen etc) being used and converting the resource usage to actual power usage based on automatically learned realistic power models.

Joulemeter overview

Joulemeter can be used for gaining visibility into energy use and for making several power management and provisioning decisions in data centers, client computing, and software design.

For more technical details on the system here is their paper.

Virtual Machine Power Metering and Provisioning
Aman Kansal, Feng
Zhao, Jie Liu
Microsoft Research
Nupur Kothari
University of Southern
California
Arka Bhattacharya
IIT Kharagpur
ABSTRACT
Virtualization is often used in cloud computing platforms for its
several advantages in efficient management of the physical resources.
However, virtualization raises certain additional challenges, and
one of them is lack of power metering for virtual machines (VMs).
Power management requirements in modern data centers have led
to most new servers providing power usage measurement in hardware
and alternate solutions exist for older servers using circuit and
outlet level measurements. However, VM power cannot be measured
purely in hardware. We present a solution for VM power metering.
We build power models to infer power consumption from resource
usage at runtime and identify the challenges that arise when
applying such models for VM power metering. We show how existing
instrumentation in server hardware and hypervisors can be
used to build the required power models on real platforms with low
error. The entire metering approach is designed to operate with
extremely low runtime overhead while providing practically useful
accuracy. We illustrate the use of the proposed metering capability
for VM power capping, leading to significant savings in power provisioning
costs that constitute a large fraction of data center power
costs. Experiments are performed on server traces from several
thousand production servers, hosting Microsoft’s real-world applications
such as Windows Live Messenger. The results show that
not only does VM power metering allows reclaiming the savings
that were earlier achieved using physical server power capping, but
also that it enables further savings in provisioning costs with virtualization.

Note there will be a desktop and laptop version available soon.

Download: A freely downloadable version of the Joulemeter software that measures laptop and desktop energy usage will be be available in a few weeks. Watch this space!

But I want to get access to the VM, Server and software versions for the data center.  Maybe I can get this group involved with GreenM3 being transition into an NPO.  The University of Missouri is also another connection as a  Mizzou professors worked with the Microsoft Researchers in a prior job developing sensor networks.

Read more