Microsoft's Data Center Solutions sponsors innovative Microsoft Research project - an always on data collection and visualization system

Microsoft's Data Center Solutions(DCS) sponsored a Microsoft Research project described in this article.

Monitoring the conditions: This sensor, a prototype developed by the Networked Embedded Computing group at Microsoft Research, is sensitive to heat and humidity. The group envisions using sensors like these to monitor servers in data centers, enabling significant energy savings. The sensors could also be used in homes to manage the energy use of appliances.
Credit: Microsoft Research

The sensors, says Feng Zhao, principal researcher and manager of the group, are sensitive to both heat and humidity. They're Web-enabled and can be networked and made compatible with Web services. Zhao says that he envisions the sensors, which are still in prototype form, as "a new kind of scientific instrument" that could be used in a variety of projects. In a data center, the idiosyncrasies of a building and individual servers can have a big effect on how the cooling system functions, and therefore on energy consumption. Cooling, Zhao notes, accounts for about half the energy used in data centers. (He believes that the sensors, which he says could sell for $5 to $10 apiece, could be used in homes as well as in data centers, where they could work in tandem with a Web-based energy-savings application.)

The connection between MSR and DCS is producing an always on data collection and visualization system for data center operations. As Microsoft continues to develop this system the potential for use of this system beyond the data center is huge for other industries and potential energy savings.

I want to thank Jie Liu for showing me the early prototype. And now, that there is a public article, I can write how cool this technology is.

Read more

Microsoft Demos Auto-Shift: Energy-Aware Server Provisioning at TechFest

Microsoft Research has a post about one of their Green projects at their TechFest event.  If you are not familiar with Techfest here is a google news search and the Microsoft website.

Auto Shift: Energy-Aware Server Provisioning

Green, as we now know, is, indisputedly, the new black. Seems like you can't turn on the television or pick up a newspaper to read about the latest green initiative. Lots of people are talking.

Feng Zhao is doing something about it.

Zhao's Networked Embedded Computing group is showing a TechFest demo called Auto-Shift: Energy-Aware Server Provisioning, which addresses server resource management for Internet services, such as Live Messenger and Hotmail. Data centers for such services require potentially expensive decisions about how many computers to allocate and how those are deployed.

"No. 1," Zhao says, "you have to buy the servers. No. 2, once you buy a server, you have to manage it. And third, you have to have an infrastructure, such as power supply. In this particular study, we looked at the power usage of the servers that are running one of our largest Web services. If you look at the load as it varies over the course of the day, it peaks around noon and slows down around midnight. That clearly shows that not all the servers are needed all the time. Can we shut down some of the servers? Can we actually save energy?"

This demo is the same paper I referred to earlier in another post.

The blog entry continues with the following points made by Feng.

"We also have all these sensors in the data centers," Zhao says. "Some of the machines work harder than others. If we can move the workload around, from hotspots to cool spots, the air conditioning doesn’t have to work as hard, because of the efficiency of cooling the hottest spots. If you move that workload and even out the temperature disparities, that means good energy savings. Incorporating environmental-sensor readings such as temperature and humidity, and couple that with smart scheduling and workload migration, and we believe we can even save more resources."

That sounds green, indeed--and economical, too. 

"What it translates to," Zhao concludes, "is that you use less power and that, with these smarts, we can figure out that maybe we don’t need to buy that many machines to start with, because we can do the same work, with very little difference in performance, and actually run it on a smaller set of machines. Reduce energy cost and reduce hardware investment in the first place--that would reduce service cost, reduce staffing, and reduce the space you need to build."

Read more

Energy Efficiency Drives Higher Consumption, Downside of Virtualization?

OK, I was on vacation this week, so I didn't write as much, and I wasn't planning on doing much work. But, I made a call to Microsoft's Christian Belady to discuss some energy saving ideas from the top of Stevens Pass.

If you haven't skied Stevens Pass, on the backside, you ski right under high voltage power lines from the Columbia Basin's hydroelectric dams, and you can get close enough in some areas that you'll feel get a slight charge holding your pole up in the air.  So, thinking about power feels like the right thing to do.

IMG_1477

After talking to Christian, I was thinking about his point on energy efficiency driving higher consumption, and where it could be illustrated in a specific case.

An interesting scenario is in virtualization. If you take an existing IT environment and virtualize servers, you assume a reduction in energy costs. But now, with users creating VMs easier than physical servers by removing the physical server barrier, how long is it before new VMs are created at a rate faster than the physical servers?

Will Energy Efficient VMs drive higher energy consumption over the long run?

It would be interesting to know after people have virtualized environments, enjoyed their energy savings, what happens to their energy consumption after users have adjusted to the new VM environment.

Are people missing the point because they are not thinking about life cycle management of servers, and what the impact is of virtulization?

Read more

Microsoft Creates Video Measuring Desktop Power Efficiency in Their Lab

One of Microsoft's labs created a video on measuring power consumption on desktop devices. This video is appropriate for someone who is new to measuring energy consumption on the desktop.

So how much power do your PCs draw?  And how do you figure that out?
Those were the questions that I asked Grant after he recently updated the machines in some of the classrooms he manages.  His problem was even more complex:  He needed stronger/better/faster machines, but he was at capacity on his circuits, so he needed to do it without increasing power draw.
In this video, Grant walks us through some of his methodology in measuring and benchmarking power consumption on a few machines, and shares the results with us. 
It leads to some interesting conclusions, and some good food for thought for anyone trying to make energy conscious decisions around PC purchases.

In this video they use the Watt's up device, but hopefully, they'll put the Smart Watt device in their small server room.

Read more

Energy Efficient Mermaid (Mobile Earthquake Recorder in Marine Areas)

One of the areas where Energy Efficiency is in their DNA are remote sensor engineers. The Microsoft Research team who wrote this paper are responsible for sensorweb and were Xerox PARC guys.

I found this article on the Economist, and it reminds me of some people who are investigating battery operated sensors in the data center to collect power and environmental data. Why would someone go with battery power, because the easier you can install a device in data center on its own network and power, the easier it is to deploy. The costs to deploy monitoring solutions that assume they'll plug into the existing data center network infrastructure will many times be more than the cost of the equipment.

To do that, they need to float near the sea floor, since most of an earthquake's energy travels through the rock rather than the water. So a Mermaid can operate at a depth of up to 1,500 metres (about a mile). When she hears something that might be pertinent, she runs the signal through her on-board computer to decide just how significant it really is. If it does turn out to be significant, she surfaces by pumping air into a bladder and makes contact with a satellite that has been co-opted into the project. Once she has delivered her message, the air is sucked back out of the bladder and she returns to her gloomy underwater station.

The main engineering problem Dr Simons faces—apart from making something that will work reliably in the salty ocean depths—is energy conservation. When a Mermaid runs out of power, she dies. That power is provided by lithium-ion batteries and is reckoned sufficient for between 50 and 100 surfacings.

One of the ways Dr Simons saves power is in the computer. The decision to surface is made by an algorithm that depends on a mathematical function called a wavelet. This divides an earthquake wave into separate components which can be studied independently. That allows the computer to restrict energy-intensive high-resolution analyses to those sections of the waves that really need it. The other sections receive a more cursory (and thus less power-consuming) glance.

Read more