Christian Belady migrates to Cloud Computing joins Microsoft Research eXtreme Computing Group

I’ve started discussing the Cloud Computing more as the cloud infrastructure does almost all the things a green data center does and has industry’s attention.  I can’t think of anyone who wants to go to the cloud to over provision hardware, create silos that don’t work together, and ignore their energy use.

Mike Manos announced his move to the Mobile Cloud.

I am extremely happy to announce that I have taken a role at Nokia as their VP of Service Operations.  In this role I will have global responsibility for the strategy, operation and run of infrastructure aspects for Nokia’s new cloud and mobile services platforms.

Christian Belady announced his move to Microsoft Research’s eXtreme Computing Group.

But even with all of this change, I see there is even more opportunity now then there was when I started at Microsoft almost three years ago. Cloud computing has made mining and developing the “right” opportunities even that much more important. We need to think about how we tie together the complete ecosystem of the software stack, the IT, the data center and the grid today and what efficiencies we can drive from our research and development for the future. For those of you that know me – this is the kind of opportunity that makes me salivate. There aren’t many people around tasked with this kind of challenge and this is the opportunity I have been given in the evolution of my career at Microsoft. This week I begin tackling these projects within the Microsoft Research group in team called the Extreme Computing Group.

How big is the Cloud for Microsoft Research?  The top 3 demos listed in their demo page from TechFest are Cloud Computing.

Client + Cloud Computing for Research

Scientific applications have diverse data and computation needs that scale from desktop to supercomputers. Besides the nature of the application and the domain, the resource needs for the applications also vary over time—as the collaboration and the data collections expand, or when seasonal campaigns are undertaken. Cloud computing offers a scalable, economic, on-demand model well-matched to evolving eScience needs. We will present a suite of science applications that leverage the capabilities of Microsoft's Azure cloud-computing platform. We will show tools and patterns we have developed to use the cloud effectively for solving problems in genomics, environmental science, and oceanography, covering both data and compute-intensive applications.

Cloud Faster

To make cloud computing work, we must make applications run substantially faster, both over the Internet and within data centers. Our measurements of real applications show that today's protocols fall short, leading to slow page-load times across the Internet and congestion collapses inside the data center. We have developed a new suite of architectures and protocols that boost performance and the robustness of communications to overcome these problems. The results are backed by real measurements and a new theory describing protocol dynamics that enables us to remedy fundamental problems in the Transmission Control Protocol. We will demo the experience users will have with Bing Web sites, both with and without our improvements. The difference is stunning. We also will show visualizations of intra-data-center communication problems and our changes that fix them. This work stems from collaborations with Bing and Windows Core Operating System Networking.

Energy-Aware VMs and Cloud Computing

Virtual machines (VMs) become key platform components for data centers and Microsoft products such as Win8, System Center, and Azure. But existing power-management schemes designed at the server level, such as power capping and CPU throttling, do not work with VMs. VMmeter can estimate per-VM power consumption from Hyper-V performance counters, with the assistance of WinServer2008 R2 machine-level power metering, thus enabling power management at VM granularity. For example, we can selectively throttle VMs with the least performance hit for power capping. This demo compares VMmeter-based with hardware-based power-management solutions. We run multiple VMs, one of them being a high-priority video playback on a server. When a user requests power capping with our solution, the video playback will maintain high performance, while with hardware-capping solutions, we see reduced performance. We also will show how VMmeter can be part of System Center management packs.

Microsoft Research is lucky to find someone who is influential in the industry and has spent 3 years being in its own data center operations. There are few who could make the jump from data center operations to Microsoft Research, and I am sure Christian will constantly be working on knowledge transfers across the teams.

For the rest of the industry, I’ve go a feeling we are going to see even more of Christian’s ideas out there now that he is in Microsoft Research.

This is exciting in itself, but what really gets me “charged” is the opportunity I have now to work between my former group Global Foundation Services (that drives the current Microsoft cloud infrastructure) and my new group Microsoft Research (MSR). Taking the best practices from what we have learned with our current and future Gen 4 data centers and combining them with the resources of one of the best research organizations in the world (MSR), I am convinced that many new and exciting things will come. And best of all, I am lucky to be right smack in the middle of it and will still be working closely with the teams driving the hardware architecture for the cloud today and in the future. So actually, I am really not leaving GFS but rather extending the reach of GFS into the future. Who can ask for a better opportunity….man I love this company!

Read more

Open Source Data Center Initiative Story by Mike Manos

I wrote a post announcing GreenM3 partnering with University of Missouri and ARG Investments with Mike Manos as an industry advisor.  I spent a few paragraphs explaining the use of an Open Source Software model applied to data centers.

Mike Manos took the time to write his own post in response to mine and it is well written story that explains why we are using this approach.

My first reaction was to cut and paste relevant parts and add comments, but the whole story makes sense. So for a change, I am going to copy his whole post below to make sure we have it in two places.

Open Source Data Center Initiative

March 3, 2010 by mmanos

There are many in the data center industry that have repeatedly called for change in this community of ours.  Change in technology, change in priorities, Change for the future.  Over the years we have seen those changes come very slowly and while they are starting to move a little faster now, (primarily due to the economic conditions and scrutiny over budgets more-so than a desire to evolve our space) our industry still faces challenges and resistance to forward progress.   There are lots of great ideas, lots of forward thinking, but moving this work to execution and educating business leaders as well as data center professionals to break away from those old stand by accepted norms has not gone well.

That is why I am extremely happy to announce my involvement with the University of Missouri in the launch of a Not-For-Profit Data Center specific organization.   You might have read the formal announcement by Dave Oharawho launched the news via his industry website, GreenM3.   Dave is another of of those industry insiders who has long been perplexed by the lack of movement and initiative we have had on some great ideas and stand outs doing great work.  More importantly, it doesn’t stop there.  We have been able to put together quite a team of industry heavy-weights to get involved in this effort.  Those announcements are forthcoming, and when they do, I think you will get a sense of the type of sea-change this effort could potentially have.

One of the largest challenges we have with regards to data centers is education.   Those of you who follow my blog know that I believe that some engineering and construction firms are incented ‘not to change’ or implementing new approaches.  The cover of complexity allows customers to remain in the dark while innovation is stifled. Those forces who desire to maintain an aura of black box complexity  around this space and repeatedly speak to the arcane arts of building out  data center facilities have been at this a long time.  To them, the interplay of systems requiring one-off monumental temples to technology on every single build is the norm.  Its how you maximize profit, and keep yourself in a profitable position.

When I discussed this idea briefly with a close industry friend, his first question naturally revolved around how this work would compete with that of the Green Grid, or Uptime Institute, Data Center Pulse, or the other competing industry groups.  Essentially  was this going to yet another competing though-leadership organization.  The very specific answer to this is no, absolutely not.  

These groups have been out espousing best practices for years.  They have embraced different technologies, they have tried to educate the industry.  they have been pushing for change (for the most part).  They do a great job of highlighting the challenges we face, but for the most part have waited around for universal good will and monetary pressures to make them happen.  I dawned on us that there was another way.   You need to ensure that you build something that gains mindshare, that gets the business leadership attention, that causes a paradigm shift.   As we put the pieces together we realized that the solution had to be credible, technical, and above all have a business case around it.   It seemed to us the parallels to the Open Source movement and the applicability of the approach were a perfect match.

To be clear, this Open Source Data Center Initiative is focused around execution.   Its focused around putting together an open and free engineering framework upon which data center designs, technologies, and the like can be quickly put together and more-over standardize the approaches that both end-users and engineering firms approach the data center industry.

Imagine if you will a base framework upon which engineering firms, or even individual engineers can propose technologies and designs, specific solution vendors could pitch technologies for inclusion and highlight their effectiveness, more over than all of that it will remove much mystery behind the work that happens in designing facilities and normalize conversations.   

If you think of the Linux movement, and all of those who actively participate in submitting enhancements, features, even pulling together specific build packages for distribution, one could even see such things emerging in the data center engineering realm.   In fact with the myriad of emerging technologies assisting in more energy efficiency, greater densities, differences in approach to economization (air or water), use of containers or non use of containers, its easy to see the potential for this component based design. 

One might think that we are effectively trying to put formal engineering firms out of business with this kind of work.  I would argue that this is definitely not the case.  While it may have the effect of removing some of the extra-profit that results from the current ‘complexity’ factor, this initiative should specifically drive common requirements, and lead to better educated customers, drive specific standards, and result in real world testing and data from the manufacturing community.  Plus, as anyone knows who has ever actually built a data center, the devil is in the localization and details.  Plus as this is an open-source initiative we will not be formally signing the drawings from a professional engineering perspective.

Manufacturers could submit their technologies, sample application of their solutions, and have those designs plugged into a ‘package’ or ‘RPM’ if I could steal a term from the Redhat Linux nomenclature.  Moreover, we will be able to start driving true visibility of costs both upfront and operating and associate those costs with the set designs with differences and trending from regions around the world.  If its successful, it could be a very good thing. 

We are not naive about this however.  We certainly expect there to be some resistance to this approach out there and in fact some outright negativity from those firms that make the most of the black box complexity components.

We will have more information on the approach and what it is we are trying to accomplish very soon. 

\Mm

Read more

GreenM3 partners with University of Missouri, 400 MW Data Center Site with Mike Manos on advisor’s list

This is the beginning of a change in what I write about on this blog.  I’ll keep commenting on various things in the data center industry that help you go green.  But, it was becoming clear that there was more I could do.

On March 1, 2010 the University of Missouri signed a Statement of Support for GreenM3, Enginuity, and ARG Investments. The document is here Download University of Missouri Statement of Support.

image

The following are excerpts of this 6 page document that describe the partnership and the role of GreenM3.

In the early discussions with the University of Missouri, there was a clear role for a NPO, Not For Profit Organization, and one of suggestions that stuck is why don’t turn GreenM3 into a NPO. So, thanks to some volunteers, we are in the process of getting 501(c ) (3) status for GreenM3.

WHEREAS, the Greentech Research Foundation, Inc (GreenM3), Non-Profit Organization (NPO) is established as an independent and objective source that will contribute to the step change advancement of data center and ancillary services operations that strives to provide a platform for stable, secure, efficient and sustainable state-of-the-art operations that can be replicated world-wide, accomplished through public/private investment and;

The partnerships started based on the idea data center innovation requires public private partnerships.

WHEREAS, the disruptive and transformative nature of computer driven communications and commerce is little understood and creates an environment of extreme risk and opportunity, the harnessing, of which, can only be accomplished through applied research from a consortium of successful market operators and research specialists; there is a common need shared within the industry for a clearing-house or central hub of research needed to advance data center operation

To be innovative we needed a different model of operation where ideas could easily develop and can be evaluated.  The Open Source Software model made sense given the data center focus.

Open source describes practices in production and development that promote access to the end product's source materials—typically, their source code.[1] Some consider open source a philosophy, others consider it a pragmatic methodology. Before the term open source became widely adopted, developers and producers used a variety of phrases to describe the concept; open source gained hold with the rise of a public, worldwide, computer-network system called the Internet, and the attendant need for massive retooling of the computing source code. Opening the source code enabled a self-enhancing diversity of production models, communication paths, and interactive communities. [2] Subsequently, a new, three-word phrase "open source software" was born to describe the environment that the new copyright, licensing, domain, and consumer issues created.

We accept the fact that there are multiple agendas and embrace the idea to drive different designs vs a centralized approach.

The open source model includes the concept of concurrent yet different agendas and differing approaches in production, in contrast with more centralized models of development such as those typically used in commercial software companies. [3] A main principle and practice of open sourcesoftware development is peer production by bartering and collaboration, with the end-product (and source-material) available at no cost to the public.

As a demonstration, ARG Investments is a private company ready to implement ideas at the Ewing Industry Park from the GreenM3 team.  Individuals who want to accelerate changes in data center industry are the ideal members, and one of the first industry advisors for GreenM3’s new role is Mike Manos.  Mike and I have had numerous conversations and we can now work together to implement some disruptive data center ideas.  There are about another 5 – 10 individuals lined up going through the various approvals to be an advisor to the NPO, and I’ll blog about each as they can formalize their commitment.

image 

Facilitating Data Center Advancement: With not-for-profit partner, GreenM3, a nonprofit organization, the team is dedicated to developing and sharing best practices for total sustainability, to reduce the carbon impact and water use in power generation, building, data center operations, and education. Collaboration with members of the Industry Advisory Council, led by Michael J. Manos of Nokia, will be facilitated by GreenM3 and Enginuity Worldwide LLC, an innovation-based business.

 

One of the main purposes of the GreenM3 blog is to share ideas for a greener data center. With University of Missouri Resources and a data center

Facilitating Data Center Advancement:

With not-for-profit partner, GreenM3, a nonprofit organization, the team is dedicated to developing and sharing best practices for total sustainability, to reduce the carbon impact and water use in power generation, building, data center operations, and education. Collaboration with members of the Industry Advisory Council, led by Michael J. Manos of Nokia, will be facilitated by GreenM3 and Enginuity Worldwide LLC, an innovation-based business.

To test the ideas there is the partnership with Ewing Industrial Park and ARG Investments.

WHEREAS, ARG Investments LLC, a Missouri based company, (ARG) desires to build a technology campus and innovation-led development at Ewing Business Park, Columbia, Missouri as well as foster improvement in the global data center and cloud-computing space through its not-for-profit partner Greentech Research Foundation, Inc (GreenM3), including the following actions and initiatives:

Implementing the Best and Most Compelling Innovation: Data Centers can be at the nexus of cloud computing, mobile devices, and renewable energy with the right team of people. Based on business model frameworks, and by using an operating business park as the vessel for technology implementation, the technology improvements are more likely to uncover step-change advances and facilitate industry adoption.

And what do we get out of University of Missouri?  Executive support from the University.

I. SINGLE POINT OF CONTACT AND CORPORATE OFFICE

University of Missouri will make available the Vice Provost of Economic Development, or his designee, as single point of contact for ARG & GreenM3 through its Office of Economic Development and can provide meeting space for ARG & GreenM3 with on-campus contacts.

II. UNIVERSITY OF MISSOURI TEAM

University of Missouri can establish a high-level administrative response team to assure responsiveness and delivery of requested services and programs.

The team can consist of the appropriate administrators or their designees from the list below:

Chancellor

Provost

Vice Provost for Economic Development

Vice Chancellor for Research

Vice Chancellor for Administrative Services

Vice Provost for Undergraduate Studies

Vice Provost and Director of Cooperative Extension

Dean from various MU Colleges and Schools

Key Center Directors

Other Administrative or Technical Staff as needed

This team can meet periodically with ARG & GreenM3 administrators to further the success of the relationship and provide a continuous communication network.

Educational resources and centers of innovation are of course included.

This is the beginning of an exciting changes for GreenM3, and part of what this blog will do is share a new way to partner with a data center developer (ARG Investments) and University (Mizzou) to green the data center.

Read more

Al Gore is a Meme for Environmentally sensitive business, results at IBM and Apple events

I have modified one of “M” in what is GreenM3 to represent Memetics.

Memetics purports to be an approach to evolutionary models of cultural information transfer.

Al Gore acts as a Meme.

A meme (pronounced /ˈmiːm/, rhyming with "cream"[1]) is a postulated unit of cultural ideas, symbols or practices, which can be transmitted from one mind to another through writing, speech, gestures, rituals or other imitable phenomena.

Al Gore represents environmentally sensitive business, and just presented on Feb 24, 2010 at IBM Pulse 2010 conference. 

image

Below is a video from the beginning of his entertaining presentation I sat in.  Al Gore was good salesman for the concepts of IBM’s Smarter Planet, and the audience said good things about his presentation.

Two days later in Cupertino, Al Gore attends the Apple shareholder meeting, and his meme follows him.  Environmentally Sensitive Business.  But, does Steve Jobs want environmental issues brought up in his shareholder’s meeting.

February 25, 2010 12:40 PM PST

Al Gore a lightning rod at Apple shareholder meeting

by Erica Ogg

CUPERTINO, Calif.--The presence of one of the world's pre-eminent environmentalists at Apple's shareholder meeting Thursday was the subject of much of the morning's pointed discussion.

As expected, Apple's attitude on environmental and sustainability issues was one of the main concerns of the stockholders present Thursday, followed closely by the company's immense pile of cash. But early harsh comments about former Vice President Al Gore's record set the tone.

Gore was seated in the first row, along with his six fellow board members, in Apple's Town Hall auditorium as several stockholders took turns either bashing or praising his high-profile views on climate change.

At the first opportunity for audience participation just several minutes into the proceeding, a longtime and well-known Apple shareholder--some would say gadfly--who introduced himself as Sheldon, stood at the microphone and urged against Gore's re-election to the board. Gore "has become a laughingstock. The glaciers have not melted," Sheldon said, referring to Gore's views on global warming. "If his advice he gives to Apple is as faulty as his views on the environment then he doesn't need to be re-elected."

Another shareholder immediately got up to defend Gore and endorse his presence as an Apple director. And that wasn't the end of it. Two different proposals from shareholders were presented in regard to Apple's environmental impact. One was from the nonprofit As You Sow, which for the second straight year asked Apple to publicly commit to specific greenhouse gas reduction goals and publish a formal sustainability report; the second came from Herrington Investments, which proposed that Apple's board establish a sustainability committee, just like a compensation or personnel committee.

As You Sow's representative, Conrad MacKerron, praised Gore, but also challenged him on not doing more to encourage the company to set specific public commitments. Forest Hill, Herrington Investment's senior portfolio manager also addressed some of his comments directly to Gore, saying making board members responsible for Apple's envronmental impact "would make Apple a corporate leader."

This was not a serious enough issue to jeopardize Al Gore’s Board position.

Despite his apparently polarizing nature, Gore was re-elected with the rest of the slate in preliminary results.

BTW, at IBM’s Tivoli event every IBM employee had a Lenovo Thinkpad except the creative designers who had Macs.  I know a few Apple employees would grin knowing even the creatives at IBM choose Apple Computers.

Read more

Is Mike Manos a data center celebrity? :-)

After another day at IBM’s Pulse 2010 conference which is pretty serious event discussing service management, I ran across this article by the NYTimes, and it made me laugh.

Mike Manos comes as close as you can get to a data-center celebrity.

Now there is another Michael Manos that was infamous as a con artist, pretending he was a celebrity who was recently arrested.

The Dallas Morning News reports that the man, whose real name is Michael Manos, was arrested last week “on a parole violation warrant out of New York.” He has “an extensive rap sheet stretching back to 1979 for convictions ranging from kidnapping to robbery,” and was released from prison in 2004, the DMN reported.

He kept his past at bay for years. He strolled red carpets in New York saying he was “a rich promoter seeking to do a reality TV show” and spent a good deal of time in Dallas before heading to San Francisco. In after city, the article said, he raised money for charity then allegedly left “ a wake of unpaid bills behind him.”

Jane Fonda with a man who was billing himself as a wealthy philanthropist at an October 2007 G-CAPP fundraiser in Atlanta.

Jane Fonda with a man who was billing himself as a wealthy philanthropist at an October 2007 G-CAPP fundraiser in Atlanta.

The con artist celebrity schmoozer has no resemblance to the data center Mike Manos

Mike Manos

and here is a picture of Mike in Quincy

Is Mike Manos a data centercelebrity?  We can all debate this over a beer, but the NYTimes author thought it was a useful description.

fyi, our Mike Manos’s is a son of a policeman.

Lessons from Dad

January 23, 2010 by mmanos

Last week my father retired from the Chicago Police Department.   After 38 years on the force (34 of which he spent as a Homicide Detective in the worst areas of Chicago) he finally called it quits.   Its an event I truly thought I would never see.  My dad has had quite a colorful and storied career, not just with the Police Department but with his various other jobs he held over the years as well. It includes everything from security jobs, to being a member of the US Marshalls Fugitive Apprehension Unit to being a regular Bodyguard to Frank Sinatra.  Even retired he has started his own security and investigations company.  Some things never change I guess.

Read more