Technical Peak behind the IBM and Johnson Controls Partnership for Smarter Buildings

IBM has a press release released today on its partnership with Johnson Controls for Smarter Buildings.  The partnership extends an effort started in 2007 in data centers to any buildings.

IBM and Johnson Controls Join Forces to Make Buildings Smarter

Combined Offering to Enhance Energy and Operational Efficiencies

LAS VEGAS, - 22 Feb 2010: IBM (NYSE: IBM) and Johnson Controls (NYSE: JCI), today announced a new relationship to create a new era of smarter buildings.  Together, the companies will team to provide a Smart Building Solution that can improve operations and reduce energy and water consumption in buildings worldwide.

Building on an existing relationship formed between the two organizations in 2007 to create energy efficient datacenters, this new offering benefits any building or portfolio of buildings.  Johnson Controls will combine its global leadership in energy efficiency and sustainable services and technologies with IBM's global leadership in software, hardware and services.  The result will help clients address the growing pressure they face to improve energy and asset management performance across their enterprises.

I was in the press room when Al Zollar and Rich Lechner made the announcement.  I wasn’t that excited as I didn’t have enough details to write a blog entry.  But, It is interesting the way some press reported.

GreenTechMedia has this.

IBM and Johnson Controls Team Up: Bad News for Building Start-Ups?

The giant of computing and the giant of building management are pals.

It's one step closer to the merger of building systems and IT systems. But it could be scary for start-ups and others promoting building management software.

Steve Evans from Computer Business Review reports.

Rich Lechner, VP, energy and environment for IBM, said that more intelligent building management systems are vital to future growth as well as the environmental benefits.

"Smart buildings are the cornerstone of our strategy to deliver a smarter infrastructure. They are critical to the long-term environmental and economic sustainability of cities around the world," he said. "This is not just about reducing waste, it's about reducing greenhouse gas emissions and enabling the infrastructure to support the dramatic growth in urbanisation that we're seeing."

This may seem scary for those like Hara Software. 

I had a chance to wander the showroom floor and asked more questions from the Johnson Controls group.  Did a little Google Searching, then found something that didn’t get reported in the IBM announcement.  The Johnson Controls solution is built on their Oct 2008 acquisition of GridLogix.

 

Gridlogix was acquired by Johnson Controls (NYSE: JCI) effective on October 16. We are now part of the largest global provider of integrated products, systems and services for commercial, industrial and residential buildings. With over 500 locations in more than 125 countries, Johnson Controls creates quality indoor environments that are energy efficient, safe and comfortable. To learn more about Johnson Controls, please visit www.johnsoncontrols.com.

As one of the Johnson Controls explained they are working to abstracting all the building management systems with their offering and enabling fault detection in addition to data feeds.

Anyone who wants to work with John Controls building management solution can gain much of what IBM has from a technical standpoint.  Note this commitment from Johnson Controls to open standards.

Systems Integration

Building Image

Whether you need open systems protocol integration, propriety building automation integration or a custom integration solution, Johnson Controls has the technological depth and facilities management expertise to meet your business needs.

Open Systems Standards

The Metasys® building management system was designed from the ground up with open standards of communications in mind. Whether your needs demand BACnet, BACnet MSTP, LonWorks, Modbus, N2, XML, Web Services or OPC, Johnson Controls can provide you with a complete integrated solution.

Proprietary Standards
The Metasys platform provides a wide array of paths to integrate propriety protocols and systems from most major manufacturer’s equipment, including Honeywell, Siemens, Andover, CSI, Invensys, Trane, and others.

Some numbers quoted today by IBM sound just like GridLogix results from their 2008 website.

Information Management for Sustainability
Gridlogix provides your organization with the tools for sustaining your enterprise. More than going "Green", Gridlogix helps you continuously cut wasteful costs, prolong the life of your facilities’ equipment, and maintain a comfort level throughout your enterprise. With Gridlogix's Automated Enterprise Management solution, Gridlogix empowers anyone in your organization with the real time data that allows your organization to improve the efficiency of your facilities, typically reducing energy and maintenance costs by 10-20% with a payback of less than 18 months. Gridlogix delivers the best form of Green Energy, conservation.

Here is a video of GridLogix system.

Read more

Google shares production data center for compute clusters

Google Research has a post reaching out to the academic community.

Google Cluster Data

Thursday, January 07, 2010 at 1/07/2010 08:11:00 AM

Posted by Joseph L. Hellerstein, Manager of Google Performance Analytics
Google faces a large number of technical challenges in the evolution of its applications and infrastructure. In particular, as we increase the size of our compute clusters and scale the work that they process, many issues arise in how to schedule the diversity of work that runs on Google systems.

The areas of interest for Google are:

We have distilled these challenges into the following research topics that we feel are interesting to the academic community and important to Google:

  • Workload characterizations: How can we characterize Google workloads in a way that readily generates synthetic work that is representative of production workloads so that we can run stand alone benchmarks?
  • Predictive models of workload characteristics: What is normal and what is abnormal workload? Are there "signals" that can indicate problems in a time-frame that is possible for automated and/or manual responses?
  • New algorithms for machine assignment: How can we assign tasks to machines so that we make best use of machine resources, avoid excess resource contention on machines, and manage power efficiently?
  • Scalable management of cell work: How should we design the future cell management system to efficiently visualize work in cells, to aid in problem determination, and to provide automation of management tasks?

Thee Google Cluster data is here.

This project is intended for the distribution of data of production workloads running on Google clusters.

The first dataset (data-1), provides traces over a 7 hour period. The workload consists of a set of tasks, where each task runs on a single machine. Tasks consume memory and one or more cores (in fractional units). Each task belongs to a single job; a job may have multiple tasks (e.g., mappers and reducers).

The data have been anonymized in several ways: there are no task or job names, just numeric identifiers; timestamps are relative to the start of data collection; the consumption of CPU and memory is obscured using a linear transformation. However, even with these transformations of the data, researchers will be able to do workload characterizations (up to a linear transformation of the true workload) and workload generation.

The data are structured as blank separated columns. Each row reports on the execution of a single task during a five minute period.

Time (int) - time in seconds since the start of data collection

JobID (int) - Unique identifier of the job to which this task belongs

TaskID (int) - Unique identifier of the executing task

Job Type (0, 1, 2, 3) - class of job (a categorization of work)

Normalized Task Cores (float) - normalized value of the average number of cores used by the task

Normalized Task Memory (float) - normalized value of the average memory consumed by the task

Please let us know about issues you have with the data.

So far there have been 230 downloads.

Filename ▼
Summary + Labels ▼
Uploaded ▼
Size ▼
DownloadCount ▼
...

google-cluster-data-1.csv.gz
7+ hours of workload traces from a Google production cluster
Dec 18
29.8 MB
230

1 - 1 of 1

Read more

Why Cloud Computing motivates green data center behavior

I read a post on Cloud Computing made ridiculously easy.

Making Cloud Computing Ridiculously Easy

One small click for man, one giant cloud for mankind!

BY CHRISTOPHER KEENE

With all the hullabaloo about cloud computing, it is easy to get caught up in the trend of the day and miss the big picture. The big picture is that cloud computing disrupts the data center world by slashing the capital and skills required to deploy a web application.


If that is the big prize, then most of what passes for news in cloud computing is more along the lines of "me speak cloud too."

This ease of use and the business/economic model is driving the growth of Cloud Computing.

Amazon Web Services Economics Center, comparing AWS/cloud computing vs co-location vs owned data center

Amazon Web Services has a post on the Economics of AWS.

The Economics of AWS

For the past several years, many people have claimed that cloud computing can reduce a company's costs, improve cash flow, reduce risks, and maximize revenue opportunities. Until now, prospective customers have had to do a lot of leg work to compare the costs of a flexible solution based on cloud computing to a more traditional static model. Doing a genuine "apples to apples" comparison turns out to be complex — it is easy to neglect internal costs which are hidden away as "overhead".

After watching multiple presentations and efforts to get people to measure their energy consumption in the data center, I am ready to throw in the towel on changing human behavior in this area.  Not to say energy monitoring shouldn’t be done, but moving beyond the current user base is difficult.

Here is an example to think about when consumers get their bills what amount of attention do they spend on their credit card bill vs. their electricity bill? 10 to 1?  20 to 1?  Maybe 100 to 1.  It is ingrained in human behavior to look at the money more than the electricity.

The Cloud Computing infrastructure is getting easier too even though his article makes it seem difficult.

Today, cloud development and deployment is still the exclusive domain of highly paid web experts and just as highly paid hosting providers and systems administrators. As much as cloud providers like Amazon and Rackspace have done to simplify web hosting and eliminate people from the equation, it still takes far too much expertise and effort to get applications built and deployed in the cloud.


The goal of cloud computing is to make web development and deployment something that any bum can do and charge in on their credit card with nary a care in the world.

In fact, I think it is easier to get people to discuss cloud computing infrastructure than energy monitoring infrastructure. 

Eucalyptus provides AWS compatible infrastructure.

Eucalyptus turns data center resources such as machines, networks, and storage systems into a cloud that is controlled and customized by local IT. Eucalyptus is the only cloud architecture to support the same application programming interfaces (APIs) as public clouds, and today Eucalyptus is fully compatible with the Amazon Web Services cloud infrastructure.

If you want Google Application Engine Cloud Computing compatibility than there is AppScale.

AppScale is an open-source implementation of the Google AppEngine (GAE) cloud computing interface from the RACELab at UC Santa Barbara. AppScale enables execution of GAE applications on virtualized cluster systems. In particular, AppScale enables users to execute GAE applications using their own clusters with greater scalability and reliability than the GAE SDK provides. Moreover, AppScale executes automatically and transparently over cloud infrastructures such as the Amazon Web Services (AWS) Elastic Compute Cloud (EC2) and Eucalyptus, the open-source implementation of the AWS interfaces.

If you going to manage the Cloud there is RightScale.

The RightScale Cloud Management Platform

Overview

RightScale is the leading provider of cloud management solutions that enable you to design, deploy, manage, and automate business-critical applications on the cloud. To date, hundreds of thousands of deployments have been launched on the RightScale Cloud Management Platform – running everything from scalable websites to complex grid applications. Cloud computing represents a tidal shift in the way IT infrastructure operates, enabling greater agility and lower costs across company sizes. RightScale delivers the power of the cloud to every business.

I’ve already blogged about Elastra’s management tools.

  1. Is Elastra one of Amazons Cloud Computing infrastructure tools? An ...

    Jan 19, 2010 ... I plan on having a meeting with Elastra next week when I am in the bay area. I wrote about their tools last week. Elastra's Cloud Computing ...
    www.greenm3.com/.../is-elastra-one-of-amazons-cloud-computing-infrastructure-tools-an-awesome-pdf-to-understand-a-better-approach-to-...

  2. Elastras Cloud Computing Application Infrastructure = Green IT ...

    Jan 14, 2010 ... Elastra connects the power use in the data center to the application architects and deployment decision makers. Plan Composer function lets ...
    www.greenm3.com/.../elastras-cloud-computing-application-infrastructure-green-it-with-a-model-approach.html

And in fact Elastra can be used as power metrics tool in cloud computing.

Plan Composer function lets customers set their own policies based on application needs and specific power metrics (such as wattage, PUE, number of cores, etc.). Therefore, if an application requires 4GB of RAM and two cores for optimal performance, and if the customer is concerned with straight wattage, Elastra’s product will automatically route it to the lowest-power 4GB, dual-core virtual machine available.

So, I think it will be easier to create greener data centers riding the momentum for cloud computing deployments than educating the masses on the benefits of energy monitoring in the data center.

Keep in mind the goal of green/energy metrics is to change behavior.  Not to sell energy monitoring solutions.

Read more

Energy Management shows up as a Vancouver Olympic event

CNET news has a post on the Vancouver Olympics reporting on the energy consumption from Olympic buildings.

A new Olympic sport: Tracking building energy

by Martin LaMonica

While people are watching the Olympic Games, building managers will be watching their energy dashboards.

Energy management software company Pulse Energy on Friday showed off an energy-monitoring system developed to make the Olympic Games in Vancouver more efficient.

Building energy is the latest spectator sport in Vancouver.

(Credit: Screenshot by Martin LaMonica/CNET)

The software gives facilities managers a real-time readout of energy consumption at different venues. By tracking that data, building managers can make adjustments to save energy, such as turning off equipment that's not in use. The information is also available online at VenueEnergyTracker.com.

BC Hydro has been trying to attract data centers to the BC area.

Here is more information about VenueEnergyTracker.

about the venue energy tracker

BC Hydro and the Vancouver Organizing Committee of the 2010 Olympic and Paralympic Winter Games (VANOC) created the Venue Energy Tracker to showcase the innovative sustainability measures implemented in various 2010 Winter Games venues and associated sites through an energy management software.

challenge

The communities involved were challenged with producing world class facilities with a minimal environmental footprint, while maximizing the long term legacies for their residents. The Venue Energy Tracker communicates the energy consumption and savings being realized by various partner venue buildings.

actions

Employing the latest in energy management software technology, BC Hydro with the energy tracking software tracks, analyzes and reports on real-time energy consumption from the venue sites in order to see energy and green house gas savings and set benchmarks from which similar venues can compare themselves to.

Learning from past Games, applying best practices in green design, construction, and occupant engagement, the communities were able to mitigate local and global sustainability challenges and embrace opportunities to make a difference. Actions taken include implementing green building features, including but not limited to: implementing energy saving technology, sequestering BC Pine Beetle wood as a construction material, rainwater capture and reuse, waste heat reuse, targeting LEED (Leadership in Energy and Environmental Design) building certification, and incorporating green principles and practices into operations and events.

Read more

Gartner predicts by 2014 carbon impact will be common practice

In Gartner’s predictions beyond 2010, there is a point on carbon reporting be common praicte by 2014.

By 2014, most IT business cases will include carbon remediation costs.Today, server vitalization and desktop power management demonstrate substantial savings in energy costs, and those savings can help justify projects. Incorporating carbon costs into business cases provides a further measure of savings, and prepares the organization for increased scrutiny of its carbon impact.

Gartner goes on to predict the vendors will need to provide carbon life cycle costs.

Economic and political pressure to demonstrate responsibility for carbon dioxide emissions will force more businesses to quantify carbon costs in business cases. Vendors will have to provide carbon life cycle statistics for their products or face market share erosion. Incorporating carbon costs in business cases will only slightly accelerate replacement cycles. A reasonable estimate for the cost of carbon in typical IT operations is an incremental one or two percentage points of overall costs. Therefore, carbon accounting will more likely shift market share than market size.

Read more