ISO 50001 and Data Centers

ZDNET Asia has a post on Singapore, Data Centers, and Green IT.

Pro-biz, green incentives give S'pore datacenter edge

The Singapore Green Data Centre Standard is here with part of the standard built on ISO 50001.

The Green DC Standard helps organisations establish systems and processes necessary to improve the energy efficiency of their DCs. It provides them with a recognised framework as well as a logical and consistent methodology to achieve continuous improvement in their DC facilities. This standard is modelled after the ISO 50001 standard on energy management (currently under development by ISO) but is specifically tailored to meet the needs of DCs in Singapore. The standard adopts the Plan-Do-Check-Act (PDCA) methodology, an iterative, four step problem-solving process used for continuous process improvement. The PDCA cycle forms the basis for many established management standards, which have successfully stimulated substantial, continual efficiency improvements within organisations around the world.

Here is a press announcement on a Taiwan data center being ISO 50001 certified.

TSMC Leads in ISO 50001 Certification for Data Center


Hsinchu, Taiwan, R.O.C. – November 3, 2011 –TSMC today announced that its Fab 12 Phase 4 data center in the Hsinchu Science Park has completed the ISO 50001 Energy Management System certification, becoming Taiwan’s first company to earn this certification for a high density computing data center.

The ISO 50001 Energy Management System was established by the International Standards Organization (ISO) Energy Management Committee (ISO/PC242), and was announced in the second quarter of this year. The Fab 12 Phase 4 data center which completed certification provides data and control systems for factory automation, and supports both manufacturing and R&D. Adoption of the ISO 50001 Energy Management System is expected to reduce the data center’s power consumption by 8%, conserving 2.21 million kilowatt-hours of electricity and eliminating 1,350 tons of carbon emissions per year. In addition to upgrading existing data centers, TSMC also plans to apply ISO 50001 standards to future data centers and implement the most up-to-date energy-saving designs. TSMC estimates that the company can conserve 59.62 million kilowatt-hours of electricity and eliminate 36,490 tons of carbon emissions per year.

ISO 50001 standard has a video.

ISO 50001 — What is it ?ISO 50001:2011, Energy management systems – Requirements with guidance for use, is a voluntary International Standard developed by ISO (International Organization for Standardization).ISO 50001 gives organizations the requirements for energy management systems (EnMS).ISO 50001 provides benefits for organizations large and small, in both public and private sectors, in manufacturing and services, in all regions of the world.ISO 50001 will establish a framework for industrial plants ; commercial, institutional, and governmental facilities ; and entire organizations to manage energy. Targeting broad applicability across national economic sectors, it is estimated that the standard could influence up to 60 % of the world’s energy use.*

Telling Data Center Stories, think about the memories vs. the experience

I was talking to a data center executive yesterday about a project and he asked me do I think the project is really interesting.  We then discussed the issue of telling a good story to get people interested.

Check out this video to get some ideas on how memories and experiences are not the same.

The following text starts at 3:51.

Now, the remembering self is a storyteller. And that really starts with a basic response of our memories -- it starts immediately. We don't only tell stories when we set out to tell stories. Our memory tells us stories, that is, what we get to keep from our experiences is a story.And let me begin with one example. This is an old study. Those are actual patients undergoing a painful procedure. I won't go into detail. It's no longer painful these days, but it was painful when this study was run in the 1990s. They were asked to report on their pain every 60 seconds. And here are two patients. Those are their recordings. And you are asked, "Who of them suffered more?" And it's a very easy question. Clearly, Patient B suffered more. His colonoscopy was longer,and every minute of pain that Patient A had Patient B had and more.

But now there is another question: "How much did these patients think they suffered?" And here is a surprise: And the surprise is that Patient A had a much worse memory of the colonoscopy than Patient B. The stories of the colonoscopies were different and because a very critical part of the story is how it ends --and neither of these stories is very inspiring or great --but one of them is this distinct ... (Laughter) but one of them is distinctly worse than the other. And the one that is worse was the one where pain was at its peak at the very end. It's a bad story. How do we know that?Because we asked these people after their colonoscopy, and much later, too, "How bad was the whole thing, in total?" and it was much worse for A than for B in memory.

Data shows 4x revenue for location ads with real time bidding

GigaOm has a post that supports one of the points I have been making on the where changes will be occurring.  Real-time ad bidding combined with contextual information.

Mobile advertisers paying 4x more for location-based impressions

Location, location, location: it’s not just for real estate. Mobile advertisers are increasingly prizing location-based ads,according to real-time bidding exchange Nexage, which said that mobile publishers and developers are getting 3.8 times more for eCPMs, or ad impressions, that include location data in the last three months.

Demand for location-based ads are also going up, jumping by 170 percent over the same period. More and more, advertisers are pursuing mobile ads that include location data because they can find users where they are, target specific areas and can drive consumers to take actions locally.

As Google expands around the world, they are in a position to support real-time ad bidding and location.  Location may have privacy issues, but it is valuable information and provides a huge context.  Are you at home, at work, at a mall, or on the road? You should get different ads.

“We don’t see any end in sight for demand. As people see the value especially for offer-based advertising and publishers manage their privacy issues, we think it will continue to grow. If publishers do the things we talk about today, it’s not really theory anymore. It’s fact; you will see an uplift in revenue,” Cormier said.

Jonathan Ive provides his perspective on working with Steve jobs

Check out this video to get an idea of Steve Jobs from one of his most trusted advisors, Jonathan Ive.

Top Comments

  • Jony should be apples new spokesman, not Tim Cook at upcoming Keynotes. Jony talks with such passion and conviction. He is the natural successor to Steve on stage.

  • Who'd have thought Jony was such a brilliant public speaker. Fantastic tribute from a great man to another.

Why isn't Calxeda's ARM processor running at 2GHz or more?

Calxeda made a lot of news with multiple news articles.

HP Builds Servers With Cellphone Chips

New York Times (blog)‎4 hours ago‎
Hewlett-Packard announced on Tuesday a new design for some of the world's largest computer centers and says it could reduce power consumption in some cases by 90 percent. The design, called Project Moonshot, replaces the conventional...

Calxeda Stretches ARM into the Clouds

Wired News‎4 hours ago‎
By jonstokes On Tuesday, Austin-based startup Calxeda launched its EnergyCore ARM system-on-chip (SoC) for cloud servers. At first glance, Calxeda's looks like something you'd find inside a smartphone, but the product is essentially a complete server ...

Calxeda Introduces EnergyCore ARM Processor for Servers

PC Magazine‎4 hours ago‎
The EnergyCore can draw as little as 1.5 watts for a dual-core server chip, and can create a full quad-core server, including 4G of DRAM and an SSD, that draws just five watts of power. But what makes the chip stand out, ...

ARM Breaks Into One Of Intel's Strongholds (ARMH, INTC, HPQ)

San Francisco Chronicle‎5 hours ago‎
Chip designer ARM has long dominated the market for smartphones and tablets, leaving Intel scrambling to catch up. Now, it's getting chance to break into one of Intel's strongholds -- server hardware. This morning, Hewlett-Packard announced a plan ...

HP Plans Low-Power Servers Using Calxeda ARM Chips

InformationWeek‎5 hours ago‎
HP's tiny servers built on Calxeda's energy efficient Cortex chip are designed to handle large Web data streams, video processing, picture uploading, or Hadoop-style big data analysis. By Charles Babcock InformationWeek HP on Tuesday launched Project ...

But, I don't know if I am as excited as the press is.  Why?  I have been talking to ARM for over a 3 years on the opportunities for ARM servers in data centers and have been waiting for over a year for Calxeda to make a chip announcement.

So, looking at the specifications.  One of the questions i have is why does the Calxeda processor run at 1.1 - 1.4 GHz and not at 2GHz?

EnergyCore™ ECX-1000: Technical Specifications

Processor Cores

  • Up to four ARM® Cortex-A9 cores @ 1.1 to 1.4 GHz

Here is the spec for the A9 processor.

Speed Optimized: The speed-optimized hard macro implementation provides system designers with an industry standard ARM processor incorporating aggressive low-power techniques to further extend ARM’s performance leadership into high-margin consumer and enterprise devices within the power envelope necessary for compact, high-density and thermally constrained environments. This hard macro implementation operates in excess of 2GHz when selected from typical silicon and represents an ideal solution for high-margin performance-oriented applications.

In the HP press announcement there as quote to emphasize performance needs.  So, why not a 2 GHz clock rate?

“The volume of data processed in financial markets has increased exponentially, and traditional scale-up or scale-out architectures are struggling to keep up with demand without vastly increasing cost and power usage,” said Niall Dalton, director of High-Frequency Trading at Cantor Fitzgerald, a company that is currently evaluating the technology. “HP is taking a holistic approach to solving this problem and working to bring unprecedented energy and cost savings for tomorrow’s large-scale, data-intensive applications.”

Another question I have is what is the architecture to manage the Energy Cards.  This could be the opportunity for HP.

EnergyCard Reference Designs

EnergyCards are production-ready boards designed by Calxeda to demonstrate the full breadth of capabilities offered by the EnergyCore platform. With this as a building block, system OEMs can leverage Calxeda’s design expertise, allowing them to easily bring hyper-scale solutions to market in a fraction of the time required for ground up custom designs.

...

The HP Redstone Server Development Platform is the first in a line of HP server development platforms that feature extreme low-energy server processors. Initially incorporating Calxeda EnergyCore™ ARM® Cortex™ processors, future Redstone versions will include Intel® Atom™-based processors as well as others. HP Redstone is designed for testing and proof of concept. It incorporates more than 2,800 servers in a single rack, reducing cabling, switching and the need for peripheral devices, and delivering a 97 percent reduction in complexity.(1) The initial HP Redstone platform is expected to be available in limited volumes to select customers in the first half of next year.