Another example of where I am so glad I have my Verizon 4G Mifi device, Uptime's event network 0.21 Mbps speed in keynote

I am sitting in the Uptime Institute Symposium, and  wrote a blog entry on Facebook's Data Center Design.  The upload was an excruciating 3 minutes or more.  Curious I went to run www.speedtest.net twice and look at the download and upload speeds.

image 

Switching to my Verizon 4G mifi modem.

image

Ahh, now I fill like I can breath.  Crawling at 210 Kbps may be fine for free, but way too painful to blog at an event.

Ready to write about

Designing a Cloud-Friendly Data Center
Peter Panfil
Vice President and General Manager, Liebert AC Power
James Kennedy
Director - Data Center Operations and Construction, RagingWire Enterprise Solution

Facebook's Latest Data Center Design presentation at Uptime

Facebook gave a keynote presentation on its Data Center Design

Facebook's Latest Innovations in Data Center Design
Senior Electrical Engineer, Facebook  Paul Hsu
Datacenter Mechanical Engineer, Facebook Dan Lee

Below is a side by side slide Paul presented on the difference between a typical data center power conversion vs. the Facebook design.

image

Dan has a slide with side-by-side comparison of a typical mechanical system vs. the Facebook design.

image

A couple of other slides share are on the Reactor Power Panel and Battery cabinet.

imageimage

The results Facebook shared.

image

For more details you can find information at Facebook's Open Compute Project web site.

If you want to see pictures of inside the Facebook data center check out http://scobleizer.com/2011/04/16/photo-tour-of-facebooks-new-datacenter/ and http://www.datacenterknowledge.com/archives/2011/04/19/video-facebooks-penthouse-cooling-system/

Disrupting the Data Center, Uptime Institute focuses on Cloud, Cost, Capacity, and Carbon

Watching the initial keynotes for Uptime Institute it is great to see the Green Data Center idea manifest in the tag line of Cloud, Cost, Capacity and Carbon.

image

The keynote kicked off with the idea this the conference for the Disrupted Data Center which pulled together the groups of The 451 Group.

In the coming five years, a series of major technological innovations, coupled with significant, external legislative and market disruptions, will make an ever greater impression on the planning, design and operation of data centers. The economics, the operational practices and the underlying design principles of data centers, and of IT service provision, may be about to undergo some fundamental, disruptive shifts.

image

And, organizes the top underwriters of the conference.

image

But, then I ask the question is disruption of the data center come from the above list of companies.  Facebook is presenting part of its Open Compute project.

Open sourcing designs disruptive. 

As I spend the rest of my time at Uptime, I'll keep on thinking of what is disruptive in data centers.  Is Cloud, Cost, Capacity, and Carbon disruptive?  Or is it the companies who are not the underwriters list?

Data Center Analytics supports better decision making, Power Assure ships new capabilities

Power Assure has a press release on their new analytics capabilities. 

Energy Management version 4 (EM/4) software enables actionable-intelligence for maximizing data center efficiency

Santa Clara, Calif. – May 9, 2011 - Power Assure®, Inc., a data center infrastructure management solutions provider, today introduced at Uptime Institute’s Symposium 2011 Data Center Analytics for its Energy Management software platform, version 4 (EM/4). Data Center Analytics gives data center operators for the first time the ability to analyze and synthesize the overwhelming amount of raw data now available on data center equipment performance and turn it into useful business information to improve the efficiency, capacity and performance of their data centers.

The Analytics capability exists side-by-side with the monitoring and automation modules.

image

Here is a sample dashboard from Power Assure to visualize data center systems.

image

Interesting problem how to organize information in Facebook's Open Compute Project

I got a chance to meet some of the Facebook's Open Compute Project team last week, and the meeting went much better than I expected.  One of the great questions Facebook asked was how to organize the Open Compute Project's efforts.  One typical approach would be a taxonomy of the different parts of the system - power, cooling, servers, etc.

taxonomic scheme, is a particular classification ("the taxonomy of ..."), arranged in a hierarchical structure.

A hierarchical approach makes sense for a technical crowd

A hierarchy (Greek: hierarchia (ἱεραρχία), from hierarches, "leader of sacred rites") is an arrangement of items (objects, names, values, categories, etc.) in which the items are represented as being "above," "below," or "at the same level as" one another.

A different way to look at the problem is to use an ontological approach and use knowledge management techniques.

DEFINITION

In the context of computer and information sciences, an ontology defines a set of representational primitives with which to model a domain of knowledge or discourse. The representational primitives are typically classes (or sets), attributes (or properties), and relationships (or relations among class members). The definitions of the representational primitives include information about their meaning and constraints on their logically consistent application. In the context of database systems, ontology can be viewed as a level of abstraction of data models, analogous to hierarchical and relational models, but intended for modeling knowledge about individuals, their attributes, and their relationships to other individuals.

Note the yellow text where ontological vs. hierarchical is compared. 

I think the ontological approach could work for Open Compute Project.  I'll spend more time over the next couple of weeks circulating the idea and getting feedback.