Do you see who and what is behind the standards? If you did would you adopt them?

Standards are typically thought of as a good thing.  DCK just posted an AFCOM one on education, and it got me thinking do you really know the story behind a standard and if you did would it change the adoption of a standerd.  I've sat on many standard initiatives and gradually learned what is many times behind the scenes of something like an IEEE standard.

NewImage

WHAT ARE STANDARDS?

Standards are published documents that establish specifications and procedures designed to maximize the reliability of the materials, products, methods, and/or services people use every day. Standards address a range of issues, including but not limited to various protocols to help maximize product functionality and compatibility, facilitate interoperability and support consumer safety and public health.

The top players in the standards are those who have most to gain by a new standard or who have most to lose. 

Working on standards can be time consuming, especially when part of the game is to slow down the development of a standard to allow more time for companies to adapt.

Ultimately there is a scorecard each company keeps how does this standard affect my products and my company.  Does it help us or hurt us.  Does this standard help our competitors more than us.

Chris Crosby gets into this subject as well with his post.  Chris puts in some great points.

Unfortunately, when evaluating data center providers, customers often have to navigate between what is real and a vendor’s standard- inspired puffery.

...

This pattern of devolution from industry standards places a greater burden on today’s data center customers. Failure to ask for, and receive, objective evidence of a provider’s adherence to the standards that underlie their performance claims places the customer in the position of having to make their decision based more on the sizzle rather than the steak. Caveat Emptor (let the buyer beware) was the advice of the ancient Greek’s to wary prospective customers, in the world of data center standards compliance, it’s probably still good advice.

The Perfect Data Center

Dilbert's Scott Adams has a blog post on the Perfect Room and piece of SW that could support this.

You often see rooms that can't be furnished properly because furniture placement was an afterthought. The design of a room should start with the perfect arrangement of furniture and fixtures. I would think that for every budget and set of preferences there are a few furniture arrangements that stand out as the best. How hard would it be to catalog those best arrangements?

I imagine a time when a user can design a home simple by checking boxes on a long digital form. Questions for a living room might include:

1.      Do you want a TV in this room?
2.      Do you want a cozy reading chair?
3.      Do you want a fireplace?
4.      Etc.

Once the user selects all of his preferences for each room, he clicks a "shuffle" button and it spits out a house layout complete with external windows, doors, hallways, stairs, and engineering support structures. All of that stuff is fairly rules-based. If you don't like the first design, click the shuffle button again. In every case, the rooms will have exactly the features you specified but arranged differently. And of course you can walk through your model in 3D mode.

Scott closes with points on the savings and the issues.


1.      Rooms that need plumbing should be near each other to reduce costs.
2.      Orientation to the sun makes a huge difference in heating/cooling/insulation.
3.      Some designs require fewer hallways, which saves space.
4.      Some designs require more support structures, doors, windows, etc.
5.      Some designs have ductwork issues.

Those are just some obvious examples of potential savings. You'd also cut your architect expense by 80%. And you'd save on labor and materials because the building materials would be measured and cut at the factory, including everything from lumber to floor tiles to carpet.

My observation is that the building industry is slow to innovate and fairly disorganized. Builders, architects, and materials companies are all their own little silos. So my guess is that the "shuffle design" program will originate in some sort of online game environment before it gets ported to the real world.

i think this is what Compass Data Centers is attempting to do.

If you’ve ever sat behind a foul pole at a baseball game you know what a pain columns can be. That’s why your 10,000 square feet of 36” raised floor is column free. At Compass, your data center floor accommodates anything from a tape library or a Cisco 7000 with side-to-side airflow to OpenCompute’s new larger rack sizes. This degree of flexibility even extends to cable management to support your preference whether its above the rack, hanging from the ceiling, or below the floor. At Compass, the only option you don’t have is to strand your IT capacity.

Speaking of your data center floor flexibility, can you control how much the software uses of the server and storage capacity? We didn’t think so. The reality of the world is that most software does not drive the full use of the server (virtualized or otherwise). As a result, it’s tough to predict what your actual usage will be from rack-to-rack. Not to mention the patch, network and storage…That’s why your Compass solution will support rack densities that cover the spectrum up to 20kW without containment (from 0 to 400 W/sf). Just imagine what you can do using ASHRAE TC 9.9 best practices including containment.

Although perfection is hard to achieve as once you live in the space you find out things you didn't consider.

Kfir Godrich discusses Data Center Commissioning role in delivering availability

I've had the pleasure of some great conversations with Kfir Godrich.  Kfir has a guest post on Compass Data Centers blog that discusses Data Center Commissioning.

Kfir starts with a subject that reminds me of my first summer jobs at HP working in Quality engineering at HP where I worked on warranty and reliability issues.

The data center commissioning (or Cx) journey starts with understanding the basics of reliability engineering contained in the IEEE Gold Book. First, we need to define the difference between reliability and availability. Availability is the probability that a system will work as required during the period of the mission while Reliability is the probability that the system will in fact maintain operations during the mission. The related terminology that helps us introduce the Cx, is the data center predicted performance model. This model follows a failure mode typical to electronic equipment also known as the “bathtub curve” (see Fig. 1).

Bathtub Curve

In Phase 1, also called the Infant Mortality Period, data centers are going through a decreasing failure rate that it is very much desired to be as short as possible. This can be achieved through performing a full commissioning as described later. It is the author’s humble opinion that the level of commissioning must be proportional to the level of criticality and design Tier (per Uptime Institute) of the data center.

In Phase 2, referred to as the Random Failure Period, the failure rate is constant and mostly known by the fact that MTBF (Mean Time Between Failures) is calculated during this phase. The desire here is to take that flat curve as low as possible. In Phase 3, The Wear-out Period – is where components begin to reach the end of their usable life. Replacing components proactively aids in delaying the ultimate upturn in the graph.

This post is the first in a series so if you are interested in this topic there will be more.

Therefore, data center commissioning is about enabling the business through performance validation and functional testing of integrated platforms. This should typically be performed by an independent agent as part of the customers trusted advisory team and as a core part of the overall project schedule. The cost for a commissioning agent can be in the range of 0.8-2% of the total budget. Since commissioning is essential for government facilities, the US Department of Energy is publishing certain guidelines for commissioning scope and cost. Geographically, commissioning is more popular and comprehensive in North America and parts of Western Europe while the rest of the world is becoming more familiar these concepts. Our next Blog will go a bit deeper into the Integrated Testing—stay tuned. Till next time, Kfir

Kfir's new company is here.

Shh, a secret on why the Green Data Center is popular, it's the money

One of things I figured out long time ago as an Industrial Engineer is efficiency is good to some, but not all.  If you talk about being Green you get almost all the people saying being green is good with few fighting the green initiative.  And what is behind a big of being green?  A big part is being efficient.  

Chris Crosby does a good job of giving an insiders view of the secret of being green in the data center business.

“We all talk about being green like it’s our ticket to corporate sainthood, but really we just keep improving the energy efficiency of our data center operations because it helps us make more money”.

...

You really thought that all this incessant talk about being “green” was about saving the planet? How quaint. Well, why don’t you come sit up front here and let me explain a few things.

Got your attention?  Chris explains more.

Marketing, and its very close friend Public Relations, are all about making people want things because they think they are important. So about five years ago, some very bad people began to say that data centers used too much energy and that wasn’t good for the environment. While a whole bunch of folks in the data center business panicked, some marketing people got together and said, “This is awesome. We know that using too much power hurts our profit margins, and people that think that’s bad anyway, so let’s jump on the bandwagon and call our energy efficiency efforts “green initiatives” and then everyone will be happy”. This is what’s known as a win/win proposition. Naturally, the whole industry cheered.

NewImage

Seizing on this new vision, data center companies began to improve their energy efficiency. They even came up with a new standard to help measure the improvements in performance called PUE so customers could prove it to themselves. This new standard has become so popular that now data center providers and operators use it as part of their marketing and PR efforts. Really big operators love this concept. They do all kinds of wild things like spending large sums of money on horribly inefficient technologies like solar panels so they can turn around and talk about their commitment to being green. Now of course we all know that this is just to keep large groups of generally unshaven, Birkenstock wearing extremists from causing a big fuss and driving down their stock price, so we all play along and show our support by saying things like, “Man, that’s what I call a real commitment to green”. See the double meaning there? A lot of people don’t, but for obvious reasons we don’t bother to correct them.

Compass Data Center achieves Funding milestone, $45 mil for first round

Compass Data Centers in trying to be different than the rest, and part of building something different takes money.  Today, Compass Data Centers announced funding.

Compass Datacenters Completes First Major Round if Funding

 
 

For Immediate Release

$45 Million Dollar Credit Facility is Expandable to $100 Million; Will Finance Data Center Construction Projects Across United States

Dallas – September 4, 2012 – Compass Datacenters has completed its first major round of financial funding: a $45 million credit facility with an accordion feature up to $100 million from KeyCorp’s commercial finance unit, KeyBank, NA. This is the company’s first formal round of funding since the company began operations earlier this year, and it will be used to finance its aggressive growth plan over the coming months.

Compass convinced its financier that its business model will work.

“Compass Datacenters is pioneering a major area of growth for the data center industry by offering wholesale standalone data center products that meet the needs of companies that are in geographic areas that are not well served by traditional providers. They have assembled a strong management team comprised of executives with proven track records in data centers and real estate development. Their wholesale business model and proprietary design architecture have the company positioned to be the leader in this growth market,” said John Murphy, Managing Director of Real Estate Capital at KeyBank.

Next step is to see data centers build by Compass which should happen quickly with a modular design.