Top 5 Data Center Construction Companies

It can be hard to figure out who are the top data center construction companies and in this economy it is even more difficult than ever.  With the rule of Fight Club for data centers it is hard for a non-insider to figure out who does what as almost everyone says they are a leader.

Wal-Mart, Data Centers and The Fight Club Rule

June 3rd, 2006 : Rich Miller

“The first rule of Fight Club is – you do not talk about Fight Club. The second rule of Fight Club is – you DO NOT talk about Fight Club.”

Based on conversations with people who should know who builds what in the USA, and not a scientific survey here is what I can gather are the top 5 data center construction companies in order.

Also, the nice thing is every one of these construction companies has a green data center skill set.

#1 Holder Construction.

Data & Technology

Holder Construction Company is the industry leader in Data Center construction. Holder has maintained the #1 ranking on ENR’s Top 10 Data Center Contractors list for the past three years. Holder’s reputation for delivering the highest level of service on mission critical data center facilities is second to none.

  • Experience on over 100 data center projects in last 10 years
  • Over 7 million square feet of space
  • Over 3.5 million square feet of raised floor
  • Over 50 new construction projects
  • Majority of facilities have a fault tolerant, concurrently maintainable design
  • Experience in data center construction in 21 states and 2 foreign countries
  • Leader of LEED data center construction

#2 StructureTone

When it comes to mission critical construction, we deliver 24/7/365.

Featured Project

Retail Client

Texas

As a joint-venture partner, we managed construction of a new, 98,000sf facility data center that  More…

Having built over 21,000,000sf of mission critical facilities at all levels of density and redundancy, we are acutely aware of the quality and resilency demands that are unique to mission critical spaces. We are also attune to the specific, and differing, requirements that these demands place on operators, end-users and designers.
Not simply a mission critical builder, Structure Tone offers our mission critical customers 360◦ solutions that encompass technology, facilities, design and construction. Our dedicated mission critical construction staff is comprised of mechanical, electrical, technology, commissioning and construction professionals who have unmatched, hands-on experience developing, installing, building and commissioning complex, redundant infrastructure. In addition, many of our mission critical specialists have walked in our customers shoes as mission critical operators and/or end-users.

#3 Turner Construction


Project Management to Meet Your Specific Needs

Turner believes in collaboration and bringing value to every aspect of a project. Turner’s mission critical facility experience and service offerings include:

  • LEED Accredited staff experienced in critical facilities projects, including construction managers, electrical and mechanical specialists and supply chain managers with extensive product, manufacturing and commissioning experience
  • Customized software applications to increase communication for real-time updates and proactive issue resolution in preconstruction, construction, commissioning, and post turnover operations

#4 DPR Construction

Web-hosting. Colocation. Telecom. Data processing. Call Center. DPR’s proven technical expertise hyper-tracks the delivery of mission critical facilities. Every day presents new opportunities for exploring alternative techniques to improve design and construction in a 24x7 environment. DPR’s building specialists look at each project with a fresh approach to provide the right team and services for the job. Offering customers a single point of contact and up-front collaboration to shorten schedules and control costs, DPR takes the process to new heights with its program management, construction management and design/build capabilities, ensuring that facilities are ready to ramp up to full running capacity immediately upon completion and continue operating without failure.

View All Mission Critical Projects

#5 Skanska Construction

Skanska is a world leader in data center and resilient infrastructure construction. Capitalizing on our mission critical expertise, Skanska has developed the Mission Critical Center of Excellence (“COE”). Our team of experts offers an end-to-end service from initial design through commissioning and close-out.  Additionally, we also offer energy optimization services for new and existing data centers.

Mission Critical

Read more

Marketing Cloud Computing, why Amazon and Google have the advantage

i have been resisting the cloud computing hype as I am sure many of you have, but I’ve started to look at Cloud Computing as an agent to change the behavior of IT to be greener in the data center.

There are skeptics to the hype.

Merrill Lynch: Cloud Computing Market Will Reach $160 Billion...Really?

Written by Alex Williams / November 25, 2009 11:40 AM / 8 Comments

« Prior PostNext Post »

cloud-question-mark.jpgThe estimates for cloud computing can make you wonder sometimes about what to believe. Analyst firms and it looks like investment houses, can be notorious for wild estimates about market sizes.

So we have to wonder about the estimates from Merrill Lynch, which is estimating the cloud computing market to reach $160 billion by 2011.The estimate includes $95 billion in business and productivity applications.

Whoa! That makes cloud computing one of the fastest growing markets in the world.

But Merrill Lynch is not alone in its lofty estimates. Earlier this year, Gartner pegged the market at $150 billion by 2013.

In other words, cloud computing will be huge but to call it a $160 billion market seems like a form of hype that can lead to all kinds of issues. It's almost reminiscent of the dot-com bubble.

And look what happened there.

The same report discusses the business and and technical reasons of why Cloud Computing is good.

But then you need to look at the dynamics in play. IT is built on legacy systems, custom, built to order environments. Cloud computing provides a level of automation.

From the PriceWaterhouseCoopers summer Technology Forecast:

"Legacy IT soaks up much of the available IT budget and is a primary barrier to IT responsiveness and overall business agility."

The report goes on to say that cloud will be necesssary for automating the world of IT:

"...IT must adopt an architecture that creates loose coupling between the IT infrastructure and application workloads. It also must modernize and automate IT's own internal business processes for provisioning, managing, and orchestrating infrastructure resources."

With all this speculation it is hard to know what works in cloud computing.

Which is why I think Amazon and Google have the advantage.  They have the data that shows where there is the adoption of cloud computing.

Marketing the Cloud Computing covers an interesting point here.

May 12, 2009

Marketing Cloud Computing: Uncharted Territories

One of the aspects of cloud computing that receives too little attention is the massive change it brings to how software and IT infrastructure are marketed, sold, purchased and serviced. Through my work at GigaSpaces, and now advising start-ups and large companies with various cloud offerings, I have come to realize how much marketing cloud computing is still uncharted territory  -- and especially when it comes to the enterprise.

Many of the value propositions cloud brings to the table have been commonplace in the consumer Internet for more than a decade: self-service, ease-of-use, pay-by-the-drink pricing and so on. The same is true from the vendor's point-of-view: a low-touch, low-value, high-volume and short sales cycle. It's no surprise then that consumer-oriented companies, such as Amazon and Google, are the ones leading the charge in what is essentially a B2B market.

Read more

Are you “we” or are you “me” – social networking influencers, P.S. the “we” crowd is more fun

Part of what I enjoy about working on green data centers is meeting interesting people and figuring out how they fit in my social network.  There are people who are definitely and there are many who I don’t bother with.  The mistake you can make in social networking is sign up for too many networks and try to be friends with everyone.  This is not a race for quantity.

Wharton Knowledge has an article that touches on this topic.  The specific area they discuss is word-of-mouth marketing in the pharmaceutical industry, and many of the ideas apply to data center innovation and marketing.

The Buzz Starts Here: Finding the First Mouth for Word-of-Mouth Marketing

Published: March 04, 2009 in Knowledge@Wharton
Article Image

Call it viral, buzz or word-of-mouth advertising: Getting customers to spread the word about a new product through their social or professional networks is a hot strategy in the marketing world. Its proponents insist that the technique -- whether online or face-to-face -- is sure to boost a company's return on investment (ROI).

But how can companies find the right individuals to deliver the message? Marketers may wonder if they are finding the best "seeding points" -- that is, well-connected people at the hub of social networks who will latch on to a product and promote it widely among the people they know.

The traditional approach is to find the leader. The person that says look at “me.”

Who's the Leader?

The study indicates that the spread of a product by word-of-mouth -- what the authors call "contagion" -- can and does happen over social networks. The study also indicates that marketers may need to re-think whom they identify as the best seeding points in their word-of-mouth campaigns.

Traditionally, drug companies have focused their efforts on reaching notable community leaders, believing well-known experts to be the most effective emissaries of a new product. In other industries, said Iyengar, marketers and their market research companies have tried to find opinion leaders through direct surveys, asking people, in essence, "Are you an opinion leader?" and then linking those answers to observable characteristics such as age, income, education level, media habits and so on. That, however, has proved rather ineffective, leading some companies to give up on finding seeding points and go for flashy "buzz" campaigns everyone talks about, such as when British fashion retailer French Connection UK put its four-letter acronym in large letters on its bags and shopping windows.

There is another group they categorized and this is the “we.”

The researchers also asked all physicians to name up to eight other doctors with whom they felt comfortable discussing the clinical management and treatment of the disease, and up to eight doctors to whom they typically referred patients. These nominations from fellow physicians produced a second group, whom researchers called "sociometric leaders" -- the most influential and well-respected physicians in the community based on how often they were mentioned by their peers.

What did the study find as the aha moment?

"That was the biggest 'a-ha!' for the company," said Van den Bulte. Physician 184 "was not the most important in the number of connections he was getting, but he was vitally important in linking the networks."

More about Physician 184 characteristics as a “we” person.

Physician 184, for example, didn't fit the description of an individual who marketers thought would be the most effective promoter of their product -- an outgoing, high-profile doctor whose name often pops up on research papers or on conference speaker lists. "Physician 184 was self-effacing. He did not want to stand on a soap box," said Van den Bulte. "He was respected, but not in a flashy fashion. He was the opposite of a rock star."

And, they actually found that the “we” people were actually earlier adopters than the “me.”

Matching the network data with prescription records, the study showed that sociometric leaders like Physician 184 were quicker than the self-reported opinion leaders to use the new drug, and were also more likely to influence other physicians to try it. The study also found that sociometric leaders did take into account what their colleagues were doing. For marketers, this implies that word-of-mouth can affect opinion leaders as well as followers, in contrast to what is often believed and taught -- that only followers are affected by social influence.

Whenever I go to data center events I watch for the “we” vs. “me.”  I filter the me people and don’t spend that much time with them.  What I want to do is build the better connections to the “we” people as they are social network influencers.

I’ve used this method so long it feels obvious and natural, and thanks to a “we” friend I was having an IM conversation with regarding another “we” person’s behavior, the “we” vs. “me” became clear.

Are you a “we” or are you a “me”?

Read more

Via’s Big Little Green Server: 64 bit, virtualization, low power

Slashgear has a post on Via’s home mini server.

VIA M’SERV S2100 home mini server arrives

By Chris Davies on Wednesday, Jan 13th 2010 1 Comment

 

One of the reasons we have a soft-spot for VIA is that they eat their own dogfood: not only do they produce processors, mainboards and other chipsets, they also put out a range of products (often to OEMs rather than end-users) that actually use them.  Latest is the VIA M’SERV S2100, a boxy little server intended for home and small business users that’s powered by the VIA Nano CPU.

via mserv s1200 1 540x460

The M’SERV S2100 measures in at 10.2-inches long and 4.7-inches high, yet can be stuffed with up to 4TB of storage space.  There’s also a 1.3+GHz VIA Nano CPU, two memory slots, two SATA bays and an internal Compact Flash socket which the S2100 can boot from.  As for ports, you’re looking at dual gigabit ethernet, three USB 2.0 and a VGA output.

The Via product page has more details.

VIA M’SERV S2100 is a data-oriented 64-bit Mini Server with a large storage capacity, low power consumption and strong network connection. The M’SERV comes equipped with two Gigabit Ethernet that makes this unique mini server system a perfect fit not only for home download applications, but also as a small business/SOHO/personal server that provides ample storage space in an energy-efficient, compact, low-noise system.


64-bit Processor
Powered by the VIA Nano 64-bit processor, the M’SERV S2100 mini server is the first and smallest server to support a 64-bit environment. The VIA Nano processor also features hardware assisted virtualization technology, enabling users to experience improved performance across multiple virtual environments.

CF Socket
Built-in bootable Compact Flash socket is perfect for installing a slimmed-down version Windows or other embedded OS.


Dual Gigabit LAN
Two high-speed Ethernet ports on a speedy PCI Express bus for both Internet and intranet connections.

Low Noise
A quiet ball-bearing fan silently cools the system with noise levels remaining below a mere 26.8 dB.


Two RAM Sockets
Dual onboard DDR2 SO-DIMM sockets for convenient system upgrades.

Low Power
Based on a VIA processor and chipset combination, the M’SERV S2100 is an energy-efficient system with remarkably low power consumption.

Read more

Elastra’s Cloud Computing Application Infrastructure = Green IT with a Model approach

Elastra connects the power use in the data center to the application architects and deployment decision makers.

Plan Composer function lets customers set their own policies based on application needs and specific power metrics (such as wattage, PUE, number of cores, etc.). Therefore, if an application requires 4GB of RAM and two cores for optimal performance, and if the customer is concerned with straight wattage, Elastra’s product will automatically route it to the lowest-power 4GB, dual-core virtual machine available.

Gigaom has a post on Elastra’s Cloud Computing infrastructure addressing greener services.

Elastra Makes Its Cloud Even Greener

By Derrick Harris Jan. 12, 2010, 2:51pm 1 Comment

0 0 33

Elastra has incorporated energy efficiency intelligence into its Cloud Server solution, allowing customers to define which efficiency metrics are important to them and then rely on the software to route each application to the optimal resources with their internal cloud environments. Elastra’s efforts are just the latest in a growing trend toward saving data center costs by using the least possible amount of power to accomplish any given task. Especially in the internal cloud space, power management capabilities are becoming a must-have, with vendors from Appistry to VMware offering tools to migrate workloads dynamically and power down unneeded servers.

Digging into the press release I found Elestra uses a modeling approach.

Elastra accomplishes this through two technologies available in the product. The first technology is the ECML and EDML semantic modeling languages. ECML is a language used to describe an application (software, requirements, and policies) and EDML is used to describe the resources (virtual machines, storage, and network) available in a data center. These languages can be easily extended to enhance the definition of the applications and resources.

These modeling languages coupled with the Plan-Composer in the Elastra Cloud Server enables users to synthesize a plan for execution. The Plan-Composer analyzes the proposed application designs (expressed thru ECML) and data center resources (expressed thru EDML), comparing them against a library of actions and outcomes. It then generates a plan based on the energy efficiency policies of the organization that can be executed by the Cloud Server against a customer’s infrastructure.

The cool part is Elestra uses OWL and RDF to support their modeling approach.

Elastic Modeling Languages
















The Elastic Modeling Languages are a set of modular languages, defined in OWL v2, that express the end-to-end design requirements, control and operational specifications, and data centre resources & configurations required to enable automated application deployment & management.

While the foundation of the modeling languages is in OWL and RDF, developers can interoperate with the Elastra Cloud Server through its RESTful interfaces; all functions available to the Elastra Workbench are available through this interface, which are based on Atom collections and serialized JSON, XML, or RDF (XML or Turtle) entries.

Declarative models are useful ways to drive complexity out of IT application design and configuration, in favor of more concise statements of intent. Given a declaration of preferences or constraints, an IT management system can compose multiple models together much more effectively than if the models were predominantly procedural, and also formally verify for conflicts or mistakes. On the other hand, not everything can be declarative; at some point, procedures are usually required to specify the “last mile” of provision, installation, or configuration.

Read more