A good perspective on Google's Data Center efforts, myth vs. reality

Mike Manos writes a great post on Google’s latest rumors.  To give you the perspective of Mike let’s start with the end.

Editors Note: I have many close friends in the Google Infrastructure organization and firmly believe that they are doing some amazing, incredible work in moving the industry along especially solving problems at scale.   What I find simply amazing is in the search for innovation how often our industry creates things that may or may not be there and convince ourselves so firmly that it exists. 

Mike then starts at the beginning of speculate on what Google would do with a deep earth mining equipment.

Google Purchase of Deep Earth Mining Equipment in Support of ‘Project Rabbit Ears’ and Worldwide WIFI availability…Posted on October 31, 2013

NewImage

(10/31/2013 – Mountain View, California) – Close examination of Google’s data center construction related purchases has revealed the procurement of large scale deep earth mining equipment.   While the actual need for the deep mining gear is unclear, many speculate that it has to do with a secretive internal project that has come to light known only as Project: Rabbit Ears. 

and Mike References an excellent point of a floating data center.

While the technical intricacies of the project fascinate many, the initiative does have its critics like Compass Data Center CEO, Chris Crosby, who laments the potential social aspects of this approach, “Life at sea can be lonely, and no one wants to think about what might happen when a bunch of drunken data center engineers hit port.”  Additionally, Crosby mentions the potential for a backslide of human rights violations, “I think we can all agree that the prospect of being flogged or keel hauled really narrows down the possibility for those outage causing human errors. Of course, this sterner level of discipline does open up the possibility of mutiny.”

Read all of Mike’s post.  It tells a story of how what is a myth vs. reality and how much myth gets told as the truth.  

All is fare in the fight for media traffic.

Sad that $300 mil spent on Obamacare creates no innovation, the benefits of a risk-less approach

One of the sad things about the $300 mil spent on Obamacare is there is no innovation that comes from the effort.  NASA’s mission to put a man on the moon had huge risks and has many innovations it can claim.  Here is a NASA pdf you can check out.

NewImage

From a technical standpoint there isn’t anything Obamacare is doing that is innovative.  In fact, you can think of what is the flaw of Obamacare is living in the 60’s with a procurement process for buying commodities applied to IT.

That cancer is called “procurement” and it’s primarily a culture driven cancer one that tries to mitigate so much risk that it all but ensures it. It’s one that allowed for only a handful of companies like CGI Federal to not only build disasters like this, but to keep building more and more failures without any accountability to the ultimate client: us. Take a look at CGI’s website, and the industries they serve: financial services, oil and gas, public utilities, insurance. Have you had a positive user experience in any of those industries?

The cancer starts with fear. Contracting officers — people inside of the government in charge of selecting who gets to do what work — are afraid of their buys being contested by people who didn’t get selected. They’re also afraid of things going wrong down the line inside of a procurement, so they select vendors with a lot of “federal experience” to do the work. Over time, those vendors have been consolidated into pre-approved lists like GSA’s Alliant schedule. Then, for risk mitigation’s sake, they end up being the only ones allowed to compete for bids.

This results in a culturally accepted idea that cost implies quality. To ensure no disasters happen, throw lots of money at it. And when things go terribly wrong, throw more money at the same people who caused the problem to fix the problem. While this assumption may work well with commodities (want to ensure that you get lots of high-quality gravel? Buy a lot more gravel than you need, then throw out the bad gravel) the evidence points to the contrary with large IT purchases: they usually fail.

Lego rendering of Data Center History

Data Centers is what almost all of your care about.  And, some of you may enjoy Legos.  Here is a blog post when you combine both.

Datacenter History: Through the Ages in Lego

The data center has changed dramatically through the ages, as our Lego minifigures can testify!

As a rule, I don’t participate in contests: There’s usually little reward, considering chances of winning. But when Juniper Networks asked me to build a datacenter from Lego bricks, I took a second look. And, seeing that the winner can support a charity of their choice, I felt that this was an excellent opportunity for me to have some fun while doing some good!

The above post goes through history.  For you who won’t click on the post, here is the modern lego data center.

The Modern Datacenter

We now turn to today. Our modern datacenter evolved from the history shown here: We retain the same 19-inch rack mount system used for Colossus way back during World War II. All of our machines are “Turing Complete” like the ENIAC. We run UNIX and Windows Server on CPUs spawned from the PDP-11, and our Windowed GUIs reflect the Xerox Alto. Today’s multi-core servers and multi-threaded operating systems carry the lessons learned by Cray and Thinking Machines.

A modern datacenter, complete with an EMC VMAX, Juniper router, and rackmount servers

My Lego datacenter tour ends here, with two racks of modern equipment. At the rear is an EMC Symmetrix VMAX which, like the CM-5, calls attention to its black monolith shape with a light bar. At front is a Juniper T-Series router (white vertical cards with a blue top) rack-mounted with a number of gold servers. Our technician holds an iPad while walking across a smooth raised floor. I even used a stress-reducing blue color for the walls!

Although the Symmetrix model only has three Lego axes, the router rack features four: The servers sit on forward-facing studs while the router is vertical. Both use black side panels, reflecting today’s “refrigerator” design.

Reporter uses Facebook's Rooftop to check out Apple's data center in Prineville

Facebook has been getting some news with the opening of its cold storage facility.

http://sustainablebusinessoregon.com/articles/2013/10/exclusive-a-look-at-facebooks.html?s=image_gallery

http://www.bendbulletin.com/article/20131016/news0107/310160339/

http://readwrite.com/2013/10/16/facebook-prineville-cold-storage-photos#awesm=~oluaddHy6afA5O

What is funny is one reporter used the Facebook rooftop to check out Apple’s data center.

As it turns out, Apple's complex, code-named "Pillar"—and completely devoid of any markings identifying it as an outpost of the Cupertino company—is a literal stone's throw from Facebook's Prineville, Ore. hub. Tracking down the location of Apple's stealth site was just as easy as peering southeast from Facebook's roof, which ironically offered what was probably the best view in town. The Facebook employees pointed it out to me while cracking jokes about its apparently not-so-secret alias.

Construction began on the Apple data center last October, and now the first phase's main building (the large black one) appears to be complete, to the untrained, telephoto-lens equipped eye, anyway. Eventually the project will encompass two full 338,000-square foot data centers sprawling across Apple's 160-acre Prineville plot. And because everything is spookier and more fascinating when it's built out in the desert, we bring you the photographic fruits of our Veronica Mars-style investigation of Apple's Area 51.

NewImage

Hybrid as a Cloud choice, Webinar on Oct 31, 2013

I am on a webinar on Oct 31, 2013 10a PT on Hybrid Clouds.  Hope you can join in the discussion that David, Ted, and I will have with Paul Miller as moderator.

Balancing performance and cost in hybrid clouds

October 31, 2013
10:00am — 11:00am PDT

FEATURED PANELISTS

Dave Ohara
Ted Chamberlin
Ted Chamberlin Vice President Market Development, Coresite

MODERATED BY

A hybrid cloud strategy is not a destination. It is an ongoing balancing act in which enterprises weigh a shifting landscape of cost, security, and performance against business needs.

Decision criteria have never been murkier. What used to be “showstopping” issues such as regulatory requirements for private connectivity or a need for real-time data access can now be overcome through secure, low-latency secure interconnections–for a price. The challenge for IT is pairing applications, services, and data sets with the right balance of public and private cloud services, at the right price.