How advanced is your data center strategy? Learning from Modern Military Strategist John Boyd

The data center is more and more strategic to many businesses.  It is now common for outages to cost $10k-100k/min.  Many of the data center executives have military backgrounds and are used to defending their country.  Some data centers are built like fortresses with even armed guards inside the building.  Many times it is not the outsider that brings down a data center, but the insider who makes a mistake in operations and maintenance. These employees though are not the enemy.  The enemy that has attacked you is the outage itself.  When an outage occurs you can run through a playbook that lists the standard approved operating procedure which is fine if you have the time and the outage scenario was covered in your planning.  

What happens when the outage is something that you had not planned for.  You run the playbook, can't figure out how to address the outage, and now you are thinking crap.  What do we do now?  Outages can kill a company or business unit if data is destroyed or downtime is excessive.  Think of the T-mobile Sidekick outage.

The incident caused a public loss of confidence in the concept of cloud computing, which had been plagued by a series of outages and data losses in 2009.[7] 

Was the enemy the Microsoft employees who ran the services.  No.  The enemy is a collection of ideas of what was the right thing to do which eventually caused an outage.

A company statement said the mishap was due to "a confluence of errors from a server failure that hurt its main and backup databases supporting Sidekick users."[2] T-Mobile blamed Microsoft for the loss of data.[1]

Someone had the idea to insure the uptime of the Superbowl is to install a protection relay.

“The purpose of it was to provide a newer, more advanced type of protection for the Superdome,” Dennis Dawsey, an executive with Entergy Corp., told members of the City Council. Entergy is the parent company of Entergy New Orleans.

Entergy officials said the relay functioned with no problems during January’s Sugar Bowl and other earlier events. It has been removed and will be replaced.

4 years ago I read about John Boyd and his OODA Loop approach and posted on it.  I tried finding more details on what John Boyd presented.  His presentations are difficult to understand and unfortunately John Boyd did not write his ideas down well enough for others to understand.  Then I found a PhD thesis by a military student who did take the time to explain John Boyd's ideas.  You can find it here.  Warning this paper is for people who really want to understand modern military strategy.  The OODA loop concept has been transferred to business on the idea of the winners are those who can move faster and out think their opponent.

Who is John Boyd?

a tribute written two days

after Boyd’s death on 9 March 1997 which describes him as

a towering intellect who made unsurpassed contributions to the American art of war. Indeed,

he was one of the central architects in the reform of military thought which swept the

services, and in particular the Marine corps, in the 1980’s. From John Boyd we learned about

the competitive decision making on the battlefield-compressing time, using time as an ally.

Thousands of officers in all or services knew John Boyd by his work on what was to be

known as the Boyd Cycle or OODA loop. His writings and his lectures had a fundamental

impact on the curriculum of virtually every professional military education program in the

United States-and many abroad [..]he was the quintessential soldier-scholar - a man whose

jovial outgoing exterior belied the vastness of his knowledge and the power of his intellect1.

The problem the author, Frans Osinga was trying to address was the lack of explanation of how Boyd came to his conclusions.  What was his logic and assumptions?

There are a number of short papers35. Most if not all deal almost exclusively with the

OODA loop concept. Recently, two biographies have appeared. Robert Coram’s work

focuses in particular on Boyd’s life and less on Boyd’s strategic theory, although he does

provide a good synopsis of it. Boyd’s biographer Grant Hammond surpasses Coram in his

rendering of Boyd’s strategic theory but the book nevertheless falls short of offering a

comprehensive account of Boyd’s work. Instead it must be considered an authoritive and

very accessible description of Boyd’s ideas. Moreover, as it does not contain an integral

rendering of Boyd’s work, the educational experience contained within Boyd’s slides, his

unique use of words and the way he structures his arguments, does not receive the emphasis

it deserves. Finally, although touching upon Boyd’s wide array of sources underlying his

work, space restrictions prevented a proper discussion of the intellectual background of

Boyd’s work.

I am slowly digesting the PhD paper.  You can also buy the PhD paper in a book.

NewImage

This book aims to redress this state of affairs and re-examines John Boyd’s original contribution to strategic theory. By highlighting diverse sources that shaped Boyd’s thinking, and by offering a comprehensive overview of Boyd’s work, this volume demonstrates that the common interpretation of the meaning of Boyd’s OODA loop concept is incomplete. It also shows that Boyd’s work is much more comprehensive, richer and deeper than is generally thought. With his ideas featuring in the literature on Network Centric Warfare, a key element of the US and NATO’s so-called ‘military transformation’ programmes, as well as in the debate on Fourth Generation Warfare, Boyd continues to exert a strong influence on Western military thinking. Dr Osinga demonstrates how Boyd’s work can helps us to understand the new strategic threats in the post- 9/11 world, and establishes why John Boyd should be regarded as one of the most important (post)modern strategic theorists.

EMC's Innovation group builds thermal profile data center robot for $200

One of my readers, Vivek sent a link to a cool robot used in the data center to collect thermal profile data.  You could have permanent thermal sensors in your data center or have someone wander around to collect data or send a robot around.  Having the robot go around 24x7x365 a year seems like a good choice, and given it is built on iRobot you have a clean floor too. :-)

NewImage

NewImage

This idea is to build a low cost platform to monitor environmental parameters in a data center. We initially  planned to take an arduino with DS18B20 temp sensors around & build a temperature map of the data center. But we need to take care of the indoor location information as well with this method. That looked tedious & error prone. It is a good thermal detector but not good to build a thermal map. So we brainstormed with our team and some one joked about putting it on a Roomba & driving it around. The idea looked frugal because either you can put hundreds of sensors in your data center or take few sensors & walk around. Both are different in technical perspective but the later approach which is very low cost & good enough for quick data center cooling fixes.

I had a chance to have an e-mail discussion with Vivek and one of questions I had is how he knows where the robot is in the data center.  The answer.  They know the start point, and they know the wheel movement which then creates a path of where the robot is.  But, if the robot is kicked, then the location is unknown.  
 
Seems like this is a good project for a summer intern to try at your data center.
 
Here is a video of the robot.

 

Rankings of Top Respected companies with big data center footprints

Harris Interactive has a poll on the top respected companies.  What I thought be interesting is out of these top rank companies where are the data centers in this midst.

NewImage

The following compares are some of the big players in data centers and it is interesting to think about how data centers play a role in their business.

Amazon.com #1

Apple #2

Google #4

Microsoft #15

Dell #26

IBM #28

HP #34

Verizon #36

AT&T #39

Facebook #42

Are you ready for the Pacific NW Megaquake?

With all the news about Sandy in the Northeast, many learned whether their data centers could ride out a 100 yr flood natural disaster.

In the Pacific NW, the big risk is a megaquake.

NewImage

Oregon Live estimates the financial impact to just Oregon to be $32bil.

The next great Cascadia subduction-zone earthquake will kill thousands in Oregon and cause at least $32 billion in economic losses unless preparations are radically overhauled, a state panel says.

When, not if, the magnitude-9.0 quake strikes -- let alone an accompanying tsunami -- Oregon will face the greatest challenge in its history, the state earthquake commission said in a 290-page draft report released Monday to The Oregonian.

Now, you may think that Eastern Washington and Eastern Oregon are a safe distance from the threat of a Tsunami.  But, when a quake this big hits it affects all parts of a the infrastructure.  Pacific NW fiber cables could be broken, water lines break, electrical systems have cascading failures, and diesel fuel is under federal management.

Transmission towers may topple into the river, blocking ships. Fires, landslides and explosions will proliferate. Hydrants and sprinkler systems won't work.

There will be no water or sewer service, no electricity and no ATMs, telephones, television, radio or Internet. Willamette River bridges will be impassable. Food will soon run out.

Responding to the disaster will be difficult, experts found, because of a sort of emergency gridlock. To restore phone service, crews will need restored electricity. To bring back power, workers will require repaired roads and bridges. To fix highways, crews will need restored fuel delivery and distribution.

One issue though is the cascadia earthquakes are spaced out by hundreds of years.

Earthquake magnitude

The Cascadia subduction zone can produce very large earthquakes ("megathrust earthquakes"), magnitude 9.0 or greater, if rupture occurs over its whole area. When the "locked" zone stores up energy for an earthquake, the "transition" zone, although somewhat plastic, can rupture. Great Subduction Zone earthquakes are the largest earthquakes in the world, and can exceed magnitude 9.0. Earthquake size is proportional to fault area, and the Cascadia Subduction Zone is a very long sloping fault that stretches from mid-Vancouver Island to Northern California. It separates the Juan de Fuca and North American plates. Because of the very large fault area, the Cascadia Subduction Zone could produce a very large earthquake. Thermal and deformation studies indicate that the locked zone is fully locked for 60 kilometers (about 40 miles) downdip from the deformation front. Further downdip, there is a transition from fully locked to aseismic sliding.[6]

In 1999, a group of Continuous Global Positioning System sites registered a brief reversal of motion of approximately 2 centimeters (0.8 inches) over a 50 kilometer by 300 kilometer (about 30 mile by 200 mile) area. The movement was the equivalent of a 6.7 magnitude earthquake.[7] The motion did not trigger an earthquake and was only detectable as silent, non-earthquake seismic signatures.[8]

Great Earthquakes
estimated yearinterval
2005 source[9]2003 source[10](years)
about 9 pm, January 26, 1700 (NS) 780
780-1190 CE 880-960 CE 210
690-730 CE 550-750 CE 330
350-420 CE 250-320 CE 910
660-440 BCE 610-450 BCE 400
980-890 BCE 910-780 BCE 250
1440-1340 BCE 1150-1220 BCE unknown

[edit]Earthquake timing

The last known great earthquake in the northwest was the 1700 Cascadia earthquakeGeological evidence indicates that great earthquakes may have occurred at least seven times in the last 3,500 years, suggesting a return time of 300 to 600 years. There is also evidence of accompanying tsunamis with every earthquake, and one line of evidence for these earthquakes is tsunami damage, and through Japanese records of tsunamis.[11]