Microsoft's Xbox One announces 300k servers, trouble for Sony and Nintendo

The TV console wars are between Microsoft, Sony and Nintendo.  Sony had a serious outage that brought down its service.

The PlayStation Network outage was the result of an "external intrusion" on Sony's PlayStation Network and Qriocity services, in which personal details from approximately 77 million accounts were stolen and prevented users of PlayStation 3 and PlayStation Portable consoles from playing online through the service.[1][2][3][4] The attack occurred between April 17 and April 19, 2011,[1] forcing Sony to turn off the PlayStation Network on April 20. On May 4 Sony confirmed that personally identifiable information from each of the 77 million accounts appeared to have been stolen.[5] The outage lasted 24 days

I would guess anyone who has worked in Sony's online services group has a bad taste in their mouth and it is hard to get more resources.  

Xbox One launched this past week and Xbox Live is a big part of the services.

DatacenterKnowledge mentions that Microsoft will have 300,000 servers as part of Xbox One.  Running a Google Search, Bing search didn't show up the source of the information as a Microsoft transcript of the event.

When we launched Xbox Live in 2002, it was powered by 500 servers. With the advent of the 360, that number had grown to over 3,000. Today, 15,000 servers power the modern Xbox Live experience. But this year, we will have more than 300,000 severs for Xbox One, more than the entire world's computing power in 1999. (Cheers, applause.)

This matches the DCK article.

“When we launched Xbox Live in 2002, it was powered by 500 servers,” Microsoft’s Marc Whitten said in introducing the new platform. “With the advent of the 360, that had grown to over 3,000. Today, 15,000 servers power the modern Xbox Live experience. But this year, we will have more than 300,000 servers for Xbox One.”

Curious I wanted to see what was actually said, so I found the Xbox One Launch event on Youtube and at 23:13 mark is where the Xbox server reference is made.  And thanks to Youtube transcript here is the text.

23:13
when we launch xbox live in two thousand two it was powered by five hundred
23:16
servers
23:17
with the advent of the three sixty that number had grown to over three thousand
23:22
today
23:22
fifteen thousand servers power the modern xbox live experience
23:27
but this year
23:29
we'll have more than three hundred thousand servers for xbox one
23:33
more than the entire world computing power in nineteen ninety nine
 

NewImage

Part of what I do for some clients is provide research services and it is important to get to the original source of information and show where the public disclosures were.  Thanks to YouTube and other online services it is so much easier to get to the source of information which is transforming how news is reported and how analysis can be done.

The story behind the 400 ppm CO2 emissions

There was all kinds of news about CO2 level reaching 400 ppm.

National Geographic

Climate Change and CO2 400 ppm

Energy Collective-by Lou Grinzo-May 11, 2013
Climate Change and CO2 400 ppm ... on 400ppm, which is to say, an amount of CO2 in the atmosphere that's 400 parts per million, by volume.
Heat-Trapping Gas Passes Milestone, Raising Fears
Highly Cited
-New York Times-40 minutes ago

 Out of all the hype, National Geographic and The Economist tell the story behind the 50+ years of measurement started by Charles David Keeling.

Here is the National Geographic post.

Climate Milestone: Earth’s CO2 Level Passes 400 ppm

Greenhouse gas highest since the Pliocene, when sea levels were higher and the Earth was warmer.

Two teams of scientists at the Mauna Loa Observatory in Hawaii have been measuring carbon dioxide concentration there for decades, and have watched the level inch toward a new milestone.

Photograph by Jonathan Kingston, National Geographic

Robert Kunzig

National Geographic News

Published May 9, 2013

An instrument near the summit of Mauna Loa in Hawaii has recorded a long-awaited climate milestone: the amount of carbon dioxide in the atmosphere there has exceeded 400 parts per million (ppm) for the first time in 55 years of measurement—and probably more than 3 million years of Earth history.

And here is The Economist post.

Environmental monitoring

Four hundred parts per million

The only good news about the Earth’s record greenhouse-gas levels is that they have been well measured

May 11th 2013 |From the print edition

CHARLES D. KEELING, mostly known as Dave, was a soft-spoken, somewhat courtly man who changed the way people and governments see the world. A slightly aimless chemistry graduate with an interest in projects that took him out into the wild, in 1956 he started to build instruments that could measure the proportion of carbon dioxide in the atmosphere, a scientific topic which, back then, was barely even a backwater. In 1958, looking for a place where the level of carbon dioxide would not be too severely influenced by local plants or industry, he installed some instruments high up on Mauna Loa, a Hawaiian volcano. He found that the level fluctuated markedly with the seasons, falling in northern summer as plants took up carbon dioxide and rising in northern winter as dead foliage rotted. And he found that the annual average was 315 parts per million (ppm).

The Economist honors the effort by Dave.

Scientists involved in other measurements of the Earth, and those who pay for their work, need to build on his legacy. So does anyone taking a position on global-warming, where numbers as clear as Keeling’s are a rarity. Measurements of the temperature of the ocean depths and the acidity of its surface waters, of the volume of the planet’s forests and the mass of its ice sheets (see article), need to be made not just for the few years of a specific research project. Their ceaseless continuance needs to be built into the planet’s infrastructure. A world in which governments claim to be committed to spending trillions of dollars to change the shape of the Keeling curve decades hence, but do not find the funds to produce consistent records of the change going on today, is one that still has lessons to learn from the patient chemist.

And National Geographyic as well.

When the elder Keeling started at Mauna Loa, the CO2 level was at 315 ppm. When he died in June 2005, it was at 382. Why did he keep at it for 47 years, fighting off periodic efforts to cut his funding? His father, he once wrote, had passed onto him a "faith that the world could be made better by devotion to just causes." Now his son and the NOAA team have taken over a measurement that captures, more than any other single number, the extent to which we are changing the world—for better or worse.

Watch out DCIM and other Industrial data systems vendors, IBM will be showing up with MessageSight Appliance in the future

On Tuesday I had a chance to sit in a discussion with IBM's Michael Curry.  

NewImage

Michael has his own blog here.

By way of background, I work for IBM, live in Massachusetts, and have about 20 years of experience in all aspects of software. However, the postings on this site are my own and don’t necessarily represent IBM’s positions, strategies or opinions. At this point in time, I’m most interested in topics like mobile,  Cloud APIs & API management, SOA, security, big data analytics,  and data protection, so I’ll likely be talking about some of those. However, as I said, my interests have a tendency to shift…

One of the questions from the other media and analyst is what Michael is excited about.  And, Michael discussed MessageSight which is in beta and ships by end of May.

Michael is a reader of this blog and is quite technical.   

One of the questions I asked is whether MessageSight appliance is designed for a fail-over and/or mesh environment.  Yes it is.  One of the examples could be a regional approach to collect transportation data.  In the local area you could have two appliance set up as fail over, then have these nodes are networked in a mesh environment to share data with other MessageSight servers.  With a goal of 99.999% availability this would make sense.

When i was out in the exhibit hall areas I had a chance to chat with a technical person.  Hint: when you walk around with a Press Badge, you have to wait to talk to the people who are approved to talk to the press.  The funny thing is I was on the other side of this when I worked for Microsoft and you needed to have press training.  There were a few people I geeked out with and discussed hardware and software systems, then at the end they realized my badge said press and they were really nervous.  I told them don't worry, I don't write about things that aren't public disclosures.  Back to the public disclosure by an approved person.

One of the things I learned from a press trained technical person is even though MQTT is emphasized MessageSight appliance works with Java Message Service (JMS) and other messaging protocols.  Great, MessagSight appliance is messaging protocol agnostic.  IBM likes MQTT, but it will work with many other protocols.

I asked Michael if MessageSight has been targeted for use cases like Oil and Gas.  Yes.  Working Modbus and SCADA and other protocols is also part of what MessageSight appliance does.  Telecom and Transportation are also interesting. 

On Feb 22, 2010, IBM announced the Johnson Controls partnership.  I remember that one as I was there and had a chance to talk to the Johnson Control guys.  It would make sense that Johnson Controls is teaming up with IBM to allow MessageSight to work with their systems for the Smarter Buildings initiative.

IBM and Johnson Controls Join Forces to Make Buildings Smarter

Combined Offering to Enhance Energy and Operational Efficiencies

LAS VEGAS, - 22 Feb 2010: IBM (NYSE: IBM) and Johnson Controls (NYSE: JCI), today announced a new relationship to create a new era of smarter buildings.  Together, the companies will team to provide a Smart Building Solution that can improve operations and reduce energy and water consumption in buildings worldwide.

 

 

 

 

Next, has IBM looked at the DCIM market?  It has been mentioned, but not a targeted scenario in the short term.  Which may be a sigh of relief for the DCIM vendors.

One of the scenarios IBM is targeting is M2M, Machine to Machine (M2M) system.

My simple definition is M2M is the set of systems, networks, processes and data that connects machines, being technology in the field, with machines that are computers, primarily for the purpose of asset management and physical security. 

This definition seems workable but let’s explore it a bit further.  The first machine is the technology in the field, being the terminal or the endpoint of the network, and the second machine is the computer, typically located in the data centre.  The machine in the field has a routable IP address and collects data which is sent over a communications network to the computer for processing.  For example the computer correlates the sensor data with other data, it ingests, stores and analyses the live video, and it stores the smart meter data to track usage and generate a bill.

Now, if you an industrial control system vendor you need to think whether IBM's MessageSight appliance is a competitor or how you are going to work with it.  Johnson Controls is partnering with IBM.

Think about these statements in the press release.

“When we launched our Smarter Planet strategy nearly five years ago, our strategic belief was that the world was going to be profoundly changed as it became more instrumented, interconnected and intelligent. IBM MessageSight is a major technological step forward in continuing that strategy,” said Marie Wieck, general manager, WebSphere, IBM. “Until now, no technology has been able to handle this volume of messages and devices. What's even more exciting is that this only scratches the surface of what's to come as we continue down this path of a Smarter Planet.”  

...

The ability of IBM MessageSight to handle and route tremendous volumes of messages makes it ideal for use by governments and organizations looking to connect and infuse intelligence into cities and across industries such as automotive, healthcare and finance. 

...

“To realize the vision of a Smarter Planet, we must first enable the universe of instrumented sensors, devices and machines to communicate more efficiently while sharing, managing and integrating large volumes of data at a rate much faster than ever before,” said Bob S. Johnson, director of development for Sprint’s Velocity Program. “We have been testing IBM MessageSight for some initial projects and are excited about the capabilities that it could help us deliver to the vehicle and beyond.” 

There is no reason why IBM's MessageSight would be the repository of operation data in a data center or other industrial systems.

After 5 years of IBM's Smarter Planet Initiative launches MessageSight to connect sensors and things

I remember when IBM announced its smarter planet initiative 5 years and I had lots of questions on how sensor networks will work.  Today IBM announced the MessageSight appliance built on MQTT.

LAS VEGAS - 29 Apr 2013: IBM’s (NYSE: IBM) Smarter Planet strategy took a major technological step forward today with the introduction of IBM MessageSight, a new appliance designed to help organizations manage and communicate with the billions of mobile devices and sensors found in systems such as automobiles, traffic management systems, smart buildings and household appliances.

Over the next 15 years, the number of machines and sensors connected to the Internet will explode. According to IMS Research, there will be more than 22 billion web-connected devices by 2020[i].These new devices will generate more than 2.5 quintillion bytes of new data every day[ii], while every hour enough information is consumed by Internet traffic to fill seven million DVDs.[iii] 

GigaOm’s Stacey Higginbotham has a post that goes into more details.

IBM uses the example of the hundreds of sensors in your car recognizing a problem, turning on your check engine light, and then notifying the dealer so it can do remote diagnostics. As someone who is heading to the dealer tomorrow for a check engine light, this example caught my eye. Yet, I’m not sold on the need for a special box over more intelligence at the sensor, or perhaps a mesh network with nominal “intelligence.”

The internet of things exaflood is coming!

floodThe idea is compelling, but it also grossly simplifies the flow of data inside the internet of things. For example, it assumes all sensor data must be processed in “real time.” It also assumes all the data must be processed. Both of these are untrue, especially in the early days of the internet of things. But IBM is looking ahead. From its release on the MessageSight appliance:

I am at the conference and I’ll see if there is anything else interesting on MessageSight.

Disclosure: I work with GigaOm Pro as a freenlance analyst

Facebook's PUE and WUE Dashboard https://www.fbpuewue.com/prineville

I found the announcement for Facebook's PUE and WUE dashboard on The Register.

Bit barn efficiency metrics on a minute-by-minute basis

Free whitepaper – IT infrastructure monitoring strategies

Facebook has heaped pressure on major data center operators to be more transparent, publishing a dashboard that gives up-to-the-minute figures on the efficiency of the social network's gigantic bit barns.

But, no reference to the Facebook url.

Looked at DatacenterKnowledge, again no URL for the Facebook dashboard.

Then hit GigaOm and found the URL. 

The facilities are still under construction, and, as a result, the data in the two dashboards can have abnormalities, but it should become more stable over time. The company detailed its plans in a Thursday blog post on the Open Compute Project site.

With the blog post that the Register and DCK use to report.

A new way to report PUE and WUE

Thursday, April 18, 2013 · Posted by  at 09:10 AM

Today Facebook launched two public dashboards that report continuous, near-real-time data for key efficiency metrics – specifically, PUE and WUE – for our data centers in Prineville, OR and Forest City, NC. These dashboards include both a granular look at the past 24 hours of data and a historical view of the past year’s values. In the historical view, trends within each data set and correlations between different metrics become visible. Once our data center in Luleå, Sweden, comes online, we’ll begin publishing for that site as well.

It is a bit ironic that in a post about transparency it took me so long to find the original FB blog post and the dashboard. https://www.fbpuewue.com/prineville with one years worth of data.

NewImage

Disclosure: I work for GigaOm Pro as freelance analyst and like the fact that they embrace transparency in reporting.