Google's Pulp NonFiction Data Center, Wired Magazine's article on repurposing a pulp mill

Wired has an article on Google's Hamina Data Center.

Google Reincarnates Dead Paper Mill as Data Center of Future

Google's Finland data center is the ultimate metaphor for the Internet Age (Photos: Google)

Joe Kava found himself on the southern coast of Finland, sending robotic cameras down an underground tunnel that stretched into the Baltic Sea. It’s not quite what he expected when he joined Google to run its data centers.

In February of 2009, Google paid about $52 million for an abandoned paper mill in Hamina, Finland, after deciding that the 56-year-old building was the ideal place to build one of the massive computing facilities that serve up its myriad online services. Part of the appeal was that the Hamina mill included an underground tunnel once used to pull water from the Gulf of Finland. Originally, that frigid Baltic water cooled a steam generation plant at the mill, but Google saw it as a way to cool its servers.

Not anything really new that hasn't been covered already, but it is worth noting that Wired's coverage reaches an audience maybe 100x bigger than a data center publication.

It does sound like the author was frustrated not getting more info, and closes with this.

The complaint, from the likes of Facebook, is that the Google doesn’t share enough about how it has solved particular problems that will plague any large web outfit. Reports, for instance, indicate that Google builds not only its own servers but its own networking equipment, but the company has not even acknowledged as much. That said, over the past few years, Google is certainly sharing more.

We asked Joe Kava about the networking hardware, and he declined to answer. But he did acknowledge the use of Spanner. And he talked and talked about that granite tunnel and Baltic Sea. He even told us that when Google bought that paper mill, he and his team were well aware that the purchase made for a big fat internet metaphor. “This didn’t escape us,” he says.

By inference is the author complaining by referencing Facebook?

SAP wakes up to what every search developer knows, if you want speed be in memory, never touch the HD

WSJ has an article on SAP's radical new SW.  The data access is all in memory.  OOh.

In-memory computing could be crucial for cloud computing, because offering services online requires companies to rapidly process large volumes of data. In December, SAP said it would pay $3.4 billion to acquire San Mateo, Calif.-based SuccessFactors Inc., which offers online services that help manage employees and carry out performance reviews. The company also paid $5.8 billion in 2010 to acquire Sybase Inc., which makes software that can send business information securely to mobile workers on their devices, easing a potential concern with HANA.

I wonder if the Google and Facebook developers look at this and say this is innovative?

SAP convinced Charité Universitätsmedizin Berlin, a large university hospital, to drop its Oracle software and switch to HANA. Together SAP and Charité developed a prototype of an iPad software application that uses the HANA machine to analyze three million data points for 140,000 admitted patients annually and determine if they are a fit for a clinical trial. Using the application, the hospital said it reduced the time it takes to find patients from weeks to less than one second.

"We need all this data in real time," says Martin Peuker, deputy chief information officer for the hospital.

Sounds like a problem has solved over and over.  Now that would be scary to SAP if Google instead of Oracle said we can find the patients in your data.

Or Facebook could say this is a social networking problem, we can find the people you should connect with.

 

 

Data Center Energy Simulation, Fujitsu's Tool coming soon, an alternative to Romonet

At Fujitsu's North America Tech Forum the green data center topic came up in many presentations.  And, there was a tech booth with Data Center Energy Efficiency through Simulation.  The ideas was announced in 2009.

Fujitsu Advances Green Data Centre Strategy with Total CO2 and Value Analysis Solution

Fujitsu Laboratories of Europe unveils its latest Green Data Centre development at the European Technology Forum


London, 16th Sep 2009 — Fujitsu Laboratories of Europe Limited announced today the launch of its latest Green Data Centre development at the European Technology Forum, hosted by Fujitsu Laboratories of Europe in London (16-17 September 2009). Fujitsu's Total CO2 and Value Analysis solution is the result of extensive research and development, in conjunction with the Carbon Trust in the UK, the company set up by the UK Government to accelerate the move to a low carbon economy.

Based on a core simulator developed through industry collaboration and with the support of the Carbon Trust to analyse energy use and carbon emissions in data centres and identify potential reductions, Fujitsu Laboratories' new technology represents a revolutionary approach. It breaks new ground in enabling a holistic analysis of energy usage within a data centre environment to be captured, quantitatively analysed and profiled, from the physical infrastructure, to the software, applications and delivered services.

Given the demonstration was done by the Fujitsu Labs Europe I was curious on how this product relates to Romonet.

NewImage

It turns out both Fujitsu and Romonet's came from the same beginnings.

The challenge with any tool from these companies though is going to market.  Romonet is software product you buy.  Fujitsu is looking at lower cost business models that put the product on the web. Later this year Fujitsu's tool will launch.

Fujitsu's Prototype Server pools CPUs and Hard drives for a flexible and performanec

Fujitsu has a web page on its new server prototype that pools CPU and Hard drives.

In the Tech product showcase I'll see if I discover more than is on the website.

September 26, 2011
Fujitsu Laboratories Ltd.

Fujitsu Develops Prototype of World's First Server that Simultaneously Delivers High Performance and Flexibility

Proprietary resource pool architecture enables the development of new ICT services

Kawasaki, Japan, September 26, 2011 — Fujitsu Laboratories Limited today announced the construction of a next-generation server that, using resource pool architecture, is the world's first to succeed in the simultaneous delivery of high performance and flexibility.

NewImage

NewImage

 

Big Data Innovation Opportunity: McKinsey study

I am at Fujitsu Technology Forum at the Santa Clara Convention center.

One presentation that you can look at the PDF of is McKinsey's Michael Chui discussing Big Data as a new Frontier.

Big data: The next frontier for innovation, competition, and productivity

May. 2011 | by James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, Angela Hung Byers
Contributing Practices: Business Technology
Download

The amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus, according to research by MGI and McKinsey's Business Technology Office. Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers. The increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the Internet of Things will fuel exponential growth in data for the foreseeable future

 

 

 

 

 

 

 

 

 

 

The bottom line value of interest I saw in the PDF is as  follows.

1. Creating transparency

2. Enabling experimentation to discover needs, expose variability, and improve performance

3. Segmenting populations to customize actions

4. Replacing/supporting human decision making with automated algorithms

5. Innovating new business models, products and services

NewImage