Studying Tenants in a Colocation Data Center space, what is in Digital Realty Trust

Digital Realty Trust has in its 2009 10-K statement on page 35 a list of its top 20 tenants.

image

I've been staring at this table for a while, seeing what could be learned from this public disclosure.

Top 3 tenants are resellers of space - Savvis, Equinix and Qwest comprise 24.8% of the sq ft, but are only 18.9% of the rent.

The 3 financials - JPMorgan Chase, Morgan Stanley, and HSBC are 2.8% of the sq ft and 7.1% of the revenue which can be explained by the high availability requirements for their space and requirements to support financial transactions.

Google is not in the Top 20 listed.  But, Microsoft, Yahoo! and Facebook are. Microsoft has the most amount of space of these 3 with 2.5% of the space and 1.6% of revenue.  Yahoo! has 0.9% of the sq ft and 1.9% of the revenue.  With Facebook at #4 in revenue with 3.6%, they have only 1.1% of the space in Digital's data centers.  Facebook being a young company could be in high density space, so judging just by sq ft and not knowing power leaves more questions.

Comverse Technology looks like they have the most amount of space in one location 2.9% of the sq ft and only 1.4% of the revenue.  But, all spaces are not created equal and it would seem the space Comverse got into had little demand from others which may explain why there is so much space.

Why are these numbers important?  As companies decide to build data centers, they use their colocation costs as part of the business justification.  But, how many look at the costs others are paying for colocation space to judge the ROI of a data center?

Read more

Google TV vs. Apple TV, Microsoft couldn’t last for this battle

The latest move by Google and Apple for the TV experience all require big data centers.  Newsweek covers how these companies are remaking the Tube.

Geek TV

Computer makers take over the tube.

Martin Katz / Xinhua-Landov

For the past few years, tech companies have been trying to find a way to bring the Internet and television to-gether, without much success. Sure, there are lots of little boxes you can attach to your TV that let you download content from the Internet, including Vudu, Roku, TiVo, Boxee, and Apple TV, not to mention game consoles from Microsoft and Sony. Each one gives you a little something different. But no single box gives you the whole Internet.

Now Google is out to replace all those crazy little boxes with Google TV. The software program will come built right into some TV sets and it will basically turn your TV into a computer.

In all the news about Google and Apple there is almost no mention of MSN TV.

image

Because you can’t buy MSN TV anymore.

image

MSN TV is the rebranded WebTV Microsoft acquired.

MSN TV (formerly WebTV) is the name of both a thin client which uses a television for display (rather than a computer monitor), and the online servicethat supports it.

The product and service was developed by WebTV Networks, Inc., a company purchased by Microsoft Corporation and absorbed into MSN (the Microsoft Network). While most thin clients developed in the mid-1990s were positioned as diskless workstations for corporate intranets, WebTV was positioned as a consumer device for web access.

A good friend from my Apple days worked at WebTV which eventually made him a Microsoft employee.  I hired him to help evangelize Windows XP before he moved on to senior architect position in Windows.  Ironically he now works at Google, not on the TV product.

Going back further in time I worked for a short period in the Microsoft Interactive TV team which was run by Craig Mundie.  Craig is the one who drove the acquisition of WebTV.

Microsoft takes notice

In February 1997, in an investor meeting with Microsoft, Steve Perlman was approached by Microsoft's Senior Vice President for Consumer Platforms Division, Craig Mundie. Despite the fact that the initial WebTV sales had been modest, Mundie expressed that Microsoft was impressed with WebTV and saw significant potential both in WebTV's product offering and in applying the technology to other Microsoft consumer and video product offerings

Things have changed a lot in 14 years that WebTV was launched.  The first servers for WebTV were run from their office in an old BMW dealership.

WebTV's online service running from servers in its tiny office, still based in the former BMW dealership

Apple and Google now have some of the top data center infrastructure in the industry.  What has changed in the 14 years is the requirement to use data centers to power the user experience. 

I wonder if part of what contributed to Microsoft’s inability to keep up the MSN/WebTV platform is the lack of data center capabilities during product development.

It is interesting how not to long ago servers were viewed as support devices – file and print servers.   Now servers do the heavy lifting so the client experience is faster, more efficient and more flexible.

Read more

Cost of Water Continues to rise, more sources go private

Water is a precious resource.  Newsweek  has a long article on “The New Oil” water.

The New Oil

Should private companies control our most precious natural resource?

Ethan Miller / Getty Images

Click to view a gallery about how we're losing our lakes.

Losing Our Lakes: Precious Resources at Risk

Sitka, Alaska, is home to one of the world’s most spectacular lakes. Nestled into a U-shaped valley of dense forests and majestic peaks, and fed by snowpack and glaciers, the reservoir, named Blue Lake for its deep blue hues, holds trillions of gallons of water so pure it requires no treatment. The city’s tiny population—fewer than 10,000 people spread across 5,000 square miles—makes this an embarrassment of riches. Every year, as countries around the world struggle to meet the water needs of their citizens, 6.2 billion gallons of Sitka’s reserves go unused. That could soon change. In a few months, if all goes according to plan, 80 million gallons of Blue Lake water will be siphoned into the kind of tankers normally reserved for oil—and shipped to a bulk bottling facility near Mumbai. From there it will be dispersed among several drought-plagued cities throughout the Middle East. The project is the brainchild of two American companies. One, True Alaska Bottling, has purchased the rights to transfer 3 billion gallons of water a year from Sitka’s bountiful reserves. The other, S2C Global, is building the water-processing facility in India. If the companies succeed, they will have brought what Sitka hopes will be a $90 million industry to their city, not to mention a solution to one of the world’s most pressing climate conundrums. They will also have turned life’s most essential molecule into a global commodity.

Lack of water is going to take out more data centers.  Think about this.

In the U.S., federal funds for repairing water infrastructure—most of which was built around the same time that Henry Ford built the first Model T—are sorely lacking. The Obama administration has secured just $6 billion for repairs that the EPA estimates will cost $300 billion. Meanwhile, more than half a million pipes burst every year, according to the American Water Works Association, and more than 6 billion gallons of water are lost to leaky pipes. In response to the funding gap, hundreds of U.S. cities—including Pittsburgh, Chicago, and Santa Fe, N.M.—are now looking to privatize. On its face, the move makes obvious sense: elected officials can use the profits from water sales to balance city budgets, while simultaneously offloading the huge cost of repairing and expanding infrastructure—not to mention the politically unpopular necessity of raising water rates to do so—to companies that promise both jobs and economy-stimulating profits.

Of course, the reality doesn’t always meet that ideal. “Because water infrastructure is too expensive to allow multiple providers, the only real competition occurs during the bidding process,” says Wenonah Hauter, executive director of the nonprofit, antiprivatization group Food and Water Watch. “After that, the private utility has a virtual monopoly. And because 70 to 80 percent of water and sewer assets are underground, municipalities can have a tough time monitoring a contractor’s performance.” According to some reports, private operators often reduce the workforce, neglect water conservation, and shift the cost of environmental violations onto the city. For example, when two Veolia-operated plants spilled millions of gallons of sewage into San Francisco Bay, at least one city was forced to make multimillion-dollar upgrades to the offending sewage plant. (Veolia has defended its record.)

The smart people are looking to reduce water use in the data center as one of the biggest cost risks and availability  issues is water.

If you don’t think water prices will change.

The bottom line is this: that water is essential to life makes it no less expensive to obtain, purify, and deliver, and does nothing to change the fact that as supplies dwindle and demand grows, that expense will only increase. The World Bank has argued that higher prices are a good thing. Right now, no public utility anywhere prices water based on how scarce it is or how much it costs to deliver, and that, privatization proponents argue, is the root cause of such rampant overuse. If water costs more, they say, we will conserve it better.

A green data center needs a low water strategy.

Read more

Turning waste heat into power

Currently state of the art in data centers is to use the least amount of energy for a low PUE number removing heat from the data center.

What if the heat could be used to generate electricity?  A dream?   Yes.

Here is one attempt to turn heat into electricity.

Turning Waste Heat Into Power

ScienceDaily (Oct. 3, 2010) — What do a car engine, a power plant, a factory and a solar panel have in common? They all generate heat -- a lot of which is wasted.

University of Arizona physicists have discovered a new way of harvesting waste heat and turning it into electrical power.

Using a theoretical model of a so-called molecular thermoelectric device, the technology holds great promise for making cars, power plants, factories and solar panels more efficient, to name a few possible applications. In addition, more efficient thermoelectric materials would make ozone-depleting chlorofluorocarbons, or CFCs, obsolete.

A "forest" of molecules holds the promise of turning waste heat into electricity. UA physicists discovered that because of quantum effects, electron waves traveling along the backbone of each molecule interfere with each other, leading to the buildup of a voltage between the hot and cold electrodes (the golden structures on the bottom and top). (Credit: Justin Bergfield, University of Arizona)

The article doesn't discuss data centers.  But does discuss photovoltaic and cars excess heat.

"Solar panels get very hot and their efficiency goes down," Stafford said. "You could harvest some of that heat and use it to generate additional electricity while simultaneously cooling the panel and making its own photovoltaic process more efficient."

"With a very efficient thermoelectric device based on our design, you could power about 200 100-Watt light bulbs using the waste heat of an automobile," he said. "Put another way, one could increase the car's efficiency by well over 25 percent, which would be ideal for a hybrid since it already uses an electrical motor."

Read more

Google Data Centers used to save energy and lives

Google’s official blog shares a vision to use Google’s data centers to save energy and lives.

What we’re driving at

10/09/2010 12:00:00 PM

Larry and Sergey founded Google because they wanted to help solve really big problems using technology. And one of the big problems we’re working on today is car safety and efficiency. Our goal is to help prevent traffic accidents, free up people’s time and reduce carbon emissions by fundamentally changing car use.

Data Centers are key.

This is all made possible by Google’s data centers, which can process the enormous amounts of information gathered by our cars when mapping their terrain.

Enabling a bunch of smart people.

To develop this technology, we gathered some of the very best engineers from the DARPA Challenges, a series of autonomous vehicle races organized by the U.S. Government. Chris Urmson was the technical team leader of the CMU team that won the 2007 Urban Challenge. Mike Montemerlo was the software lead for the Stanford team that won the 2005 Grand Challenge. Also on the team is Anthony Levandowski, who built the world’s first autonomous motorcycle that participated in a DARPA Grand Challenge, and who also built a modified Prius that delivered pizza without a person inside. The work of these and other engineers on the team is on display in the National Museum of American History.

Lots of sensor data is used.

Our automated cars use video cameras, radar sensors and a laser range finder to “see” other traffic, as well as detailed maps (which we collect using manually driven vehicles) to navigate the road ahead.

Is the future a Google logo’d car?  :-)

We’ve always been optimistic about technology’s ability to advance society, which is why we have pushed so hard to improve the capabilities of self-driving cars beyond where they are today. While this project is very much in the experimental stage, it provides a glimpse of what transportation might look like in the future thanks to advanced computer science. And that future is very exciting.

Read more