Where are Facebook's data centers? Follow the money

RichMiller at DataCenterKnowledge has a good post that answers a question of where are Facebook's data centers?

Facebook: $50 Million A Year on Data Centers

September 16th, 2010 : Rich Miller

A look at the fully-packed racks inside a Facebook data center facility.

An analysis of Facebook’s spending with data center developers indicates that the company is now paying about $50 million a year to lease data center space, compared to about $20 million when we last analyzed its leases in May 2009.

When you spend $50 million a year, you can follow the money trail.

  • Facebook is paying $18.13 million a year for 135,000 square feet of space in data center space it leases from Digital Realty Trust (DLR) in Silicon Valley and Virginia, according to data from the landlord’s June 30 quarterly report to investors.
  • The social network is also leasing data center space in Ashburn, Virginia from DuPont Fabros Technology (DFT). Although the landlord has not published the details of Facebook’s leases, data on the company’s largest tenants reveals that Facebook represents about 15 percent of DFT’s annualized base rent, which works out to about $21.8 million per year.

...

Read more

Can Google go below 1.10 PUE with Sea Water Cooling?

DataCenterKnowledge reports on Google's use of Sea Water in its Hamina, Finland data center.

Google Using Sea Water to Cool Finland Project

September 15th, 2010 : Rich Miller

Google will use cool sea water in the cooling system for its new data center in Hamina, Finland, which is under construction and scheduled to go live early next year. The initiative continues Google’s focus on data center efficiency and sustainability. Using cool water allows Google to operate without energy-hungry chillers, and also limits the facility’s impact on local water utilities.

Where there is money savings there is typically less waste.  So, will this allow Google to go below 1.10 PUE?

Here are Google's latest numbers for Q1 2010.

Q2 2010 Performance

Quarterly energy-weighted average PUE:
1.17

Trailing twelve-month energy-weighted avg. PUE: 
1.18

Individual facility minimum quarterly PUE:
1.13, Data Center J

Individual facility minimum TTM PUE*:
1.13, Data Center B

Individual facility maximum quarterly PUE:
1.22, Data Center A

Individual facility maximum TTM PUE*:
1.23, Data Center H

* Only facilities with at least twelve months of operation are eligible for Individual Facility TTM PUE reporting

For more details on Google's latest data center construction.

The company’s plans were discussed in an article in Computer Sweden (translation inEnglish), which got a tour of the construction site in Hamina. There are no servers in sight yet, but the story reports that Google has refurbished the water pumps used at the former newsprint plant, and will use large pipes to draw cool water from the nearby Baltic Sea.

Google has a great goal for reducing its water consumption.

Google hopes to eventually use recycled water for up to 80 percent of the company’s total data center water consumption. “The idea behind this is simple: instead of wasting clean, potable water, use a dirty source of water and clean it just enough so it can be used for cooling,” Google says on its water management web page. “Cooling water still needs to be processed, but it’s much easier to treat it enough for data center use compared to cleaning it for drinking use.”

Read more

Environmental Impact of Data Center Diesel Generators - Quincy, WA

DataCenterKnowledge has a post about Washington State's Department of Ecology studying the environmental impact of Data Center Diesel Generator operation in Quincy, WA.

Quincy Generator Cluster Draws Scrutiny

September 13th, 2010 : Rich Miller

An aerial view of the Microsoft data center in Quincy, Washington

Economic development officials love clusters of huge Internet data centers. But environmental officials are less enthused about large clusters of diesel generators. The town of Quincy, Washington has both, serving as home to major data centers for Microsoft, Yahoo and Intuit (with another project from Sabey Corp. on the way). 

As much as the big data center operators would like to not discuss any data center details, there is a social and environmental impact the public has the right to review.

The Washington State Department of Ecology has approved Microsoft’s additional permit, but has also scheduled a public hearing in Quincy on Sept. 28 to hear from residents on the topic. The Ecology department conducted an evaluation of the health risks from diesel engine exhaust particulates, and found that the Microsoft expansion, viewed in isolation, is not likely to impact public health.

State officials and Microsoft are required to appear at a public meeting to present and discuss the generator expansion. The Department of Ecology took the opportunity to seek feedback from area residents, citing the growing concentration of data centers.

“Due to the interest expressed by other data companies to expand or build in the Quincy area, Ecology was concerned that the cumulative effect of diesel engine emissions should be assessed,” the state said in announcing the meeting.

Being green means more than your PUE and energy efficiency of equipment.

Here is the presentation referenced.  Gary Palcisko is the presenter.

image

image

image

image  

And note this slide for potential future requirements.

image

Read more

Data Center Hunter or Harvester/Farmer, looking for customers

In the data center industry there are many people who enjoy game hunting.

And a dominant method to find customers follows a hunting methodology as opposed to a harvester/farmer approach.

Here is an article that talks about the Hunter vs. Harvester approach.

In working with business owners and entrepreneurs over the years, I’ve noticed that when it comes to acquiring new customers, most of them are hunters. They pounce on new leads, chase the prospects, make themselves readily available to the prospect and then bend over backwards to land the new customer.

On the other hand, I’ve noticed that the most successful business owners and entrepreneurs take a different approach to customer acquisition: they are harvesters. They gather in all their leads, work hard to prevent any from slipping through the cracks, cultivate those leads and then harvest them when the time is right for the customer.

The most interesting thing about these two styles is that the hunter usually gets tired, a bit humiliated and ends up getting small margins. On the other hand, the harvester stays fresh, confident and usually earns higher margins.

I just saw this post on DataCenterKnowledge post on Data Center planning, and got me thinking about hunting vs. harvesting.

How to Avoid Data Center Planning Mistakes

September 8th, 2010 : Kevin Normandeau

Why do so many data center build outs and expansion projects fail? This white paperfrom Lee Technologies addresses this question by revealing the top nine mistakes organizations make when designing and building new data center space. It also examines an effective way to achieve success through the Total Cost of Ownership (TCO) approach.

One person may think this is harvesting, but I think it is more like hunting.

Here is an example of what of what I think as a harvesting/farmer approach. 

I've been watching my top 5 data center construction companies post.  I get about 50 hits a week - every week for the past 8 months.  I am amazed there are 470 keywords that point to my post.

image

And here is an example of this last week.

image

Looking at the ISPs the list shows the following companies - Capital One, GM, global crossing, JP Morgan Chase, Network Appliances, and Yahoo besides a long list of ISPs.

Here are the top 10 cities for this week.  Note, the ability to look at ranges of time to see what cities the customers are in.  If I looked for 8 months, I would get the every major city.

image

A group of people in Cleveland are looking for a data center.  Who?  This is obvious one.

Capital One Partners

Place page

1300 East 9th Street
Cleveland, OH 44114-1506

Pretty cool I can do this research from a blog post and Google Analytics!

Here are the top google keywords used to find my post.

image

If you were thinking like a Farmer/Harvester you would be figuring out how to reach the customers who are looking for these keywords in the cities I listed.

It is common for data center vendors to pay over $10K for a booth at a conference and  maybe buy a speaking spot in front of as few as 2 dozen people.  But few people think like a harvester and prefer hunting.

Not only that, but hunting for business is tough stuff, even for those who manage to make a living at it.  When you’re in “hunting” mode, you’re dialing for dollars; you feel resistance at every turn; rejection is common; you get “price shopped” against competitors so margins are thin; and you waste tons of time working with prospects who simply aren’t ready to buy.

Seems more efficient to be a Harvester.

On the other hand, when you’re in “harvesting” mode, you’re working smart and scooping up sales left and right. You’re like the fisherman with the irresistible bait, drawing your prospects to you. You can spend your time closing deals on the phone with hot leads or go out on the golf course because you know your prospects will call you when they’re ready to move forward.

And, now that I think about it, the data center people I enjoy talking are Harvesters, and funny enough many of them enjoy game hunting.

Lead warming is about communicating with your prospects from the moment they express interest and then if they don’t buy right away, that’s OK because you then don’t let them slip away and instead breadcrumb them with information they’ll find valuable about your product, service or company

If you think you want to be a Harvester and want to leverage my post you can drop me an e-mail dave@greenm3.com.  The easiest thing to do is to drop an inline advertisement in my post, and you'll have 50 eyeballs a week.  :-)  But, there are many more interesting things to try to be a data center harvester/farmer.

Read more

Why Twitter dedicated servers for Justin Bieber, 3% of the load

There are 247 news articles about Twitter having dedicated servers for Justin Bieber which is 3% of Twitter traffic.

image

CNET broke the news with citing a tweet.

Report: Justin Bieber is 3 percent of Twitter

by Chris Matyszczyk

If you believe that Twitter is full of inane, immature narcissism, here's one in your solar plexus.

For an allegation has reached my eyes and baffled them into blindness. The allegation is that, at any given moment, at any given movement of your lungs and toes, 3 percent of Twitter's infrastructure is dedicated solely to the one person who most defines our hopes and our times.

I am not speaking of Kim Kardashian, nor of Rep. Jack Kimble. I am speaking of the one person who can unite men and women, young and old, sane and slightly less so: Justin Bieber.

A fascinatingly hopeful post at Gizmodo offers that Dustin Curtis, a designer of some repute, was told by an employee at Twitter that 3 percent of the company's infrastructure is dedicated to the little man with the unreal voice and the even more unreal hair.

(Credit: Screenshot: Chris Matyszczyk/CNET)

Read more: http://news.cnet.com/8301-17852_3-20015781-71.html#ixzz0ywhpUh60

What I haven't seen anyone attempt to explain is why would there be Racks of Servers dedicated?

This is common sense with no information from Twitter folks.

When you want speed in your servers you want to keep your whole search index in RAM in a server. Google, eBay, and Amazon are good examples of companies who use this method in their search servers to support keyword lookup and result match-up.  Any time you go to disk or another server you will slow down the search results by magnitudes.

The costs to put your whole search index in RAM is too high, so break down the problem.  Analyzing the flow of tweets, Twitter figured out many people are focused on a few areas, and probably stick to that topic watching the tweets.

Second Life was started with the idea of virtual reality and avatars, but many people are too lazy to keep up their 3D appearance.  Tweets are much more efficient to support a virtual presence.

In the same way that Saudi Arabia wanted to get access to RIM's encrypted data the information and analytics available in Justin Bieber's servers can be useful.  And, Twitter is probably figuring out ways to monetize access to analytics.

Which brings up an interesting point is Twitter the next service which governments want access to crawl?  For many people, e-mail and blogging is in the past.  And, Twitter is their main method to communicate.

There was a rumor of part of the reason why Twitter chose Salt Lake City is to be near the NSA's new data center in Salt Lake City. 

Room to grow: a Twitter data center

Wednesday, July 21, 2010

Later this year, Twitter is moving our technical operations infrastructure into a new, custom-built data center in the Salt Lake City area. We're excited about the move for several reasons.
First, Twitter's user base has continued to grow steadily in 2010, with over 300,000 people a day signing up for new accounts on an average day. Keeping pace with these users and their Twitter activity presents some unique and complex engineering challenges (as John Adams, our lead engineer for application services, noted in a speech last month at the O'Reilly Velocity conference). Having dedicated data centers will give us more capacity to accommodate this growth in users and activity on Twitter.
Second, Twitter will have full control over network and systems configuration, with a much larger footprint in a building designed specifically around our unique power and cooling needs. Twitter will be able to define and manage to a finer grained SLA on the service as we are managing and monitoring at all layers. The data center will house a mixed-vendor environment for servers running open source OS and applications.
Importantly, having our own data center will give us the flexibility to more quickly make adjustments as our infrastructure needs change.
Finally, Twitter's custom data center is built for high availability and redundancy in our network and systems infrastructure. This first Twitter managed data center is being designed with a multi-homed network solution for greater reliability and capacity. We will continue to work with NTT America to operate our current footprint, and plan to bring additional Twitter managed data centers online over the next 24 months.

The probability of the NSA talking to Twitter at some point of time is almost certainty.

Is Twitter's dedicated servers a help for the NSA?  There are Twitter servers dedicated to President Obama.  And Oprah too. :-)  Are the number of dedicated servers the new status symbol and measure of popularity?  It used to be records sold and movie revenue.  Is it now the billions of tweets?

Read more