Canadians Pitch Data Center Location to Reduce Costs

Canada’s “The Globe and Mail” has an article on data centres.

Technology for Tomorrow
Business seek ways to reduce data centre costs

As software-as-service, cloud computing and streaming Internet services grow, companies big and small face exorbitant electricity costs

Ian Harvey

Globe and Mail updateTuesday, Sep. 01, 2009 08:39AM EDT

When crude oil goes up, drivers can shop around for gas but when it comes to electricity, businesses are subject to paying high rates because most operate during daylight hours when prices peak. They can't simply up and move our homes or businesses to the cheapest energy location.

One major headache for business are data centres, those special secured and cooled rooms with rack upon rack of monolithic black towers humming away amidst a tangle of cables that consume vast amounts of electricity.

“Up to 30 per cent of energy costs in a business can come just from the servers,” says Bill St. Arnaud, Chief Research Officer at CANARIE (Canada's Advanced Internet Development Organization). “And if that's dirty power – from coal or oil – under Carbon Tax legislation proposed in the U.S. it could be triple the cost it is now.”

The article sites a variety of industry experts and facts to educate the reader on data centre issues. Then closes with a pitch for Canada vs. NY’s Buffalo win of Yahoo.

Similarly, Yahoo is building a 200,000 square foot datacentre outside Buffalo because it will save $100-million over 15 years by accessing Niagara Falls' sustainable and affordable hydro. They're also planning to offset cooling costs by using the frigid Buffalo winter air.

Canada has many suitable locations, says Mr. St. Arnaud, pointing to our abundance of hydro, cold climate, deep lakes and river, political and geographic stability.

Read more

Who owns Data from Tax Funded Projects, 3 example of Transit Systems

News.com has a post on the issue of data access in three different transit systems – NY, SF, and Portland. This is something to think about when considering smart grid projects and other projects that could affect data centers. 

Who owns transit data?

by Rafe Needleman

Commuters on public transit want to know two fundamental things: when can I expect the bus or train to pick me up? And when will it drop me off at my destination?

Nowadays, they may also be wondering whether their local transit agency is willing to share that data with others to put it into new and helpful formats.

How likely is it that the arrival and departure information will be available on a site or service other than the official one? That depends on how open your local agency is. In some metro areas, transit agencies make data--routes, schedules, and even real-time vehicle location feeds--available to developers to mash into whatever applications they wish. In others, the agencies lock down their information, claiming it may not be reused without permission or fee.

When tax payers money is involved there are interesting views on what should happen.

In local blogs and on transit sites, outrage over agencies and companies that claim ownership of the data is growing. The core argument against locking down such data is that it's collected by or paid for by public, taxpayer-funded agencies and thus should be open to all citizens, and that schedule data by itself is not protectable content. The argument against is that the agencies might be able to profit from using the data if they can maintain control of it. The counter to that is the belief that if the data is open, clever developers will create cool apps that make transit systems more usable, thus increasing ridership and helping transit agencies live up to their charters of moving people around and getting as many private cars as possible off the roads.

StationStops gives New York metro rail commuters a timetable in their iPhones.

(Credit: StationStops)

Each city and metro area with a transit system is unique, but there are three cases in the U.S. that highlight the way the transit data drama can play out.

NY’s view treats data as copyrighted work.

New York locks down subway schedules
As reported last week at ReadWriteWeb and elsewhere, the New York Metropolitan Transportation Agency believes its public train schedules fall under copyright law and thus applied an interpretation of the Digital Millennium Copyright Act (DMCA) to send a takedown notice to the developer of StationStops, an iPhone app that gives people access to train schedules on the Metro-North lines.

SF is taking a more open approach, but has its hiccups.

San Francisco writes data accessibility into contracts

The Routesy iPhone app uses NextBus data to predict transit arrival times.

(Credit: Routesy)

In San Francisco last week, Mayor Gavin Newsom unveiled (via TechCrunch) the Datasf.org initiative, which aims to put all the city's data online for open access. Included in the program is the San Francisco Municipal Transit Agency's schedule data. There's no question that this is a positive development for San Francisco Bay Area transit app developers and that it sets a good precedent for developers elsewhere. However, static schedule data is not the whole story for transit apps, especially on systems where route schedules are poorly adhered to (on New York's Metro-North lines, the schedules are somewhat reliable; for San Francisco's MUNI buses, they are not). The most useful new apps collect real-time vehicle location data, and access to that information is not yet available from SFData.

In many cities, a company called NextBus gathers location data from vehicles and then makes that information available to the subscribing cities as well as on its own Web site. Developers of real-time transit iPhone apps, such as San Francisco's Routesy and iCommute, have had mixed results in getting access to that data.

Portland is the best.

Visit Portland for the best in transit apps
In Portland, Ore., openness on the part of the local transit agency has been a blessing for transit app developers. There are more than 25 apps that use the public TriMet data stream. Many of the apps duplicate others' functions and features, but it's just this kind of competition that makes apps better over time. When companies control data about their services and are the only ones to provide the apps that use the data, users do not get the same benefit of rapid application evolution.

Then there is google working with all three.

Google drives the bus
Google is the most aggressive company in the transit planning business. If you ask Google Maps for directions, by default it will route you by car, but you can also ask it to give you directions by public transit. In many metro areas, it will even direct you among different transit systems (from a local bus line to a commuter rail system, for example).

I travel to all three cities, and I agree Portland has the best system and is the most enjoyable to visit for public transit.  The Portland Trimet system has this site for apps.  And, a developer center.

Read more

Should you Build a Mega Data Center? Washington D.C. vs. Washington St.

There is more news than I thought on the subject on Microsoft moving Windows Azure from Washington state to Texas.   There are fifteen articles on the subject, and here are a few.

Cloud Computing: Washington vs. Washington

BusinessWeek - Om Malik - ‎4 minutes ago‎

When I spoke with US CIO Vivek Kundra last month, he outlined a pragmatic approach to federal technology that involved adopting a hybrid model ...

Will Google regret the mega data center?

Register - ‎Aug 7, 2009‎

In the wake of Microsoft's decision to remove its Windows Azure infrastructure from the state of Washington - where a change in local tax law has upped the ...

Microsoft's Drag-And-Drop Windows Azure Cloud

InformationWeek - John Foley - ‎Aug 7, 2009‎

Citing an unfavorable change in tax laws, Microsoft is moving its Windows Azure cloud from a data center in Washington state to one in Texas. ...

Curious what Om Malik would say in BusinessWeek.  He brings up Washington D.C. is excited about the cloud, but Washington state is going to build a mega data center in the state capital of Olympia instead of Eastern Washington.

Cloud Computing: Washington vs. Washington

The feds want cloud computing services as part of their tech infrastructure, but Washington State plans to build its own data center

By Om Malik

Click here to find out more!

partner logo

When I spoke with U.S. CIO Vivek Kundra last month, he outlined a pragmatic approach to federal technology that involved adopting a hybrid model of data centers and cloud computing solutions. Buying infrastructure as a service instead of banking solely on energy-guzzling data centers is a good way to stretch tax dollars, he argued. Kundra's colleague, Aneesh Chopra, Chief Technology Officer of the U.S., shares his approach.

And while cloud computing is all the rage in Washington, D.C., it seems Washington State doesn't much care for cloud computing. Instead of buying cloud computing services from homegrown cloud computing giant Amazon.com (AMZN) (or newly emergent cloud player, Microsoft), the state has opted to build a brand-new, $180 million data center, despite reservations from some state representatives.

Microsoft (MSFT) is moving the data center that houses its Azure cloud services to San Antonio, from Quincy, Wash.—mostly because of unfavorable tax policies. The data centers are no longer covered by sales tax rebates—a costly proposition for Microsoft, which plans to spend many millions on new hardware for the Azure-focused data center.

It figures as industry experts question the mega data center, state gov’ts pick up on the mega data center to use tax payers money.

Read more

A Different Environmental Roof, Greenpeace Paints “Hazardous” on HP’s Rooftop

We’re use to seeing solar panels put on rooftops for an environmental statement.  Greenpeace made their own environmental statement by painting “Hazardous Products” on  HP’s roof.

Greenpeace Paints 'Hazardous' on HP's Roof Over Toxics Use

Jeff Bertolucci, PC World

Jul 28, 2009 2:32 pm

Greenpeace Paints 'Hazardous' on HP Roof Over Toxics Use

Greenpeace, upset by what it calls Hewlett-Packard's "backtracking" on commitments to build greener tech gadgets, today sent a very large message to HP management.

Activists from the international environment group painted the message "Hazardous Products" on the rooftop of HP's global headquarters in Palo Alto, Calif. Greenpeace also sent automated phone calls from actor William Shatner (yes, Captain Kirk) to the company, urging it to phase out toxic chemicals in its products.

"Earlier this year, HP postponed its 2007 commitment to phase out dangerous substances such as brominated flame retardants (BFRs) and polyvinyl chloride (PVC) plastic from its computer products (excluding their server and printer lines) from 2009 to 2011," said Greenpeace in a statement.

A statement on HP's Website promises to phase out BFR and PVC in its personal computing products in 2011.

Read more

Data Centers as Targets for Regulation

Mike Manos just posted a great entry on his latest visit to London.

CRC – its not just a cycle redundancy check

I have been tracking the energy efficiency work being done in the United Kingdom for quite some time and developments in the Carbon Reduction Commitment (CRC).  My recent trip to London afforded me the opportunity to drive significantly harder into the draft and discuss it with a user community (at the Digital Realty Round table event) who will likely be the first impacted by such legislation. For those of you unfamiliar with the initiative let me give a quick overview of the CRC and how it will work. 

The main purpose of the CRC is a mandatory carbon reduction and energy efficiency scheme aimed at changing energy use behaviors and further incent the adoption of technology and infrastructure.  While not specifically aimed at Data Centers (its aimed at everyone) you can see that by its definition Data Centers will be significantly affected.  It was introduced as part of the Climate Change Act 2008.

How are data centers targets?

In effect it is an auction based carbon emissions trading scheme designed to operate under a Cap and Trade mechanism.  While its base claim says that it will be revenue neutral to the government (except of course for penalties resulting from non-compliance), it provides a very handy vehicle for future taxation and revenue.  This is important, because as data center managers you are now placed in a position where you have primary regulatory reporting responsibilities for your company.  No more hiding under the radar, your roles will now be front and center.      

All organizations including governmental agencies who consume more than 6000 MWh in 2008 are required to participate.  The mechanism is expected to go live in April 2010.  Please keep in mind that this consumption requirement is called out as MWh and not Megawatts.  What’s the difference? Its energy use over time for your whole company.  If you as a data center manager run a 500 kilowatt facility you account for almost 11% of the total energy consumption.  You can bet you will be front and center on that issue. Especially when the proposed introductory price is £12/tCO2 (or $19.48/tCO2).  Its real money.  Again, while not specifically focused on data centers you can see that they will be an active contributor and participant in the process.  For those firms with larger facilities, lets say 5MW of data center space – dont forget to add in your annual average PUE – the data centers will qualify all to themselves.

While many of you may be reading this and feel poorly for your brothers and sisters in Great Britain while sighing in relief that its not you, keep in mind that there are already other mechanisms being put in place.  The EU has the ETS, and the Obama Administration has been very public about a similar cap and trade program here in the United States.  You can bet that the US and other countries will be closely watching the success and performance of the CRC initiative in the UK. They are likely to model their own versions after the CRC (why invent the wheel over again, when you can just localize to your country or region).  SO it might be a good idea to read through it and start preparing how you and your organization will respond and/or collect.

and where Mike helps to illustrate why data centers are targets.

Someone once decried to me that data centers are actually extremely efficient as they have to integrate themselves into the grid, they generally purchase and procure the most energy efficient technologies, and are incented from an operating budget perspective to keep costs low.  Why would the government go after them before they went after the end users who typically do not have the most energy efficient servers or perhaps the OEMs that manufacture them.  The simple answer is that data centers are easy high energy concentration targets.   Politically going after users is a dicey affair and as such DCs will bear the initial brunt.

Ideally what we need is regulation supports transparency, simplicity, fairness, and access, but this is a new idea as the WSJ discusses.

About Time: Regulation Based On Human Nature

  • By JASON ZWEIG

Columnist's name

Franklin D. Roosevelt sent Wall Street to the torture rack. Barack Obama is sending Wall Street to the psychology lab.

A key component of President Obama's financial-reform package is its proposed Consumer Financial Protection Agency, which would apply findings from the science of human behavior to ensure "transparency, simplicity, fairness, and access" for borrowers, savers and other financial consumers.

That could make it a lot harder for a part-time worker to end up with an exploding mortgage that eats all her take-home pay. It might even mean that regulators will finally pay attention to the visual presentation of financial data -- color, graphics and other factors that exert powerful sway over your decisions.

regulation based on human nature

Heath Hinegardner

The proposal is an outgrowth of "Nudge," the brilliant book published last year by two University of Chicago scholars, economist Richard H. Thaler and law professor Cass R. Sunstein. A longtime friend of President Obama, Prof. Sunstein has been nominated to head the White House's Office of Information and Regulatory Affairs, a job often described as "the regulation czar."

In my view, a behavioral approach is decades overdue. Financial regulations always have been written mainly by lawyers and legislators -- then promptly shot full of holes by promoters who understand how real human beings think and behave.

Unfortunately for the data center I doubt we are going to get people who think like this.

Regulation that recognizes the limits of human rationality is an idea whose time has come. Like any good psychology lab, the proposed new agency will gather reams of data on how real people actually behave and adjust its rules accordingly, in real time. Of course, the financial industry will adjust its own behavior, trying to outsmart the new rules as fast as they are printed. But the war between the regulators and the regulated might finally be based on a realistic view of human nature, not fantasy.

Read more