Google Ads

Enter your email address:

Delivered by FeedBurner

Search
This form does not yet contain any fields.
    Navigation
    Monday
    Apr122010

    Google, Microsoft, Amazon, Nokia, Digital Realty Trust, Dupont Fabros vs. ASHRAE standard 90.1 requirement for economizers limits innovation - comment to be heard

    Google's Public Policy blog has a post with some of the most innovative data center operators.

    Chris Crosby, Senior Vice President, Digital Realty Trust
    Hossein Fateh, President and Chief Executive Officer, Dupont Fabros Technology
    James Hamilton, Vice President and Distinguished Engineer, Amazon
    Urs Hoelzle, Senior Vice President, Operations and Google Fellow, Google
    Mike Manos, Vice President, Service Operations, Nokia
    Kevin Timmons, General Manager, Datacenter Services, Microsoft

    This collection and probably many others are appealing to ASHRAE to change the requirement for economizers.

    Unfortunately, the proposed ASHRAE standard is far too prescriptive. Instead of setting a required level of efficiency for the cooling system as a whole, the standard dictates which types of cooling methods must be used. For example, the standard requires data centers to use economizers — systems that use ambient air for cooling. In many cases, economizers are a great way to cool a data center (in fact, many of our companies' data centers use them extensively), but simply requiring their use doesn’t guarantee an efficient system, and they may not be the best choice. Future cooling methods may achieve the same or better results without the use of economizers altogether. An efficiency standard should not prohibit such innovation.

    I know many of these above people and thanks to a friend they forwarded me this link to Google's blog post, I speculated on what drove the economizer requirement:

    1. Without talking to anyone, one assumption is this group who are active in ASHRAE brought up the energy efficiency issue early on, and ASHRAE stakeholders, most likely vendors who make economizers saw an opportunity to make specific equipment a requirement of energy efficiency data centers.  I could be wrong, but it would explain why an organization who sets standards would choose to specify equipment instead of performance.
    2. In many established data center organizations like financials, economizers are/were unacceptable in data centers a few years back.  So, is the move to establish economizers a reaction to those who refused to use economizers for energy efficient cooling.
    3. The ASHRAE consulting community sees a need for their services to meet ASHRAE's economizer requirement.  For example, if in a given area there are X number of hours a year that are available for running economizers, does the economizer need to be run for a specific %.  Hire an ASHRAE consultant to interpret the standard.  I sure can't.

    The data center group above proposes the following as a better update to the ASHRAE standard.

    Thus, we believe that an overall data center-level cooling system efficiency standard needs to replace the proposed prescriptive approach to allow data center innovation to continue. The standard should set an aggressive target for the maximum amount of energy used by a data center for overhead functions like cooling. In fact, a similar approach is already being adopted in the industry. In a recent statement, data center industry leaders agreed that Power Usage Effectiveness (PUE) is the preferred metric for measuring data center efficiency. And the EPA Energy Star program already uses this method for data centers. As leaders in the data center industry, we are committed to aggressive energy efficiency improvements, but we need standards that let us continue to innovate while meeting (and, hopefully, exceeding) a baseline efficiency requirement set by the ASHRAE standard.

    It doesn't make any sense that all data centers built to ASHRAE's standards have to use economizers. If you choose to have a waterfront data center and could use the body of water as a heat sink for your cooling, ASHRAE wouldn't allow it or would they?

    The public comment period is open until April 19.  If you disagree with ASHRAE's economizer requirement comment on this blog or Google's blog post.

    I was able to talk to Google's Chris Malone on this topic after I wrote the above.  The main concern Google has is if you are trying to be innovative in energy efficiency the last thing you want is a barrier saying you have to use a particular technology.

    In other words, the standard should set the required efficiency without prescribing the specific technologies to accomplish that goal. That’s how many efficiency standards work; for example, fuel efficiency standards for cars specify how much gas a car can consume per mile of driving but not what engine to use.

    Imagine if MPG numbers were mandated by use of hybrid, diesel, or turbocharger.  It is obvious that the most innovative MPG is going to come from those who have the freedom to use any technology.

    You should soon see other data center bloggers write on this issue.  If you think this is wrong comment on the Google Blog post or one of the others.

    Click to read more ...

    Sunday
    Apr112010

    Open Data Map movement demonstrates innovation opportunity for Open Sourced Data Center Initiative

    Tim Berners- Lee has a 6 minute TED presentation on the year open data went worldwide.

    Map and location services are top scenarios for mobile devices.  Google and Microsoft have their maps.  Nokia bought Navteq and MetaCarta.  Apple bought PlaceBase.  With all the companies creating services, volunteers using an open approach to collaborate can beat the proprietary services.

    The MercuryNews reports on Open Street Maps.

    Volunteers create new digital maps

    By Mike Swift

    mswift@mercurynews.com

    Posted: 04/09/2010 09:08:55 PM PDT

    Updated: 04/10/2010 01:36:26 PM PDT

    Ron Perez hikes by a waterfall while a portable GPS device records his tracks as... (Jim Gensheimer)

    When Brian "Beej" Hall first heard about an audacious volunteer effort to create an Internet map of every street and path in every city and village on the planet, he was hooked. At the time, the nascent effort had only a few American members, and the U.S. map was essentially a digital terra incognita.

    Just a few years later, the Berkeley software engineer is editing digital maps so precise they include drinking fountains and benches in the Bay Area parks where he hikes, and the mapping community has swelled to more than 240,000 global members. The effort, OpenStreetMap, is a kind of grass-roots Wikipedia for maps that is transforming how map data is collected, shared and used — from the desktop to smartphones to car navigation.

    The reporter makes the observation of how a nonprofit community can change the map business.

    But increasingly, the nonprofit community collaboration model behind OpenStreetMap, which shares all the cartographic data in its maps for free, is also changing the business of mapping, just as Wikipedia changed the business of reference. More and more, the accuracy of searches on Google Maps or directions issued by your car's navigational device are based on data collected by volunteers like Hall and other members of OpenStreetMap's do-it-yourself army.

    Part of the reason why OpenStreetMap is popular is the fact that the end users are creating the maps.

    OpenStreetMap users say that because their data is collected by people who actually live in a place, it is more likely to be accurate.

    "It's the people's map," said Paul Jarrett, director of mapping for CloudMade.

    If you are interested in the use of OpenStreetMap in Haiti go here.

    We chose to tell the story of 'OpenStreetMap - Project Haiti'.
    We all followed the crisis that unfolded following the Haiti earthquake, many of us chose to donate money, a few were flown out and deployed as part of the relief effort. But what practical impact can many have without being there in Haiti itself? Well, during this crisis a remarkable story unfolded; of how people around the world could virtually collaborate and contribute to the on-the-ground operations.

    OpenStreetMap - Project Haiti 1

    With the little existing physical, political and social infrastructure  now destroyed or damaged, the situation was especially challenging for aid agencies arriving on the ground. Where are the areas most in need of assistance, how do we get there, where are people trapped under buildings, which roads are blocked? This information is important to the rescue agencies immediately after the event, and to the longer rebuilding process. In many developing countries, there is a lack of good mapping data and particularly after a crisis, when up-to-date information is critical to managing events as they evolve.
    Enter OpenStreetMap, the wiki map of the world, CrisisMappers and an impromptu community of volunteers who collaborated to produce the most authoritative map of Haiti in existence. Within hours of the event people were adding detail to the map, but on January 14th high resolution sattelite imagery of Haiti was made freely available and the Crisis Mapping community were able to trace roads, damaged buildings, and enter camps of displaced people into OpenStreetMap. This is the story of OpenStreetMap - Project Haiti:

    There are many who think the Open Source Data Center Initiative will not work.  There are a lot of people who thought OpenStreetMaps wouldn't work too.

    Click to read more ...

    Saturday
    Apr102010

    Nokia acquires MetaCarta, continues investment in geolocation services beyond Navteq

    GigaOm's Om Malik has a post with Nokia's CEO on the future of the Mobile industry.

    Nokia’s CEO on the Challenges & Promise of the New Mobile Industry

    By Om Malik Apr. 8, 2010, 10:50am PDT 11 Comments

    IMPORTANT POINTS
    Expand

    Nokia Chairman, CEO and President Olli-Pekka Kallasvuo has the second-toughest job in the mobile industry — that of turning the decades-old, set-in-its-ways, $58-billion-a-year mobile handset maker into a services-driven, Internet-oriented monster that not only catches up to but surpasses new upstart rivals Apple and Google. The good news is that unlike Palm CEO Jon Rubenstein (who has the toughest mobile gig), he doesn’t have to worry about running out of money anytime soon.

    Part of the interview is the hot top of location services.

    Location Gives the Internet Relevance

    One of the things that gets Kallasvuo excited is location — or more specifically, location-based services. “Location is not an app, instead it adds a whole new dimension (and value) to the Internet,” he said, explaining why his company has made huge investments in location, including its $8 billion purchase of mapping company Navteq. Nokia earlier this year released a new Ovi Maps application that allows it to compete in markets such as India, Brazil and Russia, places where Google and Apple haven’t made inroads just yet.

    “Putting location elements into different type of services is a big opportunity which makes the Internet more exciting,” Kallasvuo said. (I’ve written about Nokia’s location-oriented strategy in the past.) Location, along with different types of sensors and augmented reality, will open the mobile world up to different possibilities, he said.

    For 2 weeks thanks to a friend who works on geolocation solutions,  I've known Nokia was acquiring MetaCarta.

    MetaCarta Inc. is the leading provider of geographic intelligence solutions. MetaCarta’s unique technology combines geographic search and geographic tagging capabilities so users can find content about a place by viewing results on a map. MetaCarta’s products make data and unstructured content "location-aware" and geographically relevant. These innovative solutions make it possible for customers to discover, visualize, and act on important location-based information and news.

    And, yesterday the press release went out on MetaCarta's website. And Nokia's. So, now I can reference public sources on the acquisition.

    Nokia acquires MetaCarta Inc.

    Espoo, Finland – Nokia announced today that it has acquired MetaCarta Inc. MetaCarta, based in Cambridge, Massachusetts, is a privately owned company which employs over 30 people and has expertise in geographic intelligence solutions. MetaCarta’s technology will be used in the area of local search in Location and other service.

    Who is MetaCarta?  Here are what IT analysts say.

    Dave Sonnen, Consultant, Spatial Information Management Research
    Sue Feldman, Research Vice President, Content Technologies

    Relevant Research

     

    Whit Andrews, Vice President Research / Analyst
    Jeff Vining, Vice President Research / Analyst
    Allen Weiner, Managing VP

    Relevant Research

    Mike Boland, Senior Analyst

    Relevant Research

    Here are some of the companies who worked with MetaCarta and awards they have won.

    • Technology Partners: ArdentMC, Clickability, EMC/Documentum, Enterprise Search Solutions (ESS), ESRI, Google, Microsoft, MITRE, Northrop Grumman, OpenText, Raytheon, and SAIC
    • Awards: IndustryWeek Technologies of the Year, KMWorld 100 Companies That Matter in Knowledge Management, and 2-time KMWorld Trend-Setting Products, Red Herring Top 100 Private Companies, Red Herring Top Innovator

    If you believe the media, Nokia is irrelevant in the battle between Apple, Google, and RIM.

    Apple's iPhone OS 4 may have more than 100 new features, but it established three big targets for Apple: Microsoft, Google and RIM. To some extent, it also showed that Apple considers Palm and Nokia to be irrelevant.

    But, I would guess this view exists because media users are mostly iPhone users, then RIM and Android.  With Nokia almost no market share with the US media reporters.  Note: I have a Nokia E71 I can use when I want a high quality phone, and thanks to OVI Maps April 6 release I can get free OVI maps for the phone.

    After listening to your overwhelmingly positive feedback and feeling your love for your favourite mobile phone, we have now created a custom version that works on Nokia E71 and Nokia E66.

    However, because of technical constraints, it isn’t possible to offer premium content such as Michelin and Lonely Planet guides on these devices.

    I wouldn't count out Nokia the way most media does.  Om Malik doesn't.

    If there was one point Nokia’s big boss wanted to make before we ended our conversation, it was that the Nokia in 2010 is going to be a lot different from the Nokia of the past. The company has its work cut out for it. The good news, if you can call it that, is that its CEO knows what to do. Acceptance is the first step toward recovery. And for me that’s a good start. I look forward to falling in love with Nokia all over again.

    It will be interesting to see Nokia's new phones in 2010.

    I am sure we'll here about big data center plans from Nokia to support its growth in services.

    Click to read more ...

    Wednesday
    Apr072010

    Apple will need to stop designing state-of-the-art hardware and start designing insanely great data centers, says David Siegel

    David Siegel has a book called Pull, The Power of the Semantic Web to transform business.

    The Problem

    On the Web today, we see millions of web sites, each of which presents web pages and documents. These are simply electronic versions of the old paper-based ways of doing things: writing checks, filing taxes, looking at menus, catalog pages, magazines, etc. When you search for something on Google, you get a list of web sites that may or may not have what you’re looking for, based on keywords found in the text. You have to look at each one and decide whether it answers your question. Google doesn’t know where the information or answers are; it just knows which pages have which keywords and who links to them.

    Our information infrastructure isn’t scaling up very well at all. The average person now sees over 1,000,000 words and consumes 34 gigabytes of information every day. Mike Bergman estimates white-collar workers spend 25% of their time looking for the documents and information they need to do their work. One billion people are online now, and 4 billion have mobile phones. Exhaustion of IPv4 addresses (limit is 4 billion) is predicted for sometime in 2011. By 2030, there will be a minimum of 50 billion devices connected via internet and phone networks. Our information infrastructure is built to haul electronic versions of 19th century documents for humans to read, and it’s keeping us from using information effectively.

    The solution to our information problem is the semantic web and the pull paradigm.

    One section that jumped out is "The Computerless Computer Company" where he makes the statement.

    Apple will need to stop designing state-of-the-art hardware and start designing insanely great data centers.

    David is a big Apple supporter working on the Tekton typeface and has a blog post on why he should lead Apple.

    The irony of I just realized is David's vision actually describes Google's plans.

    The huge strength Google has vs. Apple is its advertising system gives them a huge advantage of the "Pull" from consumers.  Apple is a push company.

    Google has insanely great data centers.

    Click to read more ...

    Monday
    Apr052010

    Next Advisor for GreenM3 NPO, Peter Horan, pushing the edge of the network to be close to customers

    Our first industry advisor was Mike Manos, our next is Peter Horan. Peter is unknown to most of the data center audience as he is an executive who has worked on the edge of innovation, not in the hub of data center activity.  Peter does have data center experience as the Sr. VP executive for InterActive Media at the time of ask.com's data center construction at Moses Lake, WA.  Chuck Geiger was CTO of ask.com at the time, and stated.

    “Moses Lake is an ideal location due to its cooperative business environment, access to low cost, renewable power and superior network connectivity,” said Chuck Geiger, Chief Technology Officer of Ask.com. “With these inherent benefits, Eastern Washington is the right choice for Ask.com as we expand our computing infrastructure to support our growth and expanded search services.”

    Peter has had the executive's view of building a large data center, yet he has some very innovative, forward thinking ideas and a powerful network.  Which brings up a presentation that Peter made discussing the "Edge of the Network."

    image

    I've known Peter for many years, including his time as Sr. VP/Publisher of ComputerWorld, CEO of DEVX.com, about.com, allbusiness.com, and was an obvious candidate for the GreenM3 NPO.

    image

    Here is a video where Peter presents the ideas to get closer to customers.  In the same way Peter encourages the audience to get close to customers, the goal of GreenM3 is to build a closer connection to customers, using open source techniques.

    image

    A person who we want to talk to in Peter's network is Chuck Geiger.

    Chuck Geiger
    Partner - Technology

    Chuck has significant experience running some of the largest online transaction product organizations and most visited sites in the world, including as CTO of Ask.com, CTO of PayPal, VP Architecture of eBay, and executive positions at InterActive Corp., Gateway and Travelocity.


    At InterActive Corp, Chuck was responsible for managing a consolidated data center strategy for IAC portfolio companies including Ask.com, Evite.com, CitySearch.com, Ticketmaster, and Match.com. Chuck also was responsible for the technology organization at Ask.com including Engineering, Architecture, Program Management, QA, IT, and Operations.


    At PayPal, Chuck was responsible for the product development organization which includes Product Management, Design, Engineering, Architecture, Operations, IT, QA, Project Management, Content, and Localization, running a team of approximately 550 professionals.
    At eBay, Chuck was responsible for the migration to the new generation system architecture and platform.

    BTW, Peter's day job is Chairman of Goodmail.

    About Goodmail Systems

    Goodmail Systems is the creator of CertifiedEmail™, the industry’s standard class of email. CertifiedEmail provides a safe and reliable means for consumers to easily identify authentic email messages from legitimate commercial and nonprofit email senders. Each CertifiedEmail is sent with a cryptographically secure token that assures authenticity and is marked in the inbox with a unique blue ribbon envelope icon, enabling consumers to visually distinguish email messages which are real and sent from email senders with whom they have a pre-existing relationship.

    We welcome Peter's passion for technical innovation and the environment.

    Click to read more ...