Open Data Map movement demonstrates innovation opportunity for Open Sourced Data Center Initiative

Tim Berners- Lee has a 6 minute TED presentation on the year open data went worldwide.

Map and location services are top scenarios for mobile devices.  Google and Microsoft have their maps.  Nokia bought Navteq and MetaCarta.  Apple bought PlaceBase.  With all the companies creating services, volunteers using an open approach to collaborate can beat the proprietary services.

The MercuryNews reports on Open Street Maps.

Volunteers create new digital maps

By Mike Swift

mswift@mercurynews.com

Posted: 04/09/2010 09:08:55 PM PDT

Updated: 04/10/2010 01:36:26 PM PDT

Ron Perez hikes by a waterfall while a portable GPS device records his tracks as... (Jim Gensheimer)

When Brian "Beej" Hall first heard about an audacious volunteer effort to create an Internet map of every street and path in every city and village on the planet, he was hooked. At the time, the nascent effort had only a few American members, and the U.S. map was essentially a digital terra incognita.

Just a few years later, the Berkeley software engineer is editing digital maps so precise they include drinking fountains and benches in the Bay Area parks where he hikes, and the mapping community has swelled to more than 240,000 global members. The effort, OpenStreetMap, is a kind of grass-roots Wikipedia for maps that is transforming how map data is collected, shared and used — from the desktop to smartphones to car navigation.

The reporter makes the observation of how a nonprofit community can change the map business.

But increasingly, the nonprofit community collaboration model behind OpenStreetMap, which shares all the cartographic data in its maps for free, is also changing the business of mapping, just as Wikipedia changed the business of reference. More and more, the accuracy of searches on Google Maps or directions issued by your car's navigational device are based on data collected by volunteers like Hall and other members of OpenStreetMap's do-it-yourself army.

Part of the reason why OpenStreetMap is popular is the fact that the end users are creating the maps.

OpenStreetMap users say that because their data is collected by people who actually live in a place, it is more likely to be accurate.

"It's the people's map," said Paul Jarrett, director of mapping for CloudMade.

If you are interested in the use of OpenStreetMap in Haiti go here.

We chose to tell the story of 'OpenStreetMap - Project Haiti'.
We all followed the crisis that unfolded following the Haiti earthquake, many of us chose to donate money, a few were flown out and deployed as part of the relief effort. But what practical impact can many have without being there in Haiti itself? Well, during this crisis a remarkable story unfolded; of how people around the world could virtually collaborate and contribute to the on-the-ground operations.

OpenStreetMap - Project Haiti 1

With the little existing physical, political and social infrastructure  now destroyed or damaged, the situation was especially challenging for aid agencies arriving on the ground. Where are the areas most in need of assistance, how do we get there, where are people trapped under buildings, which roads are blocked? This information is important to the rescue agencies immediately after the event, and to the longer rebuilding process. In many developing countries, there is a lack of good mapping data and particularly after a crisis, when up-to-date information is critical to managing events as they evolve.
Enter OpenStreetMap, the wiki map of the world, CrisisMappers and an impromptu community of volunteers who collaborated to produce the most authoritative map of Haiti in existence. Within hours of the event people were adding detail to the map, but on January 14th high resolution sattelite imagery of Haiti was made freely available and the Crisis Mapping community were able to trace roads, damaged buildings, and enter camps of displaced people into OpenStreetMap. This is the story of OpenStreetMap - Project Haiti:

There are many who think the Open Source Data Center Initiative will not work.  There are a lot of people who thought OpenStreetMaps wouldn't work too.

Read more

Nokia acquires MetaCarta, continues investment in geolocation services beyond Navteq

GigaOm's Om Malik has a post with Nokia's CEO on the future of the Mobile industry.

Nokia’s CEO on the Challenges & Promise of the New Mobile Industry

By Om Malik Apr. 8, 2010, 10:50am PDT 11 Comments

IMPORTANT POINTS
Expand

Nokia Chairman, CEO and President Olli-Pekka Kallasvuo has the second-toughest job in the mobile industry — that of turning the decades-old, set-in-its-ways, $58-billion-a-year mobile handset maker into a services-driven, Internet-oriented monster that not only catches up to but surpasses new upstart rivals Apple and Google. The good news is that unlike Palm CEO Jon Rubenstein (who has the toughest mobile gig), he doesn’t have to worry about running out of money anytime soon.

Part of the interview is the hot top of location services.

Location Gives the Internet Relevance

One of the things that gets Kallasvuo excited is location — or more specifically, location-based services. “Location is not an app, instead it adds a whole new dimension (and value) to the Internet,” he said, explaining why his company has made huge investments in location, including its $8 billion purchase of mapping company Navteq. Nokia earlier this year released a new Ovi Maps application that allows it to compete in markets such as India, Brazil and Russia, places where Google and Apple haven’t made inroads just yet.

“Putting location elements into different type of services is a big opportunity which makes the Internet more exciting,” Kallasvuo said. (I’ve written about Nokia’s location-oriented strategy in the past.) Location, along with different types of sensors and augmented reality, will open the mobile world up to different possibilities, he said.

For 2 weeks thanks to a friend who works on geolocation solutions,  I've known Nokia was acquiring MetaCarta.

MetaCarta Inc. is the leading provider of geographic intelligence solutions. MetaCarta’s unique technology combines geographic search and geographic tagging capabilities so users can find content about a place by viewing results on a map. MetaCarta’s products make data and unstructured content "location-aware" and geographically relevant. These innovative solutions make it possible for customers to discover, visualize, and act on important location-based information and news.

And, yesterday the press release went out on MetaCarta's website. And Nokia's. So, now I can reference public sources on the acquisition.

Nokia acquires MetaCarta Inc.

Espoo, Finland – Nokia announced today that it has acquired MetaCarta Inc. MetaCarta, based in Cambridge, Massachusetts, is a privately owned company which employs over 30 people and has expertise in geographic intelligence solutions. MetaCarta’s technology will be used in the area of local search in Location and other service.

Who is MetaCarta?  Here are what IT analysts say.

Dave Sonnen, Consultant, Spatial Information Management Research
Sue Feldman, Research Vice President, Content Technologies

Relevant Research

 

Whit Andrews, Vice President Research / Analyst
Jeff Vining, Vice President Research / Analyst
Allen Weiner, Managing VP

Relevant Research

Mike Boland, Senior Analyst

Relevant Research

Here are some of the companies who worked with MetaCarta and awards they have won.

  • Technology Partners: ArdentMC, Clickability, EMC/Documentum, Enterprise Search Solutions (ESS), ESRI, Google, Microsoft, MITRE, Northrop Grumman, OpenText, Raytheon, and SAIC
  • Awards: IndustryWeek Technologies of the Year, KMWorld 100 Companies That Matter in Knowledge Management, and 2-time KMWorld Trend-Setting Products, Red Herring Top 100 Private Companies, Red Herring Top Innovator

If you believe the media, Nokia is irrelevant in the battle between Apple, Google, and RIM.

Apple's iPhone OS 4 may have more than 100 new features, but it established three big targets for Apple: Microsoft, Google and RIM. To some extent, it also showed that Apple considers Palm and Nokia to be irrelevant.

But, I would guess this view exists because media users are mostly iPhone users, then RIM and Android.  With Nokia almost no market share with the US media reporters.  Note: I have a Nokia E71 I can use when I want a high quality phone, and thanks to OVI Maps April 6 release I can get free OVI maps for the phone.

After listening to your overwhelmingly positive feedback and feeling your love for your favourite mobile phone, we have now created a custom version that works on Nokia E71 and Nokia E66.

However, because of technical constraints, it isn’t possible to offer premium content such as Michelin and Lonely Planet guides on these devices.

I wouldn't count out Nokia the way most media does.  Om Malik doesn't.

If there was one point Nokia’s big boss wanted to make before we ended our conversation, it was that the Nokia in 2010 is going to be a lot different from the Nokia of the past. The company has its work cut out for it. The good news, if you can call it that, is that its CEO knows what to do. Acceptance is the first step toward recovery. And for me that’s a good start. I look forward to falling in love with Nokia all over again.

It will be interesting to see Nokia's new phones in 2010.

I am sure we'll here about big data center plans from Nokia to support its growth in services.

Read more

Apple will need to stop designing state-of-the-art hardware and start designing insanely great data centers, says David Siegel

David Siegel has a book called Pull, The Power of the Semantic Web to transform business.

The Problem

On the Web today, we see millions of web sites, each of which presents web pages and documents. These are simply electronic versions of the old paper-based ways of doing things: writing checks, filing taxes, looking at menus, catalog pages, magazines, etc. When you search for something on Google, you get a list of web sites that may or may not have what you’re looking for, based on keywords found in the text. You have to look at each one and decide whether it answers your question. Google doesn’t know where the information or answers are; it just knows which pages have which keywords and who links to them.

Our information infrastructure isn’t scaling up very well at all. The average person now sees over 1,000,000 words and consumes 34 gigabytes of information every day. Mike Bergman estimates white-collar workers spend 25% of their time looking for the documents and information they need to do their work. One billion people are online now, and 4 billion have mobile phones. Exhaustion of IPv4 addresses (limit is 4 billion) is predicted for sometime in 2011. By 2030, there will be a minimum of 50 billion devices connected via internet and phone networks. Our information infrastructure is built to haul electronic versions of 19th century documents for humans to read, and it’s keeping us from using information effectively.

The solution to our information problem is the semantic web and the pull paradigm.

One section that jumped out is "The Computerless Computer Company" where he makes the statement.

Apple will need to stop designing state-of-the-art hardware and start designing insanely great data centers.

David is a big Apple supporter working on the Tekton typeface and has a blog post on why he should lead Apple.

The irony of I just realized is David's vision actually describes Google's plans.

The huge strength Google has vs. Apple is its advertising system gives them a huge advantage of the "Pull" from consumers.  Apple is a push company.

Google has insanely great data centers.

Read more

Next Advisor for GreenM3 NPO, Peter Horan, pushing the edge of the network to be close to customers

Our first industry advisor was Mike Manos, our next is Peter Horan. Peter is unknown to most of the data center audience as he is an executive who has worked on the edge of innovation, not in the hub of data center activity.  Peter does have data center experience as the Sr. VP executive for InterActive Media at the time of ask.com's data center construction at Moses Lake, WA.  Chuck Geiger was CTO of ask.com at the time, and stated.

“Moses Lake is an ideal location due to its cooperative business environment, access to low cost, renewable power and superior network connectivity,” said Chuck Geiger, Chief Technology Officer of Ask.com. “With these inherent benefits, Eastern Washington is the right choice for Ask.com as we expand our computing infrastructure to support our growth and expanded search services.”

Peter has had the executive's view of building a large data center, yet he has some very innovative, forward thinking ideas and a powerful network.  Which brings up a presentation that Peter made discussing the "Edge of the Network."

image

I've known Peter for many years, including his time as Sr. VP/Publisher of ComputerWorld, CEO of DEVX.com, about.com, allbusiness.com, and was an obvious candidate for the GreenM3 NPO.

image

Here is a video where Peter presents the ideas to get closer to customers.  In the same way Peter encourages the audience to get close to customers, the goal of GreenM3 is to build a closer connection to customers, using open source techniques.

image

A person who we want to talk to in Peter's network is Chuck Geiger.

Chuck Geiger
Partner - Technology

Chuck has significant experience running some of the largest online transaction product organizations and most visited sites in the world, including as CTO of Ask.com, CTO of PayPal, VP Architecture of eBay, and executive positions at InterActive Corp., Gateway and Travelocity.


At InterActive Corp, Chuck was responsible for managing a consolidated data center strategy for IAC portfolio companies including Ask.com, Evite.com, CitySearch.com, Ticketmaster, and Match.com. Chuck also was responsible for the technology organization at Ask.com including Engineering, Architecture, Program Management, QA, IT, and Operations.


At PayPal, Chuck was responsible for the product development organization which includes Product Management, Design, Engineering, Architecture, Operations, IT, QA, Project Management, Content, and Localization, running a team of approximately 550 professionals.
At eBay, Chuck was responsible for the migration to the new generation system architecture and platform.

BTW, Peter's day job is Chairman of Goodmail.

About Goodmail Systems

Goodmail Systems is the creator of CertifiedEmail™, the industry’s standard class of email. CertifiedEmail provides a safe and reliable means for consumers to easily identify authentic email messages from legitimate commercial and nonprofit email senders. Each CertifiedEmail is sent with a cryptographically secure token that assures authenticity and is marked in the inbox with a unique blue ribbon envelope icon, enabling consumers to visually distinguish email messages which are real and sent from email senders with whom they have a pre-existing relationship.

We welcome Peter's passion for technical innovation and the environment.

Read more

Relationship of Electricity generation and Water, changes the game, 2 GW Entergy Nuclear Power Plant renewal permit denied based on warm water discharge

WSJ and others report on the New York environmental regulators, not the US EPA, denying Entergy's request for a 2 gigawatt Nuclear Power Plant renewal, supplying 30% of NYC's electricity.

image

New York Regulators Deny Water Permit for Nuclear Plant

By MARK LONG

NEW YORK -- New York environmental regulators have denied a key water-quality certificationEntergy Corp. needs to extend by 20 years its license to operate the 2,000-megawatt Indian Point nuclear-power plant.

The New York Department of Environmental Conservation said in a letter to Entergy dated April 2 that the two units of the plant "do not and will not comply with existing New York State water quality standards," even with the addition of a new screening technology favored by Entergy to protect aquatic life. The plant's existing "once-through" system withdraws and returns as much as 2.5 billion gallons of Hudson River water a day for cooling, a system blamed by environmentalists for damaging the river's ecosystem and killing millions of fish a year, including the endangered shortnose sturgeon.

Certification under the Clean Water Act is required before the U.S. Nuclear Regulatory Commission can approve an extension of the operating license for Indian Point, which generates enough electricity to power approximately 2 million homes and is major power source for New York City. The licenses for Indian Point units 2 and 3, which came online in the 1970s, are due to expire in September 2013 and December 2015, respectively.

What is humorous is the environmental group Riverkeeper thinking that 2 gigawatt of baseload can be brought on line by 2015.

"That power is replaceable," said Alex Matthiessen, president of environmental group Riverkeeper. "The evidence for why the plant doesn't meet state water-quality standards is overwhelming," he said, adding Indian Point accounts for the deaths of about a billion fish a year and that the group estimates cooling towers could be constructed for $200 million to $300 million.

The following is a study published on air or hybrid cooling for power plants vs. water.

Emerging Issues and Needs

in Power Plant Cooling Systems

Water availability is affecting power plant placement.  You need to be thinking the same for data center placement.

However, with the construction of new power plants in recent years, perhaps the most prevalent concern with wet cooling systems has been water availability. Growing competition from municipal and agricultural users has decreased the amounts and increased the prices of good quality water resources available to industrial users. This competition is most apparent in the southwestern U.S. where the need for new electric power generation is significant, but regional surface water sources are minimal and groundwater sources are highly prized and may have designated use restrictions. But even in areas usually considered “water rich”, such as the northeastern U.S., the combination of environmental, safety & health, and resource availability concerns has resulted in an increasing interest in dry and hybrid cooling systems as alternatives to wet cooling systems.

Size of Dry Cooling system vs. Wet Cooling - 2.2 times larger

Size. By definition, dry cooling involves the transfer of heat to the atmosphere
without the evaporative loss of water (i.e., by sensible heat transfer only). Because sensible heat transfer is less efficient than evaporative heat transfer, dry cooling systems must be larger than wet cooling systems. For example, to achieve a comparable heat rejection, one study estimates that a direct dry cooling system (ACC) will have a footprint about 2.2 times larger than a wet cooling tower and a height about 1.9 times greater.2

Maintenance of operations.

• Maintenance. Both direct and indirect dry cooling systems, as well as hybrid cooling systems, are larger and mechanically more complex than corresponding wet cooling systems. In addition to the larger heat transfer surface area, dry and hybrid cooling systems will have more fans, meaning more electrical motors, gearboxes and drive shafts. As such, labor requirements for a large ACC can be substantial. At one site with a 60-cell ACC (three 20-cell bays for three separate steam turbines), the maintenance staff was increased by two people for such activities as cleaning fan
blades and heat exchanger tube fins, monitoring lube-oil systems, and leak checking the vacuum system.3
• Energy penalties. Because sensible heat transfer is directly related to the ambient dry-bulb temperature, a dry cooling system must have the flexibility to respond to typical daily temperatures variations of 20-25 °F. A dry system that maintains an optimum turbine backpressure at ambient dry-bulb temperatures of 90-95 °F, may not - 6 - be able to do so as the temperature increases, meaning a lower generating efficiency.


From a design perspective, more surface area (i.e., a larger dry cooling system) can compensate for the decline in heat transfer at high ambient temperatures; but the greater size and associated operational control are also concerns, as previously discussed.

When all  things are equal, it comes down to cost of systems.

Costs. If performance, availability and reliability appear to be equal, then the single issue that will most likely govern the selection and use of a power plant cooling system is cost. Unfortunately, the economics of power plant cooling systems are complex, which means cost estimates are frequently mistaken, misunderstood or misrepresented.
This complexity results from the complicated relationships of three key costs: installed equipment capital cost, annual operating and maintenance or O&M cost, and energy penalty cost. For most manufacturing processes, the first two costs can be fairly well defined and, to a certain extent, contractually guaranteed by the vendor/supplier. But the energy penalty cost is somewhat unique to power plant cooling systems because it reflects a direct performance link between the cooling system and the low-pressure
turbine-generator. Consequently, the potential for and the magnitude of an energy penalty cost can dictate cooling system design and operating changes that directly affect the capital and O&M costs. So in a competitive market, generating power in the most cost-effective manner depends upon a company’s ability to balance all three key costs and optimize the overall life-cycle cost of the cooling system.

What is the water footprint of the power plant supplying your data center?

Are you planning for water as a scarce resource affecting the cooling systems for your data center?

Here is what Google presented on water use at it's data center event a year ago.

Multiple Speakers Discuss Water Issues at Google’s Efficiency Data Center Summit

I have been blogging about water issues in the data center for a while, and have a category for tagging posts for “water.”

Read more