Google Ads

Enter your email address:

Delivered by FeedBurner

Search
This form does not yet contain any fields.
    Navigation
    Saturday
    Apr102010

    Nokia acquires MetaCarta, continues investment in geolocation services beyond Navteq

    GigaOm's Om Malik has a post with Nokia's CEO on the future of the Mobile industry.

    Nokia’s CEO on the Challenges & Promise of the New Mobile Industry

    By Om Malik Apr. 8, 2010, 10:50am PDT 11 Comments

    IMPORTANT POINTS
    Expand

    Nokia Chairman, CEO and President Olli-Pekka Kallasvuo has the second-toughest job in the mobile industry — that of turning the decades-old, set-in-its-ways, $58-billion-a-year mobile handset maker into a services-driven, Internet-oriented monster that not only catches up to but surpasses new upstart rivals Apple and Google. The good news is that unlike Palm CEO Jon Rubenstein (who has the toughest mobile gig), he doesn’t have to worry about running out of money anytime soon.

    Part of the interview is the hot top of location services.

    Location Gives the Internet Relevance

    One of the things that gets Kallasvuo excited is location — or more specifically, location-based services. “Location is not an app, instead it adds a whole new dimension (and value) to the Internet,” he said, explaining why his company has made huge investments in location, including its $8 billion purchase of mapping company Navteq. Nokia earlier this year released a new Ovi Maps application that allows it to compete in markets such as India, Brazil and Russia, places where Google and Apple haven’t made inroads just yet.

    “Putting location elements into different type of services is a big opportunity which makes the Internet more exciting,” Kallasvuo said. (I’ve written about Nokia’s location-oriented strategy in the past.) Location, along with different types of sensors and augmented reality, will open the mobile world up to different possibilities, he said.

    For 2 weeks thanks to a friend who works on geolocation solutions,  I've known Nokia was acquiring MetaCarta.

    MetaCarta Inc. is the leading provider of geographic intelligence solutions. MetaCarta’s unique technology combines geographic search and geographic tagging capabilities so users can find content about a place by viewing results on a map. MetaCarta’s products make data and unstructured content "location-aware" and geographically relevant. These innovative solutions make it possible for customers to discover, visualize, and act on important location-based information and news.

    And, yesterday the press release went out on MetaCarta's website. And Nokia's. So, now I can reference public sources on the acquisition.

    Nokia acquires MetaCarta Inc.

    Espoo, Finland – Nokia announced today that it has acquired MetaCarta Inc. MetaCarta, based in Cambridge, Massachusetts, is a privately owned company which employs over 30 people and has expertise in geographic intelligence solutions. MetaCarta’s technology will be used in the area of local search in Location and other service.

    Who is MetaCarta?  Here are what IT analysts say.

    Dave Sonnen, Consultant, Spatial Information Management Research
    Sue Feldman, Research Vice President, Content Technologies

    Relevant Research

     

    Whit Andrews, Vice President Research / Analyst
    Jeff Vining, Vice President Research / Analyst
    Allen Weiner, Managing VP

    Relevant Research

    Mike Boland, Senior Analyst

    Relevant Research

    Here are some of the companies who worked with MetaCarta and awards they have won.

    • Technology Partners: ArdentMC, Clickability, EMC/Documentum, Enterprise Search Solutions (ESS), ESRI, Google, Microsoft, MITRE, Northrop Grumman, OpenText, Raytheon, and SAIC
    • Awards: IndustryWeek Technologies of the Year, KMWorld 100 Companies That Matter in Knowledge Management, and 2-time KMWorld Trend-Setting Products, Red Herring Top 100 Private Companies, Red Herring Top Innovator

    If you believe the media, Nokia is irrelevant in the battle between Apple, Google, and RIM.

    Apple's iPhone OS 4 may have more than 100 new features, but it established three big targets for Apple: Microsoft, Google and RIM. To some extent, it also showed that Apple considers Palm and Nokia to be irrelevant.

    But, I would guess this view exists because media users are mostly iPhone users, then RIM and Android.  With Nokia almost no market share with the US media reporters.  Note: I have a Nokia E71 I can use when I want a high quality phone, and thanks to OVI Maps April 6 release I can get free OVI maps for the phone.

    After listening to your overwhelmingly positive feedback and feeling your love for your favourite mobile phone, we have now created a custom version that works on Nokia E71 and Nokia E66.

    However, because of technical constraints, it isn’t possible to offer premium content such as Michelin and Lonely Planet guides on these devices.

    I wouldn't count out Nokia the way most media does.  Om Malik doesn't.

    If there was one point Nokia’s big boss wanted to make before we ended our conversation, it was that the Nokia in 2010 is going to be a lot different from the Nokia of the past. The company has its work cut out for it. The good news, if you can call it that, is that its CEO knows what to do. Acceptance is the first step toward recovery. And for me that’s a good start. I look forward to falling in love with Nokia all over again.

    It will be interesting to see Nokia's new phones in 2010.

    I am sure we'll here about big data center plans from Nokia to support its growth in services.

    Click to read more ...

    Wednesday
    Apr072010

    Apple will need to stop designing state-of-the-art hardware and start designing insanely great data centers, says David Siegel

    David Siegel has a book called Pull, The Power of the Semantic Web to transform business.

    The Problem

    On the Web today, we see millions of web sites, each of which presents web pages and documents. These are simply electronic versions of the old paper-based ways of doing things: writing checks, filing taxes, looking at menus, catalog pages, magazines, etc. When you search for something on Google, you get a list of web sites that may or may not have what you’re looking for, based on keywords found in the text. You have to look at each one and decide whether it answers your question. Google doesn’t know where the information or answers are; it just knows which pages have which keywords and who links to them.

    Our information infrastructure isn’t scaling up very well at all. The average person now sees over 1,000,000 words and consumes 34 gigabytes of information every day. Mike Bergman estimates white-collar workers spend 25% of their time looking for the documents and information they need to do their work. One billion people are online now, and 4 billion have mobile phones. Exhaustion of IPv4 addresses (limit is 4 billion) is predicted for sometime in 2011. By 2030, there will be a minimum of 50 billion devices connected via internet and phone networks. Our information infrastructure is built to haul electronic versions of 19th century documents for humans to read, and it’s keeping us from using information effectively.

    The solution to our information problem is the semantic web and the pull paradigm.

    One section that jumped out is "The Computerless Computer Company" where he makes the statement.

    Apple will need to stop designing state-of-the-art hardware and start designing insanely great data centers.

    David is a big Apple supporter working on the Tekton typeface and has a blog post on why he should lead Apple.

    The irony of I just realized is David's vision actually describes Google's plans.

    The huge strength Google has vs. Apple is its advertising system gives them a huge advantage of the "Pull" from consumers.  Apple is a push company.

    Google has insanely great data centers.

    Click to read more ...

    Monday
    Apr052010

    Next Advisor for GreenM3 NPO, Peter Horan, pushing the edge of the network to be close to customers

    Our first industry advisor was Mike Manos, our next is Peter Horan. Peter is unknown to most of the data center audience as he is an executive who has worked on the edge of innovation, not in the hub of data center activity.  Peter does have data center experience as the Sr. VP executive for InterActive Media at the time of ask.com's data center construction at Moses Lake, WA.  Chuck Geiger was CTO of ask.com at the time, and stated.

    “Moses Lake is an ideal location due to its cooperative business environment, access to low cost, renewable power and superior network connectivity,” said Chuck Geiger, Chief Technology Officer of Ask.com. “With these inherent benefits, Eastern Washington is the right choice for Ask.com as we expand our computing infrastructure to support our growth and expanded search services.”

    Peter has had the executive's view of building a large data center, yet he has some very innovative, forward thinking ideas and a powerful network.  Which brings up a presentation that Peter made discussing the "Edge of the Network."

    image

    I've known Peter for many years, including his time as Sr. VP/Publisher of ComputerWorld, CEO of DEVX.com, about.com, allbusiness.com, and was an obvious candidate for the GreenM3 NPO.

    image

    Here is a video where Peter presents the ideas to get closer to customers.  In the same way Peter encourages the audience to get close to customers, the goal of GreenM3 is to build a closer connection to customers, using open source techniques.

    image

    A person who we want to talk to in Peter's network is Chuck Geiger.

    Chuck Geiger
    Partner - Technology

    Chuck has significant experience running some of the largest online transaction product organizations and most visited sites in the world, including as CTO of Ask.com, CTO of PayPal, VP Architecture of eBay, and executive positions at InterActive Corp., Gateway and Travelocity.


    At InterActive Corp, Chuck was responsible for managing a consolidated data center strategy for IAC portfolio companies including Ask.com, Evite.com, CitySearch.com, Ticketmaster, and Match.com. Chuck also was responsible for the technology organization at Ask.com including Engineering, Architecture, Program Management, QA, IT, and Operations.


    At PayPal, Chuck was responsible for the product development organization which includes Product Management, Design, Engineering, Architecture, Operations, IT, QA, Project Management, Content, and Localization, running a team of approximately 550 professionals.
    At eBay, Chuck was responsible for the migration to the new generation system architecture and platform.

    BTW, Peter's day job is Chairman of Goodmail.

    About Goodmail Systems

    Goodmail Systems is the creator of CertifiedEmail™, the industry’s standard class of email. CertifiedEmail provides a safe and reliable means for consumers to easily identify authentic email messages from legitimate commercial and nonprofit email senders. Each CertifiedEmail is sent with a cryptographically secure token that assures authenticity and is marked in the inbox with a unique blue ribbon envelope icon, enabling consumers to visually distinguish email messages which are real and sent from email senders with whom they have a pre-existing relationship.

    We welcome Peter's passion for technical innovation and the environment.

    Click to read more ...

    Monday
    Apr052010

    Relationship of Electricity generation and Water, changes the game, 2 GW Entergy Nuclear Power Plant renewal permit denied based on warm water discharge

    WSJ and others report on the New York environmental regulators, not the US EPA, denying Entergy's request for a 2 gigawatt Nuclear Power Plant renewal, supplying 30% of NYC's electricity.

    image

    New York Regulators Deny Water Permit for Nuclear Plant

    By MARK LONG

    NEW YORK -- New York environmental regulators have denied a key water-quality certificationEntergy Corp. needs to extend by 20 years its license to operate the 2,000-megawatt Indian Point nuclear-power plant.

    The New York Department of Environmental Conservation said in a letter to Entergy dated April 2 that the two units of the plant "do not and will not comply with existing New York State water quality standards," even with the addition of a new screening technology favored by Entergy to protect aquatic life. The plant's existing "once-through" system withdraws and returns as much as 2.5 billion gallons of Hudson River water a day for cooling, a system blamed by environmentalists for damaging the river's ecosystem and killing millions of fish a year, including the endangered shortnose sturgeon.

    Certification under the Clean Water Act is required before the U.S. Nuclear Regulatory Commission can approve an extension of the operating license for Indian Point, which generates enough electricity to power approximately 2 million homes and is major power source for New York City. The licenses for Indian Point units 2 and 3, which came online in the 1970s, are due to expire in September 2013 and December 2015, respectively.

    What is humorous is the environmental group Riverkeeper thinking that 2 gigawatt of baseload can be brought on line by 2015.

    "That power is replaceable," said Alex Matthiessen, president of environmental group Riverkeeper. "The evidence for why the plant doesn't meet state water-quality standards is overwhelming," he said, adding Indian Point accounts for the deaths of about a billion fish a year and that the group estimates cooling towers could be constructed for $200 million to $300 million.

    The following is a study published on air or hybrid cooling for power plants vs. water.

    Emerging Issues and Needs

    in Power Plant Cooling Systems

    Water availability is affecting power plant placement.  You need to be thinking the same for data center placement.

    However, with the construction of new power plants in recent years, perhaps the most prevalent concern with wet cooling systems has been water availability. Growing competition from municipal and agricultural users has decreased the amounts and increased the prices of good quality water resources available to industrial users. This competition is most apparent in the southwestern U.S. where the need for new electric power generation is significant, but regional surface water sources are minimal and groundwater sources are highly prized and may have designated use restrictions. But even in areas usually considered “water rich”, such as the northeastern U.S., the combination of environmental, safety & health, and resource availability concerns has resulted in an increasing interest in dry and hybrid cooling systems as alternatives to wet cooling systems.

    Size of Dry Cooling system vs. Wet Cooling - 2.2 times larger

    Size. By definition, dry cooling involves the transfer of heat to the atmosphere
    without the evaporative loss of water (i.e., by sensible heat transfer only). Because sensible heat transfer is less efficient than evaporative heat transfer, dry cooling systems must be larger than wet cooling systems. For example, to achieve a comparable heat rejection, one study estimates that a direct dry cooling system (ACC) will have a footprint about 2.2 times larger than a wet cooling tower and a height about 1.9 times greater.2

    Maintenance of operations.

    • Maintenance. Both direct and indirect dry cooling systems, as well as hybrid cooling systems, are larger and mechanically more complex than corresponding wet cooling systems. In addition to the larger heat transfer surface area, dry and hybrid cooling systems will have more fans, meaning more electrical motors, gearboxes and drive shafts. As such, labor requirements for a large ACC can be substantial. At one site with a 60-cell ACC (three 20-cell bays for three separate steam turbines), the maintenance staff was increased by two people for such activities as cleaning fan
    blades and heat exchanger tube fins, monitoring lube-oil systems, and leak checking the vacuum system.3
    • Energy penalties. Because sensible heat transfer is directly related to the ambient dry-bulb temperature, a dry cooling system must have the flexibility to respond to typical daily temperatures variations of 20-25 °F. A dry system that maintains an optimum turbine backpressure at ambient dry-bulb temperatures of 90-95 °F, may not - 6 - be able to do so as the temperature increases, meaning a lower generating efficiency.


    From a design perspective, more surface area (i.e., a larger dry cooling system) can compensate for the decline in heat transfer at high ambient temperatures; but the greater size and associated operational control are also concerns, as previously discussed.

    When all  things are equal, it comes down to cost of systems.

    Costs. If performance, availability and reliability appear to be equal, then the single issue that will most likely govern the selection and use of a power plant cooling system is cost. Unfortunately, the economics of power plant cooling systems are complex, which means cost estimates are frequently mistaken, misunderstood or misrepresented.
    This complexity results from the complicated relationships of three key costs: installed equipment capital cost, annual operating and maintenance or O&M cost, and energy penalty cost. For most manufacturing processes, the first two costs can be fairly well defined and, to a certain extent, contractually guaranteed by the vendor/supplier. But the energy penalty cost is somewhat unique to power plant cooling systems because it reflects a direct performance link between the cooling system and the low-pressure
    turbine-generator. Consequently, the potential for and the magnitude of an energy penalty cost can dictate cooling system design and operating changes that directly affect the capital and O&M costs. So in a competitive market, generating power in the most cost-effective manner depends upon a company’s ability to balance all three key costs and optimize the overall life-cycle cost of the cooling system.

    What is the water footprint of the power plant supplying your data center?

    Are you planning for water as a scarce resource affecting the cooling systems for your data center?

    Here is what Google presented on water use at it's data center event a year ago.

    Multiple Speakers Discuss Water Issues at Google’s Efficiency Data Center Summit

    I have been blogging about water issues in the data center for a while, and have a category for tagging posts for “water.”

    Click to read more ...

    Friday
    Apr022010

    Alternative to Google's hiring Renewable Energy Systems Modeling Engineer

    I am spending more time researching the Low Carbon Data Center ideas and I ran across Google's job posting on Renewable Energy System Modeling Engineer.

    The role: Renewable Energy System Modeling Engineer - Mountain View

    RE<'C will require development of new utility-scale energy production systems. But design iteration times for large-scale physical systems are notoriously slow and expensive. You will use your expertise in computer simulation and modeling to accelerate the design iteration time for renewable energy systems. You will build software tools and models of optical, mechanical, electrical, and financial systems to allow the team to rapidly answer questions and explore the design-space of utility-scale energy systems. You will draw from your broad systems knowledge and your deep expertise in software-based simulation. You will choose the right modeling environment for each problem-from simple spreadsheets to time-based simulators to custom software models you create in high-level languages. The models you create will be important software projects unto themselves. You will follow Google's world-class software development methodologies as you create, test, and maintain these models. You will build rigorous testing frameworks to verify that your models produce correct results. You will collaborate with other engineers to frame the modeling problem and interpret the results.

    It's great Google see the need for this person, but I was curious if anyone else has done Renewable Energy System Modeling.  Guess what there is, since 1993 in fact.  NREL has this page on Homer.

    New Distribution Process for NREL's HOMER Model

    Note! HOMER is now distributed and supported by HOMER Energy (www.homerenergy.com)

    To meet the renewable energy industry’s system analysis and optimization needs , NREL started developing HOMER in 1993. Since then it has been downloaded free of charge by more than 30,000 individuals, corporations, NGOs, government agencies, and universities worldwide.

    HOMER is a computer model that simplifies the task of evaluating design options for both off-grid and grid-connected power systems for remote, stand-alone, and distributed generation (DG) applications. HOMER's optimization and sensitivity analysis algorithms allow the user to evaluate the economic and technical feasibility of a large number of technology options and to account for uncertainty in technology costs, energy resource availability, and other variables. HOMER models both conventional and renewable energy technologies:

    image

    I signed up for the Homer Energy site which has 510 users, non apparently Google engineers.

    image

    I hope to make contact with the Homer Energy team as we are trying to have a session at DataCenterDynamics Seattle on a Low Carbon Data Center.

    Maybe Google doesn't have to hire the Renewable Engineering System Modeling engineer after all.  :-)

    Click to read more ...