Google bypasses Utility for Smart Meters

gigaom posts on google’s latest power meter efforts.

Why Google’s PowerMeter Gadget Partnership Is a Big Power Play

By Katie Fehrenbacher | Tuesday, October 6, 2009 | 11:53 AM PT | 1 comment

With Google’s endless projects — from book search to a browser killer to Blogger — you’re probably wondering why I’m so excited about a new partnership deal for the company’s PowerMeter energy management tool. Well, here’s why: For the first time, consumers can now access PowerMeter via a gadget called the TED-5000, made by startup Energy Inc., and users don’t need to go through their utility or have a smart meter (a digital two-way electricity meter) installed to access it. In other words, Google has finally bypassed the utility with PowerMeter, which is an important step for both bringing consumer energy management products to the mainstream, and pushing utilities to more quickly embrace information technology networks and broadband.

Smart meters are great, but the problem is that just a little over 6 percent of households in the U.S. currently have them. While that percentage will grow dramatically in the coming years, it will take time, and PowerMeter’s former smart grid strategy would have meant the tool was only available to a small portion of the population for quite some time. The other drawback to the smart meter architecture is that utilities are installing smart meters attached to networks that aren’t exactly the most robust. Utilities commonly build networks that can significantly delay the time it takes the energy information to reach the customer — smart meters will often grab energy info every 15 minutes to an hour, but then the utility network will bring that data to the data center and display it back to the customer in a 24-hour period.

I wonder if this will bleed into corporate and commercial environments?

One point mentioned.

Utilities haven’t traditionally been very good at IT — they haven’t had to be — but that’s all changing, and next-generation utilities will need to be as proficient in running data networks as they are at managing power networks. Some forward-thinking utilities like San Diego Gas & Electric know that and are building multimillion-dollar wireless networks to manage their smart grid deployments. PowerMeter and the TED-5000 are just a small piece of that equation, but they’re an important first step in giving consumers easy access, and ownership over, their energy information.

Read more

80 years of Weather Data for Data Center Location

Looking for the place to put a data center for free cooling?  Or maybe for photo voltaic.  You need weather data.

AWS blogs about 20 GB of daily weather data.

New Public Data Set: Daily Global Weather

The folks at Infochimps have just released the Daily Global Weather Public Data Set.

This 20 GB data set incorporates daily weather measurements (temperature, dew point, wind speed, humidity, barometric pressure, and so forth) from over 9000 weather stations around the world. The data was originally collected as part of the Global Surface Summary of the Day (GSOD) by the National Climactic Data Center and is available from 1929 to the present, with the data from 1973 to the present being the most complete.

The map at right contains one yellow dot for each data collection station.

-- Jeff;

September 28, 2009

for those who need to connect to the data feed. Here is public data sets.

Daily Global Weather Measurements, 1929-2009 (NCDC, GSOD)

 

A collection of daily weather measurements (temperature, wind speed, humidity, pressure, &c.) from 9000+ weather stations around the world.

Submitted By:
infochimps

US Snapshot ID (Linux/Unix):
snap-ac47f4c5

US snapshot ID (Windows):
snap-8547f4ec

Size:
20GB

Creation Date:
08/22/09

Last Updated:
08/22/09

License:
Other

Source:
National Climate Data Center (NCDC)


Data originally collected as part of the Global Surface Summary of Day (GSOD) by the National Climactic Data Center (NCDC). Data collected, transformed, and uploaded by Infochimps.org.
Global summary of day data for 18 surface meteorological elements are derived from the synoptic/hourly observations contained in USAF DATSAV3 Surface data and Federal Climate Complex Integrated Surface Data (ISD). Historical data are generally available for 1929 to the present, with data from 1973 to the present being the most complete. For some periods, one or more countries’ data may not be available due to data restrictions or communications problems. In deriving the summary of day data, a minimum of 4 observations for the day must be present (allows for stations which report 4 synoptic observations/day). Since the data are converted to constant units (e.g, knots), slight rounding error from the originally reported values may occur (e.g, 9.9 instead of 10.0).

Read more

Intel Labs – Future of Energy Efficiency Processors – Self-Tuning Performance

I learned more than I thought I would at Intel Developer Forum. There was a lot of excitement about the latest processors and news.com has thorough coverage of IDF.

IDF 2009: Intel plays to its strengths

by CNET News staff

At the annual developer forum, Intel shows off what it can do with silicon and what to look forward to from systems built around its chips.

Intel unveils system-on-a-chip for TVs
The CE4100 is designed to bring Internet content and services to digital TVs, DVD players, and advanced set-top boxes.
(Posted in Nanotech: The Circuits Blog by Brooke Crothers)
September 24, 2009 1:30 PM PDT
Intel's Maloney: Our business is do or die
Sean Maloney, a favorite to eventually become Intel's CEO, says there are good reasons the chipmaker is pushing back against Europe's antitrust charges.
(Posted in Nanotech: The Circuits Blog by Brooke Crothers)
September 24, 2009 10:26 AM PDT

With all the hype, I was filtering, looking for something really game changing.  Something that will change things to be more efficient.  I found it in a booth staffed by Intel Labs stuck in the back of the exhibit area where Shih-Lien Lu was demonstrating Self-Tuning Processors.

image

By modifying Vcc Voltage  and clock frequency, the processor can be set up for energy efficiency or performance.

How big?  21% more throughput or 37% less power!!!

image

And there is a middle ground of 5% better performance and 28% less power.

image

Here is the prototype board.

image

There must be a catch to why isn’t Intel shipping this concept already.

Because it requires a different mindset for the market and users.  The below diagram shows the Vcc Voltage and Temperature Fclk guardband typically existing for processors.  There is a margin of safety to insure Intel Processors reliability over a 7 year period.  Huh?  But, what if I don’t want seven years?  Welcome to the problem with enterprise computing.  Lowest common denominator type of thinking to reach the market masses means you get burdened with conservative designs.

What happens if you only wanted a guardband designed for a 3 year period?  You could in theory do what Intel Lab shows and have lower Vcc voltages with higher clock frequencies, but  this would require Intel marketing and finance to rethink how they price processors.  What is the value of a 7 to 3 year change in product reliability?

image

Why go through all this effort? 

  1. Do you want a 21% performance improvement for the same power?
  2. Do you want to save 37% processor power for the same performance?
  3. Do you want 5% more performance for 28% less processor power?

Sound confusing.  Yes it will make customer procurement process complaints increase as they are handed a performance energy design envelope.

This is another example of the Flaw of Averages where people want a single number when in reality there is a distribution of performance.

Read more

Smart Grid Issues for Electric Vehicles looks like Data Center Power Monitoring Complexity

Like fractals, complex system can be self-similar.

Fractal

From Wikipedia, the free encyclopedia

For the 2009 recording by Skyfire, see Fractal (EP).

The Mandelbrot set is a famous example of a fractal

A fractal is generally "a rough or fragmented geometric shape that can be split into parts, each of which is (at least approximately) a reduced-size copy of the whole,"[1] a property called self-similarity. Roots of mathematical interest on fractals can be traced back to the late 19th Century; however, the term "fractal" was coined by Benoît Mandelbrot in 1975 and was derived from the Latin fractus meaning "broken" or "fractured." A mathematical fractal is based on an equation that undergoes iteration, a form of feedback based on recursion.[2]

A fractal often has the following features:[3]

Because they appear similar at all levels of magnification, fractals are often considered to be infinitely complex (in informal terms). Natural objects that approximate fractals to a degree include clouds, mountain ranges, lightning bolts, coastlines, snow flakes, various vegetables (cauliflower and broccoli), and animal coloration patterns. However, not all self-similar objects are fractals—for example, the real line (a straight Euclidean line) is formally self-similar but fails to have other fractal characteristics; for instance, it is regular enough to be described in Euclidean terms.

Gigaom has an article about the complexity of a smart grid electric vehicle system.

Report: IT and Networking Issues for the Electric Vehicle Market This content requires a paid GigaOM Pro subscription

Summary:

This Pike Research report focuses on the IT and networking requirements associated with technology support systems for the emerging Electric Vehicle (EV) market. Key areas covered include vehicle connection and identification, energy transfer and vehicle-to-grid systems, communications platforms, pricing and billing systems and implementation issues.

The new generation of mass-produced EVs (including both plug-in and all-electrics) that will start arriving in 2010 will be able to charge at the owner’s residence, place of business, or any number of public and private charging stations. Keeping track of the ability of these vehicles and the grid to transfer energy will require transmitting data over old and new communications pathways using a series of developing and yet-to-be-written standards.

Industries that previously had little to no interaction with each other are now collaborating, determining new technologies and standard protocols and formats for sharing data. Formerly isolated networks must be able to handshake and seamlessly share volumes of financial and performance data. EV charging transactions will, for the first time, bring together platforms including vehicle operating systems and power management systems, utility billing systems, grid performance data, charging equipment applications, fixed and wireless communications networks, and web services.

When you look at the complexity of the system, it looks amazingly like the issues to put in a real-time energy monitoring system in the data center.

  1. Executive Summary
  2. Vehicle Connection and Identification
    1. Building Codes
    2. Battery Status
    3. Managing Vehicle-Grid Interaction
    4. Power Transfer
      1. Timed Power Transfer
  3. Communications Between Charging Locations and the Grid
    1. Home Area Networks
      1. Smart Meters
    2. Communications Channels
      1. Broadband
      2. ZigBee
      3. Powerline Networking
      4. Cellular Networks
  4. Utility Interaction with Customers
    1. Real-time Energy Pricing
    2. Enabling Vehicles to Respond to Grid Conditions
    3. Renewable Energy
    4. Future Vehicle to Grid (V2G) Applications
  5. Implementation Issues
    1. Cost
    2. Standards in Flux
    3. Clash of Multiple Industries
      1. Control
    4. Privacy
Read more

Best Possible Conditions for a Green Data Center, Intel & T-Systems launch DataCenter2020

Intel and T-systems have launched DataCenter2020.

Bavaria's Silicon Valley: DataCenter 2020

Sep 18, 2009

  • T-Systems and Intel join in researching the efficient data center of the future

Opening in Munich: At DataCenter 2020, T-Systems and Intel are working on the industrial implementation and automation of ICT services. Their goal: bringing them to market with maximum energy and cost savings. In an initial phase at Euroindustriepark, the two companies are researching how to create the best possible conditions for a green data center. Initial findings will be published this year. They will serve as the basis for ecological improvements to new and existing data centers.

Some of the technical details for the below pictures are:

Highlights of DataCenter 2020 include a ceiling height that can be adjusted from 2.50 meters to 3.70 meters and a smoke generator that makes air flows visible. The test environment, roughly 70 meters square, and an equipment room of the same size are located in the T-Systems data center. Intel is providing about 180 servers for the project, while the corporate customer arm of Deutsche Telekom is supplying the infrastructure necessary to operate them.

20090918_DataCenter2020_008

20090918_DataCenter2020_002 

20090918_DataCenter2020_005

Read more