Google Ads

Enter your email address:

Delivered by FeedBurner

This form does not yet contain any fields.

    Attending Intel Developer Forum, Sept 22 - 24

    I am attending Intel Developer Forum, Sept 22 – 24 next week.


    Intel has energy efficiency as a big topic for data centers, mobile, and smart grid.

    Eco-tech Community

    See how Smart Grid technology is changing the way society generates, distributes, and manages energy — from Smart Wind Turbines, to Smart Transmission Networks, to Smart Meters, to Enterprise IT, and Smart Homes. Learn about embedded Atom Processor applications, Intel Xeon processors for Grid Modeling and Simulation, and WiMAX for Smart Grid communications. See how Home Energy Management Systems, running on either the PC, Living Room TV, or In-Home Display, are helping consumers reduce their energy costs and carbon footprints. In the Enterprise, see how much energy can be saved by utilizing the latest Xeon Processors in Data Centers, and by utilizing Intel vPro Technology, the latest Core 2 Duo Desktops, and latest Core 2 Duo Notebooks in Enterprise PC networks.

    Virtualization is another big area.

    Virtualization Community

    Industry leaders in the Virtualization Community, located in the Technology Showcase, will highlight their latest usage models and new technologies, all aimed at adding value to virtualization solutions and the enterprise cloud. Virtualization has become a key enabler of innovation in the dynamic data center. Come see how we are expanding IT capabilities with increased product offerings, new usage models, and flexible infrastructure solutions to meet customers’ ever-changing business requirements. Join these leaders to discuss the next generation of virtualization and enterprise cloud technologies in IT.

    If you think you will be at IDF feel free to send me an email and we can try and meet at the show.

    Click to read more ...


    KC Mares Asks The Tough Questions, Rewarded by PUE of 1.04

    DataCenterKnowledge has a post on Ultra-Low PUE.

    Designing for ‘Ultra-Low’ Efficiency and PUE

    September 10th, 2009 : Rich Miller

    The ongoing industry debate about energy efficiency reporting based on the Power Usage Efficiency (PUE) metric is about to get another jolt. Veteran data center specialist KC Mares reports that he has worked on three projects this year that used unconventional design decisions to achieve “ultra-low PUEs” of between 1.046 and 1.08. Those PUE numbers are even lower than those publicly reported by Google, which has announced an average PUE of 1.20 across its facilities, with one facility performing at a 1.11 PUE in the first quarter of 2009.

    KC’s post has more details.

    Is it possible, a data center PUE of 1.04, today?

    I’ve been involved in the design and development of over $6 billion of data centers, maybe about $10 billion now, I lost count after $5 billion a few years ago, so I’ve seen a few things. One thing I do see in the data center industry is more or less, the same design over and over again. Yes, we push the envelope as an industry, yes, we do design some pretty cool stuff but rarely do we sit down with our client, the end-user, and ask them what they really need. They often tell us a certain Tier level, or availability they want, and the MWs of IT load to support, but what do they really need? Often everyone in the design charrette assumes what a data center should look like without really diving deep into what is important.

    And KC asks the tough questions.

    Rarely did I get the answers from the end-users I wanted to hear, where they really questioned the traditional thinking and what a data center should be and why, but we did get to some unconventional conclusions about what they needed instead of automatically assuming what they needed or wanted.

    We questioned what they thought a data center should be: how much redundancy did they really need? Could we exceed ASHRAE TC9.9 recommended or even allowable ranges? Did all the IT load really NEED to be on UPS? Was N+1 really needed during the few peak hours a year or could we get by with just N during those few peak hours each year and N+1 the rest of the year?

    KC provides background we wish others would share.

    Now, you ask, how did we get to a PUE of 1.05? Let me hopefully answer a few of your questions: 1) yes, based on annual hourly site weather data; 2) all three have densities of 400-500 watts/sf; 3) all three are roughly Tier III to Tier III+, so all have roughly N+1 (I explain a little more below); 4) all three are in climates that exceed 90F in summer; 5) none use a body of water to transfer heat (i.e. lake, river, etc); 6) all are roughly 10 MWs of IT load, so pretty normal size; 7) all operate within TC9.9 recommended ranges except for a few hours a year within the  allowable range; and most importantly, 8) all have construction budgets equal to or LESS than standard data center construction. Oh, and one more thing: even though each of these sites have some renewable energy generation, this is not counted in the PUE to reduce it; I don’t believe that is in the spirit of the metric.

    If you want higher efficiencies and lower costs you need to be ready to the tough questions. 

    The easy thing to do is collect the requirements of various stakeholders and say this is what we need built.  And, don’t ask the questions of how much does that requirement cost?

    I know KC’s blog entry has others curious, and he has lots more appointments.

    Hopefully this will wake up many others to ask the tough questions of “how much does that data center requirement cost?”

    Click to read more ...


    Real-Time Water Data for US

    Water is a critical resources to run a data center few think about monitoring the supply chain.  You can buy generators to provide backup power, and run for weeks on generators.  But, how long can you run when you lose your water supply?

    Found a web site on US Geological Survey on Real-Time Water Data for the US.

    It is nice to know the US gov’t is providing real-time information using web services to allow you to get notifications of changes in water availability. 


    For New York City here is a status of the reservoirs.


    Click to read more ...


    Intuit Acquires Mint, Greening its Personal Financial Services

    I have used as friends and I discussed how it changes personal finance to be free and based in the cloud. 


    Now, I admit to being a bit biased to investigate this as I have friends who work at Intuit.  Well this morning, Intuit announced its $170 million acquisition of Mint, so now it is all one company.

    Intuit to swallow Mint for $170 million

    by Don Reisinger

    Financial software maker Intuit has signed an agreement to acquire personal finance service for $170 million.

    "With this transaction, Intuit will gain another fast-growing consumer brand and a highly successful Software as a Service (SaaS) offering that helps people save and make money," Intuit CEO Brad Smith said in a statement Monday. "This move will enhance Intuit's position as a leading provider of consumer SaaS offerings that connect customers across desktop, online and mobile."

    TechCrunch reported the deal Sunday night, citing unnamed sources.

    Mint, a start-up launched two years ago that tracks personal finance data, became a CNET Webware 100 winner in 2008 and again in 2009. It was also the 2007 winner of the TechCrunch50, which kicks off once again Monday in San Francisco.

    It was a smart move for Intuit to acquire another brand that was completely cloud based personal finance services.  One of the interesting differences is the demographics of Quicken vs. Mint.

    Mint's features have apparently helped it attract a younger, more diverse demographic than Intuit's Quicken Online. Mint founder and CEO Aaron Patzer told CNET News last year that 40 percent of his company's users are women. He claimed Quicken's demographic was still "85 percent men." Assuming that's true, it would appear that Intuit can significantly expand its base with the Mint acquisition.

    Click to read more ...


    Future Nuclear Reactors simplify to improve Reliability

    WSJ has an article about future nuclear reactors.

    The New Nukes

    The next generation of nuclear reactors is on its way, and supporters say they will be safer, cheaper and more efficient than current plants. Here's a look at what's coming -- and when.


    If there ever were a time that seemed ripe for nuclear energy, it's now.

    For the first time in decades, popular opinion is on the industry's side. A majority of Americans thinks nuclear power, which emits virtually no carbon dioxide, is a safe and effective way to battle climate change, according to recent polls. At the same time, legislators are showing renewed interest in nuclear as they hunt for ways to slash greenhouse-gas emissions.


    There are interesting points made that forward thinkers like Mike Manos have said as a path for future data centers.

    "A common theme of future reactors is to make them simpler so there are fewer systems to monitor and fewer systems that could fail," says Revis James, director of the Energy Technology Assessment Center at the Electric Power Research Institute, an independent power-industry research organization.

    And, a specific example is discussed of simplification.

    The current generation of nuclear plants requires a complex maze of redundant motors, pumps, valves and control systems to deal with emergency conditions. Generation III plants cut down on some of that infrastructure and rely more heavily on passive systems that don't need human intervention to keep the reactor in a safe condition—reducing the chance of an accident caused by operator error or equipment failure.

    For example, the Westinghouse AP1000 boasts half as many safety-related valves, one-third fewer pumps and only one-fifth as much safety-related piping as earlier plants from Westinghouse, majority owned by Toshiba Corp. In an emergency, the reactor, which has been selected for use at Southern Co.'s Vogtle site in Georgia and at six other U.S. locations, is designed to shut down automatically and stay within a safe temperature range.

    The reactor's passive designs take advantage of laws of nature, such as the pull of gravity. So, for example, emergency coolant is kept at a higher elevation than the reactor pressure vessel. If sensors detect a dangerously low level of coolant in the reactor core, valves open and coolant floods the reactor core. In older reactors, emergency flooding comes from a network of pumps—which require redundant systems and backup sources of power—and may also require operator action.

    Gallup has a public opinion poll.


    Click to read more ...