Dell Powers Microsoft Windows Azure

Dell has a blog post on its Cloud Computing Blog.

Dell Powers Microsoft Azure

Mon. Oct. 27, 2008

Today marks the start of an important chapter in the unfolding story of cloud computing. Microsoft has entered the ranks of cloud platform service providers with the launch of Windows® Azure™ and the Azure Services Platform.  Microsoft selected cloud-optimized Dell servers to power the Azure platform.  

Forrest Norrod, Vice President and General Manager of Dell's Data Center Solutions Division, took a moment to share his thoughts about the announcement in this video:

Here is the Dell HW used in ask.com /2008/05/more-details-on.html

Read more

Not a Bogus Green Data Center Initiative, says Intel

Intel’s Brently Davis, manager of Intel’s data center efficiency project is quoted in Baseline Magazine.

This isn’t some bogus green data center initiative, executives say. This is an efficiency exercise that saves the green that matters most in these tough economic times.

“Our main concern is not about being green,” says Brently Davis, manager of Intel’s data center efficiency project. “It is more so about being efficient, figuring out how we can run our computing environment better.

The problems Intel face are the same as any company.

Intel developed Davis’ group within its IT operations department just so it could tackle such issues. “You’ve always got to pull people in to help look horizontally,” he says. “A lot of times, the focus of IT groups is myopic: They like to concentrate on their own vertical. But you’ve got to have a horizontal overview, and I think that’s what we try to help them do.”

Their solution was to get finance people on board.

One of the first things that Davis did to make that happen was to bring in the financial folks to get a clear picture of spending beyond the overarching price tag—to see where the dollars were going. That’s the key step in any efficiency project, he says.

“You can’t do this if you don’t understand what you’re spending,” Davis says. “We brought in our finance team and said, ‘Pull all of this stuff together so we can figure out what we’re spending. Put these numbers together, get it validated and make sure it makes sense.’”

Intel reconciles issues with standardization.

Standardization Begets Better Utilization

With the business case laid out and the “but why-ers” on board, Intel is now putting all the logistical puzzles into place. The first step has been to work on standardizing the environments and practices to reduce redundancies and improve the way all the data centers work together. Because, as Davis says, when Intel surveyed the data center landscape in 2006, it had 150 centers and “there was no synergy to anything; we were all over the place.”

And executes with specific goals.

This was enabled through a number of strategies, including virtualization, grid computing and cloud computing. And it was coupled with efforts to do a better job refreshing servers—replacing them with fewer servers along the way.

Before the program started, Davis reported that by 2014, Intel was on track to move up from 90,000 servers to 225,000 servers. The goal, he says, is to keep that number at 100,000 in six years’ time and reduce the cost and power draw of each of these servers significantly.

“The only way we could do that was by getting off the old hardware,” he recalls. “We were just as guilty as everyone else. We were sitting on servers that were possibly seven or eight years old. We needed to start refreshing those servers to reduce the power consumption in the data center.”

Read more

Sun’s Software Labs Has a PUE of 1.28

I had blogged before about Sun’s Modular Data Center.

What I missed was their entry on page 15 of the document where  Sun discussed how they consolidate 32 different server labs into one space and achieve a a PUE of 1.28.

Efficiency in Sun’s Santa Clara Software Datacenter
In 2007 Sun completed the largest real-estate consolidation in its history. We closed our Newark, California campus as well as the majority of our Sunnyvale, California campus shedding 1.8 million ft2 (167,000 m2)from our real-estate portfolio. One example in this consolidation is the Software organization’s datacenter, where 32,000 ft2 (2,973 m2)of space distributed across 32 different rooms was consolidated into one 12,769 ft2 (1,186 m2) datacenter. In the end, 405 racks were configured in this space, using an average of 31.5 ft2 (2.9 m2) per rack. The initial power budget was 2 MW (out of 9 MW overall), with the ability to expand to 4 MW in the future. This design supports today's current average of 5 kW with the ability to grow to 9 kW average per rack. Keep in mind that even though the averages are 5 kW and 9 kW, racks ranging from 1 kW to 30 kW can be deployed anywhere in this datacenter.


We measured the power usage effectiveness of the Santa Clara Software organization’s datacenter, and if any single number testifies to the value of our modular design approach, it is the PUE of 1.28 that we were able to achieve. By using our modular approach, which includes using a high-efficiency variable primary-loop chiller plant, close-coupled cooling, efficient transformers, and high-efficiency UPS, an astonishing 78 percent of incoming power goes to the datacenter’s IT equipment.
Figure 4. Our Santa Clara Software organization’s datacenter achieved a PUE of 1.28, which translates to a savings of $402,652 per year compared to a target datacenter built to a PUE of 2.0.

image


In traditional raised floor datacenters the efficiencies worsen as densities increase. Our Pod design is efficient from day one, and remains efficient regardless of density increases.

Even though Sun compares its savings to a PUE of 2.0.  Most likely the PUE was over 2.0 given the 32 labs were scattered in conventional office spaces, not in a data center.


Note that the Santa Clara datacenter is a Tier 1 facility, by choice, with only 20 percent of the equipment on UPS. Choosing Tier 1 was a corporate decision to match the datacenter with the functions that it supports. The design approach is the same for our Tier 3 datacenters with one exception: the amount of redundancy. As you increase the redundancy (to N+1 or 2N) you can lose efficiency. If you make the correct product and
IT LoadChiller PlantRC/CRAC LoadsUPS/Transformer LossLighting78.57% PUE=1.28
16 The Range of Datacenter Requirements Sun Microsystems, Inc.
design decisions that make efficiency the highest priority, you can maintain a PUE of 1.6 or less in a Tier 3 datacenter.

Read more

Apple’s Green MacBook and Display

Apple has been a favorite target for Greenpeace looking at its environmental impact.

With Apple’s new MacBook, they are now marketing the Green features.

image

Apple is most proud of what isn’t in MacBook.

What’s common in other notebooks is missing in the new MacBook. Take, for example, mercury used in CCFL backlights and arsenic contained in the glass of traditional LCD displays. Apple engineers have said no to both. They’ve chosen LED technology and arsenic-free glass. They’ve also said no to brominated flame retardants (BFRs) in logic boards and PVC in cables and connectors. In fact, Apple has done more than remove these toxins from the new MacBook. They’ve done the same for the rest of the new MacBook family, the Apple LED Cinema Display, every iPod, and iPhone 3G. Sometimes saying no is a good thing.

MacBook laptop's unibody case design

Fewer parts.
Greener parts.

The new MacBook is built with significantly fewer parts. And the parts that remain are significantly greener. Take the unibody, the foundation of the notebook itself. It’s a single piece of solid, recyclable aluminum that replaces dozens of extraneous pieces once destined for landfill.

Small volume speaks volumes.
Slimmer MacBook laptop packaging

Made from recycled material, the new MacBook packaging is 41 percent smaller than the previous generation. And that’s huge. It means less paper used for smaller boxes. It also means Apple can use fewer planes to transport the same number of products.

 

Buried in Apple’s new laptop announcements is Apple’s Greenest Display.

The greenest Apple display ever.

The LED Cinema Display is the most environmentally friendly display Apple has ever created.

Toxin free

One thing that makes the LED Cinema Display so remarkable is what it lacks. Namely, environmentally harmful mercury. And like the latest-generation iPod, iPhone, and Mac computers, the glass used in the display is arsenic-free. Even the internal cables and components are BFR- and PVC-free.

Highly recyclable

Because of its glass and aluminum construction, the LED Cinema Display is highly recyclable. So when you eventually part with it, rest assured it can be remade into something new.

ENERGY STAR
ENERGY STAR

The LED Cinema Display is designed to meet the low power requirements set by the EPA and the U.S. Department of Energy, giving it the ENERGY STAR certification. The result is a display that reduces energy consumption and your carbon footprint.

EPEAT Gold
EPEAT Gold Rating

The Electronic Product Environmental Assessment Tool, or EPEAT, ranks the performance of a product throughout its lifecycle according to its environmental attributes. The LED Cinema Display earned the highest rating of EPEAT Gold.

image

Read more

Sun’s 7 Quick, Simple Steps to Green The Data Center

Found this Sun entry on their Green Data Center page which Sun calls “Eco Responsible Data Centers.”

Quick, Simple Steps to Save Energy

Take these quick actions to save energy and money in your datacenter, and help the planet in just 30 days.

  1. Adjust the thermostat set point to 77°F perASHRAE standards.
  2. Identify and turn off unused machines (typically 8-10% of machines)
  3. Use Sun Eco Assessment Service to fix air flow problems
  4. Enable power management features where available (PowerNow, SpeedStep)
  5. Refresh old servers; use trade-in and rebate programs to help pay
  6. Refresh disk technology (replace <70GB drives with >500GB)
  7. Move old data to tape (tape provides 20x-200x more TB/KW)

The last 2 are good points as many people don’t think of storage as part of their green efforts.

Read more