CA about to launch EcoSoftware for Green IT

CNET news reports CA will be releasing an EcoSoftware solution.

CA jumps into eco-software market

by Larry Dignan

CA next week will unveil an integrated sustainability suite designed to track carbon emissions, environmental assessments, metering, and compliance to policies in one dashboard.

CA calls the suite EcoSoftware and will launch it Monday, according to Christopher Thomas, vice president of energy and sustainability. I ran into Thomas at the Gartner IT Symposium, where the carbon-monitoring software caught my eye.

There are other efforts designed to track carbon emissions. For instance, Hara and SAP have various applications and others use metering to measure sustainability efforts.

Read more of "CA jumps into eco software market; Plans to launch carbon tracking suite" at ZDNet's Between the Lines.

I have written in the past it was natural for management tool vendors – Tivoli, OpenView, and CA to add Green IT management, so this is no surprise.

We’ll get more details next week as the launch is scheduled for Oct 26.

Read more

Watching a person read my blog – Gartner employee

I use typepad for my blog, and go through the site statistics to see what people are reading checking out what google search words they use to find my entries.  This gives me an idea of what people are searching  for and who my readers are.  The following I am pretty sure is a Gartner employee.

This morning at 9:30p PST I wrote a blog post on Gartner’s recommendation for Pattern-based strategy.  At 2:45 and 2:48 pm I got the following hits.

image

The first was google search for “gartner advanced analytics.”  My one entry 5 hours earlier is result #6, beating out NetworkWorld.

image

The 2nd one is a google search for “reshaping the data center gartner”.  I have the #1 search result, behind google news, but beating gartner’s blog, CIO.com, and news.cnet.com.

image

I was visiting a friend who works at Google yesterday and we chatted briefly how well my blog works with Google search, but I swear I have no insider information.

All I know is keep on writing, and keep on looking at my results.

Thanks for reading my blog.

Read more

Gartner says companies must implement a Pattern-Based Strategy

In my day job, I help clients be innovative leaders, constantly looking for what it takes to be better than the rest. Gartner recently has announced a new initiative called Pattern-Based Strategy.

It is a pleasant surprise to have Gartner’s nine analysts come to a recommendation I’ve been using for over five years in IT infrastructure.

Introducing Pattern-Based Strategy

7 August 2009

Yvonne Genovese Valentin T. Sribar Stephen Prentice Betsy Burton Tom Austin Nigel Rayner Jamie Popkin Michael Smith David Newman

The environment after the recession means business leaders must be more proactive in seeking patterns from conventional and unconventional sources that can positively or negatively impact strategy or operations, and set up a consistent and repeatable response by adjusting business patterns.

One of the best groups I worked with at Microsoft and still have many friends in is the Patterns & Practices group, and I still have regular discussions of how Data Centers and IT could/should be using a patterns-based approach.

You’ve probably guessed from the first half our name that we’re rather enthusiastic about design patterns.  Design patterns describe solutions to common issues that occur in application design and development. A large part of what we do involves identifying these common issues and figuring out solutions to them that can be used across different applications or scenarios. Once we have the patterns, we typically package them up in what we call an application block.

Software people have been some of the early adopters of patterns, but the history of patterns comes from Christopher Alexander, a building architect.

A pattern must explain why a particular situation causes problems, and why the proposed solution is considered a good one. Christopher Alexander describes common design problems as arising from "conflicting forces" -- such as the conflict between wanting a room to be sunny and wanting it not to overheat on summer afternoons. A pattern would not tell the designer how many windows to put in the room; instead, it would propose a set of values to guide the designer toward a decision that is best for their particular application. Alexander, for example, suggests that enough windows should be included to direct light all around the room. He considers this a good solution because he believes it increases the enjoyment of the room by its occupants. Other authors might come to different conclusions, if they place higher value on heating costs, or material costs. These values, used by the pattern's author to determine which solution is "best", must also be documented within the pattern.

A pattern must also explain when it is applicable. Since two houses may be very different from one another, a design pattern for houses must be broad enough to apply to both of them, but not so vague that it doesn't help the designer make decisions. The range of situations in which a pattern can be used is called its context. Some examples might be "all houses", "all two-story houses", or "all places where people spend time." The context must be documented within the pattern.

For instance, in Christopher Alexander's work, bus stops and waiting rooms in a surgery center are both part of the context for the pattern "A PLACE TO WAIT."

I’ve spent most of my career working on the Mac OS/hardware and Windows OS/hardware The use of patterns seemed like a natural thing to do, but not intuitive for the people who deploy IT infrastructure.  With Gartner’s Pattern-Based Strategy, my persuasion challenge is dramatically decreased.

So, what is good about Gartner’s Pattern-Based announcement?  Their first 2 paragraphs are well written to identify the need.

Gartner Says Companies Must Implement a Pattern-Based Strategy™ to Increase Their Competitive Advantage

Analysts Discuss the Framework for Implementing a Pattern-Based Strategy During Gartner Symposium/ITxpo, October 18-22, in Orlando

STAMFORD, Conn., October 8, 2009 —

The economic environment rapidly emerging from the recession will force business leaders to look at their opportunities for growth, competitive differentiation, and cost controls in a new way. A Pattern-Based Strategy will help leaders harness and drive change, rather than simply react to it, according to Gartner, Inc.

A Pattern-Based Strategy provides a framework to proactively seek, model and adapt to leading indicators, often-termed "weak" signals that form patterns in the marketplace. Not only will leading organizations excel at identifying new patterns and exploiting them for competitive advantage, but their own innovation will create new patterns of change within the marketplace that will force others to react.

They identify the need for closed loop feedback systems to measure the effectiveness of change.

A CONTINUOUS CYCLE: SEEK, MODEL AND ADAPT

Most business strategy approaches have long emphasized the need to seek better information and insights to inform strategic decisions and the need for scenario planning and robust organizational change management. Few have connected this activity directly to the execution of successful business outcomes. According to Gartner, successful organizations can achieve this by establishing the following disciplines and proactively using technology to enable each of these activities:

For the same reason I added modeling and social networking to the list of things I discuss and blog about, Gartner explains.

Modeling for pattern analysis — Once new patterns are detected or created, business and IT leaders must use collaborative processes, such as scenario planning, to discuss the potential significance, impact and timing of them on the organization's strategy and business operations. The purpose of modeling is to determine which patterns represent great potential or risk to the organization by qualifying and quantifying the impact.

"Successful organizations will focus their pattern-seeking activities on areas that are most important to their organization," said Ms. Genovese. "Using models to do scenario planning will be critical to fact-based decisions and the transparency of the result."

I have my black belt in Aikido, and one of the most important skills I figured out to be better is you must develop the skills to change.  Gartner adds this as well.

Adapting to capture the benefits — Identifying a pattern of change and qualifying the potential impact are meaningless without the ability to adapt and execute to a successful business outcome. Business and IT leaders must adapt strategy, operations and their people's behaviors decisively to capture the benefits of new patterns with a consistent and repeatable response that is focused on results.

Clients – I told you taking a modeling based approach to discover patterns with real-time monitoring systems will allow you to be ahead of the competition.  And, what better proof than Gartner now promoting the same ideas.  :-)

Read more

Gartner’s Top Strategic Technologies for 2010 – IT for Green and Reshaping the Data Center make the list

ZDnet has an article on Gartner’s top 2010 strategic tech list.

Gartner: Cloud computing, analytics top 2010 strategic tech list

Posted by Larry Dignan @ 5:46 am

Gartner unveiled its top 10 strategic technology list for 2010. Unified communications, servers and specialized systems are out. Client computing, data center do-overs, flash memory and mobile applications are in.

The list, presented Tuesday at the Gartner Symposium in Orlando, by analysts David Cearley and Carl Claunch looks like this:

For the data center crowd,  look at #5 “Reshaping the Data Center”  From Gartner’s press release.

Reshaping the Data Center. In the past, design principles for data centers were simple: Figure out what you have, estimate growth for 15 to 20 years, then build to suit. Newly-built data centers often opened with huge areas of white floor space, fully powered and backed by a uninterruptible power supply (UPS), water-and air-cooled and mostly empty. However, costs are actually lower if enterprises adopt a pod-based approach to data center construction and expansion. If 9,000 square feet is expected to be needed during the life of a data center, then design the site to support it, but only build what’s needed for five to seven years. Cutting operating expenses, which are a nontrivial part of the overall IT spend for most clients, frees up money to apply to other projects or investments either in IT or in the business itself.

Green IT has morphed into IT for Green which aligns well with Intel’s latest IT is the 2% to save the other 98% of carbon footprint.

Gartner’s topic of advanced analytics fits with why I started discussing modeling on this blog.

Advanced Analytics. Optimization and simulation is using analytical tools and models to maximize business process and decision effectiveness by examining alternative outcomes and scenarios, before, during and after process implementation and execution. This can be viewed as a third step in supporting operational business decisions. Fixed rules and prepared policies gave way to more informed decisions powered by the right information delivered at the right time, whether through customer relationship management (CRM) or enterprise resource planning (ERP) or other applications. The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more decision flexibility at the time and place of every business process action. The new step looks into the future, predicting what can or will happen.

The focus on flash memory is interesting vs last year they had server hardware.

Flash Memory. Flash memory is not new, but it is moving up to a new tier in the storage echelon. Flash memory is a semiconductor memory device, familiar from its use in USB memory sticks and digital camera cards. It is much faster than rotating disk, but considerably more expensive, however this differential is shrinking. At the rate of price declines, the technology will enjoy more than a 100 percent compound annual growth rate during the new few years and become strategic in many IT areas including consumer devices, entertainment equipment and other embedded IT systems. In addition, it offers a new layer of the storage hierarchy in servers and client computers that has key advantages including space, heat, performance and ruggedness.

Collaboration has been replaced by Social Computing.

Social Computing. Workers do not want two distinct environments to support their work – one for their own work products (whether personal or group) and another for accessing “external” information. Enterprises must focus both on use of social software and social media in the enterprise and participation and integration with externally facing enterprise-sponsored and public communities. Do not ignore the role of the social profile to bring communities together.

After going through the Gartner list, I realized their list is a pretty close match to what I blog about to discuss green data centers.  I had already registered for the Gartner Data Center Conference in Las Vegas on Dec 1 – 4, and it will be interesting to learn how Gartner aligns with approaches that I see working for others.

Read more

Google, Intel, Netapp fund Wimpy Node/Server Research

News.com has an article on low power servers/nodes which is funded by Google, Intel, and Netapp.

Researchers tout 'wimpy nodes' for Net computing

by Stephen Shankland

Mainstream servers are growing increasingly brawny with multicore processors and tremendous memory capacity, but researchers at Carnegie Mellon University and Intel Labs Pittsburgh think 98-pound weaklings of the computing world might be better suited for many of the jobs on the Internet today.

This first-generation FAWN system has an array of boards, each with its own processor, flash memory card, and network connection.

This first-generation FAWN system has an array of boards, each with its own processor, flash memory card, and network connection.

(Credit: Carnegie Mellon University)

The alternative the researchers advocate is named FAWN, short for Fast Array of Wimpy Nodes. It's described in a paper just presented at the Symposium on Operating Systems Principles.

In short, the researchers believe some work can be managed with lower expense and lower power consumption using a cluster of servers built with lower-end processors and flash memory than with a general-purpose server. And these days, with green technology in vogue and power costs no longer an afterthought, efficient computing is a big deal.

"We were looking at efficiency at sub-maximum load. We realized the same techniques could serve high loads more efficiently as well," said David Andersen, the Carnegie Mellon assistant professor of computer science who helped lead the project.

It's not just academic work. Google, Intel, and NetApp are helping to fund the project, and the researchers are talking to Facebook, too. "We want to understand their challenges," Andersen said.

What scenarios are they looking at?

The FAWN approach can be adjusted with hard drives or conventional memory to match various sizes of datasets or rates or the queries retrieving that data.

The FAWN approach can be adjusted with hard drives or conventional memory to match various sizes of datasets or rates or the queries retrieving that data.

(Credit: Carnegie Mellon et al.)

The key value of FAWN
So where exactly is FAWN useful? Andersen makes no claims that it's good for everything--but the use cases are often central to companies at the center of the ongoing Internet revolution.

Specifically, it's good for situations where companies must store a lot of smaller tidbits of information that's read from the storage system much more often than it's written. Often this data is stored in a form called "key-value pairs." These consist of an indexing key and some associated data: "The key might be 'Dave Andersen update 10,579.' The update value might be 'Back in Pittsburgh.'"

How much power can they save?  52 queries per joule for typical server vs. 346 queris per joule for FAWN.

The researchers compared how many datastore queries could be accomplished per unit of energy and found FAWN compelling: a conventional server with a quad-core Intel Q6700 processor, 2GB of memory, and an Mtron Mobi solid-state drive measured 52 queries per joule of energy compared to 346 for a FAWN cluster. And tests of a newer design show even more promise: "Our preliminary experience using Intel Atom-based systems paired with SATA-based Flash drives shows that they can provide over 1,000 queries per Joule," the paper said.

Read more