Another VMware Tool, Lab Manager 3.0, next is Production Manager for the Virtual Data Center?

VMware rolled out its latest version of Lab Manager 3.0.  NetworkWorld covers the release.

VMware tool aims help developers

By Peter Sayer , IDG News Service , 08/04/2008

VMware wants to make it easier for IT departments to manage the virtual machine configurations that they use for testing or developing software, and will release an upgraded Lab Manager tool on Thursday.

Testing enterprise applications is becoming increasingly complex. With the adoption of service-oriented architectures and Web 2.0 techniques, the interactions of multiple servers must be taken into account, making it important that quality assurance be able to deploy realistic test set-ups easily and repeatably.

Don't Miss!Read the latest WhitePaper - Troubleshooting Remote Site Networks - Best Practices

VMware's Lab Manager allows software developers and testers to choose a virtual machine set-up from a library of previously stored configurations and deploy it to a server pool, while IT staff retain control of who does what. The configurations can include multiple virtual machines.

I am glad Lab manager has added the capability to use a library of stored VM configurations as this is something we manually needed to do for customers 4 years ago to make it easier to control the proliferation of VM iterations.

Going with the idea of VMware running a Virtual Data Center, VMware could manage a library of known good VM configurations that have tested and verified for security, compliance, etc. This would offload a huge burden from IT shops and VMware could make server configurations a commodity it sells as a service.

VMware is following the same strategy I recommend to many others in getting into the data center - go for the dev and test labs first then go into the production data center.  So, even though the product is being promoted as a lab tool, you can expect VMware to create a Production Manager Tool next or Lab Manager will integrate with VMware's lifecycle management tools.

In this white paper VMware discusses the the Virtual Data Center.


VMware white paper
Conclusion: The New Automated Virtual
Datacenter
VMware virtualization is changing IT organizations dramatically,
bringing unprecedented levels of flexibility to datacenter by
eliminating complex hardware and software interdependencies
and creating a standard platform for deploying mission-critical
applications. This shift will require IT organizations to think
differently about managing the IT infrastructure.
VMware Infrastructure accelerates the rate of change, and has
the potential to turn the datacenter on its head by services
and IT resources are fluid rather than static, meaning that
the traditional ways of managing systems no longer applies.
The traditional way of managing systems no longer applies.
Datacenter administrators need to account for this new dynamic
IT environment as they deploy and manage virtualization on an
increasingly greater scale.
Ultimately, the automated virtual datacenter provides several
key benefits:
• Eliminates manual, error-prone repetitive tasks, increasing ROI
and freeing up IT to focus on strategic projects.
• Minimizes risk and improves compliance with IT and business
policies
• Reduces configuration issues by maintaining consistent
systems across the datacenter
• Allows IT to increase responsiveness to the business,
accelerating time-to-market for IT services dramatically
To do this effectively, IT organizations need best-of-breed tools
designed to automate the new virtual datacenter. VMware
Infrastructure provides the core building blocks for the new
virtual datacenter, and solutions for automating IT service
delivery and business continuity let customers leverage their
existing investments in VMware to accelerate the rate of change
while maintaining predictability and control, providing greater
responsiveness to the business. Using VMware Infrastructure
as a strategic platform for the datacenter, organizations can
automate the virtual datacenter

There is no reason why VMware could not host the Virtual Data Center.

Read more

Adobe’s Cloud Computing Initiative, Greening their Data Centers

GigaOm has an interview with Adobe’s Kevin Lynch, CTO discussing their cloud computing initiative.

The GigaOM Interview: Kevin Lynch, CTO, Adobe Systems
Om Malik, Monday, August 4, 2008 at 12:01 AM PT Comments (1)

Sitting across from me in the lounge of a posh Half Moon Bay, Calif., resort recently, Kevin Lynch, chief technology officer of Adobe Systems, a software company based in nearby San Jose, outlined his vision of the technology world at large. In particular, Lynch, who bears an uncanny resemblance to Harry Porter (picture the impish wizard as a grown-up) talked about how the confluence of cloud computing, web-centric applications and the emergence of the mobile Internet was going to impact our collective future.

Below are edited excerpts from our conversation:

Om Malik: How is the emergence of cloud computing impacting desktop-centric Adobe Systems as a company?

Kevin Lynch: Adobe is a 25-year-old company and that’s a great achievement because we have had the ability to change. We have changed with technological shifts. And now we are in that situation again. How software is made and sold is changing, so we are changing.

We are taking a balanced approach, and are building a hosted infrastructure. It’s not just about the cloud, but also about the desktop. There are some who are all about the cloud while others think about the desktop first. We have a hybrid approach, and we are doing that with our products like AIR.

Om: Can you talk about your online software-on-demand strategy?

Lynch: We have products like Buzzword, Photoshop Express and Acrobat.com that we are doing online. We are not deploying at the level of raw storage and raw hosting. Instead we are looking at application hosting from our customers’ perspective.

After reading about Photoshop Express’s cloud computing experience. It reminded me of some old colleagues who work at Adobe and have been building their energy efficient server infrastructure for a green data center. Last I checked with them a year ago they were custom building their own servers to get the performance they needed. Adobe is going to be one of those companies who say nothing about their data centers to protect their assets especially in a competitive area like photo sharing, but they have some unique advantages like the Photoshop code base and customers.

Om: So that means you guys need to learn a whole new language of building scalable infrastructure?

Lynch: We have 600,000 users of Photoshop Express and 500,000 unique visitors to the site every month. About 8 million Flash players are installed every day, and that needs a lot of bandwidth and infrastructure. So we know that, but it is a question facing all software companies going forward.

Read more

Cloud Computing Announcements

There have been so many cloud computing announcements this week, I don’t even want to attempt individual entries.  So, here are a list of a few.

Sun - http://www.theregister.co.uk/2008/07/31/sun_utility_computing_spin_out/ where Sun’ Chief Sustainability office now runs the cloud computing service

Exclusive Sun Microsystems' utility computing operation is being turned into a separate cloud business unit lead by Sun's chief sustainability officer Dave Douglas.

The Reg has learned Douglas will run Network.com, which provides hosted applications, servers and storage charged on a per-use basis, and will report directly to Sun chief executive Jonathan Schwartz. Network.com had been run inside Sun's software business, under executive vice president Rich Green.

IBM - http://gigaom.com/2008/07/31/even-ibms-got-computing-clouds/

Even IBM’s Got Computing Clouds
Om Malik, Thursday, July 31, 2008 at 9:00 PM PT Comments (1)

It looks like after Amazon, a mere book retailer, showed them the way, all the technology powerhouses have fallen in love with cloud computing. Hewlett-Packard, Intel and Yahoo earlier this week said they’ve teamed up with three universities to create a cloud computing testbed, and Michael Dell talked about his company’s cloud computing plans with me in a recent interview as well.

HP, Intel & Yahoo - http://news.cnet.com/8301-1001_3-10001390-92.html

HP, Intel, Yahoo join forces on cloud computing research

Posted by Caroline McCarthy 2 comments

This post was updated at 10 a.m. PDT to include further comments from the companies.

Hewlett-Packard, Intel, and Yahoo announced Tuesday that they've teamed up to create a "test bed" project for research in cloud computing, the umbrella term for outsourcing hardware and software capabilities rather than handling them locally.

With the rather dry name of The HP, Intel, and Yahoo Cloud Computing Test Bed, the open-source project will consist of data centers around the globe "to promote open collaboration among industry, academia, and governments by removing the financial and logistical barriers to research in data-intensive, Internet-scale computing." They've partnered with the Infocomm Development Authority of Singapore, Germany's Karlsruhe Institute of Technology, and the University of Illinois at Urbana-Champaign as well as the National Science Foundation.

Ebay - http://www.informationweek.com/blog/main/archives/2008/07/is_ebay_getting.html

Is eBay Getting Into Cloud Computing?


Posted by J. Nicholas Hoover, Jul 29, 2008 06:39 PM

Another of the Web giants, eBay (NSDQ: EBAY), may be joining the growing and nebulous (pun intended) field of cloud computing, if a new job listing is any indication.

The company's looking for a director of cloud computing engineering, who will be in charge of leading eBay's "Cloud Computing initiative," according to the listing. The question is whether this job is really about managing eBay's internal cloud -- the company's in the midst of a three-year "grid computing" effort -- or doing something entirely new.

Microsoft - http://arstechnica.com/news.ars/post/20080729-microsoft-bets-on-cloud-computing-as-amazon-suffers-outage.html

At Microsoft's Financial Analyst Meeting a few days ago, Chief Software Architect Ray Ozzie alluded to the company's cloud computing plans. Microsoft will offer a hosted solution for software vendors and value-added resellers to move applications to the cloud. As well as providing the basic tools to deploy these services, Ozzie said that Microsoft would deliver "a programming model leveragable on-premise and in the cloud."

The goal is to produce a set of tools to allow developers to create programs that work just as well on a single server as they do on a datacenter of servers.

Read more

The Virtual Data Center, VMware's next move?

VMware has made management changes, and they will occupy a new data center at Sabey's facility in Eastern Washington.

EAST WENATCHEE — Palo Alto-based software company VMware will become the next tenant at a data center under construction near Pangborn Memorial Airport.

Company spokeswoman Melinda Marks said by e-mail that VMware will use its new data center to expand research and development.

The company's plans, under review by Douglas County officials, call for a new 189,000-square-foot building at Intergate.Columbia, a data-center complex developed by Tukwila-based Sabey Corp. on 30 acres just east of Pangborn Memorial Airport.

Sabey is already building a data center there for telecom giant T-Mobile. That center should be finished by early next year.

Douglas County plans examiner Lars Peterson said Sabey submitted the plans for the VMware building in June.

The plans show VMware would occupy about two-thirds of the new building.

Two-thirds of a 189,000 square foot building for R&D seems a little big.  VMware is planning something big.

VMware is getting ready for big announcements in October, a month after Microsoft's Hyper-V launch.

You add up the management changes of Paul Maritz, 120,000 of data center space, and VMware's upcoming announcements.  I am guessing VMware will announce its Virtual Data Center - cloud computing initiative.

Imagine what VMware could build using all its tools and lease out a Virtual Data Center. They could change the hosting model from the extremes of rent your own space or Amazon's Web Services.

One company already going down the path of Virtual Data Center hosting is Digital Sense.

Leading the way in High Density Design

Availability, redundancy, efficiency, flexibility, security.

Digital sense is dedicated to building the most energy efficient, comprehensively monitored, high density data centres. Built to future proof your investment in our data centres, Digital Sense is able to cater for I.T. computing loads from 3kw to 25kw per rack across the entire floor space. Designed to be the ultimate in high availability design, Digital Sense has undertaken a massive design and implementation process to bring a truly redundant facility that is built to cater for the emerging virtualisation evolution.

Over the next couple of weeks I'll keep testing this idea, but I wanted to blog it now to get the idea out there.

Read more

EBay SW Developer Discusses Energy Efficient Applications

Redmond Developer has an interview with Dan Pritchett, Technical Fellow at EBay who thinks about transactions per second per watt (TPS/W).

It's no secret that power consumption is a worrying issue among datacenter managers. As system hardware becomes cheaper and energy costs continue to rise, IT managers might find that they'll spend more to power and cool a system over its lifetime than to actually buy it.

Which is why guys like Dan Pritchett, a technical fellow at eBay, has moved beyond thinking about transactions per second (TPS) with his applications to focusing on transactions per second per watt (TPS/w).

"One of the primary challenges we started to face in 2006 was power. The datacenters were maxed out and we were still running at capacity," said Pritchett, who noted that local municipalities were often physically incapable of delivering enough power to meet eBay's growing energy needs.

Ideas discussed are database design and threaded code for parallel processing.

Virtualization has played a huge role in datacenter operations, enabling companies like Google and eBay to maintain ample hardware redundancy while driving up utilization -- a key for energy-efficient design.

But dev shops can do more, Pritchett said. He urged dev managers to look to established best practices and to focus on efficient, scalable designs. At eBay, for instance, databases are always sharded -- split into smaller pieces -- to produce. "Those things definitely come at a cost," Pritchett said. "But if you are wanting to move into hundreds of millions of entities in your system, and you're wanting to deal with tens or hundreds of millions of transactions per day, that's what you are going to move to."

He also urged developers to work toward parallel programming, so that fully threaded code can work efficiently across multicore processors. "I think going forward this is definitely going to be a huge issue. We are going to start having to leverage the parallelization of the hardware into the app space," he said.

Read more