48 hours at a Data Center Conference, Example Gartner DC LV 2010

I learn a lot going to data center conferences, but I go to almost no sessions.  My calendar is booked with numerous meetings, but I leave time to network and find new connections.

This is my 2nd Gartner DC Conference in LV and it was well worth my time, but I kind of have a unique way of leveraging the conference.

Given my blog I attend the conference as media with a press badge.  Weeks before attending I start to get e-mails and phone calls from public relations company to meet with vendors who have executives at the conference.  I choose carefully who I will set up an appointment with.  Some of the companies on the list were HP, Dell, SGI, Nimbula, Equinix, and APC.

The Gartner keynotes start on Monday morning at 8a.  I arrive in LV at about noon and start my meetings, and within 48 hours I am on a plane leaving LV.

While I am at the Gartner conference I don’t speak to a single Gartner employee.  And, I actually don’t listen to any of their presentations.  There are many who find the sessions educational, but I don’t learn enough new content to make it worth the time.  Spend 1/2 hour with a senior HP executive or 45 minutes listening to Gartner explain its surveys of a group who are not the innovators.  I spend so much of my time talking to innovators, listening to how the masses think is convenient for tracking the market, but not for being more competitive than the rest.

All my interviews are with what people see as what the future data center will look like which many time are greener data centers too.

As soon as I get close to the conference I start saying hi to people who I have seen at other conferences.  My first stop after registering is going to the press room.  The press room is one of the sparsest media rooms - no food, water, coffee.  Some mints.  Rich Miller and Kevin Normandeau from DataCenterKnowledge are the first media guys I see and we catch up a bit.  Two months ago, we were all at AFCOM LV, so it has only been a short period since we chatted. Also, I see Matt Stansberry from SearchDataCenter who I hadn’t seen for a while.

Matt and I discuss Oregon Duck Football as he lives in Eugene, OR and his wife is completing her post graduate degree at UO.  I tell Matt that I educate many people it is he who broke the story on Facebook’s choice for coal power.

Maybe Facebook should have bought a Bloom Box to diffuse Greenpeace’s campaign against a coal powered data center

Thanks to Matt Stansberry’s reporting on SearchDataCenter, attention was drawn to Facebook’s Prineville Data Center being coal powered.

Tiered energy rates bring higher prices for new customers
By 2012, BPA will charge tiered rates for power. Customers that signed 20-year contracts in 2008 will pay tier-one (i.e., inexpensive) pricing for their current electricity demand. These customers use most of the power produced by the dams.

By 2012, Oregon's Bonneville Power Administration will charge tiered rates for power.

To meet new customer demand or increased demand from existing customers, BPA also purchases power from other sources. In 2012 this electricity will be classified as tier two, and it will be charged at a much higher rate than the BPA's current hydropower.

Which brings us back to Facebook: The company's new data center is being built in Prineville, Ore., a small town on Oregon's high desert. Pacific Power, a utility owned by PacifiCorp, will provide the electricity. While Pacific Power gets some hydropower from BPA, its primary power-generation fuel is coal, according to Jason Carr, the manager of the Prineville office of economic development for Central Oregon.

With the price of hydropower increasing in the Northwest, Facebook opted to bet on the incremental price increases associated with coal rather than face tier-two pricing from BPA.

I’ll see Matt and Rich many times at the conference as we interview many times the same executives.  I’ll ask what they are finding interesting.

Besides interviewing there are attendees who come to do business and we’ll meet to discuss what is going on in the industry and where there are new opportunities.  Who is doing some of the best work and who is starting up new projects.  I’ll start looking for new connections and interesting people to have discussions with at the conference.

When the exhibit area opens, I’ll look for people I know, and watch which vendors are getting lots of traffic.  I rarely spend much time at an exhibit unless I know people at the company.  Most of the time I talk to a booth person, I find I can learn more by surfing their website.  One booth I went to was Splunk which is one of the fastest growing IT management tools.  I ended up spending over 1/2 hr at the Splunk booth as the guy I was talking to just happened to work for a good friend of mine at Microsoft and I knew there was a guy who had interesting insight.  Within 24 hrs, I was able to have a telephone conversation with Splunk’s CTO to discuss an innovative use of Splunk which I hope to write more about in the future as we prove a scenario to green the data center using Splunk.

Throughout the 48 hours I talk to business friends in site selection, engineering services, construction, facility operations, containers, server hardware, cloud SW, networking, and management tools.  Looking for how the pieces fit together in interesting ways.  Making introductions, and discussing new ideas.

There were a couple of good “ah ha” moments when I figured out some new things.  One example is the big whales aren’t at Gartner.

In the end I talked to some amazing data center executives, found some new technologies sooner as they were brought to my attention, reinforced established connections, made new connections, and had a good time discussing new ideas.

BTW, I don’t expect the Gartner folks to talk to me as I am not going to pay for their advice as I am not a client.  But I will help talk about what goes on Gartner DC LV.  Everybody gets a different experience than others.   The above is my 48 hrs at Gartner DC LV 2010.

Read more

A Logistics Lesson for Container Data Centers, US Navy’s F-35 Fighter engine too big to be shipped

Container data centers are hot topics and there are lots of new players in the game.  The military has used containers for a long time to ship supplies.  Here is a story about a goof with the F-35 fighter engine for the US Navy supply logistics.

Yet Still Another Embarrassing F-35 Problem

December 3, 2010: The U.S. Navy has yet another problem with the new F-35 fighter it will soon be operating off its carriers. It seems that no one bothered to check if the engine for the F-35C could fit into the C-2 aircraft the navy currently uses to deliver jet fighter engines to carriers. Normally, carriers go to sea with 30-35 spare engines for their F-18 fighters (that the F-35s will replace). In the course of a six month deployment, a dozen or more of these engines will be flown to, or from, the carrier.

The F-35 engine can be disassembled into five major components, and the largest of these can be carried by sling under an MH-53E helicopter or V-22 tilt-rotor aircraft. Both of these aircraft are normally carried by amphibious ships, along with a battalion of marines, and are usually near a carrier task force. But the range for the MH-53E (carrying the heaviest component) is only 550 kilometers, if the weather is good. The V-22 has had problems landing heavy sling loads on carriers, and more research is needed there. The heaviest component, including the shipping container, weighs 4.3 tons, and is too heavy to transfer at sea using the normal methods of underway replenishment (with the supply ship moving along side and using cables and hoses to move material and fuel.) This leaves delivering the engine via the supply ship. This requires very calm weather, and getting close enough to use cranes to haul the engine aboard the carrier. This can be tricky, even in good weather, on the high seas. All this is a big problem, as within eight years, F-35Cs will be operating off Nimitz class carriers, and getting fresh engines on, and broken ones off, will become a real issue. The navy will improvise some kind of solution, but this is not the first major hassle with F-35s operating on carriers.

If you thinking about containers for data centers, make sure you think of the lifecycle and logistics to support the maintenance and repair of containers.

Read more

Whale Hunting at Gartner DC LV 2010

I had dinner with a couple of senior data center executives who engineer and build datacenters for some of the top players in the industry.  These guys are part of the data center industry that build the big data centers for the top players where the business must have large capacity and the best designs.

Part of the conversation was comparing data center conferences to go to.  We were all at Gartner DC LV 2010.

image

Other conferences we agreed were good are DataCenterDynamics, 7X24 Exchange, and Uptime Symposium, SVLG DCEE.  All of these conferences have more of a data center facilities focus whereas Gartner has little facilities discussion. 

Gartner is different with a focus on data center operations from an IT perspective, not a facilities.

But, then I made the point that the big boys in data centers in general don't go to Gartner.  The exception is the executives who attend for presentations and meet with clients.  And, those who sell to the rest of the attendees

For example, Google sent two people who didn't look like they were from the data center group.  Microsoft sent over a dozen, but again not from the data center group.  No Facebook or Yahoo.  No Twitter, Zynga,   No AT&T. No Apple.  A couple from Verizon Wireless, but again no data center group.

Who does intend in mass with 5 or more people? 

Canada Dept of Defense

Delta Airlines

DePaul University

DirectTV

FAA

GSA

Kaiser Permante

Medtronic

McKesson

NASA

PG&E

Royal Bank of Canada

Sandia National Labs

Southern CA Edison

Social Security Adminstration

State Farm

US Dept of Defense

US Dept of Vet affairs

US Marine Corps

Lots of big fish.  But not the whales of data center.

So is Gartner DC LV conference really the data center industry?  Or those who look for Gartner for advice on data centers?

I run into many people I know in the industry at Gartner DC, but now that I think about it is the suppliers of the data center industry I run into at the event, not the end users I know who are the most innovative and biggest. 

The end users I run into at DataCenterDynamics, Uptime, and 7X24 Exchange.

If you are hunting for the big whales in the data center industry Gartner is not the place to look, but there are still plenty of big fish.  On the other hand, getting access to the right people is part of the challenge which is why the exhibit area is used so much.

Read more

Dave Barry's Guest Keynote Video at #gartnerdc LV 2010, making fun of the Cloud and IT

Dave Barry gave an entertaining presentation of his comedy act Wit and Wisdom.

Guest Keynote Speaker

Dave Barry

Pulitzer Prize-Winning Author and Humorist

Dave Barry is a humor columnist. For 25 years he was a syndicated columnist whose work appeared in more than 500 newspapers in the United States and abroad. In 1988 he won the Pulitzer Prize for Commentary.

Many people are still trying to figure out how this happened. Dave has also written a total of 30 books, although virtually none of them contain useful information. Two of his books were used as the basis for the CBS TV sitcom "Dave's World," in which Harry Anderson played a much taller version of Dave. Dave plays lead guitar in a literary rock band called the Rock Bottom Remainders, whose other members include Stephen King, Amy Tan, Ridley Pearson and Mitch Albom.

At dinner I was talking to some friends who missed Dave's presentation. To share the experience here is a YouTube video of Dave's discussing of Cloud Computing and IT.

Dave Barry Wit and Wisdom of Cloud

Enjoy and have a good laugh.   I did which made it hard to keep the camera steady.

Read more

What the Private Cloud will bring? Really Bad $h*!

I had a full day at Gartner DC LV conference.  At the end of the day I got a good question on what I saw in the future.  Cloud is top of the topics being discussed.

Lots of people are thinking about building private clouds, but how many people know how to build an operating system for the cloud.  A common tweet from #GartnerDC

barton808 Barton George

by sean_kelley_ms

66% of folks here say they will be pursuing private cloud by 2014.#gartnerDC

So a safe answer is private cloud is the future of IT.  High utilized hardware. Dynamic Infrastructure.

image

Gartner has been saying the private cloud is coming for a while here is a post from 2009.

I believe that enterprises will spend more money building private cloud computing services over the next three years than buying services from cloud computing providers. But those investments will also make them better cloud computing customers in the future.

Building a private cloud computing environment is not just a technology thing – it also changes management processes, organization/culture, and relationship with business customers (our Infrastructure and Operations Maturity Model has a roadmap for all four). And these changes will make it easier for an IT organization and its customers to make good cloudsourcing decisions and transitions in the future.

The ability for people to understand the private cloud is daunting.  The choices are large and growing faster than people can understand.  All of this reminds me of the arrival of Desktop Publishing. with new issues for typography, color matching, images, layout, printers,scanners, and SW.

Desktop publishing began in 1985 with the introduction of MacPublisher, the first WYSIWYG layout program, which ran on the original 128K Macintosh computer. (Desktop typesetting, with only limited page makeup facilities, had arrived in 1978–9 with the introduction of TeX, and was extended in the early 1980s by LaTeX.) The DTP market exploded in 1985 with the introduction in January of the Apple LaserWriter printer, and later in July with the introduction of PageMaker software from Aldus which rapidly became the DTP industry standard software.

Before the advent of desktop publishing, the only option available to most persons for producing typed (as opposed to handwritten) documents was a typewriter, which offered only a handful of typefaces (usually fixed-width) and one or two font sizes. The ability to create WYSIWYG page layouts on screen and then print pages at crisp 300 dpi resolution was revolutionary for both the typesetting industry and the personal computer industry. Newspapers and other print publications made the move to DTP-based programs from older layout systems like Atex and other such programs in the early 1980s.

Now if you are an experienced Operating System developer and have a team who can make the design trade-offs in designing a private cloud the transition to private cloud will be like print publications that moved to Mac based DTP.  But the number of IT organizations with this skill set are only a handful - Google, Microsoft, VMware, Yahoo, Facebook, Amazon, etc.  Maybe at the most 6% of the installed base has these skills and ability to recruit top talent, so what happens to the remaining 60% of the 66% that are building private clouds?

We are going to see some really bad $h*!.

Private clouds that are bad performers.  Clouds that have bad UI.  Manageability requires giving a UI to the private cloud.  How many IT organizations have a user interface design team?

Building a private cloud is like building an operating system to manage the resources in IT with UI for system administrators designed for your internal users.

Now the smart guys have figured out they can hire experience operating system staff.  Why do you think Google hired so many Microsoft guys?  Microsoft hired a bunch of DEC guys to work on NT.

If you don't want to build some really bad $h*! you should think of hiring some OS guys.  I have a friend who runs a technical executive placement company and I think she should start up a private cloud placement service.

Are you in the 6% group with OS level talent or in the 60% group who is new to DTP and have an organization who sees the private cloud as the answer to take control.

Keep in mind this Gartner statement.

I believe that enterprises will spend more money building private cloud computing services over the next three years than buying services from cloud computing providers.

The analyst and vendors are going to market private cloud so it is unstoppable.  Just saying no to the private cloud is not an option.

Gartner DC LV is a great event to meet people and circulate ideas.  Today is a full day of interviews, business disconnections, and making new connections. 

Read more