oops, Cleantech doesn't move at the pace Moore's Law, expectations reset

GigaOm's Katie Fehrenbacher post on the VC cleantech bust.

SUMMARY:

One of the key misplaced assumptions that Valley VCs made in cleantech boom times is that the rapid progress of Moore’s Law could be created for cleantech with a little bit of VC funding and Valley smarts.

Katie starts discussing the controversy of VC Cleantech by taking a stance agreeing with Wired's post.

One of the more in-depth pieces on the cleantech venture capital boom and bust cycles was published in Wired this week. While not all of my peers will agree with me (I have already gotten in some heated debates over this), I think the story was a solid analysis of how a lot of VCs piled into cleantech investing in 2007 and 2008 with not a whole lot of knowledge of the sector and now have backed out of it (we have covered this a lot, too). The long-term promise of cleantech itself isn’t dead, but the boom VC cycle has clearly ended, much the way the dotcom boom went bust and the promise of the Internet arrived later on.

Katie jumps to the flaw in VC thinking.

But another layer to this story is that one of the key misplaced assumptions that VCs made in the cleantech boom times is that the rapid progress of Moore’s Law— which says that the number of transistors that can be placed on a chip doubles every two years — could be created for cleantech with a little bit of VC funding and Silicon Valley smarts. The notion (which is seductive but not true in most cases) is that the traditional energy industries throughout the world just didn’t do the right kind of innovation and that the Valley’s can-do spirit and open wallets would be able to unleash this potential.

I was always surprised that VCs choose to invest in physics constrained problems like solar cells, fuel cells, and batteries.  But, hey the environmental stuff seemed like it was big money to be made.  Until the Chinese came in as well with its government dominating by engineers.

A good memory helps you imagine the future

The WSJ has an interesting article on the new power of memory.

Memory allows for a kind of mental time travel, a way for us to picture not just the past but also a version of the future, according to a growing body of research.

The studies suggest that the purpose of memory is far more extensive than simply helping us store and recall information about what has already happened.

Researchers from University College London and Harvard University have made strides charting how memory helps us draw a mental sketch of someone's personality and imagine how that person might behave in a future social situation. They detailed their latest findings in work published in the journal Cerebral Cortex last week.

I find this article interesting because it describes a great strategy on how to network.  Meeting people imagining how they fit in your future is built on a good memory of who they are and what they do.

Researchers from University College London and Harvard University have made strides charting how memory helps us draw a mental sketch of someone's personality and imagine how that person might behave in a future social situation. They detailed their latest findings in work published in the journal Cerebral Cortex last week.

How many of you think you have  data center systems that provide a good memory of past performance?  If you don't have a good memory of the past how can you imagine the future?

Journalist taking Photos of NSA's Utah Data Center gets a good scare

Forbes has a post by Kashmir Hill on what happened when she took some pictures at the NSA's Utah Data Center.

Surprise Visitors Are Unwelcome At The NSA's Unfinished Utah Spy Center (Especially When They Take Photos)

 

Officers said the sign was jokingly programmed this way by a construction worker

Most people who visit Salt Lake City in the winter months are excited about taking advantage of the area’s storied slopes. While skiing was on my itinerary last week, I was more excited about an offbeat tourism opportunity in the area: I wanted to check out the construction site for “the country’s biggest spy center.”

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Many of you have had the hassle of dealing with prying eyes from journalists, and this journalist had a bit of a scare.

My outing to the facility last Thursday was an eventful one. I can confirm that the National Security Agency’s site is still under construction. It was surprisingly easy to drive up and circle its parking lot. But if you take photos while there, it is — much like Hotel California – very hard to leave.

...

“Were you taking photos?” he asked. I said that I was. He responded, “You’re going to need to delete those.”

Can imagine sitting in your car with the following thoughts?

We sat in the car some more, while they — I assume — ran background checks on us, Googled us, checked my Forbes credentials, poked around my Facebook page and called other supervisors, and perhaps a Public Information Officer to decide what to do about us. After maybe another 15 minutes, an aggressively chummy man with piercing blue eyes, wearing a sweater and slacks, came out to the car. He introduced himself as a special agent and asked us to explain why we were there, with an aside to Officer #1 that he wanted him to record everything. Dryer offered a lengthy explanation, including all of the classes I’d spoken to. Agent Federman responded with a direct question: “Did anyone send you to take those photos and do you plan to distribute them to enemies of the United States?”

The journalist had an hour that I am sure you all would say "duh" what did you think was going to happen when you got close to the facility.

It was an intimidating hour. While I’ve interviewed federal agents for stories, I’ve never been interrogated by them before. We may have been treated as gently as we were because I’m a mainstream journalist with a prominent platform and because I was accompanied by a lawyer. I was grateful that I could hold up “professional journalist” as my own badge; it felt protective.

Can you imagine if the journalist was by herself with a telephone lens on the facility without her lawyer friend?  Big SUVs driving over to her at high speed from multiple directions.

Simcity's disaster going online, 92% resolved

Simcity choose to make its latest version online only.  And, has found out what it means to be a 24x7 service with painful PR results and customer frustration.

I am sure for the data center crowd this is an entertaining event.  Here are some nuggets that have me questioning what they were thinking to start with.

In their latest update they sing the praise of resolving issues.  Well, 92% of them.

I’m happy to report that the core problem with getting in and having a great SimCity experience is almost behind us. Our players have been able to connect to their cities in the game for nearly 8 million hours of gameplay time and we’ve reduced game crashes by 92% from day one.

In the third update they said they upgraded their servers.  The initial were 11.  I think the EA folks gave the operations team a bit too little money.

First things first: we’ve been making great strides towards improving our servers. In addition to adding several new ones the past couple of days (including the addition of Antarctica today), we’ve been applying upgrades to 11 of our initial servers. What does this mean? We’ve beefed up these servers to allow for a larger capacity that will not only allow more players to get in, but will also help address a lot of the connection issues we’ve been addressing.

And, they figured out they needed a test server.

Also, we’ve released a new Test server today. As you may have guessed by the name, we’ll be using this server to test changes and new features before we deploy them to all of our other servers. Before you ask, yes, you can play on it. In fact, we’d be grateful if you did. Just note, because this is a test environment, you may experience some unstableness as we push new data to improve non-test servers.

Q: What is it for? 
A: This server is used by us the SimCity developers, and players to test changes and new features before they are released across all the other servers. This test server will improve our ability to deploy these updates as quickly and accurately as possible.

Bet you are a bunch of people inside the company saying, we should have gone to the cloud instead of deploying our own physical servers.  Especially if all you had to start with was 11.

US Gov't Data Centers to focus on being Efficient after closing down the excess capacity

NextGov has a post on the next move by the US Federal Gov't to create new metrics to asses the efficiency of remaining data centers after closing many.

The Office of Management and Budget wants to focus less on simply closing federal data centers and more on making sure the government’s existing data center stock is operating as efficiently as possible, an OMB official said Thursday.

That’s why federal Chief Information Officer Steven VanRoekel’s office plans to roll its three-year-old data center consolidation initiative into a separate program called PortfolioStat, which audits agencies’ commodity information technology budgets to root out waste and inefficiencies, said OMB Portfolio Manager Scott Renda, who works in VanRoekel’s office.

The strategy is to focus on metrics.

“You’re going to see more focus on the right kind of metrics, efficiency metrics” Renda said. “[We’ll be] thinking about PUE [an energy measurement], thinking about storage, thinking about density measures that really talk to how efficient your infrastructure is. The goal with PortfolioStat is an efficient infrastructure that’s serving the mission of the agency. Consolidation is done to support that program and mission.”

This is a strategy that probably got the consensus of many of the IT decision makers and the vendors who do a lot of consulting for these people.  Don't ever think a big move like this is not done without a vendor helping to say "yeh this is a great way to make you more efficient."  Meanwhile they are figuring out the initiatives that will replace the legacy systems and upgrade them to the latest technology.

Notice the idea is focused on metrics to make a more efficient infrastructure. 

If it was up to me, I would focus on the supply chain of who are the best performing vendors and where are the quality issues that cause huge wastes.  Then determine the metrics that support the monitoring of quality of the system including the quality of the vendors.

But, I don't do any federal gov't work and my partners have no desire to slow ourselves down by adding federal as a market segment.

The problem with metrics is it can take you down the wrong path.  

Much like the work that John Boyd to fight the bureaucracy in Washington that was building faster fighter jets and he was telling people no, you want jets that are more maneuverable.  The ability to change direction, add speed, dump speed is what wins.  The trouble is there were no metrics for that.  And the vendors liked a nice easy if it goes faster with less fuel then it is better approach.

Doesn't it kind of make sense that you want a more maneuverable infrastructure vs. an efficient metric driven machine?