RAW vs. JPG, 25% images are now RAW

10 years ago at Microsoft, four of us had this idea that RAW imaging would be big.  I wrote a blog post with some of the history.

Story of Adobe & Apple High-Value Digital Image Applications, Adobe’s angst developing for the iPad, and how Microsoft missed this battle

MONDAY, MAY 17, 2010 AT 3:25AM

This is not a data center post, but one about competition and innovation.

If you are a high-end photographer person you use the RAW imaging format, a higher quality image format vs. JPEG.

A camera raw image file contains minimally processed data from the image sensor of either a digital camera, image, or motion picture film scanner. Raw files are so named because they are not yet processed and therefore are not ready to be printed or edited with a bitmap graphics editor.

Microsoft just released a RAW image CODEC for Windows 7 and Vista, and what was interesting is the analysis says 25% of images are RAW.

Photo Gallery now supports raw format

by Brad Weed

We all take a lot of photos. In fact, according to data provided by InfoTrends, more than 73 billion still images were shot in the US alone in 2010. If you’re lucky enough to own a DSLR (digital single lens reflex) camera, you’re likely to take two and a half times as many photos in a given month as your friends with point-and-shoot cameras. That’s a lot of photos. What’s more, nearly a quarter of those photos are taken in a raw image format.

The group of 4 that had the original RAW image idea 10 years ago are no longer at Microsoft.  One is a Google executive, one is an Adobe executive, one is an imaging consultant, and myself.

At least now, we finally have the data to say how big the RAW imaging format is.  25% of the market.  Now the market is big enough, and a product can be developed.  But, it is a little too late to try and come to market now.

Thinking air-side economizer use, consider Seattle has had only 351 minutes of over 80 degrees this summer

I know this will not make you locate your data center in Seattle, but this is a fun piece of weather trivia.  As of July 24 there have been only 351 minutes of over 80 degree weather this summer.

Seattle soaks up some summer -- 273 minutes of it to be exact

By Scott Sistek

Story Created: Jul 24, 2011 at 9:39 PM PDT

For a while, it seemed Seattle was about to cement its legacy as home of the 78-minute summer.

But no more.

With a nice warm, sunny Sunday, Seattle now has had its first extended "summer experience" which I had defined as 80 degrees or warmer at the University of Washington.

The total tally was 273 minutes Sunday (4 hours, 33 minutes) bringing our entire Seattle summer experience up to a whopping 351 minutes.

I have joked that if Global Warming happens we are going to see a migration to the Pacific Northwest.

And Texas is on the end of the spectrum.

Meanwhile, in Waco, the streak continues for now. Thursday's high of 103 degrees marks the 29th straight triple-digit day and the 46th such day of 2011.

Nebula launches Hardware Appliance to run the cloud, but will users want the HW or SW?

The cloud is about virtualized environments.  So, it is bit ironic that Nebula's first product is a physical hardware appliance when the solution could be downloaded bits.

Nebula Cloud Appliance

What they’re all working is fairly fascinating: A hardware appliance pre-loaded with customized OpenStack software and Arista networking tools, designed to manage racks of commodity servers as a private cloud.

...

Kemp wasn’t planning to do an appliance, he admits, but initial investor Bechtolsheim convinced him it was the right approach. It lets Nebula provide a turnkey product for deploying OpenStack, Kemp explained, by optimizing and locking down some of the variables that might make deploying a private cloud more difficult.

Nebula's team didn't like the Eucalyptus product and choose OpenStack.

However, even with all the specialization, Nebula is very committed to building the core OpenStack code base. “OpenStack exists because Eucalyptus didn’t work at NASA,” Kemp acknowledged, so he understands the importance of solid, customizable, open-source code.

Ultimately, he said, a better OpenStack means a better Nebula, because Nebula can focus on filling in the gaps and not on reinventing the wheel. Much like Bechtolsheim was successful at Sun Microsystems  by building atop Unix and at Arista by using standard hardware components.

Here is a question.  If Nebula is the cloud appliance.

Elastic Infrastructure

The Nebula appliance dynamically provisions and destroys virtual infrastructure and storage as workloads fluctuate.

Why wouldn't you run the Nebula SW on multiple Open Compute Servers in your cloud environment?  Seems like the Nebula appliances are single point of failures unless you have multiple instances running in your cloud environment.  Which should be easy if you buy a few more Open Compute Servers.

Nebula was announced at OSCON,  but who would let their cloud environment be down waiting for a Fedex and ship their cloud data outside the company in their Nebula Appliance?

Nebula will supply the appliance. "If it fails, FedEx it back to us, and we'll send you another one," Kemp said. "Our little box has a 10 gigabit ethernet switch built into it. You can plug cheap commodity servers into the rack. You don't have to turn them on. It will do that. The interface is like Amazon Services." These servers act as monitors by this appliance, including log files and flow data. "What we do is create interface points to all of the common CMDB tools, managing tools, security tools, like ArcSight or Splunk," said Kemp. "We will create integration points for those particular products."

I am sure there is a high availability architecture that Nebula has, but why buy multiple Nebula Appliances when the same hardware, the Open Compute Servers are in your environment?  Because, the investor convinced the Nebula Founders it was a better revenue model?

Kemp wasn’t planning to do an appliance, he admits, but initial investor Bechtolsheim convinced him it was the right approach.

Would you want an appliance or the software you can run on the Open Compute Server?

BTW, given the SW runs on the Open Compute Server the Nebula Software should run on any hardware, unless Nebula modified the software to be hardware specific.

 

MapR, 1/2 the HW and faster performance for Apache Hadoop

MapR technologies came out of stealth mode in June, and their solution is available for download.

image

MapR is the Next Generation for Apache Hadoop

Here is a presentation that you can watch to learn more.

The Design, Scale and Performance of MapR’s Distribution for Apache Hadoop

Posted on JULY 27, 2011 by JACK

Check out M.C. Srivas’ Hadoop Summit presentation. Srivas, the CTO and co-founder of MapR, outlines the architectural details behind MapR’s performance advantages. This technical discussion also describes the scale advantages of the MapR distributed NameNode and provides comparisons to HDFS.

← Big Data and Hadoop

Pretty cool you can save 1/2 power for your Apache Hadoop system.  Software can save a lot of power to support a green data center.

It is funny to think about the reality of a lights out data center, Dilbert Cartoon as an example

Dilbert has a cartoon on lights out data centers.

Dilbert.com

Human error in the data center is a reality, and the funny part is an answer just like above is to not allow employees in the data center.  Especially if the data center is self-aware.

What is potentially worse than employees are the vendor support employees who are not tracked.  Do you exactly what the warranty service technician did in his service call in your data center?