Story of Adobe & Apple High-Value Digital Image Applications, Adobe's angst developing for the iPad, and how Microsoft missed this battle

This is not a data center post, but one about competition and innovation.

If you are a high-end photographer person you use the RAW imaging format, a higher quality image format vs. JPEG.

A camera raw image file contains minimally processed data from the image sensor of either a digital camera, image, or motion picture film scanner. Raw files are so named because they are not yet processed and therefore are not ready to be printed or edited with a bitmap graphics editor. Normally, the image is processed by a raw converter in a wide-gamut internal colorspace where precise adjustments can be made before conversion to a "positive" file format such as TIFF or JPEG for storage, printing, or further manipulation, which often encodes the image in a device-dependent colorspace.

The RAW Imaging apps are dominated by Adobe Lightroom, Adobe Photoshop, and Apple Aperture.  With Adobe in the dominant position with Photoshop Lightroom

Digital camera raw file support

The camera raw functionality in Adobe® Photoshop® software provides fast and easy access to the raw image formats produced by many leading professional and midrange digital cameras. By working with these "digital negatives," you can achieve the results you want with greater artistic control and flexibility while still maintaining the original raw files.

The battle between Apple and Adobe is about Flash now, and affects other Adobe products. As one of Adobe's product managers points out their Photoshop Lightroom user base has requested an iPad version, but there is no guarantees Apple will approve a Lightroom application.

Adobe announces angst-laden iPad software effort

by Stephen Shankland

Adobe has begun a new effort to bring imaging software such as Lightroom to the iPad and other tablet computers--but the leader of the work also is fretting over the control Apple has over it.

"I love making great Mac software, and after eight years product-managing Photoshop, I've been asked to help lead the development of new Adobe applications, written from scratch for tablet computers. In many ways, the iPad is the computer I've been waiting for my whole life," Adobe's John Nack said in a blog post Thursday. "I want to build the most amazing iPad imaging apps the world has ever seen."

Adobe's John Nack blog post continues.

These aren't idle questions. When the iPad was introduced, I asked what apps you'd like to see Adobe build for it. Among the 300 or so replies were many, many requests for a mobile version of Lightroom. I think that such an app could be brilliant, and many photographers tell me that its existence would motivate them to buy iPads.

Would Apple let Lightroom for iPad ship? It's almost impossible to know. Sometimes they approve apps, then spontaneously remove them for "duplicat[ing] features that come with the iPhone." Other times they allow competitors (apps for Netflix, Kindle, etc.), or enable some apps (e.g. Playboy) while removing similar ones. Maybe they'd let Lightroom ship for a while, but if it started pulling too far ahead of Aperture--well, lights out.

If you are a RAW image user, of which I am for the past ten years, buying a Canon G1 in 2000, let me tell you the story of how Microsoft missed the RAW imaging opportunity, and doesn't have a RAW imaging application even though Microsoft hired Adobe's Lightroom architect Mark Hamburg.

Canon G1 Review, Phil Askey, September 2000

Adobe Lightroom is the one application I use most often with photos.

Adobe® Photoshop® Lightroom® 2 offers powerful new and enhanced features across the entire program to help you streamline your digital photography workflow. Sort and find the photos you want faster, target specific photo areas for more precise adjustments, showcase your talent using more flexible printing templates, and more.

When you look at the history of Lightroom you see mention of Mark Hamburg who Microsoft hired/poached in April 2008.

History

In 2002, veteran Photoshop developer Mark Hamburg began a new project, code-named “Shadowland". Hamburg reached out to Andrei Herasimchuk, former interface designer for the Adobe Creative Suite, to get the project off the ground. [1] The new project was a deliberate departure from many of Adobe’s established conventions. 40% of Photoshop Lightroom is written using the Lua scripting language.

MARK HAMBURG LEAVES ADOBE

Posted by Martin Evening

markh.jpgNews has been announced that Mark Hamburg has decided to leave Adobe after having worked at the company for over 17 years. Mark joined Adobe in the Fall of 1990, not long after Photoshop 1.0 was released and was instrumental in devising many of the ‘wow’ features we have all come to love and rely on daily when we work with Photoshop.

Mark left the Photoshop team after Photoshop 7 shipped and went to work developing a new paradigm in image processing which would finally ship as the product named Adobe Photoshop Lightroom.

The irony of all this is back in 2000 I was working with a team of people at Microsoft who had a vision for RAW imaging use in Windows as a way to bring professional photography to Windows vs. the Mac.  And, the person we had on our team was Mark Hamburg's boss's boss who worked for me.  We had a bunch of other visionaries who understood the quality of images was a huge opportunity vs. JPEG.  But, it was hard to justify the market in 2000-2001.  Once Adobe and Apple shipped their RAW Imaging Applications, Lightroom and Aperture, there was now data to show the size of the market.  So around 2006 Microsoft starts trying to build a RAW imaging application group.

To make this more ironic when Mark Hamburg joined Microsoft, the executives asked Mark who they should hire to add to their development team, and Mark named his previous boss's boss, and said oh BTW he used to work for Microsoft and Adobe, but he works for Google now.  This the same guy who worked for me on RAW imaging in 2000, and likes to stay out of limelight, so you can't find him in a Google Search.  So, Microsoft tries to hire the imaging expert to leave Google, and there is a small group of us hoping he makes the move, but he says no, deciding Microsoft is not for him.  Shortly, after Mark Hamburg leaves Microsoft going back to Adobe.

Adobe's John Nack proudly blogged about Mark  Hamburg's return to Adobe.

Mark Hamburg returns to Adobe

Well, that didn't take so long, did it? :-)

After 17 years on the Photoshop & Lightroom teams, Mark Hamburg left Adobe last year to join Microsoft and work on improving the Windows user experience (as he found it "really annoying"). I'm happy to say that after that brief sojourn, he'sreturning to the Adobe Digital Imaging team. Welcome back, Mark! [Via]

Oh, and to ZDNet's Mary Jo Foley, who wrote at the time of Mark's departure:

Microsoft's competitor to Adobe Lightroom gets another champion... My bet is Hamburg will be instrumental in helping Microsoft bring to market its Photoshop Lightroom competitor.

Er, not so much.

Why did I write this post? 

Because it reminds me of the difficulties of being innovative when people look at you as if you are crazy.  "Where is the data and market research to support what you are proposing?"  My response would like to be "By the time the marketing data exists, you'll have the information to build an obsolete solution. Get out of the way."

Which reminds me the biggest reason we couldn't get RAW Imaging applications going is the lack of an established market and other groups saying they were the ones responsible for imaging applications.

Also, I should write a post on being innovative and lessons learned from friends like Gary Starkweather.

In 1969, Starkweather invented the laser printer at Xerox's Webster research center. He collaborated on the first fully functional laser printing system at Xerox PARC in 1971.[1][2]

At Apple Computer in the 1990s, Starkweather invented color management technology,[3] and led the development of Colorsync 1.0. Starkweather joined Microsoft Research in 1997, where he works on display technology.[4]

Read more

Will a Google Tablet be the iPad competitor or Netbook? Maybe both - targeting Apple and Microsoft with one device

There is lots of news on Google's Tablet with Verizon.

Verizon, Google Developing iPad Rival

By NIRAJ SHETH

Verizon Wireless is working with Google Inc. on a tablet computer, the carrier's chief executive, Lowell McAdam, said Tuesday, as the company endeavors to catch up with iPad host AT&TInc. in devices that connect to wireless networks.

The work is part of a deepening relationship between the largest U.S. wireless carrier by subscribers and Google, which has carved out a space in mobile devices with its Android operating system. Verizon Wireless last year heavily promoted the Motorola Droid, which runs Google's software.

"What do we think the next big wave of opportunities are?" Mr. McAdam said in an interview with The Wall Street Journal. "We're working on tablets together, for example. We're looking at all the things Google has in its archives that we could put on a tablet to make it a great experience."

These devices are all part of using less energy in consuming devices connected to data centers.

I am amazed the number of people who think they can get an iPad and leave their laptop at home.  Google realizes this opportunity to create the always connected laptop replacement. 

The device may not be perfect, but no laptop is either.  What trade-offs will Google and Verizon make in the device?

Once, someone gets the right device category defined watch the growth of data centers continue as hyper-connected laptop replacements fuel new usage scenarios which play into Google's hands.

Read more

Microsoft writes humorous blog post to educate SW developers of the power costs to run their code

Microsoft has a blog on "How Much Does Your Code Cost?", interjecting humor into a typically dry topic.

The big difference is that with cloud computing, you’re renting computing power in a data center somewhere. As far as you’re concerned, it could be on Saturn. Except that the latency figures might be a bit excessive. If you’ve accidentally opened one of those magazines your network administrator takes with him to the bathroom, you might know that these data centers contain racks and racks of servers, all with lots of twinkling lights. If you’ve ever been to a data center, you’ll know that they can be very hot near the server fans, much colder around the cooling vents, and noisy everywhere. All this activity results from removing the heat that the servers produce. But that heat doesn’t get there all by itself – the servers create it from the electricity they use. What’s more, it requires even more electricity to remove that heat.

Consider sending this post on to those who are involved in SW decisions to get them thinking of the impact of their SW code.

When you’re up against deadlines to turn in a software project, you probably are focused on ensuring that you meet the functionality requirements set out in the design specification. If you have enough time, you might consider trying to maximize performance. You might also try to document your code thoroughly so that anyone taking over the project doesn’t need to run Windows for Telepaths to work out what your subroutines actually do. But there is probably one area that you don’t consider: the cost of your code.

You mean what it costs to write the code, right? No.

Er, how about what it costs to compile? You’re getting warmer...

What it costs to support? No, colder again.

OK, you win. What costs do you mean?

I mean what it costs to run your code. In the bad old days, when clouds were just white fluffy things in the sky and all applications ran on real hardware in a server room somewhere or on users’ PCs, then cost simply wasn’t a factor. Sure, you might generate more costs if your application needed beefier hardware to run, but that came out of the cable-pluggers’ capital budget, and we all know that computer hardware needs changing every other year, so the bean-counters didn’t twig. A survey by Avanade showed that 50% of IT departments don’t even budget for the cost of electricity to run their IT systems. For more information, see this Avanade News Release, at http://www.avanade.com/_uploaded/pdf/pressrelease/globalenergyfindingsrelease541750.pdf

Life would be so much easier in the data center if SW developers and others thought of the data center infrastructure costs direct relationship to the code they write and how it is architected.

The good thing is cloud computing is helping to get SW developers to think about the costs to run their code.

If you deploy applications into the cloud, it is highly likely that your service provider will be charging you based on the energy that you use. Although you don’t see electricity itemized as kw/hr, you are billed for CPU, RAM, storage and network resources, all of which consume electricity. The more powerful processor with more memory costs more, not just because the cost of the components, but because they consume more electricity. In many ways, this is an excellent business model, as you don’t have to buy the hardware, maintain it, depreciate it, and finally, replace it. You simply pay for what you use. Or putting it another way, you pay for the resources you use. And this is the point at which you need to ask yourself: How much does my code cost? When power usage directly affects the cost of running your applications, a power-efficient program is more likely to be a profitable one.

The blog post references Visual Studio and Intel resources to help SW developers.

It is possible that future versions of Visual Studio will include options for checking your code for power usage. Until that time, following these recommendations should help minimize the running costs of your applications within a cloud-based environment.

  1. Reduce or eliminate accesses to the hard disk. Use buffering or batch up I/O requests.
  2. Do not use timers and polling to check for process completion. Each time the application polls, it wakes up the processor. Use event triggering to notify completion of a process instead.
  3. Make intelligent use of multiple threads to reduce computation times, but do not generate threads that the application cannot use effectively.
  4. With multiple threads, ensure the threads are balanced and one is not taking all the resources.
  5. Monitor carefully for memory leaks and free up unused memory.
  6. Use additional tools to identify and profile power usage.

For more ideas on how to reduce the memory, check out the following resources and tools:

Energy-Efficient Software Checklist, at http://software.intel.com/en-us/articles/energy-efficient-software-checklist/.

Creating Energy-Efficient Software, at http://software.intel.com/en-us/articles/creating-energy-efficient-software-part-1/.

Intel PowerInformer, at http://software.intel.com/en-us/articles/intel-powerinformer/.

Application Energy Toolkit, at http://software.intel.com/en-us/articles/application-energy-toolkit/.

Read more

Google's Eric Schmidt discusses Sharing and Mobile strategy

GigaOm analyzes this video of Eric Schmidt at Atmosphere.

And throws this out as summary of key issues Eric presents.

Schmidt made two specific comments about resource allocation, saying that the hardest and most pressing engineering issues facing Google today are around sharing and mobile. He was talking to the enterprise execs present but his statements were so absolute I think it’s fair to apply them more broadly.

“Companies are about sharing,” Schmidt said. “One of the new things in the last five years about the web is that it enables sharing-sensitive apps.” He continued,

I think of calendars as incredibly boring, but I’m wrong, calendars are incredibly interesting because they’re incredibly shared. So from a computer science perspective, all of a sudden we have our top engineers who want to build calendars. I’m going, what’s wrong with you guys? But in fact it’s a very interesting example. Spreadsheets are similar, the most interesting spreadsheets are highly, highly interlinked, something I didn’t know, and was not possible with the previous technology — Microsoft technology made it very difficult because they were not built in that model.

Google's Don Dodge (recently laid off by Microsoft) adds his perspective on the threat to Office.

Erick Schonfeld at Techcrunch says; "Slowly but surely, Google keeps trying to chip away at Microsoft’s core Office productivity suite with Google Docs, its free online word processor, spreadsheet, and presentation software. Today, Google Drawing is being added to the mix and Google Docs and Spreadsheets is getting a major realtime update."

David Berlind at InformationWeek is much more aggressive. "Make no mistake about it. Google is going for Microsoft's jugular. The deathmatch is on and, at the very least, it's for bragging rights to what we at InformationWeek are calling the "collaborative backbone." It becomes a battle that's less about Google Docs versus Microsoft Office and much more about the collaborative infrastructure behind Google Apps versus Microsoft's SharePoint and Exchange."

And provides a graph to illustrate his point.

This competitive positioning chart illustrates where Google is coming from, and where it hopes to go in the future. It is the classic Innovators Dilemma competitive curve. Time will tell how it shakes out. The move to the cloud seems to be pretty clear. Only the slope of the curve and speed seems to be in question.

Googdocsprogress

And, let's not forget the changes from Mobile.

As the mobile Internet becomes central for both consumer and corporate users, the core product questions are interoperability, security and safety, Schmidt said. “What’s important is to get the mobile experience right, because mobility will ultimately be the way you provision most of your services,” he added, saying that Google considers phones, tablets and netbooks mobile experiences.

These are all things we are thinking about as we get the GreenM3 NPO rolling, and how we will approach data center information sharing.  In some ways you could contrast what we are thinking of in an Open and Transparent approach to data center innovation vs. the status quo.  It is close to the comparison of Microsoft's individual authoring thinking vs. Google's team collaboration.

Read more

Google, Microsoft, Amazon, Nokia, Digital Realty Trust, Dupont Fabros vs. ASHRAE standard 90.1 requirement for economizers limits innovation - comment to be heard

Google's Public Policy blog has a post with some of the most innovative data center operators.

Chris Crosby, Senior Vice President, Digital Realty Trust
Hossein Fateh, President and Chief Executive Officer, Dupont Fabros Technology
James Hamilton, Vice President and Distinguished Engineer, Amazon
Urs Hoelzle, Senior Vice President, Operations and Google Fellow, Google
Mike Manos, Vice President, Service Operations, Nokia
Kevin Timmons, General Manager, Datacenter Services, Microsoft

This collection and probably many others are appealing to ASHRAE to change the requirement for economizers.

Unfortunately, the proposed ASHRAE standard is far too prescriptive. Instead of setting a required level of efficiency for the cooling system as a whole, the standard dictates which types of cooling methods must be used. For example, the standard requires data centers to use economizers — systems that use ambient air for cooling. In many cases, economizers are a great way to cool a data center (in fact, many of our companies' data centers use them extensively), but simply requiring their use doesn’t guarantee an efficient system, and they may not be the best choice. Future cooling methods may achieve the same or better results without the use of economizers altogether. An efficiency standard should not prohibit such innovation.

I know many of these above people and thanks to a friend they forwarded me this link to Google's blog post, I speculated on what drove the economizer requirement:

  1. Without talking to anyone, one assumption is this group who are active in ASHRAE brought up the energy efficiency issue early on, and ASHRAE stakeholders, most likely vendors who make economizers saw an opportunity to make specific equipment a requirement of energy efficiency data centers.  I could be wrong, but it would explain why an organization who sets standards would choose to specify equipment instead of performance.
  2. In many established data center organizations like financials, economizers are/were unacceptable in data centers a few years back.  So, is the move to establish economizers a reaction to those who refused to use economizers for energy efficient cooling.
  3. The ASHRAE consulting community sees a need for their services to meet ASHRAE's economizer requirement.  For example, if in a given area there are X number of hours a year that are available for running economizers, does the economizer need to be run for a specific %.  Hire an ASHRAE consultant to interpret the standard.  I sure can't.

The data center group above proposes the following as a better update to the ASHRAE standard.

Thus, we believe that an overall data center-level cooling system efficiency standard needs to replace the proposed prescriptive approach to allow data center innovation to continue. The standard should set an aggressive target for the maximum amount of energy used by a data center for overhead functions like cooling. In fact, a similar approach is already being adopted in the industry. In a recent statement, data center industry leaders agreed that Power Usage Effectiveness (PUE) is the preferred metric for measuring data center efficiency. And the EPA Energy Star program already uses this method for data centers. As leaders in the data center industry, we are committed to aggressive energy efficiency improvements, but we need standards that let us continue to innovate while meeting (and, hopefully, exceeding) a baseline efficiency requirement set by the ASHRAE standard.

It doesn't make any sense that all data centers built to ASHRAE's standards have to use economizers. If you choose to have a waterfront data center and could use the body of water as a heat sink for your cooling, ASHRAE wouldn't allow it or would they?

The public comment period is open until April 19.  If you disagree with ASHRAE's economizer requirement comment on this blog or Google's blog post.

I was able to talk to Google's Chris Malone on this topic after I wrote the above.  The main concern Google has is if you are trying to be innovative in energy efficiency the last thing you want is a barrier saying you have to use a particular technology.

In other words, the standard should set the required efficiency without prescribing the specific technologies to accomplish that goal. That’s how many efficiency standards work; for example, fuel efficiency standards for cars specify how much gas a car can consume per mile of driving but not what engine to use.

Imagine if MPG numbers were mandated by use of hybrid, diesel, or turbocharger.  It is obvious that the most innovative MPG is going to come from those who have the freedom to use any technology.

You should soon see other data center bloggers write on this issue.  If you think this is wrong comment on the Google Blog post or one of the others.

Read more