Tricks of the Trade,"Turning Numbers into Knowledge", info for the new wave of Data Analysts

I just received from amazon.com Jonathan Koomey's Turning Numbers into Knowledge book, and one of the things that caught my eye is the Foreword where

There is nothing else like this book out there.
Nobody who deals with problems where numbers matter—and everybody in today’s world really needs to—should be without it.
John P. Holdren*
Woods Hole, MA, October 2007

And Professor Holdren explains the "Tricks of the Trade" course he taught at UC Berkeley.

Berkeley’s guardians of academic respectability eventually made me change what they regarded as too frivolous a title for the course to “Professional Methods for Interdisciplinary Careers”, but the focus remained the same for the 15+ years that I taught it. It covered ways of thinking through complex problems; how to find and manage information; how to function in a committee; how to identify and avoid common pitfalls in the interpretation of data; how to present results clearly in words, graphs, and tables; how to manage one’s time; and even how to avoid jet lag.

Many students over the years suggested that I should write a book teaching the “Tricks of the Trade”. Notwithstanding my advice to others about time management, however, I never found the time to write it.

With the 2001 publication of the first edition of Jonathan Koomey’s remarkable book, Turning Numbers into Knowledge, I realized that I no longer needed to try. Dr. Koomey, who had taken my course in the 1980s as a Berkeley graduate student, had plenty of ideas of his own about the need and how to fill it. And the book that he wrote surpassed what I would have done, if I had found the time, in every important respect

WSJ has a post on how there is a new wave of Business Schools planning to educate students in data analysis.

Business Schools Plan Leap Into Data

By MELISSA KORN And SHARA TIBKEN

Faced with an increasing stream of data from the Web and other electronic sources, many companies are seeking managers who can make sense of the numbers through the growing practice of data analytics, also known as business intelligence. Finding qualified candidates has proven difficult, but business schools hope to fill the talent gap.

This fall several schools, including Fordham University's Graduate School of Business and Indiana University's Kelley School of Business, are unveiling analytics electives, certificates and degree programs; other courses and programs were launched in the previous school year.

Some of the students who are thinking of getting into Data Analytics should consider Jonathan Koomey's book, Turning Numbers into Knowledge.

Turning Numbers into Knowledge: Mastering the Art of Problem Solving

One way to look at Green Data Center Start-ups are they founded by engineers and scientists or VCs

Two of my cloud computing engineering friends and I are having a blast working on a technology solution that can be used in data centers as well as many other areas. I ran across Steve Blank's post on

How Scientists and Engineers Got It Right, and VC’s Got It Wrong

There are many parts of Steve's post that resonate with our team.

Startups are not smaller versions of large companies. Large companies execute known business models. In the real world a startup is about the search for a business model or more accurately, startups are a temporary organization designed to search for a scalable and repeatable business model.

...

Scientists and engineers as founders and startup CEOs is one of the least celebrated contributions of Silicon Valley.

It might be its most important.

We all worked in Silicon Valley, so we have a bunch of methods ingrained our thinking.

Why It’s “Silicon” Valley
In 1956 entrepreneurship as we know it would change forever.  At the time it didn’t appear earthshaking or momentous. Shockley Semiconductor Laboratory, the first semiconductor company in the valley, set up shop in Mountain View. Fifteen months later eight of Shockley’s employees (three physicists, an electrical engineer, an industrial engineer, a mechanical engineer, a metallurgist and a physical chemist) founded Fairchild Semiconductor.  (Every chip company in Silicon Valley can trace their lineage from Fairchild.)

The history of Fairchild was one of applied experimentation. It wasn’t pure research, but rather a culture of taking sufficient risks to get to market. It was learning, discovery, iteration and execution.  The goal was commercial products, but as scientists and engineers the company’s founders realized that at times the cost of experimentationwas failure. And just as they don’t punish failure in a research lab, they didn’t fire scientists whose experiments didn’t work. Instead the company built a culture where when you hit a wall, you backed up and tried a different path. (In 21st century parlance we say that innovation in the early semiconductor business was all about “pivoting” while aiming for salable products.)

The Fairchild approach would shape Silicon Valley’s entrepreneurial ethos: In startups, failure was treated as experience (until you ran out of money.)

Conveniently, our idea does not need VC money or MBAs.

Scientists and Engineers = Innovation and Entrepreneurship
Yet when venture capital got involved they brought all the processes to administer existing companies they learned in business school – how to write a business plan, accounting, organizational behavior, managerial skills, marketing, operations, etc. This set up a conflict with the learning, discovery and experimentation style of the original valley founders.

Yet because of the Golden Rule, the VC’s got to set how startups were built and managed (those who have the gold set the rules.)

I have been reading Steve Blank and some of his ideas as he experiments with business models.

Earlier this year we developed a class in the Stanford Technology Ventures Program, (the entrepreneurship center at Stanford’s School of Engineering), to provide scientists and engineers just those tools – how to think about all the parts of building a business, not just the product. The Stanford class introduced the first management tools for entrepreneurs built around the business model / customer development / agile development solution stack. (You can read about the class here.)

Some of the best data center conversations I have are on new business models not technology. Give it a try sometime.  It is much more fun.

Open Data Map movement demonstrates innovation opportunity for Open Sourced Data Center Initiative

Tim Berners- Lee has a 6 minute TED presentation on the year open data went worldwide.

Map and location services are top scenarios for mobile devices.  Google and Microsoft have their maps.  Nokia bought Navteq and MetaCarta.  Apple bought PlaceBase.  With all the companies creating services, volunteers using an open approach to collaborate can beat the proprietary services.

The MercuryNews reports on Open Street Maps.

Volunteers create new digital maps

By Mike Swift

mswift@mercurynews.com

Posted: 04/09/2010 09:08:55 PM PDT

Updated: 04/10/2010 01:36:26 PM PDT

Ron Perez hikes by a waterfall while a portable GPS device records his tracks as... (Jim Gensheimer)

When Brian "Beej" Hall first heard about an audacious volunteer effort to create an Internet map of every street and path in every city and village on the planet, he was hooked. At the time, the nascent effort had only a few American members, and the U.S. map was essentially a digital terra incognita.

Just a few years later, the Berkeley software engineer is editing digital maps so precise they include drinking fountains and benches in the Bay Area parks where he hikes, and the mapping community has swelled to more than 240,000 global members. The effort, OpenStreetMap, is a kind of grass-roots Wikipedia for maps that is transforming how map data is collected, shared and used — from the desktop to smartphones to car navigation.

The reporter makes the observation of how a nonprofit community can change the map business.

But increasingly, the nonprofit community collaboration model behind OpenStreetMap, which shares all the cartographic data in its maps for free, is also changing the business of mapping, just as Wikipedia changed the business of reference. More and more, the accuracy of searches on Google Maps or directions issued by your car's navigational device are based on data collected by volunteers like Hall and other members of OpenStreetMap's do-it-yourself army.

Part of the reason why OpenStreetMap is popular is the fact that the end users are creating the maps.

OpenStreetMap users say that because their data is collected by people who actually live in a place, it is more likely to be accurate.

"It's the people's map," said Paul Jarrett, director of mapping for CloudMade.

If you are interested in the use of OpenStreetMap in Haiti go here.

We chose to tell the story of 'OpenStreetMap - Project Haiti'.
We all followed the crisis that unfolded following the Haiti earthquake, many of us chose to donate money, a few were flown out and deployed as part of the relief effort. But what practical impact can many have without being there in Haiti itself? Well, during this crisis a remarkable story unfolded; of how people around the world could virtually collaborate and contribute to the on-the-ground operations.

OpenStreetMap - Project Haiti 1

With the little existing physical, political and social infrastructure  now destroyed or damaged, the situation was especially challenging for aid agencies arriving on the ground. Where are the areas most in need of assistance, how do we get there, where are people trapped under buildings, which roads are blocked? This information is important to the rescue agencies immediately after the event, and to the longer rebuilding process. In many developing countries, there is a lack of good mapping data and particularly after a crisis, when up-to-date information is critical to managing events as they evolve.
Enter OpenStreetMap, the wiki map of the world, CrisisMappers and an impromptu community of volunteers who collaborated to produce the most authoritative map of Haiti in existence. Within hours of the event people were adding detail to the map, but on January 14th high resolution sattelite imagery of Haiti was made freely available and the Crisis Mapping community were able to trace roads, damaged buildings, and enter camps of displaced people into OpenStreetMap. This is the story of OpenStreetMap - Project Haiti:

There are many who think the Open Source Data Center Initiative will not work.  There are a lot of people who thought OpenStreetMaps wouldn't work too.

Read more

Next Advisor for GreenM3 NPO, Peter Horan, pushing the edge of the network to be close to customers

Our first industry advisor was Mike Manos, our next is Peter Horan. Peter is unknown to most of the data center audience as he is an executive who has worked on the edge of innovation, not in the hub of data center activity.  Peter does have data center experience as the Sr. VP executive for InterActive Media at the time of ask.com's data center construction at Moses Lake, WA.  Chuck Geiger was CTO of ask.com at the time, and stated.

“Moses Lake is an ideal location due to its cooperative business environment, access to low cost, renewable power and superior network connectivity,” said Chuck Geiger, Chief Technology Officer of Ask.com. “With these inherent benefits, Eastern Washington is the right choice for Ask.com as we expand our computing infrastructure to support our growth and expanded search services.”

Peter has had the executive's view of building a large data center, yet he has some very innovative, forward thinking ideas and a powerful network.  Which brings up a presentation that Peter made discussing the "Edge of the Network."

image

I've known Peter for many years, including his time as Sr. VP/Publisher of ComputerWorld, CEO of DEVX.com, about.com, allbusiness.com, and was an obvious candidate for the GreenM3 NPO.

image

Here is a video where Peter presents the ideas to get closer to customers.  In the same way Peter encourages the audience to get close to customers, the goal of GreenM3 is to build a closer connection to customers, using open source techniques.

image

A person who we want to talk to in Peter's network is Chuck Geiger.

Chuck Geiger
Partner - Technology

Chuck has significant experience running some of the largest online transaction product organizations and most visited sites in the world, including as CTO of Ask.com, CTO of PayPal, VP Architecture of eBay, and executive positions at InterActive Corp., Gateway and Travelocity.


At InterActive Corp, Chuck was responsible for managing a consolidated data center strategy for IAC portfolio companies including Ask.com, Evite.com, CitySearch.com, Ticketmaster, and Match.com. Chuck also was responsible for the technology organization at Ask.com including Engineering, Architecture, Program Management, QA, IT, and Operations.


At PayPal, Chuck was responsible for the product development organization which includes Product Management, Design, Engineering, Architecture, Operations, IT, QA, Project Management, Content, and Localization, running a team of approximately 550 professionals.
At eBay, Chuck was responsible for the migration to the new generation system architecture and platform.

BTW, Peter's day job is Chairman of Goodmail.

About Goodmail Systems

Goodmail Systems is the creator of CertifiedEmail™, the industry’s standard class of email. CertifiedEmail provides a safe and reliable means for consumers to easily identify authentic email messages from legitimate commercial and nonprofit email senders. Each CertifiedEmail is sent with a cryptographically secure token that assures authenticity and is marked in the inbox with a unique blue ribbon envelope icon, enabling consumers to visually distinguish email messages which are real and sent from email senders with whom they have a pre-existing relationship.

We welcome Peter's passion for technical innovation and the environment.

Read more

Alternative to Google's hiring Renewable Energy Systems Modeling Engineer

I am spending more time researching the Low Carbon Data Center ideas and I ran across Google's job posting on Renewable Energy System Modeling Engineer.

The role: Renewable Energy System Modeling Engineer - Mountain View

RE<'C will require development of new utility-scale energy production systems. But design iteration times for large-scale physical systems are notoriously slow and expensive. You will use your expertise in computer simulation and modeling to accelerate the design iteration time for renewable energy systems. You will build software tools and models of optical, mechanical, electrical, and financial systems to allow the team to rapidly answer questions and explore the design-space of utility-scale energy systems. You will draw from your broad systems knowledge and your deep expertise in software-based simulation. You will choose the right modeling environment for each problem-from simple spreadsheets to time-based simulators to custom software models you create in high-level languages. The models you create will be important software projects unto themselves. You will follow Google's world-class software development methodologies as you create, test, and maintain these models. You will build rigorous testing frameworks to verify that your models produce correct results. You will collaborate with other engineers to frame the modeling problem and interpret the results.

It's great Google see the need for this person, but I was curious if anyone else has done Renewable Energy System Modeling.  Guess what there is, since 1993 in fact.  NREL has this page on Homer.

New Distribution Process for NREL's HOMER Model

Note! HOMER is now distributed and supported by HOMER Energy (www.homerenergy.com)

To meet the renewable energy industry’s system analysis and optimization needs , NREL started developing HOMER in 1993. Since then it has been downloaded free of charge by more than 30,000 individuals, corporations, NGOs, government agencies, and universities worldwide.

HOMER is a computer model that simplifies the task of evaluating design options for both off-grid and grid-connected power systems for remote, stand-alone, and distributed generation (DG) applications. HOMER's optimization and sensitivity analysis algorithms allow the user to evaluate the economic and technical feasibility of a large number of technology options and to account for uncertainty in technology costs, energy resource availability, and other variables. HOMER models both conventional and renewable energy technologies:

image

I signed up for the Homer Energy site which has 510 users, non apparently Google engineers.

image

I hope to make contact with the Homer Energy team as we are trying to have a session at DataCenterDynamics Seattle on a Low Carbon Data Center.

Maybe Google doesn't have to hire the Renewable Engineering System Modeling engineer after all.  :-)

Read more