Google Ads

Enter your email address:

Delivered by FeedBurner

Search
This form does not yet contain any fields.
    Navigation
    Thursday
    Mar112010

    Google Warehouse Scale Computing pattern harvested, solving the current or future performance problems

    The Open Source Data Center Initiative is using a Pattern based approach.

    In software engineering, a design pattern is a general reusable solution to a commonly occurring problem in software design. A design pattern is not a finished design that can be transformed directly into code. It is a description or template for how to solve a problem that can be used in many different situations.

    I was reading Google's Warehouse Scale Computing document which can be daunting with its 120 pages of dense topics.  One of the points made which is an example of a design pattern is under the following conditions.

    Key pieces of Google’s services have release cycles on the order of a couple of weeks compared to months or years for desktop software products. Google’s front-end Web server binaries, for example, are released on a weekly cycle, with nearly a thousand independent code changes checked in by hundreds of developers— the core of Google’s search services has been reimplemented nearly from scratch every 2 to 3
    years.

    This may not sound like your environment, but it is common in agile dynamic SW development at Google, start-ups and other leading edge IT shops.

    Agile methods generally promote a disciplined project management process that encourages frequent inspection and adaptation, a leadership philosophy that encourages teamwork, self-organization and accountability, a set of engineering best practices intended to allow for rapid delivery of high-quality software, and a business approach that aligns development with customer needs and company goals.

    The old way of purchasing IT hardware to support an application's SLA is a lower priority.  The new way is to add hardware capabilities to support the rapid innovation in SW development.

    A beneficial side effect of this aggressive software deployment environment is that hardware architects are not necessarily burdened with having to provide good performance for immutable pieces of code. Instead, architects can consider the possibility of significant software rewrites to take advantage of new hardware capabilities or devices.

    BTW, immutable in SW means which applies to many legacy systems.

    In object-oriented and functional programming, an immutable object is an object whose state cannot be modified after it is created. This is in contrast to a mutable object, which can be modified after it is created.

    Problem: How to improve the performance per watt  in IT efficiencies with data center infrastructure and hardware?

    Options:

    1. Improve data center efficiency, aka PUE.
    2. Buy more efficiency IT HW.
    3. Improve HW utilization with virtualization and server consolidation.
    4. Add new hardware capabilities that support the future of software.

    Solution: even though 1 - 3 are typical, the efficiencies from #4 could be sizeable larger.  Some part of the data center and IT hardware should be designed for the future applications vs. making the future applications run on what the past applications require.

    Examples of technologies are NVidia's GPU, solid state memory, startups with new hardware designs like www.tilera.com, and complete re-architecture of the data center system.

    People are working on the complete re-architecture of the data center system as the performance per watt gains are huge.

    How many data centers are designed for the current hardware vs the future? 50%, 75%, 90%, 95%, 98%

    Should data centers be designed for a 5 year lifespan vs. 20 - 30 to support more rapid innovation?  Then, be upgradable?

    Click to read more ...

    Wednesday
    Mar102010

    Navy Seals adopt open and transparency to improve yield of candidates

    Many people question the approach of being open and transparent for our Open Data Center Initiative. One group that has been known for its secrecy, but has changed to open and transparency is the Navy Seal program.

    Here is video by MSNBC.

    Visit msnbc.com for breaking news, world news, and news about the economy

    Here is the newly launch Navy Seal web site with twitter, Facebook, YouTube, and forums to improve the social networking connection.

    image

    And, a written interview with the Navy Seal command.

    "The new site was created to improve the information that is available online about Naval Special Warfare careers so that young men who might be interested in joining have as much information as possible to help them make informed choices and prepare," Navy SEAL Cpt. Adam Curtis, director of Naval Special Warfare Recruiting, told OhMyGov. "We want young men in our target audience to interact with us. Social networking sites provide a platform for that interaction. We can post information on Facebook, and our audience can let us know, in real time, what they think of it."

    Another benefit to social networking is the ability to ask questions and hear thoughts from not only recruiters, but people going through the same decision making process. This interaction keeps people "engaged" as Cpt. Curtis said, helping those interested to decide better if a career as a Navy SEAL is for them.

    Also included on the site is a PDF attachment of the SEALs Physical Training Guide, including nutrition information, and strength guides for swimming, running, and injury prevention. Plus, you can take the SEALs PST calculator to see if you could really hack it through basic training.

    All of these Web features are designed to help young men understand more about the SEALs program, and how to prepare for the training better than ever before without knowing a SEAL personally.

    I am sure there are many ex-Seals who question the next approach, but the Navy Seal command recognized it needed to grow faster. The old way was too slow.

    Click to read more ...

    Wednesday
    Mar102010

    400 megawatt smorgasbord meal, 2,500 from nuclear, coal, and hydro plus renewable sources – all you can eat starting at $0.035 kw/hr

    The Ewing Industrial Park in Columbia, MO has a unique power capability few can match.  When I first visited the site, they said they had 80 megawatts of power. After seeing all the high voltage power transmission lines, one 345 kv, multiple 161 and 69 kv, they must be able to get more power.  I told them go back to all the sources and find out how much they could get with transmission infrastructure. Why isn’t 80 megawatt enough for a data center?  Because, if we want to get people to understand the available power infrastructure, we need a bigger number.  A week later, they said we can get 400 megawatts.

    How can you get 400 megawatts?  Here is a summary of the power sources available.

    In Summary, Ewing Business Park is within 50 miles of about 2500 MGW of redundant power generation capacity including Thomas Hill Coal(1153MGW), Ameren Fulton Nuclear at reform Missouri( 1159MGW), Ameren Hydro at Bagnell Dam(215 MGW), City of Columbia coal/biomass(39MGW), City of Columbia /Ameren natural gas(140MGW), and Associated natural gas(60MGW).  The Ewing Site has numerous redundant feeds and suppliers to this power supply.

    Thomas Hill Coal power.

    Thomas Hill Energy Center key to providing low-cost energy

    Thomas Hill Power Plant

    Plant statistics

    Unit 1 - 1966 General Electric turbine
    Net capacity of 180 MW
    Coal burn rate of 2,325 tons/day

    Unit 2 - 1969 Westinghouse turbine
    Net capacity of 303 MW
    Coal burn rate of 3,476 tons/day

    Unit 3 - 1982 Westinghouse turbine
    Net capacity of 670 MW
    Coal burn rate of 8,660 tons/day

    The Thomas Hill Energy Center is comprised of three electrical generating units, built from 1966 to 1982 and totaling 1,153 megawatts, and a coal mine that is actively being reclaimed after closing in 1993.

    AECI employs about 260 people at the Thomas Hill Energy Center, which has received national recognition for its efficiency and successful conversion to low-sulfur coal that significantly reduced sulfur dioxide emissions.

    AECI also will achieve a system wide nitrogen oxides emission rate reduction of nearly 90 percent with completion in December 2008 of its $424 million environmental controls project at Thomas Hill to meet the Clean Air Interstate Rule.

    Ameren Callaway Nuclear

    Plant Profile

    Location

    The plant is located 10 miles southeast of Fulton, Missouri, in Callaway County; 25 miles northeast of Jefferson City, Missouri; 40 miles southeast of Columbia, Missouri; 100 miles west of St. Louis, Missouri; and 120 miles east of Kansas City, Missouri.

    Plant Design

    Standardized Nuclear Unit Power Plant System (SNUPPS), using a Westinghouse four-loop pressurized reactor and a General Electric turbine-generator.

    Generating Capacity

    1,190 megawatts (net)

    Bagnell Dam

    File:Bagnell dam mo.jpg

    Bagnell Dam impounds the Osage River in the U.S. state of Missouri, creating the Lake of the Ozarks. The 148-foot (45 m) tall concrete gravity dam was built by the Union Electric Company (now AmerenUE) for the purpose of hydroelectricpower generation as its Osage Powerplant. It is 2,543 feet (775 m) long, including a 520-foot (160 m) long spillway and a 511-foot (156 m) long power station. The facility with eight generators has a maximum capacity of 215 megawatts.

    Here is more information about the site providing the Ewing Industrial Park engineering team.

    The background on the availability is actually quite simple. Ewing Business Park is served electric by the city of Columbia Missouri. They are a member of MISO which is the Midwest independent transmission service operator.  The city has purchase/supply transmission agreements with Associated Electric and Ameren.  The city also generates some of their own power.  Currently Ewing Business park  is bisected by a 345 kV line, and served by Numerous 161 kV lines, and numerous 69kV lines.   Ewing business Park is directly adjacent to a large city owned regional substation called the Bolstad Sub station. The city has indicted that the Bolstad could serve immediate 100 MGW right now to the Ewing Park with their own infrastructure and purchase arrangements up to 200 MGW if planned.  There are 4 other regional substations owned by the City and Central Electric ranging from ½ mile to 4 miles. These substations are tapped to ameren feeders in some cases. This Bolstad substation is directly adjacent to a 140 MGW natural gas fired power plant. (1) This power plant is referred to as the Columbia Energy Center or CEC.  This power plant is currently operated as peaking plant that can fire up to 90% capacity quickly.  The ownership of this plant is the city of Columbia and Ameren energy.  The city has taken the recent steps to acquire total remaining ownership of the power plant.  The city has a 39 MGW coal/wood biomass fired plant about 5 miles away.  (2) .  Associated Electric has a large Coal fired power plant just 40 miles away. It is a 1153 MGW coal fired facility.    The Bolstad connection to this power plant is a direct 161 kV transmission line with no other taps.  This line is 50 % owned by the city of Columbia and 50% by Central Electric (the wholesale transmission provider for associated). Todd Culley with Boone Electric and Ralph Schulte with Central Electric stated that Associated can serve “ 200 MGW without a phone call to the city of Columbia Ewing site ”. They said they could easily provide 400 MGW with some notice. (3)

    Let me further explain

    In addition to this transmission line directly from Thomas hill, there is another redundant 161 kV line that comes from the Kingdom city Substation 16 miles away which is directly fed by the Thomas Hill 345 kV line that serves that Kingdom City Substation.  In addition, to these two large independent transmission line feeders, Thomas Hill has another independent 69 kV transmission line that comes from the power plant that serves the city and Ewing from the large Prathersville substation that is 2 miles away from Ewing. (4)  Associated has a natural gas fired power plant Called the Chamois Plant ( 60 MGW) about 40 miles away.  It feeds Columbia by way of one 161 kV line and 2 -69 kV lines.  All but one of these lines land at the Central/ Columbia Boone Sub on the south side of Columbia.  This sub is about 12 miles from Ewing but the interesting thing is that the City has a 161 kV and a 69kV that both run around the east side of town and come to the Bolstad Substation directly from this Main transmission Tap. (5) From the Chamois plant the independent pathway 161 kV hits the Same Boone sub but from an independent pathway from the east. 

    Ameren UE has the 345 kV line that bisects Ewing.  It does not have a substation off of it at Ewing but lands on the west side of Columbia at the Overton Substation about 19 miles away.  There are 161 kV and 69 kV lines that then extend to Bolstad that are considered independent feeders.  Ameren Would not state their capcity to serve publically from this line but currently they did say they could serve 200 MGW easily from the 345 kV line. (6)  Ameren has Bagnell dam hydro electric power plant 50 miles away.  It is rated at 215 MGW.  The main services form this plant are through Associated’s  69kV line and Ameren’s 161 kV line that also goes to the overton sub.(7) Ameren has a Nuclear Power Plant 30 miles away at Fulton mo. .  It is 1159 MGW.    Bolstad serves the Fulton area by a direct  69 kV line. (8).

    Click to read more ...

    Wednesday
    Mar102010

    Who is Monitoring Greenhouse Gases in the atmosphere? Top scientific minds or cash strapped well intended individuals

    Here is something that will leave you thinking.  Who and what measures and monitors the greenhouse gases in the atmosphere?

    The Orbiting Carbon Observation (OCO) satellite developed by NASA/JPL was supposed to do this, but it crashed after launch on Feb 24, 2009.

    Scientists to NASA: We Need A Reliable Way to Track Global Emissions - 07.31.2009

    By Keith Johnson

    Forget all the haggling with China, India, and parts of the U.S. Congress—the real obstacle to a global climate-change treaty might be accurately measuring greenhouse-gas emissions in the first place.

    That’s the warning from the National Academy of Science’s National Research Council to the head of NASA. The upshot? Without a sophisticated satellite that can track global emissions, it will be hard to know what everybody is really up to: “[C]urrent methods for estimating greenhouse gas emissions have limitations for monitoring a climate treaty.”

    NASA had such a sophisticated satellite—the Orbiting Carbon Observatory—which failed to reach orbit in February. The space agency is considering trying again—thus the letter from the NAS pointing out just how useful such satellites can be.

    The monitoring in OCO was simple.

    The satellite carried a single instrument that would have taken the most precise measurements of atmospheric carbon dioxide ever made from space. The instrument consisted of three parallel, high-resolution spectrometers, integrated into a common structure and fed by a common telescope. The spectrometers would have made simultaneous measurements of the carbon dioxide and molecular oxygen absorption of sunlight reflected off the same location on Earth’s surface when viewed in the near-infrared part of the electromagnetic spectrum, invisible to the human eye.

    Here is a video that gives you background on the OCO satellite

    The Economist discusses the issue of monitoring greenhouse gases in length.

    Monitoring greenhouse gases

    Highs and lows

    You might think that measuring the levels of greenhouse gases in the atmosphere would be a priority. If you did think that, though, you would be wrong

    Mar 4th 2010 | From The Economist print edition

    IN NEGOTIATIONS on nuclear weapons the preferred stance is “Trust but verify”. In negotiations on climate change there seems little opportunity for either. Trust, as anyone who attended last year’s summit in Copenhagen can attest, is in the shortest of supplies. So, too, is verification.

    Barack Obama was asked when he was in Copenhagen whether a provision by which countries could peek into each others’ assessment processes was strong enough to be sure there was no cheating. He answered reassuringly that “we can actually monitor a lot of what takes place through satellite imagery”. That statement conjured up thoughts of the sort of cold-war satellite system that America used to identify and count Russian missiles. But the president was being a bit previous; at the moment, no such system exists, because America’s Orbiting Carbon Observatory (OCO), a satellite that would have fulfilled the role, was lost on launch this time last year. The purpose of OCO was to work out the fate of carbon dioxide that is emitted by industrial processes but does not then stay in the atmosphere—about 60% of the total.

    The Economist author points out the problem with the system.

    America is planning to build a new OCO. In the meantime, however, a small group of scientists labours away on Earth, doing its best to monitor emissions at ground level. At the end of February a number of these researchers met at the Royal Society in London, to discuss what they were up to.

    Measuring gas levels day in, day out can look a little humdrum to outsiders, including those who hold the purse strings. They tend to prefer scientists to experiment and test hypotheses, not just tally things. But that attitude galls the greenhouse-gas measurers, and not only because it denies them money. It also ignores the fact that careful measurement is a way of discovering new things, not just of checking the status quo. Monitoring is not just a necessary handmaiden of science—it is the real thing.

    And, what people do in the short term.

    Indeed, for all the noise that is made about climate change, much of this research is done with next to no money. Asked how she paid for her monitoring of various greenhouse gases in Baden Württemberg, Ingeborg Levin of Heidelberg University replied “by stealing”—meaning not that she robs banks, but that the monitoring work is cross-subsidised by grants intended for other studies.

    How broken is the discussion on GHG that there is no world-wide GHG monitoring system?

    Let's hope the NASA budget gets approved for OCO 2.

    Proposed reflight

    Three days after the failed February 2009 launch, the OCO science team sent NASA headquarters a proposal to build and launch an OCO "carbon copy", which planned to have the replacement satellite launched by late 2011.[16] On February 1st, 2010, the FY 2010 NASA budget request did include US$170 million for NASA to develop and fly a replacement for the Orbiting Carbon Observatory.[17]

    Click to read more ...

    Tuesday
    Mar092010

    Who will be the winners of Mobile Computing?

    Don Dodge let go from Microsoft, and currently Google employee blogs on the Platform shift to Mobile.

    MARCH 04, 2010

    Platform shifts Mainframe to Mini to PC to Mobile. Why leaders fail to make the shift

    Platform shifts happen every decade or so in computing. The leaders of the previous generation are rarely successful in dominating the next generation platform. IBM dominated the mainframe business. They didn’t lose their dominance because another company built a better mainframe. They lost it because the market shifted to a new platform…Mini computers. Digital Equipment, Data General, and a few others dominated that market. Another platform shift is happening today, from PCs to Mobile devices, and another industry leader will be left behind. John Herlihy of Google Europesays “In three years time desktops will be irrelevant”

    The issues for Innovator's Dillemma are referenced.

    Why do leaders fail to adapt? The Innovators Dilemma, made famous by Clayton Christensen, clearly explains why market leaders fail to make the leap. Innovation usually happens at the low end of the market where the products are simple, prices are low, margins thin, and the market totally undefined. The industry leaders have great margins, high prices, and customers who want more features and are willing to pay for them. The industry leaders always move up market and leave the new emerging market to smaller innovators. The process usually follows these 6 steps;

    1. The disruptive technology is discovered, often by the market leading company.
    2. Marketing people seek reactions from customers and industry analysts.
    3. Established companies decide it is a better strategy to speed up the pace of sustaining technical advancement in their own product rather than go down market with the disruptive technology.
    4. Start-ups learn about the disruptive technology and see opportunity. They keep their cost structure low, build the technology, and find new markets through trial and error.
    5. The start-ups get some initial success and then move up market and eat away customers from the market leading company.
    6. The market leading company finally jumps on the bandwagon reluctantly with a half hearted attempt and fails. It is too late.

    Morgan Stanley's Mary Meeker is referenced.

    Platform shifts have 10X the number of devices and users. The move to Mobile is big and fast. Mary Meeker of Morgan Stanley says Mobile Internet usage is bigger than most people think, and it is exploding. Every platform shift has 10X the number of devices and users. There were about 1M mainframes, 10M mini-computers, 100M PCs, and 1 Billion cell phones. The next wave of mobile devices will be over 10B.

    10x platform shifts

    Mary Meeker's report has lots of good information in it.

    image

     

    image

    Note the fast growth of Mobile Internet compared to other technologies

    image

    And, she makes the point I have seen few make which is spot on.  The growth of real-time wireless sensors.

    image

    I was talking to a Google developer brainstorming some mobile scenarios, and he laughed when i was going three steps beyond his ideas.  When we both worked at Microsoft we were talking about GPS data with photos in 2001, and people thought we were crazy - "that's too expensive and what would you do with GPS coordinates."  Make money!!!!

    image

    Google is all over this scenario.

    image

    What is the new publishing and distribution network for Mobile?

    image

    The Mobile's phone capabilities are less important and some young users look at phone calls as people who aren't with it.

    image

    Click to read more ...