Google Ads

Enter your email address:

Delivered by FeedBurner

Search
This form does not yet contain any fields.
    Navigation
    Friday
    Feb052010

    Data Center as a Thermodynamic System

    Thanks to a conversation with Stewart Young from OSIsoft where we were bouncing around different ideas and the latest Green Grid meeting it hit me.  As much as The Green Grid and folks there discuss power and cooling systems is there a different way to think about how to green the data center.

    By its nature The Green Grid with its vendor members will organize itself around those things that sell the vendors products.  PUE drives energy efficiency benchmarking which gets people to upgrade equipment.  Energy efficient servers the same.

    So, let’s go to a different holistic view of the green data center problem.  Looking at the Data Center as a Thermodynamic System.

    System boundary.svg

    In thermodynamics, a thermodynamic system, originally called a working substance, is defined as that part of the universe that is under consideration. Anything under consideration is called a system. A hypothetical boundary separates the system from the rest of the universe, which is referred to as the environment, surroundings, or reservoir.

    If the data center is the system and the boundary is the physical connections to the rest of the surroundings does this work for data centers.

    A useful classification of thermodynamic systems is based on the nature of the boundary and the quantities flowing through it, such as matter, energy, work, heat, and entropy. A system can be anything, for example a piston, a solution in a test tube, a living organism, an electrical circuit, a planet, etc.

    Data Centers are closed systems.

    Closed systems are able to exchange energy (heat and work) but not matter with their environment. A greenhouse is an example of a closed system exchanging heat but not work with its environment. Whether a system exchanges heat, work or both is usually thought of as a property of its boundary.

    Data centers are actually composed of bunch of thermodynamic systems. Like:


    [edit]Psychrometry

    Psychrometry is the study of air and water vapor mixtures for air conditioning. For this application, air is taken to be a mixture of nitrogen and oxygen with the other gases being small enough so that they can be approximated by more of nitrogen and oxygen without much error. In this psychrometry section, vapor refers to water vapor. For air at normal (atmospheric) pressure, the saturation pressure of vapor is very low. Also, air is far away from its critical point in those conditions. Thus, the air vapor mixture behaves as an ideal gas mixture. If the partial pressure of the vapor is smaller than the saturation pressure for water for that temperature, the mixture is called unsaturated. The amount of moisture in the air vapor mixture is quantified by its humidity.

    Diesel Cycle

    Diesel Cycle

    The Diesel cycle is the idealized cycle for compression ignition engines (ones that don't use a spark plug). The difference between the Diesel cycle and the Otto cycle is that heat is supplied at constant pressure.

    1. Heat is supplied reversibly at constant pressure in 1-2.
    2. Reversible adiabatic expansion during which work is done in 2-3.
    3. Heat is rejected reversibly at constant volume in 3-4.
    4. Gas is compressed reversibly and adiabatically in 4-1.

    An example of Thermodynamic Systems in Data Centes is Syracuse University Data Center.  When you watch the video it discusses many ideas you would use if you looked at the data center as a Thermodynamic System.

    Maybe we need more thermodynamic engineers working on data centers?  How many data center design firms take a thermodynamic system approach?  Low PUE is the current topic, but what happens if you do as IBM did and brought power generation on site and change the boundary of the system?

    Click to read more ...

    Friday
    Feb052010

    Owning the Network commoditized, as Apple & Google’s Mobile browser devices with data centers drive customers

    Newsweek has an article on Apple and Google’s move to wrestle control away from wireless carriers which commoditizes the network.  Verizon started the battle discussing the 3G network with videos like this.

    ATT fires back with Apple’s help.

    But, how many people really like being locked into one carrier for 2 years?  Did all this advertising really work or did it just piss off people more as they knew their networks didn’t work as well as advertised.  Apple and Google are changing the game by having users buy devices, and the network is a lower priority.  How many users who bought iPhones really wanted to be on the AT&T network? We’ll see if Google’s strategy to be on all the networks works vs. Apple.

    Buh-Bye, Wireless Guys

    How Silicon Valley conquered the carriers.

    TECHNOLOGY
    Handheld History

    Remember life without your iPhone, Blackberry or Treo? From the Apple Newton to the newest Palm Pre, here's a look at the evolution of the personal digital assistant.

    The Best of Apple's Innovations

    By Daniel Lyons | NEWSWEEK

    Published Feb 4, 2010

    From the magazine issue dated Feb 15, 2010

    I like to imagine that it happened this way: One day the computer guys in Silicon Valley looked over at the mobile-phone industry and realized those carriers have figured out the ultimate racket. They sell you a phone, lock you into a two-year contract, and anything you want to buy for the phone—accessories, ringtones, games—you have to buy from them. They control the whole thing, from top to bottom, and instead of getting a one-time sale, they get a recurring revenue stream. "Wow!" the computer guys said. "Why aren't we doing that? "

    Well, now they are. Slowly but surely, companies like Apple and Google are wresting control away from the mobile carriers. Instead of a world where the companies that make the phones are just dumb hardware makers—silent partners who never get to touch the customer—Google and Apple are using the transition to smart phones as a way to flip the mobile-phone business model on its head.

    But, as the article mentions this is just changing customer behavior from wireless carrier to device maker.

    Eventually, this means that we'll all be able to buy a phone and run it on any network we want, which is what we should have been able to do all along. There's a risk, however, that we're fleeing one cage only to run straight into another, and the only thing that will change is the name of our jailer.

    Part of the reason why Apple and Google have been able to do this is how well their device works with data centers that the company owns to provider a richer experience that consumers want.

    RIM made the same mistake Windows Moblie did in having a browser that sucked.  Windows Mobile 7 will fix this problem, and RIM is joining as well.

    The latest info we’ve heard has the browser being completely re-developed from the ground up and based on Webkit — a far cry from the POS Java relic BB’s currently run. RIM is gunning to take it even further than “just a webkit” browser however. Previously leaked documents and other claims from various sources have RIM tightly integrating their BIS/BES services and server side technology into the mix (which any Opera Mini users should know…) greatly speeds up browsing speeds, rending accuracy, and manages to drastically cut down on bandwidth.

    Click to read more ...

    Thursday
    Feb042010

    Are Cloud Computing Data Centers Green? IBM announces its greenest Cloud Computing DC in North Carolina

    I’ve been writing about cloud computing more as cloud computings are more efficient using less resources.  Here is IBM’s latest press release that demonstrates cloud computing is green.

    The data center uses advanced software virtualization technologies that enable access to information and services from any device with extremely high levels of availability and quality of experience.  The facility aggressively conserves energy resources; saving cost and speeding services deployment through a smart management approach that links equipment, building systems and data center operations.

    “I thank IBM for its continued commitment to North Carolina. This facility promises to be one of IBM's greenest data centers in the world, proving once again that green is gold for North Carolina,” Gov. Bev Perdue said. “Growing North Carolina’s green economy plays a critical role in my mission to create jobs and to ensure our state’s economy is poised to be globally competitive in the long term.”

    As I’ve discussed the ideas working with University of Missouri, IBM has taken the same approach working with North Carolina Universities.

    The data center is showcasing a cloud computing solution in partnership with North Carolina Central University (NCCU) and NC State University that enables Hillside New Tech High School students in Durham, NC to access educational materials and software applications for the classroom over the Internet from the high school’s computer lab, as well as from any networked device.  This means that the learning environment can be extended to nearly any place at any time without the restrictions many schools face such as limited support, hardware resources and lack of access. The Hillside outreach project with NCCU, using cloud computing as a vehicle in support of education, is one of several such K-12 projects that IBM supports.  The new data center also currently hosts IBM’s global web site, ibm.com, and the IT operations of strategic outsourcing clients such as the United States Golf Association (USGA).

    The green features are listed here.


    • Smarter data center management:
        Thousands of sensors, connecting IT equipment, data center and building automation systems, provide data that can be analyzed to plan future capacity planning, conserve energy and maintain operations in the event of a power outage.
    • Energy efficiency: The data center uses half the energy cost to operate compared to data centers of similar size by taking advantage of free cooling – using the outside air to cool the data center.  Intelligent systems use sensors to continuously read temperature and relative humidity throughout the data center and dynamically adjust cooling in response to changes in demand.
    • Cloud computing capability:  Support for cloud computing workloads allow clients to use only the resources necessary to support their IT operations at any given moment - eliminating the need for up to 70 percent of the hardware resource that might have been previously needed to perform the same task. The data center also hosts recently announced “Smart Business” cloud computing offerings - each of these solutions can significantly reduce a clients total cost of ownership by up to 40 percent.
    • Built for expansion: Due to an innovative modular design method, IBM will be able to add significant future capacity in nearly half the time it would take traditional data centers to expand.  This design/build method – called IBM Enterprise Modular Data Center  (IBM EMDC) – also enables IBM to rapidly scale capacity to meet demand by adding future space, power, and cooling to the data center with no disruption to existing operations.  This means up to 40 percent of capital costs and up to 50 percent of operational costs may be deferred until client demand necessitates expansion.  The new data center can also quickly and seamlessly expand its power and cooling capacity.
    • New building standards: IBM started building the data center in August 2008 and it began to support client operations within 15 months compared to the industry benchmark of 18-24 months.

    In constructing the new data center, IBM renovated an existing building on its Research Triangle Park campus by reusing 95 percent of the original building's shell, recycling 90 percent of the materials from the original building and ensuring that 20 percent of newly purchased material came from recycled products.  The result lowered costs and reduced the carbon footprint associated with building by nearly 50 percent allowing IBM to apply for Leadership in Energy and Environmental Design (LEED) Gold certification. LEED is a third-party certification program and the nationally accepted benchmark for the design, construction and operation of high performance green buildings.

    Click to read more ...

    Thursday
    Feb042010

    Air Force and IBM partner to prove Cloud Computing works for Defense and Intelligence services

    One of the top concerns about Cloud Computing is security of the data in the cloud.  IBM has a press announcement on the partnership here.

    U.S. Air Force Selects IBM to Design and Demonstrate Mission-Oriented Cloud Architecture for Cyber Security

    Cloud model will introduce advanced cyber security and analytics technologies capable of protecting sensitive national data

    ARMONK, N.Y. - 04 Feb 2010: The U.S. Air Force has awarded IBM (NYSE:IBM) a contract to design and demonstrate a secure cloud computing infrastructure capable of supporting defense and intelligence networks. The ten-month project will introduce advanced cyber security and analytics technologies developed by IBM Research into the cloud architecture.

    There are press articles too.

    CNet News

    Air Force taps IBM for secure cloud

    by Lance Whitney

    IBM has a tall order from the U.S. Air Force--create a cloud network that can protect national defense and military data.

    Big Blue announced Thursday a contract from the Air Force to design and demonstrate a cloud computing environment for the USAF's network of nine command centers, 100 military bases, and 700,000 personnel around the world.

    The challenge for IBM will be to develop a cloud that can not only support such a massive network, but also meet the strict security standards of the Air Force and the U.S. government. The project will call on the company to use advanced cybersecurity technologies that have been developed at IBM Research.

    and Government Computer News.

    What I find interesting is how few authors reference the IBM press release.  The goal of the project is a technical demonstration.

    "Our goal is to demonstrate how cloud computing can be a tool to enable our Air Force to manage, monitor and secure the information flowing through our network," said Lieutenant General William Lord, Chief Information Officer and Chief, Warfighting Integration, for the U.S. Air Force. "We examined the expertise of IBM's commercial performance in cloud computing and asked them to develop an architecture that could lead to improved performance within the Air Force environment to improve all operational, analytical and security capabilities."

    Which is cut and pasted into the CNet news article as well.

    On the other hand, there are some good insights by Larry Dignan on his ZDnet blog.

    What’s in it for IBM? Cloud computing has a lot of interest, but security remains a worry for many IT buyers. If Big Blue can demonstrate cloud-based cyber security technologies that’s good enough for the military it would allay a lot of those worries.

    The advanced cyber security and analytics technologies that will be used in the Air Force project were developed by IBM Research (statement).

    According to IBM the project will show a cloud computing architecture that can support large networks and meet the government’s security guidelines. The Air Force network almost 100 bases and 700,000 active military personnel.

    and Larry continues on the key concepts of what will be shown.  Models!!! yea!

  • The model will include autonomic computing;
  • Dashboards will monitor the health of the network second-by-second;
  • If Air Force personnel doesn’t shift to a “prevention environment” in a cyber attack the cloud will have automated services to lock the network down.
  • Click to read more ...

    Thursday
    Feb042010

    Symbian Mobile OS goes open source, is data center design the next open source opportunity?

    Symbian OS went open source today.

    Symbian Is Open

    As of now, the Symbian platform is completely open source.  And it is Symbian^3, the latest version of the platform, which will be soon be feature complete.

    Open sourcing a market-leading product in a dynamic, growing business sector is unprecedented.  Over 330 million Symbian devices have been shipped worldwide, and it is likely that a further 100 million will ship in 2010 with more than 200 million expected to ship annually from 2011 onwards.


    Now the platform is free for anyone to use and to contribute to.  It is not only a sophisticated software platform, It is also the focal point of a community. And a lot of the foundation’s effort going forward will be to ensure the community grows and is supported in bringing great innovations to the platform and future devices.

    PCWorld write on the 5 benefits of open sourcing Symbian.

    Five Benefits of an Open Source Symbian

    By Tony Bradley

    The Symbian mobile operating system is getting a second life as the Symbian Foundation makes the smartphone platform open source. The lifeline will revitalize the platform, and has benefits for Nokia, smartphone developers, Symbian handsets, and smartphone users.

    With open source hitting all aspects of IT including mobile, when will data center designs go open source?  Don’t hold your breath as few of the data center designers are software people, so open source is still a foreign concept for many as designs are protected and transparency of what goes on is heresy to their thinking and business models.

    But, maybe as Cloud Computing goes open source with companies like Eucalyptus, people will not see the value in much of how data centers have been built in the past.

    Eucalyptus open-sources the cloud (Q&A)

    It's reasonably clear that open source is the heart of cloud computing, with open-source components adding up to equal cloud services like Amazon Web Services. What's not yet clear is how much the cloud will wear that open source on its sleeve, as it were.

    Eucalyptus, an open-source platform that implements "infrastructure as a service" (IaaS) style cloud computing, aims to take open source front and center in the cloud-computing craze. The project, founded by academics at the University of California at Santa Barbara, is now a Benchmark-funded company with an ambitious goal: become the universal cloud platform that everyone from Amazon to Microsoft to Red Hat to VMware ties into.

    Or, rather, that customers stitch together their various cloud assets within Eucalyptus.

    Is open source a threat to data center design?  For some maybe, for others it is an opportunity.

    For compliance and regulatory issues, eventually cloud computing providers will need to provide some level of transparency on their data center infrastructure.  Enough to meet the needs of governments and other regulatory agencies.  Will this be a driving issue for opening more details on data center infrastructure?

    There are those who argue for security reasons, we are not transparent to reduce our risks.  But, open source software believers say the systems are more secure by being transparent and allowing peer review.

    Click to read more ...