Is Foxconn using Robotics as its manufacturing push out of China?

Reuters has an article on Foxconn's plans for the use of robotics.

Foxconn to rely more on robots; could use 1 million in 3 years

Employees work inside a Foxconn factory in the township of Longhua in the southern Guangdong province in this May 26, 2010 file photo. REUTERS/Bobby Yip/Files

By Lee Chyen Yee and Clare Jim

HONG KONG/TAIPEI | Mon Aug 1, 2011 8:48am EDT

(Reuters) - Taiwan's Foxconn Technology Group, known for assembling Apple's iPhones and iPads inChina, plans to use more robots, with one report saying the company will use one million of them in the next three years, to cope with rising labor costs.

Foxconn's move highlights an increasing trend toward automation among Chinese companies as labor issues such as high-profile strikes and workers' suicides plague firms in sectors from autos to technology.

The one thing that caught my eye is Foxconn buying plants overseas.

Foxconn plans to buy a set-top plant in Mexico from Cisco Systems and is looking into investing more in Brazil, where it is already making mobile phone handsets.

It has bought LCD TV plants from Japan's Sony Corp in Mexico in 2009 and Slovakia in 2010 and is in cooperation talks with a number of top Japanese hi-tech firms, including Sharp, Canon and Hitachi.

Could server manufacturing be moving out of China as well in the future?

That's one way to solve the 100%+ tax for importing servers into Brazil.

Green Data Center Blog traffic almost 50,000 views a month

A few months ago I decided to change some ideas I was focusing on.  Part of any change normally you would worry about how it affects the traffic to your blog.  I purposely don't worry about the traffic as much as writing on things that I find interesting for my research, clients and just plain curiosity.  Frequently , I look at the traffic to see what people find interesting.

Yesterday I was watching my traffic as July 31 numbers for my blog looked like I would hit 50,000 views.  I got so close.  Note: I switched from TypePad to SquareSpace hosting in Feb which is why the Sept - Jan numbers are at 0.image

The exact numbers for Feb to July are.

image

I expect August to be lower traffic with summer holidays for Northern Hemisphere readers.  Here are the top 25 cities that hit my blog.

image

Thanks for visiting The Green (Low Carbon) Data Center blog.  I should  hit over 50,000 views by end of year. 

-Dave Ohara

Facebook updates Open Compute Project for the community, launches new look

If you go to OpenCompute.org you'll see a new look.

image

Facebook's Yael Maguire discusses the changes.

WELCOME!

WEDNESDAY, JULY 27, 2011 | Posted by Yael Maguire at 16:08 PM

Welcome to the new opencompute.org! This revamp focuses the site on projects and the community. Please bear with us as we work out our kinks, but we have a new streamlined project browser with links to some projects on GitHub! Our original specifications were created in Word and converted to PDFs, not a code-friendly manner to do open hardware development. We decided to switch our V2 specifications to MultiMarkDown, a simple text format used for the Web that easily converts to HTML and PDF. With this switch we now have a process for making contributions:

  1. Sign up on the site (link through Facebook).
  2. Sign an individual Contributor License Agreement (CLA).
  3. Get the code on GitHub.
  4. Make a patch to a spec and submit it to us at https://github.com/facebook/opencompute/issues

Facebook moves a Data Center Elephant, Dozens of Petabytes migrate to Prineville

Facebook has a post on migrating a huge Hadoop environment.  The post doesn't specifically call out the Prineville facility, but where else would they be moving to?

During the past two years, the number of shared items has grown exponentially, and the corresponding requirements for the analytics data warehouse have increased as well. As the majority of the analytics is performed with Hive, we store the data on HDFS — the Hadoop distributed file system.  In 2010, Facebook had the largest Hadoop cluster in the world, with over 20 PB of storage. By March 2011, the cluster had grown to 30 PB — that’s 3,000 times the size of the Library of Congress! At that point, we had run out of power and space to add more nodes, necessitating the move to a larger data center.

For those of you not familiar with what large data set Facebook would be moving.

y Paul Yang on Wednesday, July 27, 2011 at 9:19am

Users share billions of pieces of content daily on Facebook, and it’s the data infrastructure team's job to analyze that data so we can present it to those users and their friends in the quickest and most relevant manner. This requires a lot of infrastructure and supporting data, so much so that we need to move that data periodically to ever larger data centers. Just last month, the data infrastructure team finished our largest data migration ever – moving dozens of petabytes of data from one data center to another.

The post has lots of details and ends with a pitch to join the Facebook infrastructure team.

The next set of challenges for us include providing an ability to support a data warehouse that is distributed across multiple data centers. If you're interested in working on these and other "petascale" problems related to Hadoop, Hive, or just large systems, come join Facebook's data infrastructure team!

The data infrastructure team in the war room during the final switchover.

Curious I went to see what are the current job posts in the tech operations team.

Open Positions
Production Operations: Systems, Network, Storage, Database (14)

    Supply Chain, Program Management and Analysis (6)

    Hardware Design and Data Center Operations (12)

     

    RAW vs. JPG, 25% images are now RAW

    10 years ago at Microsoft, four of us had this idea that RAW imaging would be big.  I wrote a blog post with some of the history.

    Story of Adobe & Apple High-Value Digital Image Applications, Adobe’s angst developing for the iPad, and how Microsoft missed this battle

    MONDAY, MAY 17, 2010 AT 3:25AM

    This is not a data center post, but one about competition and innovation.

    If you are a high-end photographer person you use the RAW imaging format, a higher quality image format vs. JPEG.

    A camera raw image file contains minimally processed data from the image sensor of either a digital camera, image, or motion picture film scanner. Raw files are so named because they are not yet processed and therefore are not ready to be printed or edited with a bitmap graphics editor.

    Microsoft just released a RAW image CODEC for Windows 7 and Vista, and what was interesting is the analysis says 25% of images are RAW.

    Photo Gallery now supports raw format

    by Brad Weed

    We all take a lot of photos. In fact, according to data provided by InfoTrends, more than 73 billion still images were shot in the US alone in 2010. If you’re lucky enough to own a DSLR (digital single lens reflex) camera, you’re likely to take two and a half times as many photos in a given month as your friends with point-and-shoot cameras. That’s a lot of photos. What’s more, nearly a quarter of those photos are taken in a raw image format.

    The group of 4 that had the original RAW image idea 10 years ago are no longer at Microsoft.  One is a Google executive, one is an Adobe executive, one is an imaging consultant, and myself.

    At least now, we finally have the data to say how big the RAW imaging format is.  25% of the market.  Now the market is big enough, and a product can be developed.  But, it is a little too late to try and come to market now.