Can Carbon Relay deliver AI efficiency like what Google's Data Center group uses?

The media covers Foxconn’s back of Carbon Relay. Here is tech crunch’s post.

Taiwanese technology giant Foxconn International is backing Carbon Relay, a Boston-based startup emerging from stealth today, that’s harnessing the algorithms used by companies like Facebook and Google for artificial intelligence to curb greenhouse gas emissions in the technology industry’s own backyard — the datacenter.

According to LinkedIn the founder Matt Provo has been with the company since Aug 2015.

Carbon Relay has on its web site a graph that shows how its model matches the actual PUE

Screen Shot 2019-01-30 at 14.01.40.png

Which looks a lot like Google’s predictive accuracy which is in this paper.

Screen Shot 2019-01-30 at 14.02.33.png

I don’t know Matt Provo or anyone at Carbon Fiber. You can see the team on this page which has their LinkedIn profiles. From taking a quick look I don’t see any mechanical engineers or data center operations people.

Google’s AI/ML energy efficiency project was headed up by Jim Gao who i do know. Jim is a mechanical engineer from UC Berkeley. Go Bears! I also have my engineering degree from Cal, but long before Jim went there. Jim had years of working in Google’s data center group and started down the path of machine learning and he had one of the biggest sources of training data, Google’s data centers. Which may explain why Jim’s predictive models look more accurate than Carbon Relay.

Jim published his latest findings as part of work in Alphabet’s Deepmind where is now a Team Lead.

Screen Shot 2019-01-30 at 14.09.40.png

So can Carbon Relay’s 14 people deliver a solution as good as Deepmind’s Jim Gao? Jim has gone through the painstaking efforts to get clean accurate data from systems. There are so many small details. I love the one example where Jim ran the model to be the most energy efficient, so it turned off all the systems bringing energy use to 0. And Jim has overcome the resistant to change from a well trained data center operations staff to trust a computer model.

Looking at the number of technical team on the Carbon Relay project I am reminded how the first models Jim ran could be performed on one PC. Time will tell if Carbon Relay can deliver on data center efficiency, but even if they have a technical solution getting clean data from all the BMS environments and executing a model that is used is so hard.

The paper that Jim published has Amanda Gasparik on the paper. Got curious looked her up on LinkedIn as she is senior data center mechanical engineer. Been at Google 5 years. 8 years as Nuclear Electronics Technician for US Navy. Masters System engineering and Bachelor’s mechanical engineer.

Add another DeepMind PhD Research Engineer and you have three people who have a broad range of skills that impress me much more than Carbon Relay.