A dear friend David Schirmacher has passed

7x24 Exchange’s President David Schirmacher has passed.

It is with profound sadness that I share the news of the passing of the President of 7×24 Exchange International, David Schirmacher on January 30th after a recent accident. We are deeply saddened by this tremendous loss. We extend our deepest sympathies to David’s wife Veronica, his brother Axel, his family and friends.

I have had so many hours of conversations with David I don’t know what to write. Yet I have written so many times about conversations with David. Writing about the end of conversations is so much harder.

One of the last conversations I had with David is his new data center metrics. Without David’s thought leadership the metrics will fade, but my memories of David will not. I can hear his voice in my head. I can hear his laugh. His quick wit.

Hearing about David’s passing I immediately thought of his wife Veronica who is an empty house. David’s presence was always a pleasure and I cannot imagine how Veronica feels in her house now.

Last night when i first heard the news and I checked in with friends who I know were close to David, and this morning I checked in with a few others. If you want to send sympathy wishes you can send them to Veronica at

Sympathy or Mass cards ONLY can be sent to:
Veronica Schirmacher
1573 Redwood Grove Terrace
Lake Mary, FL 32746

No flowers please.

When attending 7x24 I got in the habit of staying after the Wednesday presentations to have a burger with David by the pool. David was always busy at 7x24 and the time to chat with him would be after the conference was over. It is hard to accept I’ll never have another burger with David.

I wish i could write more, but it is so hard. It is hard because it hurts to think David is gone.

David will be missed my many.

May his soul rest in peace.

Can Carbon Relay deliver AI efficiency like what Google's Data Center group uses?

The media covers Foxconn’s back of Carbon Relay. Here is tech crunch’s post.

Taiwanese technology giant Foxconn International is backing Carbon Relay, a Boston-based startup emerging from stealth today, that’s harnessing the algorithms used by companies like Facebook and Google for artificial intelligence to curb greenhouse gas emissions in the technology industry’s own backyard — the datacenter.

According to LinkedIn the founder Matt Provo has been with the company since Aug 2015.

Carbon Relay has on its web site a graph that shows how its model matches the actual PUE

Screen Shot 2019-01-30 at 14.01.40.png


Which looks a lot like Google’s predictive accuracy which is in this paper.

Screen Shot 2019-01-30 at 14.02.33.png

I don’t know Matt Provo or anyone at Carbon Fiber. You can see the team on this page which has their LinkedIn profiles. From taking a quick look I don’t see any mechanical engineers or data center operations people.

Google’s AI/ML energy efficiency project was headed up by Jim Gao who i do know. Jim is a mechanical engineer from UC Berkeley. Go Bears! I also have my engineering degree from Cal, but long before Jim went there. Jim had years of working in Google’s data center group and started down the path of machine learning and he had one of the biggest sources of training data, Google’s data centers. Which may explain why Jim’s predictive models look more accurate than Carbon Relay.

Jim published his latest findings as part of work in Alphabet’s Deepmind where is now a Team Lead.

Screen Shot 2019-01-30 at 14.09.40.png

So can Carbon Relay’s 14 people deliver a solution as good as Deepmind’s Jim Gao? Jim has gone through the painstaking efforts to get clean accurate data from systems. There are so many small details. I love the one example where Jim ran the model to be the most energy efficient, so it turned off all the systems bringing energy use to 0. And Jim has overcome the resistant to change from a well trained data center operations staff to trust a computer model.

Looking at the number of technical team on the Carbon Relay project I am reminded how the first models Jim ran could be performed on one PC. Time will tell if Carbon Relay can deliver on data center efficiency, but even if they have a technical solution getting clean data from all the BMS environments and executing a model that is used is so hard.

The paper that Jim published has Amanda Gasparik on the paper. Got curious looked her up on LinkedIn as she is senior data center mechanical engineer. Been at Google 5 years. 8 years as Nuclear Electronics Technician for US Navy. Masters System engineering and Bachelor’s mechanical engineer.

Add another DeepMind PhD Research Engineer and you have three people who have a broad range of skills that impress me much more than Carbon Relay.

Women in Cloud event at Microsoft Campus building 33, Jan 26, 2019

There is a Women in Cloud event on Jan 26, 2019 at the Microsoft Campus in Redmond.

Tickets are $99-200 and you can find more information here.

It is great to see this event by put together for its first time.

7 years ago I moderated a panel discussion with some top people in capacity planning and asset management which happened to be all women and it has been great to see more and more organizations support Women in the data center industry.

I hope this event is the first of many more to come. Best of luck to the event organizers.

Ever wonder why no one explains what all the data is in 4 TB a day for autonomous cars

It is covered widely that autonomous cars use TBs a day of data. I use a TB of data a month and that data is used across a wide range of videos, downloads, web browsing, and I can look at statistics on where that data is used and where it goes.

Stacey On IOT writes on the mobile edge and goes over the autonomous car use case and where will the edge be.

Moving 4 terabytes of data across 400 cars generates 1.6 petabytes of data a day, which is an obscene amount of data to transfer over mobile networks. Thus, edge processing is already taking place to produce insights required to drive the car, on the car itself.

In addition to much of that data being processed on the car, Vijay said Uber is also creating storage depots with fat connections that can handle the uploads of multiple cars. In areas where a cluster of self-driving cars may overwhelm the network, he suggested use of a mobile data center packed into what looked like a minivan to help process the data.

Part of me wonders if Uber and others talk about all the data for autonomous cars as a way to get others to waste their time. If they are generating 4TB a day now, they aren’t uploading those images now. It would seem like they are collecting lots of data now and running these cars in an instrumented mode.

Flight test instrumentation (FTI) is monitoring and recording equipment fitted to aircraft during flight test. It is mainly used on experimental aircraft, prototype aircraft and development aircraft - both military and civil, and can monitor various parameters from the temperatures of specific components to the speed of the engines. This may be displayed in the cockpit or cabin to allow the aircrew to monitor the aircraft in flight, and is usually recorded to allow the data to be analysed later.

A small modular data acquisition unit

FTI typically monitors between 10 and 120000 parameters - for example temperatures inside the cargo and pressure distribution along the wing. FTI sources could be temperature sensors, pressure probes, voltages, current transducers, potentiometers, strain gauges, aircraft data buses or cameras.[1] These sensors may be digitalized and acquired by the data acquisition system. A cockpit audio recording may also be included. A telemetry transmitter may be added to the FTI to allow real-time monitoring of the tests from a ground-station.

-https://en.wikipedia.org/wiki/Flight_test_instrumentation

But when planes fly get approved for production they don’t run in this heavily instrumented mode. They did need the instrumentation data though as part of certification to fly.

If we had details on what is in the 4TB of data you would know how much is needed to drive the car vs what is needed to certify the car.

5G lower latency times more improvements come from back-end than antenna transmission

Everyone says 5G will improve latency, but 4G and 5G are both electromagnetic spectrums that travel at the speed of light. So where is the latency improvement coming from. Nokia has a paper on low latency that goes into lots of details.

Below is a table and graphic that shows you were the times are in a latency calculation.

When you look at transmission and frame alignment you get about 3ms of time savings. Looking at UE and BTS which are user equipment and base station times you get 5 ms of time savings.

So there is actually more time saved in latency in the backend equipment than in transmission.

nokia.jpg

Ok since I brought up the transmission latency I might as well keep going. If you want to look at where the latency savings in transmission the current standards require a connection to be established to send data, switching between idle and connected modes. In 5G a connection is established and you can suspend and resume which is much quicker.

This applies to the same spectrum with only changing 4G and 5G infrastructure. So the new spectrum is not an issue. It is the transmission, connections, and backend that improve the speeds.

Screen Shot 2018-12-17 at 11.23.38.png