A New Option for moving out of AWS, Forsythe's Data Center in 1,000 sq ft increments

Moving out of AWS can save a lot of money. Huh?  Yes, here is one public disclosure from Moz’s CEO.

Building our private cloud

We spent part of 2012 and all of 2013 building a private cloud in Virginia, Washington, and mostly Texas.

This was a big bet with over $4 million in capital lease obligations on the line, and the good news is that it's starting to pay off. On a cash basis, we spent $6.2 million at Amazon Web Services, and a mere $2.8 million on our own data centers. The business impact is profound. We're spending less and have improved reliability and efficiency.

Our gross profit margin had eroded to ~64%, and as of December, it's approaching 74%. We're shooting for 80+%, and I know we'll get there in 2014.

 

 

 

So you want to move out of AWS, you dread the task of finding something the right size.  A cage in a colocation facility.  Seems too old school.  A wholesale pod?  Too big and you aren’t ready to jump into managing your own electrical and mechanical infrastructure.

How about 1,000 sq ft of data center space configured exactly the way you want?  Need more, get another 1,000 sq feet.  This is what Forsythe Data Centers has announced with its latest data center, offering the solution in the middle of this table.

NewImage



“Forsythe’s facility offers the flexibility and agility of the retail data center market, in terms of size and shorter contract length, with the privacy, control and density of large-scale, wholesale data centers,” said Albert Weiss, president of Forsythe Data Centers, Inc., the new subsidiary managing the center. He is also Forsythe Technology’s executive vice president and chief financial officer.

I got a chance to talk to Steve Harris and the flexibility for customers to have multiple suites designed exactly to support their gear is a dream come true those who know one size fits all usually means you wasting money somewhere.  You could have one suite that is just for storage, tape backup and other gear that is more sensitive to heat.  The high temperature gear could be right next to the storage suite.  You could have higher level of redundancy for some equipment and less for others in another suite.

And just like the cloud, your ability to add is so much easier than I need to move to a bigger cage.  Just add another suite.

How much power do you want per rack?  What’s a suite look like?

NewImage

Oh yeh and the data center is green too.

The facility is being designed to comply with U.S. Green Building Council LEED certification standards for data centers and to obtain Tier III certification from the Uptime Institute, a certification currently held by approximately 60 data centers in the U.S. and 360 worldwide, few of which are colocation facilities.