Finally College costs may drop as supply of freshman drops

For all the complaints about college costs rising which seems like no one can do anything about.  I find it rare that anyone looks at the problem as a supply and demand problem.  Demand from parents and students has been rising for college education.  Colleges see an inelastic price curve. They raise prices and still have full enrollment.  If the supply of students drop and colleges are competing for a limited supply of students, then prices should drop.

WSJ has an article on a student drought hitting smaller universities.

Student Drought Hits Smaller Universities

At Loyola, Freshman Class Size Plunges

As Loyola University New Orleans gears up for fall classes next month, the 101-year-old Jesuit University faces a crisis: There will be 25% fewer freshmen than the school had banked on.

"It was a pretty big hit," said Marc K. Manganaro, provost and vice president for academic affairs.

Getting a targeted number of accepted students to commit to a college's freshman class—known as the "yield"—has become more crucial for thousands of schools.

Enrollment rates for numerous smaller and lesser-known colleges and universities are falling this year, due to a decline in the U.S. college-age population, years of rising tuition, increasing popularity of Internet courses and a weak job market for recent graduates.

 

 

 

 

There is data to show the student enrollment is declining.

After decades of growth, college enrollment nationally dropped 2.3% this spring, compared with spring 2012, according to a report released by the National Student Clearinghouse Research Center. The decline is poised to continue. The number of U.S. high-school graduates peaked at 3.4 million in 2010-2011 and is projected to fall to 3.2 million by 2013-14, according to the Western Interstate Commission for Higher Education. The dip in graduates has been particularly pronounced in the Midwest and South.

I have 7 more years before my first goes to college, and I am disappointed by the possibility of college costs retreating.

Counting Servers is Easy, there are a lot of other things that are much harder

James Hamilton has a post saying that is hard to count servers.

At the Microsoft World-Wide Partners Conference, Microsoft CEO Steve Ballmer announced that “We have something over a million servers in our data center infrastructure. Google is bigger than we are. Amazon is a little bit smaller. You get Yahoo! and Facebook, and then everybody else is 100,000 units probably or less.

That’s a surprising data point for a variety of reasons. The most surprising is that the data point was released at all. Just about nobody at the top of the server world chooses to boast with the server count data point. Partly because it’s not all that useful a number but mostly because a single data point is open to a lot of misinterpretation by even skilled industry observers. Basically, it’s pretty hard to see the value of talking about server counts and it is very easy to see the many negative implications that follow from such a number

What is hard is figuring out how many cores these servers have.  What is the age of the servers?  Oldest is 4 years.  Or 3.  What is the rate of adding new data center capacity and how does that relate to overall cores and storage increasing?

The one advantage Microsoft has in making a statement on server count is the companies will not speak up what theirs is.

The first question when thinking about this number is where does the comparative data actually come from?  I know for sure that Amazon has never released server count data. Google hasn’t either although estimates of their server footprint abound. Interestingly the estimates of Google server counts 5 years ago was 1,000,000 servers whereas current estimates have them only in the 900k to 1m range.

We'll see if others speak up on server count or not.  

The US census for years has conducted a study of manufacturing capacity for years.

 

Quarterly Survey of Plant Capacity Utilization (QPC)

The Survey of Plant Capacity Utilization provides statistics on the rates of capacity utilization for the U.S. manufacturing and publishing sectors.

  • The Federal Reserve Board (FRB) and The Department of Defense (DOD) co-fund the survey.
  • The survey collects data on actual, full, and emergency production levels.
  • Data are obtained from manufacturing and publishing establishments by means of a mailed questionnaire.
  • Respondents are asked to report actual production, an estimate of their full production capability, and an estimate of their national emergency production.
  • From these reported values, full and emergency utilization rates are calculated.
  • The survey produces full and emergency utilization rates for the manufacturing and publishing sectors defined by the North American Industrial Classification System (NAICS).
  • Final utilization rates are based on information collected from survey respondents.

Wouldn't it be useful for the FRB and DOD to understand data center capacity and utilization?  It is hard to assess, but that doesn't mean it shouldn't be done.

Something that is Missing from Facebook's Carbon Data Center Disclosure, an int'l colocation presence

I've been staring at the Facebook Carbon Disclosure for over two weeks.  You can see the 2012 numbers here.

NewImage

One of the things that I asked Facebook "where are the international colocation centers?"  Facebook doesn't have any.  There are POPs which are small and are counted in the office space carbon footprint.

It is pretty amazing that in 2012 the East Coast colocation presence was as large as Prineville and Forest City combined. In 2012 there was almost no data center capacity in Europe so content was a run back to the East Coast.  West Coast was taking care of Asia Pacific.

It will be interesting to see what Facebook releases for 2013.  My one request is it will not take 6 months to release this information in June 2014 for the 2013 disclosure.

How many funny things are done in data center because risk aversion is a best practice?

Outages are career killers in data centers and IT.  This leads to a risk aversion behavior that becomes a best practice.

Risk aversion

From Wikipedia, the free encyclopedia
 
 

Risk aversion is a concept in psychologyeconomics, and finance, based on the behavior of humans (especially consumers and investors) while exposed to uncertainty to attempt to reduce that uncertainty.

But, Risk Aversion can lead to an obsession to avoid doing things.  A funny example is Ben Stiller in Along Came Polly.

Ben Stillar's risk aversion is resolved when he enters data about his safe ex-wife and the risky Polly.

Reuben is torn between the free-spirited Polly and the safe and familiar Lisa. To solve this issue he enters information about Polly and Lisa into a computer insurance program which measures risk. The computer tells him that, despite his numerous blunders with her, Polly is the best choice for him.

It is too common a best practice to manage risk by avoiding the path that may have risk.  Risk leads to an outage.  An outage leads to job loss.  Don't do those things that increase risk.

Almost all data center innovators learn to live with risk.  Risk is everywhere.  But, risk aversion can still exist in pockets of an organization when one individual finds comfort in steering clear of all risks they identify.  It's too bad you can't enter information in about their situation and given them the best choice.

Aren't you glad you haven't been hit with a 40% Energy Surcharge like Western Texas Industries?

WSJ has an article on the problem the power transmission system in Western Texas being strained by the higher use of Energy drilling and pumping that is increasing the costs for many industries in areas.

While the largely rural region has enough power plants to supply the growing demand for electricity, the high-voltage transmission network hasn't kept pace. Beginning last summer, a shortage of transmission lines in some areas meant that grid operators couldn't automatically send the cheapest power to customers, but had to turn to more expensive power plants elsewhere in the state, where there was enough transmission capacity. Those higher costs were passed on as surcharges to many large customers.

Here are descriptions of some of the pain.

That isn't good news for executives at Tower Extrusions Ltd., which makes aluminum products like stadium seats and storm gutters in Olney, Texas, about 100 miles west of Fort Worth. The company says its power bills climbed 40% last year.

"The congestion charges are putting me at a huge disadvantage, compared to my competitors near Dallas or in other parts of the state," said Mark McClelland, Tower's general manager.

Even oil and gas companies are being hit by the charges. Kinder Morgan Inc.,KMI -0.84% which produces oil in the Permian Basin, said it had to pay as much as $400,000 in congestion costs on a single day in 2012.

Apache Corp., one of the Permian Basin's top oil producers, said its average costs in the area this year are running about 15% higher, largely due to the power-line congestion costs.

Ouch.  Could imagine if you ran a data center in this area?  There are probably some data center operators who are being hit by these costs.