Cloud Computing, Commercial Real Estate and that Room Full of Servers

Not too long ago I was visiting the offices of a company that provides a sophisticated inventory management software as a service (SaaS) product. A relatively small organization renting space in a downtown office building, their server room measured maybe around 13 x 17, if I were to guess, and was the largest room they had.

I found it curious that the room was lined with racks filled with what totaled up to be around 30 tower and ‘big box’ computer systems of various ages. They had been in the process of evaluating their bandwidth requirements and costs in an underserved area (translation: bandwidth is expensive). Groups of systems were plugged into small off-the-shelf UPS devices to provide some short-term backup power and some consumer grade network switches provided additional outlets on the far corners of the room.

One of my immediate thoughts was that the property owner was probably paying more to power this company’s servers than they were collecting in rent on the computer room space. It also occurred to me that it must be a tremendously inefficient use of computing resources and, potentially, an unnecessary additional rent expense to house all of it.

So, let’s do a little math and see how each side might be better off.

The Landlord

At a relatively low $15 a square foot over 221 square feet (13 x 17) that works out to around $3,315 a year or $276 a month. The power bill on 30 servers (at 400 watts*) is probably in the $600 to $800 range each month. Given that the rest of this tenant’s offices probably didn’t total up much more than 1,000 square feet it’s unlikely the property owner is actually making a profit from having this tenant in their building. Even when upping their CAM (Common Area Maintenance) fees to compensate for increased energy cost they are just spreading the excessive energy use over all of the tenants with a higher than otherwise CAM cost.

THE TENANT

Not only is there the additional floor space they have to rent, but they have to pay a few hundred dollars a month for their bandwidth. Not just fast, but more reliable, symmetrical bandwidth than they might otherwise purchase. Let’s say the impact of having to put a fiber-based, metro-Ethernet Internet connection in as opposed to maybe a business class cable modem connection or low cost DSL is only a few hundred dollars a month (it’s usually more like several hundred dollars).

Let’s also consider that they are likely paying, one way or another, at least a couple of thousand dollars each month in IT support just to keep 30 different Windows servers updated, maintained and operating. We’re not even considering cost of hard drive replacements, equipment acquisition and replacement and software licensing.

If they’re purchasing (replacement or otherwise) five or six of those systems each year (they have around 30, remember!) and keeping their licenses current within a couple of revisions that is easily another $12,000 or more per year even using cheap hardware and minimal installation costs.

Start adding things up and you’ve got close to $4,000 a month in cost to maintain a less than optimal data center environment and the landlord is losing (give or take) $500 each month on the computer room space. This isn’t optimal for either party.

Depending on LOTS of factors $4,000 a month can buy an awful lot of cloud-based, virtual server capacity. Not only that, but with the right provider that includes massive gobs of Internet connectivity, a secure data center facility with power conditioning and generators, Enterprise class infrastructure with high-end servers and high-throughput network attached storage systems. A good provider could even set things up so that the customer could spin up new servers and operating environments on demand, in minutes, from a master image of their standard client deployment. Indeed, they could access infrastructure that would cost them hundreds of thousands of dollars to deploy themselves and many thousands of dollars a month to maintain.

So, even if the end-client doesn’t manage to stumble across a service provider that can help them vastly improve their infrastructure while containing their costs, maybe a landlord with tenants like this one would do well to build a relationship with someone they can refer tenants to. All they’ve got to lose is part of their electric bill and (and however slightly more) excessive CAM charges.

*note: A 2009 survey by IBM determined the average server used 425 Watts and this did not include cost of cooling.