So you're a CIO or an IT manager in charge of your company's data centre, and your new mission is to make an environmentally-efficient facility even "greener". Here are six imaginative strategies to consider to improve your eco-credentials, direct from Microsoft's cutting-edge test data centre.
pictureServer from Shutterstock
Hidden up in the hills of Redmond Ridge in Washington state is a 57,000 square foot data centre operated by Microsoft. It isn't mission critical, and doesn't have an uptime rating. What it does have is a 48 pods drawing 15kw/h per rack for an overall PUE of 1.2. It's the test data centre Microsoft hides in the hills to run scenarios on everything from high-performance computing at low power, through to running highly virtualised servers to test Azure deployments.
The manager of the data centre test group stationed at Redmond Ridge shared with us a few ideas on how Microsoft is re-architecting the way it thinks about power and sustainability.
Build Renewable Energy Somewhere Else
Lowering the overall PUE of your data centre is what you're after, right? It shows that you're using the power you draw in more efficiently and it's a good metric to see if you're doing your job as an IT manager.
According to Microsoft, once you get to a PUE as low as 1.2 or even lower than 1.1, it can become shockingly expensive to figure out ways to make you more efficient: almost to the point that it isn't worth the money spent versus the efficiencies gained.
Instead, what Microsoft is doing at Redmond Ridge is measuring the overall power it requires, and building renewable energy facilities like wind farms to compensate for its draw on the grid.
Wind and solar farms aren't new, but it's where Microsoft is putting them that's different.
To maximise efficiency, Microsoft calculated where the best place for a wind farm would be -- mostly because it had no room for it at Redmond Ridge -- and built it off-site to feed directly into the Washington state power grid at exactly the amount of energy per day that the facility was using. One unit of power was removed by the centre, and then replaced by the wind farm, with zero cost to the environment.
Sure, you won't be getting the very same electrons into your data centre that you produce, but using the one in, one out theory means that you've offset your draw from the grid, greening it up and ultimately lowering demand in the process.
Overcome Transmission Loss
Look at this power transmission diagram.
See the problem? 99 per cent of the initial energy generated by the dirty coal fire power plant is lost in conversion and transmission over the grid, into your data center and ultimately to the CPUs in your server racks.
To cut power usage, you need to rethink about how it gets where it's needed.
Ultimately, you're trying to put power into the power supplies of each server, right? So why do you need to pass power through absolutely everything in your data centre before it gets there?
It's experimental right now, but if you can re-engineer your server racks to contain a small fuel cell, it means you can switch the racks directly to DC power provided by said cell, improving efficiency to your chip.
It reduces losses, infrastructure costs and ultimately, emissions.
Did you know that cattle raised for human food produced over one per cent of the greenhouse gas emissions in the US last year? Almost two per cent came from landfills and sewerage treatment plants
It's a problem, and it quite literally stinks. The problem is known as methane. You know. Fart gas.
It survives in the atmosphere for a shorter amount of time when compared to other greenhouse gases, but it's still contributing to the problem.
A lot of Australian dumps are already using methane capture facilities to power their operations, so why shouldn't data centres try it? By using methane capture methods at waste treatment plants, dumps and landfills, as well as in cattle facilities, Microsoft is harnessing what it's calling biogas for fuel. It's a substitute for natural gas, and it's 100 per cent sustainable.
There are a few problems though. The amount of power lost in transmission as discussed earlier might mean that you see no benefit from biogas once you run lines from the field into your facility. That's when you start considering biogas capture for feeding back into the grid as discussed in option one.
Alternatively, if you only have a small data centre, you could consider co-opting some of the space used by a waste treatment plant for example, and locate your facility next door so little to no power is wasted in transmission. The cost there is in relocating your facility and the potential cost of running a fibre link to a literal shit-hole.
Get Ready For A Close-Up
Wondering why your servers are slowing down despite an abundance of cooling? Get out your thermal camera and take a few shots around your hot aisle. If your techs are sloppy with their cables in large racks, you'll find your problem pretty fast.
Poorly organised cables can block the free-flow of air out of your racks and cabinets, meaning your cooling system has to work that much harder to fix the problem.
Rather than spend the money ramping up your cooling systems to try and fix the problem, try installing switches vertically in your cabinets and position your I/O and power cables in one straight line running downwards.
By keeping everything out of the way, the air can move freely and you can keep your cooling bill down, thus being more efficient and saving money in the process.
Let's face it: there probably shouldn't be an army of people walking the halls of your data centre every minute of every day, so why are you keeping all the lights on in those areas?
Your servers don't care about how light it is, but your energy bill certainly does.
By doing something as simple as installing sensor lights in every row of your data centre, you can maintain -- for the most part -- a lights-off facility that will garner real savings and reduce your power and environmental footprint.
Don't Waste Cash On Rack Blanks
Having holes in your server racks isn't ideal. You want the heat staying in your hot aisle and the cooling coming from the other side. By leaving racks not occupied by blades or servers open, you're potentially screwing up your air flow. To fix that, some people might buy small plastic server blank strips to clip into your cabinet to keep everything tight and sealed.
Don't do that.
They cost around $3 to $8 per blank, and if you're running a large data centre that can become pretty expensive at scale. Instead, just get some fire-rated plastic and attach it to the front of your cabinet with a few magnets around the side to keep the air flowing correctly. It costs a whole lot less and uses less plastic, thus being more environmentally responsible.
Disclosure: Luke Hopewell travelled to Redmond as a guest of Microsoft.