Data Centre Design Mistakes To Avoid

Data centre design is becoming increasingly standardised, but that doesn't mean the design process is straightforward. Avoid costly errors by dodging these common data centre design mistakes.

Design picture from Shutterstock

IDC research manager Giorgio Nebuloni presented on data centre design issues on day 2 of Data Centre World in London, which I'm attending this week as part of Lifehacker's ongoing World Of Servers coverage. These are some of the central points he raised.

Understand your purpose. An overriding theme across Nebuloni's presentation was the importance of identifying the function of your data centre. Cloud systems aimed at the mass consumer to consumer market will have different requirements to those with B2C functionality, or those set up for private cloud provisioning.

Set suitable measurement metrics. Data centre function will also influence how you measure performance. In an enterprise data centre, cost per application is a key metric. Companies offering basic cloud and hosting services will measure on revenue per server, while more advanced cloud providers will be more concerned with revenue per kilowatt.

Don't think of any architecture other than X86. Non-X86 systems have never represented more than a tiny percentage of the overall data centre market, and that small figure has dropped over the last decade.

Choose the appropriate form factor. While rack-ready systems still represent more than half of the typical data centre, denser solutions such as blades are becoming increasingly common. "One server out of three in 2016 will be dense or ultra-dense," Nebuloni predicted.

Concentrate on the major challenges. IDC surveys suggest the biggest issues for data centre managers remain overall reliability, cooling, adequate power provisioning and ensuring sufficient capacity.

Work out where responsibility lies. IT workers are typically responsible for server design and selection, but power might be managed by the facilities division.

Choose the right form factor. Modular and container data centres have become more popular, but won't suit all environments. "Make sure you are clear on what you are trying to achieve before embarking on a modular data centre project," Nebuloni said.

Don't assume modular centres will be reused One of the often-cited benefits of container data centres is that they can be relocated, but real-world examples of this happening are rare, Nebuloni argued. "This is a theoretical one at this stage. These options can, at least on paper, be redeployed. We haven't seen many people doing that but it could potentially happen. We haven't seen it as much as we might have expected three years ago."

Lifehacker's World Of Servers sees me travelling to conferences around Australia and around the globe in search of fresh insights into how server and infrastructure deployment is changing in the cloud era. This week, I'm in London for Data Centre World, paying particular attention to how to maximise efficiency and lower costs in the data centre.


Comments

    On a more practical note, make sure all power points are appropriately labelled.

    An Australian bank once had a major outage after a cleaner unplugged a server so they could plug in their vacuum.

      If a single server going down causes any disruption for a bank, they have a lot more to worry about than correct labelling.

    Nothing about redundancy? It is pretty important to calculate the total cost of the least amount of redundancy you can get away with.

    Are two internet connections enough?
    Do I need to setup in two physical locations to reduce risk?
    How much redundant storage do I need?
    What mitigation plans do I have in place for cyber attacks?
    Do I have backup power?
    Do I need that much backup power if the lines will be down anyway?
    What is the sovereign risk when working in this country?

    Also, why are non X86 systems un-viable? If you are planning for economies of scale, it should be cheaper to run a single 64bit physical server at the maximum memory limit, than several 32bit servers at maximum. There are also numerous processing advantages of 64bit over 32bit.

    The biggest choke point would be processing capabilities when running many virtual machines, but if your workload will be large and/or complex tasks... 64bit makes far more sense.

    I think he was referring to the x86 architecture (and in turn x86_64) in preference to ARM, MIPS etc.

      Author has obviously never stepped inside a data centre.

      ARM's market share is growing fast.
      x86 covers everything from 8086 to xeon, xeon's market share is the largest.

        The x86 numbers are based on IDC's research -- which, all things considered, I'll take over an anecdotal and insulting comment. (And I've set foot in dozens of data centres over the years, FWIW.)

Join the discussion!

Trending Stories Right Now