Not Everything Has To Run Faster

Not Everything Has To Run Faster

Management invariably wants IT to be faster and cheaper and more reliable. Delivering that trifecta is almost impossible, but it’s worth bearing in mind that improving raw performance doesn’t always have to be the goal. You may need speed, but you probably don’t need speed everywhere.

Picture: Getty Images

I was reminded of this during a presentation at NetApp’s Elevate customer event in Sydney yesterday. Adrian Jansz, the ICT manager for property information provider RP Data, was detailing how the company consolidated its data centre requirements down from nine locations to just two.

Storage infrastructure wasn’t something that RP Data wanted to skimp on. “Storage is one of the cornerstones for our information infrastructure,” Jansz said. “Having data in our name, it’s paramount to have a Tier 1 storage provider behind us.”

What RP Data didn’t necessarily need, but had been paying for, was high-speed fibre connections to enable near-instantaneous replication of data. That was an expensive and excessive choice.

“Everybody wants real-time replication and being able to restore straight away is super-important, but in reality we’re not saving lives,” Jansz said. “We’re not in an industry that require an availability that other industries adhere to. So we worked out our own appropriate service levels, retired a dark fibre synchronous link and replaced it with something more economical, saving a bunch of money.”

“We don’t really require real-time replication for a lot of our data sets. Our data marts are built overnight.”

Conversely, RP Data switched to a flash array solution for some of its internal operations, because in that context improved performance did make a definite difference. “There was a business need to get far more throughput from our databases,” Jansz said. “The IT guys were saying ‘we can’t get to our data marts fast enough’.”

In that case, the switch made a useful and measurable difference. “We have multiple data feeds and data marts every day. Some of our principle data sets are roughly 700 million rows and 300GB of data.” The extract-transform-load process had taken up to 14 hours with the old solution, which dropped to 3.5 hours with the new version.

That also made the budget for the higher-priced flash solution easier to procure. “If you don’t get the business buy-in , you’re going to be pushing shit uphill,” Jansz noted.

Evolve is a weekly column at Lifehacker looking at trends and technologies IT workers need to know about to stay employed and improve their careers.