How Will Mainframes Survive In The Cloud Era?


Cloud computing promises us flexible and reliable service delivery based on charging for what we use, but that’s a model which mainframe computing has been using for decades. How will the use of mainframes evolve in the future?

Picture: Keystone/Getty Images

The comparison between clouds and mainframes isn’t a new one. Sendmail inventor Eric Allman made the point at Linux.conf.au 2011: “Cloud computing is a return to centralised administration. You are handing the keys back to people in those glass rooms.” But not all the people in glass rooms ever left.

BMC Software’s recent global survey of mainframe users emphasises two key points: mainframes aren’t going anywhere soon, but they’re not generally being used for new tasks. Within ANZ, growing use of mainframes is largely driven by existing applications; 91 per cent of regional respondents to the survey said that this was the main reason for the growth in MIPS (millions of instructions per second, the standard measure of mainframe performance). Key priorities for change include reducing costs and improving disaster recovery.

We already know that reducing costs on mainframes is a priority. Beyond that, the big switch has been that the data produced from mainframes is often used to deliver information to consumers accessing (for example) bank accounts via smart phones, rather than simply feeding into corporate systems.

“On the one hand, it is true that large scale organisations are running applications they’ve been running for a fair period of time, but what’s really changed is they’re now saying ‘it’s all about the presentation,” James Russell, vice president of Mainframe Service Management, Asia Pacific at BMC told Lifehacker. “The workload is still doing transaction processing, but people can be querying it at any point. For the mainframe, it’s normal application workload, but the user interface has completely changed. The model of how we interact with systems has changed so much is that we’ve catered for everything to be delivered in a mechanism that’s all about the consumer. But does that change the workload? Not really.”

The bigger workload shift is that batch processing has become less common, and scheduled downtime is a rarity. “Transaction volumes mean there are no quiet times and there’s no outages to run batches. They’ll upgrade their OS and their database environment and they don’t go down while that happens.”

Russell suggests that mainframes will continue to play a role in cloud environments, especially where data confidentiality is an issue. “Security is the number one reason why companies don’t put their data into the external cloud, but they’ll create a mechanism whereby they have their private cloud access that information.”


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments


Leave a Reply