The History Of Computing Is Both Evolution And Revolution

This month marks the 60th anniversary of the first computer in an Australian university. The University of Melbourne took possession of the machine from CSIRO and on June 14, 1956, the recommissioned CSIRAC was formally switched on. Six decades on, our series Computing turns 60 looks at how things have changed.

Justin Zobel is the head of the Department of Computing & Information Systems, University of Melbourne

History of computing image from Shutterstock

It is a truism that computing continues to change our world. It shapes how objects are designed, what information we receive, how and where we work, and who we meet and do business with. And computing changes our understanding of the world around us and the universe beyond.

For example, while computers were initially used in weather forecasting as no more than an efficient way to assemble observations and do calculations, today our understanding of weather is almost entirely mediated by computational models.

Another example is biology. Where once research was done entirely in the lab (or in the wild) and then captured in a model, it often now begins in a predictive model, which then determines what might be explored in the real world.

The transformation that is due to computation is often described as digital disruption. But an aspect of this transformation that can easily be overlooked is that computing has been disrupting itself.

Evolution and revolution

Each wave of new computational technology has tended to lead to new kinds of systems, new ways of creating tools, new forms of data, and so on, which have often overturned their predecessors. What has seemed to be evolution is, in some ways, a series of revolutions.

But the development of computing technologies is more than a chain of innovation – a process that’s been a hallmark of the physical technologies that shape our world.

For example, there is a chain of inspiration from waterwheel, to steam engine, to internal combustion engine. Underlying this is a process of enablement. The industry of steam engine construction yielded the skills, materials and tools used in construction of the first internal combustion engines.

In computing, something richer is happening where new technologies emerge, not only by replacing predecessors, but also by enveloping them. Computing is creating platforms on which it reinvents itself, reaching up to the next platform.

Getting connected

Arguably, the most dramatic of these innovations is the web. During the 1970s and 1980s, there were independent advances in the availability of cheap, fast computing, of affordable disk storage and of networking.


Ron Bowles at the IBM 7044, a batch processing machine with magnetic tape storage, around 1969.

Compute and storage were taken up in personal computers, which at that stage were standalone, used almost entirely for gaming and word processing. At the same time, networking technologies became pervasive in university computer science departments, where they enabled, for the first time, the collaborative development of software.

This was the emergence of a culture of open-source development, in which widely spread communities not only used common operating systems, programming languages and tools, but collaboratively contributed to them.

As networks spread, tools developed in one place could be rapidly promoted, shared and deployed elsewhere. This dramatically changed the notion of software ownership, of how software was designed and created, and of who controlled the environments we use.

The networks themselves became more uniform and interlinked, creating the global internet, a digital traffic infrastructure. Increases in computing power meant there was spare capacity for providing services remotely.

The falling cost of disk meant that system administrators could set aside storage to host repositories that could be accessed globally. The internet was thus used not just for email and chat forums (known then as news groups) but, increasingly, as an exchange mechanism for data and code.

This was in strong contrast to the systems used in business at that time, which were customised, isolated, and rigid.

With hindsight, the confluence of networking, compute and storage at the start of the 1990s, coupled with the open-source culture of sharing, seems almost miraculous. An environment ready for something remarkable, but without even a hint of what that thing might be.

The ‘superhighway’

It was to enhance this environment that then US Vice President Al Gore proposed in 1992 the “information superhighway”, before any major commercial or social uses of the internet had appeared.


Tim Berners-Lee invented the world wide web as an essential tool for high-energy physics at CERN from 1989 to 1994. Flickr/ITU Pictures CC BY-NC-ND

Meanwhile, in 1990, researchers at CERN, including Tim Berners-Lee, created a system for storing documents and publishing them to the internet, which they called the world wide web.

As knowledge of this system spread on the internet (transmitted by the new model of open-source software systems), people began using it via increasingly sophisticated browsers. They also began to write documents specifically for online publication – that is, web pages.

As web pages became interactive and resources moved online, the web became a platform that has transformed society. But it also transformed computing.

With the emergence of the web came the decline of the importance of the standalone computer, dependent on local storage.

We all connect

The value of these systems is due to another confluence: the arrival on the web of vast numbers of users. For example, without behaviours to learn from, search engines would not work well, so human actions have become part of the system.

There are (contentious) narratives of ever-improving technology, but also an entirely unarguable narrative of computing itself being transformed by becoming so deeply embedded in our daily lives.

This is, in many ways, the essence of big data. Computing is being fed by human data streams: traffic data, airline trips, banking transactions, social media and so on.

The challenges of the discipline have been dramatically changed by this data, and also by the fact that the products of the data (such as traffic control and targeted marketing) have immediate impacts on people.

Software that runs robustly on a single computer is very different from that with a high degree of rapid interaction with the human world, giving rise to needs for new kinds of technologies and experts, in ways not evenly remotely anticipated by the researchers who created the technologies that led to this transformation.

Decisions that were once made by hand-coded algorithms are now made entirely by learning from data. Whole fields of study may become obsolete.

The discipline does indeed disrupt itself. And as the next wave of technology arrives (immersive environments? digital implants? aware homes?), it will happen again.

This article was originally published on The Conversation.


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments