How Quantum Memory Could Change Computing

In a hot tub in 2012, physicist Seth Lloyd pitched a quantum internet application to Google’s Sergey Brin and Larry Page. He called it Quoogle: a search engine that, using mathematics based on the physics of subatomic particles, returns results without ever actually knowing the query. Such an advance would require an entirely new kind of memory, called qRAM, or quantum random access memory.

Though intrigued, Brin and Page turned the idea down, Lloyd told Gizmodo. According to his story, they reminded him that their business model was based on knowing everything about everyone.

But qRAM as an idea hasn’t died. Today’s computers are quite good at remembering data represented by billions of bits, binary code digits that can equal either zero or one. RAM, or random access memory, stores the data short-term on silicon chips, assigning each piece of data a unique address that can be accessed randomly — in any order — to refer to the data later.

It makes computer processes much faster, allowing your laptop or phone to quickly access the RAM for data frequently used by programs, rather than the storage, which is much much slower. But one day in the future, computer processors might be supplanted or augmented by quantum computer processors, machines that would be good at searching through huge datasets, machine learning, and artificial intelligence applications.

Quantum computers are still a nascent technology, but if they’re ever going to be able to run these potentially lucrative algorithms, they’ll need to access RAM in a whole new way. They’ll require qRAM.

“[QRAM] would be an amazing application, and make the kind of quantum devices that Google and IBM make today instantaneously useful,” Lloyd told Gizmodo.

Classical computers, like ThinkPads, iPhones, and the best high-performance supercomputers, perform all their operations by translating data into one or many combinations of bit values, zeroes and ones. The bits interact, then the final result is another combination of ones and zeroes. Quantum computers also spit out a final result of ones and zeroes.

But while the calculation is occurring, their quantum bits, or qubits, communicate with one another in a new way, through the same rules of physics that govern electrons. Rather than just equaling one or zero, a single qubit could be a little bit of both during the calculation, as governed by a special mathematical equation that encodes the probability you’ll get a zero or a one when you actually measure the qubit’s value.

Multiple qubits have more complex equations that treat combinations of qubit values as single mathematical objects. The end result is one or several possible binary strings, with the final value given to the user determined by the probabilities encoded in the equations.

This weird qubits-are-equations-until-you-measure-them-and-then-they’re-like-bits-again-except-their-values-might-have-some-innate-randomness maths could be useful for problems that are traditionally hard for computers. One such difficult problem is factoring large numbers into primes, which would crack the algorithm used to store much of our encrypted data — a development that could be “catastrophic” for cybersecurity.

It could also serve as a new way for computers to manipulate large datasets, like those you might see in machine learning problems (e.g., advanced facial recognition systems).

Quantum computers aren’t better than regular computers, yet. IBM offers researchers and businesses access to a functioning 20-qubit processor, and Rigetti offers a 19-qubit processor, while classical supercomputers can simulate quantum computers’ abilities up to about 50 qubits. Still, physicist John Preskill recently declared that technology has entered a new era in which quantum computers could soon find a use beyond being interesting physics experiments.

The US government takes quantum tech seriously because of its cybersecurity implications, and plenty of physicists and computer programmers are hunting for new quantum applications.

But many researchers hope to find ways that quantum computers could advance the state of artificial intelligence and machine learning using quantum algorithms. Those algorithms are complex and require accessing significant amounts of data, which means they would need the quantum equivalent of RAM: qRAM.

Quantum RAM isn’t, like, billions of bits somehow stored in a few qubits. Instead, it’s a way for quantum computers to apply their quantum operations to the large lists of data you might see in machine learning problems. Ultimately, regular RAM consists of data stored for a program to use, and programs access that stored data by specifying the address of the bits — like how you can sum spreadsheet cells by typing “sum (A2+B2)” rather than typing the specific numbers inside the cells every time.

Quantum algorithms would need to be able to access regular RAM quantumly — at the most basic level, it could set up a superposition that’s both A2 and B2 at the same time, and then return either the value in A2 or the value in B2 when the calculation completes. There’s nothing quantum about the memory itself — the quantum part is how the memory is used and accessed.

Basically, if you had a lot of stored data — like the databases used in all those silly “trained a bot” stories — there might be a quantum algorithm that can do a better job than a regular computer at searching through the data or telling you something important about it. This could be lucrative for the financial industry or a company like Google, and once again, would need a quantum RAM.

Lloyd and his team’s decade-old qRAM paper proposed one way for quantum computers to access only the addresses in the memory that it needed in superposition, using what they call a quantum bucket brigade. Essentially, since every address in the RAM is just a series of bits, you can represent it as a tree of branches where each qubit is a command telling the computer to head left or right.

This works in classical computers, but a quantum computer having only a left or right option would entangle extra paths along every switch, ultimately resulting in an enormously large and fragile quantum state which could easily fall apart into a non-quantum state. Lloyd and his collaborators envisioned a tree structure where every split in the tree was automatically held in a “wait” state, allowing the machine to only head down the left or right branches (paths) to access the memory it needs rather than entangling the extra stuff.

It’s rather technical, but it’s meant to vastly reduce the amount of power required to do these sorts of machine learning problems.

“Most of the algorithms that people are researching need some kind of quantum RAM,” Michele Mosca, a scientist at the University of Waterloo in Canada who also has researched quantum RAM, told Gizmodo. “Anything we can do to reduce the cost of practical quantum RAM can vastly reduce the timeline to useful quantum computers.”

But we are very, very early on in the days of quantum computing. It’s almost laughable to imagine today the way that classical computers remembered things during their infancy. RAM was comprised of magnetic loops connected by wires, where each loop represented a single bit and the orientation of the magnetic field in the coil represented the bit’s value.

The first commercially produced American computer, the UNIVAC-I, famously stored data by converting electric pulses into sound waves through liquid mercury. This wasn’t random access memory; rather than being able to retrieve any stored data whenever you wanted, you could only retrieve the data in the order it was sent into the line. But it was considered cutting-edge.

“It was state of the art,” Chris Garcia, curator at the Computer History Museum, explained to Gizmodo. “They were throwing whatever would stick to the wall at the time,” but things like this were better than anything else that had existed. Ending up with the way computers store memory today — on microchips made from special material called semiconductors — required advances in the sciences as well as in the processes that made silicon storage much cheaper than little rings of magnets.

What will quantum RAM actually look like? Probably not the way that Lloyd and his team envisioned. At a conference last year, physicists joked with me that quantum computing as a field could very well be developing more vats of liquid mercury.

There are most certainly technological and mathematical advances yet to be uncovered that will optimise the computers and how they eventually store data.

Lloyd agreed. “I would love if someone trounced our original idea,” he said. “If you could load classical information into quantum states, it would be a tremendous application for these near-term quantum computers.” After all, computers are more than just their ability to run fancy algorithms — they’re exciting for the way those algorithms can manipulate and abstract data in order to do something useful.

And hey, maybe there really will be a quantum Google one day.


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments


Leave a Reply