Quantum RAM: Modelling the big questions with the very small

Griffith’s Professor Geoff Pryde, who led the project, says that such processes could be simulated using a “quantum hard drive”, much smaller than the memory required for conventional simulations.

“Stephen Hawking once stated that the 21st century is the ‘century of complexity’, as many of today’s most pressing problems, such as understanding climate change or designing transportation system, involve huge networks of interacting components,” he says.

“Their simulation is thus immensely challenging, requiring storage of unprecedented amounts of data. What our experiments demonstrate is a solution may come from quantum theory, by encoding this data into a quantum system, such as the quantum states of light.”

Einstein once said that “God does not play dice with the universe,” voicing his disdain with the idea that quantum particles contain intrinsic randomness.

“But theoretical studies showed that this intrinsic randomness is just the right ingredient needed to reduce the memory cost for modelling partially random statistics,” says Dr Mile Gu, a member of the team who developed the initial theory.

In contrast with the usual binary storage system - the zeroes and ones of bits - quantum bits can be simultaneously 0 and 1, a phenomenon known as quantum superposition.

The researchers, in their paper published in Science Advances, say this freedom allows quantum computers to store many different states of the system being simulated in different superpositions, using less memory overall than in a classical computer.

The team constructed a proof-of-principle quantum simulator using a photon - a single particle of light - interacting with another photon.

They measured the memory requirements of this simulator, and compared it with the fundamental memory requirements of a classical simulator, when used to simulate specified partly random processes.

The data showed that the quantum system could complete the task with much less information stored than the classical computer- a factor of 20 improvements at the best point.

“Although the system was very small - even the ordinary simulation required only a single bit of memory - it proved that quantum advantages can be achieved,” Pryde says.

“Theoretically, large improvements can also be realized for much more complex simulations, and one of the goals of this research program is to advance the demonstrations to more complex problems.”

Griffith University

Folks who are new to the IT biz might be confused by many of the adjectives that are used to describe various software products. Here, we provide a brief glossary of several common descriptors and their definitions.

  • Powerful: gimmicky
  • Self-documenting: verbose
  • Encourages best practices: explodes if you look at it funny
  • Exposes low-level functionality: bad user interface
  • Industry standard: archaic
  • Community-driven: untested
  • Proprietary: undocumented
  • Certified: overpriced
  • Flexible: config file is larger than actual program
  • Interoperable: equally incompatible with everything
  • Drop-in replacement: cunning trap
  • User friendly: pain in the ass

DIGITAL FUN̷̢̛͝ FACT #7281/a: By the year 2025, all of human̵͔̦̟͔̖̣͇̘̬̱̜͘͟͞ͅ kind will be caught in a recursive software iń̤̹̙stallation̯̺ loop wheņ̵̸́ the the only n̸̷̨͙̯̱̼̗̦͎͙͎̼̰͇̠̹͉̥̰͓̖͢͞ew computer programs available are for in̯̟̗͢͡͞͞͞stallin̷̷̗̟̩̤͍̥͕̻̭̞͓̜̥̟̹ͯ͒ͥͫͪͦ͛ͣ̓ͫ͐̃͗̀g n̛͜͝͠ew computer programs to iǹ̕stall n̸̡͝ew computer programs ad n̶̥̪͓̣̺̤̭͖̯̞͉̝̍ͧ̇̄̿̓ͤ͑ͨ̐ͪ͢͢͡͡ͅauseam.