Google and NASA partner to explore quantum computing using D-wave 2X technology.
Although the concept of quantum computing has been around since the early 1980’s, the emergence of Artificial Intelligence (AI) and the Internet of Things (IoT) is revamping modern research and the computing industry.
According to recent reports, Google and NASA have been studying how quantum systems can advance artificial intelligence and machine learning, and solve difficult optimisation problems, using hardware from quantum firm D-Wave.
It’s the latter challenge, called quantum annealing, that’s generating fresh excitement. The researchers are using the D-Wave 2X to solve the problem, which could help bring quantum computing into the mainstream.
What is Quantum Computing?
Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level.
Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond.
Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.
Since it’s origination in the early 1980’s, quantum computing has long baffled researchers in pursuit of discovering a faster approach using AI.
The researchers emphasized that quantum computing is still in the experimental stages and has yet to be commercialized. Rupak Biswas, deputy director of exploration technology at NASA Ames, likened the state of quantum computing to the early development of conventional computers during the 1930’s and 40’s.
This is not a commercial product,” he said. “This is not like you can go to your shopping mall and pick up one of these quantum computers so you can start texting and Facetime.”
How Quantum Computing Began
The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles.
But it is generally accepted that David Deutsch of Oxford University provided the critical analysis for quantum computing research.
In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later.
Historical Background on Quantum Theory
Quantum theory’s development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called “quanta”), as does matter.
Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.
The Essential Elements of Quantum Theory
* Energy, like matter, consists of discrete units, rather than solely as a continuous wave.
* Elementary particles of both energy and matter, depending on the conditions, may behave like either particles or waves.
* The movement of elementary particles is inherently random, and, thus, unpredictable.
* The simultaneous measurement of two complementary values, such as the position and momentum of an elementary particle, is inescapably flawed; the more precisely one value is measured, the more flawed will be the measurement of the other value.
The D-Wave 2 and the Internet of Things (IoT)
The 2X is a quantum computer that’s 100 million times faster than a normal computer. But using a D-Wave machine isn’t as simple as plugging it in and typing queries.
The 2X is ten feet tall and must be kept at a chilling -273 degrees C. Aside from a supply of liquid helium to reach such depths of temperature, those hoping to use a quantum machine must work out how to write algorithms and programs that work with the system.
Representing information in qubits allows the information to be processed in ways that have no equivalent in classical computing.” We’ve built the machine, now we need to discover how it works.~ NASA
If Google and NASA’s research delivers promising results, this subset of machine learning and AI will be ideal for optimisation, moreover, figuring out the best way to do something.
For NASA, that could be selecting a flight path to Mars, but it could be applied to anything from drug modeling to cracking encryption, or simply revealing the most efficient train route through Europe for a summer excursion.