To aid Quantum
computing Google has now created a Quantum Microchip called the “Willow”. That
chip was able to solve a mathematical problem in 5 minutes, a task that would
take the most powerful; super computer available right now 10 Septillion years
to accomplish; Septillion is 10 raised to the power of 24 or 1000 crore crore
crore years. Even the age of the known Universe at 1370 crore years fades into
insignificance before this figure.
Earlier the problem of
Quantum computing has been the number of errors. But the errors decrease as the
qubits (bits for ordinary computers become qubits for Quantum computers)
increase and in this chip they are almost eliminated due to the much faster
speed and other protection systems.
The Quantum computers
have to be maintained just above absolute zero to eliminate errors.
This chip is truly mind
boggling and pushes the frontiers of computing to an unbelievable distance. This
actually makes us wonder as to where the human race is progressing. Where is
the limit to such inventions?
But then, all said and done
Quantum Computing has certain limitations because they operate by Quantum
entanglement which is the very thing that gives them the speed. It is found
that the Quantum Computers generate only a limited amount of entanglement
before they get disturbed by noise. Quantum computers are inherently noisy, and
without error correction technologies one qubit in a 1000 qubit Quantum
computer fails. In contrast only 1 in 1 billion billion bits fails in a
conventional computer. The challenge is to build quantum computers that are
less error prone.
There are some other
problems to be addressed as well, but that does not mean they could not be
overcome and it is only a matter of time before they are fully exploited and an
entirely new world would be enabled by them. Perhaps maybe in another 10 years’
time they would be fully exploited.
Google claims that its
Willow QPU (Quantum Processing Unit) is the first in the world that achieved
the results “below threshold” after which the QPU is made sufficiently error
free to be useful and trustworthy for computing as outlined by a computer
scientist in his research paper published in 1995.
No comments:
Post a Comment