- Joined
- Feb 24, 2013
- Messages
- 40,319
- Reaction score
- 23,992
- Gender
- Undisclosed
- Political Leaning
- Conservative
Here is a good video by a woman with a PhD in Quantum computing with a lot of good information on Quantum computing:
Like so many on-the-cusp technologies these days, there is a lot of unnecessary overstatement of current capabilities which leads to confusion in the general population.
People end up assuming way too much about these technologies to the point that they become more of a meme.
Quantum computing is one of those overstated technologies.
The biggest hurdles for Quantum computing aren't the hardware, surprisingly, it's the software. The biggest problem with quantum computing is that the phenomenon of the "observer effect" looms large in Quantum computing output. While you can do massive parallel computing per qubit, the act of reading the massively parallel output collapses the observable output to one answer when the real power of the parallel computing is all of the parallel answers.
As I've heard it described, you could put in 500 separate inputs to a Quantum computer and it can instantly run it's algorithm to generate 500 answers, but you will only be able to extract one of the outputs, and you won't really know which of the 500 inputs that answer applies to. This isn't a problem with tasks like cracking encryption since the encryption algorithm only really cares that you gave it one of the real answers, it doesn't care what input you used to generate that answer.
This is such a big hurdle that in the history of quantum computing there are really only 2 working algorithms (the extent of a Quantum computing software) that can generate useful outputs, with a third theoretical algorithm being published just last year.
So as it stands right now, the issue with Quantum computing isn't the hardware. Developing systems with more qubits is more of an engineering task than it is theoretical science. We humans are good at brute forcing engineering solutions. What we can do with that is driven by the algorithms which is where the field is actually stalled.
I've toyed around with the potential of a future application where we run AI through a Quantum Computer (which would best simulate an organic brain), but the reality is that there is a chasm between Quantum Computing and General purpose software that may never actually be bridged. Asking Quantum computing to take on AI might, in the end, make as much sense as tasking a power drill with cooking you dinner.
IN THEORY some of Quantum Computing's limitations can be overcome by running more qubits. The more qubits you dedicate to a task, the more differentiated outputs you can derive from that massively parallel computing, but it won't really change the kinds of tasks that Quantum computing is uniquely positioned to answer.
In the end, the most significant short term contribution of Quantum computing will likely be in materials research where a quantum computer could make it much easier to discover Chemistry's holy grails, like room temperature super conductors which would then exponentially grow traditional computing power, and Quantum computer qubit counts for that matter.
Like so many on-the-cusp technologies these days, there is a lot of unnecessary overstatement of current capabilities which leads to confusion in the general population.
People end up assuming way too much about these technologies to the point that they become more of a meme.
Quantum computing is one of those overstated technologies.
The biggest hurdles for Quantum computing aren't the hardware, surprisingly, it's the software. The biggest problem with quantum computing is that the phenomenon of the "observer effect" looms large in Quantum computing output. While you can do massive parallel computing per qubit, the act of reading the massively parallel output collapses the observable output to one answer when the real power of the parallel computing is all of the parallel answers.
As I've heard it described, you could put in 500 separate inputs to a Quantum computer and it can instantly run it's algorithm to generate 500 answers, but you will only be able to extract one of the outputs, and you won't really know which of the 500 inputs that answer applies to. This isn't a problem with tasks like cracking encryption since the encryption algorithm only really cares that you gave it one of the real answers, it doesn't care what input you used to generate that answer.
This is such a big hurdle that in the history of quantum computing there are really only 2 working algorithms (the extent of a Quantum computing software) that can generate useful outputs, with a third theoretical algorithm being published just last year.
So as it stands right now, the issue with Quantum computing isn't the hardware. Developing systems with more qubits is more of an engineering task than it is theoretical science. We humans are good at brute forcing engineering solutions. What we can do with that is driven by the algorithms which is where the field is actually stalled.
I've toyed around with the potential of a future application where we run AI through a Quantum Computer (which would best simulate an organic brain), but the reality is that there is a chasm between Quantum Computing and General purpose software that may never actually be bridged. Asking Quantum computing to take on AI might, in the end, make as much sense as tasking a power drill with cooking you dinner.
IN THEORY some of Quantum Computing's limitations can be overcome by running more qubits. The more qubits you dedicate to a task, the more differentiated outputs you can derive from that massively parallel computing, but it won't really change the kinds of tasks that Quantum computing is uniquely positioned to answer.
In the end, the most significant short term contribution of Quantum computing will likely be in materials research where a quantum computer could make it much easier to discover Chemistry's holy grails, like room temperature super conductors which would then exponentially grow traditional computing power, and Quantum computer qubit counts for that matter.
Last edited: