Ask HN: Will quantum computing offset the power grid needs of AI?
What do you think? If successful, will quantum computing reduce the power needs of AI and datacenters or will we come up with new ideas and problems that will continue to increase power needs?
Relevant link: https://www.nature.com/articles/s43588-023-00459-6
Currently most quibits require cryogenic cooling so without a breakthrough in room temperature superconductors etc. we could easily end up in a situation where quantum computing is used for specialized tasks where it is faster than conventional computing but the end result is more energy expenditure overall.
Edit: clarified the wording
It's hard to be sure, but Quantum Computing is useful only in a few specific problems, for example to calculate the Fourier Transform, because it's easy to parallelize and each part needs a very similar operations. With some clever trick, this can be used to factorize numbers. But it's not clear how to se it to solve arbitrary problems, so I guess a quantum AI is very difficult.