Speaking from experience, as somebody who works in quantum computing.
* Theoretical physicists make hypotheses, and find abstract circuits to accomplish a goal; experimental physicists work closely to validate those circuits -- a cycle of experimentation, hypothesis testing, and lots of statistics (aka science).
* Circuit designers find concrete implementations of those circuits, and work closely with fab to produce them. Fab employs chemists and physicists to test, characterize, and refine their processes through experimentation, hypothesis testing, and lots of statistics (aka, science).
* When experiment-laden chips come back from fab, the circuit designers and theoretical physicists work with experimental physicists and lab technicians to characterize/validate the implementation through experimentation, hypothesis testing, and lots of statistics (aka, science).
* Individual devices are assembled into processors by an architect, who has a high-level understanding of the circuits, fab processes, and algorithms that use the processors -- coordinating a larger feedback loop with algorithm researchers involving experimentation, hypothesis testing, and lots of statistics (aka, science).
Scienceless design is a myth worthy of quashing. Our process isn't much different than those employed in the classical computing realm. I'd claim that quantum computing isn't even groundbreaking, in this regard: there's probably a billion person-hour trail of scientific studies that took us from the Antikythera device to the 7nm node (cough cough, or 10nm; who's counting?) and the architectures built thereupon.
If you're doing "design" without "science" as a subroutine; you're probably just following a recipe.
It isn't a science. It doesn't delve into natural laws. Quantum computing, like computer science, is mathematics. The applied versions of quantum computing and computer science is engineering. One of my professor's biggest pet peeves was the misnomer "computer science". He always said that he was a mathematician and not a scientist. Turing was a mathematician, not a scientist.
> Outside the US mathematics is a science. (Because deduction is part of the scientific method.)
That's a reason for math to be recognized as an important tool for science (and a certain level of it a prerequisite for certain science work), but lacking an empirical component, it makes no sense to consider it a science.
Whether math is a science isn't a matter of location. It's a matter of definition. Math is not a science by definition anywhere.
Deduction is used in philosophy as well. Doesn't make philosophy or logic a science. And the major underpinning of the scientific method is empirical, not deductive.
Can you point to where the scientific method is used in mathematics? It isn't. It's why math has axioms and theorems and proofs while science has hypothesis, theories and empirical tests.
How can I empirically test that the square root of 2 is irrational? I can't because math is not a science. And I don't need to because it has been proven that square root of 2 is irrational. In math, once proven, it's always true. In science, theories are true until they've been proven to be false. In science, theories are falsifiable. In math, theorems are not. They are true forever.
I tend to view "science" as quintessentially a "success term." We call a discipline a science if we have a high degree of confidence in its results. In this sense, math is a science, and whether psychology or economics are scientific has less to do with their subject matter than whether we think they produce real knowledge.
I like this notion of science because it helps to account for the broader sense of "science" you see in earlier periods of history, before it became more narrowly associated with experimental science. More importantly, I think it gets at what actually interests us in the designation of "science."
On the other hand, whether or not philosophy is a science is also up for question; if we go by what you're saying, we can approximate "real knowledge" to be "progress", but whether or not philosophy makes progress is hotly debated in metaphilosophy. There are some interesting arguments on this[0] site about it. In particular, as you note, the notion that science is purely empirical/"experimental" is relatively new, and outside of the Anglosphere, at least, it was common to see philosophy take up the name of science through words such as Wissenschaft - both Kant and Hegel published works which had their titles translated as the "Science of Logic", for instance.
But most people seem to agree that the crudest formulation of science as that which is falsifiable (a la Popper) is inadequate; as the SEP notes, astrology is falsifiable and would thus qualify as a science! Furthermore, research has shown that what we regard as "science" published in the Nature journal actually doesn't usually come in the form of starting with a falsifiable hypothesis, but the research is exploratory in the first place.
Defining science is tricky, and I think we should reconsider how much weight we assign to the term as a proxy for the concepts of rigor, progress, replicability, ability to provide new knowledge, to confirm old knowledge etc.
> astrology is falsifiable and would thus qualify as a science
To qualify as science it should also not have been falsified. There is simply no problem in astrology being a candidate here, it's not reject because of form, but because of the contents.
Not at all, the matter is of falsifying hypotheses, and just as one can falsify hypotheses in physics, one can also falsify them in astrology. Popper said:
>“statements or systems of statements, in order to be ranked as scientific, must be capable of conflicting with possible, or conceivable observations”
But this excludes the possibility that there is a pseudoscientific claim that is refutable. One such claim from astrology might be that the position of the planets decide our fate to die on a particular day etc. But this can be and has been falsified. According to Popper, if it can't be tested then it's not science, but if it can be tested and shown to be correct or incorrect then it is science. Popper confirms this interpretation later:
>“A sentence (or a theory) is empirical-scientific if and only if it is falsifiable.”
Testing and refining the "theories" of astrology would qualify as scientific inquiry, but what practitioners of astrology actually do is far from scientific, so the subject itself really shouldn't be considered a science.
But this process of testing and refining theories is the application of the scientific method to claims of astrology isn't any different to the way it's done in physics, nor does it rule out a possible hypothesis being put forward which is successful. Practitioners of astrology put forward at least some falsifiable hypotheses (or at least, we can interpret them as hypotheses) just as physicists do.
If the criterion for science is simply whether or not the hypotheses can be falsified, this very much includes astrology and can exclude (as I mentioned in the case of Nature) legitimate science. By saying that astrology is far from scientific simply opens the question again: what counts as science?
We both agree that astrology isn't science, or it shouldn't be considered science, but my point is simply that the criterion of falsifiability is not sufficient to delineate science from non-science. Maybe we should say that the majority of claims made in the field should be (i) falsifiable (ii) shown to be correct, but that still leaves several cases open.
I agree that science is more than just making falsifiable claims, but I don't think that's a fair characterization of Popper. From what I understand of Popper's philosophy (not having read his work first-hand), science is the development and refinement of theories that produce falsifiable claims. That already disqualifies astrology, and we could certainly add more criteria. But I don't think "shown to be correct" should be one because it's impossible in general, which is what inspired the falsifiability criterion in the first place.
The popular conception of astrology as a mechanistic and predeterministic way to evaluate someones life is trivially falsifiable. Like any mechanistic and predetermined way to evaluate someone's life.
But that's straw dogging it to the max.
The more realistic conception of astrology is as a prehistoric psychological counseling tool which cannot be falsified; some anxious panic ridden pregnant woman asks an astrologer if her newborn will be healthy or not; astrologer sees anxiety will only make the mom and soon to be newborn more unhealthy, calms mom down by explaining the baby will be fine and she can trust him because he's so smart that he knows the constellation Pisces was in the sky when she was born (in the spring, because he's old and remembers her). He knows so much about nature and biology that he can mathematically and mostly correctly even predict star positions, not bad for 10000 BC. She's like "oh wow this dude is smart, for a pre-historical dude anyway" and calms down and the lower stress hormones mean her and the baby survive delivery; holy cow astrology works? That's cut and dried example; a more modern example is nothing in medicine is 0% or 100%, and most times anything that makes the patient feel better results in a better outcome; again; astrology as an auxiliary additional side therapy in concert with modern medicine works?
Astrology makes definite claims, and the claims have shown to be false. Check out Shawn Carlson's 1985 piece "A Double Blind Test of Astrology" in Nature.
I agree! But the popular conception of science as merely something falsifiable doesn't include that condition - my whole point here (which for some reason people are downvoting me for...?) is that Popper's definition isn't sufficient, because it both includes pseudoscience (astrology) and excludes legitimate (exploratory) science.
Perhaps I am misunderstanding the author or simply not fully grasping the concept of "design science"[0], but it seems as if they are simply trying to reclassify engineering with a term that sounds more "science-y".
Quantum computing boils down to unitary matrix operations over complex space. It's very straightforward. But the way those pieces behave in more complicated systems and the emergent properties they give rise to can be studied... Scientifically.
Some of it will be encompassed by computer science in that algorithms will still have the same metrics, but there may be other metrics for which to study quantum algorithms.
Those probability distributions are typically represented as complex vectors and quantum gates as unitary matrix operations on those vectors.
Here's a quantum computer simulator [1] I wrote years ago that allows you to construct quantum circuits using the most common quantum gates found in literature. This is all done by the application of complex matrices. On the right there is the probability distribution that taking a measurement would result in.
Edit: It's worth noting that we can study quantum computing without a quantum computer, just as we can study computing without a classical computer.
Edit 2: Here's Grover's Algorithm in my simulator where you can see that the probability of measuring the correct "hidden" oracle value is 96% after only three applications of the diffusion operator: https://qcsimulator.github.io/?example=Grover%27s%20Algorith...
That isn't thinking about it very deeply. Boolean algebra was useful in either case. When people who are working in Quantum Computing are overwhelmingly motivated by the speed increases only possible with real quantum computers, if one never materializes, not that many people are going to be excited about algorithims for pretend quantum computers that run no faster than classical machines.
Furthermore, the bifurcation of computer science and electrical engineering hasn't been all that successful as an academic practice. UC Berkeley recently decided to leave the ABET accreditation process for EECS because they felt they can innovate on a better synthesis of the fields for the fields and students.
They're related, but computer science has nothing to do with how electrons interact in different materials. Not that it isn't useful for people to know how the lower level components actually work. For that matter, people building the lower level components will also be more useful knowing how they're used in the abstract level.
And similarly, studying QC in the theoretical form can also lead to discovery and understanding of quantum properties. Even if we have no fully-featured quantum computers.
Computers are a procedure, not an object. You can build a computer using people, paper, pencils, and a list of procedures to follow when particular instructions are given.
Computer science and the construction of computers are virtually the same thing. Making a computer was never a challenge; making a faster computer always is.
The fundamentals of quantum computing are absolutely a science. The hypothesis, discovery and characterization of useful quantum systems is done in Physics labs all over the world. Beyond that, is Computer Science not considered a fundamental science?
Also when MIT moved to python vs scheme not to dumb down the CS, but to more emphasize combing and programming complex systems on the fly (robotics etc).
It's a branch of mathematics which wouldn't be considered a single branch without the practical concerns of programming driving it. You have everything from computability, sure, but you also have queuing theory (networking), formal grammars (parsers, compilers), relational algebra (database queries), and whatever else I'm forgetting, all under one banner, because all of them are useful to the same practical field.
>Up to now, physics has for the most part not been a design science. But my guess is that’s going to change in the coming decades. There are more and more examples where design seems the right way to think: topological quantum computers; new designer phases of matter; the Alcubierre warp drive and other designer spacetimes; constructor theory and universal constructors; programmable matter and utility fog.
I'm not sure if the author realizes what the difference between physics and engineering really is. In the 1800s, concepts like entropy and pressure (and the foundations of mechanics) were considered to be exciting fields of research, right alongside the construction of the first steam engines. Although the inventors building the first steam engines were exploiting the cutting-edge physics of the day, they were not in doing so acting as physicists. Engineering is not "the study of heat transfer and classical mechanics," engineering is "the application of physics (or whatever other science is helpful) to practical design problems." If a physicist were to construct or design a physical system to meet some requirements, they would be by definition engineering, whether they were a mechanical engineer, an electrical engineer, or a quantum field engineer. Likewise, if an engineer were to decide that the laws of the system they were working with were not fully understood, they might spend some time working out what they were, and in doing so they would be acting as a scientist.
> Engineering is not "the study of heat transfer and classical mechanics," engineering is "the application of physics (or whatever other science is helpful) to practical design problems."
I basically agree, but there's an important human-institutional aspect to this: An alternate useful definition of physicist (engineer) is "a person who work in a physics department/lab (engineering department/shops)". And despite common misconceptions, the sort of thing that binds departments together is not the goal of the work (learning basic laws vs. building useful things) but rather the collection of techniques/tools/ideas that people learn and apply in those places.
In other words, even though the goals of "materials physics" have for decades been much closer to engineering and design, there is a reason people continue to call it physics: because, at least for the time being, the sort of training you go through to become a materials physicist is mostly done in a physics department, and the methods by which materials physicist evaluate each other is closer to physics than to engineering.
My dad was a aeronautical engineer and my father-in-law is an electrical engineer. I would say their work was not really like my wife's (when she was a scientist).
The scientists in my life have seemed more like chefs coming up with the recipes and the engineers more like line cooks making the operation of the restaurant actually happen. That's probably a really terrible analogy.
I don't mean at all to say that the engineers don't understand the science (I have no idea! and it probably varies from eng to eng), but the actual work I hear or heard them talk about over the course of decades seems very different, whereby "scientist with thumbs" doesn't really seem accurate.
Although maybe that is more true now than in the past? For the last 10 years or so, my f-i-l's main complaint is that newly graduated engineers only understand theory and models and have zero practical experience coming out of a degree program. Though, he ... let's just say, he exaggerates a lot, so I don't know how accurate that is.
My engineer-for-45-years father seems very little like a scientist.
There's also the "theoretical physicist" vs the "experimental physicist", the experimental physicist is closer to the engineer, and validates the results.
Yes but we either validate what was already proven or something else for which the theory doesn't yet exist. Both are completely unhelpful for scientific progress in my opinion.
The author never used the phrase "glass bead game" -which pretty well describes quantum computing's relation to anything you could consider science or engineering. I guess he does use the words "Alcubierre warp drive" which is about right, except we don't have 1000s of Alcubierre warp drive experts running around publishing papers and pretending like they're not playing make believe.
You do realize you can sign up for D-Wave Leap and execute programs on a quantum annealer ten minutes from now, right?
It's far from a glass bead game. Just because you and I don't know how to make an annealer do something useful doesn't mean others aren't already getting exciting and useful results from it.
FWIIW I use annealing all the time in non-gradient optimization problems. My X220 thinkpad from 2012 is faster than d-wave's thing.
The only interesting quantum computer is one which achieves actual quantum supremacy. There's a really obvious reason why this will never happen (well really obvious to me). It's the same reason you can't solve NP-hard problems using soap bubbles.
Excuse my ignorance, but what is "the reason you can't solve NP-hard problems using soap bubbles"? That seems an obvious category error, but that can't be the reason you're dismissing the feasibility of quantum computing. What am I missing?
Show me an actual quantum error corrected qubit in the physical world. There is no time limit, unless you die first, in which case "time's up!"
FWIIW there are 0 error corrected qubits in the history of the human race so far. Might happen some day! Kinda funny how regular digital computers didn't have such problems as not existing in the world, even in the very early days.
I didn't/can't downvote; I only made fun of the term LARP -- did you mean lark?
The article you link to doesn't mention D-Wave.
It links to a more detailed summary[1] which does reference D-Wave, but it happens in a curious way.
> While various strategies for building quantum computers are now being explored, an approach that many people consider the most promising, initially undertaken by the Canadian company D-Wave Systems and now being pursued by IBM, Google, Microsoft, and others...
> On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work.
Very curious indeed; they mention D-Wave in passing, but don't mention the pesky 2000 qubit machine that Google & NASA have bought and analyzed in peer-reviewed journals. They go on to list some vaporware, presenting a narrative that the entire industry is vaporware. D-Wave doesn't fit the narrative, and it was omitted.
Even curiouser, Rigetti has made their own quantum computer, and they beat D-Wave to a public cloud offering -- doesn't fit the vaporware narrative, didn't make the list. I'm accustomed to seeing bias towards gate-model in discussion of QC efforts, but Rigetti's erasure is very curious indeed.
Of course, I'm accustomed to getting downvoted whenever I speak of D-Wave or adiabatic QC. Since the computational models are equivalent[2], I might go so far as call that a religious stance. But there's one issue with gate model; it needs error correction to scale, and NOBODY implements it. IBM has a cute proof that shallow circuits can show advantage on certain problems, but the problems are just as "real-world" as Google's billion-times-speedup using D-Wave hardware: there's no evidence you'd see that performance solving a problem you need an answer to.
So my question for you, a person with experience in simulated annealing algorithms who claims equivalence with D-Wave's hardware: do you know the difference between adiabatic evolution[3] and the annealing that you're simulating?
No, I mean LARP. Like "live action role playing." That's what they're doing. They've been around for 20 years for heavens sake!
I know about Rigetti; same adjective. I've been yacking with Dyakonov and other such people, as I'm writing a long, long thing on the topic.
Yeah, OK, the National Academy of Sciences has put together a panel of quantum computing guys who came to the conclusion that building actual quantum computers in the near future is unlikely: must be a conspiracy!
In the same as exobiology - we can make models compatible with our current physics/chemistry/etc and study those models, write papers, have conferences, grants... ie. do a science. One day it may happen to be useful upon encountering a life on another planet matching the models.
* Theoretical physicists make hypotheses, and find abstract circuits to accomplish a goal; experimental physicists work closely to validate those circuits -- a cycle of experimentation, hypothesis testing, and lots of statistics (aka science).
* Circuit designers find concrete implementations of those circuits, and work closely with fab to produce them. Fab employs chemists and physicists to test, characterize, and refine their processes through experimentation, hypothesis testing, and lots of statistics (aka, science).
* When experiment-laden chips come back from fab, the circuit designers and theoretical physicists work with experimental physicists and lab technicians to characterize/validate the implementation through experimentation, hypothesis testing, and lots of statistics (aka, science).
* Individual devices are assembled into processors by an architect, who has a high-level understanding of the circuits, fab processes, and algorithms that use the processors -- coordinating a larger feedback loop with algorithm researchers involving experimentation, hypothesis testing, and lots of statistics (aka, science).
Scienceless design is a myth worthy of quashing. Our process isn't much different than those employed in the classical computing realm. I'd claim that quantum computing isn't even groundbreaking, in this regard: there's probably a billion person-hour trail of scientific studies that took us from the Antikythera device to the 7nm node (cough cough, or 10nm; who's counting?) and the architectures built thereupon.
If you're doing "design" without "science" as a subroutine; you're probably just following a recipe.
That's a reason for math to be recognized as an important tool for science (and a certain level of it a prerequisite for certain science work), but lacking an empirical component, it makes no sense to consider it a science.
Deduction is used in philosophy as well. Doesn't make philosophy or logic a science. And the major underpinning of the scientific method is empirical, not deductive.
Can you point to where the scientific method is used in mathematics? It isn't. It's why math has axioms and theorems and proofs while science has hypothesis, theories and empirical tests.
How can I empirically test that the square root of 2 is irrational? I can't because math is not a science. And I don't need to because it has been proven that square root of 2 is irrational. In math, once proven, it's always true. In science, theories are true until they've been proven to be false. In science, theories are falsifiable. In math, theorems are not. They are true forever.
I like this notion of science because it helps to account for the broader sense of "science" you see in earlier periods of history, before it became more narrowly associated with experimental science. More importantly, I think it gets at what actually interests us in the designation of "science."
But most people seem to agree that the crudest formulation of science as that which is falsifiable (a la Popper) is inadequate; as the SEP notes, astrology is falsifiable and would thus qualify as a science! Furthermore, research has shown that what we regard as "science" published in the Nature journal actually doesn't usually come in the form of starting with a falsifiable hypothesis, but the research is exploratory in the first place.
Defining science is tricky, and I think we should reconsider how much weight we assign to the term as a proxy for the concepts of rigor, progress, replicability, ability to provide new knowledge, to confirm old knowledge etc.
[0] http://www.philosophyisscience.com/
To qualify as science it should also not have been falsified. There is simply no problem in astrology being a candidate here, it's not reject because of form, but because of the contents.
>“statements or systems of statements, in order to be ranked as scientific, must be capable of conflicting with possible, or conceivable observations”
But this excludes the possibility that there is a pseudoscientific claim that is refutable. One such claim from astrology might be that the position of the planets decide our fate to die on a particular day etc. But this can be and has been falsified. According to Popper, if it can't be tested then it's not science, but if it can be tested and shown to be correct or incorrect then it is science. Popper confirms this interpretation later:
>“A sentence (or a theory) is empirical-scientific if and only if it is falsifiable.”
If the criterion for science is simply whether or not the hypotheses can be falsified, this very much includes astrology and can exclude (as I mentioned in the case of Nature) legitimate science. By saying that astrology is far from scientific simply opens the question again: what counts as science?
We both agree that astrology isn't science, or it shouldn't be considered science, but my point is simply that the criterion of falsifiability is not sufficient to delineate science from non-science. Maybe we should say that the majority of claims made in the field should be (i) falsifiable (ii) shown to be correct, but that still leaves several cases open.
But that's straw dogging it to the max.
The more realistic conception of astrology is as a prehistoric psychological counseling tool which cannot be falsified; some anxious panic ridden pregnant woman asks an astrologer if her newborn will be healthy or not; astrologer sees anxiety will only make the mom and soon to be newborn more unhealthy, calms mom down by explaining the baby will be fine and she can trust him because he's so smart that he knows the constellation Pisces was in the sky when she was born (in the spring, because he's old and remembers her). He knows so much about nature and biology that he can mathematically and mostly correctly even predict star positions, not bad for 10000 BC. She's like "oh wow this dude is smart, for a pre-historical dude anyway" and calms down and the lower stress hormones mean her and the baby survive delivery; holy cow astrology works? That's cut and dried example; a more modern example is nothing in medicine is 0% or 100%, and most times anything that makes the patient feel better results in a better outcome; again; astrology as an auxiliary additional side therapy in concert with modern medicine works?
Astrologers... refuse to do this, to put it politely.
[0]: https://en.wikipedia.org/wiki/Design_science#cite_note-cross...
Some of it will be encompassed by computer science in that algorithms will still have the same metrics, but there may be other metrics for which to study quantum algorithms.
Here's a quantum computer simulator [1] I wrote years ago that allows you to construct quantum circuits using the most common quantum gates found in literature. This is all done by the application of complex matrices. On the right there is the probability distribution that taking a measurement would result in.
[1] https://github.com/qcsimulator/qcsimulator.github.io
Edit: It's worth noting that we can study quantum computing without a quantum computer, just as we can study computing without a classical computer.
Edit 2: Here's Grover's Algorithm in my simulator where you can see that the probability of measuring the correct "hidden" oracle value is 96% after only three applications of the diffusion operator: https://qcsimulator.github.io/?example=Grover%27s%20Algorith...
Furthermore, the bifurcation of computer science and electrical engineering hasn't been all that successful as an academic practice. UC Berkeley recently decided to leave the ABET accreditation process for EECS because they felt they can innovate on a better synthesis of the fields for the fields and students.
And similarly, studying QC in the theoretical form can also lead to discovery and understanding of quantum properties. Even if we have no fully-featured quantum computers.
Computer science and the construction of computers are virtually the same thing. Making a computer was never a challenge; making a faster computer always is.
Also when MIT moved to python vs scheme not to dumb down the CS, but to more emphasize combing and programming complex systems on the fly (robotics etc).
These artificial categories are fuzzy and not very useful.
Don't sweat their opinion, Industry drives the world.
I'm not sure if the author realizes what the difference between physics and engineering really is. In the 1800s, concepts like entropy and pressure (and the foundations of mechanics) were considered to be exciting fields of research, right alongside the construction of the first steam engines. Although the inventors building the first steam engines were exploiting the cutting-edge physics of the day, they were not in doing so acting as physicists. Engineering is not "the study of heat transfer and classical mechanics," engineering is "the application of physics (or whatever other science is helpful) to practical design problems." If a physicist were to construct or design a physical system to meet some requirements, they would be by definition engineering, whether they were a mechanical engineer, an electrical engineer, or a quantum field engineer. Likewise, if an engineer were to decide that the laws of the system they were working with were not fully understood, they might spend some time working out what they were, and in doing so they would be acting as a scientist.
I basically agree, but there's an important human-institutional aspect to this: An alternate useful definition of physicist (engineer) is "a person who work in a physics department/lab (engineering department/shops)". And despite common misconceptions, the sort of thing that binds departments together is not the goal of the work (learning basic laws vs. building useful things) but rather the collection of techniques/tools/ideas that people learn and apply in those places.
In other words, even though the goals of "materials physics" have for decades been much closer to engineering and design, there is a reason people continue to call it physics: because, at least for the time being, the sort of training you go through to become a materials physicist is mostly done in a physics department, and the methods by which materials physicist evaluate each other is closer to physics than to engineering.
The scientists in my life have seemed more like chefs coming up with the recipes and the engineers more like line cooks making the operation of the restaurant actually happen. That's probably a really terrible analogy.
I don't mean at all to say that the engineers don't understand the science (I have no idea! and it probably varies from eng to eng), but the actual work I hear or heard them talk about over the course of decades seems very different, whereby "scientist with thumbs" doesn't really seem accurate.
Although maybe that is more true now than in the past? For the last 10 years or so, my f-i-l's main complaint is that newly graduated engineers only understand theory and models and have zero practical experience coming out of a degree program. Though, he ... let's just say, he exaggerates a lot, so I don't know how accurate that is.
My engineer-for-45-years father seems very little like a scientist.
Our Boss was Presediuent of the Mech engineers and the boss of CIT The civil's
It's far from a glass bead game. Just because you and I don't know how to make an annealer do something useful doesn't mean others aren't already getting exciting and useful results from it.
FWIIW I use annealing all the time in non-gradient optimization problems. My X220 thinkpad from 2012 is faster than d-wave's thing.
The only interesting quantum computer is one which achieves actual quantum supremacy. There's a really obvious reason why this will never happen (well really obvious to me). It's the same reason you can't solve NP-hard problems using soap bubbles.
FWIIW there are 0 error corrected qubits in the history of the human race so far. Might happen some day! Kinda funny how regular digital computers didn't have such problems as not existing in the world, even in the very early days.
https://spectrum.ieee.org/tech-talk/computing/hardware/the-u...
The article you link to doesn't mention D-Wave.
It links to a more detailed summary[1] which does reference D-Wave, but it happens in a curious way.
> While various strategies for building quantum computers are now being explored, an approach that many people consider the most promising, initially undertaken by the Canadian company D-Wave Systems and now being pursued by IBM, Google, Microsoft, and others...
> On the hardware front, advanced research is under way, with a 49-qubit chip (Intel), a 50-qubit chip (IBM), and a 72-qubit chip (Google) having recently been fabricated and studied. The eventual outcome of this activity is not entirely clear, especially because these companies have not revealed the details of their work.
Very curious indeed; they mention D-Wave in passing, but don't mention the pesky 2000 qubit machine that Google & NASA have bought and analyzed in peer-reviewed journals. They go on to list some vaporware, presenting a narrative that the entire industry is vaporware. D-Wave doesn't fit the narrative, and it was omitted.
Even curiouser, Rigetti has made their own quantum computer, and they beat D-Wave to a public cloud offering -- doesn't fit the vaporware narrative, didn't make the list. I'm accustomed to seeing bias towards gate-model in discussion of QC efforts, but Rigetti's erasure is very curious indeed.
Of course, I'm accustomed to getting downvoted whenever I speak of D-Wave or adiabatic QC. Since the computational models are equivalent[2], I might go so far as call that a religious stance. But there's one issue with gate model; it needs error correction to scale, and NOBODY implements it. IBM has a cute proof that shallow circuits can show advantage on certain problems, but the problems are just as "real-world" as Google's billion-times-speedup using D-Wave hardware: there's no evidence you'd see that performance solving a problem you need an answer to.
So my question for you, a person with experience in simulated annealing algorithms who claims equivalence with D-Wave's hardware: do you know the difference between adiabatic evolution[3] and the annealing that you're simulating?
[1] https://spectrum.ieee.org/computing/hardware/the-case-agains...
[2] https://physicsworld.com/a/quantum-adiabatic-and-quantum-cir... (and... their adiabatic model is just as possible as error-corrected gates right now)
[3] https://en.wikipedia.org/wiki/Adiabatic_theorem
I know about Rigetti; same adjective. I've been yacking with Dyakonov and other such people, as I'm writing a long, long thing on the topic.
Yeah, OK, the National Academy of Sciences has put together a panel of quantum computing guys who came to the conclusion that building actual quantum computers in the near future is unlikely: must be a conspiracy!
Can normal people sign up and immediately get QPU time, API access, etc? The D-Wave offering is much more complete, I think.