This is indeed very promising, but not necessarily a breakthrough yet. There's still the issue of operating temperature. The researcher said the next step is to try to get this to work with fancier superconductors that operate at 77K, which would "only" require liquid nitrogen cooling. If that is achieved, then I see this having real applications, but niche ones. More revolutionary applications would require room temperature, normal pressure superconductors, which AFAIK haven't yet been found, and might be possible at all(it's an open question). Even if they were found, there's of course also the question of whether the cost of those materials can be brought down enough to be worth the gains in efficiency etc.
One thing I'm wondering: superconductors are supposed to be more efficient because they waste no power as heat. But doesn't it take power to maintain such cold temperatures, even for "high temperature" liquid nitrogen temperatures? How does this compare to the energy we save from the lack of resistance in the superconductor?
It depends on how much current you want. I worked on a 1.2 m major radius 1 T copper stellarator that used 10 MW of power for the confinement coils. The coils are water cooled but even then the shots are only 1-second to avoid overheating (and to save on the engineering cost of the power system).
Any real fusion reactor would need 5+ T confinement fields and be 2+ times the radius of this machine. You would be using many GW in ohmic losses in copper confinement coils for hundreds of MW of fusion power: it simply could never work. A cryoplant for an LTS fusion reactor would use on the order of 50 MW, a bit less for HTS.
So for the big magnet application superconductors are an incredible win.
>But doesn't it take power to maintain such cold temperatures, even for "high temperature" liquid nitrogen temperatures? How does this compare to the energy we save from the lack of resistance in the superconductor?
Not a thermals guy, but bog-standard, consumer air conditioners tend to have efficiencies of 300% or so. Ie, for every joule of electrical energy consumed, they'll reduce the thermal energy in the room by 3 joules.
Regardless of this, having zero heat output in the wafer itself could be far more interesting than just increased power efficiency. For example, without the issue of thermals one could imagine silicon wafers with thousands of layers and 3-D interconnects.
It's a question of what you're spending power on. When you cool something there's the action of pumping heat away from it, and then the issue of how easily heat can migrate back in.
So you're only really fighting the inefficiency of your insulation, and insulation can be very, very good (at the extreme end you could magnetically balance something in a vacuum chamber and then chill it - it'll take a very long time to heat up by radiative emission from the walls (double wall insulated cups use this property though my take is they're just gas filled - same idea though).
Essentially the ongoing cost of keeping something cool is purely a function of heat load making it through insulation, and in all cases that's going to be much lower then generating heat in the system - the problem is that engineering and complexity wise, it's non-trivial and failure scenarios can be bad - i.e. an SC magnet quench can be very bad since you amongst other things might immediately boil your coolant. This happened at the LHC during commissioning.
The superconducting transition temperature (Tc) of many superconducting materials increases as the pressure applied to the material is increased, so I guess the idea is to make the ambient pressure high enough that Tc is below the ambient temperature.
My instinctual reaction (as a physicist studying superconductors, but with no expert knowledge of geophysics) is that this won't work for most (possibly all) materials, because (a) the maximum Tc under high pressure is still generally less than ~ room temperature, (b) the ambient temperature increases with depth below the earth's surface.
Not if Climate Change is any concern. What we need in Antarctica is a factory that makes it colder there, not dump heat there. What you want to do is have heatsinks that dump heat to the cold side of the vacuum of space.
This is exciting, don't get me wrong, but I'm always skeptical of these. "in the lab" in silicon is somewhat equivalent to "in mice" in biology. I was very excited about memristors back in '08 when they were first actually synthesized , but here we are, ~14 years later, and still no commercial viability. Producing something in a lab, and mass producing something in a foundry/factory are very different.
Super interesting result, but some of the things mentioned in the article are just plain wrong or grossly misrepresented. Not sure for example what they mean with "IBM mentions that without non-reciprocal superconductivity, a computer running on superconductors is impossible". RSFQ (Rapid single flux quantum) logic is based on Josephson junctions and works just fine up to many GHz, Prof. Likharev's group at SUNY / Stony Brooke developed these together with IBM, and RSFQ circuits are still being used in niche applications as well as quantum computing. The reason that they never replaced semiconductor-based computers is simply that they couldn't keep up with the rapid progress of those. In the 80s and 90s it was impressive that RSFQ devices could (theoretically) run at several 10s - 100s GHz, fast and efficient MOSFET transistors have much better characteristics and don't need costly cooling. For high-frequency applications there are HEMTs (high electron mobility transistors), which are often also easier to operate and manufacture than superconducting circuits. Also, no one managed to really scale RSFQ logic beyond a few 100.000 junctions.
No, but we can expect a lot of angry Apple customers at some point bitching that Apple's QMJJs stop working after two years because the coolant leaks into their trendy graphene and gold logic gates throwing them out sync with the laser, catching fire and turning their advertised 350 million times faster computer into a superfund site. But having massive renders sitting there for your review before you get actually submitted them to render might be worth it.