

Fun with complexity theory | Locklin on science
source link: https://scottlocklin.wordpress.com/2021/10/27/fun-with-complexity-theory/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Fun with complexity theory
The algorithms that reproduce the behavior of matter are NP-hard. There’s lots of examples; soap bubbles on Steiner trees, Ising models (of dilute magnetic materials), chaotic dynamics, protein folding; any sort of simulation of biological matter or chemicals in general. Most of physics involves minimization of action. Many of the heuristic approaches used to solve NP-hard problems are taken from physics and nature in general. And nature is known to “solve” its problems in linear time. Computer scientists always pipe up here and invoke some permutation on Church-Turing and the idea that nature doesn’t precisely “solve” for the minimum surface soap bubble on a Steiner tree; just one which is somehow close to the best minimum surface. I agree of course, but I also assert that this is all that matters: the world of matter is more important than the world of Turing machines; this is a branch of math getting far, far above its station (math is a branch of physics; one where experiments are cheap).

For that matter, there are still people who think we can build analog computers based on dynamical systems theory that solve interestingly large NP-hard and other difficult problems. This is an old field, to my mind it has produced more interesting results than anything in “quantum information theory.” I had a couple of papers brewing on applying what I know about dynamical systems theory to quantum algorithms back in the early 00s; then I got an honest job and gave it up (dynamics to time series pays better). The nice thing about dynamical systems theory applied to other things is it is a grown up branch of physics, based on simple geometric and topological concepts, and it has tremendous explanatory power. Most of classical physics is just dynamics and counting, and that’s most of the physics you see around you. But still, we have ideas from complexity theory that get mapped onto problems in physics. I’ve mentioned the Ford Paradox before, but there are others.
An interesting example of complexity theory being used in physics; back in 2014, Arkady Bolotin pointed out that might explain… Schroedinger’s cat. The idea being you get a computational explosion when you try to solve Schroedinger for a macroscopic object, a la NP-hard. Arkady leaves out QMA, BQP and other quantum complexity classes, for explicit lack of real world quantum computers, which is exactly as it should be.
The argument goes something like, you can map quantum problems to the Ising model, which is NP-complete. It’s possible that somehow a quantum calculation doesn’t map cleanly onto an Ising model (he says as much), but certainly it sometimes does, and that’s probably enough to make the argument hold together. So, solving the Schroedinger equation for a macroscopic object can’t be done in polynomial time, otherwise . Furthermore if you try to solve Schroedinger over the course of a year on a sort of Avogadro’s number worth of objects, each one mapping to a Hilbert space, you’d have to do each computation in something like
which is, as things have it, pretty small. Ridonculously smaller than the Planck time. It’s a hand wavey argument, essentially saying macroscopic quantum objects are impossible, because if they weren’t, we’d have a proof that
. Also that Schroedinger on macroscopic or otherwise complex objects (including quantum computers ; objects which have exponential numbers of internal quantum states) is meaningless. This paper is only cited by 5 papers, and is really engaged with by none of them. Mind you, it is somewhat hand wavey, but then, you have entire fields (aka “quantum information theory”) which are nothing but hand waving.
Bolotin has a couple more papers dealing with this idea in broad strokes. Another paper starts off with one of those imaginary quantum computers and works backwards to “well, what if Nature says they don’t work.” Blasphemy I know, what with all those amazing quantum computing startups breaking public key cryptography …. any day now. Like many blasphemies, taking it seriously for a few moments is probably worth the time.
Bolotin sets up a situation where we treat a Stern-Gerlach experiment; modeling both the detector and the funny obviously quantum stuff together. Then we think about actually doing this calculation using some kind of computer. Because of the many-body nature of such a simulation involving a macroscopic detector, we’re back to something that is NP hard to simulate. But, nature obviously gives us an answer when Stern and Gerlach perform their experiment; doesn’t look NP-hard when nature does this; which implies P=NP, which will make everyone sad. This all goes back to measurement problems in quantum mechanics: we never try to model the detector and the Stern-Gerlach experiment using the Schroedinger equation or other formulations of quantum mechanics. We model the obviously quantum thing by itself, and assume that the detector system is somehow non-quantum a la Copenhagen.
Stern-Gerlach; you need to model the detector with the silver atoms in a fully quantum world
In another paper he uses the run time complexity of solutions to the Schroedinger equation as an algorithm to justify the quantum/classical divide. The argument is much as the above; mapping into Ising and saying “look at this snarl.” This one is more satisfying as it gives a sort of natural way of justifying the Copenhagen interpretation, which is the philosophical approach which simply assumes the classical limit, rather than demonstrates from quantum principles that there must be one. It would be more satisfying if there was a first principle way of pointing to the level of internal degrees of freedom where the Schroedinger equation turns into Hamilton-Jacobi… because of complexity theory. Maybe this is possible to do, but that step hasn’t been taken yet. Certainly there must be some such limit: we observe quantum mechanical behavior all the time in small and simple systems. Yet we never observe macroscopic objects doing quantum things, or even objects with large scale entanglement (aka something like a quantum computer or other mesoscopic system).
If we assume this overall argument is true, which it probably is, there must be something like a “quantum break-time” for complex objects. To refresh your memory, the quantum break time in dynamically complex systems (kicked rotors, bipendulums) is the observation that simple quantum objects with classical dynamical complexities behave classically for short timescales. It’s almost as if the universe itself can’t present us with the quantum solution to the classically chaotic Schroedinger equation. But eventually the universe catches up, and the spectral density looks like eigenvalues of a random matrix as they should. It’s a creepy effect; like the universe “smells” the classical solution for small periods of time, then smears it out into a quantum system when it thinks about it for long enough.
Bolotin’s ideas lead one to the thought that for sufficiently internally complex systems that seem like they should be quantum, the quantum break-time is infinite. Basically the universe can never present the result of a quantum simulation of a very complex system: the proverbial Pentium of the Universe running the calculation is insufficient. If you have something too complex and messy you can’t get the quantum answer, even if you wait around for a long time. Since such a system is computationally very complex, if the universe can be thought of as a sort of analog computer, the quantum calculation can’t be done before the end of the universe. The universe just chumps out and gives you the simple classical mechanics solution which any schoolboy could present you with from fiddling with Lagrangian mechanics. If the complex system can be thought of as quantum, it can only be thought of as quantum on some style timescale that is longer than the age of the universe.

One would have to get more strict with definitions of “complexity,” to run this idea to the ground, but entanglement may be sufficient. We’ve done all manner of experiments with two entangled photons, or couple of entangled nuclear spin states. I’m not aware of people having done such things with hundreds of entangled nuclear spin states, or thousands. You can entangle them so it looks like a couple of collective spin states interacting with the magnetic field, but not the sort of all to all entanglement which would be required for something like a quantum computer. Quantum computing researchers will eventually have to make very entangled machines which do all kinds of impossible things; there may be a quantum optics or NMR experiment which establishes a quantum breaktime for number of entangled otherwise independent systems. Joe Ford suggested building smaller and smaller double pendulums and seeing what happened. I guess I’m adding suggesting more and more entangled things. Maybe there is some interesting thought experiment on doing it with systems where we know how to make entanglement. Photons are the simplest one: your high energy photon is split by a KDP crystal into two entangled photons -do it again and again until maybe it looks like a boring thermalized microwave wave front or something. I dunno; this isn’t my day job. Just some thoughts on reading the papers.
This set of Bolotin ideas are pretty clever, and it’s a shame more people haven’t looked into it. I mean, I am a mere laborer in the quantitative fields, a twiddler of bits; I might have missed something important here, but it sure seems viable and interesting to me. At least viable and interesting enough to respond to, which seems to have never happened. When I consider the number of shitty ideas popularized by people that employ PR firms to get their “science” in front of eyeballs, it makes me a little sad. Anyway, maybe someone will tell me why this is wrong.
Related
Open problems in RoboticsJuly 29, 2020In "brainz"
Quantum computing as a field is obvious bullshitJanuary 15, 2019In "non-standard computer architectures"
30 open questions in physics and astronomyAugust 2, 2012In "Open problems"
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK