6

Baby Steps - 51posts

 3 years ago
source link: https://51posts.com/2021/09/29/22/45/07/6391/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client
Baby Steps

You don’t have to see the whole staircase, just take the first step—Martin Luther King, Jr.

Maryna Viazovska was the first person to prove an exact bound on sphere packing in a dimension higher than 3. She achieved this in 2016 by making an improvement of 0.00001 over one previous paper by Henry Cohn and Noam Elkies. This also led to the solution in dimension 24, joint with Cohn and Abhinav Kumar, Stephen Miller, and Danylo Radchenko.

Today we talk about partial progress—baby steps—and its relation to solving conjectures.

This post is a continuation of our recent thoughts about how those who claim to have solved a major problem almost always claim to have solved the whole thing in one large leap. We’ve thought about declaring a rule that any claim on a major open problem, at least about complexity lower bounds, must first be a partial result. We will give concrete suggestions in that direction at the end. But on reflection, we’ll curb the dogmatics a little.

In her 2016 Quanta article on Viazovska’s breakthrough, Erica Klarreich quotes Peter Sarnak:

“It’s stunningly simple, as all great things are. You just start reading the paper and you know this is correct.”

We hasten to add that the paper’s correctness is evident amid the context that others had established over the previous two decades, going back at least to Thomas Hales’s voluminous proof of Johannes Kepler’s conjecture for dimension 3. To quote Klarreich:

Researchers have known for more than a decade what the missing ingredient in the proof [of optimality for dimensions 8 and 24] should be—an “auxiliary” function that can calculate the largest allowable sphere density—but they couldn’t find the right function. … [F]inding the right modular form allowed Viazovska to prove [the case of 8] in a mere 23 pages.

Thus Viazovska brought in a new tool to the problem, modular forms, which had already proved their merit in resolving the Fermat conjecture. But she still needed to choose the right modular form among many candidates. Klarreich quotes her as saying it is difficult even just to explain how she knew which one to use. This brings us back to the challenge of explaining—or debunking—the flash of insight that claimers claim to have. At least in this case, per Sarnak’s quote, the proof was in the pudding.

From Cohn’s 2017 AMS Notices article

To complete the mixing of our original message, the examples we chose happen to involve improvements by tiny amounts. We’ve even understated Viazovska’s above: reckoned against a later paper by Cohn and Elkies, her improvement was

\displaystyle  0.000000000000000000000000000001.

The recent post where we discussed the phrase “the proof is in the pudding” involves a number with six more {0}s than that. These are not what we mean by “baby steps.” But let us tell our next story, which also involves Cohn.

Steps Toward Matrix Multiplication

We have covered other work on the exponent {\omega} of matrix product. This means the infimum of all {e} such that any two {n \times n} matrices can be multiplied in {O(n^e)} unit operations. In 1990, Don Coppersmith and Shmuel Winograd brought {\omega} down below {2.375477}. The current best bound {\omega < 2.37286} is from a SODA 2021 paper by Josh Alman and Virginia Williams; its abstract highlights the improvement by {0.00001} from the previous best.

In 2005, however, Cohn wrote a paper with Robert Kleinberg, Balazs Szegedy, and Christopher Umans on ideas for taking {\omega} all the way down to {2} in one jump. This diverges from both the reading of “baby step” as “tiny improvement” and our idea meaning limited-but-definite partial progress. Their strategy for the full jump has come in and out of clouds since then. But their paper has motivated the subsequent partial progress in several ways. We have mentioned it several times, notably here.

The objective need not be {2} versus {2.372...}, however. There is a notable milepost just above {2.30}. It comes from a STOC 2015 paper by Andris Ambainis, Yuval Flimus, and François Le Gall. It shows that a wide class of techniques of the kind used since 1990 cannot achieve better than {\omega < 2.3078}.

Other recent work by Alman and Williams shows both a dimension along which the road down to {2} may still be open but other limits in other directions. Even their new {0.00001} improvement has new ideas that may be exploited more generally. This is a feature of partial progress that contrasts with what we said recently about not learning much from whole-jump attempts: partial progress always shows a higher learning curve.

Some Partial Progress Objectives

Cohn also has a page of informal thoughts on how to perform research, including advice for those who claim to have solved some open problem. We want to add this advice:

Make sure ahead of time that your advance really covers the intermediate ground it jumps through. Even better, walk back your claim a little so that it proves something only a step or so beyond previous knowledge.

Here are a few examples of such things to prove in complexity theory:

  • {\mathsf{P \neq NP}}: The claim is that SAT requires exponential time.
  • A Baby Step: Prove SAT cannot be done in linear time. Or that it cannot be done in {n^2} time.
  • Circuit Lower Bounds: The claim is that some concrete problem in {\mathsf{NP}} requires super-polynomial size general circuits.
  • A Baby Step: Prove that some concrete problem cannot be done in a linear size boolean circuit–or one of size {6n}.
  • Factoring: The claim is that factoring a general number {N} requires super-polynomial time in the bit-length of {N}.
  • A Baby Step: Prove that factoring cannot be done in linear time. Or not in quadratic time. See this.
  • Graph Isomorphism: The claim is that there is polynomial-time algorithm to tell whether two graphs are isomorphic.
  • A Baby Step: Prove the case where the graphs have degree a fixed constant.

This is an example of a baby step that is already done. Note: it was not an easy step. It is a famous result of Eugene Luks—see his 1982 paper.

We could go deeper into known computational complexity issues to find more. The idea applies to problems from mathematics in general. Here is one:

  • Riemann Hypothesis” The claim is of course that

    \displaystyle  \sum_{n=1}^{\infty} \frac{1}{n^s} = 0

    only happens for negative even integers and for {s} with real part {1/2}. The latter are the zeroes on the half line.

  • A Baby Step: Prove that it is impossible for just two zeroes to be off the half line. Or that only a finite number can be off the half line.

We invite readers to add more favorite math examples in comments, or others from complexity theory.

Open Problems

What are some additional first steps?

Have we also—unintendedly but effectively—made a case for the value of tiny improvements? At least when they are conceptual improvements?

Here is the answer to the puzzle in the previous post: Barbara wins. She forces the sum of every row to be {0}. Alan makes an entry {x}, she then makes an entry in the same row {-x}. This works since 1986 is even. This makes the matrix singular, since the matrix times the all {1} vector is {0}.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK