At the Heidelberg Laureate Forum three years ago, I took lots of selfies with Fields medallists, Abel prizewinners and Turing laureates. This included having a dinner in a castle with Leonard Adleman, pioneer of asymmetric cryptography:
…and Endre Szemeredi of regularity lemma fame…
…and Louis Nirenberg…
…and, last but certainly not least, enjoyed sparkling Riesling in a Bavarian brewery with Michael Atiyah:
He proceeded to summon several of us into a room, wherein he posed a rather interesting problem and offered a reward for its solution:
Consider n distinct points, in the three-dimensional unit ball. Let the ray (half-line) from through meet the boundary of the ball at , viewed as a complex number on the Riemann sphere. We define the monic polynomials whose roots are given by the projections of the remaining points onto the sphere.
Prove that these n polynomials are linearly independent.
If we consider the determinant of the matrix M formed by the coefficients of these polynomials, we get a degree-½n(n−1) homogeneous polynomial in the n(n−1) roots. This determinant can be seen to be invariant under adding a constant to all roots, but it is not scale-invariant because the degree is nonzero. This can be amended by dividing by a normalising constant, yielding a rational function δ:
Note that δ is not only scale- and translation-invariant, but also is invariant under simultaneously replacing all roots by their reciprocals. This means that δ is invariant under the entirety of the Möbius group, which corresponds naturally to the group of orientation-preserving projective transformations fixing the unit ball. Since δ is dimensionless, it is reasonable to conjecture the following stronger problem:
Prove that |δ| ≥ 1.
Apparently an acquaintance of Atiyah proved this for up to 4 points by symbolic manipulation in a computer algebra package, and experimentally verified that it appears to hold in much higher dimensions.
Interestingly, if one of the points is on the boundary of the unit ball, it can be seen that deleting it does not alter the value of δ. (Hint: since we have so much invariance, it suffices to check this at the point 0.) This allowed Atiyah to strengthen the problem even further:
Prove that, if we leave the points in-place and gradually shrink the ball until one of the points lies on the boundary, the value |δ| does not increase.
Atiyah circulated this problem to as many mathematicians as he could, offering a bottle of champagne and an invitation to the next HLF as a reward for anyone who could solve it. I was perplexed that Atiyah — who is a ‘theory-builder’ rather than a ‘problem-solver’ (e.g. Erdös) — would be interested in a problem that, whilst being elegant, seemingly bears no connection to serious research mathematics. I wondered whether he was following in the footsteps of Littlewood, who used to take disguised versions of the Riemann hypothesis and give them to PhD students as research problems.
Of course, I didn’t know at the time which great problem Atiyah had reduced to this lemma. Last year, however, he gave a talk at Cambridge presenting a proof of this geometrical inequality. I wasn’t at the talk, but apparently it involved expressing the logarithm of |δ| (possibly negated) as the von Neumann entropy of some system, and proving the strongest version of the conjecture as a corollary of entropy being non-decreasing.
On Monday morning, however, Atiyah will be presenting a proof of the Riemann hypothesis in a 45-minute talk at the Heidelberg Laureate Forum, three years after he presented this problem to us. The abstract of the forthcoming talk mentions that it builds upon work by von Neumann, which is tantalisingly consistent with my prediction that his ‘points in a ball’ conjecture was merely the remaining lemma required to solve a huge unsolved problem!
Anyway, in 60 hours’ time, number theory will be revolutionised. Let’s hope that his proof generalises easily to GRH as well, so that we can enjoy a deterministic primality test faster than AKS.