The polymath blog

September 20, 2013

Polymath8 – A Success !

Filed under: news — Gil Kalai @ 5:58 pm
Tags:

The main objectives of the polymath8 project, initiated by Terry Tao  back in June, were “to understand the recent breakthrough paper of Yitang Zhang establishing an infinite number of prime gaps bounded by a fixed constant {H}, and then to lower that value of {H} as much as possible.”

Polymath8 was a remarkable success! Within two months the best value of H that was 70,000,000 in Zhang’s proof was reduced to 5,414. Moreover, the polymath setting looked advantageous for this project, compared to traditional ways of doing mathematics. (I have written a post with some more details and thoughts about it, looked from a distance.)

August 9, 2013

Polymath7 research thread 5: the hot spots conjecture

Filed under: hot spots,research — Terence Tao @ 7:22 pm
Tags:

This post is the new research thread for the Polymath7 project to solve the hot spots conjecture for acute-angled triangles, superseding the previous thread; this project had experienced a period of low activity for many months, but has recently picked up again, due both to renewed discussion of the numerical approach to the problem, and also some theoretical advances due to Miyamoto and Siudeja.

On the numerical side, we have decided to focus first on the problem of obtaining validated upper and lower bounds for the second Neumann eigenvalue {\mu_2} of a triangle {\Omega=ABC}. Good upper bounds are relatively easy to obtain, simply by computing the Rayleigh quotient of numerically obtained approximate eigenfunctions, but lower bounds are trickier. This paper of Liu and Oshii has some promising approaches.

After we get good bounds on the eigenvalue, the next step is to get good control on the eigenfunction; some approaches are summarised in this note of Lior Silberman, mainly based on gluing together exact solutions to the eigenfunction equation in various sectors or disks. Some recent papers of Kwasnicki-Kulczycki, Melenk-Babuska, and Driscoll employ similar methods and may be worth studying further. However, in view of the theoretical advances, the precise control on the eigenfunction that we need may be different from what we had previously been contemplating.

These two papers of Miyamoto introduced a promising new method to theoretically control the behaviour of the second Neumann eigenfunction {u_2}, by taking linear combinations of that eigenfunction with other, more explicit, solutions to the eigenfunction equation {\Delta u = - \mu_2 u}, restricting that combination to nodal domains, and then computing the Dirichlet energy on each domain. Among other things, these methods can be used to exclude critical points occurring anywhere in the interior or on the edges of the triangle except for those points that are close to one of the vertices; and in this recent preprint of Siudeja, two further partial results on the hot spots conjecture are obtained by a variant of the method:

  • The hot spots conjecture is established unconditionally for any acute-angled triangle which has one angle less than or equal to {\pi/6} (actually a slightly larger region than this is obtained). In particular, the case of very narrow triangles have been resolved (the dark green region in the area below).
  • The hot spots conjecture is also established for any acute-angled triangle with the property that the second eigenfunction {u_2} has no critical points on two of the three edges (excluding vertices).

So if we can develop more techniques to rule out critical points occuring on edges (i.e. to keep eigenfunctions monotone on the edges on which they change sign), we may be able to establish the hot spots conjecture for a further range of triangles. In particular, some hybrid of the Miyamoto method and the numerical techniques we are beginning to discuss may be a promising approach to fully resolve the conjecture. (For instance, the Miyamoto method relies on upper bounds on {\mu_2}, and these can be obtained numerically.)

The arguments of Miyamoto also allow one to rule out critical points occuring for most of the interior points of a given triangle; it is only the points that are very close to one of the three vertices which we cannot yet rule out by Miyamoto’s methods. (But perhaps they can be ruled out by the numerical methods we are also developing, thus giving a hybrid solution to the conjecture.)

Below the fold I’ll describe some of the theoretical tools used in the above arguments.

(more…)

June 4, 2013

Polymath proposal: bounded gaps between primes

Filed under: planning,polymath proposals — Terence Tao @ 4:31 am

Two weeks ago, Yitang Zhang announced his result establishing that bounded gaps between primes occur infinitely often, with the explicit upper bound of 70,000,000 given for this gap.  Since then there has been a flurry of activity in reducing this bound, with the current record being 4,802,222 (but likely to improve at least by a little bit in the near future).

It seems that this naturally suggests a Polymath project with two interrelated goals:

  1. Further improving the numerical upper bound on gaps between primes; and
  2. Understanding and clarifying Zhang’s argument (and other related literature, e.g. the work of Bombieri, Fouvry, Friedlander, and Iwaniec on variants of the Elliott-Halberstam conjecture).

Part 1 of this project splits off into somewhat independent sub-projects:

  1. Finding narrow prime admissible tuples of a given cardinality (or, dually, finding large prime admissible tuples in a given interval).  This part of the project would be relatively elementary in nature, relying on combinatorics, elementary number theory, computer search, and perhaps some clever algorithm design.  (Scott Morrison has already been hosting a de facto project of this form at this page, and is happy to continue doing so).
  2. Solving a calculus of variations problem associated with the Goldston-Yildirim-Pintz argument (discussed at this blog post, or in this older survey of Soundararajan) [in particular, this could lead to an improvement of a certain key parameter k_0, currently at 341,640, even without any improvement in the parameter \varpi mentioned in part 3. below.]
  3. Delving through the “hard” part of Zhang’s paper in order to improve the value of a certain key parameter \varpi (which Zhang sets at 1/1168, but is likely to be enlargeable).

Part 2 of this project could be run as an online reading seminar, similar to the online reading seminar of the Furstenberg-Katznelson paper that was part of the Polymath1 project.  It would likely focus on the second half of Zhang’s paper and would fit well with part 1.3.  I could run this on my blog, and this existing blog post of mine could be used for part 1.2.

As with other polymath projects, it is conceivable that enough results are obtained to justify publishing one or more articles (which, traditionally, we would publish under the D.H.J. Polymath pseudonym).  But it is perhaps premature to discuss this possibility at this early stage of the process.

Anyway, I would be interested to gauge the level of interest and likely participation in these projects, together with any suggestions for improving the proposal or other feedback.

March 2, 2013

Polymath proposal (Tim Gowers): Randomized Parallel Sorting Algorithm

Filed under: polymath proposals — Gil Kalai @ 4:41 pm

traj2

From Holroyd’s sorting networks picture gallery

A celebrated theorem of Ajtai, Komlos and Szemeredi describes a sorting network for  $n$ numbers of depth $O(log N)$. rounds where in each runs $n/2$. Tim Gowers proposes to find collectively a randomized sorting with the same properties.

February 14, 2013

Next Polymath Project(s): What, When, Where?

Filed under: polymath proposals — Gil Kalai @ 3:26 pm

wspolymath

Let us have a little discussion about it.

We may also discuss both general and specific open research mathematical projects which are of different flavor/rules.

Proposals for polymath projects appeared on this blog,  in this post on Gowers’s blog, and in several other places.

September 10, 2012

Polymath7 research threads 4: the Hot Spots Conjecture

Filed under: hot spots,research — Terence Tao @ 7:28 pm

It’s time for another rollover of the  Polymath7 “Hot Spots” conjecture, as the previous research thread has again become full.

Activity has now focused on a numerical strategy to solve the hot spots conjecture for all acute angle triangles ABC.  In broad terms, the strategy (also outlined in this document) is as follows.   (I’ll focus here on the problem of estimating the eigenfunction; one also needs to simultaneously obtain control on the eigenvalue, but this seems to be to be a somewhat more tractable problem.)

  1. First, observe that as the conjecture is scale invariant, the only relevant parameters for the triangle ABC are the angles \alpha,\beta,\gamma, which of course lie between 0 and \pi/2 and add up to \pi.  We can also order \alpha \leq \beta \leq \gamma, giving a parameter space which is a triangle between the values (\alpha,\beta,\gamma) = (0,\pi/2,\pi/2), (\pi/4,\pi/4,\pi/2), (\pi/3,\pi/3,\pi/3).
  2. The triangles that are too close to the degenerate isosceles triangle {}(0,\pi/2,\pi/2) or the equilateral triangle {}(\pi/3,\pi/3,\pi/3) need to be handled by analytic arguments.  (Preliminary versions of these arguments can be found here and Section 6 of these notes  respectively, but the constants need to be made explicit (and as strong as possible)).
  3. For the remaining parameter space, we will use a sufficiently fine discrete mesh of angles (\alpha,\beta,\gamma); the optimal spacing of this mesh is yet to be determined.
  4. For each triplet of angles in this mesh,  we partition the triangle ABC (possibly after rescaling it to a reference triangle \hat \Omega, such as the unit right-angled triangle) into smaller subtriangles, and approximate the second eigenfunction w (or the rescaled triangle \hat w) by the eigenfunction w_h (or \hat w_h) for a finite element restriction of the eigenvalue problem, in which the function is continuous and piecewise polynomial of low degree (probably linear or quadratic) in each subtriangle; see Section 2.2 of these notes.    With respect to a suitable basis, w_h can be represented by a finite vector u_h.
  5. Using numerical linear algebra methods (such as Lanczos iteration) with interval arithmetic, obtain an approximation \tilde u to u_h, with rigorous bounds on the error between the two.  This gives an approximation to w_h or \hat w_h with rigorous error bounds (initially of L^2 type, but presumably upgradable).
  6. After (somehow) obtaining a rigorous error bound between w and w_h (or \hat w and \hat w_h), conclude that w stays far from its extremum when one is sufficiently far away from the vertices A,B,C of the triangle.
  7. Using L^\infty stability theory of eigenfunctions (see Section 5 of these notes), conclude that w stays far from its extremum even when (\alpha,\beta,\gamma) is not at a mesh point.  Thus, the hot spots conjecture is not violated away from the vertices.  (This argument should also handle the vertex that is neither the maximum nor minimum value for the eigenfunction, leaving only the neighbourhoods of the two extremising vertices to deal with.)
  8. Finally, use an analytic argument (perhaps based on these calculations) to show that the hot spots conjecture is also not violated near an extremising vertex.

This all looks like it should work in principle, but it is a substantial amount of effort; there is probably still some scope to try to simplify the scheme before we really push for implementing it.

July 13, 2012

Minipolymath4 project, second research thread

Filed under: polymath proposals — Terence Tao @ 7:49 pm

It’s been almost 24 hours since the mini-polymath4 project was opened in order to collaboratively solve Q3 from the 2012 IMO.  In that time, the first part of the question was solved, but the second part remains open.  In other words, it remains to show that for any sufficiently large k and any N, there is some n \geq (1.99)^k such that the second player B in the liar’s guessing game cannot force a victory no matter how many questions he asks.

As the previous research thread is getting quite lengthy (and is mostly full of attacks on the first part of the problem, which is now solved), I am rolling over to a fresh thread (as is the standard practice with the polymath projects).  Now would be a good time to summarise some of the observations from the previous thread which are still relevant here.  For instance, here are some relevant statements made in previous comments:

  1. Without loss of generality we may take N=n+1; if B can’t win this case, then he certainly can’t win for any larger value of N (since A could simply restrict his number x to a number up to n+1), and if B can win in this case (i.e. he can eliminate one of the n+1 possibilities) he can also perform elimination for any larger value of N by partitioning the possible answers into n+1 disjoint sets and running the N=n+1 strategy, and then one can iterate one’s way down to N=n+1.
  2. In order to show that B cannot force a win, one needs to show that for any possible sequence of questions B asks in the N=n+1 case, it is possible to construct a set of responses by A in which none of the n+1 possibilities of x are eliminated, thus each x belongs to at least one of each block of k+1 consecutive sets that A’s answers would indicate.
  3. It may be useful to look at small examples, e.g. can one show that B cannot win when k=1, n=1? Or when k=2, n=3?

It seems that some of the readers of this blog have already obtained a solution to this problem from other sources, or from working separately on the problem, so I would ask that they refrain from giving spoilers for this question until at least one solution has been arrived at collaboratively.

Also, participants are encouraged to edit the wiki as appropriate with new developments and ideas, and participate in the discussion thread for any meta-discussion about the polymath project.

July 12, 2012

Minipolymath4 project: IMO 2012 Q3

Filed under: research — Terence Tao @ 10:00 pm
Tags:

This post marks the official opening of the mini-polymath4 project to solve a problem from the 2012 IMO.  This time, I have selected Q3, which has an interesting game-theoretic flavour to it.

Problem 3.   The liar’s guessing game is a game played between two players A and B.  The rules of the game depend on two positive integers k and n which are known to both players.

At the start of the game, A chooses two integers x and N with 1 \leq x \leq N.  Player A keeps x secret, and truthfully tells N to player B.  Player B now tries to obtain information about x by asking player A questions as follows.  Each question consists of B specifying an arbitrary set S of positive integers (possibly one specified in a previous question), and asking A whether x belongs to S.  Player B may ask as many such questions as he wishes.  After each question, player A must immediately answer it with yes or no, but is allowed to lie as many times as she wishes; the only restriction is that, among any k+1 consecutive answers, at least one answer must be truthful.

After B has asked as many questions as he wants, he must specify a set X of at most n positive integers.  If x belongs to X, then B wins; otherwise, he loses.  Prove that:

  1. If n \geq 2^k, then B can guarantee a win.
  2. For all sufficiently large k, there exists an integer n \geq 1.99^k such that B cannot guarantee a win.
The comments to this post shall serve as the research thread for the project, in which participants are encouraged to post their thoughts and comments on the problem, even if (or especially if) they are only partially conclusive.  Participants are also encouraged to visit the discussion thread for this project, and also to visit and work on the wiki page to organise the progress made so far.
This project will follow the general polymath rules.  In particular:
  1. All are welcome. Everyone (regardless of mathematical level) is welcome to participate.  Even very simple or “obvious” comments, or comments that help clarify a previous observation, can be valuable.
  2. No spoilers! It is inevitable that solutions to this problem will become available on the internet very shortly.  If you are intending to participate in this project, I ask that you refrain from looking up these solutions, and that those of you have already seen a solution to the problem refrain from giving out spoilers, until at least one solution has already been obtained organically from the project.
  3. Not a race. This is not intended to be a race between individuals; the purpose of the polymath experiment is to solve problems collaboratively rather than individually, by proceeding via a multitude of small observations and steps shared between all participants.   If you find yourself tempted to work out the entire problem by yourself in isolation, I would request that you refrain from revealing any solutions you obtain in this manner until after the main project has reached at least one solution on its own.
  4. Update the wiki. Once the number of comments here becomes too large to easily digest at once, participants are encouraged to work on the wiki page to summarise the progress made so far, to help others get up to speed on the status of the project.
  5. Metacomments go in the discussion thread. Any non-research discussions regarding the project (e.g. organisational suggestions, or commentary on the current progress) should be made at the discussion thread.
  6. Be polite and constructive, and make your comments as easy to understand as possible. Bear in mind that the mathematical level and background of participants may vary widely.

Have fun!

June 24, 2012

Polymath7 research threads 3: the Hot Spots Conjecture

Filed under: hot spots,research — Terence Tao @ 7:21 pm

It’s once again time to roll over the research thread for the Polymath7Hot Spots” conjecture, as the previous research thread has again become full.

As the project moves into a more mature stage, with most of the “low-hanging fruit” already collected, progress is now a bit less hectic, but our understanding of the problem is improving.  For instance, in the previous thread, the relationship between two different types of arguments to obtain monotonicity of eigenfunctions – namely the coupled Brownian motion methods of Banuelos and Burdzy, and an alternate argument based on vector-valued maximum principles – was extensively discussed, and it is now fairly clear that the two methods yield a more or less equivalent set of results (e.g. monotonicity for obtuse triangles with Neumann conditions, or acute triangles with two Neumann and one Dirichlet side).  Unfortunately, for scalene triangles it was observed that the behaviour of eigenfunctions near all three vertices basically preclude any reasonable monotonicity property from taking place (in particular, the conjecture in the preceding thread in this regard was false).  This is something of a setback, but perhaps there is some other monotonicity-like property which could still hold for scalene acute triangles, and which would imply the hot spots conjecture for these triangles.  We do now have a number of accurate numerical representations of eigenfunctions, as well as some theoretical understanding of their behaviour, especially near vertices or near better-understood triangles (such as the equilateral or isosceles right-angled triangle) so perhaps they could be used to explore some of these properties.

Another result claimed in previous threads – namely, a theoretical proof of the simplicity of the second eigenvalue – has now been completed, with fairly good bounds on the spectral gap, which looks like a useful thing to have.

It was realised that a better understanding of the geometry of the nodal line would be quite helpful – in particular its convexity, which would yield the hot spots conjecture on one side of the nodal line at least.   We did establish one partial result in this direction, namely that the nodal line cannot hit the same edge of the triangle twice, but must instead straddle two edges of the triangle (or a vertex and an opposing edge, though presumably this case only occurs for isosceles triangles).  Unfortunately, more control on the nodal line is needed.

In the absence of a definitive theoretical approach to the problem, the other main approach is via rigorous numerics – to obtain, for a sufficiently dense mesh of test triangles, a collection of numerical approximations to second eigenfunctions which are provably close (in a suitable norm) to a true second eigenfunction, and whose extrema (or near-extrema) only occur at vertices (or near-vertices).  In principle, this sort of information would be good enough to rigorous establish the hot spots conjecture for such a test triangle as well as nearby perturbations of that triangle.  The details of this approach, though, are still being worked out.  (And given that they could be a bit messy, it may well be a good idea to not proceed too prematurely with the numerical approach, in case some better approach is discovered in the near future).  One proposal is to focus on a single typical triangle (e.g. the 40-60-80 triangle) as a test case in order to fix parameters.

There was also some further exploration of whether reflection methods could be pushed further.   It was pointed out that even in the very simple case of the unit interval [0,1], it is not obvious (even heuristically) from reflection arguments why the hot spots conjecture should be true.  Reflecting around a vertex whose angle does not go evenly into \pi has created a number of technical difficulties which seem to so far be unsatisfactorily addressed (but perhaps getting an alternate proof of hot spots in the model cases where reflection does work, i.e. the 30-60-90, 45-45-90, and 60-60-60 triangles, would be worthwhile).

 

June 15, 2012

Polymath7 research threads 2: the Hot Spots Conjecture

Filed under: hot spots,research — Terence Tao @ 9:48 pm

The previous research thread for the Polymath7Hot Spots Conjecture” project has once again become quite full, so it is again time to roll it over and summarise the progress so far.

Firstly, we can update the map of parameter space from the previous thread to incorporate all the recent progress (including some that has not quite yet been completed):

This map reflects the following new progress:

  1. We now have (or will soon have) a rigorous proof of the simplicity of the second Neumann eigenvalue for all non-equilateral acute triangles (this is the dotted region), thus finishing off the first part of the hot spots conjecture in all cases.  The main idea here is to combine upper bounds on the second eigenvalue \lambda_2 (obtained by carefully choosing trial functions for the Rayleigh quotient), with lower bounds on the sum \lambda_2+\lambda_3 of the second and third eigenvalues, obtained by using a variety of lower bounds coming from reference triangles such as the equilateral or isosceles right triangle.  This writeup contains a treatment of those triangles close to the equilateral triangle, and it is expected that the other cases can be handled similarly.
  2. For super-equilateral triangles (the yellow edges) it is now known that the extreme points of the second eigenfunction occur at the vertices of the base, by cutting the triangle in half to obtain a mixed Dirichlet-Neumann eigenvalue problem, and then using the synchronous Brownian motion coupling method of Banuelos and Burdzy to show that certain monotonicity properties of solutions to the heat equation are preserved.  This fact can also be established via a vector-valued maximum principle.  Details are on the wiki.
  3. Using stability of eigenfunctions and eigenvalues with respect to small perturbations (at least when there is a spectral gap), one can extend the known results for right-angled and non-equilateral triangles to small perturbations of these triangles (the orange region).  For instance, the stability results of Banuelos and Pang already give control of perturbed eigenfunctions in the uniform norm; since for right-angled triangles and non-equilateral triangles, the extrema only occur at vertices, and from Bessel expansion and uniform C^2 bounds we know that for any perturbed eigenfunction, the vertices will still be local extrema at least (with a uniform lower bound on the region in which they are extremisers), we conclude that the global extrema will still only occur at vertices for perturbations.
  4. Some variant of this argument should also work for perturbations of the equilateral triangle (the dark blue region).  The idea here is that the second eigenfunction of a perturbed equilateral triangle should still be close (in, say, the uniform norm) to some second eigenfunction of the equilateral triangle.  Understanding the behaviour of eigenfunctions nearly equilateral triangles more precisely seems to be a useful short-term goal to pursue next in this project.

But there is also some progress that is not easily representable on the above map.  It appears that the nodal line \{u=0\} of the second eigenfunction u may play a key role.  By using reflection arguments and known comparison inequalities between Dirichlet and Neumann eigenvalues, it was shown that the nodal line cannot hit the same edge twice, and thus must straddle two distinct edges (or a vertex and an opposing edge).   (The argument is sketched on the wiki.) If we can show some convexity of the nodal line, this should imply that the vertex straddled by the nodal line is a global extremum by the coupled Brownian motion arguments, and the only extremum on this side of the nodal line, leaving only the other side of the nodal line (with two vertices rather than one) to consider.

We’re now also getting some numerical data on eigenvalues, eigenfunctions, and the spectral gap.  The spectral gap looks reasonably large once one is away from the degenerate triangles and the equilateral triangle, which bodes well for an attempt to resolve the conjecture for acute angled triangles by rigorous numerics and perturbation theory.  The eigenfunctions also look reassuringly monotone in various directions, which suggests perhaps some conjectures to make  in this regard (e.g. are eigenfunctions always monotone along the direction parallel to the longest side?).

This isn’t a complete summary of the discussion thus far – other participants are encouraged to summarise anything else that happened that bears repeating here.

« Previous PageNext Page »

Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 366 other followers