114

When I was a graduate student in math (mid-late eighties and early nineties) the arena was dominated by a few grand projects: for instance, Misha Gromov's hyperbolic groups, which spread into many seemingly heterogeneous domains (such as combinatorial group theory, to name just one), and Bill Thurston's classification of low-dim manifolds.

A few years have passed, and I must admit that, aside my pet domains, such as categorical algebra and applied logic, I have not kept up with innovations.

I am pretty sure, though, that new trends, generated by some core new ideas, are leading contemporary math research, or substantial portions thereof. One such idea I know already, namely Voevodsky's homotopical foundations of logics, which brings together abstract homotopy theory and type theory.

What else?

PS I am not interested (at least not directly) in new solutions to old problems, unless these solutions are the germs of (and conducive to) new directions.

Rather, I would hear about new research projects which unify previously disparate fields, or create entirely new ones, or shed lights into old domains in a radically new way (in other words, I am interested in what Thomas Kuhn called paradigm shifts). The ors are of course not mutually exclusive.

Addendum: It looks like there is an ongoing debate on whether this question should be kept open or not. As I have already answered below, I submit entirely and with no reservations to the policy of this respectable forum, whatever the outcome. As a dweller of MO, though, I am entitled to my personal opinion, and this is to keep my question in the air. Nevertheless, I am well aware of the potential risks of either turning it into -what I like best of math right now- or generic answers that provide no meat for this community. Therefore, allow me to add a clarification:

My wish is the following- to get to know from informed individuals which new paradigm-shifting trends are currently under way.

To this effect I appreciate all the answers so far, but I invite everybody to provide some level of details as to why they have mentioned a specific idea and/or research project (some folks have already done it). For instance, one answer was tropical math, which indeed seems to have risen to prominence in recent years. It would be nice to know which areas it does impact and in which way, which core ideas it is founded on, which threads it brings together, etc. The same applies of course to all other proposals.

  • 6
    This could be quite an interesting discussion, which I personally could likely enjoy and might even contribute to. That being said, I firmly believe that this is too subjective and possibly argumentative for MO. Voted to close as such. –  Dec 30 '12 at 21:18
  • 25
    dear quid, I have no issues if you folks decide to close it down, and I do understand your misgivings about possible arguments on what is important and what not.

    But fact is, there have always been and there will always be leading trends in math and sciences, and the only thing I am after here is a set of honest answers: why not letting working mathematicians talk about what is important right now in their great field? If people will feel differently about what is relevant, so be it. All the best, Mirco

    – Mirco A. Mannucci Dec 30 '12 at 21:41
  • 4
    As I said it could be quite interesting. Yet, opinions could be quite mixed. Not sure this goes over so well (but so far my vote is the only one, so). For a substitute, around 2000 a couple of books in this direction were published; this is also already more than a decade ago but then also quite a bit more recent then the 80s. Visions in mathematics: GAFA 2000 Special Volume, in particular is quite interesting. Containing a transcript of a discussion around this more or less. –  Dec 30 '12 at 22:00
  • 23
    This is a nice question. Of course it is subjective, most of mathematics is subjective! We are faced with subjective criteria everytime we referee or submit a paper, everytime we choose a research problem, everytime we propose a topic for a thesis.

    I would like to offer another criterion to those who are in haste to close questions like this one: will the average reader of this forum learn something from the question, its answers, and/or the discussion? If yes, let it run, if no close it if you must.

    – alvarezpaiva Dec 31 '12 at 01:26
  • 2
    @alvarezpaiza: I fear it is always possible to learn things which are false, and then we get people following Brian the Messiah and marvelling at his shoe. – Yemon Choi Dec 31 '12 at 03:39
  • 3
    @Yemon Yes, it is. The gullible are also at falt, though. Banning information too much or forbidding discussions too quickly because it's potentially dangerous isn't going to fly in a democratic community, in my opinion. – Yuichiro Fujiwara Dec 31 '12 at 09:21
  • 2
    The new universe of math opened by Mochizuki while proving the ABC conjecture certainly fits the bill, but I lack the expertise to write a nice answer built around that... – Suvrit Dec 31 '12 at 09:29
  • 2
    @Yemon: We're mostly professional mathematicians here so there is little danger of following Brian. We have other Messiahs. Moderators do not garantee the exactness of questions nor answers and the decision to close a non-homework (or totally stupid) question is often more subjective than the question itself. – alvarezpaiva Dec 31 '12 at 11:47
  • 13
    Meta thread created http://tea.mathoverflow.net/discussion/1505/ Please post further contributions on the appropraiteness of the question their; for those not yet meta active: there is an extra sign-up for meta (do not try to use your MO login) but it is simple and instant (despite the wording 'apply for membership') –  Dec 31 '12 at 14:16
  • I couldn't get myself to read the text in all CAPS---I guess I might not be the only one with this difficulty---perhaps the OP might want to clean up the question, now that it has already garnered so much attention.... – Suvrit Jan 02 '13 at 13:58
  • @Suvrit: as there is clear evidence you are not alone (on a deleted answer and on meta) I changed the format, though personally I have not much problem with this aspect of the question. (Only, I wonder a bit why nobody else did it so far if this is such an issue...) –  Jan 02 '13 at 14:24
  • @quid: because this is a question that has garnered a lot of interest, I did not want to impinge upon the OP's prerogative by editing the question myself...but thanks for making it look less abrasive :-) – Suvrit Jan 02 '13 at 16:33
  • @suvrit Well, the caps were meant to be noticed, and sure enough they were! :) I agree that it was a bit rough, but it provoked the intended effect, namely to push people to be more informative in their answers. I am glad though that quid polished it up. – Mirco A. Mannucci Jan 05 '13 at 00:03
  • 1
    Motivic homotopy theory – tttbase May 12 '16 at 03:08

30 Answers30

76

The Langlands program. It goes back to the sixties, but in the last years, with the proof of the fundamental lemma by Ngô Bảo Châu and with several results in the local case, it became one of the most active area in number theory, and I think there is no hope to finish the job in the next, say, 50 years.

Ricky
  • 3,674
  • 7
    "there is *now hope", I hope? – Kevin Casto Jan 02 '13 at 00:57
  • Actually, to me it makes more sense with 'no' than with 'now' (and I believe(d) this was meant). ('the last years' are not so few after all, since the main local results I am aware of are twenty (or even thirty) [pos char] and ten [char 0] years old, respectively. And, about the same for the function field results [n=2, general].) –  Jan 02 '13 at 10:59
  • 15
    I really meant no. This is of course just a personal opinion (and am not an expert!). But I think there very important recent results in the local case, for example the local-global compatibility proved by Emerton. – Ricky Jan 02 '13 at 11:36
  • @Ricky: Thank you for clarifying this. Also, I should perhaps add that my comment should not be read as some crtical evaluation of more recent results (of which I am simply essentially unaware; not that I am overly familiar with the ones I mentioned). All I meant to point out is that it seems to me there was/is already (continual) progress over a long period of time and thus it seems reasonable to expect that it will take still more time to reach some 'end' (whatever this even means precisely for this program), as indeed you meant, too. –  Jan 02 '13 at 12:04
  • 5
    I agree with Kevin. It is not true anymore that there is no hope to finish it in the next 50 years. There has been tremendous progresses in the last 20 years (basically since Wiles proved FLT). The announcement last year (is there a preprint yet?) by Thorne, Li, Harris and Taylor of the construction of a Galois representation attached to an arbitrary regular automorphic form for Gl_n is a great progress. The importance of the general proof of the fundamental lemma by NGO can hardy be underestimated, and could lead him and other to new progresses in functoriality. Laurent Lafforgue's ... – Joël Jan 04 '13 at 12:02
  • 4
    project on base change without the twisted trace formula has not come to fruition yet, but it may well do so soon, and open new ways. The progresses in the modularity results for Galois representations are also very impressive. And someone unknown now may very well come tomorrow with a great new idea. So, sure, several very important breakthroughs, plus a tremendous amount of technical work is still needed to "finish" the Langlands program -- but there is a hope now that this may happen much sooner that we expected 20 years ago. – Joël Jan 04 '13 at 12:07
  • Dear Joël, thank you for the comment. I would like to stress once again that my sentence was just a personal opinion (based on my, very limited, knowledge on the subject) and was not intended to be taken very seriously. In any case I am very happy to know about the results you cited! – Ricky Jan 04 '13 at 12:40
  • @Joël: Could you make precise what you mean by 'the Langlands program'. This seems important to me for any claim in this direction. It is not like (in my perception) 'the Langlands program' has a universally agreed upon meaning. –  Jan 04 '13 at 20:15
  • 2
    Dear Quid, you are right, and this is why I have put "finish" between quotation marks. The Langlands program is pretty much open-ended. Let me precise my question by giving definitions of the Langlands Program. A certainly narrow interpretation, that Langlands would certainly reject as too narrow, but within which many people are working now, is the existence of a natural bijection, satisfying some properties too long to be stated here, between algebraic cuspidal automorphic forms for $Gl_n$ over a number field $K$, and geometric $l$-adic Galois representations of $G_K$. This program... – Joël Jan 06 '13 at 20:51
  • 1
    certainly seemed completely unaccessible 20 years ago, but in my opinion does not seem so inaccessible right now. Sure, a few enormous breakthroughs are still needed (for treating algebraic mass forms and other algebraic non-regular forms in higher rank, and also for the case of arbitrary fields instead of CM fields), plus a lot and lot
    of work to improve the "Galois to automorphic" direction, but there`has been many as impressive breakthroughs in the recent past, and there is a reasonable "hope" the progress will continue at the same pace. Assuming this, one could expect this program...
    – Joël Jan 06 '13 at 20:56
  • to be finished in the next 10 years. A larger version of the Langlands program contains the above plus functoriality for general automorphic representations correspond to any morphism of L-groups. This is I believe what most people mean when they say Langlands Program. This is much harder, but even for this larger program, in view of the pace of recent progress, it does not seem to be absurd to hope for a complete solution well before 50 years. and III), there is Langlands' vison of a Tannakian structure of the category of automorphic reps. for $Gl_n$ for n variable. for this, I don't know – Joël Jan 06 '13 at 21:02
  • 1
    @Ricky: Sure, it is your opinion, and it is shared by many people in the field. But I wanted to stress that there is a basis for a slightly more optimistic opinion as well. – Joël Jan 06 '13 at 21:04
  • Dear Joël, thank you for the interesting and detailed elaboration of your point of view. –  Jan 07 '13 at 13:40
64

I think it is certainly appropriate to denote as a "grand project" the remarkable new progress in the area sometimes called additive combinatorics or additive number theory, though the subject has expanded to the point that neither of these are good names any longer, if they ever were. I am talking about the strain of thought whose modern form starts with Gowers and continues through with work of Tao, Green, Helfgott, Breuillard, Ziegler, Pyber-Szabo, and many, many others: loosely speaking, all this work centers around the idea that "things that are approximately structured approximate a structure" -- so that if I am a subset of a group and I am approximately closed under multiplication, I must be close to some literal subgroup; or if I am a subset of Z which contains too many arithmetic progressions, I must actually have big intersection with some infinite arithmetic progression; or if I am a subset of R^2 such that lines containing two points of my set are overly likely to contain a third, then I must look something like a subgroup of a real elliptic curve....

Another way to put it is that this field is concerned with structural dichotomies -- subsets either obey the laws that completely random subsets do, OR they are "structured" in some appropriate way; there is no in between.

To me this perfectly meets the definition of a grand program -- like Gromov's take on group theory (with which it shares both some content and some philosophial affinity!) it provides a really new paradigm for "how things are," and at the same time it has given us real progress in a multitude of areas (analytic number theory, harmonic analysis, combinatorial geometry, etc.)

JSE
  • 19,081
60

There is the derived algebraic geometry program of Jacob Lurie, starting with his thesis in 2007 and building on the work of ..., Simpson, Toën-Vezzosi, etc. In the words of Lurie, this is basically "jazzing up" the foundations of algebraic geometry with homotopy theory. Lurie has written two 1000-page books called Higher Topos Theory and Higher Algebra, and a series of papers called DAG. He has already had success in applying the theory to the classical topic of elliptic cohomology.

See also this blog post by Tim Gowers and a relevant MO discussion.

(Of course, this can be viewed as an extension of Grothendieck's program from the 80's, see e.g. this MO discussion.)

AAK
  • 5,841
  • Yes Adeel. This is perhaps the only answer I know something about. Lurie's program is in fact an extension and continuation of Grothendieck's ideas, but it brings in new fire, and also unifies a large amount of research. A grand effort indeed! – Mirco A. Mannucci Jan 01 '13 at 13:25
  • 11
    What is the grand project here? I was on the ICM topology committee that chose him to speak. One of the main reasons that we chose is that he had constructed a spectrum for equivariant K-theory (I'm not exactly sure what that means, so I might have mistated it). He also constructed various other exotic cohomology theories like tmf and TMF. There was some talk that these will be useful to classify the stable homotopy groups of spheres, for example (I think Behrens is working on one aspect of this). But I'm not really sure what kind of grand project you have in mind? – Ian Agol Jan 01 '13 at 19:38
  • 14
    I think homological algebra in its various guises is one of the universally acknowledged tools of modern mathematics. This deals with LINEAR settings, such as vector spaces, modules over rings and other abelian categories. I think one way to understand this grand project, which goes back at least to Quillen, is that one expects an even more universal and powerful tool in the nonlinear setting, HOMOTOPICAL algebra. This is a tool that helps us "correct" or derive operations with nonlinear objects such as spaces, rings, schemes, categories, etc. The cited books greatly develop [cont.] – David Ben-Zvi Jan 02 '13 at 00:01
  • 13
    [cont.] homotopical algebra in a new setting, making it a powerful and streamlined tool that's (from the POV of an end user) easier to take "off the shelf" than previous ones. There are already many many applications (including some of the most spectacular by Lurie himself - in particular the Cobordism Hypothesis), though as should be clear the "grand project" goes much further back and is much broader. Algebraic geometry and representation theory are some of the areas where homological algebra is most deeply embedded and where the much broader nonlinear version will make a tremendous impact. – David Ben-Zvi Jan 02 '13 at 00:04
  • 9
    (Let me re-reiterate, no one here is ascribing homotopical algebra to Lurie, and as has been often claimed this is not the format to debate relative merits of individual mathematician's contributions, but I think it's incontrovertible that Lurie is a leader in this grand project, and that great progress has been made in recent years.) – David Ben-Zvi Jan 02 '13 at 00:11
  • 2
    Thanks David, I suppose that helps some, although I'm still not sure what's the project is (i.e. what is expected to be done?). I hadn't realized that his work on the cobordism hypothesis used the techniques from his books - I know they use $(n,\infty)$ categories, and work of Madsen-Weiss-Tillmann-Galatius, but I didn't know there was any "derived algebraic geometry". A concrete project stemming from Lurie's work related to the cobordism hypothesis is clear, namely to construct fully dualizable objects in $(n,\infty)$ categories to classify fully extended TFTs. – Ian Agol Jan 02 '13 at 02:56
  • 3
    Ian - The cobordism hypothesis doesn't use any "DAG" per se but the books are not about DAG (the first is foundations of homotopical category theory, the second of homotopical algebra). I think of the cobordism hypothesis as a vast generalization of Hochschild homology theory and a kind of universal version of all of the "diagrammatic calculi" for category theory, explaining why it is that we can draw pictures of categories and algebraic operations, and thus fits well here, and its proof certainly uses categorical and algebraic technology from the books (but AFAIK is now independent of GMTW). – David Ben-Zvi Jan 02 '13 at 03:31
  • 7
    It's an interesting question "what is expected to be done" (or "what's DAG's version of the Weil Conjectures"..) - I don't know. One example of a (not overly facetious) answer is, "the geometric Langlands program". I also expect the classical (especially the p-adic) Langlands program will benefit greatly. One can say the same for lots of specifics in algebraic geometry (Donaldson-Thomas theory, homological mirror symmetry) and maybe someone with a better understanding of the motivic world could say something useful there. – David Ben-Zvi Jan 02 '13 at 03:44
46

The field with one element $\mathbb{F}_1$ (a.k.a. F-un).

Having a precise notion of such a field would allow us to further exploit the analogy between number fields and function fields (much like discrete valuation rings, Dedekind domains, schemes, etc. have done in the past). In particular, with a suitable notion of $\mathbb{F}_1$ we should be able to find a proof of the Riemann hypothesis based on Weil’s proof of the Riemann hypothesis for curves over finite fields.

The integers are not an algebra over any field in the classical sense, which makes it a priori impossible to adapt Weil's argument to this case. For this analogy to work, a good definition of $\mathbb{F}_1$ should have the property that $\mathbb{Z}$ is an $\mathbb{F}_1$-algebra.

Furthermore, there are lots of connections with “q-deformations”. I don't know much about this, but John Baez has some nice stuff written at This Week's Finds in Mathematical Physics, weeks 183-187.

An example of the cool unification we can do with $\mathbb{F}_1$ at our disposal: we can make sense of statements like

Combinatorics is linear algebra over $\mathbb{F}_1$.

(This idea is originally due to Jacques Tits, if I recall correctly.)

For a possible way of rigorously developing such a theory, see Nikolai Durov's paper New Approach to Arakelov Geometry.

  • 5
    It should be noted that Durov's definition of $\mathbb{F}1$ is surprisingly simple and that $\mathsf{Mod}(\mathbb{F}_1)$ is just the category of pointed sets (which supports the claim about combinatorics; note that $\mathsf{Mod}(\mathbb{F}{\emptyset})$ is the category of all sets). But his approach is used for some problems in Arakelov geometry and doesn't qualify for the solution of the Riemann hypothesis. – Martin Brandenburg Jan 02 '13 at 10:42
36

A grand programme in mathematics with some enthusiastic supporters is that of computerized proofs for mathematical theorem.

Another related (but different) programme is that of automatic verification of (humanly created) mathematical proofs.

Gil Kalai
  • 24,218
34

Is quantum computing too far away from pure math to qualify? Perhaps it has not shaken up mathematics so much, but it has brought together theoretical physics, computer science, and mathematics in a way unseen before.

Sam Hopkins
  • 22,785
33
  1. One grand project which has generated much work is the quest for mathematical understanding of mirror symmetry (via homological mirror symmetry, or the Strominger-Yau-Zaslow/Gross-Siebert picture). Attempts to formulate and prove the conjecture have led to interesting new ideas in symplectic geometry (like the work of Paul Seidel) and attempts to confirm enumerative predictions from string theory have led to new techniques in algebraic geometry. While this grand project has been around since the early 90s (e.g. Kontsevich's ICM talk which introduced homological mirror symmetry was in 1994) it is still going strong and much progress has been made.

  2. Another very active program in geometry was initiated by the paper of Donaldson-Thomas (see also the more recent paper of Donaldson-Segal) and is an effort to define instanton counting/Floer-theoretic invariants in the context of higher-dimensional gauge theory and exceptional geometry.

  3. The search for constant scalar curvature Kaehler metrics (see Donaldson's lecture from the Fields Medallists' Lectures Volume or Tian's book "Canonical metrics in Kaehler geometry") and the related Donaldson-Tian-Yau conjecture on existence of Kaehler-Einstein metrics on Fano varieties was recently resolved after nearly twenty years' work by many of the world's leading geometric analysts.

  4. The 2000 paper "Introduction to Symplectic Field Theory" by Eliashberg-Givental-Hofer certainly counts as the initiation of a grand project: the systematic study of punctured pseudoholomorphic curves in certain non-compact symplectic manifolds. The theory has many applications, and a by-product is the new foundational polyfold approach to elliptic moduli problems.

Jonny Evans
  • 6,935
  • 1
    just a minor comment about 3: Donaldson's article was actually written in 1996, and although it is supposed to be a reworking of his 1986 Fields Medallist lecture, the material on constant scalar curvature Kahler metrics (including the conjectures) is from 1996, which is roughly when Donaldson started thinking about those matters. – YangMills Jan 05 '13 at 21:07
32

Tropical mathematics

The algebraic geometry, analysis and other mathematics over the tropical semiring instead of the real numbers.

user30304
  • 291
  • 4
    this is a MO question on tropical math which also addresses why this new field is very relevant (I have found especially fascinating the perspective of looking at the tropical semi-ring as a "classical limit"):

    http://mathoverflow.net/questions/83624/why-tropical-geometry

    – Mirco A. Mannucci Jan 01 '13 at 21:21
32

Does compressed sensing count as math? If it does, here is a blog post from the horse's mouth.

Edit: For those who would like a popular article, here is a good one in Wired (by JSE if I'm not mistaken). Also, it is encouraged to read the highly upvoted comment by JSE below.

Because I don't think I can ever explain it better than Terence Tao's brilliant blog post or think I'm qualified either, I'll just refer to the blog, and here simply mention in which field I, as someone working on combinatorial design theory, personally stumbled on it as an interactions between fields (Please read the following only when you have nothing better to do.). I hope experts edit and improve this post.

I had heard good things about compressed sensing before, but the first paper I read was about its application to error correction by Candes, Rudelson, Tao, and Vershynin. I don't know if it's comparable to other recent truly remarkable progress in coding/information theory (e.g., polar coding, which could be a candidate for the answer to OP's question), but it was a refreshing read to me who dabble in coding theory. It's in one sense similar to normal linear codes in that the goal is to recover a vector $f \in R^n$ by knowing $y = Af +e$, where $A$ is an $m$ by $n$ matrix and $e \in R^m$ is the error vector. But the paper studies when $f$ is uniquely determined by $l_1$-minimization a la compressed sensing. Then I learned that some combinatorial design theorists I follow were applying design theory to compressed sensing, in a very rough sense, to give a nice deterministic method for explicitly providing ideal $A$. And when I checked what was up in quantum information these days (I also dabble in quantum information), I ran into this paper by Gross, Liu, Flammia, Becker, and Eisert, where compressed sensing is applied to quantum state tomography, a method for determining the quantum state of a system. And this is the one paragraph version of how I wound up with an endless to-read backlog of papers spanning multiple fields.

  • If you can have single-pixel cameras then can you create a giant telescope just by using a small telescope? – user30304 Dec 30 '12 at 22:26
  • I don't want to spread the wrong information or get own'd by experts. But the linked blog post by Terence Tao mentions applications to astronomy. You can find a bunch of papers mentioning telescopes and whatnot by casual googling too, e.g., http://dx.doi.org/10.1109/JSTSP.2008.2005337 – Yuichiro Fujiwara Dec 31 '12 at 10:08
  • I apologize for the multiple edits to this post in a short span of time, which might have unnecessarily pushed up this thread to the top of the front page. I promise I'll be more careful next time. – Yuichiro Fujiwara Dec 31 '12 at 17:21
  • 1
    Thank you for including some more specialised and personalised information. –  Dec 31 '12 at 17:51
  • 1
    A great resource form more information is the repository for compressed sensing at Rice University (http://dsp.rice.edu/cs). And I do think CS is very important ... it is great that you brought it up! – Kevin R. Vixie Dec 31 '12 at 20:28
  • This is the kind of answer I was hoping for: something I knew absolutely nothing about and also very informative! Fascinating new field...I will most definitely check it out. Thanks Yuichiro!

    PS also thanks to the commentators, who brought in new material.

    Kudos to all

    – Mirco A. Mannucci Jan 01 '13 at 13:27
  • 8
    I would say that the grand project is better described as "sparse inference," where we try to reconstruct data that is known or expected to be sparse in some basis (or low-rank, or in some other way restricted to a low-dimensional but badly nonconvex subspace of parameter space.) This includes compressed sensing but also a much bigger circle of ideas (L^1 minimization, convex relaxation more generally, hierarchical clustering, manifold learning, etc.) I have learned a ton from talking to people about this stuff and I hope more pure mathematicians will get in on it! – JSE Jan 02 '13 at 03:08
  • And yes, it counts as math! – JSE Jan 02 '13 at 03:18
  • I agree with JSE about "sparse inference" (and it sounds cool!). Also, for those who would like a popular article before reading more technical stuff, here's a good one in Wired (by JSE, I think): http://www.wired.com/magazine/2010/02/ff_algorithm/all/1 – Yuichiro Fujiwara Jan 03 '13 at 23:59
  • I'm a fan of compressed sensing myself, but I wouldn't describe it as a "grand project." It's rather an extremely versatile idea that has a seemingly infinite variety of manifestations and applications. To use Gowers's distinction between theory-building and problem-solving, "grand project" to me connotes theory-building, while I regard compressed sensing as being firmly in the problem-solving camp. – Timothy Chow Jan 04 '13 at 15:26
  • 5
    But e.g. the paper of Candes and Recht that just won the Lagrange Prize places compressed sensing within a bigger and more conceptual theoretical framework. That's what I mean by pushing back on "compressed sensing" as a name for the whole field. By the way, thanks to YF for linking to my Wired piece -- but for anybody reading MathOverflow, Terry's blog post is going to offer you much more than my magazine article, which is very simplified! – JSE Jan 04 '13 at 15:50
  • @Timothy That's a good point. Actually, I caught you posting graph minor as an answer yesterday and insta-upvoted it. Then again, while the term "grand project" does carry a definite connotation of theory building, maybe it's not too huge a stretch of imagination to call a tidal wave of manifestations and applications of a brilliant idea a grand project in contemporary math. Well, obviously I'm biased in favor of the problem solving/comp sci camp. – Yuichiro Fujiwara Jan 04 '13 at 16:49
26

Optimal transport. Both its study (generalizations, Monge problem, regularity issues, and geometric properties to cite the part I work in) and its applications (to geometry notably with the Work of Sturm and Lott-Villani, to image processing and recognition, etc.) have developed hugely since the 90's.

25

Graph minor theory. The first success of this theory was the proof of Wagner's conjecture that in every infinite set of finite graphs, one is a minor of the other. However, the theory developed over the course of twenty-odd papers by Robertson and Seymour has been enormously fruitful and its consequences are still being actively explored, with no end in sight. The proof of the strong perfect graph conjecture was the next spectacular success, and then came applications to the structure of clawfree graphs. Graph minor theory almost singlehandedly transformed people's perception of graphs as structureless combinatorial gadgets about which only countless ad hoc theorems could be proved, into a realization that graphs are highly structured objects about which general theories can be developed.

Timothy Chow
  • 78,129
  • Thanks Timothy, this is really spectacular stuff! I confess (shame on me!) that up to 10 minutes ago I did not even know what a minor of a graph is.... But after your great answer, I checked the wiki and found out a whole fascinating universe there. For instance, to someone like me that cringes at the word infinity, it comes as a happy surprise the Friedman, Robertson & Seymour 's theorem that is the finitistic version of the above. I have the feeling that this is exactly one of these paradigm shifting events I was looking for, and that we will see much more along similar lines – Mirco A. Mannucci Jan 04 '13 at 23:54
  • 1
    Mirco: The original graph minor theorem can be phrased finitistically as follows: For any hereditary property, there is a finite set $S$ of finite graphs such that having the property is equivalent to not having any graph in $S$ as a minor. I am not sure exactly where you stand philosophically, but the work of Friedman, Robertson, and Seymour may discomfit finitists of some stripes because it shows that the graph minor theorem cannot be proven except by using stronger induction axioms than are allowed in first-order Peano arithmetic. – Timothy Chow Jan 06 '13 at 02:07
  • 1
    Tim, thanks for this add-on. This result by HF not only does not bother me, but actually is music to my ears! My philosophical position could be summarized in two main tenets: 1) all math objects are configuration of syntactic games (included the "natural numbers") 2) the apparent dichotomy finite-infinite is contextual. Now, 1) allows me to avoid restrictions of any sorts as far as which axiomatic system I can play with (of course, just like ordinary games, someone likes bridge and someone likes poker). 2) tells me basically this: there is infinitely (sorry the pun) more in the so-called – Mirco A. Mannucci Jan 11 '13 at 11:31
  • finite realm than what we ever dreamed so far. I reiterate that your answer is especially dear to my heart, because I am certain that in the years to come we will see much more of that. The complexity (and beauty) of the finite realm (particularly finite graphs, finite categories, etc) will literally astound us – Mirco A. Mannucci Jan 11 '13 at 11:37
  • 2
    Dear Timothy, I have a small bone to pick with this answer. While graph minor theory is indeed a grand project, the proof of the strong perfect graph conjecture and the characterization of the structure of claw-free graphs are not part of graph minor theory. Both are concerned with forbidden induced subgraphs, rather than forbidden minors. The tools used in studying forbidden induced subgraphs are rather different, as witnessed by the fact that the paper containing the proof of the strong perfect graph conjecture does not reference a single paper from the graph minors sequence. – Louigi Addario-Berry Mar 11 '13 at 13:23
  • Louigi: You are correct, of course. I'm now scratching my head as to what I was thinking when I wrote what I wrote! I think that in my mind I was confusing the polynomiality of recognizing hereditary properties, which does follow from graph minor theory, with the polynomiality of recognizing perfect graphs, which as you say does not rely on graph minor theory. – Timothy Chow Mar 11 '13 at 17:56
24

Manjul Bhargava's new field of arithmetic invariant theory is a perfect example of a new grand project. It began with Manjul's doctoral thesis, in which he presented a completely new view of Gauss's composition law for binary quadratic forms in a way that led to generalizations. This led to a series of papers on generalizations to cubic forms and beyond, with a general framework coming from representation theory.

In addition to being intrinsically interesting, this has led to new results on counting quadratic rings, cubic rings, etc, whose crowning achievement is an important new result on the Birch and Swinnerton-Dyer conjecture (c.f. Bhargava and Shankar). There is much more to research in the theory, and numerous people (including some of his students) have found manifold connections with other areas of math, such as knot theory and algebraic geometry.

See here for an overview of Manjul's work.

See here for some notes from a seminar at Princeton that shows the vast reach of this theory.

David Corwin
  • 15,078
  • 1
    I welcome others, especially his students, to improve my answer. – David Corwin Dec 30 '12 at 23:18
  • 10
    Is this a "theory", or a set of striking applications of the geometry of numbers (of which Bhargava is a master) via insightful use of the representation theory, geometry, and arithmetic of algebraic groups? For an algebraic group acting on an "arithmetic" scheme, the set $S$ of integral points of an "orbit" injects into the degree-1 cohomology of the stabilizer. If the stabilizer is commutative then the cohomology set is a group, and when those group operations preserve $S$ we get a "composition law" on $S$. Making such laws explicit is a subtle art, but do any not arise in this way? – user29720 Dec 31 '12 at 01:18
  • 2
    I asked Arul Shankar this and he said, if I interpreted him correctly, that some don't. For instance, binary cubic forms up to $GL_2$ action have a $S_3$ stabilizer, and they have a composition law. – Will Sawin Dec 31 '12 at 04:24
  • 4
    Thanks, Will. For connected ss groups over local and (non-real) global fields, the degree-1 cohomology injects into an ${\rm{H}}^2$ with coefficients in the Cartier dual of the commutative "fundamental group" (due to vanishing of degree-1 cohomology of simply connected ss groups), so it raises the question: is there a central extension of $S_3$ for which those degree-1 cohomologies with $S_3$-coefficients inject into an ${\rm{H}}^2$ with commutative "coefficients" recovering the composition law? That is, is there no known "cohomological" explanation of that composition law? – user29720 Dec 31 '12 at 05:09
  • 5
    Don't binary cubics simply parametrize cubic rings (without any extra line bundles, unlike say ternary cubics)? If so, what is the composition law on cubic rings? – anon Dec 31 '12 at 05:26
  • 6
    (Forgive me if any of this is wrong, I'm rushing.) For binary cubic forms, you have a composition law only among those pairs with the same quadratic resolvent; the law is actually coming cohomologically from the SL_2 action, whose stabilizer is a form of Z/3Z whose H^1 reads off (more or less) the 3-torsion in the class group of an appropriate quadratic ring. So this one fits kreck's formulation. The only one I know which doesn't is the case studied by Gross and Lucianovic (continued next comment) – JSE Jan 02 '13 at 03:16
  • 5
    where the stabilizer is a form of SO_3 -- but now the group law is coming from the map H^1(K,SO_3) -> H^2(K,+-1) and there is a cohomological explanation in kreck's other sense. As far as I know, "every composition law is a cohomological law" among those observed so far. – JSE Jan 02 '13 at 03:17
  • @JSE: I had a question about the resolvent (whose construction I don't know). If B/A is a "cubic" ring (lets take this to mean a finite etale A-algebra of constant rank 3 for the moment, and ignore ramification), then it's classified by an element of H^1(Spec(A), S_3). Is the resolvent the image of this element in H^1(Spec(A),Z/2) defined by the determinant S_3 ----> Z/2? And if so, is there a similarly cohmological of accounting for ramification? – anon Jan 04 '13 at 06:21
23

I have three answers. The first two involve mathematics with an applied flavor, with strong connections to mathematics of a purer flavor. The last one is purer in origin, but full of potential for applications.


$\color{red}{\large\text{I)}}$ $\color{blue}{\large\text{Image Analysis}}$

This is clearly an applied area. But it has strong connections to purer areas like harmonic analysis, PDE, geometric measure theory, and variational analysis.

The mathematical branch of image analysis heated up a great deal in the 1990's as a cumulative result of S. Geman and D. Geman (1984). Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images (Google citations = 13672), Mumford and Shah (1989) Optimal approximations by piecewise smooth functions and associated variational problems (Google Citations = 3140), and Rudin, Osher and Fatemi (1992) Nonlinear total variation based noise removal algorithms (Google citations = 5079). Innovations in applied harmonic analysis - wavelets! - also had a large impact.

Interesting mathematics could be applied to problems that could be seen, literally! This inspired many applied and (pure) mathematicians to explore and contribute. This was strengthened by the formation of the SIAM activity group in about 2001, as well as by the fact that it had strong, influential participants like Guillermo Saprio, Andrea Bertozzi, Don Geman, Stan Osher, Tony Chan, David Mumford ... to name only a very few. Another reason image analysis became interesting to mathematically serious folk (as well as many dabblers) was that the ideas went both ways -- cool mathematics could be applied, but also, applications generated exciting, new mathematical problems.

$\color{blue}{\text{Examples}}$

The Mumford-Shah functional: This variational functional introduced by David Mumford and Jayant Shah to solve segmentation problems became an object of study attracting lots of intense scrutiny from the likes of E. De Giorgi, L. Ambrosio, G. David and others. And as fas as I know, the structure theory is still not complete.

ROF functional -- TV Denoising: This functional and it's variants generated a huge amount of interest. In fact that interest has not died off, especially if one looks at the endless variations that have been generated. Interesting algorithms as well as purer investigations using the tools of geometric measure theory have generated new ideas, even in geometric measure theory. Example: Allard's 2007 paper, Total variation regularization for image denoising, I. Geometric theory, uses geometric measure theory tools to definitively expose the nature of TV regularized image functional minimizers.

Geman and Geman: as is clear from the Google citations, it has had an enormous influnece in applications. I know the least about this subject, so I am not aware of the details of its impact on mathematics.

The area is stronger than ever and is characterized by a constant influx of fresh ideas, some of which generate very interesting and rich innovations in mathematics. For example, the CS topic brought up by Yuichiro has a big intersection with mathematical image analysis.

$\color{blue}{\text{Dicussion:}}$ Is this a grand challenge? I would argue that it is, but it is much more of a grass roots effort, not dominated by one personality but rather driven by a large number of ingenious people and real world problems. So it is different than the Grothendieck or Lurie or Thurston programs. It is more chaotic, more accessible, yet rich with motivations and inspirations that lead very deeply as well. It feels to me like something at the intersection of mathematics and physics.

$\color{red}{\large\text{II)}}$ $\color{blue}{\large\text{Mathematics for and from the Data Deluge}}$

It is not news that massive overloads of data are being generated, nor is it a new idea that old analysis tools are not enough. Those who know something of both the current data challenges and available mathematical technology realize that:

  1. Those mathematical tools are largely unexplored for their potential to data, and
  2. data problems are powerful sources of new ideas in those (purer) mathematical areas.

This is definitely another grand challenge which in fact subsumes the previous grand challenge of mathematical image analysis. It is of course driven by real world applications, but this in no way lessens the mathematical challenges. But it does broaden them tremendously.

$\color{blue}{\text{What is the nature of the mathematics involved in this challenge?}}$ It is very wide ranging, from geometric measure theory, harmonic analysis and PDE to graph theory, probability and statistics. Real problems are agnostic as to where insights might come from!

$\color{blue}{\text{What are the big questions?}}$ How do we extract information from very high dimensional data? How do we characterize streaming data on the fly? How do we find the proverbial needle in the haystack? etc. etc. etc.

How does this translate into mathematical programs of research? In tremendously varied ways. One has to look at research in mathematics, electrical engineering and computer science (at least) to get a grasp on the large scale of the intellectual energy devoted to these problems.

$\color{red}{\large\text{III)}}$ $\color{blue}{\large\text{Analysis in Metric Spaces}}$

I am a neophyte here, but this area is both rather hot and very intriguing. Currents in metric spaces by L Ambrosio, B Kirchheim (2000), Differentiability of Lipschitz functions on metric measure spaces by J Cheeger (1999), and monographs like Heinonen's Lectures on Analysis in Metric Spaces (2001) as well as various papers on analysis in sub-Riemannian spaces are examples and starting points for exploration. (The Helsinki school in analysis seems to me one major driving force here.)

There appears to me to be huge opportunities for progress here. Lots of exciting questions!

I also believe that the potential of this area for use in understanding and modeling data in metric spaces is just beginning to be realized. Data often comes with some notion of distance, but no natural embedding in a vector space. As the numbers of mathematicians working simultaneously in both pure and applied modes grow, I believe areas like analysis in metric spaces will become exploited for their power to illuminate applied problems.

  • 2
    very interesting and well presented answer Kevin, KUDOS!

    somehow I feel that one of the next breakthroughs will be in new math tools to manage and get insights from large data sets, so I look forward especially to dig into your number 2 above

    – Mirco A. Mannucci Jan 02 '13 at 11:39
  • 3
    Dear Mirco, I think you might enjoy "three examples of applied & computational homology": http://www.math.upenn.edu/~ghrist/preprints/nieuwarchief.pdf – M.B. Jan 03 '13 at 22:27
  • 2
    I want to second the suggestion of Magnus' ... computational topology is a another great example of what I am talking about. (In fact, a piece of what I have done with the flat norm has been spun off by a couple of my collaborators into things similar to the bar codes from the computational topology folk.) – Kevin R. Vixie Jan 04 '13 at 19:04
19

Large Networks and Graph Limits - new AMS book by László Lovász

"The theory has rich connections with other approaches to the study of large networks, such as ``property testing'' in computer science and regularity partition in graph theory. It has several applications in extremal graph theory, including the exact formulations and partial answers to very general questions, such as which problems in extremal graph theory are decidable. It also has less obvious connections with other parts of mathematics (classical and non-classical, like probability theory, measure theory, tensor algebras, and semidefinite optimization)."

"This book explains many of these connections, first at an informal level to emphasize the need to apply more advanced mathematical methods, and then gives an exact development of the theory of the algebraic theory of graph homomorphisms and of the analytic theory of graph limits. This is an amazing book: readable, deep, and lively. It sets out this emerging area, makes connections between old classical graph theory and graph limits, and charts the course of the future." --Persi Diaconis, Stanford University

"This book is a comprehensive study of the active topic of graph limits and an updated account of its present status. It is a beautiful volume written by an outstanding mathematician who is also a great expositor." --Noga Alon, Tel Aviv University, Israel

"Modern combinatorics is by no means an isolated subject in mathematics, but has many rich and interesting connections to almost every area of mathematics and computer science. The research presented in Lovasz's book exemplifies this phenomenon. This book presents a wonderful opportunity for a student in combinatorics to explore other fields of mathematics, or conversely for experts in other areas of mathematics to become acquainted with some aspects of graph theory." --Terence Tao, University of California, Los Angeles, CA

"Laszlo Lovasz has written an admirable treatise on the exciting new theory of graph limits and graph homomorphisms, an area of great importance in the study of large networks. It is an authoritative, masterful text that reflects Lovasz's position as the main architect of this rapidly developing theory. The book is a must for combinatorialists, network theorists, and theoretical computer scientists alike." --Bela Bollobas, Cambridge University, UK

user30304
  • 291
19

Universality phenomena for determinantal point processes and relatives.

After the deep results obtained by many great researchers concerning independent random variables, lot of attention has been recently paid to a certain kind of interacting random variables, arising from several (a priori non related) fields of mathematics, which behaves in a same way as the number of such random variables goes to infinity (appearance of the Sine kernel, Tracy-Widom distribution ...) ; the so-called universality phenomenon. This class of interacting random variables is not yet identified but includes

  • the eigenvalues of many random matrix models

  • the lengths of the rows of Young diagrams distributed according to the Plancherel measure

  • models from statistical physics like (T)ASEP, polynuclear growth models, random tilings of geometric shapes, ...)

  • the zeros of the Riemann Zeta function, once assumed the RH

and many others.

For further information, see e.g. the nice (although not exhaustive) overview of Deift http://arxiv.org/abs/math-ph/0603038

Because of the diversity of the mathematics involved a huge community, including a few Fields medals, is now working on a better understanding of such a class of random variables.

Adrien Hardy
  • 2,085
18

Work on the mathematical foundations of quantum field theory. See for instance the recent review by Michael Douglas: "Foundations of quantum field theory" in Proc. Symp. Pure Math. vol 85, 2012. See also this recent Stony Brook conference on the subject.

17

Ricci flow. It did solve Poincaré's conjecture and the $1/4$-pinching conjecture, but has also become an object of study. More generally, it has launched a large amount of work on geometric flows (mean curvature flow and others), notably with the idea that some other problems can be solved by designing an ad-hoc flow.

  • 1
    Didn't curvature flow (for curves, for example) predate Ricci flow? But you're right in that the idea that Riemannian metrics or submanifolds of Riemannian spaces can "flow" in naturally defined ways has been a driving force in Riemannian geometry. I wonder if there are any examples of this philosophy in other geometries? I only know of the work of Stancu and others in affine convex geometry. – alvarezpaiva Dec 31 '12 at 10:14
  • 3
    @alvarezpaiva: As far as I know the history, the first geometric flow was the harmonic map flow introduced by Eells-Sampson in 1964. – Robert Haslhofer Dec 31 '12 at 15:09
  • See also: Mullins, W. W. Two-dimensional motion of idealized grain boundaries. J. Appl. Phys. 27 (1956), 900--904, for the curve shortening flow in the plane. –  Oct 14 '13 at 06:52
17

Theory of computing and more specifically computational complexity and the NP=!P problem represet an important new paradigm in mathematics. This offers new look on classical issues like the study of algorithms, optimization, and randomness, and provide an interesting lens for many areas/examples/results of math.

Among other things, computational complexity describes different novel notions of "proofs" and among them the possibility to prove a mathematical statement only "beyond a reasonable doubt."

As already mentioned in another answer, quantum information/computation and mathematics related to it represent a major new programme/paradigm.

Gil Kalai
  • 24,218
15

This is a question rather than an answer. Shmuel Weinberger wrote a (to me, amazing) book about a decade ago, entitled

"Computers, rigidity, and moduli. The large-scale fractal geometry of Riemannian moduli space" (Princeton link)

Has the "grand project" promised by this book flourished, or at least evolved, in the last decade? I don't mean this question to be critical, I am just curious where his line of thought (e.g., "logical complexity engenders geometric complexity") stands ten years after.

Joseph O'Rourke
  • 149,182
  • 34
  • 342
  • 933
  • 1
    I love that book! Recent work of Seidel (http://arxiv.org/abs/0704.2055) and McLean (http://arxiv.org/abs/1109.4466) explores such questions in the context of symplectic and contact topology. – Jonny Evans Jan 02 '13 at 12:28
14

A successful grand project is the classification of finite simple groups. It was completed but several exciting follow-up projects are ongoing.

Gil Kalai
  • 24,218
  • 4
    What are followups? – Alexander Chervov Jan 02 '13 at 06:44
  • 1
    Well, understanding the structure of finite simple groups (e.g. maximal subgroups) Understanding the way general groups are built from finite simple groups, (formally this includes the theory of p-groups which is a separate thing but probably you can study this "modulo p-groups"), classifying representation of finite simple groups, understanding primitive permutation groups, and finding other proofs/approaches to the classification itself. – Gil Kalai Jan 02 '13 at 15:40
  • 4
    Regarding the representations, a truly grand and I think underappreciated achievements is Lusztig's construction of all irreducible complex characters of all finite groups of Lie type (including the vast majority of the finite simple groups). However the modular representation theory of finite Lie groups is I believe still a wide open and exciting grant project. – David Ben-Zvi Jan 02 '13 at 16:31
  • 4
    A follow-up in a different direction would be Aschbacher's work on fusion systems. Aschbacher's aim is to extend (!!) CFSG to give a complete classification of all finite fusion systems (of a particular type). This is different to the other follow-ups mentioned above in that Aschbacher is not using CFSG as such, rather he is applying the techniques developed in the course of proving CFSG to the study of a wider class of object. – Nick Gill Jan 07 '13 at 12:53
12

Within the realm of finite permutation group theory there are a series of projects that could be collectively entitled The classification of finite combinatorial objects subject to transitivity assumptions. These kinds of classifications have, of course, been around a long time (for instance the Greeks interest in platonic solids is a particular instance) but the nature of this work changed very dramatically with the completion of the Classification of Finite Simple Groups.

Particular threads of this grand project include:

  • The classification of distance-transitive graphs (cf. work of Saxl, Van Bon, Inglis and others);
  • The classification of flag-transitive designs (cf. the paper of Buekenhout, Delandtsheer, Doyen, Kleidman, Libeck and Saxl which gives an almost-complete classification). More recently the flag-transitivity condition has been relaxed, and progress has been made on classifying designs which are, for instance, line-transitive or point-primitive (cf. work by many authors!)
  • The classification of finite projective planes subject to various assumptions. This is a special case of the previous item. In 1959 Ostrom & Wagner gave a full classification of projective planes admitting 2-transitive automorphism groups; in 1987, and using CFSG, Kantor gave an almost-classification of projective planes admitting point-primitive automorphism groups; results have appeared subsequently dealing with the weaker situation of point-transitivity.
  • The classification of generalized polygons subject to various assumption. The previous item is a special case of this. (I know of recent work on generalized quadrangles due to Bamberg, Giudici, Morris, Royle and Spiga; not sure about hexagons and octagons.)
  • The classification of `special geometries'. This is work initiated (I believe) by Francis Buekenhout in an attempt to understand the sporadic groups (see the earlier answer by J Mckay). The idea is to find geometries on which the sporadic groups act, analogously to the way the groups of Lie type acts on Tits buildings.
  • The classification of regular maps (i.e. graphs embedded nicely on topological surfaces and admitting an automorphism group that is regular on flags/ directed edges). This is the thread that involves the Platonic solids; more recently there is a wealth of work by people like Conder, Siran, Tucker, Jones, Singerman, and many others.

There are many others but these give a flavour (skewed to my own interests).

In many of the threads just mentioned (but not all) a crucial first step in classifying objects is to use the Aschbacher-O'Nan-Scott theorem which describes the maximal subgroups of $S_n$. One then often needs information about maximal subgroups of the almost simple groups and so another famous theorem of Aschbacher comes into play (along with results by Kleidman, Liebeck, and others). These theorems are closely related to the answer given by Gil Kalai - the production of results of this ilk (facts about the finite simple groups) is, in itself, a grand project!

Nick Gill
  • 11,161
  • 39
  • 70
11

In numerical mathematics there is a recent new grand theme called randomized numerical linear algebra (RandNLA). One example (probably even a paradigm) is the "randomized range finder" from the paper "Finding structure with randomness". In a nutshell, you hit a (probably very large) matrix $A$ from the right with a random matrix $\Omega$ (which should be a short matrix) and then use a traditional numerical algorithm to find an orthonormal base of the range of $A\Omega$. By hitting the matrix from the right, one can reduce the number of columns of the matrix and hence, the potential computational effort can be reduced. On the other hand, one loses some dimensions of the range of the matrix but one hopes that, due to the randomization, the most important dimensions are kept.

The general idea is that in most cases the interesting quantities are not such "high-dimensional" as they look at first glance. In the case of a high dimensional range of a matrix $A$, it may be that the "usual element" $Ax$ lives in a space of lower dimension.

Interesting questions in this area include: What guarantees can be given for the output of a randomized algorithm? To what extend does the distribution from which the random object in the algorithm is drawn influence the quality of the output? Under what circumstances does randomization pay off (e.g. in terms of computational effort or storage)?

J.J. Green
  • 2,497
Dirk
  • 12,325
11

The concentration of measure phenomenon and (related to it) the modern developments related to isoperimetric relations can be regarded as a large mathematical programme which involves analysis, geometry, various applications to other disciplines, combinatorics and algebra.

Gil Kalai
  • 24,218
10

The simultaneous study of a space $X$ and its observables $F(X)$ (real, complex, or operator-valued functions on $X$) is an old topic, but with quantum groups and non-commutative geometry it has been the source of much modern mathematics. The introductory paper by Connes does a really nice job at explaining this.

In the paper Connes underlines the pioneering work of I.M. Gelfand in this area. However he misses one little thing. Gelfand's work on integral geometry was also motivated by this philosophy. The idea is to consider the incidence relation as a special type of multivalued map between the two spaces and to consider how functions, forms, densities, and other functional objects correspond under the map.

alvarezpaiva
  • 13,238
10

In information theory (error-correcting codes) the grand achievements in 90-ies are turbo-codes and LDPC codes. Recent 2009 discovery which became hottest topic is polar codes.

It is tempting to say that paradigm-shift coming with turbo and LDPC codes instead of earlier popular approaches: convolutional codes, Reed-Solomon codes, BCH codes et.al. is shift from algebra to probability, from order to chaos. I mean that earlier constructions were much dominated by algebra considerations e.g. non-recursive convolutional codes are just the ideals in the ring $F_2[x]\oplus ... \oplus F_2[x]$. While turbo and LDPC are actually constructed and decoded with methods which much influenced by probabilistic and randomized considerations: roughly speaking good LDPC codes can be constructed by sufficiently sparse and random matrix. The decoding methods used for LDPC - belief propagation naturally belong to probability or machine learning maths. rather than algebra.

Actually turbo code is almost the same as convolutional code, modula one "small" detail - interleaver. Interleaver is "radomizer" added to the algebra-tasted convolutional code, it is crucial thing which makes all work. That what concerns the encoder. The decoder of turbo-codes "resembles" turbine and hence the name "turbo"-code, it is crucially based on probabilistic techniques in coding theory. Well, the key technique - BCJR algorithm was developed much earlier, so, of course, all division into old-new paradigms is not very precise, but nevertheless seems there is something behind it.

These ideas found rich practical applications. If someone is reading this with the help of smartphone - say thank to "turbo-codes" - they are working there.

New discovery - polar codes - probably can be characterized as algebra's strike back - they seems to be quite algebraic nature, sorry I cannot say much for the moment.

8

Hyperfields

"Krasner, Marshall, Connes and Consani and the author came to hyperfields for different reasons, motivated by different mathematical problems, but we came to the same conclusion: the hyperrings and hyperfields are great, very useful and very underdeveloped in the mathematical literature. Probably, the main obstacle for hyperfields to become a mainstream notion is that a multivalued operation does not fit to the tradition of set-theoretic terminology, which forces to avoid multivalued maps at any cost. I believe the taboo on multivalued maps has no real ground, and eventually will be removed. Hyperfields, as well as multigroups, hyperrings and multirings, are legitimate algebraic objects related in many ways to the classical core of mathematics. They provide elegant terminological and conceptual opportunities. In this paper I try to present new evidences for this."

Hyperfields For Tropical Geometry I. Hyperfields And Dequantization, Oleg Viro, http://arxiv.org/abs/1006.3034

The hyperring of adèle classes, Alain Connes, Caterina Consani, http://arxiv.org/abs/1001.4260

user30304
  • 291
6

The CFSG (classification of finite simple groups) yields L: The finite groups of Lie type, and S: the non-Lie groups = 26 sporadic simples. We do not know how natural this taxonomy is.

One approach is that of (categorical) 2-groups. Another is that of BIRS Banff 12frg158 which is an attempt to tame the sporadics using integrable systems, symplectic geometry, characteristic classes, and mathematical physics. This may lead to flourishing of new interconnections between many fields.

John McKay
  • 251
  • 1
  • 3
  • here's the link to the Banff workshop page: http://www.birs.ca/events/2012/5-day-workshops/12frg158 it's a pity that the talks are not available online... – Christian Nassau Jan 06 '13 at 14:23
  • see also Gil Kalai's answer below: http://mathoverflow.net/questions/117668/new-grand-projects-in-contemporary-math/117838#117838

    It would be better to edit his answer and delete this.

    – András Bátkai Jan 07 '13 at 09:59
5

The theory of nonlinear dispersive equations, hyperbolic conservation laws, etc., see

Terece Tao's book on the subject, Jean Bourgains book or Helge Holdens co-authored monographs.

2

I might say that algebraic stacks and their development might be a "grand project" (although, admittedly, this is something that I do not know a whole lot about). In particular, there is the "Stacks Project", found here: stacks.math.columbia.edu/

Johannas
  • 255
  • 6
    I don't really know if this fits to the question; it only fits to the title. After all, algebraic stacks were already studied in the mid-60s and the idea of moduli spaces/stacks is even older. – Martin Brandenburg Jan 02 '13 at 10:45
-1

Closely related to the theory of hyperbolic conservation laws already mentioned by András Bátkai is the study of dispersionless (i.e., rewritable as first-order quasilinear homogeneous) systems integrable in the sense of soliton theory, and of their relations to geometry, see e.g. this paper, this one and this one and references therein.