27

Why does mathematics seem to have a polarity bias, i.e., why are products more common than coproducts, algebras more common than coalgebras, limits more common than colimits, monads more common than comonads, topoi more common than cotopoi, etc. despite each pair being in a formal duality?

David White
  • 29,779
  • 18
    Presentability is clearly also a big asymmetry - from this perspective, one might argue that colimits are "more common" than limits :) But my guess is that this is precisely where the polarity comes from: the category of sets is presentable, and simply not co-presentable. This ultimately comes down to elements (as opposed to co-elements) – Maxime Ramzi Jan 07 '24 at 14:03
  • 45
    This may be an unsatisfying answer, but to some extent this is likely due to what order humans discover things. If one side of any pair of a formal duality is going to show up more often in easy contexts, then that side will get named first. So that one gets named as the main object class and the other gets the co associated with it. – JoshuaZ Jan 07 '24 at 14:04
  • 4
    I like to think of products as constructions, and coproducts as deconstructions. I think that as humans we have a bias towards wanting to build new things, and only bother to deconstruct things after they have already been built. – Sean Sanford Jan 07 '24 at 14:44
  • 22
    Mor$(X,\lim Y_i) \cong \lim$ Mor$(X,Y_i)$ and Mor$($co$\lim X_i,Y)\cong \lim$Mor $(X_i,Y)$. Limits win 2-1 against the colimit. – Jochen Wengenroth Jan 07 '24 at 15:15
  • 39
    but colimits are more cocommon – Pietro Majer Jan 07 '24 at 16:10
  • 8
    @PietroMajer I thought mmons were more colimital? – Noah Schweber Jan 07 '24 at 16:22
  • 1
    Should this be community wiki? It feels like answers will be primarily opinion based, and it's hard to imagine any one answer being objectively "more correct" than any other. – David White Jan 07 '24 at 17:05
  • 2
    I suppose that most of us learn how to take the product of two groups, for example, before we learn how to take their coproduct, or free product. Some constructions are just simpler to describe. – Donu Arapura Jan 07 '24 at 18:12
  • 19
    Because the things without the "co-" were the things discovered first, and therefore were named first. When the dual-type objects arose, they got the co- prefix. – Daniel Asimov Jan 07 '24 at 18:23
  • 4
    Without thinking about all the particular examples already mentioned: one source of asymmetry might be that we privilege certain "relations" and call them "functions" (so we have a bias towards working in $\textsf{Set}$ and $\textsf{Grp}$ rather than $\textsf{Set}^{\rm op}$ and $\textsf{Grp}^{\rm op}$) – Yemon Choi Jan 07 '24 at 18:31
  • 1
    I could have sworn we had almost exactly this question, with a different title, about 10 years ago. Do any other old-timers remember it? One related question, that I think isn't the one I mean: Is the dual notion of a presheaf useful?. \ @PietroMajer, re, surely colimits are more mmon? – LSpice Jan 07 '24 at 19:02
  • It may also be, in some particularly symmetric cases, that in lack of other reasons one is led to prefer the shorter name. E.g. I would rather use "sec" and "tan", than "cosec" and "cotan". – Pietro Majer Jan 07 '24 at 19:13
  • 1
  • As others have said, there is usually a clear asymmetry. The simplest examples are a vector space versus its dual and a linear map versus its transpose . In fact, many of the examples cited are special cases or variants of this. – Deane Yang Jan 07 '24 at 19:28
  • 1
    Back in the olden days, before the co-ification of mathematics, there was just the odd co-thing here and there, even as category theory was slowly growing. There were certainly cocycles and coboundaries and a few which we think of has having the modern meaning of co, but then there were others like cosets which were just tolerated, maybe the oddballs at the party, but then it wasn't a very big party... yet. And then, all of a sudden, it caught on! Everything that hadn't been co-ed already suddenly got co-ed! And here we are. – Lee Mosher Jan 08 '24 at 02:02
  • 1
    I learned a lot about coalgebras (of functors) before I ever thought much about algebras of functors, so I tend to think of alegbras as co-coalgebras. (And presheaves as co-copresheaves.) – N. Virgo Jan 08 '24 at 08:14
  • 1
    @LeeMosher, re, to the contrary, according to the OED, co-ed has been around since 1886. – LSpice Jan 08 '24 at 14:47
  • It seems that cohomology is more common/important than homology. – sds Jan 09 '24 at 19:08
  • 1
    One thing I picked up from Vicious Circles by Barwise and Moss is that sometimes standard Set Theory nudges you towards "things" rather than "cothings". For example coalgebras often lead to operations that unpack things out of something (eg. the opposite of a monoid that takes two things and packs them down to one). An obvious way to represent this unpacking is to identify an object with the set of things it unpacks to (or similar schemes). But that leads to non-wellfounded sets. – Dan Piponi Jan 09 '24 at 22:15
  • 2
    Yes, after thinking about the different examples I am familiar with, I concur that to the extent that this is a real technical phenomenon, it is generally inherited from sets having elements rather than co-elements. – Cameron Zwarich Jan 09 '24 at 22:21
  • 2
    I have pondered the fact that a group object is one for which the morphisms into it have a group structure, while a cogroup object is one for which the morphisms out of it have a group structure. Both are defined in terms of groups. – Jeff Strom Jan 09 '24 at 23:10
  • Take $x$. Subtract and add $1$ and then multiply. I am still very confused by the fact that it gives $x^2-1$ and not $x^2$. – Stabilo Mar 18 '24 at 12:40

5 Answers5

25

It is difficult to answer this question, because the answer is probably different for each of the examples the OP mentions. I think for each instance of "co" there are several possibilities:

  • A. One thing was discovered because it's very natural, then people realized you can dualize and create the co-thing, which has fewer applications but is still interesting. Examples include topos vs cotopos, homotopy vs cohomotopy, monad vs comonad, group vs cogroup, etc. This was also pointed out in the comments.

  • B. The co-thing is strictly more complicated than the thing, and to understand it requires that you first understand the thing. Mathematical concepts that are easier and less technical often are more common in the literature than more complicated concepts. Examples include homology vs cohomology. Also, in many cases, a coalgebra starts with algebra structure, then adds more.

  • C. The original thing is based on our intuition from early math, where learning is probably driven by what's useful to a wide segment of society, and hence feels more intuitive than the co-thing. This was also mentioned in the comments. This is probably the case with algebras vs coalgebras, or products vs coproducts if you mean in the algebraic sense.

  • D. The thing is actually not "more common" than the co-thing. They are both extremely natural and you could learn them in either order. This is how I think about limit vs colimit, or product vs coproduct in the categorical sense.

Let me write a bit more about (A). I don't think a cogroup is any harder to understand than a group, but we learn about groups first because they are ubiquitous. Same for representable functors vs corepresentable functors. Of course, this choice in the education system becomes a self-fulfilling prophecy, and every year more work is done on the thing than on the co-thing because people understand the thing better or feel like it's more natural. Once you've learned about the "thing" then thinking about the co-thing is harder because it requires reversing all arrows, hence there are fewer papers about the co-thing. But, if the order of discovery (or, the applications) had pushed us to invent the co-thing first, then the "thing" would be harder for our brains. Neither is intrinsically more natural than the other. This might be a good place to point out that in cases where two notions are both very natural, sometimes the "co" is on the wrong one, for historical reasons, like with covariant functors, which actually do NOT involve reversing the direction of the arrows. This isn't an example of a thing vs co-thing, because there are no "variant functors". Instead, functors that reverse arrows are called contravariant. But, if these notions were being invented today, what we know as covariant functors would be the thing, and contravariant functors would be the co-thing.

Let me write a bit more about (B). Sometimes the co-thing implies the existence of the thing. For example, in algebra, it's common to fix a monoid $A$ and an $A$-coring $B$, then look at $B$-comodules in $A$-modules. In this case, a comodule is already a module. We consider this setting in Section 6.6 of this recent paper of mine with Donald Yau, which also cites other places where this kind of thing is studied in algebra. Something similar happens with comodules over a Hopf algebroid, which we recall in 6.9 and 7.4 of that paper.

Let me write a bit more about (C). It might be a special case of A, but I put it separately because it could also be driven by applications more than order of discovery or which thing is more natural (if you can even quantify that). So, while you could think of the natural numbers as either a monoid or a comonoid, one way involves addition, which is a fundamental application of math, so we bias in that direction. I didn't start thinking of it as a comonoid until logic programming in college. I spent many years writing papers about algebra structure, and recently wrote one about co-algebra structure, and while doing the literature review I was struck by how common coalgebras are after all. In many cases they are actually even more common than algebras, e.g., in a cartesian category, every object is a comonoid with respect to the diagonal map. Another example is that when you study closed symmetric monoidal categories, you could start with either the tensoring or the cotensoring (meaning, the internal hom). There are categories that are closed but not monoidal, and categories that are monoidal but not closed. I've always felt the monoidal structure was more natural because it feels like things I've learned before, but I know plenty of category theorists and homotopy theorists who would argue that the co-thing (the closed structure) is actually the more natural one to consider. One place this comes up is how there are two ways to check the pushout product axiom or the SM7 axiom in model categories, and a non-trivial proportion of researchers use the co-thing approach via hom objects.

Lastly, let me write more about (D). In my work, I see colimits more than limits, and I work with cofibrations more than with fibrations. I think both limits and colimits encode extremely common human ways of thinking. I think about the colimit as what I can build from all the pieces in front of me, the global behavior trying to emerge from the local behavior, or "where things are going." I think of limits as where things came from, since the limit of the diagram maps to each object in it. I previously wrote an answer, based on David Spivak's book Category theory for the sciences, that got into the human ways of thinking that concepts in category theory are encoding.

Note that my way of thinking about limits and colimits is the opposite of the comment about creating vs destroying. I think that comment is more correct for algebra vs coalgebra, e.g., creating new numbers by adding up old numbers, vs breaking a number down into its constituent pieces. I'd argue that both goals are really fundamental to the thinking process of humans, and neither is "more common" or "more natural" than the other.

David White
  • 29,779
  • For A), isn't it true though that each example can be translated into a co-example? The problem seems to be that we find the original examples more interesting and natural than the co-examples, not that there are more of one than the other. – provocateur Jan 07 '24 at 20:11
  • 1
    I think that the comment about creating vs. destroying that you reference is @SeanSanford's. – LSpice Jan 07 '24 at 20:11
  • 1
    @provocateur Yes. I don't think I wrote that there were "more examples" of the thing than the co-thing in A, but indeed the goal was to think about "more natural" or "more interesting." Sorry if the answer gave the wrong impression. However, for (B) there really could be "more examples" since to be a coalgebra implies being an algebra, plus more, for example. – David White Jan 07 '24 at 20:45
  • 4
    I'd say that the extent to which homology/cohomology is an example of B depends substantially on your approach. (And also there are cases like de Rham theory where there is a cohomology theory but no homology theory.) – Alison Miller Jan 08 '24 at 01:12
  • 1
    I don't understand your comment about monoidal structures and internal homs. Either one can be characterized uniquely as an adjoint to the other, but may not exist: you can have a non-closed monoidal category or a non-monoidal closed category. – Mike Shulman Jan 08 '24 at 08:19
  • 3
    Also a "covariant functor" is not the dual of a "variant functor" -- it is actually the non-co concept, with a "co-covariant functor" being popularly known as a "contravariant functor". – Mike Shulman Jan 08 '24 at 08:20
  • @MikeShulman I edited to address your comments. I wouldn't want to mislead any future readers, so added clarity about what I was trying to say. – David White Jan 08 '24 at 13:54
  • 1
    I hope it's ok if this answer gets more votes than any of the coanswers. – Todd Wilcox Jan 09 '24 at 04:50
  • 1
    I still think the comment about monoidal structures is misleading. I wouldn't say that you get a monoidal structure "for free" from a closed structure, since you can have a closed category that is not monoidal. Moreover, this is a bad example of (B), since the symmetry between monoidal structures and closed structures is fairly complete; either suffices to characterize the other, but neither suffices to construct the other. And I am at least doubtful of your claim about "most category theorists". – Mike Shulman Jan 09 '24 at 05:17
  • I edited again to address Mike's most recent comment. – David White Jan 09 '24 at 18:55
  • As Will points out below, coproducts of sets are not less common. I've thought about the algebra/ coalgebra question and think of the answer here as algorithmic. To specify an algebra means giving a rule to produce something from two things. To specify a coalgebra means creating a "lookup" function - e.g. find all occurrences of 36 in the multiplication table. I'm no computer scientist, but I conjecture that if this can be made well-posed the lookup question will be more intensive. – Dev Sinha Jan 09 '24 at 19:00
  • @DevSinha I have "coproduct vs product" in D, which is exactly "not more common." However, the example of coalgebras over a coring is one where every coalgebra is also an algebra, but not vice versa, so I put it in B. I guess your comment is about C, where I've got that products seem more intuitive. It's true that addition corresponds to coproduct, but I don't think students think about it that way till much later. – David White Jan 09 '24 at 19:12
  • 1
    I wrote about missing the chance to use the term "variant functor" 10 years ago in a comment on the page https://mathoverflow.net/questions/175833/what-recent-programmes-to-alter-highly-entrenched-mathematical-terminology-have – KConrad Mar 18 '24 at 03:26
17

Coproducts of sets are introduced earlier in mathematical education than products of sets, under the name "union" or "disjoint union". Also, addition is of course introduced earlier than multiplication.

The reason for the naming choice in that case seems to be that products in a wide variety of concrete categories (e.g. algebras for a variety in the sense of universal algebra, various sorts of manifolds) correspond to products of the underlying sets with the "obvious"s structure, while coproducts correspond to coproducts of the underlying sets less often (only in the manifolds case, not the algebras), so it makes sense to name categorical products by analogy with a more familiar construction and coproducts by duality with products than vice versa.

Will Sawin
  • 135,926
8

I'm not sure that what you are saying is even true, e.g. "limits/products are more common than colimits/coproducts", what does "common" even really mean?

In a literal sense the statement is false because of opposite categories: e.g., a cone is a limit iff the corresponding cocone in the opposite category is a colimit. This might seem silly to point out but remember that mathematicians have a choice of orientation of any category, so when given the choice they will pick the orientation that is most familiar to them. I think this entirely explains why e.g., cotopoi is not a common notion: any time someone would work with a category whose opposite is a topos, they would just work with the opposite category to re-use their intuition about toposes. For example, the Schanuel topos is often defined as sheaves on the opposite caetgory of finite sets and injections (https://ncatlab.org/nlab/show/Schanuel+topos) rather than being defined as the cotopos of co-sheaves on the category of finite sets and injections.

Then if we acknowledge that mathematicians have this degree of freedom, then the question is the simpler "why are common categories generally asymmetric in what universal properties they have" and this is something where there are fundamental mathematical obstructions. For instance, any category that is both cartesian closed and whose opposite is cartesian closed is a preorder, as originally noted by Joyal.

Max New
  • 827
6

I don't have any deep insights to offer, but I'd like to suggest that there are two separate questions being mixed together here.

The first question is why there often appears to be an asymmetry between "primal" and "dual" in mathematical structures that arise in practice.

The second question is a linguistic one, having to do with which structure we deem to be the "primal" one and which one we deem to be the "dual" one.

That structures we care about are often not perfectly symmetric between primal and dual should not be too surprising, since category theory captures only some features of whatever mathematical thing we're studying.

The linguistic question has many possible explanations, as others have pointed out, and is perhaps less interesting since it doesn't necessarily imply anything of mathematical substance.

There is potentially a third question that could be asked, which is whether seemingly unrelated mathematical structures $S_1$ and $S_2$ are nevertheless asymmetric in the same way. But I'm not sure that this question actually makes sense. If $S_1$ and $S_2$ are truly unrelated, then in what sense can we claim that we've chosen the primal and dual labels "in the same way"? Conversely, if $S_1$ and $S_2$ are related, then that presumably explains why their labels are correlated (by analogy, physical magnets all have a north pole and a south pole, and we can label them all consistently because north poles all repel each other, even though which one we call north and which one we call south is arbitrary).

Timothy Chow
  • 78,129
2

$\newcommand\Set{\mathrm{Set}}$Rather than what we use more or less than what, I've wondered at why while category theory is a completely symmetrical theory, many dual things feel very different from each other in the actual math we use category theory for. Limits in concrete categories are generally just limits of underlying sets whereas colimits are often much more complicated. Relatedly, forgetful functors are usually quite simple and are right adjoints, and free functors are often quite complicated and are left adjoints.

I think that a whole bunch of asymmetry stems from the asymmetry of the category of sets. Most notably (imo) is that it is cartesian closed, and not cocartesian coclosed. So much of math is built off of Set so it's not surprising that there ends up being so much asymmetry, given the asymmetry of Set.

But why the asymmetry of $\Set$? I always found it curious that even in classical set theory, $\Set$ is a cartesian closed category which through categorical semantics is associated with intuitionistic propositional logics.

(Please forgive me as I'm rusty on the technical details. I believe much of these arguments can be made completely formal through the duality of values and continuations and through some of the machinary around linear logic. The duality of abstraction has pretty clear exposition on some of this and references a lot of the literature that I don't have access to.)

I believe that this asymmetry of $\Set$ is because we like to work forwards rather than backwards [1]. We have values $1 \to A$ and want to achieve a result so we work forward, applying functions to get to that result. Dually we could start with a request $A \to P(1)$ or a continuation (think a predicate as in $\mathrm{CABA} = \Set^\text{op}$, or a linear form) and work backwards, precomposing or "coapplying" functions, until we end up at some starting point. With the former, we get a cartesian closed structure, whereas with the latter we get the less familiar cocartesian coclosed structure. Importantly, you can't have both at once as that collapses your category into a poset by a simple argument given in the above paper.

$\Set is based on taking the forward value-oriented side of the duality, so it fundamentally breaks the symmetry at the start, before we define our concrete categories, our groups and rings and topological spaces and presheaves on top of them. I believe linear logic and related machinery can provide a more symmetrical starting point for whatever that is worth.

[1] And I think because we like to think in terms of "things" rather than whatever is the "right" way to think of continuations or covalues. Requests, results, a hole waiting for a block of the right shape? Predicates $A \to P(1)$ (like elements of a complete atomic boolean algebra) or something like linear forms, the dual of values/vectors in vector spaces?

LSpice
  • 11,423
dan l
  • 21