I am trying to compile a list of notations and definitions that has become ingrained in mathematical folklore, yet are still on some objective scale unsatisfactory. I offer two starting examples.
For a polynomial $f \in \mathbb{Z}[x_1, \cdots, x_n]$, we say that $f$ is irreducible if there does not exist a polynomial $g \in \mathbb{Z}[x_1, \cdots, x_n]$ of degree at least one such that $g | f$. We say that $f$ is absolutely irreducible if $g$ is allowed to take on coefficients in $\overline{\mathbb{Q}}$, the field of algebraic numbers. Thus we come to the following unfortunate theorem:
Suppose $f(x,y) \in \mathbb{Z}[x,y]$ is an irreducible homogeneous polynomial. Then $f$ is absolutely completely reducible.
(This is because $f(x,y) = y^d f(x/y, 1)$, where $d = \deg(f)$, and over $\overline{\mathbb{Q}}$ $f(z,1)$ can be reduced to a product of linear factors by the fundamental theorem of algebra, and so the same holds for $f(x,y)$).
A second example is the canonical notation for anything involving the Riemann zeta function or $L$-function, where a complex number is always denoted $s = \sigma + it$. It seems that the more natural thing to do is pair $s$ and $t$ together since they are consecutive letters in the same alphabet. Nonetheless, $t$ is perpetually paired with $\sigma$ instead, because that was the notation used by Riemann in his original paper and it has stuck around since.
And, of course, the often debated and contested of the importance of $\pi$, where some believe that $2\pi$ (representing one full revolution) should be the standard constant, and not $\pi$.
Are there other examples of 'bad' notation and definitions?