9

Studying some representation theory I came up with the following problem.

We work over a field of characteristic $0$. Let $V$ be the standard representation of $\mathrm{GL}_n$ and let $W$ be the representation $(\mathrm{Sym}^2(V))^{\otimes n}$.

Is it possible to describe the weights of the irreducible components of $W$ (after choosing the usual Borel and so on)? In particular I would like to know if the representation $\det(V)^{\otimes 2}$ appears in $W$.

John
  • 91

3 Answers3

11

By Pieri's formula, a partition with $2n$ elements in $n$ rows, corresponding to a representation of $GL_n$, occurs in this representation with multiplicity equal to the number of ways of obtaining that partition by starting with the empty partition and $n$ times adding two elements, no two in the same column.

For the determinant squared, which corresponds to a partition with $2$ columns of length $n$, this occurs with multiplicity exactly one.

Will Sawin
  • 135,926
  • This completely answers the original question, by describing the irreducible components and their multiplicities, and the weights can be found via semistandard Young tableaux. Note that the convention used here is the transpose of the one more common in representation theory: a simple ${\rm GL}_n$-module with highest weight $\lambda=(\lambda_1,\ldots,\lambda_n)$ is usually represented by the partition $\lambda$, but here it is represented by the dual partition $\lambda^t$. Thus ${\rm Sym}^2 V$, with highest weight $(2,0,\ldots,0)$, is column $(1^2)$ here instead of the more common row $(2)$. – Victor Protsak Jun 23 '18 at 05:52
  • 1
    @VictorProtsak Fixed the convention. Actually, although you can't tell from what I wrote, I was viewing representations as partitions correctly, but writing my partitions sideways, with each part a column instead of a row. – Will Sawin Jun 23 '18 at 06:48
5

The answer is yes.

Let $e_1,\ldots, e_n$ be the standard basis of $V$. Consider the morphism $$ f \colon \det(V) \to V^{\otimes n} $$ given by $$ f(e_1 \wedge \cdots \wedge e_n) = \sum_{\sigma \in S_n}(-1)^{\varepsilon(\sigma)}e_{\sigma(1)} \otimes \cdots \otimes e_{\sigma(n)}, $$ where $S_n$ is the symmetric group on $n$ letters and $\varepsilon(\sigma)$ is the parity of $\sigma$. This gives a morphism $$ f^{\otimes 2} \colon \det(V)^{\otimes 2} \to V^{\otimes 2n}=(V^{\otimes 2})^{\otimes n}. $$ Using the natural projection $V^{\otimes 2} \to \mathrm{Sym}^2(V)$, we get a morphism $$ g \colon \det(V)^{\otimes 2} \to (\mathrm{Sym}^2(V))^{\otimes n}, $$ that it is easily checked to be injective.

I think that moreover $\det(v)^{\otimes 2}$ appears with multiplicity $1$ (I checked this using a computer up to $n=10$), but I didn't tried to prove it.

Ricky
  • 3,674
  • Nice job. I'm wondering if this generalizes: Given any vector space $V$ and any nonnegative integers $n$ and $k$ (not necessarily having $n = \dim V$), we can consider the composition of the canonical maps $\left(\Lambda^n V\right)^{\otimes k} \overset{\text{inclusion}}{\to} \left(V^{\otimes n}\right)^{\otimes k} \overset{\cong}{\to} \left(V^{\otimes k}\right)^{\otimes n} \overset{\text{projection}}{\to} \left(\operatorname{Sym}^k V\right)^{\otimes n}$. Is this composition injective? At least it is easy to see that $\left< e_n^k, h_k^n \right> = 1$ in symmetric functions. – darij grinberg Jun 22 '18 at 15:02
  • Ah, of course it is not generally injective -- after all, $\left(\Lambda^n V\right)^{\otimes k}$ will generally have more than one irreducible Schur functor in it, so if the map was injective, then $\left< e_n^k, h_k^n \right>$ would be greater than $1$. But the map is injective when $n = \dim V$; this generalizes your result. – darij grinberg Jun 22 '18 at 15:04
  • (Why is it easy to see that $\left< e_n^k, h_k^n \right> = 1$ in symmetric functions? Well, recall that the complete homogeneous symmetric functions are orthogonal to the monomial symmetric functions; thus, $\left< e_n^k , h_k^n \right>$ is the coefficient of the monomial symmetric function $m_{\underbrace{\left(k,k,\ldots,k\right)}_{n \text{ entries}}}$ in $e_n^k$ (in the monomial basis). But the latter coefficient is clearly the coefficient of the monomial $x_1^k x_2^k \cdots x_n^k$ in $e_n^k$. Finally, the latter coefficient is $1$, for simple reasons.) – darij grinberg Jun 22 '18 at 15:06
  • @darijgrinberg: Yes. The argument I gave applies also for joint multilinear $SL_n$ invariants of $n$ forms of degree $k$ in $n$ variables. Although the last step of specialization to the same form does not work if $k$ is odd. – Abdelmalek Abdesselam Jun 22 '18 at 15:13
  • For $k$ odd (and also $k$ even $>2$), it is better to specialize to the forms $x_1^k,\ldots,x_n^k$ in order to show that the invariant is nonzero. – Abdelmalek Abdesselam Jun 22 '18 at 15:20
3

Just an addendum to Ricky's answer: the multiplicity is indeed 1 which can be proved as follows.

An occurrence of ${\rm det}(V)^{\otimes 2}$ inside $({\rm Sym}^2(V))^{\otimes n}$ is the same thing as a nonzero joint multilinear ${SL}_n$-invariant of $n$ quadratic forms $Q^{(1)},\ldots,Q^{(n)}$ in $n$ variables. By the first fundamental theorem of classical invariant theory, this must be a linear combination of expressions (after choice of coordinates) of the form $$ \sum_{i_1,\ldots, i_{2n}=1}^{n} \epsilon_{i_1,\ldots,i_n}\ \epsilon_{i_{n+1},\ldots,i_{2n}} \ Q_{i_{\sigma(1)},i_{\sigma(2)}}^{(1)} \ Q_{i_{\sigma(3)},i_{\sigma(4)}}^{(2)} \cdots \ Q_{i_{\sigma(2n-1)},i_{\sigma(2n)}}^{(n)} $$ where $\sigma$ is a permutation of $\{1,\ldots,2n\}$.

Here the $Q_{i,j}^{(a)}$ denote the matrix elements of the quadratic forms and $\epsilon_{i_1,\ldots,i_n}$ is completely antisymmetric with the normalization $\epsilon_{1,\ldots,n}=1$.

If you take a symmetric matrix $A$ and a skew-symmetric $B$ then $\sum_{i,j}A_{ij}B_{ij}=0$ because you are contracting two symmetic indices with two antisymmetric ones. The same is true if $A$ and $B$ are tensors with more indices that are frozen. Thus the above expression is zero for all permutations $\sigma$ which send two elements of the same block of the partition $\{\{1,\ldots,n\},\{n+1,\ldots,2n\}\}$ to the same block of the partition $\{\{1,2\},\{3,4\},\ldots\{2n-1,2n\}\}$. It is then easy to see that all you get are multiples of the expression corresponding to say the permutation $\sigma$ defined by $$ \sigma(i)=2i-1\ \ ,\ \ \sigma(n+i)=2i $$ for $1\le i\le n$. Moreover, this invariant is not zero because when specializing to all quadratics being equal to say $Q$ this gives the non identically vanishing polynomial $n!\ {\rm det}(Q)$.

Note that the above permutation $\sigma$ is not the only that works. There are $2^n\ n!^2$ permutations which satisfy the combinatorial requirement I mentioned but their corresponding invariants differ by a $\pm 1$ factor.

Finally, as remarked by Darij, this easily generalizes to occurrences of ${\rm det}(V)^{\otimes k}$ inside $({\rm Sym}^k(V))^{\otimes n}$.

  • What exact "first fundamental theorem" are you using? (I know it for vectors, not for quadratic forms.) – darij grinberg Jun 22 '18 at 15:27
  • The one for quadratic forms (or any system of tensors you want) is a trivial consequence of the one for vectors and covectors. That's one of the uses of the classical symbolic method. – Abdelmalek Abdesselam Jun 22 '18 at 15:32
  • Ah! I forgot about vectors and covectors (again). – darij grinberg Jun 22 '18 at 15:41
  • For instance, in the present situation by specializing the quadratics to $L_1^2,\ldots,L_n^2$ you get an invariant of $n$ covectors which is of degree 2 in each. You can recover the original invariant by acting with $\prod_{a} Q^{(a)}(\partial L_a)$ where the "vector" $\partial L_a$ is that of partial derivative operators with respect to the coefficients of the linear form $L_a$. – Abdelmalek Abdesselam Jun 22 '18 at 15:41
  • From the point of view of generalizing the FFT, there is no need to specialize to $Q=L^2$. One can also do $Q=AB$ for two linear forms $A$ and $B$. This allows one to treat invariants of antisymmetric tensors. – Abdelmalek Abdesselam Jun 22 '18 at 15:44
  • I should also say that FFT for vectors and covectors follows from vectors only since by Cramer's rule you can replace a covector by $n-1$ vectors. This is how I proved the FFT/Schur-Weyl in your previous MO question https://mathoverflow.net/questions/255492/how-to-constructively-combinatorially-prove-schur-weyl-duality – Abdelmalek Abdesselam Jun 22 '18 at 15:47