2

In "Modern Quantum Mechanics" by Sakurai J.J. he gives an example of a Cartesian tensor of rank $2$ which is a dyadic formed out of two vectors $\mathbf{U}$ and $\mathbf{V}$, i.e. $T_{ij} \equiv U_i V_j$. It obviously has $3^2=9$ components because each index can run from $1$ to $3$. However then they show that this dyadic can be decomposed into the following sum:

$$U_i V_j = \frac{\mathbf{U}\cdot\mathbf{V}}{3} \delta_{ij} + \frac{U_iV_j - U_j V_i }{2} + \left( \frac{U_i V_j + U_j V_i}{2} - \frac{\mathbf{U}\cdot\mathbf{V}}{3} \delta_{ij}\right)$$

It is then claimed that the first term is a 0-rank tensor (one component, scalar, invariant under rotation), the second term - tensor of rank 1 (three components) and the third term - traceless tensor of rank 2 (five components).

A couple of questions (I'm new to tensors, so pardon me if the questions are trivial):

  1. Why the second term is a tensor of rank 1? Sure, since the only difference between $U_i V_j - U_j V_i$ and $U_j V_i - U_i V_j$ is the sign, one can claim that it has only three independent components (the binomial coefficient $\binom{3}{2} = 3$ doesn't care about the order). However that's not how the rank of tensor is usually defined. Why is it wrong to say that $(U_i V_j - U_j V_i)$ has 9 components? Three of them are zero, and the rest six are pairs of numbers with opposite sign. Or maybe 6 components, if we neglect the zeros?

  2. Similarly, why the last term is a rank 2 tensor? First of all, from what I learned we can only add/subtract tensors of the same rank. But from what I see $(U_i V_j + U_j V_i)$ has at least six components ($(1,1), (1,2), (1,3), (2, 3), (2, 2), (3, 3)$) whereas $\frac{\mathbf{U}\cdot\mathbf{V}}{3} \delta_{ij}$ is the same scalar (0-rank tensor) which appears in the beginning of the decomposition.

  3. The same question applies to the general composition: if these are tensors of different rank isn't it meaningless to add them? Maybe in reality each term should be multiplied (as in outer product) by some sort of a unit tensor (which increases the rank/order of the tensors) to make this decomposition mathematically precise?


The same terminology is used in the book by A. Messiah. He writes:

$\mathbf{U} \otimes \mathbf{V}$ is reducible. The nine-dimensional space in which it is defined is the direct sum of three irreducible invariant subspaces (with respect to rotations), having respectively 1, 3 and 5 dimensions. The projections of $\mathbf{U} \otimes \mathbf{V}$ onto each of these subspaces are therefore irreducible tensors; they are, to within a constant, the scalar product $\mathbf{U} \cdot \mathbf{V}$, the vector $\mathbf{U} \times \mathbf{V}$ and the irreducible 5-component tensor which transforms under rotation like the harmonic polynomials of second degree.

Notice that he says "5-component" tensor. I assume that in reality the tensor has 9 components (i.e. it's a rank-2 tensor) but only 5 of them are independent (which actually has nothing to do with the rank of the tensor).

grjj3
  • 675
  • The third term is symmetric and traceless. And the second is antisymmetric. – G. Smith Dec 16 '19 at 17:57
  • @G.Smith - I understand that it's symmetric and traceless. The question is why it has only 5 components and why can we add tensors of different rank – grjj3 Dec 16 '19 at 18:00
  • A symmetric tensor has $6$ independent components. Tracelessness imposes $1$ relation on them. $6-1=5$. An antisymmetric tensor has $3$ components. – G. Smith Dec 16 '19 at 18:03
  • @G.Smith - I agree. But from what I know the rank of tensors is not defined by the number of independent components. And again, how can we add tensors of different rank? Let's concentrate on the second term of the decomposition: why it's a rank-1 tensor (3 components) as opposed to a rank-2 tensor (9 components)? (whether or not the components are dependent is a different question) – grjj3 Dec 16 '19 at 18:07
  • I think it’s just terminology. I’d call all three terms tensors of rank 2 because they all have 2 indices. And I’d say that the first one transforms according to the $\mathbf 1$ representation ($\ell=0$), the second according to the $\mathbf 3$ representation ($\ell=1$), and third according to the $\mathbf 5$ representation ($\ell=2$). But your book isn’t using “rank” to mean “number of indices”. – G. Smith Dec 16 '19 at 18:14
  • @G.Smith - so just to clarify: all the three terms in the decomposition are actually rank 2 tensors since they have two indices (which indeed is consistent with the common definition of tensor rank), however the main point here is that they have different number of independent components? And by "independent component" we mean any element which (1) isn't identically zero (2) isn't intrinsically dependent on another component (otherwise any change in it is accompanied by a change in another component). Correct? – grjj3 Dec 16 '19 at 18:36
  • I agree with what you’ve written except I’m not willing to say “actually”. People use “rank” to mean two different things and you can’t say one is right and one is wrong. I prefer Messiah’s terminology of a “$(2\ell+1)$-component” tensor. – G. Smith Dec 16 '19 at 18:40
  • @G.Smith - Good to know. I wasn't aware of the fact that there are different definitions of "rank". Every book I've come across defines "rank" as the "the total number of contravariant and covariant indices". Thank you! – grjj3 Dec 16 '19 at 18:57

1 Answers1

0

This is just different terminology to refer to transformation properties under rotation, rather than the total degree.

Basically $$ x^2+y^2+z^2 $$ is called a scalar, or rank-0 spherical tensor, even if it is quadratic in the components. In that way the nomenclature for spherical tensors is slightly different from that of general Cartesian tensors.

Thus a vector is a collection of 3 (non-zero) objects that transform under conjugation as $L=1$ states do under rotations.

Quoting Wigner in his Group Theory book:

.. an irreducible tensor operator of the $\omega$th degree... is defined by the condition that its $2\omega+1$ component $\textbf{T}^{(\rho)}$ transform under rotation of the axes as follows: $$ \textbf{O}_R^{-1} \textbf{T}^{(\rho)} \textbf{O}_R =\sum_{\sigma=-\omega}^{\omega} \boldsymbol{{\cal D}}^{\omega}(R)_{\rho\sigma} \textbf{T}^{(\sigma)}$$

where clearly Wigner uses $\rho$ and $\sigma$ to index components.

ZeroTheHero
  • 45,515
  • So in this terminology a vector would be "rank 3" tensor meaning a 3-component tensor, correct? – grjj3 Dec 16 '19 at 20:18
  • @grjj3 actually a rank $1$ tensor means angular momentum $1$, so that a rank $L$ tensor will have $2L+1$ components. – ZeroTheHero Dec 16 '19 at 20:25