12

This might be something basic, but it is unclear to me. So I am used to work with representations of groups as matrices. These matrices represent the structure of the Lie algebra by satisfying the commutation relations:

$$ [T_i,T_j]=f_{ijk}T_k $$

but I read particle physics texts where the vectors upon which the matrices act, and not the matrices themselves, were referred to as representations. For example, in SU(3) group, after we find all the weights of a representation we say that "we found the representation", even though we did not find the generator matrices. My question is, in which sense are the vectors representing the group, vectors look like passive elements on which the group matrices act, and do not contain the structure of the group. I hope my question is clear.

EDIT: As the title states, are these vector representations, together with the weights, isomorphic to the group generators?

Lonkar
  • 121
  • Related: http://physics.stackexchange.com/q/52417/2451 and http://physics.stackexchange.com/q/41424/2451. – Qmechanic Oct 21 '14 at 05:56
  • 1
    Personally, I maintain that this usage of "representation" to refer to the vectors themselves does not make sense and I try pretty hard not to use it. – David Z Oct 21 '14 at 06:19
  • Comment on question (v3): A group one considers in physics may not be a Lie group, and even when it is, the matrices that represent group elements don't typically satisfy the structure relations of the Lie algebra, but representations of Lie algebra elements do. One motivation for identifying the vector space with the representation is that if, for example, a representation decomposes as a direct sum if irreps, then the vector space can be thought of as decomposing in a corresponding way into a direct sum of subspaces. In this sense, the vector space isn't entirely "passive." – joshphysics Oct 21 '14 at 06:57
  • 2
    Also, it's not only physicists that use this terminology, mathematicians abuse the terminology often as well. In fact, if you look on the first page of Fulton and Harris' book on representation theory (a hardcore math book to be sure) you'll find the sentence "When there is little ambiguity about the map $\rho$ (and, we're afraid, even sometimes when there is) we sometimes call $V$ itself a representation of $G$..." – joshphysics Oct 21 '14 at 07:04
  • @joshphysics thanks, I was talking about commutators of Lie algebra representations. Beside the decomposition of vectors, do they reveal any of the "multiplication" structure of the Lie algebra manifested in the commutators? – Lonkar Oct 22 '14 at 15:21
  • @Lonkar Not that I can tell. Since for a given vector space there are in general many representations of many groups or algebras acting on that space, it's not possible to reconstruct group structure directly from the vector space. For this reason, as Fulton and Harris remark, it's ultimately an abuse of terminology that requires context to alleviate. – joshphysics Oct 22 '14 at 16:59
  • Representations cannot, in general, encode all the structure of a group because representations are not, in general, faithful. However, it is fair to say that finding the weights of an SU(3) rep. are is the same as "finding the representation" because the structure of SU(3) makes it possible to deduce all the matrices just from the weights. But you are using the group structure to learn about the rep., not the other way around. – Luke Pritchett Mar 05 '16 at 21:09

3 Answers3

3

I think your confusion (like mine) is simply over technical English usage. As you rightly state "vectors look like passive elements on which the group matrices act, and do not contain the structure of the group".

To my mind, a representation of a group is a triple $(\mathfrak{G},\,V,\,\rho:\mathfrak{G}\to GL(V))$: the group $\mathfrak{G}$ being represented, the target vector space $V$ being acted on in a group action of the matrix group of endomorphisms $GL(V)$, and the homomorphism $\rho:\mathfrak{G}\to GL(V)$ between them.

As long as it is clear from the context, it is OK to think of the vector space as the representation, and this seems to be the standard physcist's usage of the word. Indeed, it is a particularly physical point of view: in the standpoint I first cited, you're watching the "system" from afar; in the physicist's viewpoint, you are, like any good experimentalist or careful obsverer, getting as near as you can to the action (i.e. sitting at the business end of the arrow $\rho$) and looking at what the actors who come out of it (the matrices in $GL(V)$) do through their actions on their playthings (the vectors in $V$). My mind picture, in this context, is literally of someone in a white coat, seated right at the end of the pipe $\rho$ (which for me somehow is always a bronze colour) and carefully writing down their observations on what's happenning at the end of the pipe $\rho$ when the studied creatures come out!

In physics, the vectors in $V$ are often quantum states in a quantum state space $V$ (these are the most wonted to me), and we sometimes we seek linear unitary transformations on them that are "compatible with" (i.e. homomorphic images of) the group $\mathfrak{G}$ of "physical happennings" (I'm thinking of $\mathfrak{G}$ as the Poincaré group, or $SO(1,\,3)$ or the latter's double cover $SL(2,\,\mathbb{C})$). The vector space contains the physical things (the quantum states), so it's natural to think of them as the "representation".

  • "...the vectors in $V$ are almost always quantum states..." If by "almost always" you mean anything like "almost everywhere" in math, then I think this rather severe hyperbole. What about group representations in classical mechanics, GR, any classical field theory for that matter, and even quantum field theory where the vectors are elements of the target spaces of fields? – joshphysics Oct 21 '14 at 06:41
  • @WetSavannaAnimal thanks. I understand that the vectors are important in their meaning as quantum states but don't understand how they reveal, by themselves, the group structure. (for example, the generator matrices can be described by their "multiplication tables" which are the commutators - this way they reveal the group structure). Also, once we know the weights is the an isomorphism between weight vectors and generators? – Lonkar Oct 22 '14 at 00:47
  • @Lonkar I've got it now: I didn't before (as is likely clear from my answer). I am thinking some more, but my gut feeling is that no, you don't in general have a unique definition of the group from its weight spaces. Your question is a deeper one than terminology (and a very good one), and you should take away the "terminology" tag and make "once we know the weights is the an isomorphism between weight vectors and generators?" your key question. This may then be a better question for Maths SE – Selene Routley Oct 22 '14 at 01:22
  • @Lonkar I'd almost edit your question for you as I'd like to see a good answer but (1) this would be presumptuous and (2) you are clearly sharper than I about this and probably other matters as well. – Selene Routley Oct 22 '14 at 01:25
  • @WetSavannaAnimal thanks for the kind response. I am not an expert, but I am trying to learn. I changed the title and put an Edit in the question, thanks for the suggestions. I will wait to see more responses – Lonkar Oct 22 '14 at 04:44
2

The vectors (kets) upon which the matrices act should be referred to as the carrier space of the representation. As you said, the matrices are the representation of the abstract generators. It is just lazy talking to refer to the vector space as the representation. A maximum weight vector labels the irreducible representation and tells you the dimension of the carrier space.

It easy to label an orthogonal set of basis vectors for the carrier space using the Gelfand pattern. This is an upside down triangle of numbers with the weight vector numbers along the topside base and using the betweeness rule to fill in the rest of the integers of the triangle. The matrices of the representation my also be calculated from the maximum weight vector, though it is more difficult.

Except, at least for GL(N), the abstract generators themselves can serve as the carrier space vectors for one of the irreps. That is when the group acts on the generators by conjugation. Examples are, the O(3) generators $\vec{J}$ transforming like a 3 vector by 3x3 matrices, or the SU(3) generators transforming like an octet by 8x8 matrices.

Gary Godfrey
  • 3,354
1

No, this is not an ambiguous terminology issue. I suspect seeking an "isomorphism" might be too hidebound... you might as well seek a "functor"! The basic answer is that, yes, possession of the generator matrices T of dimension dxd is basically equivalent to characterization of the weights of states v in the d-dimensional vector space on which such matrices act, except the latter normally gets you more directly to what you want to know in QM in the latter case, by dint of the relevant labels/roots/weights.

Think of SU(2) for simplicity, but you might choose to generalize to SU(3), once the game is evident. To rotate an arbitrary d-dim v by an angle θ, you operate on it by v→exp(iθJ)v in classical physics. By a change of basis of the 3 generators J to their raising and lowering ladder versions $J_+, J_-$ and $J_0$ and the marvelous eigenvalue equations their Lie algebra commutators satisfy, you may organize these rotations much more usefully in QM, and also computationally--this is of course how these higher-dim matrices in the SU(2) WP-article were found, in the first place!

That is, once you have eigenvectors of J.J, with eigenvalues j(j+1), up to normalization, you have characterized the dimensionality of the eigenvector v by 2j+1, and its component by the eigenvalue m of J0 on it, while you know how the raising and lowering Js will send entries to their neighboring slot. So writing the states in the |j,m> convention is tantamount to posing them ready for rotations by simple shifts of their m and multiplications by numbers. For small angles θ,this amounts to transitioning to v+iθJ v for J any linear combination of these 3 ladder operators.

Conversely, the structure of these operators specifies the Lie algebra of the matrices you started with, uniquely (Cartan). Here, $J^2 |j,m\rangle = j(j + 1) |j,m\rangle$, $ J_0 |j,m\rangle = m |j,m\rangle $ and $J_\pm |j,m\rangle = \sqrt{j(j+1)-m(m\pm 1)} |j,m\pm 1 \rangle$: if we act on them with arbitrary bras on the left, $\langle j\,m'|$ they yield the matrix elements of the matrices in question, guaranteed to satisfy the SU(2) algebra.

But this language is simpler than daft matrix multiplication for the purpose of transitioning between states by QM operators, the Wigner-Eckart theorem, etc. The transition is just linear-algebraic change of language.

For SU(3), there are more such eigenvalues: not just the analog of the j (isospin), but also the hypercharge, eigenvalue of Y=B+S, related to the strangeness quantum number. And, indeed, more ladder operators, V+ or U+ (e.g. U-spin interchanges d and s quarks) move you among components of the vectors in suitable ways---they are labelled in plane patterns with triangular symmetry, rather than lines for plain rotations. Again, each dot on these weight diagrams for the octet, decuplet, etc... corresponds to an entry of the d-dim v and you know exactly, virtually by inspection, how these are going to respond to a exp(iθT) rotation, by the clever way they were labelled; so, really, yes!, an equivalent to matrix multiplication. The moment you have drawn the downwards-pointing triangular weight diagram for the baryon decuplet, you have implicitly specified the 10x10 generator matrices of SU(3).

The preponderance in physics of the 2nd language over just writing monster dxd matrices tells you something about its compactness and utility.

Cosmas Zachos
  • 62,595