4

We know that rotations are performed via real and orthogonal matrices, $O^{T}=O^{-1}$. We can write $O$ as, (The proper rotations have unit determinant)

$$O = \exp(A),$$

where $A^{T}=-A$. In three space dimensions a real anti-symmetric matrix will require three independent entries.

$$A_{ab} = \epsilon_{abc}\theta_c.$$

This looks like $\vec{M}\cdot\vec{\theta}$, where the matrix elements of the generators $M_c$ are given as

$$(M_c)_{ab}=\epsilon_{abc}.$$

Let us remember:

$$[t_a,t_b]= i\,\,f_{abc}t_c.$$

Where $f$ is structure constant. We know that $f$'s should satisfy Jacobi identity

$$f_{bcm}f_{mah} + f_{abh}f_{hcm} + f_{cah}f_{hbm} = 0.$$

We also know that:

$$t_a\,t_b - t_b\,t_c - i\,f_{abc}t_c = 0.$$

The question is: How can I find $t_a$ individually in terms of $f$'s? Do you have a suggestion?

Qmechanic
  • 201,751

1 Answers1

4

In general you can't find a unique Lie algebra basis $X_j$ from the structure constants because there are in general many Lie groups with the same Lie algebra: $SO(3)$, $SU(2)$ have the same Lie algebra and hence the same structure constants. They do however exponentiate differently to define different global topologies, as I discuss in detail here.

Furthermore, even within the same Lie group, if the $X_j$ all undergo the same similarity transformation $X_j\mapsto \sigma\,X_j\,\sigma^{-1}$ then the structure constants are unchanged. This is a literal matrix similarity transformation in a matrix Lie algebra and, by Ado's theorem, every Lie algebra can be faithfully represented as a matrix Lie algebra, although this representation is not unique, as the $SO(3)$, $SU(2)$ example above shows.

Although your problem looks as though it should be easy (one feels that one should be able to read the Lie algebra members straight off the structure constants), the general solution to your problem is surprisingly tricky and is in fact the subject matter of Ado's theorem. The best discussion of this I have seen is on Terence Tao's blog here, but it is not constructive. I can't recall whether there is a constructive proof in Bourbaki but in any case the only constructive proof I have ever seen left me with severe bruising around the head and utterly defeated me, so I'm not qualified to talk about it.

However, you can get cunning in your $O(N)$ case, by taking heed of the properties of the Adjoint Representation, on which my take is discussed here. $O(N)$, as far as the Lie algebra is concerned, the same as $SO(N)$ because, when we relax the constraint $\det U = 1$ in $SO(N)$ there are only two, discrete possibilities for the determinant: since $U^T U = 1$, $U$ is real, we must have $\det U = \pm 1$. So $O(N)$ is simply two disjoint copies of $SO(N)$. So the identity connected component of $O(N)$ is $SO(N)$. This is unlike the unitary group, where we have $\det U = e^{i\,\phi},\,\phi\in\mathbb{R}$ and $U(N) = SU(N)\otimes U(1)$.

Now, $SO(N)$ is simple (in the sense of having no normal Lie subgroups) aside from in the "fiddly case" of $SO(4)$, which itself has a universal cover $\tilde{SO}(3) \otimes \tilde{SO}(3)$ where $\tilde{SO}(3)\cong SU(2)$ in which case $SO(4)$ is semisimple. In particular $SO(N)$ has no continuous centre. Since $\ker(\mathrm{Ad})$ is the centre of a group, the big A adjoint representation annihilates the centre. Therefore in such a lucky case where there is no continuous centre, the image of $SO(N)$ under the big A adjoint representation is either isomorphic to $SO(N)$ itself, or is the quotient group $SO(N)/\mathbb{Z}_m$, where we have modded a discrete centre out of the latter.

All of this is a long winded way of saying that in your lucky case, the Lie algebra of the group's image under the adjoint representation is the same as that of the group.

So now let $[X_j,\,X_k] = i\,f_{j,\,k,\,\ell}\,X_\ell$ in your notation. This tells you that when you choose a basis for the Lie algebra aligned to the $X_j$, we have:

$$\mathrm{ad}(X_j) X_k = i\,\sum\limits_{\ell=1}^N f_{j,\,k,\,\ell}\,X_\ell$$

or

$$\mathrm{ad}(X_j)_{\ell,\,k} = i\, f_{j,\,k,\,\ell}$$

Then, the "braiding" relationship $\mathrm{Ad}(e^{X\,t}) = \exp(\mathrm{ad}(X)\,t)$ tells you that these matrices are a basis for the Lie algebra $\mathrm{Lie}(\mathrm{Ad}(O(N)))$ of $\mathrm{Ad}(O(N))$. As we have seen,

$$\mathrm{Lie}(\mathrm{Ad}(O(N))) = \mathrm{Lie}(O(N))$$

so you can take the $\mathrm{ad}(X_j)$ as your "generators", i.e. in your notation:

$$(t_a)_{b,\,c} = \mathrm{ad}(X_a)_{b,\,c} = i\, f_{a,\,c,\,b}$$

will give you an $N\times N$ faithful matrix representation of your generators. Note that any similarity transformation can be imparted on these generators and you still have a perfectly good set.

So this answers your question, but a lingering thought is: where does the Jacobi identity come into it and why apparently haven't we used it? The answer is that is has: the Jacobi identity is nothing more than a disguised version of the following general fact: the Lie group homomorphism $\mathrm{Ad}:\mathfrak{G} \to \mathrm{Aut}(\mathfrak{G})$ induces a homomorphism $\mathrm{ad}$ of Lie algebras that respects Lie brackets:

$$\mathrm{ad}([X,\,Y]) = \mathrm{ad}(X)\,\mathrm{ad}(Y)-\mathrm{ad}(Y)\,\mathrm{ad}(X) = [\mathrm{ad}(X),\,\mathrm{ad}(Y)]$$

and this last relationship is in fact equivalent to the Jacobi identity.