1

Considering for simplicity only real $n$-by-$n$ Hamiltonian matrix, $H_n$. Can unitary transformations, $S$, diagonalizing such a Hamiltonians, $$ S^T H S = \Lambda, $$ be always represented as rotations? We clearly can do this for $n=2$: $$ S=\begin{bmatrix} \cos\phi & \sin\phi \\ -\sin\phi &\cos\phi\end{bmatrix}, $$ and for $n=3$ (since the corresponding rotation matrices in 2d and 3d are the general orthogonal matrices). However I am not sure how to describe a rotation in an arbitrary number of dimensions. Also, what is exactly a rotation in $n$-dimensions? Can we always represent it in terms of an infinitesimal generator as a rotation about an axis, $$ R=e^{\theta K}, $$ where $K^T=-K$?

Remark The answers might be obvious for people with a strong background in the group theory. However, my background is not in high-energy physics, and I have never studied continuous groups. Thus, I would appreciate down-to-earth explanations or references where I could find such explanations.

Qmechanic
  • 201,751
Roger V.
  • 58,522
  • 2
    Because your $H$ is an observable, the restriction that it be Hermitian reduces to it being a symmetric real matrix. Now, it is a fundamental result of finite-dimension linear algebra that every real, symmetric endomorphism can be diagonalised by means of orthogonal matrices. $$ \Lambda = R^THR $$ $$ R\in O\left(n\right) $$ So my guess would be no. You cannot guarantee that this $R$ is necessarily part of $SO\left( n \right)$ --the part that's identity-connected. Give me some time to find a counterexample --or maybe someone abler than me--, and maybe we'd have a satisfactory answer. – joigus Mar 08 '21 at 11:45
  • @joigus Do I understand you correctly: $R\in O(n)$ is already by definition a rotation, but it cannot always be represented as a composition of infintesimal rotations and/or a rotation about a single axis? – Roger V. Mar 08 '21 at 12:42
  • 1
    An element of $O\left( n \right)$ is not necessarily an element of $SO\left( n \right)$. It is modulo determinant's sign. Look at point 1 in @Qmechanic 's answer. I think they clinch the case. It's true that you can always find this $O\left(3 \right)$ element. But a simple re-arrangement of the basis order gives you what OP wanted. – joigus Mar 08 '21 at 13:27

2 Answers2

5
  1. It is well-known that real symmetric $n\times n$ matrices are diagonalizable by orthogonal matrices $\in O(n)$. In fact, they are diagonalizable by special orthogonal matrices (=rotations) $\in SO(n)$, because we can always include/remove a reflection with negative determinant.

  2. The Lie group $SO(n)$ of $n$-dimensional rotations are classified in e.g. this & this Phys.SE posts.

  3. OP's last question is answered affirmatively by the fact that the exponential map $\exp: so(n)\to SO(n)$ is surjective, cf. e.g. this Math.SE post.

Qmechanic
  • 201,751
  • 1
    Say your $H$ is made diagonal by, $$ R^{T}HR $$ but $\det R=-1$. $\Lambda$ being, $$ \Lambda=\left(\begin{array}{ccc} E_{1} & 0 & 0\ 0 & E_{2} & 0\ 0 & 0 & E_{3} \end{array}\right) $$ You may not care what the ordering is, so as the disconnected part to the identity you can obtain by $O^{-}\left(3\right)=P\times SO\left(3\right)$, with $P$ any inversion, you can set $\Lambda$ as, $$ \Lambda=\left(\begin{array}{ccc} E_{2} & 0 & 0\ 0 & E_{1} & 0\ 0 & 0 & E_{3} \end{array}\right) $$ Now $R \in SO\left(3\right)$. But there's a price to pay, as ordering changes. – joigus Mar 08 '21 at 14:11
1

Given a vector space $V$ equipped with an inner product $\langle \cdot,\cdot \rangle:V\times V \rightarrow \mathbb R$ and an ordered basis $\{\hat e_n\}$, a rotation is defined as a linear transformation $R:V\rightarrow V$ which preserves the inner product (i.e. $\langle R(x),R(y)\rangle = \langle x,y\rangle$) and the orientation of the space. Such transformations are represented as matrices $r$ such that $r^\mathrm T r = \mathbb I$ and $\mathrm{det}(r) = +1$; the set of such matrices is called the special orthogonal group $\mathrm{SO}(n)$.

If you would like to remove the requirement that the transformation preserve orientation, then you are talking about the orthogonal group $\mathrm{O}(n)$ which preserves inner products. This group includes inversions as well as proper rotations.

More concretely, we can define an orthogonal transformation as a map which takes an orthonormal basis $\{\hat e_n\}$ to another orthonormal basis $\{\hat g_n\}$, and a rotation as an orthogonal transformation which also preserves the orientation of the basis (i.e. an orthogonal transformation with determinant $+1$). It can be shown$^\ddagger$ that any orthogonal transformation can be written as a rotation $r$ composed with an odd permutation matrix $i$; in other words, one can get from any orthonormal basis $\{\hat e_n\}$ to any other orthonormal basis $\{g_n\}$ by performing a rotation and then, if necessary, an odd permutation of the result.

A symmetric $n\times n$ matrix $M$ can be diagonalized by an orthogonal transformation which maps an arbitrary basis $\{\hat e_n\}$ to an eigenbasis $\{g_n\}$ of $M$. That is, one finds the transformation $t$ which maps $\hat e_n$ to $\hat g_n$; from there,

$$(t^\mathrm T M t) \hat e_n = (t^\mathrm TM)\hat g_n = t^\mathrm T \lambda_n \hat g_n = \lambda_n t^\mathrm T \hat g_n = \lambda_n \hat e_n$$

as a result, $t^\mathrm T M t$ is diagonal in the basis $\hat e_n$. As argued above, such a transformation is orthogonal (as it's just a map from one ON basis to another); but we can restrict our attention to special orthogonal transformations by noting that if $t$ has determinant $-1$, then we can add an inversion $i$; the transformation $i\circ t$ then also diagonalizes $M$ (exercise for the reader), and has determinant $+1$.

As a result, for any $n\times n$ symmetric matrix $M$ there exists a special orthogonal transformation (i.e. a rotation) which diagonalizes it.


Can we always represent it in terms of an infinitesimal generator as a rotation about an axis $R=e^{\theta K}$, where $K^\mathrm T = -K$?

We can always write a rotation in that form, as per Qmechanic's answer. The proof is not trivial, but for intuition one can note that any rotation can be broken up into $N$ smaller rotations, and as $N\rightarrow \infty$ the rotations become infinitesimally close to the identity transformation $R(\theta/N) \approx \mathbb I + \frac{\theta}{N} K$ for some antisymmetric $K$.

However, note that rotations are generically not about any single axis, in the sense that e.g. a rotation in $d=4$ dimensions may leave two axes fixed (so specifying a single axis and an angle is not sufficient to define the rotation). Instead, one should think of a rotation as occurring in a plane, see e.g. this wiki article.

Another way to understand this is that a rotation can be defined by writing down an antisymmetric matrix $K$ and then letting $R=e^K$; this exponential map is surjective, as mentioned above. The antisymmetric matrices constitute a vector space - the Lie algebra corresponding to the Lie group $\mathrm{SO}(n)$ - which has dimension $n(n-1)/2$. In 3 dimensions, $3(3-1)/2=3$ and so the space of antisymmetric matrices is isomorphic to the underlying vector space itself. This means that we can associate each antisymmetric matrix to a vector in $\mathbb R^3$, whose magnitude defines an angle and whose direction defines an axis of rotation. But clearly this construction is restricted to the very special case $n=3$; in any other dimension, we cannot specify a rotation simply via an axis and an angle, because the former requires a specification of $n(n-1)/2$ parameters and the latter provides $n$ of them.


As a final note, if we move to complex vector spaces with conjugate-linear inner products, then replacing symmetric $\leftrightarrow$ Hermitian and orthogonal $\leftrightarrow$ unitary leads to the immediate generalization from $\mathbb R$ to $\mathbb C$.


$^\ddagger$If you're familiar with wedge products, then this can be reformulated as follows: an orthogonal transformation $t:\hat e_n \mapsto \hat g_n$ is such that $$\hat g_1 \wedge \ldots \wedge \hat g_n = C \hat e_1 \wedge \ldots \wedge \hat e_n$$ with $C=\pm 1$. $C=+1$ corresponds to an orientation-preserving orthogonal transformation, i.e. a rotation. From this standpoint, it is obvious that any orthogonal transformation can be expressed as a rotation composed with an inversion; if $C=-1$, we can simply add an odd permutation $\hat g_1 \leftrightarrow \hat g_2$ to make $C=+1$.

J. Murray
  • 69,036
  • Thank you. I have also read elsewhere about rotation in a plane in 4D, which was one of the underlying doubts. Could you recommend a good book? Does this really go in the direction of Lee groups or is it simply about deeper understanding of Hilbert spaces? – Roger V. Mar 08 '21 at 14:28
  • @Vadim I've updated my answer with some additional insight regarding the axis/plane of rotation issue in higher dimensions. The correspondence between $SO(n)$ and the set of $n\times n$ antisymmetric matrices should be understood as the correspondence between a Lie group and its Lie algebra; for a good reference on the subject, see e.g. these notes by Brian Hall. The facts that orthogonal/unitary transformations simply constitute a shuffling of the basis and that symmetric/Hermitian matrices can be diagonalized by such transformations are crucial[...] – J. Murray Mar 08 '21 at 15:19
  • [...] for understanding Hilbert spaces, and can be found in any text on Linear Algebra. – J. Murray Mar 08 '21 at 15:20
  • I think it is really the knowledge of Lie groups that I lack. Thanks again. – Roger V. Mar 08 '21 at 15:24