5

I am trying to connect the concepts I learned from special relativity, to those of general relativity. Take a look at this example from wikipedia. They find a transformation matrix from the contravariant components of a vector, to the covariant components.

Now let's move to general relativity. I know that in flat space, the metric tensor is just the Minkowski metric $\eta_{\mu\nu}$, and I know that in order to change a vector to a covector, you simply contract the metric with the vector.

But if I were to take a vector $V^{\mu}$ and lower the index to a covector $V_{\mu}$ in flat space, it most certainly would not be the complicated change of basis matrix shown in the example. Am I missing something here? When you lower an index, are you finding an entirely different entity? Or are you finding the covariant components of the same vector?

I hope this makes sense.

Qmechanic
  • 201,751
user41178
  • 981
  • You have two different answers because there really are two ways of setting the formalism up. Alfred's is the modern one. I believe Steil's way is still used in engineering classes, but it's clunkier when you have complicated metrics, like in GR. – knzhou Oct 21 '16 at 23:10
  • @knzhou All my experience on the topic comes from GR. What do you understand under "clunkier"? I mean Alfred Centauri, gave 1 correct formula, I gave another one and a half. The question may remain how to put them into the right context yes but I think mathematically there should be one sound theory/formalism behind it. (Or not?) – N0va Oct 21 '16 at 23:40
  • @M.J.Steil All my experience comes from GR, too. The most popular current books (Carroll, Wald, MTW) all distinguish vectors and covectors as different objects. I've only ever seen the two equated by engineers. – knzhou Oct 21 '16 at 23:47
  • 2
    I say "clunky" because saying a vector has "covariant/contravariant components" totally wipes out the geometric interpretations of vectors and covectors, shown in Alfred's answer; it makes it harder to see why a covector is a linear function from vectors to $\mathbb{R}$, for instance. – knzhou Oct 21 '16 at 23:47
  • @knzhou I looked at the first sections of Wald and I now get the point made by you and Alfred Centauri. There is a difference one should be aware of. I never looked into details in those early sections in those modern books and I was taught GR with lecture notes, where the words "dual" or "cotangent" do not occur. – N0va Oct 22 '16 at 00:24

3 Answers3

10

You're dealing with different geometric objects: Tangent vectors, which can be realized as equivalence classes of curves, and cotangent vectors, which can be realized as equivalence classes of real-valued functions (think differentials).

There's a natural linear pairing operation between these objects: Compose a curve and a function, and you get a map $\mathbb R\to\mathbb R$. Take it's derivative at the point in question, et voilà. This pairing operation allows us to consider the spaces as 'dual', and in particular identify the cotangent space with the space of linear functionals on the tangent space.

Given a coordinate system on a manifold, the coordinate lines are curves, yielding a basis of the tangent space, whereas the components of the coordinate chart are functions, yielding a basis of the cotangent space. It's easy to show that these bases are algebraically dual, ie their pairing yields the Kronecker delta.

On (pseudo-)Riemannian manifolds, there's additionally a metric tensor $g$, a non-degenerate bilinear form. This tensor induces an isomorphism $g^\flat:v\mapsto g(v,\cdot)$ from the tangent to the cotangent space ('lowering the index'), with an inverse map $g^\sharp$ ('raising the index').

The map $g^\sharp$ can be used to pull back our basis of the cotangent space onto the tangent space, yielding the reciprocal basis. The components of a vector $v$ relative to the reciprocal basis of the tangent space are the same as the components of the covector $g^\flat v$ relative to the dual basis of the cotangent space. This makes it possible to conflate vectors and covectors, but that's considered a mostly bad idea nowadays.

Having said all that, now on to your actual question:

But if I were to take a vector $V^{\mu}$ and lower the index to a covector $V_{\mu}$ in flat space, it most certainly would not be the complicated change of basis matrix shown in the example. Am I missing something here?

The Minkowski metric is that 'complicated change of basis matrix' - it's just that you're dealing with an orthonormal basis, which makes it simple.

Christoph
  • 13,545
  • silly me, originally, I had $\flat$ and $\sharp$ switched; should be ok now... – Christoph Oct 22 '16 at 02:12
  • Oh! I think I get it! So I should get the identity matrix (since it is 2D space) on that change-of-basis matrix shown in the example if I convert from the coordinate basis to an orthonormal basis. How would I go about doing this? (This would GREATLY help me understand). I remember reading about this in my GR book by Sean Carroll, but I am still very ignorant in the whole thing. – user41178 Oct 24 '16 at 22:26
  • @user41178: if you're just looking for an orthonormal basis, there's the Gram-Schmidt process; if your looking for an orthonormal coordinate basis, you have to find Riemannian normal coordinates – Christoph Oct 24 '16 at 23:35
7

When you lower an index, are you finding an entirely different entity?

Yes, it's a different entity. $V^\mu$ are the components of the vector $\vec{V}$ while $V_\mu$ are the components of the one-form $\tilde{V}$ dual to $\vec{V}$ with the fundamental relationship

$$\langle\tilde{V}, \vec{V}\rangle = V_\mu V^\mu = g_{\mu \nu}V^\nu V^\mu= V^2$$

In summary $\vec{V}$ and $\tilde{V}$ are not the same entity since they belong to different vector spaces but they are related via the metric.

Update: To emphasize that vectors and one-forms are different geometric objects, consider the following image and caption from the Wikipedia article "One-form"

enter image description here

Linear functionals (1-forms) $\mathrm{α, β}$ and their sum $\mathrm{σ}$ and vectors $\mathbf{u, v, w}$, in 3d Euclidean space. The number of (1-form) hyperplanes intersected by a vector equals the inner product.

  • So then, what is all that business in SR about a vector having contravariant and covariant components?? Is it just a matter of the same nomenclature?

    The two different components come from the fact that in a skewed coordinate system, there are two possible ways of representing a vector. Dropping a perpendicular to the axes, or dropping a parallel to the axes, both of these converge to be the same in unskewed space. I don't get why things are so different in GR.

    – user41178 Oct 24 '16 at 22:09
5

I would not see it the way @Alfred Centauri has described it. Which might be me misunderstanding the answer/ not understanding the mathematical meaning of different entities here right but I will come back to it after my take on the topic.

There is a physical vector $\mathbf{V}$ and one can express this vector in respect to the co- or contravariant basis: $$\mathbf{V}=V_\mu\mathbf{e^µ}=V^\mu\mathbf{e_µ}.$$

$\{ \mathbf{e_µ} \}$ and $\{ \mathbf{e^µ} \}$ are just different basis, which are related by $\mathbf{e_µ}\mathbf{e^\nu}=\delta_\mu^{~~\nu}$. The reciprocal basis is not independent of $\{ \mathbf{e_µ} \}$, neither are the resulting components: as they are related by $V_\mu=g_{\mu\nu}V^\nu$. Let me cite the wikipedia page the OP linked on that point:

In a vector space $V$ over a field $K$ with a bilinear form $g : V × V → K$ (which may be referred to as the metric tensor), there is little distinction between covariant and contravariant vectors, because the bilinear form allows covectors to be identified with vectors. That is, a vector $v$ uniquely determines a covector $\alpha$ via $$\alpha (w)=g(v,w)$$ for all vectors w. Conversely, each covector $\alpha$ determines a unique vector $v$ by this equation. Because of this identification of vectors with covectors, one may speak of the covariant components or contravariant components of a vector, that is, they are just representations of the same vector using reciprocal bases.

I agree with @Alfred Centauri that co- and contravariant vectors and components are not the same but I am not sure about calling them different entities. This might be my mistake because I do not really know what to make of "entities" in a mathematical context but for me it sounds to big of a difference between two so closely related objects.

EDIT: after some points made by @knzhou in the comments and after some additional reading in a modern text book (Wald) on GR (which differs a bit from the "old school" GR lecture notes I was taught GR from).

I think the modern point of view is (as @Alfred Centauri pointed out) to really distinguish between vectors (contravariant) and dual vectors (cotangent, covariant). The equation and points I made above do not distinguish between vectors and dual vectors and I chose a (arbitrary basis/ metric) to make my point. The quote I made actually describes the "intimate" relation between both objects but at a fundamental and basis/metric independent level they are mathematically and geometrically different. There is a relation between them but they are different objects/different entities.

But if one introduces a basis/metric one can use it to

... establish a one-to-one correspondence between vectors and dual vectors. Indeed, given a metric $g$ we could use this correspondence to entirely circumvent the necessity of introducing dual vectors. Normally this is done and accounts for why the concept of dual vectors is not more familiar to most physicists. However, in general relativity we shall be solving for the metric of the spacetime; since the metric is not known from the start, it is essential that we keep the distinction between vectors and dual vectors completely clear. [R.M. Wald, 1984, General Relativity, p. 23]

N0va
  • 3,001
  • 1
    There are vector spaces without metrics, and there the difference between one forms and vectors is fundamental. – Javier Oct 22 '16 at 00:51
  • I strongly disagree with this answer. $V_\mu e^\mu$ and $V^\mu e_\mu$ are completely different objects, i.e. a vector and a linear functional, resp., and setting them as equal is only sowing the seeds of further confusion in people unaware of the difference. The vector space $W$ and its dual $W^$ are isomorphic only because they have the same dimension, and there is provably* no natural isomorphism (roughly speaking, basis-independent) between them. Only in the presence of a metric is there a canonical isomorphism. A well-earned -1 from me. – Emilio Pisanty Oct 22 '16 at 11:12
  • @EmilioPisanty: depends on whether $e^\mu$ denotes the dual or the reciprocal basis; you should be able to develop the theory just fine in terms of the latter, but I agree that it's not something I'd recommend – Christoph Oct 22 '16 at 11:19
  • @Christoph Yes, you can switch the notation around, but the difference remains - that's a trivial transformation and it doesn't change the structure. – Emilio Pisanty Oct 22 '16 at 11:42
  • 1
    @EmilioPisanty: of coure the difference between tangent and cotangent vectors remains; but given a metric, you can phrase everything in terms of the reciprocal basis and never need to talk about covectors; all formulae written in classical tensor notation should remain valid as vector components relative to the reciprocal basis are identical to the components of the corresponding covector relative to the dual basis; given $e_\mu=\partial/\partial x^\mu$, just define $e^\mu=g^{\mu\nu}\partial/\partial x^\nu$ instead of $e^\mu=\mathrm dx^\mu$; then, we have indeed $V_\mu e^\mu\equiv V^\mu e_\mu$ – Christoph Oct 22 '16 at 11:54
  • 1
    @Christoph That's precisely the point. "Given a metric" obscures all the geometrical and linear algebraic structure that's there before the metric, and it whitewashes the fact that a change in the metric will also change the canonical identification between the vector space and its dual. This is precisely the sort of muddle that creates confusions like the OP's, which is why it's so important to keep things separate. – Emilio Pisanty Oct 22 '16 at 12:48
  • @EmilioPisanty: we're in violent agreement here, except that I'd stress framing things in terms of reciprocal bases is rather misguided than outright wrong – Christoph Oct 22 '16 at 12:56