1

I learned about covariant and contravariant vectors in the context of Vector and Tensor analysis and Now I'm learning about it in the context of Linear vector spaces in Dirac ket notation. I'm having difficulty in relating the two to each other.


We know that any vector can be expanded in term of basis $\{e_i\}$ set, then $$\vec{A}=\sum_iA^ie_i$$ To find the component of the vector, One defines the reciprocal basis $\{e^i\}$ as $$e_i\cdot e^k=\delta_{ik}$$ Now we can find the component by simply taking the dot product with reciprocal basis: $$\vec{A}\cdot e^{i}=\sum_jA^je_j\cdot e^i=A^i$$ We can also expand the same vector in term of reciprocal basis as $$\vec{A}=\sum_iA_ie^{i}$$ We call components $A^i$ contravariant while $A_i$ covarient component of vectors.


Now in the context of Linear Vector Spaces, It says that any vector can be expanded as $$|a\rangle =\sum_i a^i|i\rangle $$ And they directly define the components $$a_i=\sum_j \bar{a}^j\langle j|i\rangle $$ the last line object should be $\langle a|i\rangle $. I don't find any relation between the two. How does the reciprocal basis look in ket notation? Can any help me relating the two?

Qmechanic
  • 201,751
  • 3
    "we can also expand the same vector in terms of reciprocal basis". ABSOLUTELY NOT. A vector space and its dual are completely different objects, and if you decide to mix them up, you're completely setting yourself up for all sorts of confusion. Also, we're not taking the dot product of $\vec{A}$ and $e^i$. $e^i$ is a linear map which eats vectors and spits out numbers, so the relationship is $A^i=e^i(\vec{A})$. – peek-a-boo Sep 04 '21 at 06:52
  • 1
    @peek-a-boo You are absolutely correct; many texts simply gloss over this confusion without much thought, forgetting that this can only be done in Euclidean space with Cartesian coordinates, where vectors and covectors are canonically identical. – Vincent Thacker Sep 04 '21 at 13:19
  • 1
    @VincentThacker right, and I would say that ESPECIALLY in $\Bbb{R}^n$, one should be extra cautious, because $\Bbb{R}^n$ having so much familiar structure (a group, vector space, smooth manifold, inner product space, Riemannian manifold etc etc) is a double-edged sword. The familiarity to the untrained student may seem like a good thing, when in fact not being consciously aware of where exactly we're using all this structure is just a huge recipe for disaster. I would go so far as to say even in $\Bbb{R}^n$ we shouldn't unnecessarily dualize things with the inner product. – peek-a-boo Sep 04 '21 at 13:35
  • @peek-a-boo Look, I write what I have read. What I do to make thing clear? Can you suggest any reference? – Young Kindaichi Sep 06 '21 at 13:16

1 Answers1

1

Take any two vectors $|a\rangle$ and $|b\rangle$ and expand them in the basis $\{ |i \rangle \}$.

Write $$|a\rangle=\sum_i a^i |i\rangle$$ and $$|b\rangle=\sum_j b^j |j\rangle$$ where $a^i$ and $b^i$ are the contravariant components of the vectors in the stated basis.

By the definition of a bra, $$\langle b |=\sum_j \bar {b^j} \langle j|$$

Taking the inner product of $|b\rangle$ and $|a\rangle$ we get-

$$\langle b|a\rangle=\sum_j\sum_i \bar{b^j}a^i \langle j|i\rangle$$

Rearranging the summation we get:

$$\langle b|a\rangle=\sum_i\Big\{\sum_j \bar{b^j}\ \langle j|i\rangle \Big \} a^i$$

The term inside the curly brackets is known as the covariant component of vector $|b\rangle$ with respect to the basis vector $|i\rangle$ and is written as $b_i$

This thus makes the inner product much less cumbersome:

$$\langle b | a \rangle= \sum_i b_i a^i$$

without even caring about whether the basis is orthonormal. If the basis is orthonormal then $b_i=\bar{b^i}$ and you get the ubiquitious relation:

$$\langle b | a \rangle =\sum_i \bar{b^i}a^i$$

Does that solve your problem?

Physiker
  • 1,441