5

On Physics there's one very clear intuition on what a vector $\mathbf{v}$ is: they represent things with direction and magnitude (although when no metric is available there's no clear concept of norm). Also every single construction with vectors used in Physics usually carry a physical or geometrical meaning behind, some examples are divergence and curl when we are talking about vector fields.

When we move to tensors things become a little more complicated to understand from an intuitive point of view. The rigorous definition of a type $(k,l)$ tensor is of course ubiquitous: it is one multilinear mapping from $k$ vectors and $l$ linear functionals on real numbers. The physical intuition, however, is not so simple to grasp at first.

Now for the present question to stay focused I want to consider just $(0,2)$ tensors. One $(0,1)$ tensor is a vector and one $(0,2)$ tensor is a linear combination of objects of the type $\mathbf{v}\otimes \mathbf{w}$.

The rigorous definition is that those things are bilinear functions of linear functionals. So they pick dual vectors and gives numbers in a bilinear fashion. Now, what is the physical intuition behind this?

How can we physically understand such objects? Is there a straightforward way to intuitively understand them like we do with vectors and the constructions from vector calculus we use in Physics?

To make it clear, I do understand the rigorous construction where we identify a vector space and the double dual $V\simeq V^{\ast \ast}$ and then consider $\mathbf{v}$ as the function $\mathbf{v} : V^\ast\to \mathbb{R}$ given by $\mathbf{v}(f) = f(\mathbf{v})$ and so we consider $\mathbf{v}\otimes \mathbf{w}(f,g) = \mathbf{v}(f)\mathbf{w}(g)$. The point is, how such an object can be understood from a physical standpoint.

Gold
  • 35,872
  • Have you ever thought about it in terms of quantum mechanics? There we use the tensor product to stick together subsystems to make a bigger system. Sure $v\otimes w$ is a bilinear map from the dual space to the reals, but it can also be thought of as two vectors glued together to give you another vector in a bigger space. – Ruvi Lecamwasam Apr 12 '15 at 02:02
  • I've found this helpful though YMMV: https://books.google.com/books?id=w4Gigq3tY1kC&pg=PA105#v=onepage&q&f=false – Alfred Centauri Apr 12 '15 at 02:41
  • 1
    Look into Geometrical Algebra. There's a large body of work on geometric applications bilinear products and related objects. A place to start might be this paper by Hestenes. – garyp Apr 12 '15 at 03:22
  • The bilinear functions of linear functionals is sort of a double dual. You take the duals of the two vectors, and then put them together in a sort of bidual fashion. – Christopher King Apr 12 '15 at 03:28
  • Comment to the question (v1): $\mathbf{v}$ appears to be a (1,0) vector in some parts of the question formulation and a (0,1) covector in other parts. – Qmechanic Apr 12 '15 at 08:28
  • Related: http://physics.stackexchange.com/q/83743/2451 and links therein. – Qmechanic Apr 12 '15 at 08:31

4 Answers4

3

There is another definition. Let's say $v$ and $w$ are part of the vector spaces $V$ and $W$. Now, let's consider bilinear maps $f$ from $V$ and $W$ to vector spaces such as $Z$. Bilinear means that

  • $ f(v_1+v_2, w) = f(v_1, w) + f(v_2, w)$
  • $ f(v, w_1 + w_2) = f(v, w_1) + f(v, w_2)$
  • $f(cv, w) = cf(v, w) = f(v, cw)$

This maps are important for some reason. Is there a way to represent them as regular linear maps? Well maybe we could represent them by $V \otimes W \to Z$ where $V \otimes W$ is some sort of combination $V$ and $W$? What would we want it to be? Well, there should be an obvious bilinear map from $V$ and $W$ to it. Just to be confusing, let's use the same symbol for the combining the vector spaces themselves and the individual vectors, and call it $- \otimes -$. Also, for any bilinear map from f $V$ and $W$ to $Z$, we should have a linear map $f'$ from $V \otimes W$ to $Z$ such that $f(v, w) = f'(v \otimes w)$. Now we see that $v \otimes w$s represents inputs to bilinear maps.

Now, does such a $V \otimes W$ exist, and if so, is it unique? Well, if it exists, its unique up to canonical isomorphism via abstract nonsense. We also give an example of it existing here*. Also, your definition as bilinear functions of linear functionals is also a legit definition*, implying that the two definitions are isomorphic (your definition is sort of a double dual, you have two duals, then you put them together in a sort of bidual fashion.) The great thing about the universal property is that it let's us know what we are looking for before we start looking.

In summary, $v \otimes w$s are inputs to bilinear maps from $V$ and $W$ to other vector spaces.

It's worth noting that tensors of all types can be constructed via similar universal properties, sometimes using duals.

*Someone can edit this answer to put in proofs of this, since it in turn proofs that the two definitions are equivalents.

1

That is only one definition my friend, but there is another. But to understand $v \otimes w$, you must first understand $V \otimes W$ (where $V$ and $W$ are the vector spaces for $v$ and $w$ respectively.) First, consider, for all the vectors $v$ and $w$, the abstract symbols $v \otimes w$, and use this as a basis for a free vector space. Now, using equivalence classes, we declare the following to be equal.

  • $(v_1+v_2) \otimes w = v_1 \otimes w + v_2 \otimes w$
  • $ v \otimes (w_1 + w_2) = v \otimes w_1 + v \otimes w_2$
  • $(cv) \otimes w = c(v \otimes w) = v \otimes (cw)$

Its linear in each part individually. What does this mean?

It means that the magnitudes can be moved between the different parts of the vector. It basically means that you are keeping the direction aspects separate while merging the magnitude aspects.

Its worth noting that tensors of all types can be gotten by tensor producting vector spaces with themselves and their duals.

1

I'll just throw out two simple examples of tensors that you already know:

1) the dot product, itself, is a tensor. It takes two vectors as input, and spits out a number

2) Now, consider a general fluid. It might have viscosity, and shear and everything. It's completely general. Now, consider yourself to be something living in this fluid. You move a little bit along some vector ${\vec v}$. In ordinary fluids, you'd expect some backward force just opposing ${\vec v}$, but because this is a complicated fluid that might have twisting properties, you will in general get another force ${\vec F}$ out. One simple way of expressing this is by making a matrix $\sigma_{ab}$ (I switch to components in Einstein summation now), and say that $F^{a} = \sigma^{a}{}_{b}v^{b}$. You have discovered the classical stress tensor, which encodes all of the pressures in a fluid.

So, what do these two examples have in common? They both involve situations where you need an answer, and the answer depends on one or more vectors. So, you get an object that takes in one or more vectors as input, and spits out zero or more vectors as output. If the thing is multilinear in its input(s), then, whammo, you have a tensor.

Zo the Relativist
  • 41,373
  • 2
  • 74
  • 143
1

Some tensors have more intuitive pictures than others.

For instance, simple antisymmetric tensors tensors can actually represent oriented subspaces with a magnitude. So that's the clear inheritor of the oriented-1-dimensional-subspace with magnitude.

And then you can imagine little oriented planes, little oriented 3-volumes in 4d (for relativistic physics) and so forth.

But even tensors that aren't antisymmetric could have a slightly more intuitive feel. Even as functions. Take a bilinear function, it has two places for inputs. If you fill in just one place, then you effectively now have a linear function.

So your rank two tensor you can think of as a function that takes a covector and gives you a vector (since a vector is a linear function that takes a covector).

I think of vectors as column vectors (since I learned mathematics in the US), and covectors as row vectors and there is a natural sense in which they are linear functions of each other (use regular matrix multiplication).

So then I think of a single vector as something that waits for a row vector and gives a number. And I can think of a covector as something that waits for a column vector and gives a number. Each is just what it is (row or column) waiting for the other and then giving a number. And now I can imagine a rank two tensor could be something that waits fora column vector and then produces something that is still waiting for a second column vector.

Since a regular square matrix is something that takes a column vector and gives you a column vector, it isn't crazy to imagine something that takes a column vector and gives you a row vector. So then I imagine the tensor symbol as doing that, queueing up the basic objects to allow just that, basically since two row vectors next two each other looks like a longer row vector, I put tensor symbols between them, so tell myself to multiply in order and when I get the number from the multiplication I can distribute the number either to the front or across the whole next row vector. We can represent a mixed tensor of type (1,1) as a matrix even when it is not a rank one operator (in the linear algebra sense) but there isn't a nice layout to my knowledge for other types. But the tensor symbol does work, and you can use the abstract index notation (basically saying what happens for basis vectors and basis covectors).

Timaeus
  • 25,523