3

Before reporting this a duplicate, this post explains the computations of tensors nicely, but I still have questions regarding why they are needed. Also, what makes a tensor actually tensor? The way it transforms? Why wouldn't everything transform like a tensor?

Habouz
  • 1,312
  • 1
  • 19
  • A tensor (I assume you mean in the physics sense) is just a way of attaching linear algebra information to every point in space in a way that transforms the right way when we change coordinates. Why would we want to do that? I see two reasons: 1) we understand linear algebra pretty well, so it's usually the first thing we try when trying to describe new phenomena, 2) it works. – Charles Hudgins Jun 26 '22 at 22:06

2 Answers2

3

I'll flip the order of questions.

What makes a tensor a tensor?

A tensor is indeed something that transforms as a tensor. No further definition is needed. An example might help. The metric tensor is a tensor defined as:

$$g_{ij}=\vec{e_i}\cdot\vec{e_j}$$

Or the dot product of the basis (note that some people call them basis vectors, but it's important to know they do not transform like vectors. I'll explain this later). This expression is written in index notation which means, for this example, you need 2 arguments and a number will output. For $i=1, j=1$: $$g_{11}=\vec{e_1}\cdot\vec{e_1}$$

Which if we are working in the normal orthonormal system would be equal to 1. We know this is a tensor because it transforms like so. It carries 2 down indices and such tensor transforms like how a basis transforms, but twice. The metric tensor does this. When you scale the entire basis by $a$, the metric scales by $a^2$: $$g'_{ij}=\vec{e'_i}\cdot\vec{e'_j}=a\vec{e_i}\cdot a\vec{e_j}=a^2 g_{ij}$$

Why wouldn't everything transform like a tensor?

This problem can be solved when you think of counter-examples. In fact, a simpler question would be: "Why wouldn't every array of numbers transform like a vector?"

The reason for this confusion is that we picture an array of numbers as arrows, and arrows transform like a vector!

Firstly, a vector is something that transforms opposite to how a basis transform. So:

$$v^1 \vec{e_1} + v^2 \vec{e_2}=\frac{v^1}{a} a\vec{e_1} + \frac{v^2}{a} a \vec{e_2}= v'^1 \vec{e'_1} + v'^2 \vec{e'_2}$$

(I equated both representations (primed and unprimed) because it should not change under transformation)

(Another note: The superscript does not mean exponentiation; it is a label. It is a superscript because it differentiates vectors from basis. Exponentiation is written with parenthesis like the scaling factor.)

Since $a\vec{e_i}=\vec{e'_i}$, this leaves us with $v'^i=\frac{v^i}{a}$

Thus a vector transforms oppositely to a basis. Like: the position vector (1000,2000) which means 1000m east, 2000m north. If I switch to kilometers, I multiply the basis by 1000 since 1 unit now covers 1000m, but I divide the vector by 1000 so that in the new units, it's (1,2), which is 1km east, 2km north.

Not every array of numbers transforms like this. let's say my array is (n,m) where n is the price of a house I bought, and m is the area. It's nice to use them since I can add them easily to get the total price and area. However, if I switch to km, nothing changes, so it is not a vector in the position space, which is the space physicists use.

What about Tensors? Well, similar to vectors, I can easily create a 2d array for useful purposes, but it would not transform like tensors.

Why are they needed?

As you've seen above, the transformation rules for tensors are different than vectors. Try making the metric tensor a vector. You can't because they transform differently.

Finally.

What is a Tensor, intuitively?

A tensor is an object. This object transforms with certain rules. You cannot replace it with a vector or anything else because they transform differently. Not every object that looks like a tensor transforms like a tensor, I can easily create an array of numbers that do not transform like how a tensor does. Example of things that need tensors to describe them:

Metric tensor, which describes how dot products work. Transforms like a tensor.

Curvature tensor, which describes how a space is curved. Under a change of coordinates, it transforms in a certain way which fits the criteria of a tensor.

Ricci tensor, which describes how a region changes its area as you move it through space. Transforms like a tensor.

Habouz
  • 1,312
  • 1
  • 19
1

Before we understand a tensor, we need to understand what a co-vector is. A co-vector is essentially a linear map from a vector to a number. A simple example of such is the gradient covector. For instance, suppose you consider a point in 3-D space, and you want to know by what amount a physical quantity like Temperature would change if you take a small step in a certain direction, then you can find it by evaluating the gradient co-vector field of the temperature in the direction you wish. Mathematically,

$$dT_v = \nabla_{v} T$$

Now, there is another view point you can take, that is vectors are functions which send co-vectors to numbers. For example, suppose you have a small step in a certain direction of space from a certain point in space, now you can think of how different physical fields such as pressure, density , temperature would change for that specific direction.

This leads to a view point that vectors can be thought of as functions which takes co-vectors to numbers.

Now, you may still have the question, what is a tensor? Well, a (p,q)-tensor is a function which takes p vectors and q co-vectors linearly and spits out a number.

Why would this be useful? Let me give a concrete. Suppose you are doing multi-variable calculus, then when evaluating surface integrals, we do it by projecting doing the area on the surface down onto the x-y plane. Since, the area is defined through two tangent vectors, we can define something known as a two-form which eats these two tangent vectors and gives us the projection of the area in the tangent plane onto the x-y plane.

Notice that to describe none of these ideas I didn't pick a basis or use actual numbers. The beauty of tensors is that, if I were to infact choose a basis for describing the above ideas, then the components of the tensors with respect to that basis would transform in such a way that no matter in what coordinates we actually evaluate the tensor on vectors and co-vectors in, we get the same number.