Warning: You might be wondering why this isn’t in Math Stack Exchange. In fact, it is. I asked the same question there a few days ago but got no answer, and since I think that this question is more directed towards the “physicist POV” of tensors and is related to Einstein’s Convention, I’ll leave the question here.
Original Question Below
I'm trying to self study tensor calculus. I was trying to derive the notation for covariant and contravariant indexes of a linear transformation matrix ($(1,1)$ type tensor).
So I did the following: Try for the $2 \times 2$ case, and then try to find a pattern.
For covectors (covariant) $x_i$: (I shall first assume that both indexes of the matrix are up as contravariant, just for simplicity of notation. I'll later "correct" this according to what i have found).
We have:
$$ \begin{bmatrix} x_1' & x_2 ' \end{bmatrix} = \begin{bmatrix} x_1 & x_1 \end{bmatrix} \begin{bmatrix} a^{11} & a^{12} \\ a^{21} & a^{22} \end{bmatrix}$$ And so, for $x_j'$, I'll have: $$x_j' = a^{ij}x_i$$ (Summation convention here). Here, I am summing on the first index of the matrix. For the contravariant case, I have: $$x^i ‘= a^{ij}v^j$$ Hence summing on the second index of the matrix. So, I tought about writing $a$ as $a_{i}^j$, calling then $i$ as the covariant index and $j$ as the contravariant index. Does this make any sense? The first problem I see is that this goes against the summation convention, who states that the indexes must be summed up when they are at different positions (ex: $a^iv_i$ would mean a summation in $i$, but $a_ iv_i$ would not).
This construction I've made would be equivalent to: $$a = a_ib^j \mathbf{e}_i\otimes \mathbf{e}^j$$
My other question is if it is possible arriving to the following:
$$a = a^ib_j \mathbf{e}^i\otimes \mathbf{e}_j$$
Using matrix algebra? Is that possible?
I'm really confused. I've seen 2nd order tensors being written as $a^i_j$ and as $a_i^j$. What is the difference between them? Do they act the same way on vectors?