I'm studying special relativity and I have some difficulties with tensor index.
Take for example the Lorentz matrix, whose elements are written as $\Lambda^\mu{}_\nu$.
$\Lambda^\mu{}_\nu(v) = \begin{bmatrix} \gamma & -\gamma \beta & 0 & 0 \\ -\gamma \beta & \gamma & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0& 1 \end{bmatrix} \ \Lambda_\mu{}^\nu(v) = \begin{bmatrix} \gamma & \gamma \beta & 0 & 0 \\ \gamma \beta & \gamma & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0& 1 \end{bmatrix}$
Now I know that the $u$ is the index that is linked with rows. This is ok and it is ok how we can write the multiplication of vectors and matrix in this way.
But I have seen for example this equation
$g_\alpha{}^\beta = \Lambda^\mu{}_\alpha\Lambda_\mu{}^\beta$
where $g$ is identity matrix. I see that both $\mu$ represents rows. So it is not a usual matrix multiplication. How can he tell that $\alpha$ represents row and $\beta$ the column in $g$? (Ok $\Lambda$ is symmetric but if we don't take a symmetric matrix i don't know)