I'm really confused by the notation of raising and lower indices in tensors when mixed with einstein summation notation and referencing the metric tensor. I need help separating several conflicting concepts and notations into a single framework that I can work with for reading literature.
Here Suppose $A$ is a first order "tensor" that is a vector, then abstractly $A$ can be viewed as a list of numbers we shall call this list $\alpha_i$ indexed by $i$. Now the tensor can many in forms:
My understanding is as follows:
First Model: Index Location Transpose
$$ A^{i} = \begin{bmatrix} \alpha_{0} \\ \alpha_1 \\ \vdots \\ \alpha_n\end{bmatrix}$$
$$ A_{i} = \begin{bmatrix} \alpha_{0} & \alpha_1 & \dots & \alpha_n\end{bmatrix}$$
Naturally then
We have the dot product or inner product:
$$ A_iA^i = \alpha_{0}^2 +\alpha_1^2 + \dots + \alpha_n^2 $$
And we have the tensor product or "outer product":
$$ A^iA_i = \begin{bmatrix} \alpha_0^2 & \alpha_0 \alpha _1 & \dots & \alpha_0 \alpha_n \\\ \alpha_0 \alpha_1 & \alpha_1^2 & \dots & \alpha_1 \alpha_n \\\ \vdots & \vdots & \ddots & \vdots \\\ \alpha_0\alpha_n & \alpha_1 \alpha_n & \dots & \alpha_n^2 \end{bmatrix} $$ So far all is well.
In this model $A^i A^i$ is undefined and same with $A_i A_i$. Like these expressions cannot be well defined.
Second Model: Einstein Summation, ALL Repeated Indices are Summed Over
Here $A^{i}$ cannot be intrinsically viewed as a column or row vector, and therefore $A^{i} = \alpha_i$ with no "matrix like behavior". What can be said though is that in an expression if an index is repeated then we sum over the index, and moreover once we want to concretely evaluate $A^{j}$ versus $A_{j}$ then we just access the $j^{\text{th}}$ element of the list as per the answer here
Thus:
$$ A_{i} A_{i} = A_{i}A^{i} = A^{i}A^{i} = A^{i}A_{i} = \alpha_0^2 + \alpha_1^2 + \dots + \alpha_n^2 $$
This is weird to me because now its not clear at ALL what the difference between $A_{i}$ and $A^{i}$ is and this is in DIRECT conflict with the first model that can't even evaluate two of the sums listed above.
Third Model: The metric tensor model + Einstein convention
I still don't have rigorous definition of what $A^{i}$ or $A_{i}$ mean much like the einstein summation model BUT we do know the following that given a metric tensor (as a rank 2 tensor) $g_{ij}$ that we have
$$ g_{ij}A^{i} = A_{j}$$ $$ g^{ij}A_{j} = A^{i} $$
This third model is the most interesting, we have literally that:
$$ g_{ij}A^{i} = \sum_{i=0}^{n}g_{ij}A^{i} = g_{0j}\alpha_0 + g_{1j}\alpha_1 \dots g_{nj}\alpha_n$$
But it's clear this is a different model as well, because of the following:
if we interpret $A^{i}$ to be a column vector then by definition $A^{j}$ is a column vector.
If we interpret $A^{i}$ to be a row vector, then by definition $A^{j}$ is a row vector.
If we assume NO matrix structure on $A^{i}$ that is, it is just a one-dimensional array of numbers and nothing more, then we have that $A_{j}$ is also JUST a one dimensional array of numbers and nothing more.
Qualitatively this is in DIRECT conflict with the first model, although it may not be in direct conflict with the second model, DEPENDING on your choice of metric tensor $g_{ij}$, of we let our metric tensor stray from being the euclidean metric (all 1's on the diagonal and zeros elsewhere) then this does not lead to the same results as the einstein summation convention I think.