5

Let $M(n)$ be an $n\times n$ matrix in the variables $x_1,\dots,x_n$ with entries $$M_{i,j}(n)=\frac{x_{\max(i,j)}}{x_{\min(i,j)}}, \qquad 1\leq i,j\leq n.$$ I'm interested in the following:

Questions.

(1) Is there a neat or "closed form" evaluation for the determinant $\det M(n)$?

(2) Is there an explicit formula for the inverse of $M(n)$?

Thank you.

T. Amdeberhan
  • 41,802

2 Answers2

10

Let us write $$a_r=\frac{x_{r+1}}{x_r}$$ for $r=1\cdots n$.

We can then write the matrix $M(n)$ in the form $$\begin{pmatrix} 1 & a_1 & a_1a_2& \cdots & a_1a_2\cdots a_{n-1} \\ a_1 & 1 & a_2& \cdots & a_2\cdots a_{n-1}\\ \vdots & \vdots & \vdots &\ddots & \vdots\\ a_1a_2\cdots a_{n-1}& a_2\cdots a_{n-1} &\cdots & a_{n-1} & 1\end{pmatrix} $$ We do now Gauss elimination, and reduce $a_{n-1}$ times the $(n-1)$-th row from the $n$-th row. We then get in the last row $0, 0, \ldots (1-a_{n-1}^2)$. But this means that $det(M(n))=det(M(n-1))(1-a_{n-1}^2)$ and by induction $$det(M(n)) = \prod_{r=1}^{n-1} (1-a_r^2)=\prod_{r=1}^{n-1}(1-\frac{x_{r+1}^2}{x_r^2}).$$ By using inductively the Gauss elimination mentioned above, one can also get the inverse of $M(n)$.

Ehud Meir
  • 4,979
  • Thanks. What is the inverse, explicitly? – T. Amdeberhan Jan 04 '17 at 16:04
  • 1
    Nice. The last product should also contain squares, right? – Dirk Jan 04 '17 at 16:09
  • I didn't do the exact calculations for the inverse matrix. What I can say is the following: We can use Cramer's Rule. Then all the $n-1\times n-1$ minors will either have zero determinant, or will be something of the form $a_i \cdot M(n-1)$ for some $i$ (and $M(n-1)$ stands here for some reordering of the indices). You will then get quite an explicit formula. – Ehud Meir Jan 04 '17 at 16:45
  • I understand, it makes sense. But, I anticipate a neat answer for the inverse - just like the determinant. – T. Amdeberhan Jan 04 '17 at 16:50
  • 2
    Regarding $a_i$ as an indeterminate in the matrix of Ehud's answer, the matrix is the Varchenko matrix of the arrangement of $n-1$ parallel lines in the plane. See for instance Section 6.5 of http://www.cis.upenn.edu/~cis610/sp06stanley.pdf. Theorem 6.25 of this reference therefore gives a vast generalization of $\det M(n)$. The inverse of $M(n)$ is a tridiagonal matrix with terms $a_i/(a_i^2-1)$ on the diagonal above and below the main diagonal, and with $(1,1)$-entry $1/(1-a_1^2)$, $(n,n)$-entry $1/(1-a_{n-1}^2)$, and other diagonal entries $(1-a_i^2a_{i+1}^2)/(1-a_i^2)(1-a_{i+1}^2)$. – Richard Stanley Jan 04 '17 at 17:37
  • Solving (2) for $M^{-1}$ (thanks to Stanley) actually gives away (1) more readily, because computing the determinant the tridiagonal $M^{-1}$ is much easier (thus we get $\det M$ immediately). – T. Amdeberhan Jan 04 '17 at 19:19
  • 2
    A slick way to evaluate Ehud's matrix is notice that two rows become equal if we set $a_i=1$ and become negatives if we set $a_i=-1$. A simple degree argument shows that there are no other factors, and setting each $a_i=0$ gives the correct scaling factor (i.e., the polynomial has constant term 1). – Richard Stanley Jan 04 '17 at 22:21
3

I wish to advertise the Method of Condensation in proving determinantal evaluation. The key is guess the answer, which is thanks to Ehud Meir. Then, generalize it a bit. Let $x_1, x_2,\dots$ be an infinite set of variables and modify the original matrix (by shifting variables) to $M^{a,b}(n)$ so that $$M_{i,j}^{a,b}(n):=\frac{x_{\max(i+a,j+b)}}{x_{\min(i+a,j+b)}}.$$ Convention: $M^{0,0}(n)=M(n)$.

Claim. If $a\neq b$ then $\det M^{a,b}(n)=0$, and if $a=b$ then $$\det M^{a,a}(n)=\prod_{r=2}^n\frac{x_{k-1+a}^2-x_{k+a}^2}{x_{k-1+a}^2}.\tag1$$ Proof. The case $a\neq b$ is easy - simply factor out a variable from $n^{th}$-column/row and another variable from the $(n-1)^{th}$-column/row. These new columns/rows are identical.

Inductive proofs neatly work with this Dodgson's recursive relation $$\det Z^{0,0}(n)=\frac{\det Z^{1,1}(n-1)\det Z^{1,1}(n-1)-\det Z^{0,1}(n-1)\det Z^{1,0}(n-1)}{\det Z^{1,1}(n-2)}$$ satisfied by any matrix (so long as denominators do not vanish). Thus, it holds for $\det M^{a,b}(n)$. So, it remains to prove that the (explicit) formula on the RHS of (1) does satisfy the same equation. However, this is quite a routine simplification (preferably with symbolic sofwares). The proof follows. $\square$

T. Amdeberhan
  • 41,802