Answer to Question #103244 in Linear Algebra for michael

Question #103244
Let A = (a ij) n x n, where n >= 3, such that a ij = i - j for each 1 <= i <= n and 1 <= j <= n. Prove that det(A) = 0
1
Expert's answer
2020-02-20T09:00:45-0500

First of all let's understand what the matrix we have. Write a matrix in general form, then

A = "\\begin{bmatrix}\n 0 & -1 & -2 &...&1-n \\\\\n 1 & 0 & -1 &...&2-n \\\\\n 2 & 1 & 0 & ...&3-n\\\\\n ...&...&...&...&...\\\\\nn-2&n-1&n&...&-1\\\\\nn-1&n-2&n-3&...&0\n\\end{bmatrix}". Now we can see that if we subtract from the second row the first one, we get the equivalent matrix (it means determinant doesn't change) and in the second row there are only ones. Do the same thing with two last rows (subtract from the last row (row index = n) the previous one (row index = n-1)), then the last row also is full of ones.

That means that we can get the full zero row by subtracting from the second row the last one.

But when we get the full row of zeros it exactly means that the det(A) = 0.


Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!

Leave a comment

LATEST TUTORIALS
New on Blog
APPROVED BY CLIENTS