Question #3459

Let β={u1, u2, ... , un} be a subset of Fn containing n distinct vectors and let B be an nxn matrix in F having uj as column j.
Prove that β is a basis for Fn if and only if det(B)≠0.
For one direction of the proof this is something that I found :
Since β consists of n vectors, β is a basis if and only if these vectors are linearly independent, which is equivalent to the map LB being one-to-one. Since the matrix B is square, this is in turn equivalent to B being invertible, hence having a nonzero determinant.
However I do not understand the transition from the vectors being linearly independent to being one to one. Why is this true? Is there a better way to prove this? Also, how do I prove the reverse direction?

Expert's answer

We have that β=(u

Suppose we have an equation:

c

It represents a system of n equations for each component. The matrix of this system is matrix B. We know that c

So det(B) ≠ 0 (as a result of Cramer’s rule).

Sufficiency:

We have that det(B)≠0. Let’s prove that β=(u

Suppose that u

Let c

u

because each of the determinants = 0 (each has 2 equal columns).

But det(B) must be non-zero. So our assumption was wrong and u

As we have n linearly independent vectors in Fn, β is a basis of Fn.

## Comments

## Leave a comment