Show that the determinant that a procession $A$ is same to the product that its eigenvalues $\lambda_i$.

You are watching: Why is determinant the product of eigenvalues

So I"m having a tough time figuring this one out. I recognize that I have to work v the characteristics polynomial the the matrix $\det(A-\lambda I)$. But, when considering an $n \times n$ matrix, I do not know exactly how to work out the proof. Must I simply use the determinant formula for any type of $n \times n$ matrix? I"m guessing not, because that is fairly complicated. Any kind of insights would certainly be great.


*

*

Suppose that $\lambda_1, \ldots, \lambda_n$ are the eigenvalues of $A$. Then the $\lambda$s are additionally the roots of the properties polynomial, i.e.

$$\beginarrayrcl \det (A-\lambda I)=p(\lambda)&=&(-1)^n (\lambda - \lambda_1 )(\lambda - \lambda_2)\cdots (\lambda - \lambda_n) \\ &=&(-1) (\lambda - \lambda_1 )(-1)(\lambda - \lambda_2)\cdots (-1)(\lambda - \lambda_n) \\ &=&(\lambda_1 - \lambda )(\lambda_2 - \lambda)\cdots (\lambda_n - \lambda)\endarray$$

The very first equality follows from the administrate of a polynomial provided its roots; the top (highest degree) coefficient $(-1)^n$ can be acquired by widening the determinant along the diagonal.

Now, by setup $\lambda$ come zero (simply because it is a variable) we obtain on the left next $\det(A)$, and also on the best side $\lambda_1 \lambda_2\cdots\lambda_n$, the is, we indeed acquire the wanted result

$$ \det(A) = \lambda_1 \lambda_2\cdots\lambda_n$$

So the determinant the the procession is equal to the product the its eigenvalues.


share
cite
monitor
edited Sep 13 "20 at 10:23
*

Gaurang Tandon
6,11333 gold badges2828 silver- badges6262 bronze badges
answer Sep 28 "13 at 7:42
*

onimonionimoni
5,27077 gold badges2525 silver badges4040 bronze title
$\endgroup$
11
| present 6 more comments
14
$\begingroup$
I to be a start Linear Algebra learner and also this is just my humble opinion.

One idea presented above is the

Suppose that $\lambda_1,\ldots \lambda_2$ room eigenvalues the $A$.

Then the $\lambda$s are also the roots of the properties polynomial, i.e.

$$\det(A−\lambda I)=(\lambda_1-\lambda)(\lambda_2−\lambda)\cdots(\lambda_n−\lambda)$$.

Now, by setting $\lambda$ to zero (simply since it is a variable) we gain on the left side $\det(A)$, and also on the appropriate side $\lambda_1\lambda_2\ldots \lambda_n$, the is, us indeed attain the preferred result

$$\det(A)=\lambda_1\lambda_2\ldots \lambda_n$$.

I dont think that this works generally yet only for the instance when $\det(A) = 0$.

Because, as soon as we create down the characteristics equation, we usage the relation $\det(A - \lambda I) = 0$ adhering to the very same logic, the only case where $\det(A - \lambda I) = \det(A) = 0$ is that $\lambda = 0$. The connection $\det(A - \lambda I) = 0$ must be obeyed also for the special situation $\lambda = 0$, i m sorry implies, $\det(A) = 0$

UPDATED POST

Here ns propose a way to prove the theorem because that a 2 by 2 case. Allow $A$ it is in a 2 by 2 matrix.

$$ A = \beginpmatrix a_11 & a_12\\ a_21 & a_22\\\endpmatrix$$

The idea is to use a specific property that determinants,

$$ \beginvmatrix a_11 + b_11 & a_12 \\ a_21 + b_21 & a_22\\\endvmatrix = \beginvmatrix a_11 & a_12\\ a_21 & a_22\\\endvmatrix + \beginvmatrix b_11 & a_12\\b_21 & a_22\\\endvmatrix$$

Let $ \lambda_1$ and also $\lambda_2$ be the 2 eigenvalues the the matrix $A$. (The eigenvalues deserve to be distinct, or repeated, real or facility it doesn"t matter.)

The two eigenvalues $\lambda_1$ and $\lambda_2$ must satisfy the following condition :

$$\det (A -I\lambda) = 0 $$Where $\lambda$ is the eigenvalue of $A$.

See more: Is There A Tak And The Power Of Juju Games For Sale, Video Game / Tak And The Power Of Juju

Therefore, $$\beginvmatrix a_11 - \lambda & a_12 \\ a_21 & a_22 - \lambda\\\endvmatrix = 0 $$

Therefore, making use of the property of determinants detailed above, ns will try to decompose the determinant right into parts.

$$\beginvmatrix a_11 - \lambda & a_12 \\ a_21 & a_22 - \lambda\\\endvmatrix = \beginvmatrix a_11 & a_12 \\ a_21 & a_22 - \lambda\\\endvmatrix - \beginvmatrix \lambda & 0 \\ a_21 & a_22 - \lambda\\\endvmatrix= \beginvmatrix a_11 & a_12 \\ a_21 & a_22\\\endvmatrix - \beginvmatrix a_11 & a_12 \\ 0 & \lambda \\\endvmatrix-\beginvmatrix \lambda & 0 \\ a_21 & a_22 - \lambda\\\endvmatrix$$

The final determinant can be more reduced.

$$\beginvmatrix \lambda & 0 \\ a_21 & a_22 - \lambda\\\endvmatrix = \beginvmatrix \lambda & 0 \\ a_21 & a_22 \\\endvmatrix - \beginvmatrix \lambda & 0\\ 0 & \lambda\\\endvmatrix$$

Substituting the final determinant, we will have actually

$$\beginvmatrix a_11 - \lambda & a_12 \\ a_21 & a_22 - \lambda\\\endvmatrix = \beginvmatrix a_11 & a_12 \\ a_21 & a_22\\\endvmatrix - \beginvmatrix a_11 & a_12 \\ 0 & \lambda \\\endvmatrix - \beginvmatrix \lambda & 0 \\ a_21 & a_22 \\\endvmatrix + \beginvmatrix \lambda & 0\\ 0 & \lambda\\\endvmatrix = 0 $$

In a polynomial$$ a_n\lambda^n + a_n-1\lambda^n-1 ........a_1\lambda + a_0\lambda^0 = 0$$We have actually the product of source being the coefficient of the term v the 0th power, $a_0$.

From the decomposed determinant, the just term which doesn"t indicate $\lambda$ would certainly be the very first term

$$\beginvmatrix a_11 & a_12 \\ a_21 & a_22 \\\endvmatrix = \det (A)$$

Therefore, the product of roots aka product that eigenvalues the $A$ is equivalent to the determinant the $A$.

I am having obstacles to generalize this idea the proof come the $n$ by $$ case though, as it is complex and time consuming because that me.