M

**v**=**v**λ
We can extend this concept to an

These two matrices, and the original matrix M, obey the

__by combining all the n eigenvectors into an n x n matrix that we will call V, and replacing λ with a diagonal matrix D.__**eigenmatrix**These two matrices, and the original matrix M, obey the

__:__**eigenmatrix equation**
MV = VD

It took me some time to understand why V has to alternate left and right in this equation, but it becomes apparent if you matrix-multiply out the right hand side. Doing so recovers the eigenvalue/vector definition we started with, i.e. λ

If V is

_{0}only multiplies the first column of V_{}etc.If V is

__(i.e. det(V)≠ 0) this leads us to a__**non-singular**__useful set of operations called__**very**__and__**diagonal decomposition**__(in the matrix sense, not the complex number sense). We can rearrange the eigenmatrix equation two ways:__**conjugation**
M = VDV

^{-1}and V^{-1}MV = D*Edit: Turns out there's still a lot for me to learn. For matrices where V is singular, you can still usually decompose M into C*DC

^{-1}

*where D is an upper (or lower) triangular matrix and C is different from the matrix of eigenvectors V. Using this decomposition, it's still possible to prove lots of things involving the eigenvalues, determinant and trace.*

I.e, the first equation states that a square matrix M can be rewritten as a diagonal matrix D, pre-multiplied by V and post-multiplied by the inverse of V.

This is referred to as the

__of M. Usually we're more interested in the value of D alone but the full equation will be of value in future posts.__**diagonal decomposition**
The second equation states that any matrix M can be

__by pre and post-multiplication by the inverse of V, and V itself.__**converted to diagonal form**
This operation is referred to as

**conjugation**and is used to prove several important matrix properties.
## No comments:

## Post a Comment