Math, asked by anilvnaik4411, 1 year ago

Algorithm to calculate eigen vectors of a matrix

Answers

Answered by virk51
0
In numerical analysis, one of the most important problems is designing efficient andstablealgorithms for finding the eigenvaluesof a matrix. These eigenvalue algorithms may also find eigenvectors.

Eigenvalues and eigenvectors

Main articles: Eigenvalues and eigenvectors andGeneralized eigenvector

Given an n × n square matrix A of real orcomplex numbers, an eigenvalue λ and its associated generalized eigenvector v are a pair obeying the relation[1]

where v is a nonzero n × 1 column vector, I is the n × n identity matrix, k is a positive integer, and both λ and v are allowed to be complex even when A is real. When k = 1, the vector is called simply an eigenvector, and the pair is called an eigenpair. In this case, Av = λv. Any eigenvalue λ of A has ordinary[note 1]eigenvectors associated to it, for if k is the smallest integer such that (A - λI)k v = 0 for a generalized eigenvector v, then (A - λI)k-1 v is an ordinary eigenvector. The value k can always be taken as less than or equal to n. In particular, (A - λI)n v = 0 for all generalized eigenvectors v associated with λ.

For each eigenvalue λ of A, the kernel ker(A - λI) consists of all eigenvectors associated with λ (along with 0), called the eigenspace ofλ, while the vector space ker((A - λI)n)consists of all generalized eigenvectors, and is called the generalized eigenspace. Thegeometric multiplicity of λ is the dimension of its eigenspace. The algebraic multiplicity of λis the dimension of its generalized eigenspace. The latter terminology is justified by the equation

where det is the determinant function, the λiare all the distinct eigenvalues of A and the αiare the corresponding algebraic multiplicities. The function pA(z) is the characteristic polynomial of A. So the algebraic multiplicity is the multiplicity of the eigenvalue as a zeroof the characteristic polynomial. Since any eigenvector is also a generalized eigenvector, the geometric multiplicity is less than or equal to the algebraic multiplicity. The algebraic multiplicities sum up to n, the degree of the characteristic polynomial. The equation pA(z) = 0 is called the characteristic equation, as its roots are exactly the eigenvalues of A. By theCayley–Hamilton theorem, A itself obeys the same equation: pA(A) = 0.[note 2] As a consequence, the columns of the matrix must be either 0 or generalized eigenvectors of the eigenvalue λj, since they are annihilated by In fact, the column space is the generalized eigenspace of λj.

Any collection of generalized eigenvectors of distinct eigenvalues is linearly independent, so a basis for all of C n can be chosen consisting of generalized eigenvectors. More particularly, this basis {vi}n
i=1 can be chosen and organized so that

if vi and vj have the same eigenvalue, then so does vk for each k between i andj, and

if vi is not an ordinary eigenvector, and ifλi is its eigenvalue, then (A - λiI )vi = vi-1(in particular, v1 must be an ordinary eigenvector).

If these basis vectors are placed as the column vectors of a matrix V = [ v1 v2 ... vn ], then V can be used to convert A to its Jordan normal form:



where the λi are the eigenvalues, βi = 1 if (A - λi+1)vi+1 = vi and βi = 0 otherwise.

More generally, if W is any invertible matrix, and λ is an eigenvalue of A with generalized eigenvector v, then (W−1AW - λI )k W−kv = 0. Thus λ is an eigenvalue of W−1AW with generalized eigenvector W−kv. That is, similar matrices have the same eigenvalues.

Normal, Hermitian, and real-symmetric matrices

Main articles: Adjoint matrix, Normal matrix, andHermitian matrix

The adjoint M* of a complex matrix M is the transpose of the conjugate of M: M * = M T. A square matrix A is called normal if it commutes with its adjoint: A*A = AA*. It is called hermitian if it is equal to its adjoint: A* =A. All hermitian matrices are normal. If A has only real elements, then the adjoint is just the transpose, and A is hermitian if and only if it issymmetric. When applied to column vectors, the adjoint can be used to define the canonical inner product on Cn: w · v = w* v.[note 3] Normal, hermitian, and real-symmetric matrices have several useful properties:

Every generalized eigenvector of a normal matrix is an ordinary eigenvector.

Any normal matrix is similar to a diagonal matrix, since its Jordan normal form is diagonal.

Eigenvectors of distinct eigenvalues of a normal matrix are orthogonal.

The null space and the image (or column space) of a normal matrix are orthogonal to each other.

For any normal matrix A, C n has an orthonormal basis consisting of eigenvectors of A. The corresponding matrix of eigenvectors is unitary.

The eigenvalues of a hermitian matrix are real, since (λ − λ)v = (A* − A)v = (A −A)v = 0 for a non-zero eigenvector v.

If A is real, there is an orthonormal basis for Rn consisting of eigenvectors of A if and only


Similar questions