what are eigen value problems?
Answers
Answered by
1
In a way, an eigenvalue problem is a problem that looks as if it should have continuous answers, but instead only has discrete ones. The problem is to find the numbers, called eigenvalues, and their matching vectors, called eigenvectors. This is extremely general—it is used in differential equations (because solutions to linear differential equations form linear spaces!) and described in detail in linear algebra.
The basic idea for linear equations is this: Ax = bhas a unique solution if A is invertible, and has either many or no solutions if not. If it were basic algebra, we would divide both sides of the equation by A, but division is not defined for matrices. So we work around it; if your calculator’s division button were stuck, and you needed to divide a number by 2, what would you do? You could multiply by 0.5, and get the answer. 0.5 is the (multiplicative) inverse of 2. So we want to solve by finding the matrix A^(-1) that we could multiply the equation Ax = b by, and get A^(-1) A x = A^(-1) b = I x = x.
The basic idea for eigenvalues is this: Does Ax = Lx for some vector x and number L? (Usually lambda is used for L.)
Imagine a 3x3 matrix A that transforms vectors in 3D into other vectors in 3D. The eigenvalue question is this: what vectors are not rotated by A? And if they aren’t rotated, are they reversed, smaller, larger, or the same magnitude? What number L is the length of eigenvector xmultiplied by when A acts on it?
Some matrices don’t have real eigenvalues or eigenvectors— a rotation matrix in 2 D, for example, rotates everything. (The zero vector 0doesn’t count, because that always works trivially.) Some matrices have one or more; some have a “full set” of n eigenvectors in n dimensions. Those matrices we call “diagonalizable.” This is important in quantum mechanics because if you can diagonalize a matrix that means you can measure a thing and get specific answers.
The basic idea for linear equations is this: Ax = bhas a unique solution if A is invertible, and has either many or no solutions if not. If it were basic algebra, we would divide both sides of the equation by A, but division is not defined for matrices. So we work around it; if your calculator’s division button were stuck, and you needed to divide a number by 2, what would you do? You could multiply by 0.5, and get the answer. 0.5 is the (multiplicative) inverse of 2. So we want to solve by finding the matrix A^(-1) that we could multiply the equation Ax = b by, and get A^(-1) A x = A^(-1) b = I x = x.
The basic idea for eigenvalues is this: Does Ax = Lx for some vector x and number L? (Usually lambda is used for L.)
Imagine a 3x3 matrix A that transforms vectors in 3D into other vectors in 3D. The eigenvalue question is this: what vectors are not rotated by A? And if they aren’t rotated, are they reversed, smaller, larger, or the same magnitude? What number L is the length of eigenvector xmultiplied by when A acts on it?
Some matrices don’t have real eigenvalues or eigenvectors— a rotation matrix in 2 D, for example, rotates everything. (The zero vector 0doesn’t count, because that always works trivially.) Some matrices have one or more; some have a “full set” of n eigenvectors in n dimensions. Those matrices we call “diagonalizable.” This is important in quantum mechanics because if you can diagonalize a matrix that means you can measure a thing and get specific answers.
Anonymous:
Come on this is copied from Quora.
Similar questions