Physics, asked by alfiyanahas, 9 months ago

If operators A and B are hermition . Then show that
i[A, B] is hemation? What relation must exist
blw operators A and B inorder that A,B is
hermetion?​

Answers

Answered by mvm9363
0

Answer:

In 18.06, we mainly worry about matrices and column vectors: finite-dimensional linear algebra. But into the syllabus pops an odd topic: Fourier series. What do these

have to do with linear algebra? Where do their interesting properties, like orthogonality, come from?

In these notes, written to accompany 18.06 lectures in Fall 2007, we discuss these

mysteries: Fourier series come from taking concepts like eigenvalues and eigenvectors

and Hermitian matrices and applying them to functions instead of finite column vectors.

In this way, we see that important properties like orthogonality of the Fourier series

arises not by accident, but as a special case of a much more general fact, analogous to

the fact that Hermitian matrices have orthogonal eigenvectors.

This material is important in at least two other ways. First, it shows you that the

things you learn in 18.06 are not limited to matrices—they are tremendously more

general than that. Second, in practice most large linear-algebra problems in science

and engineering come from differential operators on functions, and the best way to

analyze these problems in many cases is to apply the same linear-algebra concepts to

the underlying function spaces.

2 Review: Finite-dimensional linear algebra

Most of 18.06 deals with finite-dimensional linear algebra. In particular, let’s focus on

the portion of the course having to do with square matrices and eigenproblems. There,

we have:

• Vectors x: column vectors in R

n (real) or C

n (complex).

• Dot products x · y = x

Hy. These have the key properties: x · x = kxk

2 > 0 for

x 6= 0; x · y = y · x; x · (αy + βz) = αx · y + βx · z.

• n×n matrices A. The key fact is that we can multiply A by a vector to get a new

vector, and matrix-vector multiplication is linear: A(αx + βy) = αAx + βAy.

1

• Transposes AT

and adjoints AH = AT . The key property here is that x·(Ay) =

(AHx) · y . . . the whole reason that adjoints show up is to move matrices from

one side to the other in dot products.

• Hermitian matrices A = AH, for which x·(Ay) = (Ax)·y. Hermitian matrices

have three key consequences for their eigenvalues/vectors: the eigenvalues λ are

real; the eigenvectors are orthogonal;

1

and the matrix is diagonalizable (in fact,

the eigenvectors can be chosen in the form of an orthonormal basis).

Now, we wish to carry over these concepts to functions instead of column vectors,

and we will see that we arrive at Fourier series and many more remarkable things

Explanation:

Similar questions