Math, asked by avinsmallo9664, 1 year ago

Suppose that s1 is the set of symmetric 2x2 matrices and that s2 is the set of skew-symmetric 2x2 matrices. Prove that span (s1 s2)

Answers

Answered by rjgolu
1

If we consider A =

(acbd)

Then A^{t} =

(abcd)

So for a matrix to be in our subspace, we would have

(−a−c−b−d)

=

(abcd)

Which implies that -a = a and -d = d, so therefore a and d are 0 (won't bother to show) and that b = -c.

Therefore the resulting set of matrices will look like

(0c−c0)

This is where I run into trouble. I'm not sure if the dimension of this subspace is 1 or 2. If I factor the c out, then I have the matrix

(01−10)

and by taking any scalar multiple of this matrix, I can create any matrix in the subspace of 2x2 skew symmetric matrices. It's also linearly independent so it should be a basis.

My confusion arrises in that I could also decompose the matrix further into the two matrices

(00−c0)

and

(0c00)

These are both linearly independent vectors and span the subspace, so it's a basis, and I was wondering if that means the dimension of the subspace should be 2. But I wonder if the fact that the choice of c affects both matrices means that this decomposition isn't necessary and that since you can't choose two different weights for these matrices that the dimension of the subspace must in fact be one.

Sorry that was a long post, and maybe not the cleanest, still getting used to Latex. Thanks for any help.

Similar questions