given that vector A+ vector B+ vector C=0 but of three vectors two are equal in magnitude and the magnitude of third vector is √2 times that of either of the two having equal magnitude. then find the angles between vectors.
Answers
Aloha user
a+b+c=0
=> a+b = -c
- and sonow
|a+b| = |c|
that means resultant of vector sum of the two (a & b) is equal to -C.
Now
let
|a|=|b|=a
given
|c|=Sqrt[2] a and is equal to |a+b|
Now resultant of vector sum of a & b is
|a+b| = Sqrt[ |a|^2 + |b|^2 + 2.|a|.|b| Cos(angle between a & b) ]
= Sqrt[ 2 a^2 + 2 a^2 Cos(angle between a & b) ]
=> Sqrt[2] a = Sqrt[2] a Sqrt[1+Cos(angle between a & b) ]
=> Sqrt[1+Cos(angle between a & b) ] = 1
=> 1+Cos(angle between a & b) = 1
=> Cos(angle between a & b) = 0
=> angle between a & b = 90 degrees
and as a & b are equal in magnitude and " - c " is the resultant of vector sum of a & b, so - c lies exactly in the middle of a & b, that is " - c" makes 45 degree angles with a & b, and hence "c" makes "180 - 45" degree angle between a & b,
so
angle between a & b = 90 degrees
angle between a & c = angle between b & c = 180 - 45 degree
= 135 degree
= 135 degree