Math, asked by shivam667, 1 year ago

Can anyone help me out with this?

Attachments:

Answers

Answered by Neotros
2

 \sqrt{2}  +  \frac{3}{ \sqrt{2} }
 =  \frac{2 + 3}{ \sqrt{2} }
 =  \frac{5}{ \sqrt{2} }
now multiplying both the numerator and denominator by root over 2
 =  \frac{5 \sqrt{2} }{2}
 = 2.5 \times 1.41421...
= 3.53553...
which is not exact number.
hence it is Irrational number.

Neotros: please mark as brainliest
Neotros: please
Similar questions