Physics, asked by suhan03, 3 months ago

An enemy plane is at a
distance of 500 km from a
radar. In how much time the
radar will be able to detect the
plane? Take velocity of radio
waves as 3 x 10^8 m/s *​

Answers

Answered by Anonymous
14

Velocity of Radar = 3 × 10⁸ m/s

= 3 × 10⁸ ÷ 10³ m/s

= 3 × 10⁵ km/s

Distance of the plane from radar = 1000 km [the signal will be deflected]

Let time taken to detect the radar be = t

We know,

s = v × t

⟹ t = s ÷ v

⟹ t = (1000 km)/(3 × 10⁵ km/s)

⟹ t = 1/(3 × 10² km/s)

t = 0.333 × 10-² s

∴ The radar will take 0.333 × 10-³ seconds to detect the plane.

Answered by tennetiraj86
14

Explanation:

The distance between the plane and radar=500 km

we know thay 1 km=1000 m

distance=500×1000=500000m

now the radio waves covers the distance twice to detect the plane

so total distance =2×500000=1000000=1×10⁶m

Now velocity of radio waves=3×10⁸ m/s

now time taken to detect=Distance/speed

= 1×10⁶/(3×10⁸)

=>(1/3)×10^6-8

=>1/3×10^-2 s (or)

=>0.33×10^-2 s

Attachments:
Similar questions