Science, asked by Mansi2712, 1 year ago

if Cheetah spots his Prey at a distance of 100 M what is the minimum time it would take to get its Prey if average velocity attained by it is 90 km per hour

Answers

Answered by amitsingh995819
24

Given,

Distance = 100 m

Speed = 90 km/h

Converting Km/h to m/s

Speed(m/s) = 90×1000 / 60×60

=> 25 m/s

We can use shortcut also to convert km/h to m/s,

=> speed(m/s) = 90 ÷ 3.6

=> 25 m/s

Now,

Time = Distance ÷ Speed

Time = 100 ÷ 25

Time = 4 seconds

Answered by soniatiwari214
1

Concept:

  • One dimensional motion
  • Kinematic equations
  • Speed is the distance covered per unit of time
  • Determining the required time through the known values of distance and velocity.

Given:

  • Distance between the cheetah and the prey, s = 100 m
  • Average velocity = 90 km/h = 90 * 1000/3600 = 25 m/s

Find:

  • The minimum time the cheetah would take to get its prey

Solution:

The distance between the cheetah and the prey s = 100m

speed is the distance covered per unit of time

speed = distance/time

v = s/t

v = 25 m/s

t = s/v

t = 100/25 = 4 s

The minimum time is 4s for the cheetah to get its prey.

#SPJ2

Similar questions