Science, asked by rimanayjaijazz, 1 year ago

A child drops a ball from a height of 10 m. Assume that its velocity increases uniformly at the rate of 10 m per second squared. Find the velocity with which the ball strikes the ground and the time taken by the ball to reach the ground.​

Answers

Answered by DIVINEREALM
178

    ANSWER    

Acceleration(a) = 10 m/s²

Initial Velocity(u) = 0 m/s

Final velocity be 'v' m/s

g = 10 m/s²

Now Using equation of motion:

v² + u² = 2gh

v = √2gh

  = √200

  =  10√2 m s⁻¹

Now, Let the time be 't' sec.

.

Equation of motion:

h = ut + 1/2gt²

h = 1/2gt²

t = √(2h/g)

t = √2 sec

∴ Velocity = 10√2 m s⁻¹

∴ Time = √2 second  

Answered by lakshya1290
19

Answer:

v = 14.14m/s or 10√2m/s

t = 1.414s or √2 s

For

Explanation:

Pls see image

make me Brainliest answer pls friends.

stay home and stay safe...

Attachments:
Similar questions