A ball is in rest position dropped from a certain height its velocity becomes 19.6 M per second near the surface of the earth and has acceleration of 10 M per second square calculate the time taken by a ball to reach on the surface of the earth
Answers
Answered by
1
given u = 0 m/s
v = 19.6 m/s
a = 10 m/s^2
to find t,
v = u + at
t = (v - u) a
t = (19.6 - 0) 10
t = 1.96 sec
v = 19.6 m/s
a = 10 m/s^2
to find t,
v = u + at
t = (v - u) a
t = (19.6 - 0) 10
t = 1.96 sec
Answered by
1
Hii Vandana
Please Check Your Answer It is Right or Wrong
According To The Formula
V = U + at
[V = 19.6M/Sec
[U = 0M/Sec
[A = 10M/Sec²
[ t = ?]
Now put it into the Formula
V = U + At
19.6 = 0 + 10*t
19.6 = 10t
t = 19.6/10
t = 1.96 Sec Ans
Thanks
Regards
Please Check Your Answer It is Right or Wrong
According To The Formula
V = U + at
[V = 19.6M/Sec
[U = 0M/Sec
[A = 10M/Sec²
[ t = ?]
Now put it into the Formula
V = U + At
19.6 = 0 + 10*t
19.6 = 10t
t = 19.6/10
t = 1.96 Sec Ans
Thanks
Regards
Amankhan993653:
@vandana if it is right please mark it BrainList I will be very Thankful
Similar questions