An object is thrown downward with an initial velocity of 19 feet per second. The distance, d it travels in an amount of time, t is given by the equation, d = 19t + 15t². How long does it take the object to fall 50 feet?
Answers
Answered by
0
Explanation:
Consider,
x=3t+16t2 where x=70m in this case.
Hence,
70=3t+16t2
⇒−16t2−3t+70=0
∴t=2s,−2.19s
The only physically significant answer here is 2s.
To be sure, you are given a relationship between distance and time, this doesn't include velocity or acceleration. Thus, you can merely solve for the time through the given relationship.
Was this answer helpful?
PLS MARK AS THE BRAINLIEST ANSWER.
Similar questions