Math, asked by sopp9526, 10 days ago

A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour, what was the percent error in this measurement?

Answers

Answered by sanchitay868
0

Answer:

If the baseball was actually going 102.8 miles per hour, what was the percent error in this measurement? 0.19%

Step-by-step explanation:

plz mark me branlist

Similar questions