Math, asked by zoesplace6, 6 months ago

A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour, what was the percent error in this measurement?

0.19%

0.20%

0.21%

0.22%
FIRST CORRECT PERSON GETS MARKED AS BRAINLIEST!!!!!!!!!

Answers

Answered by akshayrajv2006
3

Answer:

The radar speed gun was invented by John L. Barker Sr., and Ben Midlock, who developed radar for the military while working for the Automatic Signal Company (later Automatic Signal Division of LFE Corporation) in Norwalk, CT during World War II

Step-by-step explanation:

A radar speed gun (also radar gun and speed gun) is a device used to measure the speed of moving objects. It is used in law-enforcement to measure the speed of moving vehicles and is often used in professional spectator sport, for things such as the measurement of bowling speeds in cricket, speed of pitched baseball, athletes and tennis serves.

Microdigicam Laser in use in Brazil

Handheld radar speed gun

A radar speed gun is a Doppler radar unit that may be hand-held, vehicle-mounted or static. It measures the speed of the objects at which it is pointed by detecting a change in frequency of the returned radar signal caused by the Doppler effect, whereby the frequency of the returned signal is increased in proportion to the object's speed of approach if the object is approaching, and lowered if the object is receding. Such devices are frequently used for speed limit enforcement, although more modern LIDAR speed gun instruments, which use pulsed laser light instead of radar, began to replace radar guns during the first decade of the twenty-first century, because of limitations associated with small radar systems

Answered by tejasvinisinhaps23
3

The percent error in this measurement is 0.194%. Step-by-step explanation: Given : A radar gun measured the speed of a baseball at 103 miles per hour.

please mark me brainliest and follow

me

Similar questions