tenjr27
contestada

A radar gun measured the speed of a baseball at 92 miles per hour.

If the baseball was actually going 90.3 miles per hour, what was the percent error in this measurement?

Respuesta :

Answer:

subtract the actual measurement from the initial measurement

ie. 92-90.3 =1.7 %

You can calculate the error and then measure how much percent of the real value, the error came.

The percent error in the considered measurement is of approx 1.88%

How to calculate the percent error?

Suppose the actual value and the estimated values after the measurement are obtained. Then we have:

Error = Actual value - Estimated value

To calculate percent error, we will measure how much percent of actual value, the error is, in the estimated value.

[tex]\text{Percent error} = |\dfrac{Error}{\text{Actual value}}| \times 100 = |\dfrac{\text{(Actual value - Estimated value)} }{\text{Actual value}}| \times 100[/tex]

Sine in the given condition, we have:

  • Actual measurement = 90.3 miles per hour
  • Estimated measurement = 92 miles per hour

Thus, we have

Error = 90.3 - 92 = -1.7 miles per hour

Thus, the percent error is calculated as:

[tex]\text{Percent error} = |\dfrac{-1.7 }{90.3}|\times 100 = \dfrac{1.7 \times 100}{90.3} = \dfrac{170}{90.3} \approx 1.88\%[/tex]

Thus,

The percent error in the considered measurement is of approx 1.88%

Learn more about percent error here:

https://brainly.com/question/3105259