Math, asked by ekonyrev, 9 months ago

Judy thinks there will be 325 people at the county fair on Friday, while Atticus thinks there will be 600 people. On Friday, 452 people attend the fair. Who is closer in their estimate? What is the difference between the percent errors?
PLEASE ANSWER BOTH QUESTIONS!

Answers

Answered by Anonymous
13

taking the mean of both the assumptions of Judy and Atticus

We get = (325 + 600)/2

= 925/2

= 462.5

we see that the actual number of people who attended the fair is less than 462.5 > 452, therefore the lower number that was taken for mean will be closer .

therefore, the observation of Judy is

closer to the actual number.

% error = %error of Atticus - %error of Judy

%error = (assumption - actual)/actual x 100

%error for judy = 127/452 x 100 =

12700/452 ≈ 28%

%error for Atticus = 148/452 x 100 = 14800/452 ≈ 32.7%

Difference = 4.7%

HOPE THIS HELPS

Cheers MATE, PLEASE MARK MY ANSWER AS BRAINLIEST

Answered by saivivek16
5

Step-by-step explanation:

Aloha !

Given ,

=>600+325/2

=925/2

=462.5

percentage of error= 127/452×100%

≈28%

Percentage of error=148/452×100%

≈32.7%

Now,

=32.7%-28%

=4.7%

Thank you

@ Twilight Astro ✌️☺️♥️

Similar questions