Math, asked by anynikolayeva, 4 months ago

7.21. According to Nielsen Media Research, the average number of hours of TV viewing by adults (18 and over) per week in the United States is 36.07 hours. Suppose the standard deviation is 8.4 hours and a random sample of 42 adults is taken. What is the probability that the sample average is more than 38 hours? What is the probability that the sample average is less than 33.5 hours? What is the probability that the sample average is less than 26 hours? If the sample average actually is less than 26 hours, what would it mean in terms of the Nielsen Media Research figures? Suppose the population standard deviation is unknown. If 71% of all sample means are greater than 35 hours and the population mean is still 36.07 hours, what is the value of the population standard deviation?

Answers

Answered by mramchandramishra198
0

Answer:

Enter first number: 15

Enter second number:20

Press 1. For SUM

Press 2. For PRODUCT

Press 3. For DIFFERENCE

Press 4. For QUOTIENT

Enter your choice (1-4): 2

The Product=300

ok

Similar questions