Math, asked by kanchanyadav3696, 5 months ago

Water drips from a faucet at a rate of 41 drops/minute. Assuming there are 15,000 drops in a gallon, how many minutes would it take for the dripping faucet to fill a 1‑gallon bucket? Round your answer to the nearest whole number.

Answers

Answered by subhashmkg667
7

Answer: 370 minutes.

Explanation:

You can calculate the time by using the formula of rate:

         rate = amount / time = number of drops / time

From which you clear the time:

        time = number of drops / rate

The rate and the number of drops are given: 41 drops / minute and 15,000 drops (since that is the number of dropos in 1.00 gallons).

Therefore, time = 15,000 drops / 41 drops / minute = 365.85 minutes

You have to use the correct number of significant figures, which is the least number of significant figures in every factor. 41 has two significant figures, 15,000 also has two significant figures, while 1.00 has three significan figures.

So you must round 365.85 minutes to two significan figures.

That is 370 (365 is round up to 370).

Answered by Anonymous
3

Given - Rate of dropping : 41 drops/minute

Number of drops in gallon : 15000

Find - Minutes to fill bucket

Solution - Using unitary method to find out the number of minutes required to fill. 1-gallon bucket.

41 drops fall in number of minute : 1

15000 drops will fall in number of minute : 1/41*15000

Hence, 15000 drops will fall in 365.85 minutes.

Round off the answer we get number of minutes 366 minutes.

Therefore, 1 gallon bucket can fill in 366 minutes.

Similar questions