Math, asked by gururajaheri31891, 3 days ago

A clock was reading the time accurately on Tuesday at noon. On Thursday at 3pm the clock was running late by 612 seconds. On average, how many seconds did the clock skip every 30 minutes?

6
7
8
9

Answers

Answered by steffiaspinno
52

The clock skipped 6 seconds on average per 30 minutes.

Step-by-step explanation:

  • The starting time, when the clock was accurate = 12 noon on Tuesday
  • The final time, when the clock was late by 612 seconds = 3 pm on Thursday
  • The number of hours between final time and initial time = 2 days and 3 hours = (2\times 24) + 3 = 48 + 3 = 51 hours
  • Therefore, the number of 30 minutes duration in 51 hours = 51 \times 2 = 102
  • Total late time = 612 seconds

Hence, the average of seconds the clock skipped every 30 minutes

= Total late time/ Number of 30 minutes duration

= \frac{612}{ 102}

=6

Thus, the clock skipped 6 seconds on average per 30 minutes.

Answered by bjm8jpp2zk
0

Answer: 20.4/20 seconds

Step-by-step explanation: divided the amnt of sec

Similar questions