on a trip of 200 miles,a man travels at the rate of 40 miles per hour for the first 100 miles .for the rest of the trip,he travels at the rate of 55 miles per hour.what is did average speed,in miles per hour,for the total trip?
A:400/9 B:200/9 C:90 D:50
solved it anyone math expert
Answers
Answer:
Neither A, B, C nor D!
Hope this helps.
Step-by-step explanation:
The time for the first 100 miles is
t₁ = distance / speed = 100 / 40.
The time for the remaining 100 miles is
t₂ = distance / speed = 100 / 55.
The total time is
t = t₁ + t₂ = 100/40 + 100/55 = 100 × ( 1/40 + 1/55 )
The average speed is then
distance / time
= 200 / t
= 2 / ( 1/40 + 1/55 ) ..... (*)
≈ 46.3 miles per hour
(*) So the average speed is the harmonic mean of the speeds involved.
Answer:
95/22 miles/hour
Step-by-step explanation:
Formula needed:
Time taken = Distance ÷ Speed
Speed = Distance ÷ Time taken
STEP 1: Find the time taken to travel for the first 100 miles
Distance = 100 miles
Speed = 40 miles/hour
Time taken = Distance ÷ Speed
Time taken = 100 ÷ 40 = 5/2 hours
STEP 2: Find the time taken to travel for the second 100 miles
Distance = 100 miles
Speed = 55 miles/hour
Time taken = Distance ÷ Speed
Time taken = 100 ÷ 55 = 20/11 hours
STEP 3: Find the total time taken
Total = 5/2 + 20/11 = 95/22 hours
STEP 4: Find the average speed
Total distance = 200 miles
Total time taken = 95/22 hour
Speed = Distance ÷ Time taken
Speed = 200 ÷ 95/22 = 95/22 miles/hour
Answer: 95/22 miles/hour