Math, asked by MMM09, 1 year ago

Mike travels in an airplane a distance of 2760 miles.For one half of the distance, the airplane flies at a speed of 560 miles per hour, and for rest of the distance. it flies at a speed of 820 mph.How long does the trip take.

Answers

Answered by TPS
1
for first half of the distance,
d = 2760/2 = 1380 miles
speed = 560 mph
time(t₁) = 1380/560 = 2.464 hour

for second half of the distance,
d = 2760/2 = 1380 miles
speed = 820 mph
time(t₂) = 1380/820 = 1.683 hour

total time = t₁ + t₂ = 2.464 + 1.683 = 4.147 hours
Similar questions