Physics, asked by kalyan2895, 10 months ago

Two sound waves from a point source on the ground travel through the ground to a detector. The speed of one wave is 7.5 km s1, the speed of the other wave is 5.0 km s1. The waves arrive at the detector 15 s apart. What is the distance from the point source to the detector?

Answers

Answered by sawakkincsem
19

Answer:

x = 225km

Explanation:

Given in the question,

speed of one wave is 7.5 km/s

speed of other wave is 5 km/s

difference between their arrival time = 15 sec

Formula to use

time = distance/speed

since time difference is 15 so t1 - t2 = 15

x/5 - x/7.5 = 15

7.5x - 5x = 15(5)(7.5)

2.5x = 562.5

     x = 225km

So, the distance from the point source to the detector = 225 km

Answered by protegelin
0

Answer:

Explanation: t1= d/7.5

t2=d/5

t2- t1=15

d(1/5-1/7.5)=15

d=225km

Similar questions