English, asked by deepanshudeepanshu83, 7 months ago

what Happened during the winters?​

Answers

Answered by kangezzkang
1

Answer:

your skin get dry

and you feel cold

Answered by nakshatravaradaraju
3

Answer:

During the Winter the Earth is tilted away from the sun. Because the Earth is titled away from the Sun, The rays from the sun have to travel further through the atmosphere before it reaches the Earth's surface. Since it takes longer, the hours of daylight are shorter and the temperature is colder.

Similar questions