what Happened during the winters?
Answers
Answered by
1
Answer:
your skin get dry
and you feel cold
Answered by
3
Answer:
During the Winter the Earth is tilted away from the sun. Because the Earth is titled away from the Sun, The rays from the sun have to travel further through the atmosphere before it reaches the Earth's surface. Since it takes longer, the hours of daylight are shorter and the temperature is colder.
Similar questions
Science,
3 months ago
English,
3 months ago
Computer Science,
6 months ago
Political Science,
6 months ago
Economy,
10 months ago
Physics,
10 months ago
Math,
10 months ago