what Happened during the winters?
Answers
Answered by
1
Answer:
your skin get dry
and you feel cold
Answered by
3
Answer:
During the Winter the Earth is tilted away from the sun. Because the Earth is titled away from the Sun, The rays from the sun have to travel further through the atmosphere before it reaches the Earth's surface. Since it takes longer, the hours of daylight are shorter and the temperature is colder.
Similar questions
Science,
3 months ago
Social Sciences,
3 months ago
Science,
3 months ago
Computer Science,
7 months ago
Political Science,
7 months ago
Economy,
11 months ago
Physics,
11 months ago
Math,
11 months ago