Social Sciences, asked by MOHDAKHIF, 9 months ago

what are the changes happend in the world after treaty versallies​

Answers

Answered by Anonymous
2

Answer:

Hi buddy

Explanation:

The Versailles Treaty forced Germany to give up territory to Belgium, Czechoslovakia and Poland, return Alsace and Lorraine to France and cede all of its overseas colonies in China, Pacific and Africa to the Allied nations.

Hope it helps you

please mark as brainlist answer

Attachments:
Answered by nehacute
1

Explanation:

End of World War I

The fighting ended on November 11, 1918 but the formal end to one of the deadliest conflicts in human history did not come until June 28, 1919.

Similar questions