History, asked by diya569, 7 months ago

what did Germany lose after the first world war?​

Answers

Answered by ghatejanhavi74
2

Answer:

Germany lost World War I. In the 1919 Treaty of Versailles, the victorious powers (the United States, Great Britain, France, and other allied states) imposed punitive territorial, military, and economic provisions on defeated Germany. In the west, Germany returned Alsace-Lorraine to France.

Answered by dy212842
0

pls mark me brainliest

Most European countries made alliances with each other, thus pulling Germany with its' allies: Austria Hungary, Bulgaria and the Ottoman Empire into the war. By the end of the war, Germany's economy was bankrupted. Germany and its' allies lost the war with the Treaty of Versailles, by signing it on June 28, 1919.

Similar questions