Social Sciences, asked by ravindrasingh88283, 7 months ago

what changes came in Germany after the end of world war 1​

Answers

Answered by dollypatel16
0

Answer:

Germany. In Germany, there was a socialist revolution which led to the brief establishment of a number of communist political systems in (mainly urban) parts of the country, the abdication of Kaiser Wilhelm II, and the creation of the Weimar Republic

Explanation:

At the end of World War I, Germans could hardly recognize their country. Up to 3 million Germans, including 15 percent of its men, had been killed. Germany had been forced to become a republic instead of a monarchy, and its citizens were humiliated by their nation's bitter loss.

Similar questions