History, asked by krishna300notout, 11 months ago

describe what happened to Germany after it's defeat in the first world war

Answers

Answered by deeprajj
86
The thing which happened after the defeat of Germany in word war are
1] their was scarcity of food
2] price of food increases
3] economic hardship
4] Germany have to sign a treaty where they they have to give their land to other county.
5] they also have give their resources

krishna300notout: thanks
deeprajj: it's okk
Answered by silentkiller000
17
hi mate..


As in most nations, the economic factors of the time play a significant role in determining how a society will behave. Germany was economically devastated after a draining defeat in World War I. Due to the Versailles treaty, Germany was forced to pay incredibly sizeable reparations to France and Great Britain.


hop dis hlps u..
#be brainly...
Similar questions