what happened to germany after second world war?
Answers
Answered by
1
the germany development by hitlers and fought world war 2 .
Answered by
3
HEY MATE HERE IS YOUR ANSWER--
The thing which happened after the defeat of Germany in word war are:-
1] their was scarcity of food
2] price of food increases
3] economic hardship
4] Germany have to sign a treaty where they they have to give their land to other county.
5] they also have give their resources
Anonymous:
but apke answer dediya kise ne
Similar questions