History, asked by pooja7721, 1 year ago

describe what happen to germany after defeat in world war first

Answers

Answered by harshkaushik09
9
Germany lost to the allies in WW1 and suffered huge territorial losses giving away its land and population to Poland, Russia, France, Belgium an Denmark  and ultimately had to sign The Treaty of Versailles.
Similar questions