Political Science, asked by Avani2004, 1 year ago

what do you mean by term Weimar Republic????

Answers

Answered by adityachaturvedi123
2

The Weimar Republic was the democratic government founded in Germany following Kaiser Wilhelm II's abdication near the end of War World I. It continued in name until 1945, but actually ended with Hitler's seizure of dictatorial powers in 1933. From the beginning, the Weimar Republic suffered from the financial and psychological effects of the Treaty of Versailles, including the requirement to pay "reparations" to France and England, several military occupations, and its famously crippling inflation. Additionally, controversy surrounding that treaty and the manner of Germany's defeat in World War I led to increased support for extremist nationalist groups, such as the Nazi party, which won the largest majority in parliament in 1932 and soon thereafter established a dictatorship.


Avani2004: thnx
Similar questions