History, asked by Anonymous, 1 year ago

define Weimar republic

Answers

Answered by ashwini88
2
The Weimar Republic was the democratic government founded in Germany following Kaiser Wilhelm II's abdication near the end of War World I. It continued in name until 1945, but actually ended with Hitler's seizure of dictatorial powers in 1933
Answered by Anonymous
2

A national assembly met at weimer and prepared a democratic constitution that establish a federal structure in germany

Similar questions