define Weimar republic
Answers
Answered by
2
The Weimar Republic was the democratic government founded in Germany following Kaiser Wilhelm II's abdication near the end of War World I. It continued in name until 1945, but actually ended with Hitler's seizure of dictatorial powers in 1933
Answered by
2
A national assembly met at weimer and prepared a democratic constitution that establish a federal structure in germany
Similar questions
History,
7 months ago
Science,
7 months ago
Biology,
7 months ago
Chemistry,
1 year ago
Social Sciences,
1 year ago