History, asked by puskarkray, 3 months ago

After the treaty of Versailles people of Germany lost their faith in democracy.(True/False)

Answers

Answered by academymindbenders
0

Answer:

true

Explanation:

The Treaty of Versailles (French: Traité de Versailles) was the most important of the peace treaties that brought World War I to an end. The Treaty ended the state of war between Germany and the Allied Powers. It was signed on 28 June 1919 in Versailles, exactly five years after the assassination of Archduke Franz Ferdinand, which had directly led to the war. The other Central Powers on the German side signed separate treaties.[i] Although the armistice, signed on 11 November 1918, ended the actual fighting, it took six months of Allied negotiations at the Paris Peace Conference to conclude the peace treaty. The treaty was registered by the Secretariat of the League of Nations on 21 October 1919.

Similar questions