History, asked by katarmalraj58, 1 month ago

which social changes took place after the first world war?​

Answers

Answered by krantibakoriya81
0

Answer:

How World War One heralded social reforms - BBC News Even before the guns fell silent on the Western Front, the long-term social consequences of World War One were being felt back home. Women had a stronger voice, education, health and housing appeared on the government's radar, and the old politics were swept away.

Answered by SriraamV
0

Answer:

The aftermath of World War I saw drastic political, cultural, economic, and social change across Eurasia, Africa, and even in areas outside those that were directly involved. Four empires collapsed due to the war, old countries were abolished, new ones were formed, boundaries were redrawn, international organizations were established, and many new and old ideologies took a firm hold in people's minds. World War I also had the effect of bringing political transformation to most of the principal parties involved in the conflict, transforming them into electoral democracies by bringing near-universal suffrage for the first time in history, as in Germany (1919 German federal election), Great Britain (1918 United Kingdom general election), and Turkey (1923 Turkish general election).[citation needed]

Explanation:

Similar questions