History, asked by Sipundash6663, 4 days ago

How did World War I change the US?
a. It caused the United States to forever end American isolationism.
b. It brought the US out of a recession and into a period of prosperity.
c. It put the US into a recession that later led to the Great Depression.
d. It resulted in a new amendment to define American citizenship.

Answers

Answered by s164892
4

Answer:

It caused the United States to forever end American isolationism.

Answered by ashutoshmishra3065
1

Answer:

Explanation:

Everything altered after World War I.

More than 37 million people died in the war worldwide, and millions more endured its aftereffects.

Empires were destroyed, borders were redrawn, and economies were decimated.

In comparison to the Korean and Vietnam wars put together, World War I claimed more American lives.

The struggle divided and united the United States, paving the way for new definitions of American citizenship and the country's place in world politics.

More than a million American soldiers were dispatched to Europe, where they encountered a conflict unlike any other, fought in trenches and in the air, and characterized by the development of military innovations like the tank, the field telephone, and poison gas. The conflict also had an impact on the culture of the The postwar years saw a wave of civil rights advocacy for equal rights for African Americans, the ratification of an amendment safeguarding women's ability to vote, and a bigger involvement in foreign affairs for the United States after an Armistice accord ended the conflict on November 11, 1918.

It caused the United States to forever end American isolationism.

#SPJ2

Similar questions