History, asked by IvotedforTRUMP, 7 months ago

HELP ASAP BEING TIMED!!!!

How did World War I change women’s roles in the United States?

A. Women received greater educational opportunities.
B. Women fought alongside men in the military.
C. Women replaced men in the workforce.
D. Women earned more money than men.

Answers

Answered by Anonymous
4

Answer:

\huge\boxed{\red{\bold{Answer}}}

Women received greater educational opportunities.

Answered by prabhkaur06
2

C is the correct answer...

Similar questions