Identify the role of women in American society before, during, and after World War I.
Answers
Answered by
2
Answer:
Women were given increased political rights. ... Women were applauded for their roles as nurses in army hospitals. Women supported their husbands by cooking in camp and bringing water. Women were mostly there to raise money and morale.
Similar questions