Swhat do you understand by gender roles
Answers
Answered by
0
Gender roles can be defined as the behaviors, values, and attitudes that a society considers appropriate for both male and female. ... Traditionally, men and women had completely opposing roles, men were seen as the provider for the family and women were seen as the caretakers of both the home and the family.
Similar questions