History, asked by ranjanarathi42, 10 months ago

what according to you does the term feminism mean? do you think womens should continue to get species privileges in our society? give reason​

Answers

Answered by disha2509
1

Answer:

Feminism is a range of social movements, political movements, and ideologies that aim to define, establish, and achieve the political, economic, personal, and social equality of the sexes.

I think yes.Empowering women does not mean belittling or punishing men. Men, too, suffer from gender role assumptions that place expectations upon them to live and act a certain way. Feminists believe each person should be viewed based on their individual strengths and capabilities as a human being, not the strengths and capabilities assumed of their gender. They believe every person should be treated equally — not because of gender, but in spite of it.

Similar questions