Social Sciences, asked by lathasajumon, 4 months ago

The roles of men and women play or work,they do are ???(equally or not equally )​

Answers

Answered by Anonymous
8

Is there anything more fundamental to our collective well-being than the quality of relationships between men and women? Last year, our society stumbled into unmapped territory in gender relations - as women told story after story of horrific abuse and predation at the hands of men in their lives, we witnessed the birth of a long-overdue movement. We heard millions of voices rise in unison to say "#MeToo," and our country will never be the same.

But the real work is just beginning. Men and women need to figure out a way to navigate this new territory together, and there's no use in pretending like it will be easy or straightforward. There's a lot of anger in our country right now - anger among women who have been abused, coerced, and dismissed by their male colleagues, as well as anger among men who feel besieged and demonized when many have done nothing to deserve it.

Answered by shrawani321
1

Answer:

I think yes because they have their own responsibility and they should do that.

Similar questions