Social Sciences, asked by farhan0784, 7 months ago

define faminism?


Answer Right


dont spam

Dont copy​

Answers

Answered by srakesh20498
0

Answer:

Feminism, the belief in social, economic, and political equality of the sexes. ... Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women's rights and interests.

Mark me as brainliest

Answered by karansaw14366
0

Feminism is a range of social movements, political movements, and ideologies that aim to define and establish the political, economic, personal, and social equality of the sexes.Feminism incorporates the position that societies prioritize the male point of view, and that women are treated unjustly within those societies.Efforts to change that include fighting against gender stereotypes and establishing educational, professional, and interpersonal opportunities and outcomes for women that are equal to those for men.

if you like follow me

Similar questions