English, asked by shiny6145, 7 months ago

A report on Feminism ​

Answers

Answered by sahithi052412
1

Answer:

Feminism, the belief in social, economic, and political equality of the sexes. Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women’s rights and interests.

Similar questions