History, asked by farhan0784, 7 months ago

define feminism?


In short​

Answers

Answered by Anonymous
54

❣Answer❣

Feminism means awareness of women's right and interest based on the belief of the social ,economic and political equality of the genders.

Answered by abhishekpatel59259
0

Answer:

Feminism, the belief in social, economic, and political equality of the sexes. ... Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women's rights and interests.

Explanation:

mark as brainlist

Similar questions