Political Science, asked by nandi7833, 8 months ago

write a note on feminisim​

Answers

Answered by HarshitaGoel
0

Answer:

Feminism, the belief in social, economic, and political equality of the sexes. Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women's rights and interests.

Explanation:

Hope it HELPS!

Answered by sapnalimbu4867
3

Feminism is a philosophy advocating equal economic, political, and social rights and opportunities for women. The term has been used for close to a century in the United States: Even before winning the right to vote in 1920, women who sought women's rights called themselves feminists.

Similar questions