write a note on feminisim
Answers
Answered by
0
Answer:
Feminism, the belief in social, economic, and political equality of the sexes. Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women's rights and interests.
Explanation:
Hope it HELPS!
Answered by
3
Feminism is a philosophy advocating equal economic, political, and social rights and opportunities for women. The term has been used for close to a century in the United States: Even before winning the right to vote in 1920, women who sought women's rights called themselves feminists.
Similar questions
Computer Science,
4 months ago
Social Sciences,
4 months ago
Hindi,
8 months ago
Physics,
11 months ago