what is feminist??
only a defination
❤
Answers
Answered by
4
Answer:
Feminism, the belief in social, economic, and political equality of the sexes. Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women's rights and interests.
Answered by
0
Answer:
Faminism is the act to raise the power of female in the make dominated society. People who take part in the action are called Faminist.
Similar questions
Math,
2 months ago
Social Sciences,
4 months ago
Social Sciences,
4 months ago
Art,
9 months ago
Chemistry,
9 months ago