English, asked by ndbsvhx, 4 months ago

what is feminist??
only a defination

Answers

Answered by Anonymous
4

Answer:

Feminism, the belief in social, economic, and political equality of the sexes. Although largely originating in the West, feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women's rights and interests.

Answered by swastika9413
0

Answer:

Faminism is the act to raise the power of female in the make dominated society. People who take part in the action are called Faminist.

Similar questions