What do you mean by Women Empowerment ?
Share your ideas in this group
Answers
Answered by
3
Answered by
0
Answer:
Women empowerment is the pivotal part in any society, state or country. It is a woman who plays a dominant role in the basic life of a child. Women are an important section of our society. Education as means of empowerment of women can bring about a positive attitudinal change.
Similar questions