Social Sciences, asked by mdshakil9, 10 months ago

how us the life women in france​

Answers

Answered by Rishabhgtam
1

Answer:

Explanation:

The roles of women in France have changed throughout history. In 1944, French women obtained women's suffrage. As in other Western countries, the role of women underwent many social and legal changes in the 1960s and 1970s. French feminism, which has its origins in the French Revolution, has been quite influential in the 20th century with regard to abstract ideology, especially through the writings of Simone de Beauvoir.

Similar questions