how do films help to improve our society?
Answers
Answered by
3
Ans= The film industry is arguably one of the most impactful sectors in modern society. Sitcoms and comedy shows make us laugh, psychological thrillers help us see the world from an improved perspective, and historical films help us understand where we've come from as a people.
Similar questions