Unless you have your head buried in the sand, I’m sure you’ve heard that there is a lack of roles for female actors in television and movies, with the latter being far worse. In fact, it has come out in recent months that lead actresses struggle to get paid as much as their male counterparts, even when they are the main star. It makes you think that Hollywood doesn’t value women: the audiences or the actors. Why aren’t their more female-led projects? You might excuse the Hollywood establishment by saying that female-led television shows and movies don’t do well. Wrong.