Hollywood has a Conservative Bias
Claims of “Liberal Hollywood bias” or smears against the “Liberal Hollywood Elites” is a propaganda campaign by the right to motivate their base but also work the refs so that Hollywood films and TV shows are actually more conservative than liberal. It has been such a successful campaign that even many liberal minded people will falsely believe that Hollywood is liberal – but this is not the case!
We are putting together a guide tracking all the many conservative biases prevalent in film & TV starting with the biggest one, sexism. And we will also point out a lot of conservative biases are inherent to the genre of movies & TV shows and so will never change. Such as there is an entire genre where crime is rampant called...yes you guessed it, "Crime". We enjoy movies and TV shows and understand they are just that.