The only way that Hollywood ever skews toward liberal is because part of what we make out of Hollywood involves writers, actors, directors, musicians, set designers, and photographers. In general, people like that are going to be more progressive, more open minded, a little more altruistic.
Hollywood has to appeal to the broadest audience, and when it comes to most social and economic issues, America is progressive. Because of that, the messages that are in Hollywood movies tend to be, for instance, pro-environment.