Hollywood has to appeal to the broadest audience, and when it comes to most social and economic issues, America is progressive. Because of that, the messages that are in Hollywood movies tend to be, for instance, pro-environment.
The only way that Hollywood ever skews toward liberal is because part of what we make out of Hollywood involves writers, actors, directors, musicians, set designers, and photographers. In general, people like that are going to be more progressive, more open minded, a little more altruistic.
I grew up with a single mom who was a waitress. We were on food stamps. My mom then got Pell Grants, put herself through college to get a degree to get a better job. Because we were broke, I then had to go to a state school. I went to Temple University, and had to get loans. So I grew up in a world where I saw the government helping individuals pull themselves up, and saw it work very successfully.
We punch mirrors and we explore our darker selves. No, it's just an amalgam of all newscasters that we grew up with. Sort of like before there was cable, when these people were like gods.
The easiest time to be funny is during a fairly serious situation. That way, you can break the ice. It's crazy, but even at funerals, people will get huge laughs.
You could feel America starting to ease up a little bit on racism, against blacks in certain pockets, and then suddenly The Cosby Show bubbled up and it was the right time for it.
I have certain beliefs about how people should treat employees and how companies should be run, but I was really surprised though this process to learn that those beliefs are actually good business.