Hollywood is for-profit, is what Hollywood is. All the studios are owned by big, megacorporations that are the furthest thing from liberal you can possibly imagine.
You could feel America starting to ease up a little bit on racism, against blacks in certain pockets, and then suddenly The Cosby Show bubbled up and it was the right time for it.
I think American culture had just become so disengaged from the process of government, and we'd been so fuzzed out by our pop culture around us, that I don't think people really saw this guy for what he was.