Hollywood is run by businessmen. Who are rich. And therefore mostly conservative. And who care more about the bottom line than politics in any case, and generally won't care what message a film sends as long as it makes them richer. The whole "Hollywood run by liberals" thing is just another right-wing boogeyman. Sure, there are plenty of celebrities in Hollywood who are liberal (and we've got to outgrow this childish notion that that's a bad word), but there are plenty of others who are conservative, and in either case it's a mistake to confuse celebrity with power or authority.