Hollywood is run by businessmen. Who are rich. And therefore mostly conservative. And who care more about the bottom line than politics in any case, and generally won't care what message a film sends as long as it makes them richer. The whole "Hollywood run by liberals" thing is just another right-wing boogeyman. Sure, there are plenty of celebrities in Hollywood who are liberal (and we've got to outgrow this childish notion that that's a bad word), but there are plenty of others who are conservative, and in either case it's a mistake to confuse celebrity with power or authority.
The irony of conservatives dismissing Hollywood folks as liberals is that when a Hollywood conservative steps forward, whether they can string together words into sentences or not, they are immediately deified as some kind of larger than life figures. (Fred Thompson? Ronald Reagan? Ahhhhnold?)
In any case, the answer to the original question is no. There is no liberal conspiracy. And I'm a conservative.