It seems that in the Hollywood "wonderland" everyone knew everything,"but" people thought it was ok to tolerate predators.
And mind you this all makes fuss because of the whole celebrities thingy, I do not want to imagine what is going on in business or schools and colleges..
I don't know about Academia but why would you think the business world is worse than Hollywood? The business world has been dealing with this crap for years which is why we have had sexual harassment training for decades. Hollywood has had a pass for so long because it is such a small, tight knit community run by a few power players and people are willing to stay silent for fear of having your career ruined.
Business is so large it is easy to change jobs and melt away to another job. While there is always fear and pressure not to come forward, there many more ways to come forward in the business world. I do think this generation of people will be more aware of sexual harassment and more issues will come to light in every sector, not just Hollywood. But I really think Hollywood was an anachronistic hold back and escaped this for so long, which is why we are hearing about this now.
Hollywood is in desperate need of more high profile female directors, producers and studio heads and for being so full of liberals has really fallen behind in that area.