"Hollywood" is a deeply conservative state of mind. Commercial products, about the only kind available, will reflect the political situtation, not in a narrowly partisan sense necessarily, but the basic world view. That basic world view says that there are monsters out there out to get us and "we" have to be remorselessly violent to save ourselves and win. And this apocalyptic struggle comes out of no policy or decision, it just happens. That's the basic world view. Hence these movies and TV shows.
But why would I change my mind when there are no arguments to the contrary? I mean, defending it is one thing, but denying it is another. It's a major cultural trend that's been worsening for more than 25 years; it's not exactly subtle.