

For sure Hollywood has played a huge role.
- The desensitisation of war and violence.
- Celebrity worship. Politicians are treated like celebrities and when they do evil things, it’s treated like an episode of Entertainment Tonight.
- The fiction of the American saviour (e.g. Superheroes and all those damn apocalypse movies where the USA saves the world).

Religion is a tool for that class. It’s also an excuse for them to do crime and absolve themselves of guilt by asking for forgiveness - if they even do that.