Hollywood thinks they’re the masters of dictating a moral compass.
The industry perpetuates how Americans should feel ethically, yet they are embroiled in sex scandals and star in violent films.
If you want more real news you can trust, subscribe to our youtube channel.