• 2 years ago
The dark side of Hollywood

Hollywood history
It's no secret that Hollywood has a darker side. It's a gigantic film industry that has entertained countless people all around the world for over a century now, but for all its glamour and beauty, there lies beneath it a darker underbelly. It's no longer hidden from the public, as there have been too many stories about the harm it's done to people for any filmgoers to remain blissfully unaware.
youploger

Recommended