Hollywood needs to "pick a side" in the Culture Wars, according to the media. They're mad the Oscars weren't MORE political. Isn't that what got them into the Culture Wars in the first place?
Hollywood CAN'T Be Apolitical…
Hollywood needs to "pick a side" in the Culture Wars, according to the media. They're mad the Oscars weren't MORE political. Isn't that what got them into the Culture Wars in the first place?