Hollywood's been ruined for me for awhile already. I usually have the TV on when I go to bed, but it's almost always stuff like Ancient Aliens,Mysteries of the Abandonded,What On Earth etc
I know even those shows exist to push a false truth...but, if you listen to the narrators etc as just giving their opinion, there's still some neat stuff to see
Far as actual movies/series etc ...that's all a woke shitshow anymore. Hell, in the beginning even Simpsons had values to be taught.
Hollywood's been ruined for me for awhile already. I usually have the TV on when I go to bed, but it's almost always stuff like Ancient Aliens,Mysteries of the Abandonded,What On Earth etc
I know even those shows exist to push a false truth...but, if you listen to the narrators etc as just giving their opinion, there's still some neat stuff to see
Far as actual movies/series etc ...that's all a woke shitshow anymore. Hell, in the beginning even Simpsons had values to be taught.