!! End of America !! – Even Sociology Thinks We’re Doomed !
Is it possible that Western Culture is doomed? Are we on a downward trajectory that ends with America being wiped off the map? Nuclear war, maybe? Economic collapse? Something worse? Don’t believe …
Post to Tumblr