Trust me Americans know that our image is already being ruined. It makes me sick to see all these other countries get to move on while we are stuck in this miserable shit hole. It is depressing and disappointing. Listening to the UK talk about how the Tories will make them 'like America' like that's a terrible thing to be. It makes me want to cry. I want us to be better and not get even worse.
I’m Canadian and while people tend to like most Americans as individuals, pretty much no one but right wingers likes the country as a whole. When I lived in Japan, people there were kind of like “we want to go there to go to Disneyland but fuck living there”.