We all know what that is. I've been guilty of that while visiting other countries and I regret it. Ok, so for a little history. To Great Britain, while you all were in power for several centuries, was there ever such thing as an "ugly Englishman"? How about the French and Spaniards? When your respective nations were the major players on the world stage, how did other nations feel about you? Is there anything from the literature of that period to give you an idea of how your countrymen were perceived around the globe? For some reason, my trick knee is telling me that in history, this isn't the first time that a nation and her citizens had been criticized (and in many cases, rightly so) when her country was a world power. I'm not suggesting that Americans are beyond reproach, only that to a certain extent, when you're on top of the heap, everybody throws rocks at you. I just doubt that this is the first time in history it's happened. In fact, when we weren't a world power, we probably took part in some rock throwing of our own.