Advertisement
Advertisement
the Ugly American
- Pejorative term for Americans traveling or living abroad who remain ignorant of local culture and judge everything by American standards. The term is taken from the title of a book by Eugene Burdick and William Lederer.
Advertisement
Advertisement
Advertisement
Advertisement
Browse