Most of the world sees Americans (those living in the USA) as ignorant, arrogant, brash, loud, war mongering, and materialistic, not to mention megalomaniacs. The following video is evidence: http://www.YouTube.com/watch?v=QZgDVx2gZUI If still in denial, also see: http://www.youtube.com/watch?v=6lwbSdL2s00&feature=PlayList&p=B5272E78FF3DCACA&index=0&playnext=1 Hope your question has been answered to your satisfaction.
western states
they both hated germany
No.
no they hated them
Its not the popularity of the soldiers, but the conflict they are in. A great example was Vietnam. Most hated the war itselft, and why we were over there, not the soldiers.
soccer.
Andrew Jackson
They hated us
This is quite a broad question to be asking, I wouldn't state that all of us are hated. In a sense of stereotypical answers though, men are hated because of how women see us as far as cheating, lying etc.
western states
Because he hated it.
Us dosent hates any country but us is hated by all the countries
they both hated germany
No.
No
Because the people from Britian sucked and hated us
no they hated them