Most of the world sees Americans (those living in the USA) as ignorant, arrogant, brash, loud, war mongering, and materialistic, not to mention megalomaniacs. The following video is evidence: http://www.YouTube.com/watch?v=QZgDVx2gZUI If still in denial, also see: http://www.youtube.com/watch?v=6lwbSdL2s00&feature=PlayList&p=B5272E78FF3DCACA&index=0&playnext=1 Hope your question has been answered to your satisfaction.
Its shoot from the hip policies without consultation with the world community.
western states
No.
they both hated germany
no they hated them
Its not the popularity of the soldiers, but the conflict they are in. A great example was Vietnam. Most hated the war itselft, and why we were over there, not the soldiers.
Andrew Jackson
soccer.
They hated us
This is quite a broad question to be asking, I wouldn't state that all of us are hated. In a sense of stereotypical answers though, men are hated because of how women see us as far as cheating, lying etc.
Because he hated it.
western states
Us dosent hates any country but us is hated by all the countries
No.
they both hated germany
No
no they hated them
Because the people from Britian sucked and hated us