answersLogoWhite

0

The Vietnam War a war that changed the United States in more ways than one. The Vietnam War was able to transform policy making in the US by preventing the US from intervening in other conflicts on the Asia mainlands.

User Avatar

Wiki User

11y ago

What else can I help you with?