Why is Vietnam important to American history?

The U.S. participated in the Vietnam War. We were allies with the South Vietnamese. Vietnam won their independece from France and then they split into North and South. We helped and aided the South Vietnamese during the Vietnam War. If the young are finally given the correct history on the Vietnam War (I'd rather place my bet on a Vietnam War Vet speaking to the kids than any books could tell) then that's magic, because the U.S. government not only lost that war, but they under-estimated their enemy. For the first time in American History many American citizens were torn over this war and didn't give the Vietnam Vets who made it home the honor they deserved. The U.S. Government gave little aid to the Vets and many of them had psychological problems (trauma) and also drug addiction. The U.S. government tried to hide the embarrassment of this war and took it out on the very men who served their country. If anything should be learned about this war all countries shouldn't be quite so hasty to go into war before learning more and getting to know their real enemy. War simply makes money! Some wars have to be fought, but the Vietnam War was a mistake and isn't that a lovely label to put on the Vietnam Vets. If you really want to learn about the Vietnam War go on websites and talk to Vietnam Vets!