"USA"
The US was not founded by the French
none. french cities are in france not the us
There has never been any French colonies in the US. There were large areas of North America which were French colonies which became a part of the US.
Yes in WW1 and WW2 the French were allied with the US.
The French sold Louisiana to the US in 1803.
they all were discovered by french explorers all but the US was discovered by french explorers...
French People
100 US dollars = 5 french dollars.
No , the US assisted the French .
Generally, the French and US governments do get along quite well.
gave us french food, french words, gave us haute coutre fashion,
John Adams was the US President when the French Revolution ended in 1799.