0
Anonymous
Certainly not. France is a sovereign European country with a long history . Much of the present day United States was once a territory of France.
Wiki User
France sold the lousisiana territory to the US
Louisiana
France
They purchased it from France.
no i was from Britain
France sold the Louisiana Territory to the USA.
France and the british