answersLogoWhite

0

Throughout most of the many wars England has had with France the U.S.A was a British Colony, the only time before WW1 that Britain, France and the U.S.A were involved in conflict was during the battle for independence the French supplied the U.S with money supplies and men with which to fight the British.

Apart from this the U.S.A has had nothing to do with any war between Britain and France

User Avatar

Wiki User

15y ago

What else can I help you with?