This would depend upon which war you are talking about. The US fought against England in two wars, the Revolutionary War and the War of 1812, and it fought as an ally of England in two other wars, WW I and WW II.
england
so they can help out britain and not get taken over
The United Kingdom officially declared war on Germany on August 4, 1914, thereby entering World War I.
I don't believe England helped the South during the Civil War.
The British declared war on Germany on August 4, 1914, officially entering WWI.
england
England
Yes, and that is why England made the Sugar Act, Stamp Act etc. To impose higher taxes to help pay for the French and Indian War.
England and the US were allies in WWII against Germany's invasion of Europe and Africa.
During the Civil War, the south hoped to ally themselves with France and England.
They'd already taken that part of America when they called it New England. They lost New England along with the other American colonies during the American War for Independence and did not try and reclaim any areas after that.
the French- Indian War.