answersLogoWhite

0

I would have to say after the American Civil War and during the mass wave of Immigration and the Industrial Age that relations were normalized. Before the Civil War, Britain and America were engaged in several border disputes in relation to British North America-- Pig War, Aroostook War, etc. During the Civil War, US and UK relations were very cold and frosty almost to the point of Britain siding and fighting with the Confederacy. In the sense of becoming allies, it was not totally evident until World War One when the U.S. formally sided with the Allied Powers over the Axis Powers. World War Two and onwards into the present reaffirmed their commitment to the "special relationship."

User Avatar

Wiki User

16y ago

What else can I help you with?