answersLogoWhite

0

The main effect of the war of 1812 was that it ended the American revolution. Of course the "official" end of the revolution was the British surrender to Washington at Yorktown, but until the war of 1812, the newly formed United States, despite its new found political independence, still had essentially the same economic relationship with England as when they had been colonies. The real independence, economic separation from England, didn't happen until the end of The War of 1812.

Justin Kodner

User Avatar

Wiki User

15y ago

What else can I help you with?