answersLogoWhite

0

Until late in the war most people in Britain were unaware of the war, however when France joined the U.S. they were able to draw attention to the struggle of the colonies which ultimatly caused the public to favor the colonies and helped force the British government to end the war

from Claycrazy:

I asked around, and someone told me that, yes the war was popular in England.

User Avatar

Wiki User

13y ago

What else can I help you with?