answersLogoWhite

0

Did England win the war of roses?

User Avatar

Anonymous

13y ago
Updated: 8/20/2019

The Wars of the Roses were civil wars where each faction wanted control over the crown of England. You could say England won the Wars of the Roses, but England also lost the wars.

User Avatar

Wiki User

13y ago

What else can I help you with?