answersLogoWhite

0

WWI first began in France, France and Germany hated each other. When Germany became an United Country in 1870-1871 , France went to war to try to stop it. Germany won and France wanted revenge. They started forming alliances and many countries got involved. It was a tragic moment.

User Avatar

Wiki User

14y ago

What else can I help you with?