In Europe. Mostly France.
mostly in France
The Legion of Doom
germany
France and Belgium
france
Yes, Americans were in France in WW1.
No, France gave in to Germany, and Germany then took over France. We did not help them. Not in World War I, the war the question asks about. Britain joined France in its fight againt Germany during the first World War.
Probably Russia, then France
Germany, France, Britain, Russia
The Triple Entente (Britain, France and Germany)
Russia to the East, France to the West