Germany and France were hostile to each other prior to World War II because Germany successfully invaded France in 1940. The relations between France and Germany is embodied in a cooperation called Franco-German friendship.
Germany wanted an empire like Britain and France had. Unfortunately, Britain and France owned over half of the world between them and Britain also had the world's best navy, so Germany would have to both increase the size of its navy and fight Britain and France for colonial possessions. This damaged relations
U.S., Germany, Japan, France, and U.K.
U.S., Germany, Japan, France, and U.K.
There is a lot of controversy to this. The possible answers are Germany, Russia, UK, or France. Germany has lots of economic power. Russia has a large military. France also has a large economy and good relations with the world. (also a lot of territorial claims) The UK has a large economy as well and good relations with the world (past empire[commonwealth])
Relations between the two countries sorely declined. France and Germany would not get on good terms until the post-World War II period in European History.
Germany invaded France during World War II.
No, Germany was France's enemy.
No, France gave in to Germany, and Germany then took over France. We did not help them. Not in World War I, the war the question asks about. Britain joined France in its fight againt Germany during the first World War.
France was a member of the Triple Entente, while Italy was a member of the Triple Alliance, with Germany and Austria-Hungary.
France wanted to weaken Germany so that it wouldn't threaten France again
France is well known for its wine production.