answersLogoWhite

0

After World War II, western Germany was under the control of France, Great Britain, and the United States, they merged their territories to create Western Germany. The Eastern Germany was under the control and influence of Russia

User Avatar

Wiki User

10y ago

What else can I help you with?