No, Germany did not take over Japan. After World War 2, Germany did throw two atomic bombs in Japan, but they never took over their land because there was no one to rule them because
adolf Hitler was dead.
Well, in World War II, Japan was allied with the Germans, as Germany wanted to take over the world in that time to have full absolute power over everybody, but they needed help to do so. Germany then allied with Japan...
Germany, Japan, Poland and everyone else that got taken over by Germany and Japan like Poland
The Axis: Germany, Japan and Italy
No. Germany and Japan have never exercised political or military control over one another. Germany was instrumental, though, in helping Japan to modernize and the two countries were strong allies in World War II.
During ww2, Germany could of won the war
The territory of their neighbors. Nazi Germany, Italy & Japan were consumed with the desire to conquer & enslave their neighboring nations.
World War One made Germany pay for reparations, told that their army could not exceed 100,000 men, airplanes, submarines, and tanks were taken away, and Germny could not keep any land they took over during the war. That obviously upset the Germans. During World War Two, concentration camps killed millions of Jews all over Germany and Poland. Germany and Japan eventually surrendured.
both countries became highly militaristic and built powerful armies.
Both countries became highly militaristic and built powerful armies.
Both countries invaded neighboring countries to expand their territory and influence.
Both countries invaded neighboring countries to expand their territory and influence.
Both countries became highly militaristic and built powerful armies.