No. Germany and Japan have never exercised political or military control over one another. Germany was instrumental, though, in helping Japan to modernize and the two countries were strong allies in World War II.
Germany, Italy, and Japan
Location gave Germany more strategic value than Japan to the other Allied countries. Japan had already been completely dominated by the United States so they retained control.
I thought it was France.
Against the Axis powers, of which Japan was part of. (The Bomb was dropped on Hiroshima and Nagasaki, Japan). The emperor of Japan at the time was Hirohito, and Hitler was in control of Germany and Europe.
Both Germany & Japan enslaved the people's of the conquered countries; primarily for the purpose of "forced labor", and to control the population.
Japan=Emperor Hirohito, Germany=Hitler.
Japan didn't invade Germany. During World War II, Japan actually joined Germany. It was part of the Axis which included Germany, Italy and Japan. After the United States declared war on Japan, Italy and Germany, as Japan's allies, declared war on the United States.
Japan did not attack Germany. Japan had an alliance with Germany and Italy. Japan fought the USA (after Pearl Harbour), Britain and her commonwealth (which included Australia).
JIG Japan, Italy, Germany
No. Germany and Japan are two different countries, a long way from each other.
After Japan attcked Pearl Harbor we declared war of Japan, and Germany.
Japan germany and italy