answersLogoWhite

0

The unprovoked attack by the Japanese on Pearl Harbour

User Avatar

Wiki User

10y ago

What else can I help you with?

Related Questions

Why was the US upset with Japan?

The U.S. was pissed of at Japan in WW2 because the Japanese attacked Pearl Harbour (a U.S base) un-provoked, which led to the government of America declaring war on Japan


Sinking of the lusitania led to what?

America declaring war on Nazi Germany and joining the Second World War in Europe.


What led to the us full involvement in World War 2?

The immediate cause was Japan's attack on Pearl Harbor. This caused the US to declare war on Japan, which led to Germany and Italy declaring war on the US.


What led to America declaring war?

Post your question again, please, and tell us which war you are thinking of.


What event in the northern America led to the enlightenment?

The American Revolution


What two men led America in the war against japan in WW2?

teddy rosevelt and hitler


Which event led to the Seven Years War in America?

the formation of the Ohio company


What event directly led to America's entry into World War 2?

The attack on Pearl harbour


What event directly led to the independence movement in Latin America?

The Peninsular Wars, when France led by Napoleon Bonaparte invaded Spain and Portugal.


Did America attack Japan after Japan attacked pearl harbor?

Yes. It was a thing we both shouldn't have done cause that led us into the war deeper.


What event occurred on dec 7th 1941?

On December 7, 1941, Japan launched a surprise military attack on the United States naval base at Pearl Harbor, Hawaii. This assault led to the destruction of numerous ships and aircraft and resulted in the loss of over 2,400 American lives. The attack prompted the United States to formally enter World War II, declaring war on Japan the following day. This event significantly altered the course of the war and U.S. foreign policy.


Which groups converged a new nation on the North American continent?

The thirteen British colonies in North America converged to form a new nation by declaring independence from Great Britain in 1776. This led to the creation of the United States of America.