AMERICA AFTER WWII
At the end of the war, America was the strongest economy and had the strongest military in the world.
After World War II, the Americans became much more involved in the world's affairs. As some did not have jobs, the government told all the people to just have babies. This was after the end of the great depression. With that, it allowed more job and more schools to be made.
Food was rationed out for families, just like gas and all the metal, tin and rubber was collected to help the war effort.
hopefully this helped you! i used this for my project and it helped me a little
Seeing as how it's 2012 now, I'd have to say "yes!"
World War 1 happened first and World War 2 happened second.
The Korean War happened after WW2.
World War 2 and the Holocaust.
they were modifyed or put in muesuems
It was the Great Depression.
the will be no world war 2
The US entered World War 2 in 1941 and the war ended in 1945.
Nobody knows.
What happened to the nations who lost world war 2?
The Korean War happened after WW2.
World War 1 happened first and World War 2 happened second.
The industries probably destroyed by the Russians.
they won but were missing a lot of soldiers that they lost.
Japan bomb us i think :( - SpyAdventurer
Japans attack on Pearl Harbor
The Japanese attacked Pearl harbor and that allowed US to join World war two and get revenge
It is pointless to surmise what MIGHT have happened if 'thus and such' had happened, or didn't happen. History is what it is. What if Adolf Hitler had been killed fighting in World War I? See what I mean?