ww2 changed alot of things in America. the lives of many people like the blacks and other minorities were changed as a result of ww2. ww2 also made America a better country because it gained respect from other counteries.
It ended World War 2.
k
it didn't.
World War 2 changed the world forever by teaching us a lesson in the areas of atomic warfare, and the treatment of war prisoners.
ya bn indiremiyorum? :(
noo
bye
the world war 2 kill many male...resulting in not able to reproduce and have fun
yes DERT DA DER
he was dead
civilian union
maybe