The time period from around 1400 CE to just past the mid-1850s. After that, most of the American continents fought back and gained freedom for themselves from the European countries that had previously conquered and used them.
America never took over Europe.
Because winds tend to blow from North America towards Europe.
eastern Europe
it the Africa Americans that take over Europe Asia all those other countries are battling Europe to take over their world
Black bears are found pretty well all over North America, South America, Europe, and Asia.
No, but they were already making plans to take over California and the Pacific coast of North America anyway.
Badgers are found on North America, Europe, Africa, and Asia.
not they owned Florida but america did take over it
America did not want Germany to take Europe for many reasons. First and foremost is the fact that America was afraid that if Germany was able to take Europe there would be no stopping them. Another reason which really wasn't known about widely until the war end, was the fact that Germany was exterminating not only the Jews but anyone the NAZI party thought undiserable. It didn't matter if you were a normal person, that was not the consideration, what was important at that time was the fact that everyone had to do what was best for the country, in this case, Germany. Germany was practicing a very extereme verison of Eugenics. Eugenics was a psuedo science started in North America and moved over to Europe. America practiced Eugenics, but not the extent Germany did. Hope this helps, but I would suggest further research......
Some diseases brought over from Europe to the Americas included smallpox and measles.
ya
most of Europe