it became the biggest country in Africa
Yes, Germany is in fact a country of Europe. It is by Poland, Sweden, Czech Republic, The Netherlands, Belgium, France, and some other country.
Cities in GermanySome cities in Germany are:BerlinBremenHannoverKölnHamburgLeipzigDresdenBonn
Well it was Germany Because of the nazi party then Hitler wanted revenge on french and the British then Germany invades Poland in 1939 Britain and France declare war on Germany. But i will skip some things the last thing am going to say is the Germany Invade Russia in 1941
He became chancelor of Germany and he got rid of some of the unemployment
Germany just like every othercountry has many different cultures, totally depening on which part / state/ of Germany you refer to. To get an idea about the differences read on here http://travel1000places.com/destination/visit/show.aspx?country=Germany.
The country of Germany does not sell products but German companies do. You can bet some are fake just like in every other country.
Greece
No. Europe is not a country, it is a continent. And yes, some parts of Germany and France had pagan religions.
There are many things to see and do in Germany. Germany has many beautiful castles to visit, great food, wonderful cities to see, the Black Forest, mountain areas, and wonderful little villages.
Technically, yes. I only say technically because not all of them were strictly German. Some of them just moved to Germany and then became Nazi.
Germany is a country. Some obtained passports in Berlin. Detailed report link: http://www.9-11commission.gov/staff_statements/911_TerrTrav_Monograph.pdf