answersLogoWhite

0


Best Answer

They are not too different from the rest of the world ,but mostly Christan

Answer

If your question means to ask "what religions are practiced in Germany," the answer is: nearly all the same religions that are practiced in the United States. Traditionally Germans were Protestant (in the north) and Catholic (in the south), with a large Jewish population especially in large cities. Since the Holocaust, Judaism is obviously much less practiced in Germany today, but there are vibrant and growing Jewish communities throughout Germany still. In addition, a large Turkish minority practices Islam, and a large Greek minority practices Orthodox Christianity. Nearly every other major world religion, from Hinduism and Buddhism to Scientology are practiced in Germany today--and many Germans are also agnostic or atheistic as well, just as in the United States.

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What do Germans believe in?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What do Germans believe?

Germans believe in god like Americans do.


What did Hitler believe was the Master Race?

the Germans


What did Hitler believe that Germans were?

He beleived they were the 'perfect race'


Did Hitler believe Germans need lebensraum?

yes


Do Germans believe in ghosts?

15 of my friends are Germans and they belive in ghosts you cant realy tell if Germans belive in ghost, because there are belivers and well...not belivers


Did Hitler tell the Germans what they wanted to believe?

yes, and they were all lies.


Why did the franks and other German Jews believe they were true Germans?

Because they were...


Who did Hitler believe were true Germans?

Anyone blonde and blue-eyed.


Pennsylvania Germans believe a White Christmas indicated what color of Easter?

green


Where in Virginia did Germans and Scots-Irish settle?

I believe that they settled in mostly Jamestown.


How do the people feel about the Germans at first?

I believe that the English hated the Germany at first


What did Hitler do to have the Germans believe him?

He became chancelor of Germany and he got rid of some of the unemployment