answersLogoWhite

0

I'm not sure what they think about the life on earth, but they believe that when you die, you will enter the gates of Heaven and live an eternal life with peace and no worries. Or something along that line.

well, that's not entirely right. we believe that life on Earth is sacred from "the womb to the tomb." we believe that abortion is wrong, the death sentence is wrong (for the most part). we believe in pretty much anything that promotes life. as for the after life, we believe that God will judge us at the moment of our death. it will be like a slide show of our life in which he will judge our rights and wrongs and send us either to Heaven or Hell. Hell is eternal separation from God, while Heaven is eternal union with God. in heaven, we believe that there is no sin, therefore completely pure and peaceful, like the person above me said. life in either heaven or hell will be eternal. if god feels that you're still worthy of going to heaven but not completely ready, he will send you to a place called purgatory. purgatory is like a halfway mark where you will be purified until you are ready to go to heaven. every soul that goes to purgatory will eventually make it to heaven, none will go to hell. hopefully that summed it up for you!

User Avatar

Wiki User

14y ago

What else can I help you with?

Related Questions

When do Christians believe life starts?

Christians believe that 'life starts' at conception.


Who is Jesus and why is he important to Christians"?

Christians believe that Jesus is the Son of God who came to Earth to save humanity from sin. He is important to Christians because they believe he is the savior who offers forgiveness, eternal life, and a path to God.


How did life appear on earth?

Christians and scientists have different ideas about this. Christians believe, quite simply, that God created everything on the Earth, from the smallest microorganisms to the gigantic redwood trees and blue whales. Christians refute that life could have occurred just by chance. Scientists have different theories about the origins of life. Many believe in a "big bang" theory, but there are many different theories out there, most of which have life arising from "spontaneous generation".


Do Christians believe in the after life?

Yes, Christians all believe that the soul will live on after the body has died.


Is the pope is the best person to decide what christians believe?

The pope is the best person to believe what the 'Catholics' should believe but not necessarily what ALL christians should believe. It really is a matter of personal opinion. The Pope acts as a connection between all Catholics/Christians on earth and to help them to lead a good and fulfilled life so he knows what he is talking about.


What did Christians say created the earth?

Christians say and believe that the creation account in Genesis is true. God created the heavens and the earth.


Do christians belief in life after death?

Yes Christians believe in life after death. The Bible (God's word) teaches that.


Do Christians believe in life on other planets?

Some do, some don't. Some Christians claim that if we discover life on other planets, Christianity is proved wrong. But it is not so, because God doesn't say in the Bible if earth is the only planet that can support life.


Is the bible needed to guide life?

Christians believe this.


What does it mean for christians to believe in god as creator?

Christians believe that God is a creator because he created the whole earth in seven days and Adam and Eve and every single living thing on, in and coming to this earth.


What do Christians believe happens when you die?

Christians believe that when you die, your soul goes to either heaven or hell based on your faith and actions in life.


What do Christians believe is the purpose to life?

To serve and glorify God