Christainity teaches us how to live life.
Christianity teaches us that there is more to life than what we live here on earth - there is life after death. It teaches us that there is a spiritual world.
It also teaches us that life as we know it was no accident, life was created by a creator, and we were created to have fellowship with our Creator. But because the first man and woman rebelled against the Creator all people born since that time are spiritually dead.
The New Testament is the Holy Book that refers to the life of Jesus Christ and the life of the early christians
To be saved by believing in Jesus.
Current Christians use Jesus' teaching as a guide to their own life in hope to attain salvation.
NO!! it is important to ALL who want to learn the truth.
Virtually All Christians believe that Jesus was God in human form, and He rose from the dead on Easter Sunday. YES. The life and death and resurrection of Jesus Christ will Always be important for All Christians, Everywhere and for All Time.
Christians believe that by His death, Jesus made it possible for us to have eternal life, that He died that our sins might be forgiven. Therefore, believing this, Christians acknowledge Jesus as their Saviour.
The Bible
Christians who are willing to give up their life for Jesus and go to another country and talk about Jesus.
They do. If they did not, there would be no more Christians. If they they do not, it is because they want to devote their life to the Lord Jesus.
The Bible symbolizes holiness to Christians. In it we learn about the work of God and his son, Jesus.
No! The sanhedrin didn't believe Jesus had ever came back to life, that's why they demanded that the Romans persecute the Christians because the Christians told everyone that Jesus had risen from the dead.
Christians believe that Jesus is the Son of God who came to Earth to save humanity from sin. He is important to Christians because they believe he is the savior who offers forgiveness, eternal life, and a path to God.