answersLogoWhite

0

Hollywood changed the concept of Voodoo--or, rather inventing their own concept--simply by making a mockery of a legitimate religion from West Africa brought over to the Caribbean by slaves. They de-religionized Voodoo and created a dangerous entity by using, for instance, the walking dead to kill non-believers. Voodoo is still viewed as being a weird belief, because Hollywood makes money off such nonsense. == == I am a Christian and believe the Bible's answers to such questions as you pose.

If you care to know what The Bible teaches about sorcery/dark arts, there are sermons by Ivor Myers on this link: http://www.audioverse.org/search/?query=ivor+myers

They are different topics starting from March 5 2007 upward. Hollywood is also eventually explained in this context in one of these topics.

I am listening to them again, and will send you exactly which date. But all the topics are unbelievably intriguing and deal with the sinister that God asks His followers not to involve themselves with.

User Avatar

Wiki User

17y ago

What else can I help you with?