answersLogoWhite

0

A dentist is a doctor. I believe you mean, though, a doctor as in someone who does body parts, not teeth, like a dentist. Here are some differences (if that is what you are referring to):

Dentists work on teeth, they inspect them and clean them. Dentists do not work on the mouth, however. Only teeth (and gums), that is their job. A doctor's job is to work on the the other mouth parts, and other body parts and bones.

User Avatar

Wiki User

15y ago

What else can I help you with?