answersLogoWhite

0

Outside of the medical field, the term doctor often means that an individual has earned a doctorate degree (a Ph.D.) in their field and often has done research.

User Avatar

Wiki User

11y ago

What else can I help you with?