answersLogoWhite

0

Doctors are not ignorant. They have many years of education and internship behind them before they ever practice on their own. Doctors are required to complete continuing education requirements and are on top of their particular medical field.

From AllHorses101: Some doctors can be ignorant when you tell them about symptoms and they do not listen. However, most doctors are learned and know what they are doing. Otherwise they would not be doctors. :D

User Avatar

Wiki User

12y ago

What else can I help you with?