Want this question answered?
yes doctors are better than teachers
Because teachers teach you the important things in life...and doctors know whats important for life.......
no of cource not doctors are way more important they save lives ok yeah teachers are responible for you education so you have a better life but come on what would you rather have an education or be dead think about it.................... If there were no doctors, then all the teachers would get sick and possibly die. Actually if there were no teachers there would be no doctors so no they are actually both are important
Is your brain more important than your liver? Try living without one. This is not really a question; teachers and doctors and absolutely vital to a healthy and enlighted society. We cannot do without either group.
Doctors and TeachersDoctors might be considered more important than teachers because they are the ones we go to when we are ill or injured, and they can save our physical lives. Actually, though, it depends on what your needs are which one is most important. If you are healthy and not injured, but want to learn a new profession, a foreign language, or even how to live a healthier life, a doctor would not be important, but a teacher with expertise in the subject you want to learn would be.So, doctors are not more important than teachers. Doctors and teachers are equally important. They are just important in different ways. That is questionable. Without teachers, there would be no doctors. It is not so much a matter of more important. They both serve very different functions.
As a doctor, I think teachers are more important. Without teachers, there wouldn't be any doctors. However, without doctors, teacher would still exist. They may not be as healthy as they could be, but they would exist as they had for thousands of years.
Because when a teacher taught someone then he/she became a doctor. The previous statement is incorrect, teachers don't teach doctors, doctors teach doctors. But in reality, neither is more important than the other. Both are integral to how our society functions
I would believe doctors are better but they both do "good things". Since doctors treat us when we are sick and teachers teach us about things we don't actually need to know most of the time. They could be equal but doctors are better in my opinion.
Yes, farmers are more important than doctors. Also, teachers are more important than accountants, and travel agents are more important than life coaches. Take notes, there will be a test.
We need both farmers and doctors to remain healthy. Also, there are quite a few other professions that we need to carry on a healthy society - like teachers, policement, businesspeople, etc. Discussions about what profession is more important seem to me quite idle, and will only tend to cause conflicts.
Most probably. If you think about it, there are 50 teachers in each of millions of schools.
is it because teachers can change a person while doctors can only save life?