Why do schools require students to study biology if it is not their major?

I'm in no way an expert on the academic attitude of America but I believe that the purpose of taking any class that is not directly in one's major is to better round that person. It's hard to say I'm going into this field I ONLY need to know these things. You never know what other things a job may need you to know. Every educated person needs a basic understanding of certain subjects, regardless of his or her major or future profession. Taking introductory courses in bio, chem, and physics is, or should be, part of every well-rounded person's education. (And you can add the "dismal science" as well: economics.) Although you may never apply a single fact or concept you learn in science to your work or profession, you must have a rudimentary understanding of the natural sciences in order to understand the issues and ethical dilemmas that face all citizens and are the subjects of myriad debates and conflicts in this country -- and the world; e.g., (in no particular order) genetic engineering, evolution, abortion, stem-cell research, medicine, nutrition, global warming (or if you prefer, climate change), the environment, agriculture, food additives, smoking, drug use, sexually transmitted diseases, vaccinations, cancer clusters, pesticides, energy and alternative forms of it, etc. There are others, of course, and the list would be longer if I had included economic issues. (There could be more ignorance about economics than the natural sciences, which is really disturbing when you think about it.)