answersLogoWhite

0


Best Answer

Study of materials that make up Earth and the processes that form and change these materials.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

Earth as scientific body, including the origin and history of its rocks, soil and other spheres. And also involves even the study of life on earth.

This answer is:
User Avatar

User Avatar

Wiki User

10y ago

According to an American dictionary, the word 'geology' means the science that deals with the earth's physical structure and substance. It can also mean the geological features of an area.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: According to an American dictionary what does the word geology mean?
Write your answer...
Submit
Still have questions?
magnify glass
imp