answersLogoWhite

0

Actually, social studies has everything to do with society. The definition of social studies is the study of social relationships and the functioning of society and usually made up of courses in history, government, economics, civics, sociology, geography, and anthropology so it has everything to do with being social.

User Avatar

Wiki User

13y ago

What else can I help you with?