answersLogoWhite

0

Yes, the USA is a capitalist country. The object of most Americans and most American companies is to make money or increase the amount of capital they have.

User Avatar

Wiki User

14y ago

What else can I help you with?