answersLogoWhite

0

Is American society improving

User Avatar

Anonymous

14y ago
Updated: 8/20/2019

It depends on your political views. I would say yes, things are getting better, slowly but surely. Gay marriage is being legalized state by state (the governor of Washington state will sign a bill soon legalizing gay marriage, becoming the seventh state to do so), Americans are beginning to see the 'light' (they are beginning to see the issues in government, and things like Occupy). Things are getting better, but if you are conservative its not getting better.

User Avatar

Wiki User

14y ago

What else can I help you with?

Related Questions