answersLogoWhite

0


Best Answer

It began dying out over a long period of time. Ever since the Civil War, racism was on a downward trend. After Martin Luther King Jr. gave his famous "I Have a Dream" speech, people began seeing African Americans as more equal to themselves.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

No, most certainly not. There was still an uphill battle for equality long after slavery as there still is an uphill battle to educate the ignorant and teach the intolerant in the US.

This answer is:
User Avatar

User Avatar

Wiki User

11y ago

It hasn't.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What has happened to racism in America?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions