Health care reform is the idea that the current health care system must be
changed because it is not effective, wise or fair.
Currently there is a debate in the United States about what health care reform should look like for Americans.
The United States enacted legislation March 23, 2010 that improves health care access and strengthens insurance oversight, adopting methods piloted by individual state systems.