answersLogoWhite

0

The 'New Deal,' a series of widespread federal (and some state) programs and initiatives during the 1930s, was indeed 'liberal.' As distinct from 'conservative' understandings of the responsibilities and privileges of federal government, the 'liberal' New Deal greatly expanded the role of federal government relative to its previous role in American economics and social planning. Fundamentally, it conceived of the federal government as responsible for ensuring in-state stability and prosperity in domestic affairs.

User Avatar

Wiki User

10y ago

What else can I help you with?