answersLogoWhite

0

Job benefits are the extra items that an employer will offer to its employees. In the US at least, that usually means some sort of health insurance and some sort of retirement savings plan at a bare minimum. Many employers will also offer things such as life insurance, legal insurance, and discounts for various goods and services (like cell phone discounts.)

User Avatar

Wiki User

15y ago

What else can I help you with?