Yes, they can. Under federal law, an employer can require you to pay for the mandatory drug test. As long as having the employee pay does not have the effect of discouraging minority job applicants or lowering the employee's wage below the federal minimum, the employer can charge you for the test.
Billing your health insurance is a form of billing you, even if your health insurance is from your employer.
A prospective employer may be interested in your health because many employers pay a portion of their employees' health insurance. Health insurance premiums may be higher if you are in poor health or a regular smoker. However, a potential employer is not legally allowed to ask questions about health during an interview.
Mandatory Insurance
It depends on where you are as to how you get your insurance. In the United States, your employer ether has health insurance available or does not have health insurance available. Some musicians work for an employer who provide health insurance. Many do not provide it.
Yes, unless you are being provided health insurance by your employer in which case they determine what you are allowed to have because it is based on a company package that they purchased in whole. You are now the master of your own health insurance thanks to Obama Care
Health insurance coverage is mandatory in Massachusetts for anyone over 18 who can find affordable insurance. Those with low income may be eligible for insurance at no cost.
Yes the employer can pay the health insurance but is not required to by law. He is encouraged to for bettering the employees benefits.
Can you drop your health insurance coverage at anytime from your employer?Read more: Can_you_drop_your_health_insurance_coverage_at_anytime_from_your_employer
Can you drop your health insurance coverage at anytime from your employer?Read more: Can_you_drop_your_health_insurance_coverage_at_anytime_from_your_employer
Under our current United States law, it is not mandatory for individuals to carry health insurance. I don't think it's a smart idea to not have health insurance, but it's just not affordable for some people.
Mandatory Insurance
In the United States, health insurance is not made mandatory by the federal government. There are, however, certain employers or educational institutions that require it. If working as a medical resident, for example, health insurance is mandated.
Assuming you are talking about your employer's health plan post termination, the employer has that responsibility.