Hostage to Your Employer: How a WWII Policy Locked U.S. Health Care to Jobs
If you lose your job in America, you often lose your health care. This stark reality baffles people in many other countries, where medical coverage doesn’t vanish with a pink slip. In the United States, however, your ability to see a doctor is frequently tied to your employer – a system born of historical accident and now a source of heated debate. Why do Americans get health insurance through their jobs, unlike virtually every other developed nation?… Read more