Hostage to Your Employer: How a WWII Policy Locked U.S. Health Care to Jobs

If you lose your job in America, you often lose your health care. This stark reality baffles people in many other countries, where medical coverage doesn’t vanish with a pink slip. In the United States, however, your ability to see a doctor is frequently tied to your employer – a system born of historical accident and now a source of heated debate. Why do Americans get health insurance through their jobs, unlike virtually every other developed nation? The answer stretches back to World War II and has created a legacy that affects inequality, innovation, and even life-and-death decisions about care.

Most working-age Americans (about 60%) receive health insurance through an employer, either their own or a family member’s. This employment-based coverage dominates U.S. health care for those under 65. It might feel “normal” to Americans, but it’s an oddity internationally: in other rich countries, the government usually guarantees health coverage as a right of citizenship, decoupled from any particular job. The American approach raises troubling questions. What happens if you’re laid off or your employer doesn’t offer benefits? Why should a CEO have platinum health benefits while a part-time retail worker has none? And can a system built for the 1940s survive the needs of a 21st-century workforce?

In this in-depth exploration, we’ll uncover how wartime policies and political battles created the employer-based health insurance system. We’ll compare the U.S. path with those of the UK and China – two very different nations that both chose to make health care more universal. And we’ll examine the consequences: a patchwork of coverage that leaves millions uninsured or “job-locked,” clinging to jobs they might otherwise leave for fear of losing health care. It’s a journalistic journey into history’s echo in today’s headlines, asking whether Americans will remain hostages to their employers for health care – or finally break free.

How a WWII Policy Locked U.S. Health Care to Jobs

Wartime Wage Freezes and the Birth of Job-Based Benefits

To understand why American health insurance is tied to employment, we must time-travel to the 1940s. World War II was in full swing, and the U.S. government imposed strict wage controls to prevent inflation. Employers, desperate to attract workers in a booming wartime economy, couldn’t lure talent with higher pay – so they got creative. They began offering health insurance as a perk on top of frozen wages.

It was an ingenious workaround: companies like Kaiser shipyards on the California coast started providing medical care to their workers to keep them healthy and productive. In Richmond, California, industrialist Henry J. Kaiser opened company clinics and even a field hospital for his shipyard workers, who paid 50¢ a week for care. For many laborers, this was the first time they had regular medical access, and it quickly became a prized benefit. As one historian noted, health insurance became “the most valued benefit in the workplace” as companies used it to compete for scarce labor during WWII.

Critically, the government encouraged this twist of fate. In 1943, the War Labor Board – which enforced wage controls – decided that fringe benefits like health insurance would not count as wages. This meant offering insurance did not violate the wage freeze. Soon after, the IRS ruled that employer health contributions were tax-free. In effect, Washington blessed and subsidized the idea: a company could spend a dollar on an employee’s health plan and it was worth a full dollar to the worker, whereas a dollar of wages might be cut down by taxes. This tax quirk supercharged the trend. During the war, employers could provide benefits valued up to 5% of wages without counting them as taxable income. What began as a temporary wartime expedient became permanent policy in 1954, when the IRS formally codified the tax exemption for employer-sponsored insurance.

Thus, by the late 1940s and 1950s, America’s unique path was set. Rather than a government-run health system or a universal insurance program, the private sector and workplaces became the main source of coverage. After the war, powerful labor unions made health benefits a standard demand in contracts, expanding coverage to millions of unionized workers. Non-union companies followed suit to stay competitive in recruiting. Health insurance as a job benefit was here to stay, snowballing through the postwar economic boom.

This historical accident – wage freezes leading to fringe benefits – explains why the U.S. never adopted national health insurance when other countries did. In the 1940s, President Harry Truman did propose a national health program, envisioning coverage for all Americans. But he was stymied by fierce opposition, notably from the American Medical Association (AMA). The doctors’ lobby at the time equated government health insurance with menace; in 1948 the AMA rallied physicians against “Trumancare” by warning it was “socialized medicine” that could lead to socialism in all aspects of American life. Even earlier, as far back as 1918, U.S. voters had been frightened away from government health insurance. In California that year, a referendum for state-run health coverage lost badly after doctors campaigned against it, calling compulsory health insurance “a dangerous device invented in Germany, announced by the German Emperor from the throne in the same year he started plotting to conquer the world”. In the fevered patriotism of two world wars, linking public health care to the Kaiser or to the Red Scare was a winning scare tactic for opponents.

As a result, while Europe and other regions were moving toward universal health systems, the U.S. tacked in a different direction. By the mid-20th century, the majority of American middle-class families had health security – but only as long as the breadwinner had a job with benefits. Those outside the workforce or in low-wage jobs were left in the cold. It wasn’t until 1965 that the U.S. patched some of the biggest holes by creating Medicare (government insurance for the elderly) and Medicaid (coverage for the very poor). These programs acknowledged what other countries had long assumed: older and low-income citizens needed guaranteed health coverage beyond the employer system. Still, for working-age Americans and their dependents, employment-based private insurance remained the default – a vast, tax-subsidized system of patchwork private plans, administered by thousands of companies.

One irony in this history is that America’s reliance on private, job-tied insurance was not born from a careful plan, but from improvisation under pressure. A wartime measure to prevent inflation ended up embedding itself in the fabric of American life. Decades later, we live with the consequences: health care in the U.S. is a job perk first, and a human right second – essentially an accident of history turned into an institution.

The Global Outlier: Other Countries Chose Universal Coverage

To grasp how unusual the U.S. path is, consider two very different countries: the United Kingdom and China. Both are economic and geopolitical peers of the U.S. in different ways – one a long-developed democracy, the other a rapidly growing formerly communist state. Neither ties health insurance to employment the way America does. Their choices cast America’s system in sharp relief, highlighting alternatives that might seem radical to Americans but are standard elsewhere.

In the UK, health care became a right for all citizens in the aftermath of WWII. With their economy in ruins from the war, the British did something astonishing: they built the National Health Service (NHS), a publicly funded, universal health care system. Starting in 1948, every UK resident “automatically [became] entitled to free public health care through the NHS, including hospital, physician, and mental health care”. This bold move was inspired by the wartime Beveridge Report of 1942, which envisioned a cradle-to-grave welfare state to slay the “five giants” of want, disease, ignorance, squalor, and idleness. Health care was to be “comprehensive and free,” replacing the patchwork of voluntary insurance and charity care that existed before. Despite Britain’s economic struggles post-war, there was a political consensus that health care should be provided to all as a basic right, funded by taxes. The NHS model meant employers play little role in basic health coverage – aside from paying taxes like everyone else. Whether you’re employed, unemployed, a child, or retired in the UK, you can go to the doctor or hospital without ever worrying about an insurance network or a bill (barring small fees like prescription co-pays). Private insurance exists only as a supplemental option for quicker access or extras, carried by perhaps 10% of Britons. To most Brits, the idea that your job would determine your health care is perplexing; the NHS turned health care into a public service, as fundamental as policing or public education.

China, on the other hand, took a different road to (near) universal coverage, one that mixes employment-based insurance with government programs. In the Mao era (1950s–1970s), China had rudimentary collective health care (think “barefoot doctors” in villages and state-run clinics in cities). That system eroded in the 1980s market reforms, leaving many Chinese to pay out-of-pocket. Faced with public outcry over health costs, China in the early 21st century built a new system that by 2011 achieved 95% insurance coverage of its 1.3 billion people – arguably the largest expansion of health insurance in human history. How did they do it? By creating three main insurance schemes, two of which are government-run social insurance for those outside formal employment, and one which is employment-based for urban workers.

Under this model, formal sector urban employees in China are required to enroll in the Urban Employee Basic Medical Insurance, a program funded by payroll taxes from employers and employees. In essence, if you work for a company in a city, part of your paycheck (and your employer’s payroll) goes into a health insurance fund. But crucially, if you don’t work for a formal employer, China has you covered too: rural residents and urban unemployed (including children and the elderly) can join the Urban-Rural Resident Basic Medical Insurance, which is heavily subsidized by the government. The government pours funding in, and individuals pay only small premiums. By mandating employer contributions for those who do have jobs and using tax revenues for those who don’t, China managed to extend at least basic coverage to virtually everyone. The quality and depth of coverage can still vary – Chinese insurance often has significant co-pays and caps, and wealthier citizens sometimes buy private supplemental insurance. But the core principle is clear: no one is left completely uninsured because they don’t have a job. Even the relatively few permanent foreign residents in China are entitled to the same basic coverage as citizens, a stark contrast to the U.S. where immigrants’ access can be very limited.

These two examples underscore a key point: there’s nothing inevitable about tying health care to employment. The UK chose a publicly financed route – essentially socialized medicine. China chose a social insurance route – a mix of employment-based and government-funded pools – to reach universal coverage. Many other countries have their own models: Germany and Japan use nonprofit “sickness funds” often linked to occupations but regulated and subsidized by government; Canada uses provincial single-payer insurance; etc. In virtually all advanced economies, and even many developing ones, health coverage is portable and guaranteed, regardless of your job status.

By contrast, the U.S. stands out for neither providing universal public insurance nor legally mandating employers to cover everyone (except for certain employer mandates added recently under the Affordable Care Act). The American system might be described as “voluntary” employer-based insurance, propped up by large tax subsidies and supplemented by public programs for some groups. For decades, this made the U.S. an outlier: the only rich nation without universal health coverage. Even after the Affordable Care Act (ACA) reforms of 2010, which expanded coverage considerably, about 25 to 30 million Americans remain uninsured in 2025. That’s roughly 9% of the population – a figure unthinkable in London or Beijing.

To be fair, the ACA did push the U.S. slightly closer to its peers by filling some gaps. It created marketplaces for individuals to buy insurance (with subsidies) independent of employment, and it expanded Medicaid for low-income adults in many states. It also imposed an employer mandate on larger firms: companies with 50+ full-time employees must offer affordable health insurance or pay a penalty. These measures nudged the U.S. system toward broader coverage, but they did not fundamentally sever the link between jobs and insurance. Instead, they layered new components atop the old structure. The result today is a sprawling hybrid: about 164 million non-elderly Americans get employer-sponsored insurance, 89 million get government coverage (Medicaid or Medicare under 65, etc.), and the rest buy private plans or go uninsured. It’s a far cry from the simplicity of the UK’s “everyone is covered” or China’s near-universal enrollment.

For Americans, these international contrasts can be jarring. They illustrate that the way U.S. health care is organized is not a natural law but a policy choice – or series of choices and accidents – that diverged from the path others took. Seeing that alternatives exist raises an uncomfortable question: What are the consequences of America’s employer-based system, and are Americans truly better off with this approach?

Inequality, “Job Lock,” and the Human Cost of Tying Health to Work

The employer-based system has managed to cover a majority of working Americans and their families since the mid-20th century, which is no small feat. Many people are indeed satisfied with their job-provided insurance, especially if their employer is generous. However, the flip side of this model is glaring inequality and insecurity. Essentially, it creates a class system of health haves and have-nots – often mirroring the disparities in the labor market.

If you land a good job with benefits, you likely have access to comprehensive health insurance for you and your dependents. Big, profitable companies and government agencies typically offer robust health plans. In 2023, virtually all large firms (98% of those with 200+ workers) offered health benefits to at least some employees. These plans often cover families, and larger employers usually pay a significant share of the premium. Workers at such companies might only contribute 17% of the premium for single coverage, on average, and about 29% for family coverage, with the company covering the rest. That can be a valuable subsidy worth tens of thousands of dollars per year. The federal tax break sweetens the deal further – employees don’t pay income or payroll tax on the value of those health benefits. For middle- and upper-income workers, this is a huge hidden benefit. For instance, a worker in the 22% tax bracket would need nearly $27,500 in extra salary to buy a $20,000 family policy on their own (after taxes), whereas through an employer they get the $20,000 policy tax-free. As a result, higher earners in high tax brackets effectively get more tax savings from employer insurance (each health insurance dollar is untaxed at, say, 37% for a top earner, versus 10% for a low earner). Over time, this has amounted to what economists call an upside-down subsidy – roughly $273 billion in federal tax revenue was foregone in 2019 due to the exclusion of employer health benefits from taxation, a benefit skewed toward richer workers with pricier plans.

But what if you have a low-wage job, or work part-time, or for a small business? The picture can be starkly different. Smaller firms and low-paying employers are far less likely to offer health insurance at all. Only 39% of businesses with fewer than 10 workers offer health benefits. In 2023, just over half (53%) of firms with 3 or more employees offered any health plan to any of their staff – meaning nearly half do not. These non-offering employers tend to be concentrated in certain sectors: think of many restaurants, retail shops, gig economy platforms, farms, and independent contractors. Even when smaller employers do offer insurance, the coverage often isn’t as generous. Workers at companies with many low-wage employees face higher premium costs on average, and small firms sometimes require employees to pay a greater share of the premium (for instance, a higher percentage of the cost of family coverage). Part-time and gig workers are at an especially high risk of being uncovered. Only a minority of employers extend health benefits to part-timers – in 2023, about 26% of large firms and just 13% of small firms that offer health benefits included part-time employees in those plans. Many employers deliberately limit workers’ hours or classify them as contractors to avoid benefit obligations.

The outcome of these practices is that millions of working Americans have no insurance at all, or pay dearly for individual plans. It’s telling that among adults aged 18-64, one in four people with a job isn’t covered through their employer – either because their employer doesn’t offer a plan, or they’re not eligible for it, or they can’t afford the premiums. These tend to be folks in the lower rungs of the income ladder. Data shows that workers with low incomes are significantly less likely than higher-income workers to even be eligible for employer coverage. Small businesses – often hailed as “the engine of job creation” – are the very employers who often “can’t afford to offer workplace insurance,” as economist Melissa Thomasson points out. So we end up with a paradox: the people who might need the security of health coverage the most (those earning modest wages or in unstable jobs) are least likely to have it.

Before the ACA’s coverage expansions, this meant tens of millions of working Americans were uninsured, essentially locked out of the healthcare system unless they paid full price or qualified as “poor enough” for Medicaid. The ACA has somewhat alleviated this by offering subsidized individual plans and expanding Medicaid in many states, and indeed the U.S. uninsured rate hit historic lows around 9% in 2023. Still, gaps remain. In states that didn’t expand Medicaid, an adult working a low-wage job can earn too much to qualify for Medicaid but not enough to afford a private plan – falling into a “coverage gap.” And when individuals do buy insurance on their own, it’s often costly. Many uninsured say the main reason is that coverage is too expensive. It’s not hard to see why: outside of an employer group, premiums are high, and even with ACA subsidies some people find it unaffordable or complicated to enroll.

Beyond the bare numbers of who is covered and who isn’t, the employer-based system impacts quality of life and economic choices in subtle but profound ways. One notorious phenomenon is known as “job lock.” Economists use this term to describe how workers who get health insurance from their job become reluctant to leave that job, lest they lose their benefits. In a society where health coverage isn’t guaranteed, your current job’s health plan can feel like a lifeline – golden handcuffs that keep you from pursuing new opportunities. Surveys back this up: around 40% of Americans say that health insurance considerations have affected their career decisions, whether it’s not switching jobs, not reducing hours, or avoiding self-employment, because they fear losing or interrupting their coverage. Think of the talented professional who stays in a stagnating position because her child has a chronic illness and she can’t risk a gap in insurance; or the would-be entrepreneur who sticks with a corporate job for the health plan, shelving a startup idea. Job lock doesn’t just affect individuals – it ripples through the economy. By one analysis, the U.S. has a noticeably smaller small-business sector and lower self-employment rates than other wealthy nations, and job lock is cited as one contributing factor. When people are “locked” into jobs for benefits, labor mobility suffers and the economy may lose some dynamism and innovation. In a real sense, tying health care to jobs can stifle the American Dream of striking out on one’s own.

There are also health consequences to the current system’s inequalities. People without job-based insurance – whether uninsured or juggling inferior plans – tend to delay or skip care due to cost. Uninsured Americans are far more likely to forgo doctor visits, prescriptions, and preventive screenings, resulting in worse health outcomes. A medical emergency or a serious diagnosis can spell financial ruin for those without adequate coverage. Paradoxically, even those with employer insurance can face high out-of-pocket costs (via deductibles and co-pays), which have been rising over the years. But the uninsured or underinsured have it worst: medical bills contribute to a significant share of personal bankruptcies in the U.S. and cause immense stress. In a society where losing your job can mean losing your health coverage, any economic downturn or layoff becomes a public health crisis as well as an economic one. This was vividly demonstrated during the COVID-19 pandemic in 2020. As millions of Americans were laid off in the sudden recession, many also lost employer health insurance overnight. Researchers estimated that by June 2020, roughly 7.7 million workers lost jobs that came with health insurance, affecting coverage for an estimated 6.9 million of their dependents as well. While emergency measures and ACA safety nets (like special enrollment periods and Medicaid expansion) helped many of these folks find alternate coverage, it highlighted a frightening fragility: at the very moment people most needed health care – during a public health emergency – coverage was contingent on something as unrelated as their employment status.

Finally, there’s an underlying philosophical question of fairness and social solidarity. The employer-based system tends to pool healthy and sick together within a company, but it segments society at large. If you work for a firm of 1,000 people, you share risk among that relatively small group; if that firm has younger, healthier employees, your premiums stay low, but if it has older/sicker ones, premiums rise or benefits might get cut. People working for generous employers enjoy peace of mind, while those working for stingy or struggling employers do not. It’s health care by lottery of where you work. Contrast that with national systems (like the NHS or Canadian Medicare) where the entire population shares the risk and cost, and everyone has the same basic coverage. The U.S. approach, as some critics argue, violates a sense of egalitarian justice: do we really believe a janitor working for a large corporation deserves better health care than a janitor working for a tiny mom-and-pop business? Or that a person should fall into medical debt because they happen to be self-employed or between jobs? These uncomfortable questions grow sharper as awareness increases that most other nations long ago answered them by decoupling health care from employment status.

Rethinking an Unintentional System

The story of why U.S. health care became employer-based is, at its core, a tale of unintended consequences. A wartime labor policy and a tax decision sprouted an entire health insurance ecosystem that no one consciously designed from scratch. Now, more than 75 years later, Americans are still navigating the maze that grew from those seeds. Is this system still fit for purpose, or is it time to rethink the fundamental link between work and health care?

There are signs of change – or at least, of dissatisfaction – on the horizon. Polls have often shown that a majority of Americans support the idea of universal health coverage in principle, and debates about proposals like “Medicare for All” or a public option routinely flare up in election seasons. The very existence of the ACA’s marketplaces and Medicaid expansion reflects an acknowledgement that the old system was leaving too many people out. Yet, political hurdles to a more radical overhaul remain high. Powerful stakeholders (insurance companies, some employers, and even segments of the public satisfied with their plans) resist sweeping changes. The employer-based system has inertia on its side; it’s deeply embedded in how American employers compensate their workers, and how Americans expect to obtain health insurance.

Interestingly, some large employers themselves have voiced concern about the sustainability of being the primary providers of health care. The cost of employer-sponsored insurance continues to climb, eating into paychecks and company profits alike. In 2023, the average annual premium for a family plan through work was about $24,000 – roughly equivalent to the price of a new car every year. Employers foot most of that bill, but ultimately it comes out of workers’ total compensation. Over the past decades, wage growth has been sluggish in part because more compensation has been diverted to health benefits instead of pay raises. Corporations in global competition sometimes complain that funding employees’ health insurance puts them at a disadvantage against foreign firms whose workers are covered by national health systems. Small businesses, as noted, struggle even more. In fact, the smaller the firm, the heavier the burden: not only are small employers less likely to offer coverage, those that do often pay higher premiums due to having smaller risk pools and less bargaining power.

Recognizing these issues, ideas like “portable benefits” are gaining traction. This concept means benefits (health insurance, retirement plans, etc.) would be attached to the individual, not the job, allowing people to carry their health coverage with them from gig to gig. It’s essentially an attempt to simulate for private insurance what Medicare or national insurance does in other countries – make it universal and continuous, irrespective of employer. Some envision a system of personal insurance accounts or a federal program one can buy into, which employers (if they want) could contribute to without being solely responsible for managing a health plan. Others push for expanding public programs (for example, lowering the Medicare eligibility age, or adding a public insurance option for anyone to join) as a way to erode the rigid boundary between those with cushy job insurance and those without.

Yet, incremental tweaks aside, the central dilemma remains political and cultural. Americans historically have been wary of government-run health care, a sentiment rooted in the same decades that gave us employer-based insurance. The legacy of terms like “socialized medicine” still looms large in political discourse, even as programs like Medicare (which is a single-payer government insurance for seniors) became beloved and untouchable. There’s an element of “the devil we know” – people with good employer plans often fear losing what they have under any reform, while those without tend to lack the political clout to force change.

However, the terrain is shifting slowly. Each generation of Americans has fewer people in traditional long-term, full-time jobs with benefits. The rise of the freelance and gig economy means more people lack that default job insurance safety net. Additionally, the COVID-19 pandemic was a wake-up call that tying health care to jobs can be, frankly, dangerous: it’s hard to justify a system where a sudden virus-induced recession can swell the ranks of the uninsured overnight. Public opinion after the pandemic has shown increased openness to bold health reforms, according to some polls, as health security feels more urgent.

In the end, the question of “Why is U.S. health care employer-based?” can’t be separated from “Should it continue to be?” The historical reasons – wage controls, tax breaks, union negotiations – are artifacts of a bygone era. They answered the needs of the 1940s and 1950s, when the workforce was mostly male, often stayed with one company for life, and government solutions were politically untenable. Today’s America looks very different. The workforce is more mobile and diverse; jobs are frequently changed; women and gig workers and self-employed entrepreneurs form a huge part of the economy. Medical costs are exponentially higher now, and an untreated illness can be financially catastrophic. In this context, the employer-based system often seems like a rusty machine, kept running through patchwork fixes and costly subsidies, rather than a sleek model fit for current purposes.

Americans are effectively paying a price for that historical accident – in money, yes, but also in freedom and peace of mind. The price is paid by the worker who hesitates to start a business, by the family that goes uninsured because their small employer can’t help, by the patient who delays care until an ER visit because their coverage lapsed with their job. It’s also paid in a less obvious way by everyone: the tax exclusion for employer insurance is, as noted, the government’s largest health subsidy, meaning taxpayers collectively underwrite this system to the tune of hundreds of billions of dollars. One could argue that Americans already pay for universal health care – they just do it in a convoluted, unequal way that doesn’t cover everyone.

There is a growing realization among some policymakers and business leaders that the status quo isn’t just inequitable, it’s also inefficient. Even Republican Senator John Thune’s recent comments – framing health insurance as something that “a lot of times… comes with a job” and suggesting the goal is to get people into jobs with benefits – were fact-checked as overly optimistic, because simply having a job is no guarantee of health coverage in America. Thune’s remark unintentionally underscored the assumption baked into American thinking: that the proper path to health care is through employment. But as experts responded, that’s a reality full of caveats and holes. About 25% of working adults aged 18–64 have no access to job-based insurance at all, and even those who do may find the cost too high to enroll. The fact-check highlighted that getting a job “increases your chances” of insurance, but doesn’t ensure it. For a rich nation, “hope you get a good job” is a rather flimsy health policy.

As the U.S. moves further into the 21st century, with all the challenges of pandemics, an aging population, and economic disruptions, the pressure to untangle health care from employment may grow. Whether change comes in the form of a single-payer system, a public option, or a patchwork of portable private plans, the goal would be to finally make health coverage universal and stable – something Americans can count on regardless of their work life. That would mark a paradigm shift, correcting course from the unique historical detour that began in WWII.

Yet history also shows that American policy shifts often come slow and with fierce resistance. It’s sobering to note that nearly every attempt at national health insurance over the last century (Teddy Roosevelt in 1912, FDR in the 1930s, Truman in the 1940s, Clinton in the 1990s) failed politically, until a modest step like the ACA barely squeaked through in 2010. The employer-based system has survived not just because of inertia, but because it does work well for many people – and those people fear losing what they have. Any path forward will have to contend with the reality that for roughly 160 million Americans, employer coverage is the only system they’ve ever known. Change can be unsettling, even if it promises broader security.

As it stands, the U.S. seems likely to maintain its employer-centric model in the near term, while incrementally bolstering the safety nets around it (through Medicaid, ACA subsidies, etc.) to catch those it fails. But the discontent and debates signal that the conversation is far from over. Every time an American faces a health care nightmare because of a job loss or a gig economy job, the question is raised anew: Is health care a basic right, or a job perk?

That question cuts to the heart of the national ethos. For a country that celebrates individual freedom, it’s paradoxical to have a system that can bind people so tightly to employers for something as vital as medical care. Perhaps the coming decades will see Americans reclaim some of that freedom – freeing health care from the HR department and making it a societal guarantee. Until then, understanding the why of the current system is crucial to imagining how it could be different. The story of wartime wage freezes and tax codes may seem like dusty history, but it lives on in every doctor’s appointment scheduled around work hours, every COBRA form filled out after a layoff, and every late-night worry about what happens to your family’s health care if you quit or get fired.

America’s employer-based health insurance was an ingenious solution for its time, but a solution to a problem that no longer exists (wartime inflation and labor shortages). In solving that problem, it created new ones: inequity in access, complexity, and vulnerability in a changing economy. As we’ve seen, other nations made different choices when faced with the fundamental question of how to keep their people healthy. The U.S., for a variety of reasons, took a singular path. The challenges now are to address the shortcomings of that path and to decide whether to stay the course or forge a new one.

Change may yet come, driven by economic necessity or public demand. It could be sudden or gradual. But even if the U.S. eventually untethers health care from employment, the lessons of how we got here will remain instructive. They remind us that policies have long shadows: a decision in 1942 still affects whether your 2025 employer covers your biopsy or your insulin. They also remind us that what once seemed unthinkable (employers providing health insurance was novel in 1940s; universal government coverage was radical in 1918) can, over time, become the norm – and vice versa.

As America grapples with these issues, it’s worth reflecting on that history and asking bold questions. Is it acceptable that losing a job can mean losing health care, in a nation as wealthy as ours? Can we imagine a system where health security is divorced from the ups and downs of the job market? The past and the global experience suggest that it’s possible – that employer-based insurance is one model, not destiny. Ultimately, confronting these questions directly is the first step toward an answer. The status quo may have been born by accident, but the future of American health care will be decided by choice.

 

Scroll to Top