William Freedman |
Viewed from the outside, the most telling feature of the American healthcare sector is its stark contrast between quality of care and access to care. If you google “country with the best doctors,” you’ll get a litany of surveys of varying degrees of academic rigor. It’s hard to quantify “best,” but they’ll all tell you the same thing: The United States has earned a reputation for hosting the most distinguished doctors in the world.
Now try getting an appointment with one. The quality of American physicians is uncontested, but the quantity is another story. We have, according to the World Health Organization, only 2.5 doctors for every 1,000 residents, or about half Mexico’s ratio. The U.S. ranks 61st in the world by that metric, right between Saudi Arabia and Moldova.
So we’ve got the best, but they’re in short supply. What a Doctor of Economics might tell you about doctors of medicine, then, is that they’re bound to be money magnets. The 18 highest-paid occupations in America all require an M.D. degree or the specialized equivalent, according to the Labor Department. The 19th is CEO. But now that you got your appointment with a $353,970 per year cardiologist, how are you going to pay for their services?
In the beginning, American doctors were kindly old men who stepped out of a Saturday Evening Post cover illustration to make house calls on all the sick and infirm. If their patients couldn’t afford their fee, they’d accept payment in chicken or goats.
Or probably not, but that’s the sepia-tinged view we have of old-timey medical care. Still, it was relatively affordable and accessible.
Then it all fell apart during the Great Depression of the 1930s. That’s when hospital administrators started looking for ways to guarantee that they got paid. On delivery. In cash. And so, according to the American College of Healthcare Executives, health insurance was born. Interestingly, doctors would have none of it at first. The earliest health plans covered hospitalization only.
The Depression passed just in time for America’s next great national trial, the Second World War. A new set of challenges required a new set of responses. During the Depression, there were far too many people for far too few jobs. The war economy, though, had the opposite effect. Suddenly, all able-bodied men were in the military, but somebody still had to build the weapons and provision the troops. Even with women entering the work force in unprecedented numbers, there was simply too much to get done. The competition for skilled labor was brutal.
A wage freeze starting in 1942 forced employers to find other means of recruiting and retaining workers. Building on the workers’ compensation plans that had recently been mandated, employers or their union counterparts started offering insurance to cover both hospital and doctor visits.
Of course, the wage freeze ended soon after the war did. But the tax code and the courts soon clarified that employer-sponsored health insurance was non-taxable. Objectively, it made little sense that people were continuing to get their health coverage through their jobs. Still, no politician was going tell voters that their taxes would go up just so a more logical coverage model could be pursued.
Medicare, a government-sponsored health plan for retirees, debuted in 1965. Over the coming decades, it would’ve made sense to lower the eligibility age incrementally, maybe a year at a time, until everyone was covered. That’s pretty much what many other wealthy nations have done. But Americans as a whole have always been suspicious of government intrusion, so this idea went nowhere.
Fast-forward to 2010. Medicare for all – also called the public option – was a non-starter. Apparently, there were millions of people who liked the healthcare plans they got from their bosses. Or maybe they didn’t. But at least they understood it and didn’t want to roll the dice with something new. And yet it was clear that employer-sponsored plans were vestiges of another time. They made sense when people stayed with the same company for their entire careers but, as both job-hopping and layoffs became more prevalent, plans tied to the job were becoming obsolete.
Thus the Affordable Care Act was proposed by Barack Obama’s White House and squeaked by Congress and the Supreme Court with the narrowest of margins. The ACA introduces an individual mandate that requires everyone, regardless of their job status, to have health insurance. It set up an array of government-sponsored online exchanges where individuals could buy coverage. It also provided advance premium tax credits to defray the cost to consumers.
But it didn’t ignore the fact that most people were already getting health insurance through work, and that a significant proportion of them didn’t want to change that status quo. So the ACA also required employers with 50 or more full-time equivalent employees to provide health coverage to at least 95% of them. The law, nicknamed Obamacare by supporters and detractors, set a minimum baseline of coverage and affordability. The penalty for an employer that offers inadequate or unaffordable coverage can never be greater than the penalty for not offering coverage at all.
It's important to note that Obama didn’t dream all this up himself. He’s on the record as saying that, if America were starting from scratch, he’d have preferred a single-payer public system like Canada has. But creating a new American healthcare substrate in the 21st century meant accounting for the presence of a powerful private health insurance industry.
The model for Obamacare was the healthcare reform package that went into effect in Massachusetts in 2006. While the Bay State is accurately reputed as a liberal bastion, the initial proposal was made by then-Governor Mitt Romney, a Republican who now serves as a senator from Utah. (He was also Obama’s opponent in the 2012 presidential election, campaigning to repeal on the federal level the bill he championed on the state level.)
The Obama administration was neither the first nor the last to champion new ways to provide healthcare coverage to a wider swath of Americans.
The first attempts to harmonize U.S. healthcare delivery systems with those of the other developed economies came just five years after Medicare and its twin Medicaid – which assists low-income Americans – came into being. Two separate bills were introduced were introduced in 1970 alone. A bill that would have made all Americans eligible for Medicare seemed on the surface to be the one most likely to pass, especially because it had bipartisan support. But the chairman of the relevant Senate panel had his own bill in mind, so that’s the one that got through the committee. It effectively said that all Americans were entitled to the kind of health benefits enjoyed by the United Auto Workers Union or AFL-CIO – for free. But shortly after Sen. Edward Kennedy began hearings on his bill in early 1971, a competing proposal came from an unexpected source: the Richard Nixon’s White House.
President Nixon’s approach, in retrospect, had some commonalities with what Obamacare turned out to be. There was the employer mandate, for example, and an expansion of Medicaid. It favored healthcare delivery via health maintenance organizations (HMOs), which was a novel idea at the time. HMOs, which offer managed care within a tight network of healthcare providers, are descended from the prepaid health plans which flourished briefly in the 1910s and 1920s. They were first conceived in their current form around 1970 by Dr. Paul M. Ellwood, Jr. In 1973, a law was passed to require large companies to give their employees an HMO option as well as a traditional health insurance option.
But that was always intended to be ancillary to Nixon’s more ambitious proposal, which got even closer to what we have now after it wallowed in the swamp for a while. When Nixon reintroduced the proposal in 1974, it featured state-run health insurance plans as a substitute for Medicaid – not a far cry from the tax credit-fueled state-run exchanges of today.
Of course, Nixon had other things to worry about in 1974: inflation, recession, a nation just beginning to heal from its first lost war – and his looming impeachment. His successor Gerald Ford tried to keep the proposal moving forward, but to no avail.
But this raises a good question: If the Republican president and the Democratic Senate majority both see the same problem and have competing but not irreconcilable proposals to address it, why wasn’t there some kind of compromise? What major issue divided the two parties?
It was a matter of funding. The Democrats wanted to pay for universal health coverage through the U.S. Treasury’s general fund, acknowledging that Congress would have to raise taxes to pay for it. The Republicans wanted it to pay for itself by. Charging participants insurance premiums, which would be, in effect, a new tax. It’s hard to believe that’s where it all fell apart, but it’s also hard to believe that Matthew McConaughey won an Oscar, but both these things are true.
Next up, Ronald Reagan signed the Consolidated Omnibus Budget Reconciliation Act (COBRA) in 1985. COBRA enabled laid-off workers to hold onto their health insurance – providing that they pay 100% of the premium, which had been wholly or at least in part subsidized by their erstwhile employer. The thing is, if you’re out of work you probably need that subsidy more than ever, so COBRA’s main provision can be interpreted to be a sick joke. A 2006 Commonwealth Fund survey found that only 9% of people eligible for COBRA coverage actually signed up for it.
The COBRA law had a section, though, that was only tangentially related. The Emergency Medical Treatment and Active Labor Act (EMTALA), which was incorporated into COBRA, required all emergency medical facilities that take Medicare – that is, all of them – to treat patients irrespective of their insurance status or ability to pay. As Forbes staff writer Avik Roy wrote during the Obamacare debate, EMTALA has come to overshadow the rest of the COBRA law in its influence on American healthcare policy. We’ll return to that point soon.
So it wasn’t until the 1990s that Washington saw another serious attempt at healthcare reform. Bill Clinton’s first order of business as president was to establish a new healthcare plan. Up until then, First Ladies generally focused their attention on such non-controversial causes as preventing drug abuse (Nancy Reagan) or promoting literacy (Barbara Bush). That’s when America first met Hillary Rodham Clinton.
For the first time, the First Lady took on the role of heavy-lifting policy advisor to the president and became the White House point person on universal health care. Her proposal mandated that everyone be enrolled in a health coverage plan, and that subsidies be provided to those who can’t afford it. Companies with 5,000 or more employees would have to provide such coverage. There’s nothing too controversial about that, but the Clintons didn’t leave it there. Their plan centralized decision making in Washington, with a “National Health Board” overseeing quality assurance, training physicians, guaranteeing abortion coverage and running both long-term care facilities and rural health systems.
The insurance lobbyists had a field day with that. The famous “Harry and Louise” ads portrayed a generic American couple having tense conversations in their breakfast nook about how the federal government was going to come between them and their doctor. By the 1994 midterms, any chance of universal healthcare in America had died, along with a 40-year run of Democratic majorities in the House of Representatives.
Near-universal healthcare would have to wait another generation.
It’s important to understand what the common threads were connecting the different models considered over the past half century. It’s interesting to note that, as much as healthcare reform has been mostly a Democratic issue, its intellectual underpinnings were mainly generated by Republicans.
The Nixon-era ideas centered around expanding benefits – whether through Medicare or workplace-sponsored HMOs – until everyone was covered. Since then, much intellectual heft was brought to bear to flesh out new approaches.
The two seminal works on the topic are “Assuring Affordable Health Care for All Americans,” a 1989 white paper written for The Heritage Foundation by renowned healthcare policy analyst Stuart M. Butler, and “The Jackson Hole initiatives for a twenty-first century American health care system,” an academic article appearing in Health Economics in 1992.
Butler’s paper introduced the individual mandate, that is, the requirement that every household be required to buy health insurance. This was a direct response to a pitfall many policy analysts identified in ENTALA. If anyone could get treated for free in the emergency room, then everyone could get treated for free. This dilemma came to be known as the “free rider” problem. Whether or not this became a substantive burden on emergency medicine providers is a still a matter of debate, but the conservative-leaning Heritage Foundation didn’t want the same risk to propagate throughout the entire healthcare system. So Butler suggested that everyone subscribe to a plan and, if they truly couldn’t afford to do so, then their premiums would be subsidized, not their direct care.
The Jackson Hole paper’s big idea was managed competition. It might not surprise you to learn that Ellwood, considered the father of HMOs, was the lead author of that paper.
“The market for health care services in the United States has clearly failed to contain expenditure growth to acceptable levels or to produce an equitable distribution of coverage and services,” the Jackson Hole group contended. They recommended a solution in which large, sponsoring organizations – suggesting an employer mandate – push cost-effective managed care solutions rather than traditional fee-for-services plans, which they considered less efficient.
When Romney first proposed universal health care in Massachusetts, he favored the individual mandate of Butler’s approach. The Democratic legislature, though, wanted the individual mandate plus the employer mandate inherent in the Jackson Hole prescription. When the plan passed the state legislature, Romney actually vetoed the employer mandate section. The legislature, though, overrode his veto and Massachusetts ended up with a plan that included both individual and employer mandates, and that’s the format that went on to inform Obamacare.
Despite an onslaught of court challenges, Obamacare remains the law of the land. For a while, Republican congressional candidates ran on a “repeal-and-replace” platform plank, but even when they were in the majority there was little legislative action to do either.
Still, Obamacare is not the last word in American healthcare reform.
Since then, there have been two important improvements to Health Reimbursement Arrangements, through which companies pay employees back for out-of-pocket medical-related expenses. HRAs had been evolving informally since at least the 1960s, but were first addressed by the Internal Revenue Service in 2002.
Not much more happened on that front until Obama’s lame-duck period. In December 2016, he signed the bipartisan 21st Century Cures Act which was mainly a funding bill supporting the National Institutes of Health as it addressed the opioid crisis. But, just like the right to free emergency room treatment was nested in the larger COBRA law, the legal framework of Qualified Small Employer Health Reimbursement Arrangements was tucked away in a corner of the Cures Act. QSEHRAs, offered only by companies with fewer than 50 full-timers, allows firms to let their employees pick their own insurance coverage off the Obamacare exchanges. The firms then pay the employees back for some or all of the cost of those premiums. The employees then become ineligible for the premium tax credit provided by the ACA, but a well-constructed QSEHRA will meet or exceed the value of that subsidy.
And that brings us to one last innovation, which basically expands QSEHRA-like treatment to those companies that have, or aspire to have, more than 50 employees.
Individual Coverage Health Reimbursement Arrangements, or ICHRAs, were established by a 2019 IRS rule approved by Cabinet-level authorities of the Trump administration. ICHRAs allow firms of any size to offer employees tax-free contributions to cover up to 100% of their individual health insurance premiums as well as other eligible medical expenses.
Instead of offering insurance policies directly, companies advise their employees to shop on a government-sponsored exchange and select the plan which best suits their needs. Premiums are then reduced by employer reimbursement rather than an advance premium tax credit. And, because these plans are already ACA-compliant, there’s no risk to the employer that they won’t meet coverage or affordability standards.
And, let’s face it: We’re never going back to the mid-20th century model of lifetime employment at one company. We’re not even going back to the expectation of full-time employment that lasted through the early 21st century’s Obamacare debate. Now, with remote employees and gig workers characterizing the work force, the portability of an ICHRA provides some consistency for those who expect to be independent contractors for their entire careers. Simultaneously, it provides a way for bootstrap-phase startups to offer the dignity of health coverage to their Day One associates.
We're on a mission to provide better healthcare for companies of all sizes. Schedule a demo to learn more.Schedule demo