Given the central role that health insurance plays in the American healthcare systems, it is remarkable how short a time it has been with us. Many Americans alive today were born before modern health insurance became available in the United States around 1930. Although brief, the history of health insurance in the United States is sharply contested.
The history of health insurance in the United States is often presented as a narrative of missteps and missed opportunities. Indeed, two contending narratives of policy failure dominate much of the literature describing this history.
The predominant narrative, both in terms of the length of the tradition and volume of scholarship it has produced, emphasizes the failure of the United States to join other developed nations in embracing universal care coverage. Time and again, during the progressive period, in the New Deal, during the Truman Administration in the 1960s and 1970s, and during the Clinton Administration, efforts to establish a universal national health insurance program had come to naught. There were certainly victories along the way – most notably, the enactment of Medicare and Medicaid in 1965. But repeatedly, national health insurance proposals had gone down in defeat.
There is, however, also an alternative narrative of failure, favored by opponents of government intervention in healthcare finance. According to this narrative, repeated unwarranted government intervention in our healthcare system through regulation and subsidies has resulted in excessive cost, inadequate quality, and limitation on choice. Our biggest policy failure has been our refusal to unshackle the free market to work its magic on our healthcare system.
This article recounts both narratives. It will then, however, offer yet another alternative narrative – a story of ‘muddling through’ and of modest success. In fact, throughout the second half of the twentieth century, the vast majority of Americans were insured. The number of Americans covered by employment-based health insurance expanded very rapidly during the 1940s and 1950s, whereas the scope and extent of coverage continued to expand until the 1980s. Beginning with the 1960s, the Medicare, Medicaid, and State Children’s Health Insurance Programs in the 1990s filled the most serious gaps in private coverage. Besides noninsurance ‘safety net’ programs, the Emergency Medical Treatment and Active Labor Act, which requires hospitals to provide emergency treatment regardless of ability to pay (although not for free), filled yet another gaps. Only with contractions of private coverage in the 1990s, greatly accelerated in the 2000s, did this patchwork of insurance coverage become truly unsustainable.
The article concludes with an analysis of the Patient Protection and Affordable Care Act of 2010, which attempts to build on the United States’ unique mix of private and public health insurance to fill the growing gaps in coverage that have become apparent at the beginning of the twenty-first century. The extent to which this fix, in fact, succeeds, certainly, remains to be seen.
The dominant account of the history of health insurance in the United States focuses on failed attempts to create universal health coverage. The first attempt to establish universal health coverage in the United States was led by the progressive movement in the late 1910s. Germany had inaugurated a social health insurance program in 1883, followed by a number of other European countries in the 1890s and early 1900s. The success of the efforts of the progressives to expand social welfare programs at the state level led the American Association for Labor Legislation (AALL) to believe that a national sickness insurance program might also succeed. The AALL marshaled a coalition of progressive academics and enlightened business leaders, who pushed for reform based largely on the German model. By 1917, the AALL’s standard health insurance bill was being considered in 15 state legislatures.
Then everything fell apart. Some labor leaders opposed the government taking over the provision of welfare benefits to workers, a role that they coveted for themselves. Business leaders consolidated their opposition to the legislation. After a brief initial period of openness to change, organized medicine retreated to a stance of obdurate and highly effective opposition, which it assumed toward public health insurance for decades thereafter. Insurance companies, which as of yet sold little health insurance but had developed a substantial market for industrial life insurance policies, opposed the proposal, which would have offered burial policies as part of the sickness benefit. Finally, as America was drawn into the First World War, enthusiasm for German things quickly waned. Compulsory health insurance legislation was defeated in California and New York, and by 1918, social health insurance was no longer on the table.
The possibility of a national health insurance program flickered to life again briefly during the 1930s. The severe economic dislocation of the Great Depression quickly overwhelmed state, local, and private relief efforts. The Social Security Act enacted in 1935 created a national social insurance retirement income program for the elderly and offered federal subsidies for state cash assistance program for the poor elderly, dependent children, and the blind. Although there was considerable support for a federal program that would provide health benefits, fervent opposition led by organized medicine threatened to bring down the entire social insurance program if health insurance was a part of it. President Roosevelt ultimately abandoned social health insurance.
Repeated attempts to create a universal social health in the aftermath of war also proved unsuccessful. Although President Truman campaigned for a national health insurance program more vigorously than what Roosevelt did before him, the United States turned rightward following the war, electing a Republican congressional majority. The most important parts of Truman’s program that survived Congressional debate were the Hill-Burton hospital construction program (which between 1947 and 1971 disbursed US$3.7 billion in federal funds for hospital construction, contributing up to 30% of all hospital projects during the period) and a heavy federal commitment to healthcare research. There was also quiet expansion of healthcare for the poor. The Social Security Act Amendments of 1950 for the first time committed the federal government to match, to a very limited extent, state expenditures for in-kind medical services through the matching fund provisions of the federal/ state public assistance programs for the elderly, blind, and disabled, as well as families with dependent children. Federal assistance for state indigent healthcare plans program was further expanded by the Social Security Act Amendments of 1960, which created the Kerr-Mills program to provide federal matching funds for a medically needy elderly.
The anticommunism of the late 1940s and early 1950s and continued opposition from organized medicine put a hold on any further attempts to create a national health insurance program. Nevertheless, pressure for national health insurance was quietly building among organized labor and the elderly, who adapted strategically scaling back their expectations by limiting their immediate goals to cover only social security beneficiaries with social health insurance, and by 1960, to the coverage of hospital care.
With the election of President Kennedy in 1960, efforts to provide healthcare for the elderly were redoubled. The landslide election of President Johnson and of a liberal Democratic Congress following the assassination of Kennedy finally made health reform inevitable. In 1965, Congress created the Medicare program to insure hospital and medical services for the elderly as well as the Medicaid program to pay for healthcare services for public assistance recipients and the medically needy.
Social insurance advocates had hoped that the enactment of Medicare and Medicaid would be followed up by expansion of public insurance to the entire population. The 1970s, however, brought little progress as Democrats in Congress failed to reach agreement with the Nixon administration regarding the way forward, and the Carter administration focused (largely unsuccessfully) on cost control rather than on coverage expansion. Medicare coverage was expanded to the disabled, but no further.
Although the 1980s saw expansion in the Medicaid program, universal health coverage was off the agenda during the Reagan administration. The election of Bill Clinton in 1992, who campaigned for healthcare reform in the light of a growing number of uninsured Americans and rapidly increasing healthcare costs, brought new hope to reform advocates. However, Clinton administration stumbled politically. It took a year and a half to craft a reform plan in secret, giving interest groups and political opponents time to rally opposition and devising at last a plan that was too complex and could not gain traction. The late 1990s saw the creation of the State Children’s Health Insurance program, followed by the expansion of Medicare to cover outpatient prescription drugs in 2003, without which another two decades were likely to be lost in the quest for universal coverage.
Analysts offer a variety of explanations for America’s inability to adopt universal coverage. These include a national ideological aversion to strong government, powerful interest groups that benefit from the status quo, the absence of a strong political left, political institutions that make it far easier to block than accomplish change, and path dependency. Each of these explanations explains part of the problem, although the saliency of any particular explanation varies from one decade to another.
The ‘failed attempts to adopt universal coverage’ narrative would seem to be an accurate description of the history of health insurance coverage in the United States as far as it goes but does not fully acknowledge the remarkable expansion of private health insurance, which has played a more central role in the United States than it has in most other developed nations (Switzerland and, more recently, the Netherlands being the main exceptions). It is to that story to which the authors will shortly turn. The author will also consider whether the adoption of the 2010 Affordable Care Act provides a happy ending to the narrative of failure. But first, the free market advocate alternative for ‘history of failure’ narrative will be considered.
Although the failure of universal coverage narrative focuses on the plight of uninsured and underinsured Americans, the government interference narrative contends that Americans are ‘overinsured.’ Americans have too much insurance because of government policies that have encouraged private insurance for routine as well as catastrophic medical costs, thus resulting in severe moral hazard (as well as too much public health insurance and government regulation).
The history of American overinsurance begins, according to this narrative, with the exemption of fringe benefits in wageprice controls during World War II, thus stimulating the former’s growth. Also dating from the 1940s, are tax subsidies for employment-related insurance that have encouraged the provision of excessive health insurance coverage for most Americans. Because insurance premiums have largely been covered by employers, the true cost of health insurance has been concealed from Americans. Because the predominant forms of health insurance have imposed little costsharing, the true cost of healthcare has been concealed as well. Finally, the Medicare and Medicaid programs have driven up healthcare prices and utilization, limited choices for the elderly, and discouraged provider innovation. Repeated attempts by the government to fix health insurance market failures have only worsened the situation.
There is some truth in this narrative despite offering only a partial picture of American developments. In fact, labor was scarce during World War II and excess profit taxes were very high, up to 85%. The Stabilization Act of 1942 did allow the National War Labor Board (NWLB) to exclude a ‘reasonable amount’ of insurance benefits from wage controls. An IRS administrative ruling of 1943 also allowed businesses to deduct payments toward health and welfare funds as business expenses, contending that these benefits would not be taxable to employees.
Yet, there is reason to be skeptical of the oft-repeated claim that wage policy was the primary reason for the expansion of health insurance coverage during the War. First and foremost, health insurance as an employee benefit was already well established and rapidly growing before the war began, as described below. Second, most of the growth in wartime employment and health insurance coverage took place before the NWLB policies came into effect in 1943. American industry had been gearing up for the looming war since 1939, and while the number of American employees insured through commercial plans (the plans most likely to be paid for in part by the employer) increased from 960 000 in 1939 to 4.3 million in 1943, it only increased to another 71 000 between 1943 and 1945. Employment-related insurance coverage increased again rapidly after wage price stabilization controls expired in 1946, suggesting yet again that expansion was not driven primarily by wage stabilization policy. The wage stabilization policy was in any event routinely circumvented, as it allowed wage increases in conjunction with promotions, which quickly became common. Finally, throughout the war, Blue Cross coverage, the most common form of health insurance, continued to be paid for largely by employees rather than employers. By the end of the war, only 7.6% of Blue Cross enrollees were participants in groups to which employers contributed.
There is more reason to credit the employee benefit tax exclusion and deduction for the increase in health insurance coverage in the United States. The most rapid growth in health insurance coverage, however, took place in the late 1940s and early 1950s before the tax subsidies were enshrined in the 1954 Tax Code, and probably had more to do with aggressive collective bargaining by the unions than tax subsidies. The tax subsidies, however, undoubtedly contributed to the expansion of the scope and depth of health benefits well into the 1990s.
It is also very likely that the expansion of benefits has contributed to the growth in healthcare costs. Free market advocates assert that the Rand Health Insurance Experiment (HIE) conclusively demonstrated that more comprehensive health insurance coverage leads to higher healthcare spending. Although the meaning of the HIE and its relevance to contemporary health policy continue to be debated, the correlation between broader insurance coverage and increased healthcare spending seems plausible. It is also clear that the creation of Medicare and Medicaid has resulted in higher healthcare spending at least due to more people being insured.
Market advocates generally argue for the removal of tax incentives for private health insurance coverage and for the scaling back the operation of government healthcare programs through the use of vouchers to pay for private health insurance. Their most significant legislative victory has been the Medicare Modernization Act of 2003, which provided tax subsidies for health savings accounts coupled with high deductible health plans. High deductible health insurance has spread rapidly during the early 2000s and now dominates the individual market. This has resulted in increased financial difficulty for insured families and reduced access to healthcare. However, increased cost sharing has also arguably had a restraining effect on healthcare costs.
There is yet a third narrative of the history of health insurance in the United States that is somewhat more sanguine. Health insurance came into existence in the United States in the first half of the twentieth century as advances in medicine made healthcare of real value and increases in the cost of healthcare rendered it increasingly less affordable to those with serious medical problems. The prestigious Committee on the Costs of Medical Care concluded in its 1932 final report that the high cost of medical care for those most in need necessitated the provision of either private or public insurance, but by that time private insurance was already in use.
Describing the early history of health insurance is problematic because of a different meaning of the term ‘health insurance’ before the mid-twentieth century. The late nineteenth and early twentieth centuries saw the rapid growth of what was then called health insurance or sickness insurance. This coverage insured against wages lost due to illness. After a short waiting period, an insured individual would be able to collect a fixed amount per week until he (or, rarely, she) was able to return to work or until the benefit was exhausted. This insurance was offered by employer-funded ‘establishment funds’, labor organization funds, and commercial insurers; as well as by ethnic, religious, and community-based fraternal organizations. Some of these insurers and funds also provided life or burial insurance. Although a few offered insurance to cover medical costs, most did not. Not only was the value of most medical care questionable, fund members were also apparently concerned that a doctor paid for by the fund might be too eager to certify the member healthy enough to return to work.
Other precursors of health insurance also emerged during the late nineteenth and early twentieth centuries. Some fraternal organizations hired physicians to provide care to their members – the much maligned ‘lodge practice’; some even built their own hospitals. Employers in remote areas like in the case of railroad, mining, or logging companies also provided medical services through company doctors or through industrial medical plans.
Modern health insurance was born in 1929. In that year, the first ‘hospital service plan’ was started by Baylor Hospital in Dallas in 1929. Baylor entered into a contract under which white public school teachers paid 50 cents a month into a prepaid hospital services annual plan with the assurance that they would receive up to 21 days of hospital care, and a onethird discount for the remaining 344 days.
Hospital service plans did spread quickly during the 1930s. In 1936, the American Hospital Association established the Commission on Hospital Services, which ultimately became the Blue Cross Association. This commission encouraged and supported the spread of state and regional Blue Cross plans. By 1937, Blue Cross plans had 894 000 members; by 1943, membership reached almost 12 million.
Blue Cross plan members paid a fixed sum every month for the assurance that their needs would be covered if they had to be hospitalized. Blue Cross plans were available on a community-rated basis, that is, all members paid the same rate, regardless of health status. The plans negotiated ‘service benefit’ contracts with the hospitals under which the plans would cover up to a fixed number of days of hospitalization for a per diem fee established in the contract. Blue Cross plans also provided either service benefit or indemnity coverage (under which insureds would pay medical providers in cash and then file a claim with the insurer for an indemnity payment) for ‘extras’ such as emergency and operating room charges, or laboratory tests.
As it became increasingly clear in the late 1930s that there was a substantial market for hospital benefits, private commercial insurers too entered the group insurance market. Whereas only 300 000 Americans were covered by commercial hospitalization policies in 1938, nearly six million had coverage in 1946. Unlike Blue Cross plans, commercial insurance covered hospital care besides offering surgical coverage. By 1945, over five million Americans had commercial surgical coverage. Commercial plans even began to cover medical costs (nonsurgical physician’s services) in the hospital. By the late 1950s, home and office visits also began to be covered, especially under individual policies. Commercial health insurance was sold on an indemnity basis. Indemnity payments would be for fixed sums per service, which were set forth beforehand in the insurance contract.
The success of the hospitals in offering prepaid benefits was soon noticed by physicians. In 1939, the first of the physician service benefits plan that came to be known as Blue Shield plans appeared. Blue Shield plans initially covered surgical benefits in hospital, expanding later on to cover in-hospital medical and eventually ambulatory medical benefits.
Blue Shield plans combined the Blue Cross and commercial insurance approaches for providing benefits. Although some plans offered only service benefits or only indemnity coverage, most of them offered both. Doctors agreed to accept negotiated payments from the plans as payment in full for patients whose income fell below a specified level. Members with incomes above such levels, however, received indemnity payments and had to pay the difference between the doctor’s charge and the indemnity amount. Blue Shield plans were initially community-rated, but over time moved to experience rating like the commercial insurers.
The year of 1929 saw the birth of other models of health insurance as well. In that year, the first consumer’s cooperative providing prepaid medical care was created in Elk City, Oklahoma, whereas the Ross-Loos Clinic, a physicians’ cooperative, began offering a prepayment plan for an employment-related group in Los Angeles. During the 1930s and 1940s, other models of health care coverage appeared based on comprehensive prepayment for healthcare. Some of these, such as the Kaiser plan, were initially industry-sponsored wherease others, like the Washington Group Health Insurance Plan, grew out of consumer-sponsored plans. The Farm Security Administration encouraged consumer cooperatives, which covered 725 000 persons by the early 1940s, but largely disappeared when government support ended. Industry sponsored plans also continued to exist, covering approximately a million people in 1930.
These precursors of modern staff-model health maintenance organizations (HMOs) were vigorously opposed by organized medicine. Organized medicine preferred cash- and-carry medicine (as it does today), but was willing to tolerate insurance that did not subject doctors to lay control. Lay control of medical practice was unacceptable, and health plans that employed doctors were fought vigorously by the American Medical Association (AMA) through much of the twentieth century, resulting in a criminal conviction of the AMA for antitrust violations in the 1940s. These efforts by the AMA kept prepaid medical practice marginal until the final quarter of the twentieth century.
Initially, Blue Cross, Blue Shield, and commercial plans were sold primarily to groups. It was much less expensive to market health insurance to groups than to individuals. Insuring employment-related groups in particular helped for addressing the problem of adverse selection. Blue Cross plans sold insurance to groups of various types, primarily, however, they contracted with employment-related groups. Employers permitted the sale of group policies to their employees, facilitated the formation of groups, and often deducted the premiums from pay checks through a payroll check-off system.
At the outset, employers themselves rarely contributed to premiums for the Blue Cross plans. As late as 1950, only 12.2% of Blue Cross plan participants received employer contributions. Employer contributions were more common with commercial plans. By 1950 employers contributed approximately half of the ‘gross cost’ of health insurance for employees and 30% of the cost of dependent coverage. Because employers commonly received rebates from insurers, their actual ‘net cost’ was in fact much lower, approximately 38.5% for employees and 20% for dependents. A major focus of collective bargaining agreements was to shift more of the cost of the premium to the employer.
In the booming American economy following World War II, health insurance coverage expanded dramatically. By 1950, nearly 76.6 million Americans constituting half the American population had hospitalization insurance – 54.2 million had surgical benefits, and 21.6 million had medical benefits. By 1965, when Medicare and Medicaid were adopted, private hospital insurance covered 138.7 million Americans, that is, approximately 71% of the American population.
As coverage expanded, it also became more comprehensive. In the early 1950s, commercial insurers began to offer major medical coverage that provided catastrophic coverage for hospital and medical care. Major medical policies usually supplemented basic hospital and surgical-medical coverage. Comprehensive coverage followed soon on its heels, bundling basic and major medical coverage into a package to provide the most complete coverage available. During the 1950s, Blue Cross and Blue Shield plans began to combine forces to offer similarly comprehensive coverage. Finally, during the 1960s and 1970s, insurance coverage began to expand to cover dental care and pharmaceuticals, with improved coverage for maternity care, mental health, and some preventive services within basic coverage.
Another important trend after the War was the increased employer responsibility for employee health benefits. During the late 1940s and early 1950s, employer contributions to collectively bargained plans increased exponentially. By 1959, employers paid the entire premium for hospital insurance for virtually all unionized employees in multiemployer plans and for 37% of employees subject to collective bargaining agreements in single-employer plans.
Employer contributions to premiums in nonunionized places of employment increased more slowly. By 1964, however, approximately 48% of employees had the total cost of their health insurance covered by their employer. Employer contributions to health insurance expanded even more quickly during the 1970s and 1980s. By 1988, employers covered 90% of the cost of individual coverage and 75% of the cost of family coverage.
Among the several reasons for the impressive postwar expansion in the number of workers covered, the benefits provided, and the level of employer contributions in the third quarter of the twentieth century, the most important one was probably pressure from the labor unions. Unions were at the peak of their strength in the mid-twentieth century. Improved fringe benefits were a high priority for the unions. The National Labor Relations Board clarified in 1949 that employee benefits were included within the ‘terms of conditions of employment’ subject to collective bargaining under the National Labor Relations Act, giving new impetus to union demands for health benefits.
In the beginning, some of the major unions such as the United Mine Workers had operated their own health benefit funds. The 1947 Taft-Hartley Act prohibited union-run benefit plans, but established multiemployer Taft-Hartley plans, which were operated jointly by labor and management. Most employee benefit plans, however, were established by management. Unions tended to favor Blue Cross and Blue Shield contracts, which offered more comprehensive coverage, but large employers favored commercial insurers that offered more flexibility in the design of plans as well as generous rebates, which substantially reduced the employer’s net contribution to premiums. Employers with healthy workforces also favored commercial insurers because they used experience rating and thus could offer lower rates.
Health benefits were not limited to unionized firms. Even firms that were not unionized offered liberal fringe benefits to forestall unionization. Employers also saw health insurance as a means to stabilize employment (by making it more difficult for employees to leave), to keep workers healthy and productive, and to ward off a national social health insurance program.
Another factor underlying the growth of employment related insurance was the continuing increase in healthcare costs. The proportion of the gross domestic product spent on healthcare grew from 3.6% in 1928–29 to 5.4% in 1958–59. Changes in medical technology were making medical care much more effective and thus more valuable, although medical care was becoming less affordable. The growing burden of healthcare costs led, in turn, to an increased desire to spread costs through insurance and pass it on to employers.
Tax policy also certainly played a role. The 1954 Internal Revenue Code explicitly recognized the nontaxability of employment-related benefits. As more and more Americans began to pay income tax (which was paid primarily by the wealthy before World War II), the tax benefits of health insurance became more important. Tax subsidies played a particularly important role in increasing the share of premiums covered by employers as well as the scope of coverage.
A final factor that drove the expansion of employee coverage was the enactment of the Employee Retirement Income Security Act (ERISA) in 1974, which blocked the application of state insurance regulation and premium taxes to self-insured plans. Self-insurance gave employers increased power to control healthcare costs and the opportunity to receive interest on reserves, as well as protecting them from state premium taxes, insurance mandates (which became common in the early 1980s), capital and reserve requirements, and risk pool contribution requirements. Whereas only 5% of group health claims was paid by self-insured plans in 1975, an estimated 60% of employees were in self-insured plans by 1987.
Although most American employees had hospital coverage (and increasingly surgical and medical coverage) by the 1970s, that coverage was often quite thin. Until the 1980s, commercial insurance was predominantly indemnity coverage and balance billing was very common. Moreover, dollar limits on coverage were often quite conservative. As late as 1959, when 72% of the population had hospital insurance, 18.4% of personal care expenditures was covered by insurance, whereas 56.5% had to be paid. Blue Cross plans offered first-dollar coverage, but initially limited the number of days of hospitalization they would cover, whereas Blue Shield plans often offered indemnity coverage to higher-income enrollees.
Coverage, moreover, did not reach many who were not employed. The one group that was most noticeably left behind during the coverage expansion was the elderly. Retiree health coverage expanded rapidly during the 1950s and 1960s, and many of the elderly purchased individual insurance, yet many remained uninsured too. Efforts to provide public insurance for this group came to fruition in 1965 with the creation of the Medicare program, described earlier. The Medicaid program too offered supplemental coverage to the elderly and disabled besides basic coverage to impoverished families with dependent children.
Other new programs also began to partially fill other gaps left by private insurance. Community health centers that provide services to lower-income families on a sliding scale basis were launched in 1964. Provisions of the 1949 Hill Burton hospital funding program, requiring grantees to provide free or reduced cost care to those in need, finally began to be enforced in the 1970s. The 1986 Emergency Medical Treatment and Active Labor Act required Medicare participating hospitals to provide emergency services even to those unable to pay (although not free). The 1986 budget bill also included a provision that allowed persons who lost their employment or their dependency status to purchase continuation coverage for a period of time at full cost (so-called, COBRA continuation coverage).
By 1980, the vast majority of Americans had health insurance coverage through their employment, and this coverage was increasingly comprehensive. 82.4% of the population had private health insurance that year, a proportion not yet repeated. Most employers paid the full premium for individual coverage and the majority of the premium for family coverage. Deductibles and coinsurance remained common, and indeed spread to Blue Cross and Blue Shield plans, but with the advent of major medical and then comprehensive coverage, out-of-pocket expenditures decreased and insured expenses increased in the final quarter of the century. By 1980, the proportion of healthcare costs covered by private health insurance exceeded that covered out-of-pocket, and with the advent of HMOs in the 1980s, cost-sharing virtually disappeared. The United States had apparently solved through private initiative, supplemented by public programs for those whom private markets could never protect, the problem of health security that other nations addressed through social insurance or public provision.
However, America’s health security system began to unravel during the early 1970s. The driving disruptive force was the increase in healthcare costs. Inflation generally was a serious problem during the 1970s, but healthcare costs grew even more rapidly than other costs. Public initiatives were adopted to restrain healthcare cost growth – including health planning, professional standards review organizations, and in some states, hospital rate review – but none achieved great success.
During the 1980s and early 1990s, health insurers responded to cost increases by turning from being passive payers to becoming care managers. Within a decade, conventional indemnity insurance and service benefit plans gave way to plans, initially called HMOs and preferred provider organizations, which offered limited provider networks, attempted to review and control utilization, and experimented with incentive structures that would discourage rather than encourage provision of services. This strategy worked for sometime. By the mid-1960s, healthcare cost growth had declined dramatically, indeed it briefly fell in line with the general growth of the economy.
But cost increases also began to have an impact on coverage. Beginning on with the 1980s, the percentage of Americans with health insurance began to decline. The first to lose coverage were retirees, who fell victims to the declining power of the unions (which had been their strongest champions), to the steady increase in healthcare costs, and to a change in accounting standards after 1990 that required firms to consider the cost of future retiree health obligations as a current liability on their books.
Moreover, small businesses had never covered their employees to the same extent as larger businesses, and as the American economy shifted from a manufacturing to a service based economy – and concomitantly from large unionized employers to small businesses, the percentage of employees who were insured began to fall further.
Small groups have been underwritten for decades on the basis of expected claims costs of their members, and coverage can be very expensive, even difficult to find, for older groups or groups in hazardous occupations. A number of states took steps in the 1990s to make health insurance more accessible for small groups. This included statutes guaranteeing insurance issue and renewal, limiting variations in rating among groups; and restricting the preexisting condition exclusions. A few even required community rating. The 1996 federal Health Insurance Portability and Accountability Act established guaranteed issue and renewal requirements throughout the country and imposed limits on the preexisting condition exclusions. Administrative costs, however, remained significantly higher for small groups than for large groups and health status underwriting continued for small groups in most states.
Even for larger groups, managed care succeeded in stemming the growth of healthcare costs only temporarily. The more extreme forms of managed care proved intensely unpopular. Although Congress failed to adopt a national managed care bill of rights when the issue came before it in 2001, most states adopted legislation restraining managed care in the late 1990s. As the economy improved in the late 1990s and early 2000s, employers backed off from the most stringent forms of managed care, moving to broader provider networks and away from strict utilization controls.
Healthcare costs began to rise dramatically again by 2000, however. As the economy worsened again in the mid-2000s, the cost of employment-related health insurance began to reach levels that employers found intolerable. Employers reacted primarily by increasing employee cost-sharing, although some employers dropped coverage or increased the employee share of health insurance premiums. Many employers shifted to high-deductible policies, sometimes offering contributions for health savings accounts (held by the employee) or health reimbursement accounts (held by the employee), which received tax subsidies under the 2003
Medicare Modernization Act. As health insurance became more costly and less valuable to employees, more employees passed up employment-related coverage.
Public programs grew steadily for some time, offsetting the decline in private coverage. Employment-related coverage had never covered dependents on the same terms as workers, and many lower income workers could not afford the premiums required for family coverage if their employers even offered it. Many children, therefore, remained uninsured. Medicaid coverage for children had steadily expanded through the 1980s and 1990s, and in 1996, the State Children’s Health Insurance Program was created to cover children in families with incomes up to 200% of the poverty level and above. Medicaid was also expanded after 1981 to cover pregnant women, recognizing the cost-effectiveness of timely prenatal care.
The massive layoffs and economic retrenchment that accompanied the economic decline in 2008 and 2009, however, accelerated the decline in private coverage, overwhelming the expansion of public coverage. Only 55.8% of Americans had employment-based coverage by 2009, down from 63.9% in 1989. The decline of insured retirees had been even steeper. Only 28% of large firms that offered health benefits covered retirees in 2010, down from 66% in 1988.
A small percentage of Americans have always been insured through the individual market. Administrative costs are even higher in the individual non group market than in the small group market, and premiums vary sharply from individual to individual based on health status, age, and other underwriting factors. Individual insurance, however, is often the only alternative available for a growing number of self-employed Americans, including, early retirees, the unemployed, part-time and temporary workers, and individuals who do not have insurance available through their place of employment. A number of states attempted in the 1990s to reform the nongroup market, but in most states reforms were not as ambitious as small group market reforms. The Health Insurance Portability and Accountability Act required only guaranteed renewal and imposed limits on the exclusions of preexisting conditions for individuals who transfer from group insurance or some equivalent public insurance. Many states also established high risk pools for otherwise uninsurable individuals, but risk pool premiums were high and participation was generally low. Individual plans are now predominantly high-deductible plans, with the most common deductible levels in 2009 being US$2500 for individual policies and US$5000 for family policies. The individual market is characterized by high premiums and high turnover, but it is the only coverage currently available to many Americans.
In summary, the history of American health insurance is a story of successes and failures. It is true that healthcare costs have been growing at rates in excess of general inflation almost without interruption for the past half century and that the number of uninsured Americans has now reached critical levels – 50.7 million or 16.7% of the population in 2009. But the vast majority of Americans had access to healthcare for a half century through private health insurance and those who had the most difficult time accessing insurance were covered through public insurance. Can we, however, do better?
The Patient Protection and Affordable Care Act of 2010 (ACA) represents an additional article in each of these three narratives. Some, although not all, of its supporters laud it as finally achieving the long-dreamed of goal of healthcare coverage for all. In fact, if all goes according to plan, the legislation should dramatically expand health insurance coverage and reduce the number of the uninsured. The legislation expands Medicaid to cover all American citizens and long-term legal residents with incomes of up to 138% of the federal poverty level and offers tax credits to help cover the cost of health insurance premiums for Americans and legal residents with incomes of up to 400% of the poverty level. It imposes a penalty on Americans who do not purchase health insurance and penalizes employers who do not offer health insurance or provide inadequate coverage to their employees. The Congressional Budget Office estimates that by 2020, the legislation will reduce the number of the uninsured by 32 million, but it will still leave 23 million Americans (including undocumented aliens) without health insurance. The dream of universal coverage is not yet fulfilled.
Free market advocates loudly criticize the ACA as a ‘government takeover’ of the healthcare system. They complain that the legislation extends government subsidies for and regulation of the healthcare system even further. They fret that the expansion of health savings and reimbursement accounts that they achieved in the early 2000s will be overturned. They assert that the legislation will result in unconstrained growth in healthcare costs.
The ACA does dramatically expand federal funding and regulation of private health insurance. It does not, however, significantly expand federal regulation of the healthcare delivery system. Fundamentally, moreover, the ACA adopts a market-based approach to healthcare reform. It establishes ‘health insurance exchanges’ at the state level to organize competition among health plans. It establishes new programs to increase competition by encouraging the extension of multistate private plans to every state and the formation of interstate insurance sales compacts and nonprofit insurance cooperatives. Finally, the legislation has no effect on health savings or reimbursement accounts other than to limit their use for over-the-counter drugs. Indeed, the normal employment-based policy currently has an actuarial value of over 80%, whereas the standard subsidized ‘silver’ policy under the ACA will have an actuarial value of 70% (‘actuarial value’ refers to the percentage of total medical costs of a standard population paid for by an insurer, so the lower the actuarial value, the higher the percentage of medical costs borne by the insured. Most current health savings accounts-affiliated high deductible plans will be permissible as 60% actuarial value ‘bronze’ plans. There is likely to be most, not less, cost-sharing under the ACA.
However, the ACA is best understood finally in terms of the ‘patchwork of coverage’ narrative. The legislation is in the long American tradition of expanding private health insurance coverage and filling gaps with public coverage. Once the ACA is fully implemented, most Americans will continue to be covered by employment-related health insurance, Medicare, and Medicaid. The ACA significantly expands Medicaid, acknowledging that Americans below 138% of the poverty level simply cannot afford health insurance although the Supreme Court decision limits the number of poor Americans who will benefit from this expansion. Tax credits and cost reduction subsidies are offered to allow Americans earning up to 400% of poverty to purchase health insurance and to limit their exposure for cost-sharing, thus making insurance affordable to 19 million more Americans.
The biggest change made in the American health insurance is that the legislation outlaws health status underwriting and bans preexisting condition exclusions. Insurers must no longer compete based on risk selection but rather do so based on price and value. The original Blue Cross plans community rated premiums, and community rating has long been the norm (and required by law since 1996) within employee groups. Outlawing health status reinforces this tradition while rejecting an equally long tradition of health status underwriting. The legislation also prohibits or limits other health insurance practices and policy provisions – some of which, like the imposition of annual limits, date back to the beginning of health insurance, whereas others, like limitations on access to certain specialists, are more recent.
The ACA fits within the narrative of the quest for universal coverage, and can also be understood as imposing additional government regulation and subsidies on healthcare markets (albeit to the prospect of making them function better), but it is best understood as one more article in the ongoing story of helping our patchwork private/public health insurance system to hobble along.
References: