Tim Duer
Data, Healthcare
June 15, 2021
How did modern healthcare delivery and the managed care structure come to exist?
With so much talk about restructuring, it’s important to consider the factors that led American healthcare to its current iteration. Managed healthcare in the US is often referenced as being “different” than care in nearly every other developed nation in the world. But not many people have considered this basic question: “How did we get here?”
The early 20th century saw the expansion of more complex procedures regarding surgery and diagnostic testing. During this time, the importance of infection control was also recognized more fully. As a result, care shifted from primarily home-based to hospital-based.
With this change, costs also increased. It became more difficult to pay out of pocket for care provided, which led many European nations to move toward public healthcare. Meanwhile, private insurers and unions stepped in to mitigate costs for patients in the United States. This early divergence continues to this day and has resulted in two very different methodologies for managed healthcare.
Healthcare coverage in the US remained a mix of personal, private, and union coverage until World War II, when widespread employer-sponsored healthcare was introduced. The mid-war Stabilization Act of 1942 utilized restrictions on wages to manage inflation. This limited many companies’ ability attract new and more desirable employees with higher pay. Seeking ways to make employment more appealing, companies began to utilize expanded benefits packages to increase their compensation without increasing wages. This change rapidly shifted the availability of health benefits in the United States; prior to WWII, only 10% of working adults had health insurance. By 1955, that number was 70%.
The increased connection between employment and health insurance was recognized as problematic, especially for the elderly, disabled, and unemployed. During his time in office, Harry Truman renewed earlier pushes to introduce a national health insurance plan, like the universal coverage that had been adopted in Europe. However, this was met with sharp criticism by many in Congress and the American Medical Association (who had previously opposed third-party payers).
Private, employer-sponsored insurance options continued to expand and became the primary form of coverage. Other mid-century changes included the 1945 McCarron-Ferguson Act. The act exempted insurance companies from federal oversight, leaving management to each state. The consequences would be felt for years to come as public insurance options became available.
The Social Security Act of 1965 ushered in the next substantial change to insurance, laying the groundwork for Medicare and Medicaid coverage for the elderly, the disabled, and the poor. These programs provided more options for people that had been excluded from employer-sponsored coverage plans, and effectively ended out-of-pocket payment for medical care.
This act firmly established the reliance on a “third-party payment system” for healthcare in the United States. It separated people into those who provide care, those who receive care, and those who pay for care. Hospitals negotiated rates (and discounts) with these payers, and out-of-pocket payment, at a full listed rate, became nearly impossible for the uninsured.
Under a third-party payment system, more attention was paid to utilization management. Questions regarding “medical necessity” became much more commonplace. Insurers became increasingly aware expenses and sought new ways to manage this oversight. As a result, managed care was established over the final decades of the 20th century.
Insurers introduced health maintenance organizations (HMOs) throughout the 1970s as a new means of utilization management and administration. HMOS used a “gatekeeper” approach to care, whereby authorization from primary care providers was required prior to any specialty or hospital care. They also introduced the idea of “pre-certification” prior to procedures. Benefits of HMOs included reduced spending waste (which was prevalent in the health system) and the inclusion of more preventative services; however, it also increased challenges for consumers to access desired care.
Preferred Provider Organizations (PPOs) followed as modified form of managed care. They focused on driving care towards “in-network providers” that had lower negotiated rates with insurers. While most PPOs removed the “gatekeeper” that was present in HMOs, any care that was provided “out-of-network” resulted in a much higher patient responsibility. Cost control measures, such as pre-certification, were put into place. There was more scrutiny over what coverage plans deemed medically necessary.
While managed care plans have evolved over the past 30 years, they generally remain in place as a method for utilization management within the present third-party payment system. Questions continue to arise regarding the definition of “medical necessity” with both providers and patient alike, both groups often frustrated by coverage limitations.
While most private coverage (as opposed to Medicare or Medicaid) remained employer-sponsored, during the 2000s the amount contributed to these plans by both worker and employer increased drastically. This period also saw increasing numbers of Americans going uninsured. In response to these rapid rises in healthcare costs, renewed calls for an overhaul to America’s healthcare plan ushered in the 21st century.
In 2010, the Affordable Care Act (ACA) was signed into law, representing the largest change to health care payment in recent years. While it has undergone numerous challenges and changes, the law brought about several inclusions that altered the healthcare landscape. The ACA…
The positive and negative impacts of these inclusions remain under debate. However, few Americans appear to feel optimistic about the current state of healthcare. Rising healthcare costs consistently remain high-priority concern for many, and more changes will likely follow soon.
While complaints regarding managed care can be lengthy, and many issues are perceived as negatives by both patients and providers, increased attention on utilization management and cost control initiated the use of data and analytics in care delivery operations.
Once they were included in the equation and their profits were impacted, insurers quickly recognized that many medical procedures were inefficient, ineffective, or both. In 1972, the Professional Standards Review Organization (PRSO) was created to assess the appropriateness of care provided under Medicare and Medicaid. While the outcome and effectiveness of PRSO and its subsequent versions are debatable, recognizing the need for oversight was essential. This oversight not only focuses on the clinical performance and adherence to evidence by providers, but also requires the use of sound data in looking for outliers.
For any utilization management program to be effective, it is important to identify the relevant key metrics for comparison. These metrics may be operational in the form of costs, staffing, time, or clinical by way of health outcomes. Establishing these baselines provides comparisons between hospitals and interventions, and ultimately can help identify quality and value in care.
As the healthcare system moves toward a more value-based model (to be discussed in future articles), these baselines are increasingly important. And so, while the term “managed care” may initially cause many to wince, the oversight that accompanied the model introduced data analytics into healthcare management.
In the next installment of this series, we’ll discuss the shifting focus in payment from “fee for service” to “value-based care”, and review the benefits and challenges that accompany this change for payers, providers, and patients alike.
Ready to learn more? Contact Causeway Solutions to get started!