Introduction: If We Built It, Could They Come?
In Part 1 of this series, I explored the idea of a healthcare system that actually works: a cohesive Electronic Health System (EHS) that moves information seamlessly with the patient, incentivizes health outcomes, recognizes healthcare as a public good, and uses technology to empower patients, reduce disparities, encourage trust, and support clinicians. This vision may seem boring~ it’s not a one-pill cure for cancer and diabetes, or an implant that detects diseases before the first symptom. But if that pill was invented tomorrow, could we identify everyone that needed one? Could we keep implants from becoming unaffordable? Would patients believe doctors enough to take them?
In Part 2, I’ll confront an uncomfortable reality: the healthcare system was not an accident. It was the product of political, economic, and technological decisions that enrich and encourage fragmentation. To build the system we want, we must understand how we arrived at the one we have.
Fragmentation is not an accident. It’s policy.
I. Employer Insurance and the Birth of Fragmentation
The fragmentation of American healthcare began in earnest during World War II. As part of the wartime economy, labor shortages pushed the federal government to impose strict wage and price controls in order to curb inflation through the Stabilization Act of 1942[1]. Employers were prohibited from increasing salaries to attract workers. Instead, the IRS ruled that employers' contributions to group health insurance policies were exempt from taxation[2]. This gave businesses another way to compete for labor, and cemented a model of employer-sponsored private insurance that persists today.
In 1945, following death of FDR, President Truman championed an updated version of the Wagner-Murray-Dingell Bill. The bill called for compulsory national health insurance and a payroll tax (sound familiar?). His plan envisioned a federally funded, universal health system that would guarantee access to hospital and medical care for all Americans, with services provided largely through existing private providers but publicly financed[3]. However, private insurance companies, the American Medical Association, and business groups branded the plan as unnecessary, too expensive, and destroying free enterprise[3].
Private insurance was already too profitable for reform.
While fighting over nationalized health insurance, Congress passed the Hill-Burton Act (1946) which provided subsidies for hospital construction, particularly in rural and underserved areas [4]. In exchange for federal funding, hospitals agreed to provide some free or reduced-cost care. This program continued paying through 1997, and remaining hospitals built under the Act are still obligated to provide free or reduced cost care[4].
The Medicare and Medicaid Act was passed in 1965 by LBJ[5]. These programs, still very much a crowning achievement of American healthcare, accounted for roughly 23% of federal spending in FY2024[6], and provided critical coverage for the elderly, persons with disabilities, and the poor. Despite expansions to Medicaid after the Affordable Care Act, 44% of American adults 19-64 are under- or uninsured[5]. CMS and safety net programs created a noteworthy split: private insurers got to cover healthier, younger populations, while government programs bore the high costs of the elderly and poor. Like the Postal Service vs. Amazon dynamic, it’s public infrastructure that takes care of the Americans the market dubs too expensive.
II. Cracks in the Foundation: Hospitals, Doctors, and the Rise of Market Logic
By the 1980s, federal commitment to public health infrastructure waned. The Prospective Payment System (PPS) introduced under Reagan shifted Medicare payments from reimbursing hospitals for their actual incurred costs to providing fixed lump sums based on Diagnosis-Related Groups (DRGs)[7]. This "one-size-fits-all" payment model disproportionately squeezed small rural hospitals, which often had higher operating costs due to lower patient volumes, less specialized care (higher payouts), and the fixed overhead of keeping critical services available.
At the same time, the pullback in public investment in hospital infrastructure encouraged a shift toward privatized, fractured models of healthcare delivery. Rather than close, struggling hospitals increasingly turned to private equity for financial viability, reinforcing a dependence on employer-sponsored insurance and accelerating the market-driven logic of healthcare. Others closed. Between 1980 and 1988, 353 hospitals were shuttered in America. 190 were located in urban and suburban areas, 163 rural[8].
“Public Infrastructure takes care of the Americans the market deems too expensive.”
Sadly, rural hospitals continued to close at alarming rates, with at least 192 closures from 2005 to 2024[9]. The abandonment of direct public funding has left rural communities stranded, exacerbating healthcare deserts and magnifying inequality.
It’s more than buildings.
Medical education is largely dependent on government money. Public and private medical schools, as well as residency/training programs receive subsidies from Medicare and Medicaid (CMS), and of course student loan payments[11]. Physician shortages are estimated to exceed 10,000, and most experts expect the problem to accelerate. America has an aging population, an increased demand for medical services, and a relatively fixed supply of providers [10]. From 1997 to 2020, no significant increases to federal subsidies were made by CMS. The COVID relief bill added money for just 1,000 residency slots [10].
III. Digital Islands: Innovation, not Integration
Alongside the creation of Medicare and Medicaid programs, the 60’s saw another major innovation; the computer! Not long after, academic medical centers explored the capacity of these systems to log clinical information. Importantly, many early systems were developed “in-house” and fulfilled different purposes. Some were mainly for billing and scheduling. Others, like COSTAR and HELP were designed to support forms of clinical decision making, and benefit medical research [12]. But, all early computerized systems were limited by storage and functionality.
With advancements in computers, and the advent of the internet, the VA set out to develop a different kind of interoperable electronic medical record. VistA (an early, decentralized version) became the Computerized Patient Record System (CPRS) and was adopted across VA facilities nationwide. CPRS allowed VA facilities to share patient data seamlessly, improving coordination and outcomes[12]. This software was published, open-source for all to benefit, but interest was low.
Academic medical centers innovated too, developing proprietary systems that explored new clinical decision support features like drug-allergy notifications, and abnormal laboratory test results[13]. Policymakers would go on to incentivize adoption (HITECH Act of 2009 as part of the American Reinvestment and Recovery Act[14]), but never standardized design. So, the software was hoarded, patient data represented a competitive advantage, security was a HIPAA checkbox[13].
Efforts like Health Level Seven (HL7) and later FHIR (Fast Healthcare Interoperability Resources), emerged to facilitate data exchange through the creation of standards, but progress has been slow, and oftentimes optional[13].
“Hospitals digitized- but remained digital islands.”
IV. There’s an App for That!
I think it’s important to talk about digital infrastructure because it represents a space that has seen much more movement, much more momentum shifts compared to physical infrastructure. There isn’t much disagreement on physical infrastructure; we need more hospitals and more doctors. There are significant challenges in achieving those goals; funding, building, and recruiting. Private equity won’t fill these gaps. But, what I don’t see discussed often, are solutions for digital infrastructure.
I, am a type 1 diabetic. It’s a reality that has kept me interested in these questions. I have a viewable patient portal, an app for monitoring trends in blood sugar, an account to order medical supplies, an app to manage my prescriptions, and an excel sheet to track the use of those medications.
None of it ends up in my EHR.
All of this data is relevant to the discussions I have with my doctor, so why is it on me to aggregate and facilitate the conversation? A more cohesive system would undoubtedly make my life easier, and I imagine it would mean even more for those who struggle with disease complications or insurance coverage.
V. Looking Forward
We know we need more doctors, and more hospitals. But even if we met those goals, the experience of healthcare remains frustrating and expensive. Jennifer Pahlka describes a conflict of “product management versus project management.” I believe healthcare has no shortage of project managers. These are people I admire and respect. But healthcare product management needs a 21st century overhaul.
In Part 3, I’ll explore elements of successful, cohesive health systems from around the world — and what lessons they offer for building digital healthcare infrastructure in America.
Sources Cited: