According to a recent petition, United Healthcare, the largest health insurance provider in the US, is utilizing an AI system known as nH Predict to limit the time severely ill individuals can spend in extended care and reject their health coverage.
A lawsuit was filed this week in the US District Court of Minnesota by the families of two terminally ill individuals who were denied coverage by UnitedHealth. The plaintiffs argue that the health insurance company breached its contract by relying on AI technology, which led to inaccurate predictions regarding the necessity of extended care.
Their claims are substantiated by an investigation conducted by Stat News into United Health’s internal practices at its subsidiary NaviHealth. The investigation revealed that the company enforced strict adherence to the questionable forecasts generated by the AI algorithm regarding the duration of individuals’ stay in long-term care facilities.
According to Stat, the cost-cutting AI allegedly saved UnitedHealth hundreds of millions of dollars that would have otherwise been allocated to patient care.
While health insurance claims are seldom challenged, the lawsuit asserts that approximately 90% of appeals are successful. This indicates the significant inaccuracies of the AI system and the potential exploitation of numerous vulnerable patients by UnitedHealth.
Spencer Perlman, a medical industry expert, expressed to Stat, “If UnitedHealth is treating NaviHealth’s systems as gospel, that is not sound medical decision-making.” He highlighted the importance of individualized decision-making in healthcare rather than relying solely on algorithms.
In response to Stat, UnitedHealth denied allegations that NaviHealth instructed staff to deny treatment using the AI tool. The company stated that medical executives make decisions based on Medicare guidelines rather than using the AI as a definitive tool.
Nevertheless, evidence from employees and documents suggests that United Health’s AI system has been involved in questionable decision-making processes.
According to Stat, the nH Predict system allocated only 20 days of rehabilitation to an elderly woman who had suffered a stroke, which is half the typical duration for stroke patients. Similarly, an elderly legally blind man with kidney and heart issues was only granted 16 days of care.
The inaccuracies of the nH Predict system stem from its reliance on historical data from approximately six million past patients. While this approach may seem logical, it perpetuates past errors and cost-cutting measures without considering current medical needs and practical circumstances.
Dr. Ziad Obermeyer, an expert in computational discrimination at the University of California, Berkeley, emphasized that the length of a patient’s stay is influenced by various factors beyond biological considerations.
UnitedHealth aimed to achieve a strict adherence to the AI’s projections, with event managers instructed to maintain nursing home stays within a 3% margin of the AI’s estimates by 2022. This margin was further reduced to less than 1% the following year, leading to stringent consequences for event managers who failed to meet the target.
Former NaviHealth event director Amber Lynch, who was terminated earlier this year, criticized the company’s profit-driven approach, stating, “It’s all about the money and the data points. I hated that because it takes the respect out of the client.”
This case serves as a stark example of how the perceived objectivity of AI can mask unethical practices and exploit individuals during their most vulnerable moments.