Two families have brought a lawsuit against UnitedHealth, asserting that the company used a flawed AI algorithm to deny necessary medical coverage to elderly patients. This algorithm, known as "nH Predict," reportedly had a 90% error rate and frequently contradicted physicians’ assessments of medical necessity, leading to premature discharge from care facilities or significant out-of-pocket expenses for families.
The case, filed in Minnesota, highlights a concerning trend where insurance companies rely on AI for decision-making, potentially compromising patient care. The AI tool was designed to guide coverage decisions, but the plaintiffs argue that it was instead used to systematically deny claims, exploiting the elderly who may lack the means to challenge these decisions.
This lawsuit raises critical questions about the ethical use of AI in healthcare, especially as the American Medical Association urges for a human review of AI-based denials and the industry recognizes the potential of AI to streamline administrative processes.