AI has a growing role in processing health insurance claims
Artificial intelligence and algorithms are a part of many businesses these days, including health insurance. It’s a conversation that’s resurfaced since the killing of UnitedHealthcare CEO Brian Thompson.
Starting in January in California, a new law will regulate how insurers use AI in the process of granting prior authorization for treatments.
Before AI made its way into health care, physicians at insurance companies evaluated what treatments would qualify for coverage.
Now, “the hospital will have to submit a lot of information on how sick a patient is to the insurer,” explained Jeff Marr, who’s researching insurers’ use of algorithms at Johns Hopkins University. “The insurer’s gonna use this in their algorithm to determine what level of care is needed and that’s the basis of decision making.”
There is usually still a physician on the insurer’s side, Marr said, but AI allows them to be less involved in the administrative work, saving time and costs.
But using AI can be problematic. Ryan Clarkson is managing partner of Clarkson Law Firm, which is representing patients who say they’ve been denied proper health care because of faulty algorithms.
“There have been situations in which our clients have undergone a surgery, for example, their prescribing physician has ordered over 21 days of physical therapy care,” he said. But patients were then asked to pay out of pocket after 10 days.
Clarkson said he’s spoken to hundreds of people with similar stories and believes AI’s use in insurance claims and requests is widespread.
“I suspect that anyone who has been invoiced, paid a bill has been touched or has touched AI — certainly if it’s occurred within the last couple years,” he said.
It’s hard to determine exactly how much insurers lean on algorithms, because the industry isn’t transparent. But algorithms are usually trained on small groups of data based on types of patients, according to Nicholson Price, a law professor at the University of Michigan.
“And that’s used to build a system, which is then applied to really different patients with really different stories,” he said.
Price added that medical care often requires nuance. “Here is just a decision being made about something that is intensely personal, which is effectively being made by a machine” — instead of a decision about something that’s intensely personal being made by a person.