The nonprofit Electronic Frontier Foundation, a digital and privacy rights organization, is suing the Centers for Medicare and Medicaid Services (CMS) for more information about a multistate program that uses artificial intelligence to evaluate requests for medical care.
The nonprofit filed a Freedom of Information Act (FOIA) lawsuit against CMS seeking disclosure of records related to the Wasted and Inappropriate Services Reduction (“WISeR”) model. The WISeR model uses AI to evaluate prior approval applications.
CMS rolled out the Innovation Center model in June to test new prior authorization requirements in traditional Medicare. It will be launched on January 1st and is scheduled to end at the end of 2031.
This pilot program will use AI to evaluate prior authorization requests from Medicare beneficiaries, expanding the scope of prior authorization in traditional Medicare, which has largely avoided prior authorization. The pilot program is being rolled out in Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington.
WISeR participants will use AI to expedite pre-authorization for certain services deemed “vulnerable” to waste, fraud, and abuse. This includes skin and tissue substitutes, which are increasingly being targeted by federal agents.
While the White House claims the measure is aimed at combating fraud and waste, clinicians and Democratic lawmakers have raised concerns about the use of AI in pre-certification decisions. The 42 Democratic lawmakers said in a letter that the model encourages higher prior authorization rates and encourages health plans to use artificial intelligence to make haphazard decisions because private insurers are paid “based on the percentage of expenditures avoided.”
According to the Electronic Frontier Foundation, there is little information about how the AI algorithms used in WISeR work, including what kind of training data they rely on. “It remains unclear whether WISeR has any safeguards in place against systemic flaws such as algorithmic bias, privacy violations and unwarranted treatment denials,” EFF executives said in a statement.
Kit Walsh, EFF’s director of the AI and Knowledge Access Legal Project, said in a statement: “Letting algorithms make treatment decisions can lead to unwarranted and even discriminatory delays or denials of needed care.” “Given these serious risks, the public is demanding transparency, but we are not getting it. We are suing to get much-needed answers about how Medicare’s AI experiment will work.”
EFF says that by design, WISeR incentivizes contracting companies to deny prior authorizations against patients’ best interests. Vendors will be partially compensated based on the amount of medical services they deny and will be entitled to up to 20% of the associated savings, the group said.
Patients, doctors and other lawmakers have also been critical of what they see as delay or denial tactics that delay or cut off access to health care, potentially resulting in irreparable harm or death, KFF Health News reported.
Earlier this year, EFF filed a FOIA request with CMS seeking records related to WISeR. Among other records, the request sought agreements with software vendors participating in WISeR. Records related to testing the vendor’s technology for accuracy, bias, or illusions. Records related to auditing, monitoring, and evaluation of WISeR and participating vendors. To date, CMS has not provided these records to EFF, the group said.
EFF’s FOIA lawsuit calls for their immediate processing and release.
“The public has a right to know more about the algorithms that drive their health care decisions,” said EFF staff attorney Tori Noble. “Without increased transparency, patients, health care providers, and policy makers will continue to be left in the dark.”

