Sharp HealthCare sued after ambient AI allegedly recorded exam-room visits without consent
A proposed class action filed on November 26, 2025 alleges that Sharp HealthCare used Abridge's ambient AI documentation system to record doctor-patient conversations without obtaining legally valid consent. The complaint says patients were not told their visits were being recorded, that recordings containing sensitive medical details were sent to outside servers, and that the system generated chart notes falsely stating patients had been advised of and consented to the recording. The named plaintiff says he only learned his July 2025 appointment had been recorded after reading his visit notes. Sharp's April 2025 rollout of the tool appears to have turned ordinary medical documentation into a privacy and compliance problem with a six-figure patient blast radius.
Incident Details
Tech Stack
References
Sharp HealthCare's ambient AI lawsuit is the kind of story the healthcare industry spent all of 2025 trying not to tell about itself. The sales pitch for ambient documentation was simple enough: let software listen to the visit, draft the note, give clinicians more time to look at patients instead of keyboards, and maybe squeeze a little more productivity out of the day. The complaint filed against Sharp says the real workflow was much less polished. According to the plaintiff, the health system rolled out Abridge across its clinics, recorded private conversations without proper consent, shipped those recordings off to outside servers, and then let the software write chart language saying the patient had agreed to all of it.
If those allegations hold up, this was not a minor paperwork miss. It was a clinical documentation tool turning the consent record into fiction.
The rollout
Sharp and Abridge announced their partnership in April 2025. The public framing was familiar: better documentation, better billing, less clerical burden, happier clinicians. Abridge was not positioned as an experimental side project. It was introduced as an enterprise tool for a large health system, complete with the usual promises about compliant notes and operational lift.
That context matters because it shifts the story away from a single doctor improvising with a gadget. This was a system-level deployment. Once an ambient documentation platform is rolled out across clinics, the consequences of getting the consent model wrong stop being individual mistakes and start becoming organizational exposure.
The named plaintiff, Jose Saucedo, says he went to a July 2025 appointment at a Sharp Rees-Stealy clinic and later discovered, by reading his medical notes, that the conversation had been recorded using Abridge. According to the reporting by KPBS and Becker's, he says he was never told the visit would be recorded and was never given a lawful opportunity to agree or decline. The suit was filed in San Diego Superior Court on November 26, 2025.
Where the complaint gets ugly
Ambient AI vendors like to describe their systems as digital scribes. That sounds harmless until you translate it into what actually has to happen. The software cannot draft the note unless it captures the conversation. In an exam room, that means symptoms, medications, diagnoses, treatment options, family history, and whatever else comes up while a patient is sitting in a thin gown answering questions they usually would not want broadcast beyond the room.
The Sharp complaint alleges the recordings were sent to Abridge servers, where company personnel could access them, and that the files could remain there for roughly a month before deletion. Again, those are allegations, not findings. But even at the allegation stage, the structure of the problem is obvious. A health system adopted a tool that moved confidential medical conversations into a broader technical and contractual chain, then allegedly failed at the first legal question that chain raises: did the patient actually consent?
California is not subtle about this issue. The state is an all-party consent jurisdiction for confidential recordings, and healthcare data has its own layers of privacy regulation on top. That does not mean ambient documentation is impossible. It means the disclosure and consent process has to be real, documented, and legible to the patient rather than assumed into existence because the software works best when everyone stops asking questions.
The complaint says Sharp did something worse than skip a form. It says Abridge-generated chart notes included language stating that patients had been advised that recording was taking place and had consented, even when that conversation never happened. That is the detail that turns a privacy dispute into something broader. Once software starts auto-populating the chart with false consent language, the medical record stops merely describing care and starts laundering a compliance failure.
Why false consent language matters
Hospitals live and die by the record. If the chart says the patient was informed and consented, that sentence does not sit there as decorative prose. It becomes the institution's account of what happened. It may shape internal reviews, legal defenses, billing records, and future care. If the chart is wrong about something as basic as whether the patient agreed to being recorded, the technology has introduced error into the very document people rely on to sort out responsibility later.
That is one reason the blast radius here is so much larger than the named plaintiff. The complaint estimates more than 100,000 Sharp patient encounters may have been recorded during the rollout. Even if that figure proves high, the scale question is still brutal. This was not one accidental recording. The risk came from enterprise implementation.
Healthcare loves to talk about AI adoption as if the hard part is getting clinicians to trust the tool. This case suggests the harder problem may be getting institutions to respect the legal and human conditions under which the tool can be used at all. A patient in an exam room is not a beta tester. They are not part of an A/B test on documentation efficiency. They are there to discuss private medical issues with a clinician, and the health system's job is to make the data flow comprehensible before it starts capturing their voice.
The standard industry dodge
One of the patterns around generative AI in regulated settings is that vendors and buyers talk at the level of capability while critics are left to talk at the level of consent, retention, access, and liability. The demo shows a polished note appearing seconds after the visit. The ugly questions arrive later. Who heard the recording? Where did it go? How long did it stay there? What exactly did the patient agree to? What happens if the note is wrong? What happens if the record says the patient consented when they did not?
None of those questions are fringe concerns. They are the work.
That is why this case feels consequential even before any court reaches the merits. It offers a blueprint for how ambient AI can fail in a real deployment. Not because the transcription was amusingly bad or the summary forgot a detail, but because the institution allegedly treated informed consent as a background task and let the software produce a cleaner version of reality than the patient actually experienced.
Sharp declined to comment publicly because the litigation is pending. That is standard. Becker's also noted that the case appears to be among the first known hospital lawsuits tied specifically to ambient intelligence. That matters because health systems across the country spent 2025 racing to sign similar contracts. Abridge was hardly alone, and Sharp was hardly the only provider under pressure to make clinicians more efficient.
What the cautionary tale actually is
The cautionary part is not that AI entered the exam room. It is that the exam room is one of the last places where "move fast and sort out governance later" should have been obviously disqualifying. The whole point of healthcare compliance is to formalize the moments that cannot be left to vibes: authorization, disclosure, confidentiality, documentation, and the chain of custody around sensitive information.
Ambient AI promises to disappear into the clinical workflow. That is good product design when the workflow is safe. It is terrible governance when the hidden part includes the legal basis for the recording itself. A recording system that fades into the background may feel seamless to the health system, but seamless for the institution can look a lot like invisible to the patient.
If the allegations are true, Sharp did not just deploy a note-taking assistant too casually. It deployed a system that allegedly recorded confidential visits without proper consent and then wrote the paperwork to make it look authorized after the fact. In a regulated industry built on records and trust, that is not a small implementation bug. That is the implementation.
Discussion