AI Scribes Are Reshaping Doctor Visits; Here Is What Patients Should Know
Nearly a third of physician practices now use AI scribes, and most patients have no idea. Here is what consent looks like, what can go wrong, and how to protect your data.

Nearly a third of physician practices across the country are now using artificial intelligence scribes, or are actively working to add the capability, yet most patients walk into those appointments without a clear picture of what that means for their care, their privacy, or their medical records. The tools are spreading fast, driven by a healthcare system desperate to reduce the administrative burden that has long pushed clinicians toward burnout. Understanding how AI scribes work, what rights patients have, and what questions to ask has become an essential part of navigating a modern doctor's visit.
What AI scribes actually do
An AI scribe is a software application that listens to a clinical encounter in real time, transcribes the conversation, and generates a structured visit summary. Eric Boose, a family physician at Cleveland Clinic, has been using one of these tools for roughly two years. He describes the output as typically available within seconds after an appointment ends, a speed that compresses what once required extended time at a keyboard into something nearly instantaneous. The charting work that used to follow him home has been cut dramatically, and Boose credits the tool with giving him more time with his family.
The more immediate effect, he says, is what happens inside the exam room. "I can really just sit there and engage and just focus on them and listen," Boose told reporters, describing a return to what he calls "old-fashioned medicine." The ability to maintain eye contact, listen carefully, and respond without the competing demand of documentation is something many physicians in the pre-AI era simply could not do. That shift in attention has real clinical value: patients who feel heard tend to share more, which means clinicians have better information on which to act.
Consent: what you should be asked before the recording starts
Common practice in clinical settings is for providers to request verbal consent before activating an AI scribe, though some practices may seek written consent depending on their policies. Legal recording requirements vary significantly by state, which means what is considered adequate consent in one location may not meet the threshold in another. Patients should not assume a single notification posted on a check-in tablet constitutes meaningful informed consent; it is reasonable to ask your provider directly whether an AI tool will be recording the visit before the appointment begins.
Patients also have the right to ask a provider to pause the AI scribe for sensitive portions of a conversation, or to decline its use entirely. If you opt out, clinicians revert to manual note-taking, the same method that was standard before these tools arrived. Opting out does not mean forfeiting care; it means the documentation method changes.
Accuracy gaps: why AI scribes still make mistakes
AI scribes are not infallible. These tools can hallucinate, meaning they can generate plausible-sounding information that was never actually said in the room. They can also omit important context or miss details that carry clinical significance. The standard expectation is that clinicians review and edit AI-generated summaries before those notes are finalized and incorporated into the official medical record, but that editorial step depends on the provider catching errors that can be subtle.
Patients carry meaningful responsibility here too. Reviewing your visit summary carefully after an appointment and raising any corrections with your provider is one of the most direct ways to catch inaccuracies before they become embedded in your permanent health record. A misrecorded medication, a missed allergy mention, or an inaccurate symptom description can have real downstream consequences if left uncorrected. Treating the visit summary as a document worth reading, rather than simply filing, is among the most practical protections available to patients right now.
Data privacy: who can access what you say in the exam room
AI companies and health systems that operate these scribe tools may both have access to the medical data captured during your visit. These entities are subject to the Health Insurance Portability and Accountability Act, commonly known as HIPAA, which imposes specific requirements around the handling of protected health information. However, HIPAA has important limitations and nuances. One that matters particularly here: if patient data are de-identified, meaning personally identifying information has been stripped out, that data can be available for broader uses depending on the specific terms of contracts between health systems and AI vendors.
This creates a practical gap between what patients might assume about the confidentiality of a clinical conversation and what may actually happen with that data downstream. The questions worth asking your provider include whether your data will be stored after the visit, how it may be used to improve the vendor's software, and whether the AI company is formally designated as a HIPAA business associate. That last designation matters because it establishes the legal framework governing how the vendor can use or share your health information.
How the shift changes healthcare beyond the exam room
The rapid adoption of AI scribes reflects a healthcare system under significant pressure. Administrative documentation has long been identified as a leading contributor to physician burnout, and tools that reduce that burden carry strong appeal. Nearly a third of physician practices are now either using these tools or actively working to deploy them, a pace that suggests AI scribes will become a standard feature of clinical care in the near future rather than a novelty confined to well-resourced health systems.
That trajectory raises questions extending well beyond any single appointment. As automated transcription becomes the default method for producing clinical notes, the accuracy standards, consent protocols, and data governance frameworks governing these tools will shape the integrity of health records at scale. There are real benefits: reduced clinician burnout, more attentive face-to-face care, and faster documentation. But those gains come alongside new operational, legal, and privacy questions that neither health systems nor regulators have fully resolved.
Key questions to bring to your next appointment
Before your next visit, consider asking:
- Will an AI scribe be recording this appointment?
- Can I pause or decline the recording for sensitive topics?
- How is my data stored, and for how long?
- Will my visit data be used to train or improve the AI software?
- Is the AI vendor designated as a HIPAA business associate?
Raising these questions is not a sign of distrust. It is the kind of informed participation that helps ensure both the accuracy of your medical record and the appropriate handling of deeply personal health information. The clinicians adopting these tools are largely doing so in good faith, seeking to be more present and less burned out. The patients who understand what those tools involve will be best positioned to benefit from that shift while protecting themselves within it.
Sources:
Know something we missed? Have a correction or additional information?
Submit a Tip
