Ottawa and Ontario patients may want to ask their doctor a few more questions after a new report from Ontario's Auditor General found that AI-powered note-taking tools deployed in the province's healthcare system produced inaccurate, incomplete, and in some cases completely fabricated information.
The audit, released this week, examined AI scribe systems — software that listens to doctor-patient conversations and automatically generates clinical notes. The technology was pitched as a way to reduce administrative burden on physicians and free up more time for patient care. But what the auditor found was far less reassuring.
What the Auditor Found
According to the report, the AI tools demonstrated what researchers call "hallucinations" — a technical term for when an AI system generates text that sounds plausible but is simply not true. In a medical context, that's not just a software glitch. That's a potential patient safety risk.
The auditor found that the tools:
- Produced incorrect clinical information in patient notes
- Left out medically relevant details from consultations
- Were not adequately evaluated before being rolled out to physicians
- Lacked sufficient oversight mechanisms to catch errors before notes entered patient records
The province had moved quickly to adopt AI scribing tools as part of broader efforts to modernize healthcare and address physician burnout — a real and growing problem. But the audit suggests the rush to deploy came at the cost of proper vetting.
Why This Matters for Ottawa Patients
Ottawa is home to major healthcare institutions including The Ottawa Hospital, CHEO, and Bruyère, along with hundreds of family medicine clinics spread across the city. If AI scribing tools were deployed across Ontario without adequate evaluation, that net almost certainly includes Ottawa-area practices.
Patient notes generated by these tools could influence diagnoses, referrals, prescriptions, and follow-up care. An error buried in an AI-generated note — a missed allergy, an incorrect dosage, a hallucinated symptom — isn't abstract. It travels with the patient through the system.
Ontario's healthcare system is already under significant strain. Family doctors are in short supply, ERs are frequently overwhelmed, and patients in Ottawa and across the province are waiting longer for specialist care. The appeal of AI tools that can reduce paperwork is completely understandable — but not if the trade-off is compromised accuracy.
The Bigger Picture on AI in Healthcare
This isn't unique to Ontario. Health systems across North America have been experimenting with AI scribing technology, and concerns about accuracy have surfaced repeatedly. The key difference is whether institutions are running proper evaluations before deployment — and whether there are human checks in the loop.
Ontario's auditor is calling for stronger evaluation frameworks and clearer accountability before these tools are used more widely. That seems like a minimum bar, not a high one.
For patients in Ottawa, the takeaway is simple: you have every right to ask your doctor whether AI tools are being used in your care, and to request that your clinical notes be reviewed for accuracy. Transparency from healthcare providers on this front would go a long way.
Source: CBC Ottawa / CBC News. Original reporting by CBC News.
