Health & Medicine
How to make it ok to record medical appointments
Clinical decision support software is beneficial but if it malfunctions, a doctor’s duty of care likely makes them liable
Published 4 August 2022
Doctors are being increasingly encouraged to rely on digital technology to guide care, but who carries the blame if doctors rely on software that makes mistakes, leading to patient harm?
Imagine this. A patient has recovered enough from a heart attack to be discharged from hospital. The presiding doctor sorts out the discharge using a hospital computer that has clinical decision support software, which compares the patient’s data with inbuilt algorithms to make recommendations for their care.
Clinical decision support tools are increasingly used throughout our healthcare system to promote high-quality care aligning with evidence and guidelines.
In this case, the software generates a pop-up alert recommending that the doctor prescribe a specific medication on the basis that the patient isn’t already taking it. The doctor prescribes the medication, and the patient goes home. A few days later, they die. An investigation finds that the patient had twice the recommended amount of the medication in their system.
Health & Medicine
How to make it ok to record medical appointments
It turns out the patient was already taking a dose of this same medication in a tablet that was combined with another drug. As a result, because of the new prescription, the patient had actually been taking a double dose of the medication, which proved to be fatal.
Information about the other medication the patient was already taking was in their medical record, but the clinical decision support tool was flawed – it didn’t recognise the existing medication the patient was on as being in the same category as the newly-prescribed medication.
The doctor was well aware of the rule against combining both medications but had relied on the computer alert. Who is responsible under the law for the patient’s death?
A scenario like this isn’t far-fetched; in fact, it’s based on one story in a recent study of flawed clinical decision support software that led to patient harm.
There is a lot of research showing that clinical decision support software is generally beneficial. For instance, it reduces medication prescribing errors and enhances the chance that doctors will follow guidelines for delivering high-quality healthcare. Yet there is also increasing awareness that malfunctions in clinical decision support software are more common than we think.
The person responsible for the mistake should bear responsibility for the harm. But who, in a situation like this, was really responsible? Was it the software company that created the flawed product and didn’t test it properly? Or was it the doctor, who should have realised the alert was wrong and overridden it?
Politics & Society
Australia’s courts and gender dysphoria
As a legal academic, I have been working with a University of Melbourne team developing a new clinical decision support tool. I was interested in where a patient would find a legal remedy if they were harmed in this type of situation, and who they could hold accountable.
The doctor could also be harmed in some ways too; for instance, they could face disciplinary action and develop mental health problems. Their job may be at risk.
My newly published research into Australian law has found that most of the legal risk is faced by the doctor and not the software developer. This is because doctors have a fundamental duty of care to their patients, which they can’t delegate to a computer when the computer is only providing recommendations and not independently carrying out decisions.
Clinical decision support software is designed to have a human in the decision-making chain; it’s intended that a doctor will use their own judgment about whether to follow each software alert. As a result, it’s quite likely that the doctor in the story would be found to have acted negligently, breaching their duty of care.
The doctor might also be liable under Australian Consumer Law for not providing services with ‘due care and skill’ (section 60). It’s unlikely that the patient’s family could win in court against the software company under the law of negligence because the software was never designed to impact on the patient directly, but only via the doctor. It would be hard to establish, then, that the company owed the patient a duty of care.
Health & Medicine
Data protection is a mental health issue for young people
However, Australian Consumer Law could apply here too – the law says that manufacturers of goods must compensate those who suffer injuries because of a safety defect in those goods (section 138).
My research also looked at the regulatory oversight of clinical decision support software in Australia. In 2021, the Therapeutic Goods Administration (TGA) set out its new approach to regulating these types of software – those not intended to replace a health professional’s clinical judgment.
This software falls under a ‘lighter touch’ regime compared to software intended to make and execute clinical decisions by itself. It doesn’t need to be listed on the Australian Register of Therapeutic Goods, although it does have to comply with certain requirements for safety and quality.
There are substantial penalties (including up to 5 years’ imprisonment) where harm relates to a device’s failure to meet the TGA safety requirements. But otherwise, the focus is on self-assessment by the company that produces the software.
My work shows that a doctor who follows flawed computer-based advice and harms a patient probably can’t deflect legal liability to the software company. The logical, but concerning, finding is that the use of clinical decision support software can introduce new legal risk for doctors – they must guard against the very systems that are put in place to help them.
Software developers need to recognise their obligations under Australian Consumer Law. And where the stakes are high – for instance, where faulty medication alerts could cause injury or death if not intercepted – the regulator’s current light-touch approach may need revisiting.
Banner: Getty Images