Clarity is needed about liability when medical AI fails

Back to news list

Source: BMJ

Original: http://www.bmj.com/content/392/bmj.s320.short?rss=1...

Published: 2026-02-18T03:26:12-08:00

Artificial intelligence is increasingly used in medical diagnostics, particularly in image recognition and mammographic cancer screening, where it can improve diagnostic performance and efficiency. Clinical decision support tools, such as the PREDICT3 model for predicting breast cancer survival and treatment, are regulated as medical devices and must meet professional and legislative requirements. Although there are no reported cases of an AI model contributing to a delay in cancer diagnosis, it is imperative to address situations where AI leads to cancer being missed or misdiagnosed. A key question remains who bears responsibility for errors caused by AI systems in healthcare and how this responsibility should be legally regulated. Without a clear legislative framework, patients could find themselves uncertain about their rights to compensation in the event of an AI error.