The privacy watchdog is examining a collaboration between Australia’s largest radiology chain and a local startup that uses patient medical information to train AI models, possibly without consent.
I-Med, which operates 250 radiology clinics across Australia, partnered with Sydney-based tech firm Harrison.ai in 2019. The partnership led to the creation of Annalise.ai, an AI tool designed to read and analyze chest X-rays for specific conditions.
The investigation follows reports that I-MED shared patient chest x-rays with health technology company Harrison.ai to train AI systems, without informing patients.
The partnership between the two organisations involved Harrison.ai using patient medical scans collected by I-Med to train this AI-based system.
According to sources there is no public information showing that patients consented to their private health data being used for this purpose.
The federal government and the Greens have since raised concerns about these revelations, the OAIC has launched an initial probe into I-Med’s data privacy.
“The OAIC is making preliminary inquiries with I-Med Radiology Network to ensure it is meeting its obligations under the Australian Privacy Principles in relation to reports it has provided private medical scans to a third-party entity for the purpose of training an Ai model,” a spokesperson said.
According to a leaked email Harrison.ai distanced itself from the issue, stating that questions of patient consent and privacy were “not matters for Harrison to respond to.”
In a statement, Harrison.ai said the data used did not contain personal information and that the company complied with its legal obligations.
The Office of the Australian Information Commissioner (OAIC) announced it is making inquiries to ensure I-MED is meeting its obligations under Australian privacy law.
“The OAIC is making preliminary inquiries with I-MED Radiology Network to ensure it is meeting its obligations under the Australian privacy principles,” a spokesperson said.
Under Australian law, personal information can be disclosed for its intended purpose or a secondary purpose that would be reasonably expected.
However, due to the unique risks associated with AI, the spokesperson noted that it may be difficult to establish that such secondary use aligns with reasonable expectations.
Both companies have say steps have been taken to protect patients’ privacy. I-MED says the data was “anonymised” when announcing the partnership in 2019, and harrison.ai last week said the data had been “de-identified and cannot be re-identified”.
The company stated that the data it received is de-identified and that it complies with its legal obligations. Meanwhile, I-MED and its private equity owners, Permira, have not responded to multiple media inquiries regarding their partnership with Harrison.ai.