Jan 13, 2026

AI use for progress notes is causing serious privacy and clinical risks in aged care

As Australia’s aged care sector struggles with workforce shortages and a rapidly ageing population, staff have inevitably been turning to artificial intelligence (A.I)  tools such as ChatGPT to help complete progress notes and clinical documentation.

While these tools offer speed and convenience, their use introduces serious legal and clinical risks. Poorly governed AI use can expose providers to privacy breaches, inaccurate care records and regulatory action under both privacy and aged care law.

These risks are no longer theoretical. In the past year, multiple aged care providers have reportedly been scrutinised by the Department of Health and Aged Care after staff entered sensitive resident information into public AI platforms, triggering breaches of the Privacy Act 1988.

Karina Peace, an aged care consultant who works with more than 40 providers, says investigations linked to AI use are already occurring. “Dialogue from multiple sources indicates that more than one provider has been investigated in the last 12 months for data breaches against the Privacy Act using ChatGPT or AI tools,” she says.

Karina points to language barriers as a significant driver. Many nurses and care workers, particularly those from culturally and linguistically diverse backgrounds, are using AI to help phrase progress notes correctly.

“They’re using AI tools or ChatGPT to support their clinical documentation to ensure the words are correct,” she explains. “But because they don’t fully understand the words, they’re often using incorrect ones, which leads to clinical issues as it implies the management of care was not correct.”

The evolving legal framework

Australia does not yet regulate artificial intelligence in aged care through dedicated legislation. Instead, providers are expected to apply existing legal frameworks, with the Privacy Act 1988 playing a central role. Under the Australian Privacy Principles, health information is classed as sensitive information and requires a higher standard of protection, including clear consent for collection, use and disclosure.

From December 2026, amendments to the Privacy Act will require organisations to be transparent about the use of automated decision making, including AI assisted processes that have a significant impact on individuals’ rights. Progress notes that inform care planning or clinical decisions are likely to fall squarely within this scope.

The Office of the Australian Information Commissioner has repeatedly stressed the need for privacy by design when deploying AI systems. It has warned against using personal data to train AI models without consent and highlighted the risk of AI hallucinations, which can undermine accuracy and breach obligations to keep records up to date and correct.

The Aged Care Act 1997, updated in 2024, reinforces residents’ rights to privacy, dignity and informed consent. These obligations align with the Aged Care Quality Standards, which require accurate, contemporaneous documentation to support safe and effective care.

Karina says the human factor is often overlooked. “If someone doesn’t understand English to that level, they don’t know if what’s coming out of ChatGPT is correct, and most often it’s not.” Inaccurate or misleading progress notes can expose providers to breaches of care standards if they influence clinical decisions or misrepresent what actually occurred.

Some AI tools may also fall under medical device regulation. The Therapeutic Goods Administration classifies software as a medical device if it assists with diagnosis, monitoring or treatment decisions. While general purpose tools such as ChatGPT usually sit outside this framework, reliance on unverified outputs can still expose providers to negligence claims if harm occurs.

Key risks: privacy breaches and inaccurate records

Entering identifiable resident information into public AI platforms creates a clear risk of notifiable data breaches. Information may be stored overseas or used to train models, potentially breaching rules around cross border disclosure of personal information.

Karina highlights how documentation errors can quickly escalate. “Mostly around misinterpretation of wording or the wrong words being used, which alludes to outcomes that never happened or the wrong outcome, which is a serious risk. From a compliance viewpoint, if someone takes the provider to court for poor care, they have leverage if the documentation is incorrect.”

The Australian Health Practitioner Regulation Agency has made it clear that practitioners remain accountable for any AI assisted outputs and that informed consent is required where personal data is involved. Responsibility cannot be shifted to the technology.

Karina also notes that AI use often sits alongside other risky communication practices. “Not only in aged care but also in our hospital system, the use of unapproved communication tools like WhatsApp is widespread across Australia, which also impacts breaches of privacy, particularly when communicating about staff, patients, or residents.”

Awareness gaps compound these risks. “Management and executives understand the risks, but I don’t think they realise they are individually liable, as well as the registered nurse or carer using those tools, which is a significant concern,” Karina says. “It extends to the board level for governance responsibilities.”

This chain of liability means providers can face substantial penalties for serious or repeated privacy breaches, alongside civil claims if poor documentation contributes to substandard care.

Accountability and best practice

Ultimately, responsibility for AI generated content sits with the provider. Regulators have signalled interest in ethical AI use, but their focus remains firmly on resident safety, data protection and governance.

Karina argues that many risks could be addressed with basic controls. “There’s a lack of policies and processes. It’s very easy for IT to block access to these ChatGPTs or AI tools and then purchase a safer system if genuinely needed to support clinical documentation.”

She does not dismiss AI outright. “Staff are time poor, and anything that could speed up the process and make it more accurate would be beneficial, but there’d have to be legislation around that. Staff are so burdened by regulatory documentation that actual time is being removed from delivering care.”

Practical risk mitigation steps include prohibiting the entry of identifiable data into public AI tools, obtaining explicit and documented consent, training staff to critically review AI outputs, and implementing strong IT controls. Providers should also conduct privacy impact assessments and update policies to address automated decision making ahead of the 2026 reforms.

A harder reality for providers

AI is already embedded in day to day aged care practice, whether providers acknowledge it or not. Ignoring its use does not eliminate risk, it amplifies it. Regulators are unlikely to accept ignorance or informal use as a defence when resident data or care outcomes are compromised.

The real challenge for providers is not whether AI has a place in aged care, but whether they are prepared to take responsibility for how it is used. Without clear governance, approved systems and staff education, AI becomes another compliance failure waiting to happen.

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement
Advertisement
Advertisement

Health funds call for urgent action to crack down on inflated specialist doctor fees

A shocking new survey reveals 1 in 5 Australians are skipping appointments with psychiatrists and obstertricians due to some specialists charging over $950 per visit. Should the Government step in to stop surprise medical bills? Read More

Lessons for aged care from the Royal Commission into Child Sexual Abuse

Royal Commissions present an opportunity to overhaul cultures and policies, and to create change for the better, not only in the present, but into the future. They have the power to reframe how we think about society and culture, and they are an opportunity for people’s stories to be heard. Lisa Giacomelli, chief operating officer... Read More

Art therapy benefits show you’re never too old to embrace creativity

Aged care residents are channelling their inner Picasso as TriCare’s Mt Gravatt Aged Care Residence embraces art therapy to keep its talented residents social and creative. Read More
Advertisement
Exit mobile version