NSW authority’s AI breach exposes 2000+ locals

The NSW Reconstruction Authority says it is contacting 2031 people affected by a data breach involving the Northern Rivers Resilient Homes Program. Photo: SUPPLIED
THE NSW Reconstruction Authority has confirmed that 2,031 people had personal data exposed after a staffer uploaded sensitive information from the Northern Rivers Resilient Homes Program to ChatGPT.
The breach occurred in March, but the full scale and nature of the disclosure were only confirmed this week.
“The breach occurred when a former temporary employee of the RA uploaded data containing personal information to an unsecured Artificial Intelligence (AI) tool which was not authorised by RA,” a spokesperson said.
The spreadsheet contained more than 12,000 rows of case information, including names, addresses, dates of birth, health details and limited financial notes.
“Importantly, we can confirm that no driver’s licence numbers, Medicare numbers, passport numbers, or Tax File Numbers were disclosed in the breach.”
The authority said it was working with Cyber Security NSW and forensic analysts to investigate the breach, and had found no evidence the data had been accessed or made public.
Affected individuals are being contacted this week and offered help through ID Support NSW and Social Futures.
“We understand this news is concerning and we are deeply sorry for the distress it may cause for those involved in the program,” the spokesperson said.
The NSW Privacy Commissioner has been notified, and an independent review is underway into how the breach occurred and why it took months to notify victims.
Cybersecurity researchers have warned that personal information shared with generative AI tools can remain stored on servers and be at risk if an account or system is later compromised.
While ChatGPT is an automated service, OpenAI staff and contractors can access stored conversations for safety reviews and technical troubleshooting, meaning any sensitive information entered into the platform can potentially be viewed or retained beyond a user’s control.
Investigations by outlets including Cybernews and TechSpot have found stolen ChatGPT credentials for sale on dark-web forums, often obtained through malware or password reuse rather than a hack of the AI itself.
Experts advise using strong, unique passwords and multi-factor authentication, and avoiding entering any sensitive personal or client information into AI platforms.
The NSW Reconstruction Authority said it has now tightened internal systems, restricted staff use of unsanctioned AI tools, and added safeguards to prevent future incidents.