The Australian Federal Police (AFP) has suspended the use of surveillance platform Auror after a freedom of information (FOI) request revealed that more than 100 of its staff had used the platform — for more than a year in some cases – without considering privacy or security implications.
Internal emails obtained by Crikey describe how AFP staff collected information from retailers that wasn’t reported to the police and input their own information into Auror’s systems, all without any agency guardrails for its use.
So little was known about the company internally, despite its widespread use, that it left an employee questioning other staff if it had been reviewed for use and if they knew who owned the technology.
Auror is a “retail crime intelligence and loss prevention platform” owned by a company of the same name whose retail customers use it to collect and share information such as CCTV footage and licence plate data with each other and police about suspected crime in their stores.
Reportedly used in 40% of Australian retailers, including Woolworths and Bunnings, the company says Auror uses machine learning to aggregate data sources to investigate alleged crimes and, it claims, even “prevent crime before it happens”. Earlier this year, Crikey reported on privacy concerns about a privately owned, widespread surveillance network accessible to police without any oversight.
Now emails released to Crikey in an FOI request show that AFP staff were quick to take full advantage of the trove of data available through Auror long before the agency’s higher-ups became aware of its use, carried out privacy and security reviews or formally partnered with the company.
In response to questions from Crikey about the AFP’s use of Auror, a spokesperson said it had suspended its use until a privacy assessment is finalised.
Auror director of strategic communication and engagement Vanessa Wills told Crikey the platform was designed to ensure privacy and security: “Using privacy-by-design principles, we have worked with law enforcement agencies globally to deliver a platform with robust safeguards built in at every step”.
‘Get Started on Auror!’
The AFP staff began using Auror as early as 2021. On November 7 that year, an AFP staff member emailed Auror staff to ask about the platform’s functions: “I was looking at the track vehicle function in Auror and had a couple of questions. Do you have a list of which organisations in Canberra run the ANPR [automatic number plate recognition]?”
A major feature of Auror is the integration of ANPR. According to the company, Auror monitors in real-time and retains 60 days of number plate data. This means that users can flag a number plate of interest and instantly find out if and where the vehicle has been seen in the past two months or receive an instant notification if it turns up in the future. Just this week, customers at Bunnings spotted new signs alerting them to the use of this technology.
Emails from Auror staff appear to promote to police more than 10 locations in the ACT that were using ANPR cameras as of July 27, 2022 (these were redacted as part of the FOI process). Another email from Auror boasts the technology can be used even when the vehicle is not associated with an alleged crime.
“A vehicle does not have to be linked to a store-reported event to be searchable on Auror,” said an Auror employee on December 13, 2022.
The documents contain dozens of emails signing up users for Auror accounts and several calendar invites to discuss its use. The emails don’t say how Auror staff first contacted these staff, although AFP staff did mention they would recommend it to others, and others wrote that they’d met with Auror at the Australia New Zealand Police Advisory Agency (ANZPAA) conference.
Emails show both Auror and AFP staff commenting on the AFP’s use of the platform. Some suggest AFP staff had been actively contributing data to the platform and that police were using it to get information that retailers hadn’t given to police.
“I notice you have been providing some great information and updates on the platform and helping to identify some POI’s [persons of interest]!” an Auror staff member said in an email on April 11, 2022.
“It’s been a great intel gathering system for me. I’ve found some incidents are placed on Auror but not reported to police,” an AFP employee replied a few hours later.
AFP use prompts cybersecurity questions
One major hurdle for the company was that the AFP’s IT services appeared to intermittently block emails from Auror. The company’s staff got around this by providing accounts with a generic password.
Towards the end of 2022, AFP staff began to wonder how many employees were using Auror — so they asked the company. An October 31, 2022, Auror email said there were 118 AFP employees with active accounts, “most” from the AFP’s ACT policing teams, according to a police staff member.
It’s around this time that AFP staff begin to wonder about their use of Auror. A November 30 2022 email chain between AFP employees shows them asking each other for information about the platform, to little avail.
“Do you know how you got onto Auror? Did it go through any security review? Do you know who owns it?” one employee questioned. Emails from February 2023 show an AFP staff member starting the cyber security review process.
In early 2023, an email from Auror suggests it had restricted the AFP’s use of some features until the agency developed a policy around the technology’s use as part of a formal agreement with the company. In response to an email from an AFP staff member about not being able to use the ANPR feature, an Auror staff member said that it had been “turned off”.
“Key liaison personnel within your organisation know this decision and the importance of ensuring the internal governance and guidelines for using these features are in place and detailed within a partnership agreement,” it said.
“I had put this forward to our ACT and national futures teams last year but apparently they work slowly,” the AFP employee replied glumly.
Auror, police respond to privacy concerns
Wills said a formal partnership between Auror and police forces was not required as approved staff already agreed to the company’s terms of use. She said police could use it to review incidents, download evidence (rather than using a physical CD or USB), and communicate with retailers.
“The platform is the modern-day equivalent of retailers driving to the police station to report a crime,” she said.
The AFP’s ACT chief police officer Neil Gaughan made a similar argument when he was questioned about it in Senate estimates in May, after Crikey’s investigation into the platform: “We treat it in the same way we treat ingestion of a large number of CCTV capabilities across the territory”.
Gaughan also rejected a comparison between Auror and the controversial facial recognition technology from Clearview AI but acknowledged the AFP had not yet created a privacy impact statement.
In 2020, the AFP faced scrutiny after it was revealed staff had trialled using Clearview AI despite the agency’s denials. The Australian Information Commissioner found that Clearview AI had broken Australia’s privacy laws and the AFP had interfered with Australians’ privacy by using it.
This article was first published by Crikey.