Welcome back to Neural Notes for 2025. For those of you who are new here, this is a weekly column where I look at how AI is affecting Australia.
In this edition, I chatted with the Australian Electoral Commission (AEC) about how it is tackling AI in the lead-up to the 2025 Federal Election.
Why the AEC is staying AI-free
According to the Australian Electoral Commissionโs senior media officer, Alex Morris, the department has no plans to use AI tools in the upcoming federal election.
โItโs not something that weโre looking at at this stage,โ Morris said during a call with SmartCompany.
Instead, the organisation is doubling down on traditional methods โ relying on public servants and established processes to maintain election integrity.
Central to the AECโs mission is maintaining political neutrality, even as it navigates a rapidly changing technological landscape.
Every member of AEC staff signs a declaration of neutrality, ensuring that their work remains unbiased and focused solely on upholding the law.
This neutrality extends to how the organisation handles misinformation, with their efforts concentrated only on claims about the electoral process rather than political campaigning.
Countering AI misinformation
While the AEC is steering clear of AI tools, itโs important to note that using AI-generated content, including deepfakes, is not illegal for campaigning in Australia.
โAt the end of the day, AIโฆ doesnโt change the basic rules around campaigning. So you still need to authorise content,โ Morris said.
โWe want to make sure that when anyone is communicating to voters about how the act of casting the vote works โ that the information is accurate.โ
This cautious approach also comes in the wake of controversies surrounding AI use in recent elections globally.
In the 2024 US presidential election, a series of AI-generated deepfake videos were circulated widely on social media, causing confusion among voters and prompting urgent fact-checking efforts.
One notable incident involved an AI-generated robocall impersonating President Biden, urging New Hampshire voters not to participate in the primary election.
Misinformation, whether intentional or a result of AI hallucinations, is likely to see an uptick in the lead-up to the Australian federal election.
To combat this, the AEC is evolving its long-running โStop and Considerโ campaign to include AI-related messaging. This initiative encourages voters to pause and verify the accuracy of suspicious or emotionally charged content before sharing it.
There has been talk about whether there needs to be stronger regulations to mitigate the risks of AI content during elections.
While this seems like an obvious step in the right direction, Morris says this would also strain the AECโs resources.
Just last year the AEC admitted a lack of technical capabilities to combat AI misuse during the 2025 federal election.
โWeโve been fairly upfront with the parliament in the past that we think there might be some resourcing implications if there were to be some form of ban on AI content in general,โ Morris said
โItโs not necessarily an argument against it. Itโs just a consideration that would need to be there.โ
AECโs conversations with Big Tech
While the AEC wonโt be employing AI itself, that doesnโt mean it isnโt paying attention to AIโs growing role in elections.
The organisation has been in active discussions with major tech companies like OpenAI, Meta, and Microsoft, encouraging them to implement safeguards for their AI tools. This is pertinent given the rise of AI being used to search for information on platforms.
Most of Big Techโs chatbots were being used to search for information on election day in the US back in November. This had incredibly mixed results, highlighting the need for voters to be pointed towards official channels of communication and information.
According to Morris, the AECโs priority is ensuring that tools like chatbots point users to reliable, official information as early as possible.
โThereโs always a danger for chatbots to hallucinate, and thatโs something that weโre conscious of,โ Morris said.
For example, Australiaโs preferential voting system can confuse voters, especially when chatbots might index information from state electoral commissions with different rules.
The risk is that voters could unknowingly receive incorrect advice โ a scenario the AEC is working to avoid by pushing platforms to prioritise redirects to the AECโs website.
Morris noted that some of these conversations have already led to tangible outcomes.
โYou can already see now, if you start typing questions into chatbotsโฆ youโre seeing some redirects to the AECโs content already, and suggestions that you contact the Australian Electoral Commission for more information,โ Morris said.
Testing this on ChatGPT, I received mixed results. While asking when the election will be, it made a rough guess followed by some election rules as well as links to media articles.
However, it didnโt link to the AEC and a disclaimer at the bottom of its answer referenced the US election, not Australia.
However, when I asked how to vote in the Australian federal election, it redirected to the AEC quite high up in its answer.
โI canโt necessarily go into detail about how each conversation is goingโฆ the conversations that weโre having are broadly productive and weโre reasonably comfortable with where the conversations are going,โ Morris said.
Never miss a story: sign up to SmartCompanyโs free daily newsletter and find our best stories on LinkedIn.