Create a free account, or log in

Neural Notes: Why the AEC is staying AI-free this election cycle

The AEC wonโ€™t be utilising AI tools during the federal election, but it is still having to tackle its inevitable use during the democratic process.
Tegan Jones
Tegan Jones
neural notes AI australian electoral commission
Source: SmartCompany

Welcome back to Neural Notes for 2025. For those of you who are new here, this is a weekly column where I look at how AI is affecting Australia.

In this edition, I chatted with the Australian Electoral Commission (AEC) about how it is tackling AI in the lead-up to the 2025 Federal Election.

Why the AEC is staying AI-free

According to the Australian Electoral Commissionโ€™s senior media officer, Alex Morris, the department has no plans to use AI tools in the upcoming federal election.

 โ€œItโ€™s not something that weโ€™re looking at at this stage,โ€ Morris said during a call with SmartCompany

Instead, the organisation is doubling down on traditional methods โ€“ relying on public servants and established processes to maintain election integrity.

Central to the AECโ€™s mission is maintaining political neutrality, even as it navigates a rapidly changing technological landscape. 

Every member of AEC staff signs a declaration of neutrality, ensuring that their work remains unbiased and focused solely on upholding the law. 

This neutrality extends to how the organisation handles misinformation, with their efforts concentrated only on claims about the electoral process rather than political campaigning.

Countering AI misinformation

While the AEC is steering clear of AI tools, itโ€™s important to note that using AI-generated content, including deepfakes, is not illegal for campaigning in Australia. 

โ€œAt the end of the day, AIโ€ฆ doesnโ€™t change the basic rules around campaigning. So you still need to authorise content,โ€ Morris said. 

โ€œWe want to make sure that when anyone is communicating to voters about how the act of casting the vote works โ€“ that the information is accurate.โ€ 

This cautious approach also comes in the wake of controversies surrounding AI use in recent elections globally. 

In the 2024 US presidential election, a series of AI-generated deepfake videos were circulated widely on social media, causing confusion among voters and prompting urgent fact-checking efforts.

One notable incident involved an AI-generated robocall impersonating President Biden, urging New Hampshire voters not to participate in the primary election.

Misinformation, whether intentional or a result of AI hallucinations, is likely to see an uptick in the lead-up to the Australian federal election. 

To combat this, the AEC is evolving its long-running โ€œStop and Considerโ€ campaign to include AI-related messaging. This initiative encourages voters to pause and verify the accuracy of suspicious or emotionally charged content before sharing it.

There has been talk about whether there needs to be stronger regulations to mitigate the risks of AI content during elections.

While this seems like an obvious step in the right direction, Morris says this would also strain the AECโ€™s resources.

Just last year the AEC admitted a lack of technical capabilities to combat AI misuse during the 2025 federal election.

โ€œWeโ€™ve been fairly upfront with the parliament in the past that we think there might be some resourcing implications if there were to be some form of ban on AI content in general,โ€ Morris said

โ€œItโ€™s not necessarily an argument against it. Itโ€™s just a consideration that would need to be there.โ€

AECโ€™s conversations with Big Tech

While the AEC wonโ€™t be employing AI itself, that doesnโ€™t mean it isnโ€™t paying attention to AIโ€™s growing role in elections. 

The organisation has been in active discussions with major tech companies like OpenAI, Meta, and Microsoft, encouraging them to implement safeguards for their AI tools. This is pertinent given the rise of AI being used to search for information on platforms.

Most of Big Techโ€™s chatbots were being used to search for information on election day in the US back in November. This had incredibly mixed results, highlighting the need for voters to be pointed towards official channels of communication and information.

According to Morris, the AECโ€™s priority is ensuring that tools like chatbots point users to reliable, official information as early as possible.

โ€œThereโ€™s always a danger for chatbots to hallucinate, and thatโ€™s something that weโ€™re conscious of,โ€ Morris said. 

For example, Australiaโ€™s preferential voting system can confuse voters, especially when chatbots might index information from state electoral commissions with different rules. 

The risk is that voters could unknowingly receive incorrect advice โ€“ a scenario the AEC is working to avoid by pushing platforms to prioritise redirects to the AECโ€™s website.

Morris noted that some of these conversations have already led to tangible outcomes. 

โ€œYou can already see now, if you start typing questions into chatbotsโ€ฆ youโ€™re seeing some redirects to the AECโ€™s content already, and suggestions that you contact the Australian Electoral Commission for more information,โ€ Morris said.

Testing this on ChatGPT, I received mixed results. While asking when the election will be, it made a rough guess followed by some election rules as well as links to media articles.


However, it didnโ€™t link to the AEC and a disclaimer at the bottom of its answer referenced the US election, not Australia.

However, when I asked how to vote in the Australian federal election, it redirected to the AEC quite high up in its answer.

โ€œI canโ€™t necessarily go into detail about how each conversation is goingโ€ฆ the conversations that weโ€™re having are broadly productive and weโ€™re reasonably comfortable with where the conversations are going,โ€ Morris said.

Never miss a story: sign up to SmartCompanyโ€™s free daily newsletter and find our best stories on LinkedIn.