AI & elections

Artificial intelligence (AI) is increasingly being used to develop various forms of communication. The use of AI has societal benefits, including for elections, but it also has some obvious potential negative impacts. Some recent community concern being expressed includes the impact on people’s ability to determine whether what they’re seeing or hearing is real in election campaign communication.

This means that voters need to be made aware of the potential impacts of AI on elections and provided with information and tools to help them more deeply examine the information they see, hear and read.

AI for good

It is easy to overlook the potential benefits of AI in the electoral environment, but these are starting to emerge in recent elections around the globe. This includes the potential to enhance voter inclusion through easier translation of educative material into other languages. There has also been commentary from campaigners with fewer resources that it can provide a greater opportunity to more widely and easily communicate with voters.

The efficiencies and benefits of AI are still to be realised and will continue to evolve with the technology.

AI communication channels

AI can be used to generate content for just about any communication channel – this includes written, verbal or visual mediums.

A chatbot is a tool that uses AI technology to answer questions or hold a conversation with a user. The technology that powers chatbots can be complex, but on a very basic level they are similar to the autocomplete function on a mobile phone.

Chat bots can be a helpful tool in everyday life for a range of information and services. They can offer availability that otherwise wouldn’t be there as well as timeliness and other benefits.

While chatbots can use natural-sounding language (some even allow voice chat), it’s important to remember that they are not alive, and do not think or feel. While often helpful, chatbots have a documented history of ‘hallucinating’ information, which can lead to them providing incorrect information.

Due to the risk of chatbots hallucinating information about Australia’s electoral system, the AEC has reviewed major chatbot results relating to federal election processes. In addition, the AEC is engaging with relevant companies in an effort to have chatbot results direct users to the AEC’s website for information.

A robocall, sometimes called a bulk phone call, is a campaign in which voters receive a phone call featuring a pre-recorded political message.

Robocalls are not a new feature of Australian elections. However, the introduction of AI technology means that robocalls are now easier and cheaper to produce. AI technology can be used to produce audio that sounds similar to real human voice. AI technology can also be used to ‘copy’ somebody’s voice – this is called a deepfake.

The AEC does not provide voter phone numbers to political campaigners and has no information about how phone numbers are obtained for either robocalls or bulk text message campaigns. Regardless of whether an electoral robocall uses a human voice or is produced using AI technology, it must include an authorisation message at the beginning of the call. This authorisation message usually needs to contain the name of the person or entity who approved the communication, as well as their address.

Deepfake video is a video that depicts somebody doing or saying something that they did not do or say. Deepfake video is often paired with deepfake audio, with the fake video adjusting the movement of an individual’s head and mouth to match the deepfake audio being used.

Some deepfakes might be obvious – many are not actually trying to deceive people but rather making a point and some are even designed as an obvious satire. Other deepfakes can be more subtle and could be designed to mislead voters into thinking that a person did or said something that never happened.

While the technology for deepfakes has been around for a few years, recent advances in AI technology have made it a lot easier for individuals to create them. 

The use of the technology is not banned in election campaigning. However, like any electoral communication, the legality of creating and distributing a deepfake depends on the content. Political parties or campaigners may opt to voluntarily disclose when an AI tool has been used to manipulate a video by including a simple message in the video.

The Electoral Act requires electoral communication like videos and audio recordings to be authorised by the individual or entity communicating. If these authorisation messages aren’t present, the AEC can investigate and take further action.

A manipulated image is an image that is intended to look realistic, but which has been modified in some way to mislead viewers. While recent advancements in AI technology have made it easier to manipulate images, manipulated (or ‘Photoshopped’) images have been a feature of public debate for several years.

There is no requirement for political parties or campaigners to disclose that they are using a manipulated image, or that an image was manipulated using an AI tool. Political parties or campaigners may opt to disclose when an AI tool has been used to manipulate an image, by including a simple message in the image.

If an image (regardless of whether it was manipulated or not) is published to a social media account or a website for the purpose of campaigning for the federal election, that social media account or website must feature an authorisation message.

Falsified, or deepfake, audio is an audio recording that depicts somebody saying something that they did not say.

Some deepfakes might be obvious – many are not actually trying to deceive people but rather making a point and some are even designed as an obvious satire. Other deepfakes can be more subtle and could be designed to mislead voters into thinking that a person said something that never happened.

While the technology for deepfakes has been around for a few years, recent advances in AI technology have made it a lot easier for individuals to create them. 

There are a number of ways that falsified audio can be used to campaign for an election including for translated messages or robocalls. As with other forms of electoral communication, depending on the message and format audio about a federal election will require an authorisation statement so people know the source of the message.

Political parties or campaigners may opt to disclose when an AI tool has been used to manipulate audio, by including a simple message in the audio.

Emerging impact

The use of AI in election communication is an emerging and dynamic space.

More than 60 of the world’s countries conducted their national election in 2024 (around half of the world’s democratic nations). There’s been examples in some elections recently of false videos pretending to deliver messages from candidates and robocalls misleading voters about how to participate in elections. The impact is hard to quantify but there hasn’t been any evidence to date that the use of AI in election communication has been the determining factor in election results.

During a federal election in Australia, it is reasonable to expect that there could be AI used in election communication – in a way that is not a cause for concern as well as material that could endeavour to mislead voters about either the voting process or candidates in the election.

Recent research, external to the AEC, that is dedicated to this topic includes the following:

Regulation & safeguards

There are many individuals, organisations and institutions that contribute to the communication environment around elections, either directly or indirectly.

Tech organisations

Many organisations assess their responsibilities and implement initiatives to combat the potential negative impacts of technologies like AI.

In 2024, many of the world’s leading technology organisations were signatories to the Tech Accord to Combat Deceptive Use of AI in Elections. The accord represents a set of commitments to deploy tools and technology countering harmful AI-generated content meant to deceive voters. The AEC continues to work with these tech companies regarding progress on their agreed obligations with respect to deepfakes.

The Coalition for Content Provenance and Authenticity (C2PA) is a related joint development between several of the world’s leading technology and AI companies with the aim of developing technical standards for the source and history of media content.

Current legislation

There is no prohibition of the use of AI in election campaigning under the Commonwealth Electoral Act 1918. However, electoral laws do require certain campaign communication to feature an authorisation statement, so people know the source of the communication.  More information is available from our Authorisations Better Practice Guide.

There is also a criminal offence in the Electoral Act for misleading or deceiving an elector in relation to casting a vote. This section of the act has been tightly interpreted by the courts to be about information that could impact how a voter completes their ballot paper, and the voting instructions, during the election period. If AI generated communication misled voters in this way the creator and communicator could be subject to criminal charges.

Parliamentary consideration

Parliament has been considering the issue of AI in communication through the Senate Select Committee on Adopting Artificial Intelligence. The AEC appeared before this committee in May 2024 and provided a submission.

Discussions with the AEC centred on the potential impact of certain uses of AI, as well as matters like the potential value of broader, whole-of-government, digital literacy campaigns or requiring the labelling of AI generated content. Whether or not regulation around the use of AI in election campaign material is introduced in the future is a matter for Parliament.

Education and communication

Each election, the AEC runs a digital literacy campaign called ‘Stop and Consider’. This campaign encourages people to “get tips” on how they consume information, linking them back to educational resources on the AEC website. The campaign has been expanded for the 2025 federal election and addresses new topics such as AI.

The AEC also maintains a prominent social and mainstream media presence, as well as other online educational material that includes a voter’s guide to election campaigning and a register of disinformation about election processes. These tools are provided by the AEC to assist voters in thinking critically when consuming campaign material. In addition, the AEC is developing a suite of digital literacy learning resources for use by community organisations and others.  

Outside of the AEC, there is also a range of educative content available from organisations and others in the community about AI and deepfakes specifically.

Reporting avenues

Some tech organisations and platforms have their own AI detection and content verification tools. There are also a range of online reporting avenues.

Links to these tools are available in the industry resources section of the AEC’s voter’s guide to election campaigning.

AEC environmental assessment (January 2025)

This webpage provides a range of contextual information on the topic of AI in elections that is designed to assist voters and other stakeholders in their own thinking. As outlined on this page, the AEC notes that for the 2025 federal election deepfakes in political communication are not illegal and, in many cases, are likely not to be used in an unethical manner. However, the general purpose of some deepfakes can be to deceive the person consuming the message - the emerging medium that is AI makes that potential deception of voters a greater concern than it has been in the past when there were fewer, and less sophisticated, tools at the disposal of communicators.

Deepfake political communication that meets the definition of electoral matter (paid content or from a political participant) has some protection in the form of authorisation requirements. Voters are at least provided with the critical information that is the source of the message. This gives the voter the opportunity to assess the context in which the information is being provided, as they form their view about the purpose and potential accuracy of it. Of course, not all political communication requires an authorisation, but the law does cover most forms of communication likely to be a prominent part of the election environment.

Voters should exercise a healthy degree of scepticism and caution if they see or hear political communication where:

  • the source is unknown or can’t be identified,
  • the information is trying to invoke an emotional response,
  • it sounds or looks unusual,
  • it can’t be easily verified through other sources,
  • it sounds and looks too good to be true,
  • the pictures or video look constructed, altered or artificial,

it depicts someone doing something that is unusual, out of the norm, or out of character,

  • it does not contain any labels or warnings,
  • there is broad media reporting raising doubts about the accuracy and or legitimacy of the information or its source,
  • the method of dissemination can’t be trusted or easily verified.

The AEC’s Stop and Consider campaign, and supporting digital literacy material, has been increased significantly for the 2025 federal election. As citizens continue to mature their own consumption habits in line with modern communication methods, it is important they’re assisted by authorities and communicators alike to perform that important individual task.  

Updated: 29 January 2025