This election is a test of how AI-driven information could shape Australia’s democracy

A map of Australia filled with social media reactions
Banner: Created using generative AI
  • Link copied

  • Australia is one of the most online electorates in the world. And in 2025, AI-driven algorithms are now shaping much of what we see and know about our elections

    By Cory Alpert, University of Melbourne

    Cory Alpert

    Published 17 March 2025

    5 min read

    The upcoming Australian election is a pivotal moment in global politics. It will be the first election in a major economy where a party that came to power in the post-COVID wave of anti-incumbency is now seeking re-election.

    The last electoral cycle saw pandemic-era incumbents removed across the world, as voters rejected the leaders who governed through lockdowns.

    Anthony Albanese speaks during the Labor Party election campaign
    The last electoral cycle saw voters reject the leaders who governed through lockdowns. Picture: Getty Images

    Australia was part of that trend in 2022, when the Liberal Party lost power to Labor.

    Now, as Labor seeks a renewed mandate, this election will reveal whether that anti-incumbent sentiment has passed or whether volatility remains the defining feature of modern democracy.

    But public resentment isn’t the only factor at play.

    Artificial intelligence (AI) has had a hand in every election since social media platforms embedded it into their recommendation algorithm systems in 2015.

    This Australian election will be no different – except the stakes are higher.

    Australia is among the most online electorates in the world: 99 per cent have internet access, 81 per cent use social media and 49 per cent rely on social platforms for news.  The end result is that AI-driven algorithms, rather than traditional media, now shape much of what Australians see and know about their elections.

    These algorithms value engagement over accuracy, ignoring what we used to imagine as journalistic ideals.

    The stories that go viral are the ones that elicit strong emotions – anger, fear, anxiety – but this doesn’t necessarily mean these stories are true. And that has consequences.

    Even when people know something is false, repeated exposure makes it feel true. AI doesn’t just shape what information we receive; it reshapes our perception of reality itself.

    One of the biggest challenges of the AI era is that truth has become fragmented.

    A close-up of a hand swiping through a news feed on a mobile phone
    AI-driven algorithms now shape much of what Australians see and know about their elections. Picture: Shutterstock

    We no longer live in a world where a few media outlets define the public conversation. Instead, millions of individuals, influencers and political players create and share their own narratives.

    This can be positive – it allows for more perspectives and challenges to dominant viewpoints. But it also means that the common understandings of democracy are no longer shared.

    Algorithms only exacerbate this fragmentation. They personalise feeds, isolating individuals in ideological bubbles. They amplify misinformation and disinformation, making it difficult to establish a common set of facts. And they erode a fundamental democratic principle: the ability to engage in a shared reality.

    Beyond algorithms, generative AI adds a new layer of complexity. It doesn’t just distort what people see – it changes how they interpret reality.

    AI-generated images and videos are not just fakes; they are something stranger. People don’t necessarily believe they depict real events, but they process them as representations of deeper, emotional truths.

    A clearly AI-generated image of a child wrapped in a flag being rescued from a burning building does not function as evidence but as an impression – an evocative rendering of how the world feels.

    This shift – from photographic realism to AI-generated imagery – highlights an expanding divide between what is real and what is true.

    In previous eras, misinformation relied on distorting real events; now, AI enables a more profound separation, where something need not be real to be accepted as true.

    AI-generated content can strip context from real events, detaching images and ideas from their origins and creating new narratives.

    Three rows of convincing-looking images of people created by AI
    The realism of AI-generated images – like these created by Nvidia –highlight an expanding divide between what is real and what is true. Picture: Nvidia

    This decontextualisation accelerates the breakdown of a shared reality, allowing different audiences to construct entirely separate versions of truth based on the same AI-driven content.

    This leads to a deeper question: if AI increasingly mediates truth, then what do we mean by intelligence?

    One definition suggests intelligence is the ability to process vast amounts of data and draw connections.

    But that’s insufficient. Intelligence is not just about making predictions – it’s about understanding the limits of one’s knowledge. It’s the capacity for doubt.

    Doubt is fundamental to intelligence because it allows for growth, revision and self-awareness. AI, as it currently exists, does not doubt. It predicts, calculates, generates – but it does not question. It does not hesitate.

    And yet, we are shaping our understanding of truth around a system that lacks this fundamental characteristic of intelligence.

    These questions about AI, truth and agency aren’t just theoretical – they need to be talked about openly.

    We need to understand how these issues are affecting people in different communities, not just in tech hubs and political capitals, but in the places where everyday life is shaped by the quiet, accumulating influence of these systems.

    AI may reshape truth, but the responsibility to interrogate it remains ours.

    The Australian election is not just a test of political stability – it’s a test of how AI-driven information can shape democratic participation.

    The Treachery of Images Tour is travelling through regional Australia having conversations with people across the country, listening to their thoughts on AI and learning from their experiences. Cory Alpert and artist Scotty So aim to have these discussions where it matters – with the communities living through its impact. You can find out more about the tour here.

    Find out more about research in this faculty

    Arts