Politics & Society
AI apocalypse or overblown hype?
Balanced regulation and better-resourced regulators would work harder to protect Australians from online scams and fraud in the digital era
Published 10 January 2024
Open up your socials for more than a few minutes and you will soon come across a scam. But what can Australia do about it?
The influence of digital platforms like Facebook, X (formerly known as Twitter), Google and Instagram will keep growing in 2024 and with it, the need to protect consumers from scams, corrupt apps and fraudulent reviews.
Australia’s regulation of digital platforms is already relatively strong. But there are still areas where people and small businesses are vulnerable to the substantial market power wielded by Big Tech.
In particular, these platforms function as gatekeepers to countless small businesses (many of which rely on online reviews for business) and to the range of digital services we access daily.
Ensuring a fair distribution of tech’s benefits while preventing the potential misuse of power is a delicate balance that needs even better regulation than we already have.
Politics & Society
AI apocalypse or overblown hype?
Our team has devised four priorities in our recent response to the Australian Government’s consultation on regulatory reform for digital platforms – aiming to keep Australians safer online.
When platforms are legally liable for user-generated content, they may over-moderate to reduce the risk that someone sues them. Conversely, if they are shielded from liability, they may under-moderate as they have a reduced incentive to police their site.
In over-moderating, a website might stop hosting user comments or reviews entirely, while in under-moderating they may leave scams up on their sites.
Neither of these are good outcomes.
Striking the right balance is essential for effective digital platform management.
In our response, we support a ‘notice and takedown’ mechanism where platforms must promptly check user reports of harmful content and act accordingly.
Failure to respond would leave the platform liable, allowing complainants to hold them accountable for posts made by a cybercriminal or organisation exploiting these vulnerabilities, and encourage responsible digital platform management.
Health & Medicine
Overcoming our psychological barriers to embracing AI
We also recommend platforms adopt a standard set of pro-consumer privacy and security practices that should be jointly developed with regulators.
These would replace current ‘notice and consent’ mechanisms for consumer protection. You would be familiar with the mechanisms, although you may not have read them (properly or at all).
Think about when you download a new app on your phone. You may ‘consent’ to a platform’s poor practices by scrolling past the contracts, privacy notices and policies before clicking ‘I agree’ – signing away your rights.
Currently, these mechanisms are inadequate because most of us are unlikely to read the terms and conditions and the bulk of us are not experts in privacy law or contracts.
Any updated practices should be accessible, written in plain English and applied consistently across all platforms.
While Australian law is already relatively strong, there are areas where it could be strengthened, including adopting a framework where consent is not the be-all and end-all.
Politics & Society
ACCC vs Big Tech: Round 10 and counting
An alternative is a rights and standards-based regime, similar to the European Union’s General Data Protection Regulation (EU GDPR). Under the GDPR users have a variety of rights that are enforceable by law.
One specific example is ‘purpose limitations’. This means that, in most cases, data can only be used for a ‘reasonable purpose’ and a reasonable user would understand why that data has been collected.
For example, your flashlight app can’t use your ID to track your location, even if it was buried in the terms and conditions.
Already, the Australian Competition and Consumer Commission (ACCC) is moving in the right direction on this issue by devising a plan of action, in coordination with the public and the affected companies.
We should be proud of the work our regulators do, but they are currently outmatched.
These large platforms have money, technical expertise and reams of lawyers to fight unfavourable regulations.
The Government should level the playing field by providing more resources to regulators with a particular focus on technical research expertise. Providing regulators with more technical researchers helps us all by studying the impact of decisions the tech platforms make.
Sciences & Technology
TikTok captures your face
This is crucial for better decision-making and effective regulation.
For example, regulators could scrape reviews, and run their own analysis of how effective fraud protection mechanisms have been. Or we could run experiments to see how users’ internet searches are impacted by Google’s near-monopoly on web browsers.
Think about it. If Google Chrome makes Google Search the default engine on your phone, to what extent does that change your search habits? Would more people potentially switch to other search engines if Google wasn’t the default option?
These are questions that can be addressed through proper research. Our team is already working on this, looking at how regulators can incorporate empirical evidence into their decision-making processes.
Ultimately, it’s about using rigorous research techniques to provide Australia’s regulators with the capacity to discern between true and false claims.
This, together with balanced regulation that goes beyond consent and clarity around the liability of digital platforms, has the potential to afford greater protection to consumers in the digital era.
Dr Suelette Dreyfus, Liam Harding and Dr Shaanan Cohney made up the team that submitted their response to the Australian Government’s consultation on regulatory reform for digital platforms. The ACCC’s inquiry into Digital Platform Services is ongoing until 2025.
Banner: Getty Images