Health & Medicine
Victorian kids’ mental health hit hardest during 2020
Australia’s leading use of digital technologies in youth mental health services could help inform Australia’s Privacy Act review, with lessons for mental health services along the way
Published 25 October 2021
In 2018, a Melbourne high school mistakenly released the personal digital records of hundreds of students.
Among the records were details of students’ “mental health conditions, medications, learning and behavioural difficulties”. This was bad enough but globally breaches of privacy like this can be much worse.
Last year, in Finland the counselling records of around 30,000 people were hacked. The hacker(s) attempted to extort victims with the threat that the records would be publicly released. The records included details of family abuse, sexuality that was hidden from others, and suicide attempts.
‘Accidents’ like leaking and hacking aren’t the only issues.
Health & Medicine
Victorian kids’ mental health hit hardest during 2020
Data concerning mental health is being increasingly monetised. Privacy International recently analysed 136 popular mental health websites and apps (all of which are accessible to young people) and found that 76 per cent of them included trackers or third-party code that enable data to be sold to advertisers.
These ‘data harms’ require urgent attention to ensure young people and their families can access high-quality crisis support online in a safe and secure way.
One key to creating trustworthy online support is robust legal regulation to govern data use and privacy. Australia is often positioned as a global leader in mental health service provision, particularly for young people.
This expertise could be leveraged as part of the Privacy Act review, and an upcoming Bill on ‘Enhancing Online Privacy’, to ensure that Australia has fit-for-purpose privacy and data protection regulations to safeguard children and young people’s lives online.
Likewise, the mental health sector could benefit by being better equipped to address its own emerging issues of data governance.
Children have the right to privacy and adequate data protection. The newly minted UN General Comment on children’s rights in relation to the digital environment explicitly outlines how the reckless use of children’s data can ‘result in violations or abuses of children’s rights’.
Sciences & Technology
Data privacy and power
Harms concerning big commercial social media platforms, for example, include pushing children towards extreme content, automated notifications that interrupt sleep, and children being targeted with ‘potentially harmful commercially driven content’.
Not all these risks are confined to data concerning distress or health, but such data are particularly sensitive. For example, in 2017 Facebook Australia was found to be boasting to advertisers about its ability to use children’s personal data to identify and target children who were feeling “worthless” and “insecure”.
Again, in April this year, Facebook was found using data to target children interested in extreme weight loss, alcohol or gambling. And just last month, the explosive ‘Facebook Files’ revealed that the company’s internal research indicated its Instagram platform contributes to eating disorders and suicidal thoughts in teenage girls.
These troubling forms of data production require attention by lawmakers, not least given their impact on children’s health and development.
For this reason, the upcoming Privacy Act review relates directly to the interests of young Australians, their families, and the youth mental health sector. The Privacy Act is the principal piece of Australian legislation protecting the handling of personal information about individuals.
The Attorney General’s Office, in response to ACCC recommendations, has announced an industry-drafted Code for social media and online platforms that will provide guidelines around processing children’s data.
But industry codes alone will never be enough. Robust legal and regulatory frameworks, including amendments to the Privacy Act, are required to address the risks of data harm in the rapidly changing digital ecosystem.
Health & Medicine
The young Australians hit hard during COVID-19
This might seem outside the remit of youth mental health organisations and service user and family advocacy bodies, but these groups have a unique role to play in ensuring data concerning mental health, and more broadly children and young people’s experiences online, are protected in line with community standards and social expectations.
Mental health-related data is a highly sensitive subset of data concerning health, which itself is already sensitive. It’s one thing for a childhood diagnosis of bronchitis to be entered onto an electronic record, and quite another for a diagnosis of childhood mental illness to be held in digital perpetuity.
In Australia, efforts to digitise and virtualise mental health services have been ramping up for several years and have accelerated under pandemic conditions. The recent Royal Commission into Victorian Mental Health Services embraced online options for mental health-related support, which is likely to set the direction of national reform.
Some efforts to digitise mental health services raise ethical challenges that must be carefully considered. For example, up to 20,000 Australian high school students will have their phone data monitored – with their consent – for up to five years in an attempt to track how mental health issues develop in adolescence.
The study aims to test whether smartphones can help deliver preventive interventions on a large scale. This includes using movement and location tracking to identify triggers for the development of mental health symptoms.
Politics & Society
Australia vs Facebook: Regulating the market of attention
Navigating the ethical and legal challenges raised by these developments – like the way biometric monitoring engages privacy and other human rights – is harder when privacy and data protection laws haven’t kept pace with technological developments.
The Australian Human Rights Commission’s recent report into new and emerging technologies explored these issues and made the case for comprehensive reform to Australia’s privacy laws.
Others have suggested that human rights approaches tend to be overly individualistic, with the attention needed instead on the use of behavioural data as a democratic resource, which would challenge the ability of Big Tech companies hoarding and commercialising population data.
Regardless, the Privacy Act review, and an upcoming Bill on ‘Enhancing Online Privacy’, provides an opportunity for data governance and youth mental health experts, and children and young people themselves, to come together and contribute to a much needed regulatory framework.
Australia’s prominent role in applying digital technologies in the mental health context, and particularly in youth mental health services, could be leveraged to inform regulatory approaches to children and young people’s data.
The contribution could go both ways, with data governance expertise informing trends to digitise health and social services.
A revamped Privacy Act and a strong Code that puts children’s best interest at the heart of data production, processing and digital innovation is a vital step in protecting children and young people in the digital age and improving national data governance for all.
Banner: Getty Images
We acknowledge Aboriginal and Torres Strait Islander people as the Traditional Owners of the unceded lands on which we work, learn and live. We pay respect to Elders past, present and future, and acknowledge the importance of Indigenous knowledge in the Academy.
Read about our Indigenous priorities