Health & Medicine
How to make it ok to record medical appointments
Controversial facial recognition technology highlights the need for in-house lawyers who are trained in the legal risks of emerging technologies
Published 4 July 2022
Technology continues to transform the practices of every organisation around the world.
But if we look specifically at the legal profession, these digital transformations can bring unanticipated legal and ethical risks. In-house legal departments used to be viewed as a hallowed black box and, although it was unclear what they did, their decisions were highly regarded.
In recent years, in-house lawyers have extended out into new areas of business, like environmental, social and governance, where they have a larger role in designing and influencing the rest of their company.
General counsels are now devoting more time to strategic emphasis and less time to day-to-day business.
As Janet Taylor-Hall, chief executive of the legal service provider Cognia Law, states – in-house lawyers engage with the sales team or procurement department, sitting down with them to develop templates, contracts or negotiation rules.
Health & Medicine
How to make it ok to record medical appointments
Most importantly, there’s a been a major change in the deployment of technology by in-house legal departments to gather data. Analysing this data can identify emerging trends and provide early notice of larger problems – like an increase in the number of new lawsuit claims.
A 2021 report by the Corporate Legal Operations Consortium (CLOC) shows more than half of the participants – around 66 per cent – answered that their organisation views the legal department as totally integrated into their strategy and business.
According to only three per cent of respondents, the in-house legal department provides strictly legal advice.
Australian businesses are now at risk of significantly higher cyber attacks because of their reliance on web-based applications. But when it comes to cyber threat response, only 54 per cent of companies have a comprehensive cyber-threat response strategy that involves their legal departments.
Cyber breaches can result not only in significant financial loss but also in reputational damage. So, the intervention of a competent legal team can mitigate those risks.
As an example of one of these privacy breaches, Human Rights Watch (HRW) has accused Adobe Connect application and Minecraft: Education Edition of violating their privacy policies and collecting children’s personal data for non-educational purposes.
This highlights the need for in-house lawyers with extensive understanding of technology, who are able to predict potential legal breaches and guide their employers throughout the early stages of new technology development.
Sciences & Technology
The AI pretenders
Australia’s Privacy Act, currently under review, is out of date in the digital age.
While legislators attempt to amend and supplement the relevant legislation to become more adaptable to technological change, lawyers continue to face legal challenges related to evolving digitalisation and it’s hard to say if these adaptions of the law really go far enough.
Recently, consumer group CHOICE investigated whether Australian retailers Kmart, Bunnings and The Good Guys were violating the Privacy Act by using facial recognition technology to capture and collect a “faceprint” of customers who entered selected stores.
Around 76 per cent of consumers were unaware that businesses used video cameras for this purpose. Stores using facial recognition technology in this way is similar to “gathering your fingerprints or DNA every time you shop”.
Under the Privacy Act, biometric data gathered by facial recognition technology is considered sensitive personal data. Clearly, existing laws are lacking when it comes to protecting us from potentially harmful facial recognition technology.
In these cases, it’s not clear whether the decision to deploy this technology wasn’t run past the legal teams in their respective organisations, whether the legal teams under-estimated the potential public backlash, or if they didn’t have sufficient understanding of the nuances of facial recognition to realise the potential risks – or perhaps some combination of all three.
In any case, having tighter involvement with a better informed legal team may have raised concerns earlier or led to these organisations informing their customers in a more transparent way.
It is very important to understand and examine how in-house lawyers become involved in procuring, designing, deploying and overseeing the emerging digital technologies in their firms and how they mitigate risks associated with emerging technologies.
What is unclear is whether current legal education and training is sufficient for lawyers providing advice on emerging technology.
Firstly, for many new applications of emerging technology, there may be several areas of existing legislation that apply but most of these were designed without these technological advances in mind.
Secondly, often no precedent exists for many issues regarding emerging technology. And this puts lawyers in a position of needing to understand the implications of emerging technology and provide legal and ethical advice without clear legislation and precedents.
Only once lawyers and law students understand the underlying legal and technical frameworks of emerging digital technology and are well-equipped to respond to technology-related issues will they be able to mitigate the legal risks associated with them.
The Centre for AI and Digital Ethics (CAIDE) is launching a new project to ensure that lawyers understand the underlying legal and technical frameworks of emerging digital technology. The four-year project New Legal Thinking for Emerging Technologies is funded by the Menzies Foundation under the Ninian Stephen Law Program.
The first phase is to understand how lawyers currently address these problems. If you are an in-house lawyer who has given legal or ethical advice on technology, we invite you for a one-hour online interview. Please contact Dr Fahimeh Abedi abedi.f@unimelb.edu.au if you are interested and available for an interview before 28 July 2022.
Banner: Getty Images
We acknowledge Aboriginal and Torres Strait Islander people as the Traditional Owners of the unceded lands on which we work, learn and live. We pay respect to Elders past, present and future, and acknowledge the importance of Indigenous knowledge in the Academy.
Read about our Indigenous priorities