The privacy paradox: Why we let ourselves be monitored
Digital virtual assistants make life more convenient but we are trading off our privacy. Here are some simple steps to keeping the AI where you want it
On a normal day for many of us, Alexa, Cortana, Siri or our digital assistant of choice, will helpfully tell us what the weather is while we get ready for work. When we return home they will play us some music.
But increasingly Silicon Valley’s virtual assistants are keeping us company pretty much all the time. They are probably listening to our meetings, noting our routines, and video-calling our friends and family. Meantime, applications like Fitbit and Strava are tracking our exercise patterns.
These technologies learn and understand us by identifying patterns in our behaviour to better accommodate our requests, but there is often a trade-off in relation to our data, privacy, and security. We have humanised them with names, and the typically female voice, reassures us by playing on our biases to make these compromises seem less sinister.
Despite this, many of us still welcome the convenience and maybe even the apparent connection they offer – perhaps even more so at a time when many are isolated at home in the wake of the COVID-19 pandemic.
Why do we let this happen? What should we be doing differently?
This disconnect between our desire for privacy and actual behaviours has been called the Privacy Paradox.
This phenomenon describes the disconnect between us expressing concerns for our privacy online, yet continuing to use digital devices that have considerable potential for eroding our privacy, and even autonomy.
It seems that while we value privacy, the experience of managing our digital privacy proactively is often too difficult and time consuming. In particular, the contracts setting out what digital service providers will do with our data are long and complex. We may also lack expertise in managing the technical aspects of digital privacy protection.
There are also some behavioral biases that come into play. People are poor at assessing future risks and either exaggerate or downplay them according to current experience. Also, due to ‘present-bias’ or the desire for instant gratification, people tend to choose present gain over future benefits. This means if the privacy risk is abstract it will be downplayed particularly in the face of a present reward – like the convenience of a voice activated assistant, the pull of social media, or a response to a present threat.
Over 5 million Australians have downloaded COVIDsafe – the Federal Government’s tracing app – as the importance of a real health crisis may override concerns about privacy and the long-term security of the data collected.
Conversely, if there are concerns and no immediate reward, people may avoid even low-risk applications like participating in the Census.
Even where we do try and be more proactive, the navigation of the technology itself can be daunting. Moreover, our relationship with digital services is determined by the terms, conditions and privacy policies they present to us – usually our only option is to take it or leave it.
But there are simple practical steps we can take to better manage our privacy:
- Delete accounts and associated apps that you no longer use, and delete personal data that apps record. Data the digital assistants record is stored on an account and can be deleted. For example, in a Google Home, go to ‘My Activity’ in your settings and delete all data, or for an Amazon Echo, the ‘Manage my device’ settings give the ability to do this.
- If you don’t want to read the terms and conditions, pay attention to what access an application is requesting – for example to your microphone, contacts, camera, location. You should consider whether you’re comfortable sharing access for the service. For example, a video app on your phone is going to need to access the microphone and camera to record content.
- Regularly review the privacy settings on your services as the default settings often change.
- Turn off input devices like microphones and cameras when they aren’t in use.
- Turn off features that allow people to remotely access your device, like incoming calls on digital assistants, which can be dialled into without the owner knowing.
There are also some protections in the legal system for individuals concerned about privacy. For example, if a service promises to treat your data in a specific way, then they must do so. Failure to act in accordance with their promises is misleading and in contravention of Australian Consumer Law.
Recently, the ACCC brought an action in the Federal Court against Google for misrepresenting how they collect, and individuals can manage, their location data.
But there are also calls for more wide-ranging reform to both privacy and consumer protection laws to enhance privacy protection in a digital age.
Practical strategies and law reform can help to ensure we are spending quality time, not privacy-eroding time, with our devices.
Melbourne Law School is launching a new expert panel series focusing on the current and predicted effects of COVID-19. On 19 May 2020 a webinar will examine the justifications for the dramatically enhanced digital and physical surveillance and restriction we are experiencing in the wake of COVID-19.
The newly launched Centre for AI and Digital Ethics at the University of Melbourne is dedicated to the cross-disciplinary research of AI with particular focus on ethics, regulation and the law.
Banner Image: Getty Images