Bringing democracy to the internet
Author and lawyer Lizzie O’Shea argues that we need to stop looking forward and start looking backwards in order to determine our digital future
CHRIS HATZIS
Eavesdrop on Experts, a podcast about stories of inspiration and insights. It’s where expert types obsess, confess and profess. I’m Chris Hatzis, let’s eavesdrop on experts changing the world - one lecture, one experiment, one interview at a time.
When we talk about technology we always talk about the future - which makes it hard to figure out how to get there. To claim our present, we need to make sense of the past.
LIZZIE O’SHEA
My name is Lizzie O’Shea, and I’m an author and a lawyer.
CHRIS HATZIS
University of Melbourne law graduate Lizzie O’Shea recently gave a public seminar at the Institute for International Law and the Humanities centred around her new book “Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Can Teach Us about Digital Technology.”
Lizzie says that we need to stop looking forward and start looking backwards. Weaving together histories of computing and social movements with modern theories of the mind, society, and self, Lizzie constructs a “usable past” that help us determine our digital future. She shows us how very human our understanding of technology is, and what potential exists for struggle, for liberation, for art and poetry in our digital present.
Lizzie O’Shea sat down with our reporter Silvi Vann-Wall to discuss the themes in her new work.
SILVI VANN-WALL
What began your fascination with looking into the past to understand the future?
LIZZIE O’SHEA
Well, alongside my law degree I also did a history degree, so I’ve had a long term interest in trying to understand the past as it’s relevant to our present. I recently completed a master’s at Columbia University, where I was lucky enough to get a scholarship to attend. I attended lectures with some academics who talked a lot about our digital future, and particularly that the time in which we’re living now is a very important moment in relation to digital technology, because the rules are still being set of the digital age. A lot of this technology is still novel and new. Often it feels like it’s imposing itself upon society and that it’s unassailable, inevitable. It has its own trajectory without an agenda. That it’s just a phenomenon that’s existing that we don’t have the opportunity to influence it.
But actually, given we are in the early stages of the digital and information revolution, I think this is the moment to start talking about how power is changing, and how we might have the ability to influence it. And that society is actually a group of people with agency and with desire to shape the world in their own interests, and that we have the chance by looking at how people have done this in the past, to do this in the digital age as well. That really became the motivation for the book.
SILVI VANN-WALL
Why do you think we have this idea that digital technology is just a growing thing that’s just sort of like evolving and existing on its own without stopping?
LIZZIE O’SHEA
Well, I think part of it is it’s an extremely profitable industry, so people who are leading these technology companies often think that what they’re doing is improving the world. When you look at some of the rhetoric that comes out of companies like Facebook and others, they talk about connecting people with this idea that that’s generally a good thing, that’s unable to be critiqued, that there’s no context for this. It’s new and unprecedented, and all of it’s good. Now, in the last couple of years, particularly since, I think, the US election in which Donald Trump was elected, I think people are starting to reflect on that, and it’s been commonly called the Techlash.
People are starting to realise how some of these ways in which we engage with technology are extremely divisive, and that that might also be a function of how the web has been monetised, and how technology is being used to make money rather than being used to do more public interest focused activities, like connecting people for public participation in decision making, for example, or improving access to knowledge and reducing boundaries to people being able to read texts and share information. But, in fact, the web as we know it currently is oriented towards profit, and so for those reasons, all it does is not universally good. In fact, there’s lots of ways in which we can critique it. This is, as we start to experience these negative consequences, this becomes the moment in which we can do that.
SILVI VANN-WALL
So perhaps now’s a good time to bring up the concept of surveillance capitalism. Could you explain that for the layman?
LIZZIE O’SHEA
Yeah, it’s a beautiful term, I think, because we often think about surveillance as coming from the state. I suppose that’s the other moment when we were talking about why this is a particularly important moment in the history of technology, is that we’re living in the wake of the revelations by Edward Snowden, which really exposed to the world just how large the reach was of the surveillance state. That’s often what people think about when they think of surveillance, that the state is doing this, it’s watching us to monitor us for antisocial behaviour, terrorism, other kinds of criminal offences.
But what the term surveillance capitalism, I think reveals, is just how much that’s also a private endeavour. It was coined by an academic from Harvard called Shoshana Zuboff, and she’s written a very large text about this actual concept that’s taken her many years to put together. What she’s saying is that companies surveyal you all the time when you’re online, and they do that for a variety of different reasons. They’re doing that to make money from selling your data, to also understand you as an individual, so what you might be able to be sold as a consumer.
Also, once they’ve mapped people who do participate in life online, it gives them the capacity to map out everybody’s behaviour. So consenting to, for example, terms of service, is no longer an individual activity, because once somebody has done that, it starts to become something that that datapoint can be used to map out people with similar features, and lookalike audiences. This is a very pervasive form of surveillance, and it’s done by private corporations for the purposes of selling things, which is not how it’s commonly thought of, I suppose, as we’re moving into this digital moment.
SILVI VANN-WALL
Your book, one of the core ideas is the idea of the usable past to understand the future, as we sort of mentioned before. So maybe I’ll bring up one of the examples, which is the establishment of the modern police force. How did that pave the way for this surveillance culture that we have?
LIZZIE O’SHEA
Well, I was thinking about this a lot. I do a bit of work in advocacy around digital rights. It’s so often the case that the surveillance state, or the agencies involved in state surveillance will be asking for more powers, often out of all proportioned the threat that they’re trying to police. So even though terrorism looms large in our national consciousness, it’s not actually a threat that most people face in any meaningful way on a daily basis. Yet we resource, and we have a political narrative that supports these agencies. I was trying to understand that as a concept, so I thought I would go back and look at what history can tell us about how policing has evolved over time.
It took me back to the first modern police force, I would argue, and there’s been various ways in which police have come to be, but there’s an argument that the first modern police force was formed in about 1798 on the docks of the river Thames. Part of that project was by merchants, so private enterprise, coming together, subsidising a professional uniformed force who would surveyal workers who were unloading goods from the ships that were arriving in the Thames and bringing in all these goods from sites of colonialism.
So what we can learn from that is that it was initiated by private industry, that surveillance was a cheap and effective form of keeping people in line, of disciplining a workforce, and that really, it occurred at the moment at which the working class as a concept was beginning to take form. Those new social relationships were being developed in industrial capitalism, and the police were used to enforce that social division.
Now, modern policing has taken on whole new forms in all sorts of different ways, but I think that origin of a force that’s designed to surveil people in aid of preserving social division is instructive for understanding why it is the surveillance state is so large today. I think surveillance today takes a very similar form in the sense that you’re never sure when you’re being watched. Sometimes it’s obvious to you, but often it’s not. That veiled threat can be a very powerful tool for suppressing different radical thoughts, for conforming people’s personalities to avoiding experimentation, or exploring the full potential of yourself. So it’s got a very negative impact on our society, I think, in a variety of different ways. So I thought it was a history worth exploring to better understand our present.
SILVI VANN-WALL
I mean, more and more you hear about our data being used in various ways that we weren’t previously aware of. I think just recently, we had the Myki data scandal.
LIZZIE O’SHEA
Mm-hm.
SILVI VANN-WALL
Where we found that actually, all our touch on and touch off points for these public transport cards were being recorded every time we used the train, or a tram or a bus, and that we could be identified with that data potentially. Is being aware enough to sort of steer things towards them not being so nefarious?
LIZZIE O’SHEA
Yeah, I think it is important to know. The other thing I suppose about that example, which I think is a really good one, is that the agency who shared that information did it as part of an activity - I think it was called Datathon. So the idea would be that people could see what they could do with this data in an open way. It hadn’t occurred to Public Transport Victoria, I don’t think, that this information could be reverse engineered to identify people individually. When we think about privacy, for example - and this is a kind of topic that I try to explore a bit in the book - that term is often used in a variety of different ways. Often there’s this sense when you hand over your information to a government agency or a private corporation, that if the data is de-identified, or separate from your name, that that’s a form of privacy, when, in fact, that’s clearly not the case.
So public transport data, notwithstanding each individual touch on and touch off might not be able to help you link that to a real person because the name’s been removed. With a little bit of poking around on the internet, it becomes very easy, very quickly to reverse engineer that and identify us. So when we hand over data and somebody tells us it’s being kept private, I think we often don’t think that this is part of a vast web of data that’s held by lots of different people, and that privacy is a much more expansive concept than just de-identification or anonymisation. It’s also much more of an expansive concept than just giving data to someone and expecting them to hold it securely from being breached. It’s actually about our autonomy and our ability to live our lives in society without an expectation that someone can use information we give away against us for the purpose that it wasn’t shared originally.
SILVI VANN-WALL
This also links up to the way that digital technology and data collection can be used to reinforce the status quo in terms of classes and minorities and so on.
LIZZIE O’SHEA
Mm.
SILVI VANN-WALL
In your book you suggest that the only way to change that is to redefine what it means to be safe, and imagine alternatives to the police. How do we go about doing that?
LIZZIE O’SHEA
Well, I think that’s a really interesting idea, to be honest. I think there’s lots of people who are working on these topics who are much better qualified than me. The purpose of me saying that, I suppose, was to show that it’s possible. That we don’t necessarily need the police to provide public safety. In fact, often what they do is not actually protecting public safety, it serves other purposes. So that’s the kind of provocation that I’m getting at there. I think there’s lots of interesting ways that we could do this. One example that I talk about in this book, it’s called a Cease Fire Program, so it’s about identifying potential victims of gang violence in the United States, and using community interventions rather than police to try and address and prevent violence.
Now, that uses a bunch of data that you can collect from different places to do that. There’s obviously potential there for that to be misused, and in fact, we’ve seen that program implemented in places like New Orleans, and then shared with private corporations in a terrible way. We don’t want to undermine - I think it’s terrible to undermine confidence in a project like that. But in the right hands, hands of community members and senior community members, you can see how it’s possible to have a group of people who can help protect public safety through early community interventions. Not with the threat of incarceration or police violence, but actually through using social relationships better.
That’s the kind of idea of public safety that I would like to see improved, that we all have more time to work and care for our fellow members of our community, and find ways to prevent violence, whether it’s gang violence, or violence against women, other kinds of violence as well, so that we can build a stronger community that can prevent this violence rather than it relying on a professionalised force that’s got the power of the state behind it, that leads to all sorts of terrible, devastating social consequences, which is how we have arrived at the modern carceral state.
SILVI VANN-WALL
I want to look at some more of the concepts covered in the book.
LIZZIE O’SHEA
Hm.
SILVI VANN-WALL
I was quite fascinated by the idea of the technological utopia, and that was a concept that was popular at the end of the nineteenth and start of the twentieth centuries. I think that’s right.
LIZZIE O’SHEA
Mm.
SILVI VANN-WALL
So it’s a concept that’s been around for much longer than I realised, and I know it’s not so popular now. So why, this idea of technology ultimately improving our lives, why hasn’t that become the case? What’s not working?
LIZZIE O’SHEA
It’s funny, because I think part of my claim here is that we often do valorise people who think technological answers to problems are the best ones without criticism. So when we think of modern heroes of today, often we think about people like Elon Musk, I guess Mark Zuckerberg for a time. People often valorise Steve Jobs as the great architect of beautiful design. That we think that engineers are the most important people. That often when we talk about climate change we think science is the only real answer. That technocrats - Barack Obama considered himself a great technocrat, and that was the future of modern politics, getting the technology of politics correct, and using technology to do that as well.
So part of my criticism of these people is to say, well, we have to be very careful about how we think about technology, and we think about technocrats, and valorising engineering as a way to solve social problems. I do that by looking back at these technological utopians you mentioned, who were very popular. I think they’re kind of forgotten now. Edward Bellamy, who was a writer, wrote a book called Looking Backward. Where he talks about a guy who falls asleep and wakes up a century later. He sees a new society and it’s made use of technology, the acceleration of technology and technological development has achieved this beautiful new society to escape the misery of the industrial revolution.
But in that utopia, there were also very serious limitations, often gender roles were prescribed, or there was a sense that hard work was the most important thing, that engineers, again, were to be lionised and to be valorised as people who knew how to run society properly. Essentially as a way of escaping the problems of politics, of circumventing the difficulty of taking people with you, of making decisions collectively, of deciding society’s fate by collaboration. Instead, investing all that energy in technological development. So the idea of that chapter, I suppose, is to kind of critique that, and understand what was going on in the nineteenth century so that we don’t make those same mistakes in the twenty-first century. That we start to think about how utopianism can be a way of avoiding politics, and just assuming the acceleration of technology will solve our social problems is a mistake.
SILVI VANN-WALL
Yeah, it’s a sort of blissful ignorance.
LIZZIE O’SHEA
Mm, I think so. And it’s a way to kind of avoid listening to everyday people who aren’t technocratic, or aren’t technologically advanced, or are engineers themselves. It’s a way of ignoring the interests of people who aren’t engaged with the industry, and instead saying, I’m better placed to make these decisions for you. One of the examples I use, when I was looking back at the historical movement, is I talk about the Paris Commune actually, and that’s how Paris Commune came about as a feature of my text, is looking at how that instance of history engaged with the technological utopians themselves.
SILVI VANN-WALL
I’ve noticed, even just in your answer, again, this concept’s coming up of the need for intersection between professions and passions. You, yourself are an interdisciplinary individual.
LIZZIE O’SHEA
Mm-hm.
SILVI VANN-WALL
Can you talk a bit more about that. Why we need to have different sort of people looking at how to improve digital technology going into the future?
LIZZIE O’SHEA
I do think if we want to have more applied democracy, if we want to improve public participation in decision making, and we want to do that to solve some of the world’s biggest problems, like climate change, or to address things like wealth inequality, I think the best way we can build robust social movements that are capable of doing that is by collaboration across different practices and disciplines. So part of the idea of this book, or the audience that I hoped to cultivate for this book is between people who may know a lot about history, or politics, and may be activists in different kinds of social movements but have been reluctant to engage with technological questions. Because they often feel very abstract and obscure and difficult to navigate. I want to give them an introduction into some of these debates so that they can speak with confidence and provide their historical and political input into these discussions.
Then on the other hand, I’m trying to talk to people who are living their lives saturated in technology, whether that’s as engineers themselves, or perhaps they work in the industry, and trying to say, you can bring your experience to debates we have about technology today by looking at historical examples from the past to help us navigate these questions. And that between the two, we might be able to build a common language, a common understanding, both of the problems we face, and ideas for potential solutions. So really, I think that the best way we’re going to be able to challenge the key sources of power who currently hold that power in society, so companies and also the state, is by working together and bringing discipline and experience from different parts of society to communicate, collaborate and coordinate actions to challenge those poles of power.
SILVI VANN-WALL
Great.
LIZZIE O’SHEA
Just a small job we’ve got to do. I hope you can attend to that by the end of the day, thanks.
SILVI VANN-WALL
All too easy.
[Laughter]
LIZZIE O’SHEA
Yes.
SILVI VANN-WALL
You also say in your book that you are hoping that we can gain democratic control of technology. Is that the right answer for steering it in the right direction?
LIZZIE O’SHEA
Absolutely. I engage with these kinds of questions all the time, because I am also a board member of Digital Rights Watch, which is an advocacy organisation in relation to digital rights. I see all the time how people’s understanding of technology and what it does does not match the reality. That may be because companies have made decisions on their behalf. Also because they don’t think politicians are prioritising the interests of their constituents over the interests of others in the social democracy space. So I absolutely think the development of technology needs to come under democratic control. That doesn’t mean that - well, what that can mean, I think, can take many forms, is what I should say.
One of the examples that comes to mind is recent actually, the protests in Hong Kong, and Twitter’s faced some scandal there, because for a long time they published advertisements that were paid for by Chinese agencies, news agencies in particular, that was vilifying protesters in Hong Kong. Recently, Twitter has cancelled a bunch of accounts. They’ve said they won’t accept money from state-based news agencies. I think people would wonder why it is that a private corporation gets to make those decisions. Twitter is almost, really, a form of public infrastructure. People use it and rely on it all the time. Yet decisions about that, about how it works, about who’s allowed to use it fall to a private corporation. Those private corporations can be subject to influence by the public, but they can also ignore it.
So Twitter, for a long time, has ignored abuse that occurs on its platform, and people have raised concerns about it over a long period of time. Yet at this moment, they’ve decided to take action. I think both of those instances are worthy of reflection. Why is it that we outsource these decisions to private corporations. Increasingly, I think corporations realise this as well. Mark Zuckerberg has quite famously penned an op-ed in the Washington Post calling for greater regulation of his platform on the understanding that he thinks he’s got too much power currently.
So there’s debates about who gets to develop technology that we use in our everyday lives all the time is happening, and unless we, as a group of people who are interested in advocating for the public interest, get together and organise ourselves and make a claim as to what that should look like, heads of these corporations will do it themselves.
I sort of see Mark Zuckerberg’s call for regulation of his platform somewhat cynically. I think he’s trying to do that in anticipation of it coming inevitably. So he’s going to try and manage that process. I think this is the moment for us to sort of organise, on behalf of community, social movements, marginalised people to try and say, well, let’s have this become more democratic. Let’s think about how we can install more democracy into how decisions are made over these platforms, over what kind of technology gets prioritised and developed, and how we allocate those resources rather than just leaving it to private companies to do themselves.
SILVI VANN-WALL
While we’re on the topic of regulation, I want to look at one of the other examples from your book about a past event that can help us understand the future.
LIZZIE O’SHEA
Yep.
SILVI VANN-WALL
That’s the design of the Ford Pinto car. What has that got to do with the regulation of the internet?
LIZZIE O’SHEA
Well, I love this story as an example of what’s possible, partly because I am a lawyer - I was about to say I was a law student, but I was a law student. Many law students will have read about the Ford Pinto scandal, and it’s really about a car put out by the Ford Motor Company and it was designed for a low income audience, so it was a relatively cheap car. The famous thing that happened was that the fuel tank was very close to the bumper bar. That meant that it was very dangerous because if you got bumped from behind, the fuel tank would leak and often catch on fire and explode and people would be burnt alive within the car. Ford had the option of putting in place a plastic buffer between the fuel tank and the bumper bar. For a variety of reasons it decided not to do that, in part due to the expense of doing so. So the buffer didn’t cost much, it was only $1.25, but to change the design it would mean getting out the car later, all these things.
They did a cost benefit analysis and they thought, well, people will die in these vehicles most likely, or be injured quite severely, but that’s a cost we can bear. So what it’s really saying is that design becomes beholden to the bottom line in a way that’s very objectionable. How that scandal evolved was a bunch of journalists got together, lawyers brought cases on behalf of people who had been victims of these crashes, and Ford was exposed for having done this. One of the issues, of course, was that Ford was not alone. There are many other car manufacturers who similarly made calculations about the cost of changing design, or not prioritising safety in design at all. Prioritising how cars looked, and how fast they could go.
So there was a call for federal regulation, spearheaded by another lawyer - who I’m very fond of, Ralph Nader, who wrote a book called Unsafe At Any Speed. Federal regulation of cars in the United States now means that they have seatbelts, they have a bunch of internal and external design features that prioritise safety, things like airbags and other kinds of amendments to cars that make them much safer. I was reading a retrospective recently that the calculations show that Ralph Nader’s book, Unsafe At Any Speed, probably accounts for about three and a half million lives saved on US roads, which I think is an astonishing figure. If I’m a lawyer and I get to the end of my career and I’ve saved three and a half million people’s lives, I’d feel pretty good about myself.
So part of my claim, really, is that we should apply this to technology as well. Problems in technology that are experienced by individuals are not the individuals fault, so it’s not like driver responsibility, which is often what the industry claimed about vehicles. But actually, these are problems of design. So we need to reset how we understand these problems to intervene into the design process and force companies to prioritise safety in their products, to make sure that their products do what they say they do, to test them, to have crash testing for algorithms or software products. To make sure that rather than prioritising getting products out to market as quickly as possible to profit from them, we should make sure that the profession of engineering, of designing these kinds of products be subject to safety standards so that it’s not up to individual engineers to make that claim, but actually, these companies are required to prioritise safety above other things. That’s really what I’m saying, that we can learn from that experience and try and reapply it with the problems that we face today.
SILVI VANN-WALL
Why is the internet so hard to regulate?
LIZZIE O’SHEA
Maybe what makes it hard to regulate is also what makes it wonderful. It’s cross-border, it’s chaotic, it’s evolving at different paces, at different times, and it’s different for every single person. I think that also gives it the great potential. It’s so international and chaotic it becomes hard to regulate, which also gives it the immense capacity to bring together people in unexpected ways and make them more powerful than they’ve ever been. When other people talk about this, like Tim Berners-Lee, for example, talks about re-decentralising the web. He talks about how originally the web was like this, it was ungovernable almost, and as we’ve seen companies manage how they make money, as we’ve seen states try to claim this power over the internet, it becomes a regulated space. I sort of think it isn’t unregulated.
How we engage with it is very designed and engineered, it’s just not rules that we get to have a say in making. One example that comes to mind is Facebook recently talked about a return to privacy, but they’ve also talked about improving their walled garden. What I mean by that is they talk about how they will be very careful with how they manage your information, but you’ll be able to go onto Facebook, you’ll be able to apply for a job, you’ll be able to go on a date, you’ll find dates, you’ll obviously be able to sell things, and do almost everything that you can do on the internet that we spend a huge amount of time on the internet for within their walled garden. They’re really trying to close that space. Tim Berners-Lee, when he talks about re-decentralising the web, he’s trying to say, let’s stop these companies in the state having power over how we engage with it and return it, I suppose, to its decentralised format so that people are more able to make the rules themselves.
SILVI VANN-WALL
If we alter the way we perceive the internet, perhaps through the lens of environmentalism, particularly that of indigenous peoples, what can we learn, and what will change?
LIZZIE O’SHEA
One of these chapters in my book, I’m trying to suggest that indigenous ways of learning, or governing have a role to play in how we’re going to govern lots of common resources into the future. That’s got relevance, obviously to our environment, and respecting the environment as a source of life but also as something that we need to protect. I think that we can think about the internet almost as an environment in and of itself, that if it is shaped by corporations, for example, that will give it a particular quality. But it’s also a huge lifegiving force. We know so much about the world thanks to what we can collect through the internet, what we can store on the internet. You can access almost any written text within seconds, and yet, people can’t do that on a day-to-day basis because that knowledge is shut down.
I suppose what I’m trying to get at here is let’s talk about building an environment of respect for that common knowledge so that it prioritises people being able to access that knowledge collectively, and let’s try and cultivate a culture of respect, I suppose, as well in our social environments and engineer them in that way so that it prioritises respecting each other. Which is, I guess a conceptual framework that I’ve sourced from indigenous ways of knowing and governing, and I looked to places like Aboriginal Australia, obviously New Zealand prior to colonisation, and also in North America, in Canada. So looking to those societies as peoples who have collectively managed resources for a long time, often in hostile environments, and that we might have something to learn from them in terms of managing the environment that we live in online in ways that prioritise respect and common humanity rather than money making and exploitation.
SILVI VANN-WALL
I think that’s a fantastic idea.
LIZZIE O’SHEA
Good. I’m glad that you think so.
[Laugher]
SILVI VANN-WALL
Lizzie, what surprised you most about your research into this book?
LIZZIE O’SHEA
That’s a really good question. What surprised me most? I suppose for a long time I’ve had the idea that thinkers from the past have relevance to technology and technological debates that we have today. I’ve obviously been thinking about that the whole time I’ve been writing the book. But probably the one that surprised me the most is my reading about Frantz Fanon. Part of the way I approached the book is I thought about a bunch of thinkers who I think are really important to understanding our moment, and then tried to figure out if they had something to tell us that was specific relevant to technology. Fanon is somebody who I had thought was in that category. He’s an extremely influential person, I think, for understanding post-colonial Africa. He was a psychoanalyst and he was active in the Algerian War of Independence. He died very young but he wrote a number of texts that I think are really fascinating for understanding how colonialism works, and how race works as a construct in society, and also how we can overcome it. He’s just a beautiful writer as well. Part of the claim is that people should read him, which is why I included him in the book.
But when I was reading a bit more about it, I hadn’t come across a text actually, at the time I started the research until I did, which was about Algerian’s use of radio during the war, which was a form of technology. So what Fanon claimed was that the radio was traditionally used by the colonial French as a form of oppression because it was in French, and it would be giving out the news of the oppressors. Then during the course of the revolution, Algerians took it over and started using it as a way to get out information amongst large numbers of Algerians, often who weren’t literate. So radio became a very important way in which people learnt about the events of the revolution.
It’s an example, I guess that I hadn’t known about before, of actually seizing technology that belongs to others, and turning it into a tool of liberation. It was just so wonderful because he was someone that I thought I really wanted to include in my book, and then when I started researching, I found this example of how he did that, and it worked perfectly with the chapter. I like to think anyway. I’m hoping more people then think about reading more expansively about what Fanon was writing about, because I think those post-colonial ideas will really inform a robust and engaged social movement that’s got the best chance of bringing together large numbers of people to confront power.
SILVI VANN-WALL
Your book has an augmented reality artwork in it, which I found really interesting.
LIZZIE O’SHEA
Mmm.
SILVI VANN-WALL
Which I found really interesting.
LIZZIE O’SHEA
Yep.
SILVI VANN-WALL
Perhaps you can explain what that is, and why you’ve decided to put it in your book?
LIZZIE O’SHEA
I’m so glad you noticed. I’ve got a picture in the front few pages of my book. It’s by a Melbourne artist called Marc - O - Matic. He makes the most beautiful artworks, I think. You look at the graphic and it looks like a drawing. Unfortunately I couldn’t put it in in colour, but he does these beautiful drawings in colour that look beautiful on their own. Then he imports into that a technological aspect. So you download an app, and you hover the app over the artwork and it animates. I think it’s really exciting to see what’s possible with these kinds of innovations in art. Because it had never even occurred to me, as not a very artistic person, that this was possible. If you follow Marc - O - Matic, you’ll see that he puts up these artworks in all different places.
You can find them also in the streets of Melbourne. You can just be wandering along, see a beautiful graphic, not realise that it’s got this hidden meaning. But then when you do, you can download the app and see what he’s done with it. I love it. I think it’s the perfect kind of expression of what I’m getting at in how technology and art can reveal to us new ways of thinking that are so incredible and persuasive. That’s what we should be using technology for, to reveal new meanings rather than what I think it’s currently used for.
SILVI VANN-WALL
That’s great, and what a brilliant way to show that idea.
LIZZIE O’SHEA
Mmm, I thought so.
SILVI VANN-WALL
It’s so exciting.
LIZZIE O’SHEA
Good. I’m glad that you liked it. Sometimes I wonder if people will worry about downloading an app, but anyway, I hope more people do.
SILVI VANN-WALL
Sure. Perhaps there is something to be said that in order to see that hidden meaning, you have to have the technology to do it.
LIZZIE O’SHEA
You do.
SILVI VANN-WALL
Therefore, being a certain class where you’re able to download that app and so on. Yeah.
LIZZIE O’SHEA
That is true. Yeah, well that’s one of the things, I suppose. You can appreciate without even knowing that it’s like that. But, of course, what we should be all able to do is access the technology to be able to reveal that hidden meaning. I agree with you. There’s certainly access to technology issues that we should solve, and they all appear solvable to me. It’s cheaper than ever to produce some of these products. We should be recycling them, of course, that’s part of the problem, that we don’t do enough of that. But it should be accessible to as many people as who need it to be able to access devices and software. It seems a terrible shame that that’s not where we’re going, in fact, quite the opposite.
SILVI VANN-WALL
Okay, the next time someone logs onto the computer and jumps online, what do you want them to think about?
LIZZIE O’SHEA
I want them to think about how their engagement with technology might be being used by others, and whether that’s private companies who design their online spaces. It might be private companies who make money out of people in the gig economy who have to staff these kinds of services that we expect in our online lives. It might also be the state, who takes that information and uses it for their own purposes. I want people to start thinking about how our idea of privacy has to be much more expansive, and that we shouldn’t accept sometimes what politicians and companies say about what privacy is, that we should start to interrogate that concept and think really critically about it, and understand privacy is about freedom and liberty, and autonomy, really. That will help us kind of understand who’s profiting from our online lives. Also, how we might then be able to take some of that power back. And thinking always, I suppose, about what we can be doing, whether it’s small or large, whenever we can - and that’s not available to everybody, but doing your bit about creating a fairer society.
I think there’s lots of gains to be made in the digital space, but I also appreciate that even if you’re not working in the digital space - that’s very important work, but my claim is that you need to start thinking about these issues because they affect us all, even if we’re working in a kind of activist space that has nothing to do with technology. I don’t think we can ignore those implications. In fact, all organisations, I think, are really digital organisations in that way. If you’re an activist in another space, I think it’s time we come to terms with some of these questions and start agitating, and organising, and working together to try and solve some of these problems.
SILVI VANN-WALL
Well, thank you, Lizzie O’Shea, for joining us and allowing us to eavesdrop on your expertise.
LIZZIE O’SHEA
Thank you so much for having me.
CHRIS HATZIS
Thank you to Lizzie O’Shea, human rights lawyer, writer, and broadcaster. And thanks to our reporter Silvi Vann-Wall. Lizzie O’Shea’s new book, Future Histories, is out now.
Eavesdrop on Experts - stories of inspiration and insights - was made possible by the University of Melbourne. This episode was recorded on August 23, 2019. You’ll find a full transcript on the Pursuit website. Audio engineering, by me, Chris Hatzis. Co-production - Silvi Vann-Wall and Dr Andi Horvath. Eavesdrop on Experts is licensed under Creative Commons, Copyright 2019, The University of Melbourne. If you enjoyed this episode, drop us a review on Apple Podcasts and check out the rest of the Eavesdrop episodes in our archive. I’m Chris Hatzis, producer and editor. Join us again next time for another Eavesdrop on Experts.
In her new book Future Histories, author and lawyer Lizzie O’Shea asks what historical social experiments like the Paris Commune can tell us about modern online democracy.
“I sort of see Mark Zuckerberg’s call for regulation of his platform somewhat cynically,” she says.
“I think he’s trying to do that in anticipation of it coming inevitably. So, he’s going to try and manage that process.”
According to the author, this is the moment for us to organise and think about how we can install more democracy into how decisions are made over these platforms.
This includes asking questions like what kind of technology should get prioritised and developed? And how do we allocate those resources rather than just leaving it to private companies to do themselves?
“I guess this is a conceptual framework that I’ve sourced from indigenous ways of knowing and governing. I looked to places like Aboriginal Australia, New Zealand prior to colonisation, and also in North America, in Canada,” says Ms O’Shea.
“So looking to those societies as peoples who have collectively managed resources for a long time, often in hostile environments, and that we might have something to learn from them in terms of managing the environment that we live in online in ways that prioritise respect and common humanity rather than money-making and exploitation.”
Episode recorded: August 23, 2019.
Interviewer: Silvi Vann-Wall.
Producer, audio engineer and editor: Chris Hatzis.
Co-production: Silvi Vann-Wall and Dr Andi Horvath.
Banner: Shutterstock