It’s time to retire Lena from computer science

Why has a pornographic image been widely used to train computer scientists and their algorithms? And what sort of message does it send to women?

In 1972, a Swedish woman named Lena posed for a centrefold image in Playboy Magazine as ‘Miss November.’ It would be the one and only time she posed for Playboy.

But in the male-dominated world of computer sciences her photo, cut off from below the shoulder, was widely disseminated, becoming the default photo for use in testing image-processing research, eventually being used to develop the .jpeg format.

Lena as she appears in the documentary ‘Losing Lena’. Picture: Supplied

How did the face and shoulders of a playboy model become the ‘go to’ picture for computer image-processing, and why is it still being used across computer science departments?

The story has been well told in the Atlantic Magazine, but in summary, in 1973 some computer scientists at the Signal and Image Processing Institute of the University of Southern California, were looking for a new image on which to train their computer to process images. One of them had a copy of Lena’s Playboy photo, so they used it.

From then on it became a popular image for computer scientists to use, despite its poor quality – it was a scan of a photo that included a scanning error. The image was also used despite it being protected by copyright. Permission was never sought from Playboy, and of course, Lena herself, unaware of her popularity among computer scientists, never gave her consent either.

Lena’s image would be one of the very first pictures uploaded to ARPANET, an early example of the internet we use today.

A new documentary, Losing Lena, is now helping to galvanise efforts to end the use of Lena’s image in technology research. It makes clear that the use of her image is emblematic of the sector’s often unwelcoming culture for women, entrenching female under-representation.

Retiring Lena as an image used in coding goes beyond copyright, consent or ethics. It is a required step to help women feel more welcome in tech.

The widespread use of “Lena” reinforces the under-representation of women in computer sciences and engineering. Picture: Getty Images

At a time when we seem to be rushing headlong into the use of artificial intelligence and machine learning, removing Lena from our visual vocabulary and training sets is one small step in the direction of making it clear that we won’t accept the unintended biases of the past. Instead we need to actively ensure diversity and equality in all of our computing and information systems research and teaching.

This is particularly important given technology’s power to shape the future.

Every technology has a history and a context. Langdon Winner in his book The Whale and the Reactor, recounts a prominent example of biases being embedded in technologies involving early traffic overpasses in and around New York, many of which were built low. This ended up blocking public buses from using the road.

As a result, low-income people who depended entirely on public transport were effectively excluded from using these roads, and these same low-income people were disproportionally made up of racial minorities.

Winner argues that politics is built into everything we make, and that historical moral questions asked throughout history, including by Plato and Hannah Arendt, are questions relevant to technology – our experience of being free or unfree, the social arrangements that either foster equality or inequality, the kinds of institutions that hold and use power and authority.

The capabilities of AI, and the way that it is being used by corporations and governments, continue to raise these questions today. Current systems using facial recognition, or policing tools that reinforce prejudice, are examples of technology that builds on politics.

The difference is that, in non-physical systems, the embedded politics isn’t as easy to identify as an overpass built too low, although the technology may be similarly challenging to rectify once it’s built.

The growing influence of technology on our lives makes it even more critical that diverse people are at the forefront of developing and regulating technologies. Picture: Getty Images

Interestingly, many of the contemporary warnings about how we develop and use technology are coming from women – Jeannie Paterson, Virginia Dignum, Ellen Goodman, Julia Powles, Kate Crawford, Genevieve Bell, Mireille Hildebrandt, the list goes on.

It is crucial that there is a diversity of voices imagining and building the world that technology is creating. Numerous writers like Meredith Broussard, Ellen Broad and Cathy O’Neill write about how those who built the internet, embedded values and assumptions that persist to this day.

Code Like a Girl, an Australian organisation that encourages and supports women and girls to code, is supporting the documentary. They note that:

“There are so many reasons why having women in tech is beneficial and vital: it will future-proof our workforce, it addresses sexism within the sector, provides role models for girls and young women and secures long-term profitability for businesses and organisations.

“But to us, the most important reason to grow the number of women building tech is so that the world we live in can be equally enjoyed by everyone. We need diversity of experiences, perspectives and stories to build a world that is more empathetic, innovative and equal.”

Given the power that now resides in technology, not enough women are seated at the table. Nor more generally is there enough diversity.

It is time to retire a pornographic image, and to ensure that more real-life women are participating in these fields, and that they all feel welcome.

The University of Melbourne is the first university to sign up to a moratorium on the use of the Lena image, and on 19 December is hosting a special screening of Losing Lena and a panel discussion.

Banner: Supplied