Ep 16: Man Made, how the bias of the past is being built into the future – Tracey Spicer

What’s the point in agitating to change the present, if bigotry is being embedded into our futures? Journalist Tracey Spicer chats about her new book, Man Made, delving into the AI melee, uncovering the inequities and exclusions, but also what can be done to change the future for the better. 

Transcript

We all hope the future will be more inclusive and equitable. But when a young boy utters the words, “Mum, I want a robot slave,” how do we feel about that? Is privilege baked into our DNA? Welcome to With Not For, a podcast from the Centre for Inclusive Design. My name is Manisha Amin, speaking to you from the lands of the Muru-Ora-Dial people here in Maroubra, Sydney, Australia. And my guest today is award winning journalist, broadcaster and social advocate, Tracey Spicer.

Much to Tracey’s horror, I’m sure, it was her young son who wanted that robot slave. And his simple requests led Tracey on a quest to know more about how and why bigotry and injustice seemed embedded into our future. Her journeys and discoveries are all chronicled in her new book, Manmade, which opens the door on how much technology can manipulate our perceptions, and what we can actually do about that. As Tracey’s fellow broadcaster, Juanita Phillips, said of Manmade; “Thank God it’s funny. If it wasn’t, I’d be too scared to sleep.” Welcome, Tracey.

Tracey: Oh, what a wonderful introduction. Thank you Manisha, for having me on the program. And I’m joining you from the lands of the Cammaraygal people on Sydney’s north side. Thank you for shining a light on this important topic.

Manisha: And before we actually delve into the book itself, can you tell us a little bit about your journey from being a journalist, to social advocate, to author?

Tracey: I think it started in childhood. I grew up in a very low socio economic area. And it became clear at a young age that not everyone was treated equally, and that in a lot of ways, society is very, very unfair. So I have Mum and Dad to thank for that, they brought up my sister and I to be compassionate, and interested in social justice issues. Mum was a very, very strong feminist, even though she wouldn’t have used the word at the time. And Dad was a nice, soft, sensitive kind of guy. So I was brought up in a household that didn’t have the typical gender stereotypes for parents.

Then when I went into journalism, I focused a lot on documentaries about women and girls in developing countries, about inequity here in Australia as well, gender pay gap, all of those kinds of things. But you’re right, it’s only been in recent years that I’ve really gone down the rabbit hole and started writing books, and doing more column writing and advocacy around issues like sexual harassment and assault, as well as more intersectional discrimination. Because obviously, it’s not only gender.

And I think that’s what’s interesting about AI, in that if you’re someone from a marginalised community, a lot of that bias and discrimination from the past, is being baked into these machines that will be running our futures. And that really scared me, the thought that all of these stories that I’ve been doing over a lifetime to shine a light on inequity, could be for naught. We could be going back to the 1950s, because of these robots gone wild.

Manisha:Just thinking about what you said there, and this idea that our past actually tells the stories of our future, and how technology can actually tell the wrong stories, if we let it. How do you feel that the things you saw as a journalist, the stories you told, have informed your perception of where AI is heading now, and the stories you want to tell in the future?

Tracey: The stories that leap to mind when you ask that question that’s so beautifully framed, you’re absolutely right. Other stories I did in India, on gender side, in Bangladesh, on the dearth of schools for girls, and in different parts of the world where, regardless of what country you’re in, what culture you’re in, there is always this idea that women and girls are valued less than boys and men.

And so when my son said, after watching an episode of South Park – we are terrible parents – at the age of 11, “Mum, I wrote a robot slave,” referring to Amazon’s Alexa, which Cartman, a very naughty boy on South Park had bought, and proceeded to order scrotum bags from the shopping centre. I thought, oh my goodness, this is terrible. Everything I’ve seen all around the world about girls being treated as either servile or sexual, all of these stereotypes are being baked into the chatbots that we use every day, but also that our children use every day. And what does that say to kids about the way women and girls should be treated?

And as I looked into it more deeply, I became really interested in how this impacts people of colour, people living with disabilities, people over the age of 50, and also in the area of where people are transgender. There’s an awful lot of homophobia that’s baked into these machines as well. So I could see really clear parallels from real world inequity and discrimination being put into data sets that train the algorithms. And then machine learning deepens these biases and this discrimination. And when you don’t have human oversight, when you don’t have, as you know, any kind of interventions through regulation or legislation, who knows where that could lead? It really is quite frightening.

Manisha: And it’s a really – I totally agree with you. I think those of us who work in the industry often talk about diversity, and the need for diversity in those data sets, but also in terms of the people creating the materials, or the technology. But sometimes it’s actually quite difficult to explain this to people at, if you like, the barbeque, or the dinner table, when the talk about AI comes up. How do you actually do that?

Tracey: Oh, yes, as soon as AI comes up, people become scared. Or they say, “Oh, I’m not very good at technology.” But I’d like to reframe that as a conversation about society, and justice and families. Because we are saturated in technology and artificial intelligence every day, from the moment we wake up to the moment we go to bed at night. Every time you Google something, every time you send an email, every time you use ChatGPT to write a story. Every time you use a chatbot, every time you get an a smart elevator in the city. And then of course, there’s the self-driving car technology that is going on around the world. So increasingly, we’ll be exposed to it.

The way I describe it to people at a barbeque is this; that there’s a lack of diversity in the whole area of technology. And our futures are being created by a small group of white men in Silicon Valley. When you have a lack of diversity in the databases, and in the programmers, the baby bias that’s born in the algorithm becomes a troublesome teenager through machine learning. The machines become like white supremacists, going down the rabbit holes of conspiracy theory websites. They just become more and more extreme, exacerbating the existing bias.

Let me tell you one good example, which I feel so privileged to interview you for the book, Manisha, you’re amazing. And one of the examples you would have read, which shocked me actually, was a small group of guys in Silicon Valley thought they would use AI to create an automated soap dispenser for Marriott Hotels around the world.

Manisha: Oh, yes.

Tracey: Oh, yeah. Except a Nigerian tech worker used it in the Marriott in London, put his hands under it, and it didn’t work. His white colleague put his hands under it, and it worked. And they worked out that these devices only worked for white hands. Now that same technology is being used in self-driving cars. So think about this; self-driving cars come up, there’s a person of colour who’s crossing a pedestrian crossing. They’ll be run over, whereas the cars will stop for a white person. So this is where a little bit of bias at start of the process can become deadly by the end.

Manisha: And you have so many of these great examples in your book, you really bring, I think, AI to life in a way that would really work for everyone actually, not just designers and technologists, but also at the dinner table. Were there any examples that really surprised you, or were light bulb moments in terms of how you started to think about technology in general, and how we design it?

Tracey: Definitely. The chapter that really jumped out to me, was the one about people living with disability, and the whole failure of smart homes. Because smart homes aren’t generally being designed by people with disabilities, right? So there can be incredible dangers of people being locked inside these so-called smart homes. And even simple things, like what designers without disabilities think that people with disabilities want, can often be two very different things. So if we get more diversity and inclusion in the design process, then all of those wonderful devices around the home will actually be helpful to us, instead of being harmful.

Manisha: And you did talk a lot about the help as well as the harm. Do you want to share some of those stories as well?

Tracey: Oh, yes. I mean, artificial intelligence is phenomenal when used within an ethical framework. For example, there are incredible programs now that can detect breast cancer in someone, five years before it manifests. This is lifesaving stuff. If we can use artificial intelligence in warfare, I mean, ideally, we’d have no wars. But if you’re sending machines out instead of humans, instead of soldiers, it could potentially save a lot of lives.

Same with self-driving cars. If they can get past the trolley problem, the ethical dilemma of who do I run over if I have a choice, and when you’ve got technology that values humans differently when there’s inequity, that is a huge problem to get over. But it could reduce a lot of the burden on the roads of people driving. There’s a lot of great technological solutions to the problems of climate change. So my book is not necessarily technology is bad, because that’s like saying water is bad, or air is bad. Technology is in Maslow’s hierarchy of needs; we need to learn to use it ethically and properly. And we need to take the good, while regulating and legislating against the bad.

Manisha: Absolutely. And I think you talk about the good in so many different ways as well, from the knowledge of First Nations people and what we can learn from First Nations people around the world, but also the role of women in Manmade. Can you tell us a little bit about your view, and your insights coming from that deep rich research that you did in the women that have helped frame technology, as well as the women who have not been thought about when we think about technology?

Tracey: Oh, they were my favourite chapters to write. Because when you think about it, Australia’s indigenous women are the world’s first scientists. And that is because weaving, which has primarily been seen as women’s work, is a fundamental and rudimentary form of coding, because it’s a binary code, like knitting, zeros and ones. And so there’s a lot we can learn from indigenous people. It’s kind of gone full circle, actually, there, where there’s a lot of wonderful AI-enabled drones that are helping to care for country in Australia, in world first programs.

Where women come into it later in history, so say in the last two or 300 years, Ada Lovelace was the world’s first computer programmer, a woman. And that’s often forgotten in history, unfortunately. In the 1950s, women made up the vast majority of the computing workforce in the world. A lot of them are known as what’s called kilo girls, they were units of labour. So you get these massive big computers that you have to crawl in and out of to fix them. And if they had a big project, they’d say, “Oh, this project will take eight to 10 kilo girls.” That’s how heavily involved women were. Women were aligned with computing, and identified with computing all throughout history, until money started coming into the sector in the 1960s and 70s, initially. And then a lot of the men pushed the women out, and put them in to the lower paid jobs, and the men took the higher paid jobs and effectively hijacked the industry for women.

And a lot of people don’t know that history, which is why I wanted to research it deeply, and write about it, because we forget about it. There were these tremendous women doing intricate weaving on matrixes to get the Apollo spacecraft into the air. And they were known as LOL, little old ladies.

Manisha: Oh dear.

Tracey: I know, right? This remarkable and incredibly sexist history. One of the computer programmers talked about how he couldn’t have got this project up without his, quote, “stable of secretaries,” which makes them sound like horses. So they were treated appallingly. But gee, they did amazing work.

Manisha: It’s interesting how women were doing this intricate, interesting work, but talked about by their weight, by animals, all sorts of things, except for by women, or as women, right?

Tracey: That’s exactly right. They weren’t valued as equals. They certainly weren’t. Even when they were doing the incredible knitting of codes into bits of material in World War Two, to hand over to the allies. There were these older women in the Belgian rail yards that were helping out the allies, and coding things into knitting. But they’re always seen as secondary to the men. They were given the grunt work. And I think that kind of continues to this day, when you look at the amount of money in the technology sector, and the list of billionaires, and how the rich get richer and the poor get poorer. And there’s just a handful of really wealthy men who are going to be designing our future, and that’s quite terrifying.

Manisha: It really is. And I can’t help thinking, and as I was reading your book, which I loved, how many women you interviewed, how many women were involved in this notion of ethics and debiasing next to men.

Tracey: Yes. And this worries me too, to be honest with you. Because I love the work of ethicists in this area. But you see this paradoxical situation where there’s a lot of men making the money in the computing industry, and a lot of women who are trying to put ethical frameworks in place. And that also goes back to that notion of women being the ones who do the good voluntary work in society, and trying to make the world a better place, and the men being the ones who make the money.

But it was interesting interviewing some of those women, because it was during lockdowns, and I had to do it via Zoom on a program called Otter AI, which is a transcription program that a lot of us use when we’re doing interviews like this. It’s a tremendous transcription program. But every time I interviewed a woman, when I got the transcript back, Otter AI would misgender the woman, and call her a traditional man’s name. So Joan became John; Allison became Ellison. And I thought, gosh, even in transcription software, you can’t escape the inbuilt biases.

Manisha: That’s incredible. And it’s a really overt representation of some of the things you talk about that are happening inside the system, things that we don’t necessarily see.

Tracey: That’s right. And one thing I’m happy about in the last couple of months, is that it’s becoming more obvious and more visible through ChatGPT. For example, a friend of mine said to ChatGPT the other day, “Tell me a story about an engineer and a nurse.” And the engineer was a male, and the nurse was a female. And every single time, for nine or 10 times, the robot came up with the stereotypical genders for the kind of workplace roles. And at the end of it, she said to the robot, “Do you think you’re biased?” and ChatGPT came back and said, “You have taught me I’m biased. I’m going to try to work harder.”

Manisha: If only everyone said that, when we asked them that question, right?

Tracey: I know, right? So hopefully, look, I don’t think developments like ChatGPT should be unleashed on an unsuspecting public while it’s still in the stage of beta testing. I think that’s just crazy. But at least if we’re aware of it, we can try to teach the algorithm to be less biased.

Manisha: So then, in terms of that, do you think there actually is hope? Can our future actually be different?

Tracey: I really didn’t want to write a book scaring the bejesus out of everybody, without giving them some solutions. So there are multifaceted solutions, things we can do in our own homes. Things as simple as changing Siri and Alexa to a male voice or a non-gender specific voice, something really simple like that. We can boycott companies that don’t have ethical AI frameworks, and don’t do audits on their databases, that use dirty databases that haven’t been cleaned for bias.

We can lobby governments to make decisions about this. Because it takes the public to put this on the government’s agenda, otherwise, they won’t do anything about it. Because there’s too many other pressing problems, or so they think, in their minds. We can have conversations about it with our families, to teach our kids critical thinking. But from a higher level perspective, we really do need organisations like the European Union, and the United Nations, and the large governments, particularly the American government, to put in place some regulations and legislation around this. Because it really is the Wild West.

I became interested in this book around that time my son talked about the robot slave, but I was also emceeing a lot of events in the technology sector. And it reminded me of the media back in the 1980s, where all these cowboys are just running loose with no concern about anything, except racing each other to the first billion. So I think there needs to be some collective consumer action, societal action. But that’s got to start with awareness and conversation. And hopefully, that’s where the book comes in. And also the work of your wonderful organisation, Manisha, I really, really appreciate the work that you do in inclusive design. That’s the only way forward.

Manisha: Thank you so much, Tracey. One of the quotes that really stood out for me by Ivana Bartoletti, from the Oxford Internet Institute, was that an algorithm is an opinion expressed in code. I really loved the idea of that. But it also brings up this notion of the code, and that difficulty that I think some of us might feel, from the perspective of feeling like a bit of an imposter talking about AI. I mean, it seems like such a dense, opaque topic. Most people don’t know how to write an algorithm. A lot of us don’t know how to write code. And yet, if our opinions are being baked into this, how do we actually have these conversations?

Tracey: We need to really simplify this discussion, by talking about it as what kind of world do we want to live in? What kind of world do we want for our kids, for the next generation? Do we think this is fair? Do we think this is balanced and equitable? And put the technology box in the same box as feminism, as Black Lives Matter, as all of those wonderful other social justice movements. Because this is about social justice. It’s not necessarily about codes and algorithms and databases and machine learning.

We know throughout history, that the way to create change is to combine storytelling with activism and lobbying. And so if anyone’s listening and they want to start this conversation, I mean, please, yes, read my book, that would be great. But also do a little bit of googling online, a bit of research, and feel confident that the conversation you’re having is being had around the world. Think about how it affects your life, and then open up the conversation in a really casual layperson’s way. I think there’s a lot to be said, for groups of particularly women, sitting around talking about issues and deciding to rattle the cage. We know the change that’s made in the past. And I know that will also happen in the future in this area.

Manisha: And I think that what your book does is it does certainly contribute to that notion of storytelling. And you wrote this book through this really interesting period in time, through COVID, and then were very open about the impact COVID had on you while writing. Can you talk a little bit about this? Because I think it’s one of those topics that we sometimes forget, people can create stories, can tell stories. But there’s a person and a life behind that, that’s also happening at the same time.

Tracey: Yes, there’s a really strong through-line there as well. Because during the pandemic, a lot of what’s called Frankenstein data sets were being cobbled together at hospitals to try to help deal with COVID. And unfortunately, they were very, very bad data sets, full of bias and discrimination that were just thrown into hospital settings, that undoubtedly resulted in people dying.

I mean, one example is people in the States not getting a ventilator if they’re over 50, because the algorithm looks for patterns in the past. And people over 50 are seen as contributing less to society, because they’re often no longer in the workforce. And so they were less likely to get a ventilator than someone who’s perhaps aged 30. And there’s a lot of other examples, of people of colour using pulse oximeters, which I used every day on my finger last year, when I had long COVID very badly, to work out the level of oxygen in the blood. Well, if you’re a person of colour using that, it mis-diagnoses your level of oxygen, because of historical racism. And you will be offered medical help after a white person, for the simple fact that it’s not working on your skin. Which is absolutely appalling in this day and age.

So I suppose, to be honest with you, Manisha, writing this book while having long COVID was like swimming through mud. Unfortunately, I had to write the last third while I was sick. But some days, I just couldn’t get the word to come, or I’d look at the screen and the words would spin around on the screen, because my brain was so inflamed, because of long COVID. So I am relieved that it’s finished. It was incredibly difficult. But it also gave me the impetus to reach the deadline and finish the book, because I realised that all of this has been embedded in medical settings, and will kill people in the next coming years and decades, unless something is done, and fast.

Manisha: This isn’t just an esoteric conversation about ideology. It’s actually a conversation about life and death. And yet the examples that you provide of what we can do are really very, very practical. Who are the people you feel would really benefit from reading this?

Tracey: I deliberately wrote this book, aiming it at a mass market readership, that anyone can pick it up and can understand AI, and understand the bias in it. That was my aim. Anyone from the age of 16 to 100, really. But I really hope that people pick it up who are interested in diversity and inclusion, and in making the world a better place.

I also hope people read it, who think that what I’m saying is rubbish. I mean, a couple of things I’ve put on social media have been attacked by some trolls, and some men’s rights activists. And I’m 55. I’ve heard every single argument against equality over my lifetime, right? So I’m accustomed to these kinds of straw man arguments. But I would like people who think that what I’m saying is rubbish to read it, and hopefully, change hearts and minds. Because there are a lot of people who are against change. There are a lot of people who are for change. But there are a lot of people in the middle who I think are able to be convinced and can help join this movement to create a better world, which is the ultimate aim.

Manisha: Absolutely. I think also this notion about the trolls, the idea of being an ally and what it takes, is one that really resonates. How do you actually keep your smile and keep your energy going, when you do have that bombardment of people, often invisible, who might have a different view?

Tracey: I really feel sorry for young people, and people who are new to being in the public space, who get attacked by trolls. Because it can be terrifying. You feel like you’re being physically piled on and beaten. And my friend, Ginger Gorman, who wrote Troll Hunting, and who I interviewed for this book as well, she had a really incredible quote, which was that the trolls do minority stacking. So I’m a white woman, they attack me because I’m a woman. But if I was a woman of colour as well, if I was a woman with a disability, I would be attacked, even in a more vicious kind of way. Which is absolutely reprehensible.

So I really do feel for anyone going through that at the moment. My best advice, because I have been trolled for years and years now, is certainly mute. I wouldn’t block, because when you block, they get their other mates to pile on you more, and it’s just time consuming. It’s annoying and stressful. So I just mute them. And once I’ve muted all of them in one particular big pile-on, I realised they’re actually quite a small, organised group, predominantly based in the US, some in Russia. And once you realise it’s actually quite a small group, you realise, oh, hang on, this is a coordinated campaign to silence me. And then, of course, the fire gets lit in your belly again, and you think I’m not going to be shut up. So my advice is, mute the ones who do it. You’ll be surprised. It’s actually generally quite a small group.

Manisha: Thank you, Tracey. I think that’s really important. And it leads back to this notion of who’s actually behind the information, who’s actually the, who are the people talking the loudest? Because we know that sometimes the loudest voice has the most impact. But the loudest voice might not be the majority voice.

Tracey: Yeah, that’s true. Women and people in marginalised communities have been silenced throughout history. So it’s one of those things that it’s unfair, that people are piled upon. But it’s also incumbent upon us to find our community, to support each other, and to find a way to keep speaking out. Because otherwise, we will be silenced once more.

Manisha: Absolutely. So a final question for you. If you could change or redesign one thing, based on your own insights and your experiences, or if you could tell a designer what they need to do differently, what would that be, and why?

Tracey: Living with a dynamic disability as part of long COVID, I was in a wheelchair a lot, because I had dysautonomia. Every time I stood up, I felt ill and thought I was going to faint. It’s part of an autoimmune disorder that you often get with long COVID. And by golly, it gave me an insight into what it’s like for people who have a wheelchair and use it, whether it’s part time, whether it’s full time. It’s a wonderful mobility aid. It helped me get out of the house, which was absolutely priceless. But there just are not enough ramps.

And it sounds like a simple thing. I know it’s something that disability advocates have been working for, for decades now. But even in Australia’s biggest cities, Sydney, there still aren’t enough ramps. And it’s incredibly frustrating, particularly trying to get on planes and things like that. There are not the systems set up. You’re actually, I was left abandoned at the airport. I know many people in wheelchairs have got similar stories to tell. But it really gave me even more empathy for people who are in wheelchairs all the time, having to deal with that. And so, by golly, we need more ramps, and more support for people in wheelchairs.

Manisha: And it just reminds me of how the ramp isn’t just from one place to another place, but it’s about that ongoing pathway, and that ongoing journey to your endpoint as well.

Tracey: That’s it. And if we want to be serious about removing barriers to create true inclusion for everyone in society, often it is those transitional are places that we need help and support. And the only people who can change policy on that, and advise and inform policy, are people with lived experience of that. It just reminded me of a system that was designed by someone who doesn’t have a disability. And that the problems affect countless people. And it’s the same in AI design, if we don’t design by having people at the table from marginalised communities, and women at the table, we will keep designing things that are just no good for all of us. And who wants a future like that? We want a future utopia, not a dystopia.

Manisha: Exactly. So thank you so much, Tracey. It’s been wonderful having you here today. Thank you for your time sharing your stories. And also for creating this conversation about that utopia that we want to have.

And thank you everyone for listening, and being with us here today, with With Not For. If you’d like to learn more about how you can make your world more inclusive, contact us on www.cfid.org.au or see the show notes, where you’ll also find links to Tracey’s book, Manmade. Until next time, this is Manisha Amin from the Centre for Inclusive Design.