Gender sex tech

Continuing the Conversation

Transcripts for Season One Episode Seven

Gender, Sex and Tech: Continuing the Conversation

Episode 7: Interview with. . . Me!!

Transcription by Jennifer Jill Fellows

Jennifer Jill Fellows: So I think it was 2015 and I was visiting a friend were all sitting around the kitchen table. Me, my partner, my friend and their partner and their kids and their oldest daughter starts telling me about this new song that she really likes. And I reveal how uncool I am by admitting I’ve never heard the song. Suddenly she’s bellowing, “Hey Siri, play. . .” And then she says the name of the song. And at first nothing happens except that I was a bit startled. Then she bellows it again, “hey, Siri” and a female voice that has come to be ubiquitous replies “I’m sorry, I didn’t get that.” Pretty soon, everyone around the table is demanding that Siri play the song. And eventually Siri did figure it out and she played the song for us. It’s a little bit odd what I just said there. She played the song for us? Like I’m describing agency to her. And that was the first time I met Siri. Although it also seems kind of odd for me to say I met Siri in that moment. Can you meet something that is essentially a tool? But if she’s a tool, why do I keep referring to her as She? Because she was a she. But why? Why is she a she instead of a he or a they or maybe even an it? What’s going on with Siri and Siri’s gender anyway?

(Music)

Jill: Hi everybody. Welcome to Gender, Sex and Tech: Continuing the Conversation. I’m normally your host, Jennifer Jill Fellows. But today I’m actually the guest. I’m stepping out of the host chair and inviting Dr. Lisa Smith to interview me about my research on the gendering of virtual assistance.

Lisa Smith:  Jennifer Jill Fellows is a faculty member in the philosophy department at Douglas College. She is at 2022 podcasting fellow of the Mark Sanders Foundation for Public Philosophy, and is really passionate about academic podcasting. Something I really like, too. She’s currently involved in not one, but two podcasting projects. This one and another one called Andraste’s Gadfly? I mean, I got to know more about that. Her academic research interests are in social epistemology and metaphysics of personhood. Her work has appeared in the journals Social Epistemology and Philosophical Studies. She particularly focuses on trust, expertise and marginalization in science and technology studies. So today, it’s my pleasure to invite Jill to step out of the host chair that she’s been occupying for quite some time. And it’s my opportunity to interview her with pleasure about virtual assistance, gender roles, and surveillance.

(Music)

Lisa: Hi, Jill. Welcome to your show.

Jill: Thank you so much for agreeing to interview me and play host today.

Lisa: Oh, it’s great. I like it. I feel like I’m in control, It’s amazing.

(Music)

Lisa: So I wanted to take a moment, as Jill has done throughout this podcast series and help us to resist this notion that these digital spaces where we connect, record, share ideas somehow separated from the physical world, right? They’re still connected in many ways. And Jill’s work in particular really helps us to think about the connection between digital space and the consequences within physical space. And, and even I would say our physical bodies and experiences. These things that were using computers, cables, headphones, microphones, they come from physical space. There are also built into physical spaces. So, we’re connected by wires and all kinds of things that are running through our homes, but also through our communities. Much of the physical space in the Canadian context, but also other parts of the world is on stolen land, land that was taken from Indigenous peoples. For me here and my home in East Vancouver, it’s very important that I acknowledge that I’m joining you today for Gender, Sex and Tech: Continuing the Conversation on the unceded and traditional territories of the Musqueam Squamish and Tsleil-Waututh peoples. Jill, can you share with us where you’re joining us from?

Jill: Yeah. I’m joining from my home on the unseated territory of the Qiqéyt Nation, which is I believe one of the smallest nations in British Columbia, and the only one without a dedicated land-based.

Lisa: Mm-hm. Yeah. You’re absolutely right.

(Music)

Lisa: So to get us started, we’d like to learn a little bit more about you. What can you tell us about your academic journey? Did you always want to be a philosopher?

Jill: No, not at all. I guess. I don’t know, but I don’t think too many people start out wanting to be a philosopher just because, like in high school, I didn’t know what philosophy was. I didn’t know anyone who wanted to be a philosopher, who was working as a philosopher. So, when I went to university, I knew I probably wanted to do something humanities and social sciences because Social Studies and English Lit were kind of the things I really, really liked. But I also thought maybe I wanted to do Physics because I liked that in high school. And so I just kinda took a bunch of classes. And I ended up actually with an Anthropology major and a Religious Studies minor for awhile. And then I had this ex-boyfriend. He’s an ex-boyfriend now. He wasn’t at the time. And he took an Ethics class when I was in I think my first maybe second year. So I was an Anthro major and my second year. And he took this Ethics class. And he kept telling me all this stuff that he was learning in his Ethics class and I kept saying like “That doesn’t sound right.” And I try to like, figure out more about it or argue with him about it. And he always end up saying, “Well like just go talk to the prof because it made sense in class.” And so, I signed up for that class next semester because I was really curious and I loved that class. So I say the boyfriend didn’t stick around, but the, the interest in philosophy did. And by the time I was in the end of my third year of undergrad, I was taking five philosophy courses and no anthropology courses and I was like I should probably switch my major. And that meant I did have to do a little bit of catch up because I had spent two years solidly as an Anthro major doing all of the coursework I needed for Anthro. So I spent another year kind of picking all the courses up that I needed to complete my philosophy major. Yeah, and then I was hooked and then I want to go on. So, I did a Master’s and I did a terminal Master’s, which for people who don’t know means that the Masters didn’t lead into a PhD. So I wrote a Master’s thesis and I defended it. And at the time, my grandmother was diagnosed with Alzheimer’s disease, she was quite late stages Alzheimer’s disease. And so I wrote my Master’s thesis on personal identity and Alzheimer’s disease. And it was actually quite a personal thesis for me, kind of working through a lot of issues of what it means to be the same person over time when your memory and personality are changing quite drastically. So that was my Master’s. And then I didn’t know if I wanted to keep going. I loved my Master’s. But it also took a lot out of me emotionally probably because it was so personal. So, I took some time off after that. And then I decided that actually the thing I really loved was teaching in the classroom in my Master’s, I had been given an opportunity to be a TA. I ran some tutorials. It was great, it was really fun. So I applied to a bunch of PhD programs and I got in. And yeah, that’s that’s, kinda how that happened. I, I sort of fell into it and then left for about a year to think about what I really wanted out of my life and then decided no, this was a good place to fall in and I went back.

Lisa: Yeah. I mean, I think that’s a common journey for, for many folks in the academic field. And I have a follow up there. I’m interested to know when you’ve thought of a philosopher in the beginning, did you have an image in your mind of what that person looked like and has it changed at all? Like now when you think of a philosopher, does that person look like, is it you?

Jill: Oh gosh. I think that’s a great question. I think, in the beginning, I think I had a very stereotypical image of a philosopher is kind of like somebody with a tweed jacket with elbow patches. Probably a man sitting in in a leather armchair, maybe smoking a pipe and doing a lot of deep thinking. And now, I mean, I’ve met so many different philosophers in so many different approaches to philosophy, I actually literally draw mental blank when I try and think about like, what does a philosopher look like? Because I know there’s so much more variety, but I do feel like that stereotype is still kind of out there. That a philosopher as like an older white dude with maybe a beard or something smoking a pipe.

Lisa: yeah, I really think it is still there, but I think it’s exciting to hear that you’ve come to this place of a much more open vision of what that can be.

Jill: It is true that philosophy as a discipline still does have quite a problem in terms of representation. I think the last time I looked, which was a few years ago, it was something like 25 percent of all professional philosophers were female. And the number of Black, Indigenous and people of color working as a philosopher is incredibly low in North America. And this is a big problem. We rank alongside STEM sciences for lack of inclusion. And yet, unlike the STEM field, philosophers don’t seem to be quite of doing as much outreach, at least not yet. Hopefully, I see some movement of that changing in some areas. So hopefully, yeah, yeah.

Lisa:  And I feel like work like what you’re doing in your chapter helps to start to provoke those questions, but also raise a foundation for doing really good and critical work. And I have to tell you, I’m so excited to work with you on this collection, but also when I got to learn more about the topic that you were working with, because it’s something that is quite far removed from the kinds of work that I’ve done in my own field of sociology. And even though we’re both exploring technology, I feel like we’re really doing so at opposite ends of that spectrum. Where you’re very much embedded within these digital spaces. And I was doing in some ways the opposites.

Jill: I totally agree that we are coming at it from opposite ends. And that’s part of what’s been so cool about interdisciplinary work in general and working with you in particular is like, learning so much about different ways people approach the question of looking at technologies through intersectional feminist lenses has been really, really cool.

Lisa: Yeah.

Lisa: So, I know that for you and reading your chapter, part of what inspired you to do this work was work you did as a temp, which I did as well. I was a temp during graduate school, right? A great gig to be sure. I’m interested to know though, so did this spark your interest in looking in more fully into virtual assistants, working as a temp?

Jill: Yeah, in my undergrad down in my Master’s, I worked as a temp and it worked really well. It’s a very nice flexible job for when you’re going to school. But it’s also really interesting because I feel like at least my experience was that I would drop into these situations where I was usually a receptionist or an admin assistant or in a couple of places I would work switchboard back in the day. I don’t know if places still have switchboard?

Lisa: I think some of them do, yeah.

 Jill: I think some of them do. Like the main line that you call and you get somebody and they help direct you. A lot of that work has been automated and it even was automated at the time I did it. It was kind of odd that there were still some places where you have an actual person to answer the phone instead of like “press one for blah, blah, blah.” But yeah. So, I did a few of these jobs. And the thing that I found really, really kind of startling at first was that people would just kind of assume I knew what I was doing. So, I get dropped into these jobs. And people would treat me as though I was the same as the person who was off on vacation or whatever, and that I would know everything that that person knew. Like I would know somebody’s regular coffee order or I would know how to do the filing system. And I usually didn’t know. Like, I do know how to file, but you have to tell me which filing system and I know how to make coffee but, you have to tell me if you want it with cream and sugar. And there was just kinda this assumption that admin assistants or these, these kind of supportive roles, support staff roles, come with a lot of knowledge that I don’t think we often acknowledge how much knowledge is there or recognize, we don’t recognize it until it’s gone, until you’ve got someone like me sitting in the chair who doesn’t know what I’m doing. And probably starkest example of this was one time I was working for somebody and they had this beautiful glass walled office. And my desk was out front of the glass walled office. So, I would kind of face out into the public space. There was this long corridor and then it opens into a public space. So, it’s not like I was staring right at the public. I was actually in a very self-contained space with a long corridor in front of me. So, you can clearly see if people were coming down the corridor. And behind me was the glass walled office of the person I was supporting in my support role. And they like to biked to work every day and they biked into work this day, the first day that I was there and they came in and I said hello because obviously I was there before them and my support role. And they went into their office and then I realized I had a question for them and I turned around and they were getting undressed in their office. Because every morning they would come in and they would change out of their biking clothes and into a suit. And their admin assistant, they’re regular assistant, knew this and just knew not to turn around in the morning, right? You just wait. And then the executive would come out when they were dressed, and I didn’t know that. And it was super embarrassing for both of us. But also funny.

Lisa: Totally. And something that stuck with you.

Jill: Yeah. Yeah. I know that was kind of rambling, but that experience and the other experiences really kind of led me to recognize that there is this kind of presumption that support people are going to know a lot of stuff. There’s a lot of knowledge that’s invested in there and in particular knowledge about the people that you’re supporting. Like you’re supposed to know what their coffee order is, know what their routine is, all that kind of stuff. And so that got me thinking about kind of knowledge and power dynamics and all of that kind of stuff. But I didn’t really start applying that to virtual assistants specifically, until I was teaching a feminist philosophy class at Douglas College a few years ago, I think it was 2016 or 2017. And I was looking for a bunch of examples or ideas that we could talk about in class so that we could apply our theories to real-world stuff. And I came across this article by Andrea Guzman where she talked about the gendering of virtual assistants. And we were talking about gender and gender performance in class. And I thought like, oh, this is really neat. Because like virtual assistants, Why are they gender? Like it all? Because they don’t have to be. It’s very strange. And so, I started looking for more work, Guzman’s was the first I found. But by far not the last. A lot of people had been talking about this. Because Siri was launched in 2011. So, even by 2017, there was quite a lot of work already being done. And so I really started diving into this and wondering like how had this not hit my consciousness before like, why, why didn’t I know more about this? And then of course, in 2019, UNESCO published their big statement about gender and the digital sphere. And they specifically called out virtual assistants as being sexist. And so that’s, that’s kind of how I got where I am now in terms of my interest.

Lisa: So a mix of experiences that came from the workplace and then making those connections into the class and the digital space. Yeah, that’s what I’m hearing. Yeah. I’m interested to know though, I actually want to go back to your experience in the temp before I go forward, and in particular, I’m wondering if you can share with us what that experience was like being a temp. Because in my, what I heard, what you said, I could hear elements of both invisibility, but also power dynamics, both for yourself and for the person who were working for. And so, I’m wondering, did you enjoy that work? Did you have pleasure in it? But also did you find elements of that role of being an assistant that were enjoyable and yeah, I won’t say more than that.

Jill: Yeah, I did enjoy that work. I didn’t think I ever wanted to make it a career because frankly, I do not think I’m organized enough to be a support staff for somebody else. I think you have to be incredibly detailed-oriented and incredibly organized. That’s what that experience taught me. And I didn’t think I would be good at either of those things. Lisa knows this because we worked on the book together. I’m not incredibly detail-oriented, but I did enjoy it. And also, I guess I really like the kind of, the exposure to this whole way, that knowledge is shared and disseminated in these hierarchical organizations in ways I hadn’t even thought about and it stuck with me lake years later I was in my PhD program, I remember and our admin assistant who supported the graduate students was on leave. And because the person was on leave, you could kind of see the whole place fall apart because nobody knew where any of the files were. And I was like, “Oh, of course they don’t know. Like I remember this. I remember this from being a temp, like, nobody knows where anything is or how to find anything.” So my lesson from that was like, always pay attention to the admin assistant. Always, always pay attention because they know a lot about how to navigate the organization. They know a lot more than I think a lot of people really think about and they really support and keep things going in a way I think often gets ignored, Like you said, ignored or made invisible or erased. It’s only visible when it breaks down. It’s only visible when I accidentally turn around in my office chair when I wasn’t supposed to.

Lisa: Yeah. Yeah. Or when a philosophy starts to explore it. So I feel like the segues perfectly into my next question. So I’m now seeing all of these different layers and what we can explore and examine when we think about these digital assistants, Siri, Alexa, right? They just kind of woven their way into the fabric of our lives very calmly, very quietly. And this is about gender, it’s about power. It’s about surveillance. But it sure looks nice on the surface, right? Who doesn’t want a helpful assistant.

Jill: Everybody wants help, who doesn’t want, huh?

Lisa: Yeah. And so tell us a bit about virtual assistants. What are they? Because I mean, there’s things that come to mind for me, but it’s bigger than just Siri and Alexa. Right?

Jill: So there’s one thing that I didn’t put in the book, but that I think is really interesting is that if you Google virtual assistant, there are two main types that you would find. And of course, googling is already problematic because Google has their own virtual assistant, we’re using Google to learn about virtual assistants. Well, we’ll put a pin in that. But if you look up virtual assistants, there are two types. There’s the type that I talk about and there’s another type that didn’t make it into the book, but I desperately want to talk about in the future, which is a kind of gig economy. Virtual assistant can refer to somebody that you hire remotely. So an actual person, right, that you find online and they work for you, completing tasks for you, and you never meet them. They’re usually low, get paid, precarious work or piecework. And so this is another use of the phrase virtual assistant, and I find it so interesting and so troubling that in the English language, we’ve equated things like Siri and Alexa to actually hiring an actual human being to do support work for you virtually. Those don’t seem like the same thing to me and it’s weird to me that we use the same language. But the virtual assistant I’m talking about are not actual people that you hire. It’s learning algorithms that allow us to navigate the Internet, navigate our own files, send messages to each other, and we can do it all verbally, right? So, when I ask Siri to play a piece of music for me or to send a text message, it just means instead of typing something into a search engine or into a text box, I can just say, “hey Siri,” and ask her to find it out for me. And I think it’s interesting, but I just used the word her to talk about Siri. So virtual assistants can operate through smartphones, smart speakers, other smart devices. It’s a way of kind of orally and auditorily accessing digital space. They are always on, they can respond anytime they have a keyword that they listen for, which means they’re always listening. And that’s kind of the essence of how they operate. Most smartphones, I think come with one now that you can choose to disable, although it’s not always super easy to disable them. And yeah, that’s, that’s how they work.

Lisa: So it’s kind of like if I got like tissue paper, Kleenex is the name brand. Yeah. And so Siri is like the name brand.

Jill: Siri is the most well-known one I think because Siri was the first big one. Siri launched in 2011. And I don’t think there was a huge gap before anybody else enter that market several years. But now the two biggest, our Siri and Alexa. Google has Google Assistant or Google Home, depending on the device. And Microsoft tried for awhile to compete with Cortana, but they’ve, they’ve kind of roles Cortana into something else. But those were the big ones. There are others, but those are the major main players, yeah.

Lisa: And something I noticed from your chapter, which is about gender and virtual assistants among other things, is that we don’t see men, right? So, it’s either gender neutral or obviously feminine, right? And so, this is something we definitely be interested in as philosophers, as sociologists, as people who do gender studies. But we also might be concerned about this as well. And so, I love to hear your thoughts on this generally. What do we see going on there? And are they still gendered as feminine?

Jill: Right. So yeah, what we see when they all launched, they were all gendered as female. So, Siri launched in 2011 with a female name, Siri, which we were told was short for Sigrid, which is a Norse female name and a female only voice. Alexa, female name, female only voice. They were all launched with a female only voice originally. Not all of them have female names. Google Assistant like has no name. So, are they still gender today? Yes. They’re still gendered. There are more choices, I guess is what I would say now. So, for the majority of them, the default voice is still female. Some of them, I believe now prompt you to pick a voice option right at the beginning so they don’t launch with the default voice. But the majority of them, the default voice is still female even now in 2022. But there are male options that now exists, so you can pick a male voice option for all of them. This was actually kind of a big deal. Alexa was a longstanding hold out. Alexa had an only female voice much longer than the rest. The others did begin launching male voices, and Alexa didn’t, and didn’t, and didn’t. And then just last year, Alexa launched a male voice. There are no non-binary options. It is very binary. You can have a female voice, you can have a male voice. There are some varieties of female and male voices, sometimes with different accents. So you can pick like a British accent or something if you want. So they are still being gendered in the sense that it is quite clear that we are meant to interact with them as gendered entities. But the gendering itself is maybe a little more malleable in terms of user, the user can choose whether to interact with them as a male or female. And that wasn’t a choice that existed in 2011, even until just this last year, not all, not all the main player gave that choice. So Alexa just kind of did that in the last year.

(Music)

Lisa: And if I remember correctly from reading your work, we can understand this as connected to some of the early days of these kinds of technologies, right? Where the feminine role is attached to that kind of work of computing, but also speaks to the mutability of that particular subject category. Would you say that’s, that’s true?

Jill: Yeah, I think so. So, we know the history of computers is tangled up with the history of women. Because the first computers were women right? Computing was a job. Being a computer meant doing small calculations. So, we’d break a large tap mathematical task down into small calculations and a computer would do the calculation. But the computer was a human. So, this is the 19th century, early 20th century. Computer was a job, a job that a human could do, with a pencil and, and it was predominantly done by women. In fact, it was a fairly good job for a woman to have at the time. There wasn’t really any upward mobility, but that was a problem for a lot of jobs that women held and continues to be a problem frankly, for many female-coded jobs. And what we saw is that as what we now think of as computers kind of came into the space, started doing these calculations, as we started developing and designing these computers, particularly driven by World War II, and the need to do more calculations than we had humans to do those calculations. So we started kind of seeing this rise of computers that weren’t human. We see that the computers are still being considered as performing kind of these subservient supportive roles. So, when the computers were women, the computers were performing subservient subordinate roles, you were doing small piecework of these larger calculations under the supervision of a male mathematician who would check your work and kind of compile everything together after the female computers had completed their tasks. In fact, sometimes these women were referred to as “harems of computers,” which is lovely. And yeah, as we see the computers become mechanical devices, we see the same kind of idea that they would work in subservient roles to fulfill the tasks of a man. So, we kinda see this equating of women in subordinate positions performing tasks for men and now computers in subordinate positions performing tasks from men. And now Siri in a subordinate position performing tasks for the user, who I argue is presumed to be male.

Lisa: Yeah, and there’s a couple ways that we can understand that assumption. If I’m remembering again your chapter correctly, right? It’s partly about how the programming of that virtual assistant is done to both assume certain kinds of questions, but also how to react appropriately or with gendered expectations in terms of how one might respond to, for example, things from verbal harassment, but also to put in quotes “inappropriate questions.”

Jill:  Yeah, I like the air quotes. Those were, those were airquotes everybody listening. In addition to their names and their voices, these virtual assistants are also gender by something researcher Gode Booth cause genderlect, which is the idea that the virtual assistance speak in gendered patterns. So, for example, the virtual assistant will often couch things in terms of, “I think or I’m not sure.” If you ask them a question, they’ll say, “I think the correct answer is.” And this is a very gendered pattern of speaking that we see more often represented among women rather than men, that women are more likely to kind of couched their assertive language in these phrases like “I think” or “I’m not sure.” All virtual assistants are also very likely to assume responsibility and assume blame when something goes wrong. So they apologize immediately. They say they’re sorry, they try to fix it, rather than saying like the user screwed up, which is probably what it was. Again, we kind of see this, this humility. In fact, the marketing itself also says that the virtual assistants are humble and helpful. So we see a lot of ways in which they are gender to be feminine and to be subordinate and to be non-threatening. And this also, as you said, comes out when the assistant is verbally harassed, right? So, when UNESCO wrote that report in 2019, they actually titled the report, “I’d blush if I could,” which is what Siri would say. If you verbally harassed her. If you said something like, “Can I fuck you?” Siri, she would say, “I’d blush if I could.” This had been criticized as early as 2015, I believe, but it didn’t really get a lot of traction in public consciousness until UNESCO put this report in 2019. And now it has changed. Siri does not say that anymore. None of the VAs are as joking with sexual harassment as they used to be. So the VAs used to be kind of flirting or joking. Now, VAs are opting for kind of a more noncommittal disengagement. So instead of saying “I blush, if I could,” for example, Siri now says, “I don’t know how to respond to that,” which is not as clearly a flirt, but it’s also not calling the user out. Right? It’s not saying “your comment is inappropriate” or saying, “here are some helpful tips on what sexual harassment is” or anything like that. So is it better? I think it’s better, but it’s still not where I’d like to see it. But again, I think a lot of the goal here is for these virtual assistants to signal that they are submissive and subservient and there to serve the user. And just as we said, like who doesn’t want help? Most of the users, I think, don’t want to be called out or criticized by their program that they’re using.

Lisa: And this, I feel, feeds into the gender binary, right? This is about the gender binary and the capacity within that to create change, it’s very difficult to move things forward when you exist within a cultural context that really only sees one thing or the other and the ways that those are interconnected. In your chapter you provide us with some really powerful feminists, but also philosophical, I’m gonna call them tools because we’re talking about technology.

Jill: I like it.

Lisa: And I see these as really helping us to understand that gendering of virtual assistants within that context of the gender binary. Can you tell us more about that?

Jill: Yeah. So the gender binary, I think is really important for understanding what’s happening with virtual assistants and why it’s happening. And yeah, as you said, it’s, it is this kind of boxes that were stuck in and it’s really hard to move forward. So we have, if we’re thinking of a gender binary, we have two genders, right? In a binary, we have male and female. And they are typically defined in opposition to each other. So for example, where men are defined as rational, women are defined as emotional, where men are defined as decisive. Women are defined as indecisive, where men are simple and straightforward, women are complex and mysterious. And I think that that cultural understanding is, is kinda of their most people know that these stereotypes exist and I hope that everybody listening knows that they are, as Lisa said, social constructions and stereotypes that it is not true that men are always rational and women are always emotional. And it is in fact not true that there are only two choices when it comes to gender. But the binary has been reinforced over and over and over again, such that it often does kind of play this limiting role of exerting this force, making us think there’s only two choices. So philosopher Simone de Beauvoir also kind of built on this idea because what she noted is that it’s not quite right to say that what we have is two genders defined in opposition to each other. It’s not as though men are one thing, or women and women are another thing instead, what Beauvoir noted is that what’s happened is that we have defined and socially, and she was writing in 1949, so you can tell me if this is still accurate, I think it kind of is, that we have defined socially that man is the norm and women are exceptions to the norm. So, it’s not just that man is rational, it’s that man is rational and rationality is the expected norm. And man is straightforward and simple. And being simple and straightforward and easy to understand is the expected norm. And being complex and mysterious is the exception. And having emotions that you display is an exception to the norm. So, what we end up then with Beauvoir thought was that man is defined as the Subject and women is the Other. In fact, she said to be a woman was to experience being Othered. And to be Othered is to be viewed as an object. Typically an object for the Subject’s use. And to be viewed as less than fully human. Whereas to be viewed as a Subject is to be viewed as the focus of attention, the agent who is using the object to promote their agency to move forward. So, this gender binary then is not just opposite. It’s that hierarchically one of these is in a better position than the other one, right? It’s better to be the Subject, them to be the Other.

Lisa: And it speaks again to that experience of mutability and visibility and assumptions about, about how those rules are exercised within those relationships, even if it’s not directly spoken.

Jill: Yeah, exactly. So if you are the Other you are supposed to focus your attention on the Subject. You are in an object for the Subject’s use and you should make yourself a useful object.

Lisa: And you go even further with this though, with Hegel. Is that right?

Jill: Yeah.

Lisa: Yeah. So Hegel adds this dimension of the master and the bonds person.

Jill:  Yeah, so Simone de Beauvoir was writing in 1949, and now we’re going back further. So GWF Hegel is a 19th century philosopher, so he’s writing in the 1800s. Beauvoir was a well-aware of Hegel. She wrote a lot of Hegel, actually. The thing I love about that is Beauvoir said she would read Hegel and get totally caught up in Hegel’s ideas and think it was amazing. And then she would leave the library and enter the streets of France during World War Two and think, how does Hegel apply to this? It’s too abstract. Hegel is incredibly abstract. So what I like about Beauvoir and other people who read Hegel before I got to Hegel is that they kind of made Hegel more concrete for me, which is nice. So, Hegel has this really big, thick meaty book called, well in English, it’s translated into English from the German, and it’s called the phenomenology of spirits. And in the Phenomenology of Spirit, there’s a fairly famous passage called Lordship and Bondage. And it talks about this relationship, a relationship of power hierarchy. So the lord in a feudal society would be the person who owns the land. And the bondsmen would work the land for the lord. And the bondsman would owe tithes to the Lord. You have to pay the Lord in order to continue working lands. And, what Hegel argues in the passage of lordship and bondage. There’s a lot he argues in there. There’s some stuff about struggling to the death. I’m not going to worry too much about that stuff. Because what’s really important is the relationship we end up with, with the lord and the bondsman at the end of the struggle. So, they bought struggle trying to assert power, trying to get control. And what’s important is that at the end, one person emerges as the word and the other person is relegated to the position of bondsmen. The lord Hegel says, is independent. That’s actually not true, but we’ll come back to that. The lord is independent and the bondsmen is dependent on the lord because the bondsman depends on the lord to let him work the land. The bondsmen also has to know a lot more about the lord than the lord has to know about the bondsmen. The bondsmen has to know what the lord wants, has to know how to till the land in order to give the lord what he wants has to know like how to not piss off the lord so that the lord exacts punishment. So the bondsmen has to know a lot about the lord’s moods and desires and how to fulfill them. But the lord doesn’t have to know anything about the bondsmen needs or desires because there’s really nothing the bondsmen can do if the Lord upsets him. So this was Hegel’s idea, and when he ends up saying then is that the lord has all the power. But the bondsmen has a lot of knowledge, way more knowledge than the lord does. So we have this interesting situation where the person with the power is less knowledgeable than the person without the power. The person without the power has all the knowledge and the person with the power doesn’t really have a lot of knowledge. Many later thinkers kind of picked up on this saying,  “Well, hang on. If the bondsmen has all the knowledge, the bondsmen could maybe leverage that knowledge for power.” And many people also noted that while this probably did exist in a feudal situation, that you’ve had this power asymmetry between the lord and the bondsman. We also find this power asymmetry in a lot of other places, right? So we can find it between men and women in a patriarchal society that men tend to have more power, but women tend to have more knowledge. I mean, why do we say men are simple and straightforward and women are mysterious. Maybe it’s because women need to know about men’s moods and desires and expectations in order to survive, in order to navigate society safely. And men don’t really need to know a whole lot about women’s desires and moods and expectations in order to navigate safely. So, we can see it was between men and women. And I think we can also see this when it comes to administrative assistants.

Lisa: Yup. Exactly. I can see that relationship being laid out very clearly, also leading into helping us think about these virtual assistants if they’re occupying that space to be held primarily by women. And I think there’s another final piece that helps us to explore this more fully and that would be feminist standpoint theory. So how does this help us to pull all of this together, right? Because Hegel, I’m guessing, did not think about feminist standpoint theory unless you know something, I don’t know.

Jill: Hegel said women  like plants. Hegel was not a feminist.

Lisa: Okay.

Jill: But curiously, there are feminist standpoint theorists who have read Hegel. So I’m thinking of Sandra Harding who cites Hegel, even though Hegel himself, definitely not a feminist. And so, what standpoint theory does is it kind of picks up on this knowledge and power asymmetry that the person with the power doesn’t have the knowledge and the person with the knowledge doesn’t have the power. The reason it’s called standpoint theory is that they say your standpoint in society, your social location matters when it comes to who has access to what sorts of knowledge. So being a bondsman or a secretary affects the kind of things you can come to know. But even more so what standpoint theory really focuses on is that the social locations of marginalized groups they say are particularly important to pay attention to for knowledge claims, particularly knowledge claims about the kind of social forces that operate in our world. So, as we know, women are more likely to experience sexism than men. And so, they are more likely to have had opportunities to learn about sexist practices. And I don’t love to call it opportunities, but experiences in their life that lead them to learn about sexist social forces and sexist social practices. Likewise, Black, Indigenous and racialized people are more likely to have dealt with racism. And so more likely to know about racist practices in our societies and in our social institutions. So, the idea then is that marginalization makes things salient that aren’t salient for more privileged populations. And I have a little bit of a story and kind of illustrate this. When I was in undergraduate university, I was in university in Calgary, Alberta. And there’s this really funky street in Calgary called Kensington Ave, and it’s got all these like cool shops and stuff and most of the stuff is way too expensive for me. It’s, it’s kind of a posher of neighborhood, but it’s really funky. And there’s some nice cafes and stuff. And so I liked to go there sometimes and just hang out, go browsing, maybe grab a coffee. And I went with a friend one time who was using a mobility aid. And we suddenly, well, not we, I suddenly realized that many of the shops were inaccessible for my friend because there was a little lip on the door and the mobility aid couldn’t get over that lip on the door. It was something that I had just been stepping over without even noticing. But it proved to be a significant barrier to my friend entering a lot of these spaces. So, this is something that I hadn’t even noticed. Right? I had the privilege of not using a mobility aid and I had just stepped over these door frames without realizing this kind of very physical structure of marginalization that existed all around me that I had literally never seen until I went there with somebody using a mobility aid. So, they knew that this was a problem and they knew exactly which shops they could get into and which ones they couldn’t. I didn’t know.

(Music)

Lisa: This leads into very well the ways that unconscious bias can absolutely have big impacts within the context of these physical spaces that you navigate through every day. But it can also extend into the building of technologies and it absolutely does. And so we can think about, for example, connections to work that’s being done around race and artificial intelligence. But I know as well that we could think about, or some folks might think about what’s going on in terms of gender and virtual assistants, that this can also be connected to unconscious bias. Can you tell us a bit more about that?

Jill: Yeah, for sure. So the UNESCO report from 2019 did actually say this. So, they did call out virtual assistants as being sexist. And they theorized that the main reason virtual assistants had been designed in very sexist ways, female names, female voices, genderlect saying, I’m sorry, I’m not sure, and also “I’d blush if I could” when you sexually harass them was just because of unconscious bias. So they put that all down mainly to unconscious bias. And they thought that because the majority of AI development teams are made up overwhelmingly of white men, there are very few women and very few people of color working in this area of the tech industry. And so, they thought that there would have been a lack of knowledge about how sexism operates. A lack of awareness that what they were doing was reproducing the sexism that already existed in our society. And just kind of reproducing it in a digital context. And so, it’s kind of well, these white male teams didn’t mean to do a sexism. They just don’t face a lot of sexism in their lives and they didn’t recognize it when they reproduced what they were seeing all around them. So, whoops. So that was kind of the explanation that had been given by UNESCO. And I think that we can interpret this by relying on standpoint theory. And we can see that yes, if marginalized groups are more likely to have knowledge about how sexism operates and what it looks like, and if you don’t have any members of marginalized groups or you have very few of them on your development team, yeah, unconscious bias, inadvertent bias can absolutely happen and you can end up reproducing an amplifying the discriminations that we see in society, again in the digital sphere. And that’s happened a lot as you’ve said, with algorithms as well. And so that was kind of the explanation that was given. And UNESCO said, we need more women on these teams.

Lisa: Yeah, and it seems to me like it you can lead towards a certain kind of first of all, culture of excusing systemic inequities. But equally, can really facilitate and support certain forms of tokenism that are really insidious and not actually lead towards deeper transformation is that is that when I’m hearing what you’re saying?

Jill: It can, yes. And one of the risks as well, that if you just have a few members of a marginalized group on your team, they may not feel safe speaking up. So that’s another issue as well. So, if you have a team of predominantly white men and you have a woman on that team who thinks what you’re doing when you design your AI is really sexist, she may not feel safe speaking up. I mean, that maybe risking her job. So just including individuals from marginalized groups, I mean do it, I think we should all do it! But just including individuals from marginalized groups doesn’t necessarily solve the problem. In addition, many standpoint theorists will note that we can internalized discrimination. We can internalize racism. We can internalized sexism. So being a woman doesn’t automatically mean that you have knowledge of how sexism operates in society. It’s more likely that you will learn how sexism operates. But it doesn’t guarantee it because we know we can internalize these things. So again, just having a member of a marginalized group on your team doesn’t necessarily help your whole team move beyond all of this.

Lisa: Yeah, absolutely. And so what I’m hearing is we need to get at this deeper layer, right? That’s all we’re always trying to do as,

Jill: Yeah.

Lisa: Social Science and Humanities scholars, right? That’s what we love to do. And it’s making me think back to some of the ways you were talking about that temp work which comes up again and you’re a new chapter. That work as being something that also gives you this inner look at this world that’s deeply important and full of power. And so I’m wondering if you can speak a bit to a quote from your chapter, so I’m just gonna read that. So this is Jennifer Jill Fellows and I quote, “The more I study them, virtual assistance, the more I conclude they are gendered to mimic the very women I subbed in for in my temp jobs over 15 years ago. I believe this gendering is done on purpose in order to subvert the Hegelian master/bondsmen dialectic, and gain an epistemic advantage over the users of these Vas”, end quote. So you’re getting at something much bigger and kinda scary. Can I say that?

Jill: I think it is a little scary.

Lisa: Tell us about that.

Jill: So first of all, I don’t think that tech people studied 19th century philosophy necessarily. So, I don’t want to say that like they read Hegel and they figured it all out. Although I will say that Hegel is cited in the UNESCO report, so people are reading Hegel. But I do think that many tech companies have realized something very similar or a very similar truth in terms of a power dynamic to what Hegel highlighted. So, I’m not claiming that they went to Hegel to grab this stuff, but I think that they’ve realized something that Hegel also pointed out. And that is that there is knowledge to be found through subordination, right? That the subordinate person or the subordinate role has a lot of opportunity for knowledge. Other writers have noted that the knowledge could come with quite a lot of opportunity for power, right? So, Marx picking up from Hegel, thought that the knowledge could give workers a huge advantage. That they could unite, rise up, overthrow the owners, right? Seize the means of production. In effect, what Marx realized, and he did read Hegel, so what Marx realized is that because of the knowledge that workers have, the owner or the lord is just as dependent on the labor, as the laborer is on the wage paid by the owner, right? So, we think, the owner thinks they’re independent, the lord thinks they’re independent, but that’s not actually true. And so I think that this idea that there’s a bunch of knowledge to be found and subordination can be quite empowering for people who find themselves in a subordinate position. But then I started asking myself, what if you could fake subordination? Then you get all the power of being in the dominant position, the owner or the lord, but also all of the knowledge of being in the subordinate position. And, like, from a certain perspective, when that be ideal? I think we know, like I don’t think Hegel was the first or definitive person to put his finger on this, I think we know that subordinated people fly under the radar all the time. And that quite often the subordinate positions have a lot of knowledge, right? I think spies will often take subordinate positions not only to avoid drawing attention to themselves, but because of the knowledge that they can gain through playing or faking a subordinate role. And of course, as we’ve already said, admin assistants and support positions know a lot and often get overlooked, get forgotten. Again, even the gender binary plays into this, right? Why are men so “straightforward and simple,” I’m using my air quotes and women aren’t mysterious. Well, it’s because as we said, men don’t need to know about women for survival. But women historically and still today have to know about men, men’s moods, men’s motivations, their desires, the things that might make them angry in order to navigate society safely. Because, okay, so I don’t think everything that Margaret Atwood has ever said or done, I don’t support all of it. I disagree a lot with her on a lot of issues, but I think she nailed it when she said that “men are afraid that women will laugh at them, whereas women are afraid that men will kill them,” right? That’s a very different power dynamic and it demands different, different knowledges in order to be safe. So, what we have then in the current environment is that secretaries are subordinate to their bosses. Women are subordinate to men. The best place to gain knowledge, then seems to be a female secretary. So, that’s what these companies have given us. They’ve given us a female secretary. In fact, I’d argue she’s in some ways even better or more palatable than a regular female secretary because she won’t laugh at you or accidentally turn around in her chair while you’re doing your morning routine? So, she’s signals in every way possible that she is non-threatening. Andrea Guzman pointed this out and said that, and this is paraphrasing, in world where women are still subordinate, Siri’s gender becomes another way that she can put us at ease, assuring us that we are the masters and she is there to serve us. But, and maybe this gets to the part that makes you a little bit afraid or creeped out, Lisa. Because we know like we’re not the masters right. Of course we’re not. We are handing over our information to these tech companies. They are the ones in power. And because they are using virtual assistants to successfully fake subordination, they are also the ones gaining all the knowledge, all of our data. Like, it’s a win-win for them. They’re, They’re the masters and they’re faking the subordination.

Lisa: Yeah, it’s all interconnected, which brings to mind intersectionality, right? These overlapping ways that oppression takes place. It’s not just about gender. It’s not just about race, it’s not just about class. It’s about how these things come together to compound.

Jill: Yeah.

Lisa: Power and violence. But what’s interesting to me in this piece is that within that digital space, many of those social locations, those social identities become collapsed. Invisible. So how have you thought about or sought to work with intersectionality within that context?

Jill: So I’m going to preface this answer by saying, what I’m going to say here is going to be restricted to the North American context. Because virtual assistants launch in different geographic locations with different voice actors and in different languages and with different accents. So, when it comes to the VAs in North America, what I argue and what I found is that these VAs are not just women, they are coded as women, they are gendered as female, but they are also, I think, white women. So, Siri is short for Sigrid, which is a Norse female name. In addition, Siri’s original voice and still her default voice, Susan Bennett who was a white woman. And Bennett reported in an interview with Oprah that she was chosen to voice Syria because she had no accidents, which um.

Lisa: Yeah, don’t we all have accents?

Jill: Yes, she has an accent. She has an accent. Bennett’s accent and most of the accents of these VAs is a type of accent that is spoken predominantly by upper middle class white women. So, the VAs, do speak in genderlect, but they speak a specific type of genderlect. They speak a genderlect that is also what is commonly and problematically referred to as Standard American English. That’s how they speak. It is an access. In fact, you could only think it’s not an access by privileging that position as the norm and thinking any other accent is a deviation from that as though this is just quote unquote normal English, which I mean is what they call it Standard American English. The word ‘Standard’ there is doing a lot of work, right? So you make all other ways of speaking English in North America sound non-normative. I don’t actually have confirmation about who Alexa is voiced by. It is apparently a guarded secret, but the suspicion online is that it is another white woman named Nina Rolle. In fact, as far as I can tell, all the major virtual assistance were originally voiced by white women. And that’s worth reflecting on. In addition to that, their voices and their names, some virtual assistants have backstories. So, conversation and personality designers have been hired to provide backstories for some of these virtual assistants. And the backstories also tend to be back stories that signal that these virtual assistants are upper middle-class. So Google Assistant, and this is in the chapter, her background and personality is that she has a young woman from Colorado, one of her parents, her father is a physics professor. Her mother is a librarian. She has a Bachelor’s degree in history from Northwestern. That’s pretty upper middle-class.

Lisa: And she’s just getting through grad school. Maybe.

Jill: She enjoys kayaking, apparently.

Lisa: Okay. Okay.

Jill: So yeah, I think that when we think about this intersectional, it’s not enough to just say they’ve been gendered as women. They’ve been gender to specific types of women.

Lisa: Yeah.

Jill: They are upper middle class white women. Another support for this is who they’ve been designed to pay attention to. So, in 2020, it was found that all the major virtual assistance misunderstand about 19 percent of what white people say, but about 35 percent of what Black people say to them. So, it’s worth thinking about in terms of intersectionality, not just what kind of person the virtual assistant is meant to mimic, but also who they are meant to listen to. So I claim that these VAs are white women and they are designed to serve the presumed white user, most likely white men. They are also middle-class, more specifically upper middle class. And so I think middle-class white femininity is being weaponized by big tech corporations to gain knowledge and power over us. And I think this kind of makes sense because mainstream North American society doesn’t tend to view middle-class white women as a big threat. And the more non-threatening these VAs appear, the more knowledge and data we will just freely give them

Lisa: Well because they’re perceived as trustworthy enough to give up our files.

Jill: Helpful.

Lisa: Yeah.

(Music)

Lisa: So before Wrap up our conversation for today and continuing conversations. I’m curious to hear one final thought from you. Well, actually that’s a lie. I do have more than one question. But the first one is actually just about how you chose to contain this particular project. Because it strikes me that this is a big series of questions, right, that extends far beyond just one chapter. And did you have difficulty whittling it down to just this one piece. And where would you go in the future with these questions?

Jill: Yes, I definitely had difficulty. There’s a lot of other stuff that I could pull in. There’s not-for-profits working on gender neutral voices. There’s a way in which we have kinda phenomenological experiences with VAs that we use pronouns like she and her to talk about what is essentially a tool. I, I have a lot of thoughts and it was really, really hard, I think is anybody knows when you’re writing a chapter, I think I went through I can’t remember how many jobs, multiple jobs, multiple, multiple drafts, trying to whittle it down to what would be the core message. But the good news is all those drafts are still sitting on my computer and I think I will go back and I don’t know what else I’ll do with it. Maybe I’ll invite you want to interview me again about something else to do with virtual assistants.

Lisa: I would love that Jill. I’m happy to help.

Lisa: On that ominous note, I want to thank our host of this series for stepping into the hot seat and answering all my questions. I feel like I made you work pretty hard. I also feel that I have learned a lot of things from both reading your chapter but also talking the day. So a reminder that if you are the person in the bosses chair and you have a glass window, don’t take your clothes off because somebody could be looking right? So that’s one thing I’ve learned. I’ve also learn a lot more about women as the first computers. Very interesting. But I’ve also learned that maybe we need to take a deeper look at these things. Siri and Alexa and all the other host of assistants that are just there to lend a helping hand. Because it goes a lot deeper. And it sounds to me like philosophy and philosophers can really help us out with this work. Jill, is there anything else that you want to leave us with in terms of advice? Did I catch all the key pieces?

Jill: So as a last thing, think, I would tell you that you need to exercise caution with these virtual assistants because there has been progress that has been made. Siri no longer jokes when you sexually harass her and that is really important. And you can now pick a male voice option, though no non-binary voice option yet tech companies if you are listening. But I don’t think that really addresses the things I’m most worried about. So, I don’t want to downplay these gains. I think they are really important. I think we don’t want to allow tech companies to model rape culture or regressive modes of gender relations. But I’m worried that these changes are being made in order to make VAs just palatable enough that we keep using them. Siri doesn’t stand up for herself. She doesn’t reprimand the user for harassing her. She just disengages. And that was a choice that the programmers made. And it’s a choice that’s being made specifically to not offend us, so we will keep handing over our data. I think this fake subordination is giving tech companies tremendous power through the knowledge that we are giving them. And we need to be aware of that first and foremost, we cannot afford to sit back in the assumption that we are the lords or the masters here, because we aren’t. We don’t have the power and we are giving away the knowledge.

Lisa: Yeah.

(Music)

Jill: This episode of Gender, Sex and Tech, continued the conversation began in Chapter 7 of the book Gender Sex and Tech: An Intersectional Feminist Guide. The chapter is titled “A Harem of Computers and a Mummery of Bondage”. And it was written by me, Jennifer Jill Fellows. I would like to thank Lisa Smith for taking the reins today, for engaging with my work and for asking me such insightful questions. And thank you listener for joining me for another episode of Gender, Sex and Tech: Continuing the Conversation. If you would like to continue the conversation further, please reach out on Twitter @tech_gender. Or consider creating your own material to further the conversation in your own voice. Music provided by Epidemic Sound. This podcast was created by me, Jennifer Jill Fellows, with support from Douglas College in New Westminster BC, and support from the Marc Sanders Foundation for Public Philosophy. Until next time everyone, bye.

Next Post

Previous Post

© 2023 Gender sex tech

Theme by Anders Norén