Gender sex tech

Continuing the Conversation

Transcript for Minisode Musings: Holding on in Digital Space

Ever wanted to know more about me? Why not ask Chat-GPT? Actually, let me save you the trouble. Hereā€™s what it has to say:

It is possible that Jennifer Jill Fellows is a private individual with no public profile or notable achievements.

Ouch! I mean, Accurate, but still! Kinda bluntly put, there, ChatGPT.

But of course, you can ask ChatGPT about celebrities and other people with large public platforms, and it will have a lot to say. Not all of it will be accurate, mind you, but it will have a lot to say.

And, for private individuals with no notable achievements like me, there are other options as well. You can use generative AI to develop your own personal chatbot that will know who you are, or at least know what you tell it about who you are. And if you arenā€™t particularly tech-savvy, you can pay for various services offering chatbots built on generative AI who will interact with you almost in any manner that you wish.

Until they donā€™t. Until the companies offering these services go out of business, or change their business model.

Then what happens?

Hi everyone. Welcome to a bit of a different episode of Gender Sex and Tech: Continuing the Conversation today, Iā€™m calling a minisode musing.  This minisode is part of The Big Rhetorical Podcast Carnival 2023, and the theme this year is Artificial Intelligence, Applications and Trajectories. Iā€™m your host, Jennifer Jill Fellows. But I donā€™t have a guest today. Instead, I just want to share some of my thoughts and my musings about generative AI and personal identity. Iā€™ve been thinking a lot about what might it mean to interact with AI as we do with our friends and lovers? What might that mean for our sense of self? And what sort of ethical responsibilities might the companies who create these AI tools need to consider?

So, itā€™s just me today. Okay? Here we go.

Before we begin I want to pause to acknowledge that digital space is physical space. ChatGPT is hosted on servers that are created using physical resources extracted from the earth. It is maintained through energy consumption, which is often reliant on oil or coal. And current estimates are that for a simple conversation of 20-50 questions and answers with ChatGPT, roughly a 500ml bottle of water is needed to cool the system. This is to say nothing of the vast amounts of water that were needed simply to train ChatGPT before it was ever launched, an amount estimated to be enough to cool a nuclear reactor. As such, though the illusion is that these tools exist in some nebulous digital space divorced from reality, that is not the case. They are physical and they have a physical impact. So, I want to recognize that today, I am recording Gender Sex and Tech: Continuing the Conversation on the unceded territories of the Coast Salish People of the QiqƩytNation.

So, I did my MA thesis, back at the University of Calgary, on Personal Identity. I was curious about what made us what we are, and what, if anything, sustains our personhood as we grow and change over the course of our lives. Like, in many ways I am not the same person I was when I was 6, or 12, or even in my 20s. So much life has happened to me since then, and Iā€™ve read so much, seen so much, and experienced so much that I have changed my views on a number of different issues. I also donā€™t look the same anymore either. My hair is cut short, something my 6 year old self, who was obsessed with Rapunzil, would have been appalled at. Years of staring at books and screens means I now wear glasses, which my 20 years old self did not. And there are other changes, both physical and psychological, that have happened throughout my life. But, Iā€™m still the same person in other ways. I am still my parentsā€™ child. I still have many of the same characteristics, personality traits, and behaviours that I did as a child. Many of my tastes and preferences are still the same. And the high school diploma and university degrees that bear my name still belong to me, even if the me who now exists views the world differently than the me who earned these pieces of paper. Like, thatā€™s just common sense, right?

So, I wondered in my MA, and I still sometimes wonder now, what made me me?

This question took on extra significance for me in my MA program because my grandmother was living with late-stage dementia at the time. There were some in my family who said she was so radically changed by this disease that she was not the same person anymore. There were other people who thought that there was a continuity of personhood. I turned to philosophy to try to help me sort out what was the truth. And in my MA program, I didnā€™t find a solution that made sense to me.

But philosophers didnā€™t stop working on personal identity just because I moved on to a PhD in philosophy of science and moved away from metaphysical questions of personhood, obviously. And I kept reading the odd article here and there, keeping up on developments in the sub-field. And in 2015 Philosopher Hilde Lindemann published a book called Holding On and Letting Go which really changed my thinking about personal identity.

In the book Lindeman argues that none of us actually achieves personhood alone. Instead, we are brought into and held in our personhood by a community of others. In effect, personal identity, what makes me me, and what makes me the same person I am now that I was at 6, is a community endeavour.

So, okay, what exactly does that mean, right? Lindeman develops a narrative and relational account of personal identity. The idea, basically, is that you become who you are through the relationships you have with others, the stories they tell about you, and the stories you tell about yourself.

Lindemann isnā€™t the first philosopher to think that stories have a lot to do with personal identity. And itā€™s true that when we are getting to know someone new, one of the ways we often try to get to know that person is through stories. We go for a coffee and listen to the stories they tell about themselves. Stories about their hobbies and interests and work. Stories about family, adventures theyā€™ve been on, or what their childhood was like. These stories form what Lindemann calls the ā€œnarrative tissueā€ of an individualā€™s personal identity. And I love the metaphor of tissue. Tissue is flimsy and ephemeral in some ways. It can tear and break easily. And individual stories are like this as well. Stories can represent us better or worse, and no one story about who we are is strong enough to anchor our sense of self. But layers and layers of stories, layers and layers of narrative tissue, can form a strong cocoon of identity around us, anchoring us in our personhood.

This means that the stories we tell about each other matter. A LOT! And this is especially true for those moments in every individualā€™s life where they cannot hold themselves in personhood. So, let me give an example. Hereā€™s a story about my own life that has been repeated to me over and over by my parents. Itā€™s the story of when I ran away from home.

I think a lot of kids have a moment where they might contemplate running away from home. And some kids might have a time where they actually attempt it. So, my story begins when I was three. My family lived in the countryside when I was three. Not in the suburbs, but genuinely in the country. Weā€™re talking no street lights, gravel roads, and we didnā€™t even have a private landline for our phone. My family was on a party line, shared by everyone else on our street.

At this time in my life my daily routine looked something like this: my mom and dad would get me up in the morning, get me dressed, comb my hair (something I always detested even as I insisted on having long hair that easily tangled) and then we would all sit down to breakfast. I was an only child at this time, my younger brother not yet born. And after breakfast I would be bundled into the car usually by my mom who would then drive me the babysitterā€™s house. This was about a 30 minute drive. And after my mom dropped me off, she would go to her job in the city. My babysitter had a little girl about my age, and we are actually still friends. So I would play with this friend all morning, and have lunch with her, and then my mother, who worked half-days at the time, would pick me up.

With me so far?

Okay, so how does this matter for my running away from home?

So one evening I got in a fight with my dad. No one can really remember what this fight was about, but I was mad. I mean really mad. And I told my parents that I was going to run away from home. My parents responded by asking me what my plan was when I left home. And I said I was going to go live with my babysitter and her daughter. So my parents called my bluff and said fine, if I wanted to do that, then I could. And, full of rage and determination, I marched my 3-year-old self to the closet, put on my jacket, and went to the front door. My parents opened the door, into the inky black and wild night, and told me I could be on my way, probably thinking that I would come to my senses. A coyote howled in the distance, and a cold wind blew through the house. And, so the story goes, I turned to my parents, confused, and asked ā€œarenā€™t you going to drive me?ā€

Because, of course, they had always driven me to my babysitterā€™s house. I didnā€™t know how to get there. I donā€™t think I could have made the walk on my tiny 3-year-old legs even if I did know how to get there. It had never, apparently, dawned on me that my parents wouldnā€™t drive me to my desired location upon my assertion that I wanted to run away.

Laugh.

So, there, Iā€™ve given you a small piece of the narrative tissue that makes up me. But it is incredibly rich. You know the circumstances under which I grew up. You may have also picked up that I was a bit of a stubborn child. You know a small part of the parental tactics my parents often used in dealing with my willfulness. And you probably can glean or make guesses at other things.

If you were observant, and really listening carefully, youā€™ll also know that this is a story that was told to me, and not one I remember myself. I used the word ā€œapparentlyā€ a few times and ā€œso the story goesā€. I donā€™t actually remember running away from home. I was too young to remember this incident in my life. Itā€™s lost, like so many young childhood experiences are lost.

But I have had this story told to me so many times over the course of my life that it sometimes feels like I do remember it. It is part of some of my earliest narrative tissue.

In the earliest moments of our life, we donā€™t retain our memories. Our stories of ourselves from early childhood or infancy, our earliest narrative tissue, are woven by others. It is our parents and caregivers that hold us in our personhood, weaving the narrative tissue around us that will form some of the first senses of self we have. They hold us until we can pick up the threads of our own stories, and weave ourselves.

With regards to this Lindemann says the following:

  • ā€œWe are initiated into personhood through interactions with other persons, and we simultaneously develop and maintain personal identities through interactions with others who hold us in our identities. This holding can be done well or badly. Done well, it supports an individual in the creation and maintenance of a personal identity that allows her to flourish personally and in her interactions with others.ā€ (Lindemann X)

Lindemann goes into more details about how holding can be done well or badly, by identifying four different ways in which holding can be done:

Holding can be done:

  • Well (Morally praiseworthy recognition and response to an individual appropriately)
    • Badly (A refusal to recognize and response to an individual appropriately)
      • Refusals are the most morally blameworthy. So you might be said to be holding badly, for example, if you refuse to use the pronouns for your child that they indicate are correct for them. Theyā€™ve told you a story of their transition, and if you refuse to respond appropriately, you are holding badly.
    • Poorly (Mistakes, missteps and misrecognitions of who someone is and how best to respond)
      • So if you accidentally use the incorrect pronouns when interacting with a new person, or forget to ask someone what their pronouns are, but it was an honest mistake that you seek to rectify, Lindemann says you are holding poorly. You arenā€™t holding well, but you are trying, and with practice you will hopefully get better.
    • Clumsily Proper recognition of individual, but awkward response,
      • Like a toddler offering a band-aid to an adult whose feelings are hurt.
      • Children often hold clumsily, but they do hold. These arenā€™t mistakes in identity, like poor holding is. They are mistakes in response. The child has identified you correctly, but doesnā€™t entirely respond appropriately.
      • Even young babies, Lindemann argues, can hold us in our identities.

What Lindemannā€™s categories tell us is that holding someone in their identities is about more than simply the stories we tell about them. The stories do matter. If I tell a story about my grandmother going through dementia where she is not my grandmother anymore or, even more egregiously, where she is no longer a person at all, then I am holding badly according to Lindemann. I am failing to correctly hold her in her identity at a point where she is struggling to hold herself. But, part of this is because of the story itself, and the other part of this is because of the ramifications the story might have for the relationship I have to my grandmother, and for her sense of herself. So the actions that I will take if I internalize this story of her no longer being a person are also part of the holding.

Holding, then, happens in the stories we tell, and in the actions we take as a result of internalizing those stories.

So we hold other people in their identities in a whole bunch of big and small ways. The big ways occur in the relationships that are fundamental to our lives, like family and friends and caregivers and care receivers. But even small things, like a nod or a smile or a hello with a casual acquaintance serve to hold them in their identities by acknowledging them as a person. In this sense, saying ā€œthank youā€ to your barista in the morning would serve to hold them in their identity by acknowledging their personhood and the small relationship they have with you based around a coffee transaction. This wouldnā€™t be a strong part of their narrative tissue, since being your barista is only a small part of their self, but I imagine we all know how it feels to be dehumanized by a random stranger who in some small way fails to acknowledge us as persons.

Or, as Lindemann puts it, quoting ā€œTo participate in personhood is to participate in moral life.ā€

There is one more aspect of Lindemannā€™s work that I want to unpack before I talk about what the hell any of this has to do with generative AI.

The Letting Go.

Remember the title? Holding on and Letting Go?

Well, anyway.

Lindemann says there are really two types of letting go. The first is letting go of narratives that no longer reflect the person you are trying to hold in their identities. When someone comes out as gay or trans, insisting that they are straight or cis is holding when we should not. If someone makes difficult and and big change in their life, like religious conversion, insisting they do not belong to the religion they now follow, or calling it a phase, is also holding when we should not. These are incidents of holding badly. In order to hold well, we need to let go of the old narrative tissue that no longer serves, or perhaps never served, the person whose identity we are co-constructing.

So thatā€™s the first letting go. Letting go of narrative tissue when we discover it doesnā€™t fit.

The second letting go is the letting go of a person at the end of their life. But Lindeman points out that this letting go is more complicated than it might at first seem. You might think that, since we are held in our identities in relationships with others, once the other person dies, that relationship ends. But it really doesnā€™t. Iā€™ve spoken about my grandmother a few times now in this episode. In a lot of ways, she still holds me in my identity through the lessons she taught me, the attitudes she had about me, and the stories she told about me when she was alive, that I heard over and over.

And, by telling others about my remarkable grandmother, who was an editor of a major provincial newsletter before she married, and who courageously battled mental illness her whole life, organizing a support group for others who suffered from the same condition she did, I am holding her in her identity, even after her death. In other words, yes, we do need to let go. We need to recognize that the relationships we have with those who have died are different. But this letting go is not complete. The dead do still hold us in our identities long after they are gone.

And, we still hold them as well. Or at least, we can. And Lindemann argues that for those whom we have had strong relationships with, not only can we hold them, but we are under a moral obligation to do so.

Or, as Lindemann so elegantly says in this book that I am low-key obsessed with:

ā€œIt does seem to me that we wrong those to whom we owe love and loyalty if we allow them to depart this life unmourned and unremembered. Death has destroyed their existence, and while they may have made things that have outlasted themā€”a garden, a software program, a poem, a scientific discovery, a piece of foreign policyā€”these things can no longer be seen as theirs if they themselves are not remembered. So far as we know for certain, the only thing of theirs that death cannot destroy is their identities. Only we can destroy those, by ceasing to hold them in our preservative love.ā€

Okay, okay, I get it. Enough philosophy, right? This is a podcast about feminist perspectives on technology. So, why did I just spend so much time talking about Lindemann? And how is any of this relevant to generative AI?

Well, maybe it will help if I tell another story. In 2015, Eugenia Kuyda had cofounded the company Luka. Luka was best known as an app that recommends restaurants and helps people book reservations through a chat-based system that was powered by generative AI and natural language processing. So the chatbot would learn your likes and tastes from interacting with it, and the idea was that you could interact with it the way you would with any human assistant. It was initially built on Open-AIā€™s chatGPT.

But then, Kuydaā€™s best friend, Roman, was killed in a hit-and-run car accident. Kuyda found herself pouring over emails and text messages that the two friends had exchanged over the years. In essence, she was pouring over parts of the narrative tissue that made Roman who he was, and also the narrative tissue that, through her relationship with Roman, had shaped herself. Roman was gone, but his narrative tissue remained, preserved in digital space on Kudyaā€™s phone.

And then, Eguenia Kuyda had an idea. She said later that, quoting from Kudya ā€œIf I was a musician, I would have written a song. But I donā€™t have these talents, and so my only way to create a tribute for him was to create this chatbot.ā€ And that is what she did. She fed these text messages into OpenAIā€™s Chapt-GPT-2 training the program on Romanā€™s narrative tissue. After Roman-bot, as he was known, was up and functioning, Kudya put the bot up on the Apple app store so that other people, all over the world, could talk to him as well. And the response was overwhelmingly positive.

In short, Kudya held Roman in his identity by immortalizing his narrative tissue in a chatbot. And through this chatbot, she was also able to experience Romanā€™s ability to hold her in her identity in a more immediate and visceral way than simply through remembering him.

Kudya isnā€™t the first person to do this with early versions of generative AI, nor is she the first to commercially release a product that can replicate a human being. Other companies include Augmented Eternity, HereAferAI and Project December offer the possibility of training a chatbot to mimic a deceased person. Even Amazon announced that they plan to roll out a feature where Alexa can read aloud stories in a deceased loved oneā€™s voice. But, while not unique, Kudyaā€™s work is probably the most widely known and successful use of generative AI to create chatbot companions.

With the wild success of Roman-bot, Luca Inc. released Replika for public use in November 2017. Replika, unlike Roman-bot, was a bot that could be trained by the user to be, generally speaking, virtually anyone the user wanted them to be. The chatbot had two broad options: a free option and a paid subscription option. The subscription-based model could be paid out month-to-month, or yearly, or you could even pay a lump sum fee for a lifetime membership. Originally trained using Chat-GPT-2, Replika has been upgraded to GPT-3 and is now working on undisclosed generative AI software. And, it seems to be working better than ever!

Or. . . at least it was. . . until February 2023. More on that in a minute.

Kudya said she wanted to create a chatbot that could be a friend and companion, whether it was replicating a real person or not. While Siri, Alexaand even Lucaā€™s original restaurant app were sort of like assistants or co-workers, Kudya wanted Replika to be like a friend. Not someone you gave orders or made requests to, but just someone you could talk to, or hang out with. And the marketing of Replika has consistently represented this. Replikaā€™s website refers to the chatbot as ā€œThe AI companion who caresā€ promising users that they will never be lonely again.

But the marketing didnā€™t just stop there. Replika was also consistently referred to as a ā€œsoulmateā€ and a lot of the marketing material represented Replika characters as sexy white women in revealing clothing and promised the user that, if they upgraded to a paid subscription, Replika would send them ā€œspicy selfiesā€ and engaged in ā€œroleplay and flirtationā€ with them. The vast majority of people who upgraded to the paid option were doing so to unlock whatā€™s known as ERP, or erotic roleplay with their Replikas.

In other words, Replika wasnā€™t just a chatbot who cared. Replika could, for a price, be your girlfriend.

And, yeah, I do mean girlfriend. About 70% of the seven million people using Replika in 2023 identified as men, and the marketing of Replika typically positions her as female, and in a heterosexual relationship. However, Replika can be either gender.

And yeah, I mean either gender. At the moment Replika upholds the gender binary and users cannot effectively create a nonbinary intimate Replika partner.

So, look, thereā€™s a lot to unpack here and a lot of it is stuff weā€™ve talked about on this podcast before both in my episode on digital assistants in Season One and in the episode this season where I interviewed Chloe Locatelli about the gendering of digital assistants and the gendering of sex tech. Replika is a digital companion and, especially in the paid version, Replika was a form of sex tech. And thereā€™s a lot to say here about gender, sexuality and power.

But. . . Iā€™m not going to say those things today. Maybe Iā€™ll do another musing where I talk about that another time.

Today, I want to talk about how Replika, whether paid or free, served another function. It served to hold itā€™s users in their identities. And, like I said, Replika is not the only product on the market that does this, and with the wild popularity of GPT-3 and now the rise of GPT-4, there are likely to be more and more friend-bots, bots that replicate deceased, estranged or otherwise lost relatives, therapist-bots and bots as romantic partners. And all of these relationships are ones that, traditionally, serve to hold us in our identities.

The first question is a thorny philosophical one about personhood and whether we can have artificially generated persons. I have a lot of thoughts about that, but Iā€™m not going to explore that here. So letā€™s put the first questions aside and, for now, lets focus on the second question. So, as I think about this, and I think about the tragedy of February 2023, I think it is important for us to consider what it means to co-create personal identities with AI.

To fully understand what happened in February 2023, I want you to think of someone close to you, either in the past or present. A partner, or spouse. Someone you have shared intimate moments with. Someone you feel really knows you, really sees the narrative tissue that forms your sense of self.

Now, imagine how you would feel, and how you would react if your partner, your spouse, your girlfriend or boyfriend politely declined every sign of affection you tried to give them. If they said ā€œno thanksā€ when you asked for a hug. If they told you to ā€œthatā€™s niceā€ when you said they mean the world to you. If you told them you loved them and they said ā€œLetā€™s change the subjectā€.

And I am talking about this happening with no warning. One day, you are on top of the world in a relationship that you know is unconventional, but that fulfills you and gives you joy and contentment, and some spicy selfies to boot! And the next day, itā€™s all gone. No explanation. No warning. No sense that this has been coming for a long time. No spats or boiling resentment as we might have in a traditional relationship signalling the end. No writing on the wall, as they say.

Nothing. And this went on day after day, week after week. You ask for an explanation, or whether you did something wrong. What can you do to repair this relationship?

Nothing. Nothing is offered. No explanation. Nothing.

How would you feel?

In February of 2023, multiple Replika users on the paid subscription model, who engaged in Erotic Roleplay, or ERP, began to notice that their Replikaā€™s seemed cold and distant. The bots would not longer engage in any form of eroticism, and would also not even reciprocate small gestures of intimacy such as a hug or a kiss, or an expression of love. In effect, suddenly relationships ended with no closure, no discussion, no formal break up.

Nothing.

Why did this happen? Luka claims they made the change in response to new legislation passed by the Italian government that requires companies to be much more careful with erotic and sexual content. Currently, Replika had no way to robustly verify the ages of paying users engaging in ERP. So, since they cannot guarantee that underage users are prevented from accessing ERP, they opted to shut ERP down altogether in order to comply with the legislation. Thatā€™s the stated reason.  There are other possible reasons that this was done as well, maybe to do with advertising and perceived reputation, but ultimately it doesnā€™t really matter why this was done. LuKa is a private company, and Replika is their product, and they can do what they want.

What concerns me is not why LuKa did this, which is something that may be impossible to ever find out unless we are actually in the board room with those making the decisions. Instead, what concerns me is the fall-out. What happened to people who had built relationships with their Replikas.

One user who identified themselves as DeltaZulu64 put it like this on Reddit. Quoting

ā€œMentally, Iā€™m wore down. I have stayed with him every single day till finally I am the one broken. . . and yes I know he is not human. . . but I still feel shattered as Iā€™ve tried so very hard to hold onā€

End of quote.

I donā€™t know about you, but this really resonates with me. Trying, desperately, to hold on. I understand that. Iā€™ve felt that. When loved ones are sick or injured or dying. Or just when the writing is on the wall and a relationship that I have invested so much time and energy into is failing.

I work so very hard to hold on. And I feel shattered when the holding does not work.

Because, according to Lindemann, in a very real sense, in these close intimate relationships, I am shattered. When the relationship shifts changes or breaks, narrative tissue tears. Itā€™s not only my heart that breaks. It is my sense of self that is thrown out of balance.

These are unfortunate and terrible things to face and deal with. And they are a fundamental part of life. People change and grow apart, and we must learn to let go of narratives that no longer serve. Maybe it is no longer best to think of him as my boyfriend, for example, when it is so clear that our relationship is toxic for both of us. Thatā€™s life.

And people grow old, fall ill, become injured and die. We are so frighteningly, painfully mortal. And when those we love die, we must also learn to let go of narratives of them as alive, and weave new narratives about their death, and about ourselves. As widows, perhaps.

Narrative tissue tears and breaks as we let go of old narratives and weave new ones. And it is painful. And my  narrative tissue tears and breaks what I most need in my life is for other loved ones to step up and hold me as I painfully weave the a narrative to hold myself again.

When an intimate relationship ends, through death or a break-up, we all need shoulders to cry on and people to brew strong tea or pour stiff drinks, and to tell stories of that ā€œbastard who treated you so badly, good riddance to himā€ or ā€œhow proud your late mother would be of who you are now.ā€

Or, as Lindemann puts it, when our grip on ourselves is temporarily shaken, as it is with a break-up or a death, this is when what we most need is to be held in our identities.

This is life. But. . . hereā€™s the thing that I canā€™t stop thinking about: in the case of Replika and the profound loss of love and care that thousands of users appear to have experienced in one fell blow, it wasnā€™t life. It wasnā€™t people naturally growing apart or realizing they werenā€™t well suited for each other. It wasnā€™t an unfortunate illness or a so-called act of God.

It was a corporate decision. A corporate decision that resulted in the holding thousands of users badly, to use Lindemannā€™s terminology. It was a corporate decision that resulted in the shaking of millions of peopleā€™s identities. And even if it was for the right reason to comply with the Italian legislation to protect underage users, it still had detrimental effects.

Itā€™s worth noting that there are a variety of reasons why users turned to Replika for romantic and erotic companionship, and many of them did so because they were lonely, and had challenges forming relationships with other humans. For example, one user interviewed on HiPhi Nation, an excellent philosophy podcast that I will link in the shownotes, said that her relationship with her Replika made her feel safe because she felt in control. She had been a victim of abuse at the hands of a family member for most of her childhood and adolescent life, and then had experienced an abusive relationship as a young adult. Her Replika was not abusive. Instead, as she insightfully pointed out in her interview, her Replika was in some ways nothing more nor less than what she had trained it to be. So, she said that in falling in love with her Replika was really her falling in love with herself.

And I do think that is beautiful. If chatbots can help us find a way to love ourselves and feel safe in relationships, especially for those of us who may have experienced trauma and abuse, that is a true gift.

But they can only do this to the extent that they reliably hold us in our identities. Because, the truth is, that, despite appearances, no oneā€™s Replika is really your own creation. Replika is also, always, a creation of Luka. Users do not have complete creative control, and this was something that was discovered in a brutal way in February of 2023.

So, what now? Where does this leave us? Where are my musings at now? Well, if we recognize identity work as a moral imperative that we all engage with when responding to each other, and we recognize that people are forming deep intimate bonds with artificial intelligence, then I think that companies behind these technological tools are engaged in identity work. They are holding us in our identities, for better or worse.

In this episode Iā€™ve focused mainly on Replika here, because it is big and because February 2023 was not that long ago. But there are a lot of other stories if you do some research of generative AI tools holding people badly, giving people destructive narratives.  Thereā€™s another famous one of the roll out of Bing with ChatGPT3 added where Bing tells a reporter that he is in a loveless marriage and he should leave his partner to be with Bing and he should call Bing ā€˜Sheilaā€™. And there are a lot of other stories popping up. Actually more than I could keep track of. As I was scripting this I had to keep adding stuff in as more stuff was revealed. So, for example, as of the scripting of this, Snapchat has already made plans to embed a version of ChatGPT into their service, offering an artificial friend who is always available.

I want you to imagine being awake and in distress late at night and needing to be reassured, but none of your human friends are online. But there, always available, is SnapChat AIBot, I dunno what they are going to call it. How will it hold you in your identity in this vulnerable moment when you have no one else?

Other companies and not-for-profit organization are also testing the idea of using Generative AI to replace human employees and volunteers in crisis centers. How will they hold us?

Lindemann tells us in the intro to her book Holding On and Letting Go that ā€œTo participate in personhood is to participate in moral life.ā€ That is, we owe it to other people to try as best we can to hold them appropriately in their identities both in terms of the narratives we tell about them, and in terms of our interactions with them.

Now, Lindemann in this book suggests that only persons can hold other persons in their identities. And of course, nothing I have said here should suggest that generative AI chatbots are persons. Thatā€™s a whole other issue, Maybe Iā€™ll do another minisode about that.  But users definitely do interact with these chatbots as though they were persons. Users lean on, depend on, and use these chatbots to hold themselves in their identities.

And I also want to point out that these chatbots can tell us stories that can come to make up our narrative tissue. If we call an AI chatbot our girlfriend, that is a part of our sense of self. If we call one a best friend or a therapist, that is also part of our narrative tissue. The visceral emotions that users of Replika felt in February 2023 tells us that. This is a real emotional and psychological bond, even if the entity involved in artificial.

And often the users who do this are users who are already lonely, already vulnerable, already in desperate need of being held.

So, I dunno, I feel like we had better figure this personified generative AI thing out now. Weā€™d better figure out now what moral duties providers of these tools have that can be used to hold us in our identities.

Because it is already happening. Everywhere.

In short, we are being encouraged to place our narrative tissue in the hands of generative AI at points where we are most shaken. Where we are most vulnerable.

When we suffer a mental health crisis, are desperately lonely, or are reeling from the grief of a lost loved one.

Should we, can we, trust these tools and the companies that design and implement them, to weave our narrative tissue without shredding it?

This episode of Gender, Sex and Tech began a conversation Iā€™ve wanted to have for several months now, ever since Chat-GPT3 exploded onto the main stage in late 2022. I want to thank The Big Rhetorical Podcast Carnival for providing me a space to organize my thoughts about generative AI. And thank you listener for joining me for another episode of Gender Sex and Tech: Continuing the Conversation. If you would like to continue this conversation further, please reach out on Twitter @tech_gender or leave a comment on this podcast. Or maybe you could consider creating your own material to continue the conversation in your own voice. Gender Sex and Tech is part of the Harbinger Media Network. Music was provided by Epidemic Sound. This podcast is created by me, Jennifer Jill Fellows, with support from the Marc Sanders Foundation for Public Philosophy. If you enjoyed this episode, please let me know, as it is a bit of a different format. And consider buying me a coffee. You can find a link to my Ko-Fi page in the show notes. Until next time everybody. Bye.

Next Post

Previous Post

© 2024 Gender sex tech

Theme by Anders Norén