Gender, Sex and Tech: Continuing the Conversation
Episode 15: Interview with Dr. Heather Barnick
Transcripts by Ganesh Pillai
Jennifer Jill Fellows: Today’s episode comes with a content warning. We will be discussing sexual assault, sexual violence, so-called revenge pornography, and exploitation, and there will be mention of suicide.
Jill: Let’s say I want to spice things up in my relationship, so I take a nude photo of myself and text it to my partner, with his consent. Let’s say, as often happens, that we subsequently break up. What could I do if my now ex-partner were to share the photo with family and friends in Canada against my wishes? Or to post the photo to a pornography website as amateur porn? Or, to complicate matters more, what if they shared the photo with family overseas? Could I stop them? I mean, the answer should be yes, right? It should be yes. . . But as my guest will explain today, things are not so straightforward.
Jill: Hello and welcome to Gender Sex and Tech, Continuing the Conversation. I’m your host, Jennifer Jill Fellows, and I am joined today by Dr. Heather Barnick. Heather Barnick is a PhD candidate at York University, and a lecturer in the Department of Sociology and Anthropology at the University of Prince Edward Island. Her research has examined cultural identity in China’s gaming industry, and privacy, surveillance, and control of young people in the digital world. Her work has previously appeared in the book “Youth in a Digital Age: Paradox, Promise, Predicament,” published in 2019. And her research interests include empathy as a tool for digital theorizing and methodology, and massive multiplayer online role-playing games, and the game design industry in China. She also has research interests in tech ethics, and the relationship between power and digital technology. And today, she’s here with me to discuss revenge, pornography, sexual violence, and the Canadian Criminal Justice System. Hi Heather, welcome to the show.
Heather Barnick: Hi, thanks for having me.
Jill: Thank you so much for making the time to talk to us. I want to take a moment before we begin to resist the idea that somehow digital space floats free of physical space. What we’ll be discussing today, will demonstrate quite clearly that digital space has consequences in physical space. But I think it’s also important to remember that digital space itself only exists because of physical space. The materials our computer, cables and servers are made of, are extracted from physical space, and built in physical space, and occupy physical space once put into operation. And much of this physical space is stolen land. So today, I acknowledge that I am recording “gender, sex, and tech: continuing the conversation” on the unceded territory of the Coast Salish people of Qiqéyt nation. And where are you joining us from, Heather?
Heather: I’m joining you from Charlotte Town, Prince Edward Island. And I should say that is the traditional and unceded territory of the Mi’ Kmaq First Nations.
Jill: So before we jump into our topic today, Heather can you tell us a little bit about your academic journey? Like, how did you come to be a sociologist? Did you always want to be one?
Heather: Well, in high school I think I didn’t even know what sociology was, and I had a vague idea of becoming a forensic anthropologist. So it’s quite a journey from there to where I am now, which is sort of at an interdisciplinary intersection between sociology, anthropology, and science and technology studies. So I started my undergrad in anthropology and I did research on oyster fishers here in Prince Edward island, and issues around gender. When I started thinking more about youth and technology, that was at the end of my master’s research on international exchange programs and university internationalization. And I was staying with a group of students, some of whom were from China, Indonesia, places like this, and they were playing online games. So they had guilds and I didn’t know, I couldn’t understand what they were talking about. It was really going over my head, but I was so fascinated by how engaged they were, and how much of a social phenomenon this was. And so that, that led me to China. And this research is more in preparation for a class I was teaching on gender in Canadian society. So that’s how I started thinking about youth and safety in online environments, and issues around privacy, sexual violence online, misogyny online, and things like that. So a circuitous route, not as straightforward line to this topic, but. . .
Jill: I think the meandering route is often the more interesting one.
Heather: Yes. I think a lot of academics find themselves taking those kinds of routes.
Jill: Yeah, because I think a lot of the things that you can major in in university aren’t necessarily things that you know you can major in when you’re in like junior high or high school.
Heather: Yeah you discover it when you’re halfway through a degree or something like that.
Jill: I also find it, yeah, your discussion of guilds was kind of a throwback to me because when I started grad school, I sort of got sucked into World of Warcraft, and sort of belonged to a few guilds for a while.
Heather: That’s where I started too. That was my initiation. And that was a big game in China when I went. So that was a great way to sort of do the participant observation methodology that anthropologists use.
Jill: Right, right. Yeah, that’s really cool.
Heather: I had to play games.
Jill: Are you, what was your character in World of Warcraft? Can I ask you that?
Heather: Sure. I was an undead warlock.
Jill: Oh no, we were on different factions.
Heather: Oh, you were alliance. How dare you? I couldn’t play that character, at least not the same way, in China because the undead faction was deemed culturally offensive by the Chinese government, especially the depiction of bones. And this was quickly pointed out to be a contradiction because there were plenty of Chinese games that did feature characters that were skeletal and had bones. But this was a way of, I think, controlling the encroachment of western businesses in China because Blizzard, of course, is based in America. So there’s lots of complexity and that’s really a little bit of geopolitics going in there in the world of gaming.
Jill: Okay. Your later research got interested in youth in Canada, and privacy and surveillance and safety. So can you talk a little bit about what got you interested in safety and privacy for young people online, and particularly in a Canadian context.
Heather: Yeah, So when I was preparing content for the course in Gender in Canadian Society, I came across the cases of Amanda Todd and Rehtaeh Parsons, which were just two horribly tragic cases. And I remember watching Amanda Todd’s cue card story on YouTube. And it was just, so, it was chilling, but also really compelling and moving. And so that really got me following their cases through, because their parents, after both girls took their own lives, their parents were really involved in getting legislation moving around anti-cyberbullying, getting police and the government to take this seriously as a crime. So your introduction, what you said, was so important about the fact that the Internet is physical.
Heather: So not just the infrastructure, the server farms that are required to make it run, but just because there’s no immediate proximate contact doesn’t mean there’s no physical harm. Or just because we talk about the harm as psychological trauma doesn’t mean there’s no bodily component to that trauma. Emotions need a body. And when suicide can be the results, it was really their cases, and there was one other cyber-bullying case, that was what got the legislation moving, that I then started following through to revenge pornography, which can involve teenagers and children, but is now also involving adults, people of all ages, really. So that’s where it started, was really with their with their cases. And they had a global impact, too. In the States, their cases were referenced as well.
Jill: So growing out of this class, and this research, you track and continue, I think, to track a number for a number of forms of online sexual violence that exist. And you looked at the different ,and often, inadequate ways that the Canadian criminal justice system has attempted to address these forms of violence. So could you tell the listeners a bit about what some of these forms of violence might look like in digital space for people who don’t know, for example, what revenge pornography is, or what these other forms of violence can look like?
Heather: Sure. So what I’ve been tracking are a series of crimes that the law, the criminal code, refers to as “non consensual distribution of intimate images”, which is a really long. So that is often shortened into the acronym NCDII, which is still not that easy to say. So as I go forward at our conversation, I’ll probably just say things like image-based sexual abuse, or non-consensual pornography. But when I say that I’m referring to NCDII. It’s a very awkward term. And so that’s an umbrella term that could include all kinds of things. So maybe I’ll start with an example. Something I was looking at this morning was a case in Winnipeg where a woman had broken up with her ex husband and I think she had custody of her children. And he had planted hidden cameras in her house, in the bedroom, in the bathroom, and other rooms. He had captured her having sex with a different partner and images of her nude in the bathroom and things like this. She reported this to police. She had noticed odd things in the wall by the outlets and things like that. And so she called police, and they discovered these cameras. She then learned that he had sent these images of her sexual activity, in her in the nude, to his family in Pakistan.
Heather: And so then the family was sending them to her family…
Heather: in order to threaten her against bringing this up with the police, or pressing charges against her ex-husband for doing this. So he sent them to his family and they weaponize them from Pakistan because then he wouldn’t be guilty of using threats with intimate images.
Jill: Oh, okay.
Heather: Once it’s crossing international borders, it’s much harder for the police to get involved, to stop the activity, to do anything about the threats. So they were threatening to show them to her employers, other friends and family, post them on pornography sites to humiliate her. And all of this to prevent her from going to the police or pressing charges against her ex. So that involves multiple forms of NCDII – voyeurism. So this is happened, hidden cameras in bathrooms or change rooms. Sex-torsion, where sexual videos or intimate photos are used for blackmail purposes. So you get money or to make request for more sexual images. So I’ll, I’ll put these on your Facebook unless you give me a video of you naked or whatever.
Jill: Or in this case, to compel a different kind of behavior?
Heather: Yeah, to prevent police action. And then revenge pornography, at least in criminal case law in Canada, has so far been the most common form of NCDII. So Moira Aikenhead, she is a law professor at the University of Victoria, in her recent study, she looked at all the English language cases. And she said 80 percent of them are revenge pornography. And 100 percent of the victims were women. And usually the perpetrators are men.
Jill: Right. And usually if we’re talking about revenge pornography, are we often talking about somebody who was a former partner, or hook up, or something like that?
Heather: Yes, exactly. So this is called revenge pornography because it’s seen as getting revenge for, say, if an ex-partner cheated, or current partner cheated, so alleged infidelity. And sometimes it’s just because they broke off the relationship.
Jill: Right. So it’s getting revenge for, the person committing the revenge porn either believes the other person cheated on them, or believes they dumped them, and this is a way of trying to get even, I guess.
Heather: Yes. And so obviously, feminists take objection to the term revenge pornography. Because there’s a presumption in there that the victim has done something wrong, when often the alleged wrong is in the mind of the perpetrator, and it’s based on patriarchal norms or misogynistic.
Jill: You can only seek revenge if you’ve been harmed, right? Revenge implies that there has been a harm.
Heather: Often it’s not justified.
Jill: The perception of harm is not obvious.
Heather: So to give you a really horrible example, this was a civil case. And I hate how they use Jane Doe. They don’t want to identify the victim, but they use the term Jane Doe. So Jane Doe versus N.M. He had a long history of assault and battery against his female partner. And so she called the police for this, for the physical violence. In response, he posted videos, sexually explicit videos of her, on a pornography website, and he sent her a text indicating he had done so and he said, “Well, I have a criminal charge for life. Now you’re an internet whore for life.”
Heather: And then he said this is a fair trade. So this is what, this is the reason feminists are concerned about the use of this term revenge pornography. This is how twisted justifications can be for the revenge, and it’s often disproportionate to the harm it does to victims, who are harassed for years after the original incident by complete strangers. The images are usually doxxed, so full name, phone number and address is included. So they get threatening phone calls. Some women have been stalked in real life. Sometimes part of the revenge is a sex solicitation ad, if they posted on Pornhub or something like this. So women will have strange men coming to their house looking for sex.
Jill: Because of this ad posted by someone else?
Jill: The ex-partner, or whoever?
Heather: Yeah. And for years and these images are circulated and shared. In one case, it was viewed 60,000 times. In another case, it was downloaded 10,000 times. These are cases from Canadian examples I looked at. There are communities of men online whose hobby it is to threaten and harass women who have been victims of revenge pornography. So it’s almost like a culture has formed around this online, perpetuating the harm. So if they take the images down, this group of men will make it their mission to keep putting the images back up, to keep the humiliation going. So the courts, this is my big issue with the law right now, is it’s still treating it as an interpersonal crime.
Jill: Oh, okay.
Heather: And the more it’s shared, the more people view it, that will increase the punishment or the judge’s understanding of the severity of the case, the extent to which privacy has been violated. But it fails to understand that the Internet opens up harm well beyond the initial incident. And it allows groups of men to participate in, and perpetuate, the harm. And it is most often men, so far, as far as we know.
Jill: So what we’re seeing here then is that the courts are still treating this as though it were like between two people kind of, that there’s this issue of the perpetrator of the NCDII. Did I get that right?
Jill: So there’s a perpetrator of the NCBI and then there’s the victim. And they’re not necessarily thinking about once these images, or this material, is up on the internet circulating, it’s not really just between these two people anymore, that a ton of other people are getting involved. And it may even be people that don’t know either of the initial people involved, right?
Jill: So in the one case, you gave, family members were involved. But there are other cases that you have talked about where it’s just kind of random strangers on the Internet picking up this kind of “vigilante,” I’m using air quotes, “vigilante,” mission to keep these images up.
Heather: Yeah, from all over the world. Who knows what country? Yes. So I don’t know what the law as it stands because our rights and our laws are based on the idea of personal autonomy. The individual has dignity and worth, but the Internet doesn’t work like that. So the legal system…
Jill: Trying to play catch up, maybe?
Heather: Very outdated. And just the speed at which the Internet moves, literally at the speed of light, fiber optics. But you can’t, even if you get injunctive relief from a court to say that person who committed the crime, or posted the content, is responsible for taking it down, they might only take down the initial post. They might stop posting, but that doesn’t stop all the other strangers on the Internet from sharing it, copying it, screenshotting, re-posting. So that’s often left to the victim and that can be a lifelong, some have to have to pay professionals to deal with this. And some porn sites have started to make money off of this. So porn sites will post ads for scrubbers. So something on here that you don’t like? Well, you can pay the service
Jill: Oh my goodness
Heather: to keep taking images down if they keep getting re-posted. So it’s not only making them responsible, but making them pay, when really these companies should be held to account for this kind of activity.
Jill: It’s making this criminal act a source of revenue.
Heather: Exactly. Yeah.
Jill: Oh my gosh. Okay. So I feel like you’ve already kind of addressed the next question I’m going to ask. But just to really drive the point home, perhaps, why does it make sense to call NCDII type acts violent, or a type of violence. Or why might it be important to label this as violent?
Heather: Maybe I’ll just quote directly from Amanda Todd’s cue cards story because I think she says it so well. She said, on one of her cue cards, “I’ll never get that image back. It’s out there forever. And sometimes I wonder what’s left of me now.”
Heather: And she talks about cutting herself. So she was physically cutting. Just the way she expressed how loss of control of this image was so destructive to her physically and emotionally, to the point where she ended up taking her own life. I mean just those words “what’s left of me now?” So when we give an image, this almost seems, this might be my anthropological training, but I’m thinking about it as a kind of exchange system because these kinds of exchanges are becoming so normalized now among age groups as young as middle school, 13, 14, they’re starting to exchange nude images, semi-nude images.
Heather: And so I think we have to start asking about the social and cultural dimensions of this, as well. What is happening in this exchange system? It’s obviously more than symbolic.
Jill: And what are the norms in this system?
Jill: As an older millennial woman, I know nothing here.
Heather: Because an exchange, the principal, the fundamental principle is supposed to be reciprocity. So there’s a ritual obligation to give, a ritual obligation to receive something that’s given. And reciprocity is supposed to be the underlying principle. But in these exchange of intimate images, we still live in a world where women are punished for their sexuality, and men are rewarded for it, especially in the public, when it’s a question of public display. And so, this it seems to me to be away men are weaponizing already existing patriarchal norms, using digital technologies to further harm women.
Jill: We already live in a society in which men are studs and women are sluts, to kind of paint the starkest extremes, and that this exists in the digital realm as well.
Heather: And even if we’re talking about just an exchange of photos on phones, women are not going to gain clout by showing naked pictures of men, at least I don’t think this might be changing. I don’t know what Gen Z is up to these days, or how they’re thinking about these.
Jill: Yeah, the way I hear people talking about dick pics, it does not sound like something that’s gaining you a lot of prestige when you receive one of these.
Heather: But then if you look at, say, a website like Anon IB, which is still operating, and this is where men will request pictures of women by name. And it might be something like, do you have any naked pics of girls from the St. F.X. basketball team. And then some random guy will drop photos of girls from that team. And so it’s like they’re collecting them almost in the way teenagers use to trade hockey cards or baseball cards when I went to school, except now these are women who they go to school with, or work with. And this is how they’re treating them, and sometimes it’s not even them as a whole person. So one quote that disturbed me so much was, “do you want,”- I’m sorry to your listeners – but he said, “tits or holes”
Heather: So they’re asking for images, but, well, what’s your preference? We don’t care about the face.
Jill: We don’t want the whole woman, we just want parts.
Heather: Yeah. Yeah. And to collect them and itemize them. Yeah.
Jill: So we’ve talked about this in terms of, by-and-large statistically, men being the perpetrators and distributors of these images, and women being the victims. Can we bring in the theory of intersectionality, and talk about how that might help us understand the various risks that victims, most of whom as you’ve said are women, face with regards to these online forms of violence?
Heather: Yeah, I think, going back to my ideas about this as an exchange, it’s pretty clear that men and women, and especially when we start adding in other intersectional dimensions like age and race, they’re not positioned equally at the site of exchange. So womens’ images have more value to men, than vice versa. So that’s one thing. But we see the same kinds of patterns that you’d see with other forms of sexual violence. So women who are racialized are targeted, and their race or ethnicity can be used to further the harm. So the photo might be tagged with slurs, racial slurs. That happens quite frequently. It’s more threatening if this is happening, say revenge pornography, is targeted against a woman who comes from a culture or religion where modesty, is very important. In one case, just the threat was enough for a university student in the UK, to take her own life. She was from Kazakhstan and her family’s Muslim. So these kinds of intersections matter, the same sorts of vulnerabilities that we see in other kinds of sexual violence. Indigenous women. So there’s huge issue in Canada with sex trafficking of Indigenous women. And this has been brought, I think there’s now a lawsuit against Pornhub for allowing images of women who were sex trafficked and underage well, to go up on their website. So an international law firm from the states filed a suit against them. Visa and MasterCard blocked charges, because of this kind of thing happening on Pornhub. So they are, Pornhub has been targeted, but it’s not as if they’re the only one. But you can, you can see this sort of, the same kinds of intersections that we talk about with other forms of sexual violence are repeating themselves with NCDII related crimes.
Jill: And this is something that’s come out in a few podcast episodes. There is sometimes this perception that like the digital world is going to somehow float free of our social and cultural norms, and patriarchal oppressions, and racism, and sexism, and ableism, and all of this stuff. And, and no. I mean, what we see is that digital space reproduces all the same kind of stigmas and prejudices that exist before the Internet, and before and NCDII was possible.
Heather: Absolutely. If you look at every year, Pornhub publishes what they call “the year in review” and they give all the data they’ve collected on users. So the top searches, they’ll break it down by country, other kinds of demographics like age, gender, sexuality. And you can just tell by that list of most frequently searched terms how intersectionality plays into an ecosystem of desire, which we see in the categories that show up on pornography sites. So I think in 2021 “Hentai” was the top searched term, Japanese with second. Then I think lesbian is third. But you can see in the list of the top 20, just about every ethnic minority is featured there, Latina, Ebony. So this is part of, part of the violence built into desire, and it shows up in data like that, top searched terms. I think now leaked revenge porn because it’s now been targeted, companies are being coerced to get rid of it, it’s showing up under other headings, like cheating, or leaked sex videos, or amateur videos, or girlfriend is another one where you’ll see revenge pornography show up in different categories. But just like any internet community, people know the code. So they, they know how to find this content, even if it’s not superficially obvious.
Jill: Even if they have to use a different term, because now they could potentially be in legal trouble if they put something up as “revenge porn.”
Heather: Yeah, or there might be algorithms now, based on how something’s titled, to not allow it to be uploaded in the first place. But there’s always a way to bypass those kinds of mechanisms. You just insert a number, or spell it incorrectly, or something.
Jill: So it sounds like we’re also still seeing sexual racism. This idea of exotification, the reducing of people to stereotypes, and also some homophobia sounds like it’s there as well.
Heather: Oh yeah, Absolutely.
Jill: Okay. So discouraging but not surprising.
Heather: Right. That’s a good way of putting it.
Jill: So we kind of now know what NCDII is, and the different forms that it might take and how it is commodified online, how it is kept going online by this team of, for lack of a better word, vigilante men who keep posting things even after they’re taking down, commodified by pornography industries, let’s flip to the criminal justice side of things.
Jill: So in your research, you’ve listed a number of ways that the justice system has tried to deal with NCDII. So could we go through some of these ways? For example, you talked about Bill C13. So what is that, and how can it be used?
Heather: So that was the Protecting Canadians from Online Crime Act. And so this was the proposal that was debated in parliament, partly in response to the tragic deaths of Amanda Todd and Rehtaeh Parsons. So it included anti cyberbullying legislation, so we have to have something explicit and clear so that police can pursue these crimes, they can be prosecuted, they’re recognized as crimes. So a big part of the problem was that police were just saying, “well, this is, we can see how this is annoying or a terrible, but it’s not a crime, and there’s nothing we can do about it. It’s not sexual assault because there’s no physical contact, no physical body was harmed.” So again, this assumption that what happens online is virtual, and therefore…
Jill: Somehow unreal?
Heather: Not really real, yes. And so this is why cyberbullying, online harassment, and these kinds of sexual image-based abuse, went unchecked for so long. So that act came into effect. It was ascended into law in 2014.
Heather: Which to me just seems crazy because we’ve had the Internet for a long time.
Jill: Yeah. It’s like maybe 14 years too late.
Heather: Exactly. What I was interested in, with that act, was the way it amended the Criminal Code, Section 162 of the criminal code, to include this new offense of non-consensual distribution of intimate images. So it made that a crime. It also increased police powers to seize devices, digital devices like computers or cell phones, if they were implicated in these kinds of crimes, sometimes without a warrant. So this could also mean they could obtain emails and access to what was on a computer, computer files and clouds, to investigate and prosecute these crimes. So I know Carol Todd, Amanda Todd’s mother, was concerned about some of these increased police powers
Heather: Because she felt like this might be an overreach, going too far, violating privacy rights. But nonetheless, I think a lot of those increased powers remain in the law. So then we get the definition of intimate images as well, which is part of the criminal code. So intimate images could be anything that includes sexually explicit content. It could be fully nude, semi-nude, or a photograph or a video, showing sexual activity. And I think where the challenge is that you have to be identifiable in the photo in a lot of cases, especially if you want the image taken down. So I’m not sure about the law, if you could still go through with the case if you’re not identified, as long as it can be established, that that is you in the photo, you created the image. But often big tech like Google, Facebook, Pornhub, if you’re not identifiable, they often won’t process your takedown request.
Jill: Wow. So that seems to even further incentivize the treating of women mostly as body parts, because if it’s a body part, then maybe it doesn’t have to come down.
Heather: Yeah. Going back to the violence, the dehumanization, objectification of it all.
Jill: It’s like doubling down on it.
Heather: Exactly. Yeah.
Jill: So some progress has been made though. We have this bill in place now.
Heather: Yes. So we have this law. And so that is if you want to pursue criminal action. So for NCDII crimes, there has to, you have to establish that you had a reasonable expectation of privacy when you created the image, but also in its distribution. So you did not give consent to its distribution. So that gets muddled too, if consent was given initially and someone changes their minds. Often those cases aren’t successful. So I need to think through consent more, and how it parallels consent as it’s understood in sexual assault cases. But it seems like a problematic area of this new law, and so is the reasonable expectation of privacy. Because if you were drunk and maybe flashing at a party and someone took pictures, if this was a public place, or say you’re at Mardi Gras where it’s kind of a tradition, and someone took a picture and then used it, said, “I’m going to send this to your boss unless…,” in those cases, I don’t think you could meet the threshold for reasonable expectation of privacy.
Jill: Because you were in public.
Heather: In a public place, yeah.
Jill: But of course, a public physical place is very different than your image is online forever.
Heather: Exactly. So this is another, this is where feminist legal theorists have, it’s another area of criticism of this law, this idea of reasonable expectation of privacy. And just the way judges are understanding violation of privacy when they start thinking about the severity, and how stringent their punishment should be, or how much damages to award if it’s a civil case. So if it’s shared more widely on the Internet among strangers, this is understood as a more egregious violation of privacy. But in some cases, the impact on the victim could be worse if it’s people they know.
Jill: Right, if it’s your boss or your family.
Heather: Exactly. In some ways, strangers, it’s sort of like, Well, who cares? But yeah, when it’s people in your social circle, when you’re losing employment opportunities, losing friends or family over this, which happens in a lot of cases.
Jill: So we’ve talked about the criminal. And there are some problems here around consent, privacy, and whether or not a widely shared image is more harmful than an image that is targeted at people you know, who may have power over you, or whom you may have a complicated relationship with, or what have you, so we know that there’s some problems or some issues with the criminal law, but that’s not the only way that. . . that’s not the only recourse legally that victims have. So can you talk a little bit about civil law?
Heather: Sure. So I think what’s important to understand, and I didn’t know this before I started looking into sexual assault and these kinds of laws, is that criminal law, it’s not there to compensate victims. That’s not the purpose of criminal law. And I think for a lot of victims of sexual assault, this needs to be made more clear, either lawyers making that clear. So if you refuse to testify in a sexual assault case, a warrant could be put out for your arrest. So criminal cases for deterrents, or denunciation, that’s the purpose. But as far as compensating victims, civil law might be the better way to go. So there are, most provinces now have torts that are very similar to NCDII in the criminal code. But civil action allows you to sue the perpetrator for damages. So that’s a financial reward. So the big advantage of taking that route is you would get, at least get some financial compensation. If you end up having to pay a scrubber to remove your images, you might need that, it might come to that. But that might be also what justice means to you as the victim. Sometimes damages are awarded also for the purposes of denunciation in civil cases too. So there might be additional damage awards, understood as not so much to compensate the victim for harm or loss, but as a demonstration to the society how severe this crime is. It deserves extra punishment, more damages. So some of these damage award amounts have been in the a $100,000 range and there’s nothing stopping you, let’s say your case didn’t work out in the criminal system, that doesn’t mean you can’t still pursue civil action.
Heather: The burden of proof is less so you might have more luck in a civil suit than criminal action. You won’t be able to get jail time. Probably. So that I guess if that’s something you’re looking for, then criminal law is the way to go, a criminal trial. But so far in the civil cases, what I’m seeing from judges is that they are understanding this as sexual violence, as assault. In one case, a judge used past cases of sexual assault and battery as a reference to determine damage awards.
Jill: Past physical cases to determine damage awards for a digital case?
Heather: Yes. And even, even in their summary of reasons indicated, although this isn’t a physical crime, it’s akin to sexual assault.
Jill: That’s awesome.
Heather: This is, especially because the violence can be repeated again and again by strangers, this is analogous to multiple assaults.
Heather: I’m paraphrasing that the judge’s words.
Jill: That sounds quite positive.
Heather: Yes. It’s promising to see that that kind of understanding is happening on the bench, I guess. Because there’s been some pretty horrific statements from judges in sexual assault cases, like “why couldn’t you just keep your knees together.” So that’s the civil route. In both cases, you can get a court injunction to stop the activity and get the images taken down, and have them deleted from phones, computers, and online sites. So that injunctive relief is also an important reason to go through the court system.
Jill: Though that may not remove the images entirely?
Jill: So is there a difference between civil law and things like copyright acts or a takedown notices, that you might also try and enact as a victim.
Heather: Copyright thing would be separate. So this also gets complicated because if you’re a victim of voyeurism, where you did not create the image on your phone, you can’t use copyright, which just seems insane to me.
Jill: But this is an image of you.
Heather: And this is how copyright law works. So if someone, say, painted an image of you, the painter has the copyright even though you are the one depicted. And so I guess that same logic is applying to these cases.
Jill: So you have to take the image yourself?
Jill: So if we’re talking about situations like you were talking about earlier, with hidden cameras, you wouldn’t be able to pursue copyright law there. But if it’s a case where you took an image and shared it with a current boyfriend, and then they broke up with you and started using that image, since you took it, you own the copyright on it. Is that right?
Heather: And the threshold for being, you have to establish some kind of creativity was involved in order for copyright. There’s all sorts of criteria to establish that its copy-rightable content, I guess. But the threshold is pretty low, so as long as there’s some indication of camera angle, lighting or posing that would meet the threshold. So it’s not usually difficult to establish that you have copyright. But because these takedown requests have to go through big tech platforms, who all have various policies, you have to fill out a form on their website, which they then have to review and process. So even just the lag, in some ways I’m thinking about bureaucratic lag, as a form of sexual violence in these crimes.
Jill: Kind of re-victimizing the victim.
Heather: Because every time there’s red tape, or some kind of stall. Five business days to process this, or it’s five years after the initial incident, before you have your day in court.
Jill: And how far can the image travel in five days, let alone five years.
Heather: Right, in 24 hours I mean-
Jill: So you can, under some circumstances, though, it sounds like use copyright as a way of trying to pull the images off the web.
Heather: Yes, but that doesn’t mean, in one case I was reading about in the states involving Google, this lawyer went through the trouble of registering copyrights. So sometimes they require registration of the copyright, some sort of official legal document indicating that you do have copyright, even though it’s clearly you in the image, and it’s obvious that you created it. A platform like Google might require this extra step of registration.
Jill: Oh my goodness.
Heather: And even then this lawyer said Google then just said, We don’t believe you actually have the copy.
Heather: So it seems like there’s such a variation, it just depends on who is processing your request. The response is so inconsistent across these platforms. Sometimes it’s immediate, and it’s taken down immediately and there’s no trouble, and other times there’s just reluctance, or resistance to do so. Indifference, a lot of women are complaining about not being taken seriously. It’s taking too long to process. They have to repeatedly make these requests in some cases, and then in other cases it’s fine and it’s gone in six or seven hours. So because they can’t be held accountable, legally, for third party content, so this is another issue with the Internet, they have policies against, these are clear violations of Terms of Service and that sort of stuff, they have takedown request procedures in place, but that doesn’t mean there’s any way to legally compelled them do this if they just don’t feel like it, or they’re taking their time to process a request.
Jill: And also, I would imagine, again, this is labor that falls to the victim, right? Like you have to prove that you have copyright. You have to go through this procedure. You have to, I guess, go through it more than once if you get the wrong person reviewing your claim the first time around or whatever.
Heather: You have to collect the links of your images.
Jill: You have to view them then
Heather: The trauma, right? You have to look at them, and copy and paste the link, which means you can’t put in the requests. You have to keep the images up to have the request process.
Jill: So you can’t try and block them or take them down yourself. You have to document it all and leave everything as it is, and be a witness to this.
Heather: Yeah, or screenshot it, or even for the police investigation and things like that, there’s certain requirements for evidence, which make things really hard for victims who, for a lot of whom just want it taken off the internet as fast as possible.
Jill: Don’t want to spend hours documenting it all.
Heather: But if you want to go through the court system, you have to leave them up for a certain period of time.
Jill: Okay. So we have talked about what these crimes look like, what people’s recourse might be, the challenges that exist in various routes of recourse. But there’s also something you alluded to quite early in our conversation when you talked about the case you looked at where the ex husband, I think it was, sent the material to his family in another country. So we began this discussion by noting that digital space is physical space in the land acknowledgment. And I think that that also pretty clearly applies here, that when the crime happens in different physical spaces, this can make things challenging. So how does legal jurisdiction complicate things? Like if the server hosting these images is in another country, or the people that the images have been shared with are under a different jurisdiction, how does that complicate the matter?
Heather: There are about 1,000 ways.
Jill: Oh my goodness, let’s here a few.
Heather: Okay. So with copyrights in the United States, there’s the Digital Millennium Copyright Act. And Canada has different copyright legislation. And so if you’re using the Canadian court system, but, like you said, the company or the server’s located in the United States, there might be a mismatch between Canadian legislation and what’s happening as far as the Digital Millennium Copyright Act in the States. And so you might run into some trouble there. A big issue in the United States right now, around jurisdiction, is that, unlike Canada, they don’t have a federal law in place. And so this means even if a crime crosses state boundaries, there might be different requirements or different thresholds. So I think most States have legislation like Canadian provinces do, but it’s not the same. So in some States you have to demonstrate that there was malicious intent to harm the victim. And sometimes mens’ understanding of this is pretty twisted. They think it’s funny or a prank, or it was just for fun, just for kicks. I didn’t mean to harm them or shame them. I just thought I’d share it with my friends, or posted it on a porn site.
Jill: It was a joke.
Heather: Right, and so if you can’t prove malicious intent in some States, you won’t get your case heard.
Jill: Even if you are in Canada?
Heather: Right, or even if your State has laws saying that that’s not required. I think it would depend on what State they’re in. And that’s where I’m not sure, you’d have to call a legal expert. Which jurisdiction comes into play? Is it where the victim is? When it’s the Internet, we can’t really say where the crime took place. Or is it where, I know Amanda Tod’s case they have now, just in 2020, extradited the Dutch man who was found responsible for harassing her and posting her image. But her case was, how long ago was that now? I think that it’s been about 6 or 7 years since that happened.
Jill: It’s been a number of years since the case. Yeah.
Heather: He had to stand trial first. And it’s only because Canada has an extradition agreement that he’s going to be able to come to Canada and stand trial. But a nation could just refuse, even if there is an extradition agreement. And they might refuse because the laws are stricter in the country where the trial is going to take place. So that’s a complication in the case in Pakistan. The RCMP could investigate in Pakistan and try and stop the family from these threats or posting these images again, but they’d have to get permission from Winnipeg police. So the request would have to come from Winnipeg police, to the RCMP. And then the RCMP has to work with Interpol, which I think has 195 members, and they coordinate police departments from these 193 different countries. But you’d probably have to work with multiple police jurisdictions in Pakistan as well. So like I said, every time there’s a different jurisdictional boundary is crossed, it’s more red tape, more bureaucracy, more delay, but the harm can continue in the meantime,
Jill: And the Internet is incredibly good at crossing jurisdictional boundaries. Really good at it.
Heather: That’s sort of how it was set up.
Heather: But the bodies still continue to matter, like you said. This idea that we would lose the significance of race and gender and ability would suddenly disappear with the Internet. And this concerns me with Web 3.0 on the horizon, because now we’re talking about augmented reality and Meta, where it’s a much more embodied kind of Internet.
Jill: Yeah a very immersive experience.
Heather: Yeah, and integrated in physical space in certain ways. You have a physical kind of Avatar representation of you, which I think heightens or intensifies the trauma. You have a potential for sexual violence in ways I think we haven’t even thought about yet. So if we don’t think more about this now, again, we’re going to be chasing after that technology when Web 3.0 comes around. So that will complicate jurisdiction too.
Jill: So let’s say, despite all of this, despite the fact that when it comes to criminal law, there are issues around consent and other issues in terms of understanding when NCDII has, when there’s been a violation, and in terms of civil law, there are other complications, and Copyright Act, you have to prove that you have ownership and that there’s some kind of creative element, and then we bring in jurisdictions and now we’re working with multiple layers of police potentially, and different national laws in different nations, let’s say that even with all of that, we win the case. Let’s say we win a legal case regarding one of these NCDII, and we get our take-down notice, like, let’s not worry too much right now about what happens to the perpetrator, we’ll just put a pin in that. But one thing that happens is that these images have to come down. How then is that accomplished? How are the images removed?
Heather: So if you put in a request on a website, say like Instagram or Facebook, I think they just take the links that you provide as part of that takedown request. And then they remove the images that way, from the backend. Whoever’s the moderator or administrator of the site would have the ability to do that. Facebook was toying with hashing technology for a while. And they had this, this is where we need more women in Silicone Valley, so they had what they called the NCDII pilot project. And what they were doing was requesting people to send their naked images in advance…
Jill: Oh no.
Heather: if they thought that someone might do this to them, or just to prevent it from happening if it did, and they would use these images to train their algorithms so that if any sort of image, I guess the algorithms would recognize things like skin tone, identifiers like moles, so even if it was just a body part, they would still be able to recognize it, and prevent it from being uploaded in the first place. So I think this was an attempt. The understanding was how fast the Internet moves, and how quickly these things can be shared, so let’s take a pre-emptive measure to potentially help victims. But it was just so misguided. Send us your naked pictures so we can train our algorithms. And then they, one of the reasons they abandoned the project, was because, globally, the definition of intimate image was so varied that they couldn’t train the algorithm to work effectively. But the other issue is all it takes is one rogue employee at Facebook.
Heather: And then of course they’d have identifying information too, so they could potentially threaten or harass or whatever. It was just.
Jill: The listeners can’t see my cringe face right now.
Jill: I’m just thinking about being asked to send intimate images of myself to a company that began as a company for rating hot women. Like Facebook, you know, the initial idea was like, we’re going to make this website so that we can rate the women at our university in terms of hotness. And now it’s like, we’re going to look after you, send us to your images. Ummm, maybe no. No. Maybe I won’t give you control over that.
Heather: Exactly. Especially considering the age and other demographic factors that are, sort of, the trends of Facebook employees, or Silicon Valley employees generally.
Heather: In their 20’s, 18 to 20. My thinking is why not put the power in the hands of users? So we know that we have this hashing technology that can, sort of, follow the trace of an image online, and potentially identify it in multiple places on the Internet and remove it. So why not build that kind of safety feature into devices, personal devices? I figure this technology has to be possible because if I can get a notification if some random person likes a comment I make on TikTok or Facebook, shouldn’t I be able to get a notification if I sent someone a photo from my phone and they’re attempting to upload it somewhere? So a kind of permission. So and so is uploading this photo. Do you give permission?
Jill: And that wouldn’t necessarily stop the voyeurism stuff. But it would stop when you’ve taken a picture and shared it with a partner.
Heather: Yeah. And since revenge pornography, at least in Canada, seems to be the most common type of crime, I just feel like this is a, a good solution. And from a tech perspective, wouldn’t this be a potential for innovation, since that’s one of their favorite buzzwords. Like why not think about safety, cyber safety in terms of innovation potential, and building safer devices that better facilitate gender equality online. Stop this kind of response, like well just stop sending nude pictures, or stay off the net. That’s what Amanda Todd was told. It’s essentially saying, if you, it’s parallel to saying “stay indoors if you don’t want to face the risk of sexual violence or misogyny.” Because online life is part of participation, it’s part, of it’s essential for the labor market, for social life, for everything, for running a business. So that’s not a solution. It’s another blame-the-victim kind of response that is not helpful, or fair, or just, if we want to take it to that level. So Europe has the right to be forgotten. And this is another really interesting possibility. Could Canadians have the right to be forgotten, so that if something’s showing up online, it could be de-indexed from all search engines?
Jill: Yeah. Because in Europe, yeah, you don’t have to prove anybody’s harmed you. You just say, I want all this stuff down. I just don’t want it up there anymore. It all pertains to me and I don’t want it there anymore.
Heather: Yeah, as long as it’s not criminal activity, shouldn’t I be able to control my image, my reputation. And when it’s sexual images, we’re talking about sexual autonomy, dignity. So maybe this is another possible solution.
Jill: So I’ve really enjoyed this talk today. I think I’ve learned a lot about the ways in which the Canadian government has made strides to recognize that this is a genuine crime, and not just weird digital, not-real stuff.
Heather: Not really real.
Jill: Yeah. So that’s good and that’s positive. And we can still see that there’s quite a lot of work to do, in terms of really being able to control our images online, that we don’t really have yet. Is there anything else, Heather, that you would like to leave our listeners with regarding non-consensual images, and revenge pornography, and the Canadian justice system today?
Heather: I think that, well, we need to find ways of making big tech accountable to the law. So that I think it’s Section 230 of the Telecommunications Act or something that means they’re not liable for third party content.
Heather: This can’t stand. If we want, if we’re looking for equity and gender equality online, it’s incompatible with this, with not being able to hold big tech accountable to the law. So that’s a big thing that I think feminist, legal feminist theory, something we need to interrogate, as well as what are the solutions that are operating at the same speed as the Internet? So there’s this temporal dimension that we can’t address through the legal system.
Jill: And I also don’t want to address it through Facebook collecting my image
Heather: Through scary pilot projects.
Jill: There has to be a better way.
Heather: Yes. So I think, I think I’ll just leave it there thinking about that, the temporal dimension and holding big tech companies like Pornhub accountable.
Jill: This episode of Gender, Sex and Tech continued a conversation begun through email discussions with Dr. Heather Barnick. I would like to thank Heather for joining me today, for this really important and sometimes difficult conversation. And thank you, listener, for joining me for another episode of Gender Sex and Tech, Continuing the Conversation. If you would like to continue the conversation further, please reach out on Twitter @tech_gender, or consider creating your own essay, podcast, video or other media to continue the conversation in your own voice. Music for this episode provided by Epidemic Sound. This podcast is created by me, Jennifer Jill Fellows, with support from Douglas College in New Westminster BC and support from the Marc Sanders Foundation for Public Philosophy. Until next time, everyone. Bye!