Real Talk Podcast: Shocking Information About Your Digital Privacy You Need To Know

October 26, 2022
real talk with susan and kristina podcast

In this episode of Real Talk, KJK Student Defense Attorneys Susan Stone and Kristina Supler are joined by Danielle Citron, author, privacy expert, and a law professor at the University of Virginia School of Law.  They discuss digital privacy and the internet. The conversation includes the little-known ways your data is being collected and sold, how your data can potentially be weaponized against you, the sad reality of how the law works against victims of digital privacy violations and how to become a better digital citizen.

Links Mentioned In the Show:

 

Show Notes:

  • (00:28) How the internet has made life a lot more convenient these days
  • (00:52) Why the internet is also a dangerous place for students 
  • (03:13) What is the concept of intimate privacy on the internet?
  • (03:50) Why your personal data is not actually, “safe,” and is actually being tracked and sold to marketers 
  • (05:16) Why even the Department of Defense advises its enlistees from using 23 and me or similar services 
  • (05:54) How your DNA is legally being sold and exploited by ancestry composition services, even outside of the United States
  • (08:30) Don’t take nude photos or sex videos if you don’t want to be vulnerable 
  • (09:15) Why you may be charged with child pornography even if you take your own nudes or send them consensually. 
  • (10:15) What consenting adults need to know before sharing their nudes with others
  • (11:03) The harsh reality of what happens when you report your vulnerable photos being misused to the authorities
  • (12:24) Why women and minorities are more vulnerable to being exploited online
  • (13:38) Can data on your period tracking apps be used against women since the criminalization of abortion in some states?
  • (15:56) How our phones can essentially be weaponized against us by law enforcement, thanks to  advertisers, marketers, data brokers.
  • (17:03) How even your location and Google search history can create a domino effect of circumstantial evidence
  • (18:39) Will the purpose of your search history be considered should it be used against you in a criminal case?
  • (20:22) Hate speech online: Are the First Amendment rights in favor of the violators in the non-private sector?
  • (22:00) How intimate privacy violations are handled in the private sector: working with Atty. General Kamala Harris on building the cyber exploitation task force.
  • (25:22) Dealing with intimate privacy violations: Why your photo may legally be allowed to stay online because of section 230 of the Communications Decency Act
  • (26:55) How Section 230 has been drastically misconstrued especially in social media violations
  • (28:07) Why Reddits and sub-reddits are the new breeding ground of non-consensual intimate imagery 
  • (28:32) How the law is further victimizing victims of digital privacy violations 
  • (30:06) Why it is crucial to change the law that protects the solicitors of intimate privacy violations instead of the victims
  • (32:10) How to be a better digital citizen: For you and for other people
  • (33:40) Why speaking up is necessary to put a stop to digital privacy violations

 

Transcript:

Susan Stone: So this is the second in a two-part series. Is there a series? If there’s only two? 

Kristina Supler: I think we’ve just made it one. 

Susan Stone: Okay. On digital privacy and the internet. , I think we can all agree that the internet brings with us a lot of ease to our life. I know that today I ran out of toothpaste and went right on my Amazon and clicked, Didn’t have to run out.

There you go. But it can also be a scary place 

Kristina Supler: In our practice representing students in, in various contexts we’re dealing and wrestling with digital evidence every day and in a variety of different contexts. We handle cases involving sexting, cancel culture, and different iterations of that, and it’s, it’s amazing to see.

I’m still amazed, Susan, I don’t know if you feel the same way, what our, our clients and their peers say and do and put on the internet. 

Susan Stone: Well, it’s not just that. It’s that I still have a lot of trouble with the fact that the whole etiquette of our society has changed with the internet and with cell phones. I still think it’s incredibly rude to look at your cell phone at the dinner table. And I will often say to my adult children and my high school age children, put it down.

Well, talk to me. I’m right here. 

Kristina Supler: Absolutely. I agree. Well, today we are thrilled to be joined by the esteemed Danielle Citron, who’s a law professor at the University of Virginia School of Law. where she teaches and writes about privacy, free expression and civil rights. For the past decade, she’s worked as a civil rights advocate and has worked with lawmakers, law enforcement officers, and various tech companies to combat online abuse and to protect intimate privacy.

She’s been directly involved in some reform effort. Surrounding the regulation of various online platforms. Since 2011, Danielle has been a member of Safety Task Forces for Facebook and Twitter, and she also serves as an advisor for. Dating apps like Bumble and Streaming Services and TikTok, so be interested to hear more about that.

She’s written countless articles published across the G Globe and her most recent book, which just came out is titled The Fight for Privacy, Protecting Dignity, Identity, and Love in The Digital Age. Danielle, thanks for joining us. 

Danielle Citron: We’re really excited. Thanks so much for having me. Awesome. 

Susan Stone: Well, d Danielle, you study how most of our private data is collected and stored.

Can you talk to our listeners about intimate privacy. What it is, and how does it impact our daily life? 

Danielle Citron: So the concept of intimate privacy is all the ways that we others have access to and information about our bodies, our health, our innermost thoughts that we, we essentially convey all day long.

Our browsing, our reading, our searching our, of course texting and emailing our information and access to, uh, our sexual activities our sex, our sexual orientation, our gender, and our closest relationships. And all day long, every day we go about our lives sharing and provid access to our intimate privacy Kind of expecting hoping, and of course, deserving intimate privacy. 

And I wish I could say that we have it, but unfortunately we often don’t. So when we go to a hotel room or a public bath bathroom, we sort of assume that no one is taping us there when we, take a nude photo of ourselves or share really intimate thoughts with a loved one via text.

We assume that they are gonna keep that confidential. And when companies ferry that information and store that information, we assume, right? They’re gonna protect it from hacking. And when we use apps, we search, we check out our health, a digital assistance, like our health apps, our Fitbit.

We share information about our health conditions, whether we’ve gotten our period, whether you. We have visited adult sites. The videos that we watch, all that information, we of course want, expect and hope, privacy, that we enjoy privacy. What we don’t think, and we don’t realize is that all that information is being tracked, sold to advertisers as and marketers and then to data brokers.

Susan Stone: You know what’s so interesting? I did the 23andme. and I am, oh boy. Oh 

boy, . 

Well, 

Danielle Citron: there 

Susan Stone: was nothing, surprisingly, I am mostly an Ashkenazi jew. That should be no surprise to anybody and a part Neanderthal. But what I was shocked with is the emails that have flowed in as a result 

Kristina Supler: of me. You just been inundated by junk, or you name it.

Susan Stone: It’s bizarre. I mean, now I guess the government knows all my genetic information. Wow. 

Danielle Citron: Right, So, So I’m a little worried, right? The Department of of Defense tells all of its Enlistees and all of its officers that they shouldn’t use 23 and me. Because that information Wow is gonna be shared outside, you know, the United States and potentially with governments that could use it to extort and blackmail. Uh, they’re enlistees. So tell us like, if the DO OD is telling them not to do it, why are you doing it? 

Kristina Supler: Where did Susan’s information go? 

Susan Stone: Yeah, tell me and what are they gonna do with the fact? Tell me that I’m an Ashkenazi Jew. I mean, I don’t know. Well, 

Kristina Supler: is it in, 

Danielle Citron: you’re, you’re a DNA isn’t just relevant to you.

It’s relevant to everyone who shares some of that material. And so that makes your identity and the identity of people in your family and those you care about, then visible, detectable to others. And that’s not just including, of course, marketers and advertisers, which I don’t want that happening either.

But it’s still, it’s happening. It’s, these information is not covered and protected by hipaa. But because it’s eligible to be sold and exploited, it’s eligible to be sold and exploited to data brokers who are selling it to non USA 

Susan Stone: uS governments. Danielle, my husband wouldn’t participate because he said he didn’t want anyone knowing about his dna.

And I told him he was crazy. So David, sounds like you’re 

Kristina Supler: the, You could, That’s 

Danielle Citron: pot-kettle, right? And it’s not like I’m wearing its tin hat. Right friends. I’m not. No, I agree. All these ways that I gotta say I love my Spotify app . There are all these ways in which I love these tools too. So I’m, I’m not suggesting that we throw them in the sea. Our phones, right, our apps.

But what I am saying is that there’s so little protection that 23 and me might think, Gosh, that’s health. Of course they’re protecting my dna and the answer is absolutely not. HIPAA does not apply, nor does the, genetic, uh, non-discrimination information Act only applies to employers. So it’s honestly, I wanna, I, you know, I wanna allow us some room to say some things don’t do.

But also to call for structural reforms cuz there’s only so much I want you to have to get rid of . I want us to use these tools, but I want us to use them in ways that are with commitments of protection. Sure. 

Kristina Supler: Mm-hmm. . So I’d like to circle back. You had mentioned nude photos and Yeah.

Again, that’s something Susan, I mean we, 

Susan Stone: we’ve all the time,

Kristina Supler: many, many cases involving nude photos and you speak in your book uh, about how nude photos, extortion, revenge porn. It, it’s something that is, is on the rise in terms of abuse. When we talk about nude photos, I think sometimes society as a whole might be quick to say.

People might judge and say, Well, if you don’t want people to see ’em, Don’t take ’em otherwise, you, you incur the risk. What’s your response to that type of thinking?

Susan Stone: And might I add, I know we give advice all the time to parents to tell, say to their kids, This is a hard no. I know you just spoke about you don’t wanna put too many fences up or guardrails.

You want people to enjoy some of the benefits of the internet. But this is an area I know we feel strongly, especially with minors. No nude photos first. 

Danielle Citron: Yeah. Yeah. 

Kristina Supler: What’s the response? 

Danielle Citron: Yeah, I’ve got two. The first is that to the response that if you don’t wanna be vulnerable, don’t take the photos. It belies the reality that sex videos can be made about you without your involvement. 

Kristina Supler: The deep fakes.

Danielle Citron: Not only Right, right, right. Not only the non-consensual sort of, um, videotaping in your bedroom that you don’t know about, haven’t permitted, but fakery. Mm-hmm. . So women’s and girls’ faces are swapped into porn at. There are now like 60,000 videos, deep fake sex videos, digitally manipulated videos online.

And guess what? 95% of them are deep fake sex videos and over 98% of them are women and girls’ faces. It’s terrifying. You would. Terrifying. So the idea that like you shouldn’t have done it, the answer is, well, you didn’t. Okay, That’s the first. The second is heated agreement about anyone under 18. Right. If you’re on 18, it’s understood as child exploitation material.

Yes. So even if you create it yourself, even if you consensually share it with someone your own age, you know, like share it with another 15 year old with whom you’re in a relationship with, the answer is for both of you, it’s child sexual exploitation material, even though which is violates federal and state law, even though the whole point of these rules are.

Predation, right? Child predators, but they’re very formalistic, these roles. And it’s like you make it, you share it. And even if you’re in a consensual relationship, you’re 16, you’ve got another, you know, you have a partner who’s 16, don’t do it. So I would say I’m totally at, agree with me. Whenever I give calls, I mean, talks to folks who were under 18.

I say, Don’t do it. You’re not allowed. State and federal law says it’s child pornography. Too much risk. Um, It. Right. And I do also say to young people who are over 18, and I have some of them in my own house two 20 year olds in that age range, age range. I say there’s nothing wrong with sexual expression at all.

Like what was important to my spouse and I making mixed tapes and writing love letters is like, it has a different valence. You do it differently in the 21st century. Right. But I do say makes you trust the person. Because of course it could be fakey, but crucially, you gotta be sure you’re sharing in a confidential relationship.

It doesn’t guarantee anything. But I don’t wanna be that person who says you can’t do it when you’re 25. But I do say be careful with whom one shares because trust is everything. 

Susan Stone: Well, and I’d like to add, a lot of people aren’t aware that it’s the one who takes the picture that owns the copyright of the picture.

So you might think it’s give it back to, it’s me, it’s my photo. But the law says otherwise. 

Danielle Citron: Right. And, and that we have to look to copyright to help us, protect us is unfortunate cuz it’s not about property and, you know, and, and creativity and making money off of the photo. This is about privacy. That’s right.

And it’s about, you know, my image doesn’t belong to you and shouldn’t be appropriated even if you took it and. I wish I could say was law was more responsive. When people non consensually share nude images of you without your permission, assuming you’re over 18 law enforcement, often you go report it and they say, Sorry, close your computer. Boys will be boys.

It’s your, it’s your fault. Yeah, they don’t do anything. They see a lot of that. And then it’s really hard to get lawyers who are willing to represent you low bono or pro bono. Mm-hmm. . Cause we gotta make a living somehow, attorneys. Right. And you can’t go to a. There’s no deep pocket, can’t go to the platforms, right?

And when you wanna sue a, a perpetrator, they probably have very little money. So it, it’s becomes like a way in which the response to victim is, Well, go sue your perpetrator, or go put them in jail. And the answer is, you can do almost neither as a practical matter. So we need to kind of rethink how we protect intimate privacy in the digital age.

Susan Stone: In your opinion, what groups of people are the most vulnerable when it comes to intimate privacy, collecting, mining, and selling.

Danielle Citron: Okay, so, so first things first. It’s not my opinion, and these are just add to evidence, right? We have studies that show when it comes to the non-consensual sharing of intimate images.

This is across the globe that women in their twenties are most vulnerable. Okay? So that’s first things first. The second is that we also know that when it comes to the exp, you know, the collection use and sharing of our. So that’s the everyday companies, right? Collecting, using, and sharing our data, that it’s gonna be more costly and is more costly for women, non-whites, LGBTQ individuals, people from vulnerable communities because it’s their bodies, right?

That are stigmatized, right? So when you, a nude photo is posted online of a woman versus a man, the response to the man is like, Go get him. You know, good for you guy. And for the woman, it costs her her job. It makes it impossible to date. She sort of disappears. So we know that the exploitation of intimate information, the information about your bodies, your health, your sexual activities, your close relationships, that’s gonna be more costly to women and vulnerable people.

Kristina Supler: You mentioned that law enforcement in the United States. They have some of the biggest intimate privacy consumers. You talk about this in your book, How pertinent is it that now, especially since the overturning of Roe versus Wade in June what can you tell us about how data collection can be weaponized against women?

Danielle Citron: Uh, so what do they say? We, we were holding all of our breath, right, before the leak. Mm-hmm. of the Dobbs decision. And now that we have the Dobbs decision we know of course that now there are over 14 states that have criminalized abortion, some at the start, and then others, like within a certain band.

And all of that infor, that is the information that is collected on our period tracking apps, our search engine. Our location data collected by apps that are then shared all of this with data brokers, tell a story about where we’ve gone. Have we seen a health provider? Do we cross state lines and go visit a, you know, a Planned Parenthood where in a state where abortion is legal, have we gone to CVS and purchased menstrual pads?

Right? Did we tell our period tracking app that we didn’t get our period and then we got it Again? All of that is circumstantial. For a prosecutor that we terminated a pregnancy or potentially so, so I, I, 

Susan Stone: This sounds so big brother, Orwellian. Are you trying to say that you think there’s gonna be a tracker on young women in their ages of like 15, 16 to 30?

I mean, it just seems outrageous. I mean, I, I can’t imagine that an individual woman thinking about going about their business, regardless of whether they’re gonna have an abortion, but I’m just talking in general. Are you saying there’s like a, some sort of geo tracker or that the government is watching every young woman?

Danielle Citron: Yes. Right now, Look at your phone. Do you, If you bring your phone with me, like you have your phone, right? Mm-hmm. , you’ve got apps on your phone. If a young woman, girl, woman brings a phone with her to a clinic, her, her phone tells the story of where she’s been. There are 40 data brokers whose focus is location that as they track everywhere you go. And those data brokers right now, so I’m not kidding when I say right now data brokers have contracts with law enforcement, the state, local, and federal level.

Those location data brokers right now are selling that information.

Kristina Supler: That is just wild. 

That’s. 

Danielle Citron: Wow, that makes sense. So I don’t, I’m not suggesting that like law enforcement has placed a, this will sound very tin hatty. I’m not suggesting that there’s a chip on you, but your phone. and we love our phones of course, and the Supreme Court is recognized in Riley that like our phone is an extension of our souls, right?

Mm-hmm. , it knows more about us than our diaries did in our homes. This is this Rob Justice Roberts speaking about a Fourth Amendment decision with regard to our cell phones and needing a warrant to get into our cell phone. Well, our phone is leaking data all day long about us to advertisers, marketers, and in turn to data brokers.

So, So 

Kristina Supler: enforcement. Sorry to interrupt, I’m just, I. Fascinating. 

Susan Stone: I’m like ‘Mic Drop!’ 

Kristina Supler: Well, and, and we do criminal defense work. And so without getting too deep into the Fourth Amendment and probable cause and warrants and all that, I’m just curious because I, I, I did not know this. I’ve learned, uh, some really, really valuable information.

Once law enforcement purchases this data, like what do they do with it? Just put it in a database that they cruise through 

Susan Stone: or, or do they, They send it to a prosecutor to take it to a grand jury. 

Danielle Citron: Yeah, and they can use it. I mean, what I think I’m most worried about is the use of the purchase data to to tell a story in a search warrant.

That you then go and get, you know, then a so found probable cause and issued by a judge, and then you use that search warrant to go get the person’s communications. Mm-hmm. , right? That their text messages and emails, and we did see that in the Nebraska case, right? Where there was evidence that was used as the basis of a search warrant that then they got text between a mom and a daughter.

Their, their Facebook text messages to each other, in which they were talking about getting sort of abortion medicine. So I do worry that information about our location can be used as the circumstantial evidence and basis for a search warrant that then is used to get the communications that we think, gosh, that’s the most protected right, are electronic communications.

Not only in real time, but then subsequently in storage that you least need, you know, a warrant for. That what makes it easy to get then a warrant is all that circumstantial evidence that’s being sold to data brokers. And in terms to 

Kristina Supler: law enforcement, I mean, you wanna tell the, the story of your day. I mean, for me it’s, it’s, it’s, look at my Google search history.

What did I do all day? 

Susan Stone: Well, you know, normally I would say, Kristina, there’s nothing juicy on there. But the fact is we represent students involved in sexual assault cases and sometimes we Google things that for professional reasons that how do people know that when we Google consent in different states that it’s for our work and not personal.

Danielle Citron: Beautifully said. No, no, no. That is, That’s so well said. All this is so taken out of context. Mm-hmm. that our searches, you know, we Think they do tell a story of exactly what we’re thinking. But as you noted, so well, you’re thinking about a case you’re working on. Let’s say you searched for, you’re representing a ter, someone accused of, of a crime related to terrorism.

You know, you could, in your practice, why not? Absolutely. And you Right, of course, Right. You’re searching bomb making instructions, you know, because it’s part of the work that you’re doing of the client, right? But we’re gonna attribute it to you. Right? So I think you, it’s a really wonderfully wonderful example to show how, people often say we have nothing to hide.

As my colleague Dan Solov has written a whole book, Nothing to Hide, It’s nonsense. We have all have something to hide, and B, it’s all taken outta context. So you know what you’re searching tells the story of your clients. It tells the story of. Own life and privacy is ours. We shouldn’t be have to be having anything to do with hiding.

Well, you know what? Or having it framed that way. Right. 

Susan Stone: Danielle, you just, actually, I was gonna ask you the question and you answered it. Why should people care? So I’m gonna go to the next thought. Going off of what we’ve been talking about, big social media companies like Twitter and Facebook and getting people banned off of those platforms. 

Let’s just talk. Andrew Tate, Kanye West. I know when I heard what Kanye did, it was really upsetting to me. Yeah. And I know that there are mental health issues and battles that he’s having, but still the impact on the listener not to sound like a teen. I was triggered. I, I, it was really difficult.

How do you balance First Amendment rights with and free speech with saying to these platform? Y you gotta cut it out. You can’t, You gotta do a better job monitoring speech and cut it at the path, 

Kristina Supler: or just protecting intimate privacy. 

Danielle Citron: Yeah, so what’s really, I think, gratifying in my work with companies is that they’re not First Amendment actors.

They’re private companies. They curate their communities. Their community guidelines sort of express their values and priorities. And of course we know their data surveillance hubs, , right? How do they make their monies advertising? But at the same time, they’re hosting communities and because they’re not the government, they can prohibit and ban hate speech.

Right. Defined a speech that demeans, that dehumanize, that’s incites violence against, uh, members of a group because of their membership right. In that group. And that is subordinating and dehumanizing. And so that is the, I think, gratifying part of my job is that because I’m not advising the government constrained by the First Amendment.

I can say to companies, You know what, Hate speech creates an environment in which there’s permission to discriminate against attack, abused torment, physically attack individuals, right? Hate speech, we know leads to murder. And so I, I. You know, you asked about the First Amendment and its role in toggling through and dealing with all types of speech.

And the first example was hate speech and, Kanye’s remarks about, you know, Jewish individuals. And then the question about intimate privacy. And I’ve been lucky to work with companies that wanna tackle intimate privacy violations. and in part because when she was the attorney general Kamala Harris, Enlisted me to advise her for privacy on privacy for two years, and then to work together on her, what she called the cyber exploitation task force.

And we brought together 50 companies in a basement room. in, in the AG’s office. Right. And in California, in San Francisco. And this is 2015, February, 2015 before Google and Bing their view is we don’t touch speech on our search engines. And many of these companies were like, Sorry, we’re not gonna band on consensual pornography.

And after we broke into working groups and, public pressure came to bear. And essentially, so in June of 2015, Google and Bing announced that they’re gonna dein index. Non-consensual, intimate images and searches of people’s names, and that’s so much what victims wanted. And companies like, as Twitter, YouTube Facebook you know, you name it, sort of Reddit jumps on the bandwagon and says, Yeah, we’re banning it as well.

So, intimate privacy violations. Uh, you can tackle them under the First Amendment that as we have at the Cyber Civil Rights Initiative, we worked with state lawmakers across the country and there are now 48 laws DC and two territories that criminalize the practice, Unfortunately, as misdemeanors, but their laws on the books.

And in five states that have gotten to the state’s highest court, all five laws were upheld. They ran them through the crucible of strict scrutiny and the court said, These are constitutional laws, right? They’re narrow, they get at a compell. Interest in protecting from harm Individuals who nude photos have been posted without consent.

It’s the least restrictive means available so we can tackle it even under the first amendment right, intimate privacy violations, we can regulate. Just following up, hope I answered that. They’re both two good questions and I wanna make sure I answered both of them. You did great job 

Kristina Supler: and, and I’d like to follow up even more so.

So in our practice we are, it, it is not uncommon for us to meet with students and parents whose lives have been just decimated because various content has made its way to the internet.

Susan Stone: I mean, cancel culture is kids, uh, throw up the word, This kid is a racist. This kid is a rapist. And immediately when other students read it, they believe it.

They don’t consider, Well, what’s the source? Who’s saying it? If you read it, ergo,. It must be true. 

Kristina Supler: One of the most difficult conversations we have with these students and parents, people’s whose lives have been turned upside down, they say, Make it stop. Make it stop. And someone’s, someone must be held responsible. This can’t go on and on.

And we have to unfortunately explain that there’s, there’s laws and protections and immunity for these, these platforms and it’s, it’s really difficult for these families. Can you tell us, in your book you write about section 230 of the Communications Decency Act. Tell us a little bit about the immunity for some of these platforms based on the content, because I know this issue of content on the internet, we wrestle with it every day.

And, and in particular, I’d like to add, can you frame it in a way that parents can get a nugget of what they can do if their child’s being canceled online? 

Danielle Citron: Okay, so, so first things first, just to kind of emphasize the point that when information is posted online is my, um, I, I interviewed 60 people for my book from around the world, and what resonated for every story for story, right, the posting of nude photos without permission was that like, it was like an incurable disease that no matter when you Googled yourself, there probably would be more nude photos posted about you.

Mm-hmm. , that it was impossible to get that content taken down. And you might say, Okay, how is that possible? And this goes to our question about Section 230. There’s a federal law passed by Congress in 1996 which at the time was designed to encourage, they called them Interactive Computer Services, but you know, online providers from cleaning up the.

Right. So the deal that these two congressmen struck, then Congressmen Ron Whited and Congressman Chris Cox, was that they said, Listen, we’re gonna provide a legal shield and immunity from being sued. We’re not gonna treat you like you’ve been publishing or speaking content that somebody else posts. We’re gonna let you leave up or take down information.

And they framed it as thinking about companies as good Samaritans who’d be filtering and blocking offensive content, Danielle, to the statute. 

Susan Stone: That is not how I view Section 230 to today. 

Danielle Citron: Today, Of course. Yeah. No, no. Let me explain. So that’s the explain, you know, that’s how Chris Cox and Rod whiten like frame the statute.

Whoa. Right. Second mic drop . And how it’s been interpreted there. Two provisions and, and, and probably in your world, you’re focusing on the leave up provision. It’s been interpreted really broadly to mean that if you leave up information that’s illegal, you’re free from liability.

Even if you’ve encouraged it, even if you’ve solicited it, even if you know for sure and you keep it up despite the fact that people have given you proof that it’s untrue, it’s, it’s not what you want.

Non-consensual, intimate imagery, no matter. These sites enjoy immunity from responsibility. So that means that when you go to TikTok and there is a, a video created by someone that repost, let’s say non-consensual intimate imagery or that repeats lies about someone that are untrue, that ruin their reputation, that the company can.

Well, they, they’ll accept complaints about it, but they don’t have to take it down, and you can’t sue them to take it down because of section 230. Now, TikTok has a very comprehensive community guidelines, and I’m working on those guidelines. Right. But let’s say we’re not talking about a TikTok. We’re talking about four chan.

We’re talking about a sub-reddit. Right, and 

Susan Stone: which is getting more popular. The sub-reddit.

Danielle Citron: That’s right. Re the subreddits are on fire with non-consensual intimate imagery and lots of abuse and the company just ignores complaints. Right. At least I’ve reported myself non, and I’m literally have nothing to do with the people in the photos.

It’s so clear from the photos. These subreddits are totally devoted to non-consensual intimate imagery. And they don’t care. And they just say, Sorry, you, we haven’t violated the, you haven’t violated the community guidelines and therefore we’re gonna keep it up. 

So that’s what, you know, you ask like, how is it that content that destroys people’s lives can remain online and that individuals have no recourse. And the answer is that the party in the best position to minimize the damage to make it stop. Not to prevent what’s harm, the harm that’s happened, but to make the harm stop from continuing. Those parties have been understood very broadly to be immune from responsibility.

And so the platform can get request letter. Plea after plea and ignore those pleas to take down content, even though that content is destroying the life of a minor, even though that content is invading intimate privacy and cruel and horrific ways, they can just ignore it. Uh, and there are sites whose whole purpose, so there are 9,500 sites whose per purpose is abuse.

That is, they focus on intimate image violations like they’re called hidden cam, hidden camera. They’re not that like sophisticated Mr. Deep fakes. Those sites, even though they’ve solicited users. To encourage them to post intimate images that they is not, that’s not consensually posted. Even though they have received complaints from victims.

Please take it down. Um, this is destroying my life. They can ignore it and enjoy immunity from responsibility. So I hope that helps illustrate just how broad this immunity is, right? Even sites whose business model is illegality, intimate privacy violations there, get off scott free. . 

Kristina Supler: That’s wild. What can we do short of lobbying to change the law?

Danielle Citron: We gotta change the law. . Okay. Join me in the fight, right? Absolutely. With folks on the hill, right? Both Democrats and Republicans Senate in house on proposals, uh, in my pitch and, and. I’ve been somewhat successful but not as successful as I had hoped is to exclude from the immunity provision.

Ban Samaritans. Sites that encourage solicitor keep up intimate privacy violations, they shouldn’t enjoy the, should not enjoy the immunity and that otherwise for the everyday, companies that are trying but at scale, it’s hard that they should have duties of care to address intimate privacy violations and other content that amounts to cyber stalking.

Do you know, and the Congress 

Susan Stone: I want to ask a question because a lot of these kids, when they call someone out for what they perceive as a bad act, They don’t see themselves as bad Samaritans. In fact, they think it’s their duty if they hear something to let the world know. And so it seems like there’s a, a shift in culture as to what information should be spread.

I mean, I know that I was raised with the concept of if you don’t have anything nice to say, don’t say it. And if you don’t know for sure really don’t say it. But that’s not the culture today. Don’t you think we need to do a cultural shift on fact checking, be more skeptical? I mean, I think that’s, It’s hard at the 

Kristina Supler: root because on the one hand we’ve had to work for so many years.

Danielle Citron: That’s right. 

Kristina Supler: To encourage students and individuals of all ages to speak up and speak out about injustices. But yet now we’ve had this big shift and it’s, you know, you have to ask, has the pendulum swung too far in terms of people speaking up and speaking out about what they perceive to be injustice?

Susan Stone: That’s a really nice context. How did we get here? And we got here because everyone was so silenced. Good point . 

Danielle Citron: Yep. No, that’s right. And I think the first thing, and I, and I imagine you’re doing this in your work all the time and in your practice, is talking to parents about teaching their kids about how not only they should protect themselves, but crucially protect other people. And think about privacy for me as privacy for they. That is, we’re all in this together and we’ve gotta think about how to be better digital citizens and think, as you said, really.

Before. 

Kristina Supler: Oh, I like that. How to be a better digital citizen. 

Susan Stone: I like that. Love that. I, We’re gonna steal that line cause we, we might have to, of course. Did you copyright that one if not one. I, 

Danielle Citron: I have an article called Intermediaries and Hate Speech Fostering Digital Citizenship for Information Age. And it came out in the LAR review in, in 2011.

And it was about how we teach our kids and how intermediaries mean platforms can be a part of the conversation about hate speech and, and what that means. Check that out. That’s a great article. Citizen. Yeah, so it’s a BU law review. It was like July, 2011. No, feel free, we all should talk about digital citizenship, however you wanna conceive of it but, I’ve conceived of it is how we think about our own ourselves and our duties to other people.

And how we wanna make sure everyone can get the most out of online, you know, life that’s networked. There’s no other place, cyberspace. It’s in us, all of us all the time. And that we have to think about ways to make it a place where we can, a thrive. And sometimes that means being really careful about what we share.

And sometimes it does mean speaking out because for far too long, This is the lesson of the Me Too movement is that, there has been silence around sexual assault, and sadly, who gets hit and burn burns are the victims. You saw that in the Johnny Depp, Amber Hearst defamation trial, that it’s still, to this day, misogyny is alive and well and living and breathing and instilled by its victims.

Right? So it also, of course, if you’re falsely accused of, so. It absolutely is earth shattering. So I think what is great is that cuz you’re in touch with parents and students is to teach them about their responsibilities as digital citizens. Not their entitlements, but their responsibilities. 

Susan Stone: I, We have to end on that note, even though I wanna talk to you about more things.

There is, that is so poignant.. And so helpful, and I can’t thank you enough. And I feel like the three of us have more collaboration in our future. I see some synergies in what we do, so thank you.

Danielle Citron: Oh gosh. Thank you,

Kristina Supler: Danielle. Thank you so much for joining us today. And for our listeners, check out her recent book, The Fight for Privacy, Protecting Dignity, Identity, and Love in the Digital Age