Legal Literacy Training

Yves, Lorelei Lee, Kendra Albert, and Korica Simon will present on the First Amendment, section 230, Patriot Act, the ways in which fear creates a push for state surveillance and the impact that this has on our community.

DANIELLE BLUNT: Hi, everyone. I am Danielle Blunt. I use she/her pronouns. I’m with Hacking//Hustling, a collective of sex workers and allies working at the intersection of technology and social justice to interrupt state surveillance and violence facilitated by of technology. Do we want to go around and do introductions first?

KENDRA ALBERT: I can go. Hi, everybody. My name is Kendra Albert. I’m an attorney, and obliged to say via that that none of this is legal advice. I work at the Harvard Cyberlaw Clinic as a clinical instructor there, and also do some work with Hacking//Hustling, and my pronouns are they/them. I’m super excited to be here with y’all.

YVES: Hi, I’m Yves. I use she and they pronouns. I’m an organizer with Survived & Punished New York, which is an abolition group. And with Red Canary Song, which is around supporting migrant sex workers.

KORICA SIMON: Hi. I’m Korica Simon. My pronouns are she/her. I’m a law student in third year at Cornell. I got involved with sex worker legal rights when I was in Seattle, and I worked for a nonprofit called Legal Voice. And I also got involved through a clinic at Cornell called the Gender Justice Clinic. So I’m excited to be here today and talk to you all more about the subject.

LORELEI LEE: Hi, everyone. My name’s Lorelei. I’m an activist and I work with Red Canary Song, as well as Hacking//Hustling, and have worked with these folks on issues of surveillance and tech‑related violence that’s impacting people in the sex trades for the last few years. So I’m really excited to be in conversation as well!

DANIELLE BLUNT: Awesome. And Yves, do you want to kick it off with a little community report back? That’d be a great place to start.

How does surveillance impact the sex workers?

YVES: Hi. So, yeah! I’m just going to talk about a little bit about how we got here, and then a little bit how it’s affected mainly the sex working community, but also in general.

So what we’re mainly talking about, or what I’ll mainly be talking about, is around contact tracing and public health, like, uses for surveillance. And then also SESTA/FOSTA and the EARN IT bills.

So these have all increased policing, and I don’t just mean the police department, but also through citizen surveillance and deputizing not only just like people, but also deputizing nonpolice department agencies and companies. Right? This includes a lot of different people, and this really like increases the scope of policing and criminalization in a way that it wasn’t previously. And the way that this has happened is, like, structural violence that has led to so much harm to already marginalized and stigmatized communities.

So kind of the reason why this is, like, happening the way that it is is because a lot of the conversation around surveillance, like, belies powerful moralism that resists against evidence and logic. We see this like happening post‑9/11, where you get this idea of like, if you see something, say something. Which sort of creates these hypervigilant crusaders for both antiterrorism but also anti‑trafficking, which is also an issue that is often tied up with sex work.

So when we see this happening ‑‑ the surveillance has increased in such a way that we’ve seen this happen before? Like, this predates all of these sorts of things. Right? They’ve used contact tracing before to criminalize… different marginalized communities and sex working communities for a really long time. With HIV/AIDS we see as an example, right? They ask you, when you get tested for HIV, they’ve also criminalized the spread of HIV period, right? But when you go into a clinic and you are asked, you’re gonna be asked if you test positive, or if you think that you have any STI, right, but especially HIV/AIDS, they’re going to say, who have you had sex with? Like, name those people. In what timeframe did you have sex with those people? We want to contact those people; how do we get in contact with them. Right? Whether that’s from someone directly, or from you yourself when you go in to get tested. They’re getting that information, and the truth is that information doesn’t just stay at the health clinic, doesn’t just stay where you think it stays. Right? That information gets passed on to the police, to these different agencies like CPS, right? And that leads to the criminalization of a lot of people, not just sex working people. Right? Also people who are profiled as sex workers. They use that information to be like, oh, you’re selling sex. So then the cops are going to come knocking on your door. And this similarly happens with COVID, right?

How is contact tracing used to surveil marginalized communities?

So we’re seeing what’s happening with COVID is they’re using contact tracing in a very similar way. And also, if you’ve seen the public rhetoric around this, right, the way public health officials and government officials are talking about it, they say that sex workers are high‑risk people, right, to have COVID. So if you are working in a massage parlor or on the street and you come in to get tested for COVID, they’re going to be like, oh, how did you get contact ‑‑ how did you get this? Who were you in contact with? Or if someone you know has got COVID and goes to get tested, they ask who were you in contact with, and you tell them, oh, I went to this massage parlor or met them on the street, that’s a way of criminalization. The truth is they are not just out to treat you, right? They are going to turn over that information to the police. This happens to spread the scope of policing in so many different ways, not just in the sex working community. Right? In a lot of different communities, they use these exact same surveillance techniques. We should think of surveillance as a strategy that exists within the larger frame of policing.
Also contact tracing, like, these different methods of policing are used to marginalize other communities, who are not sex workers but are targeted as if they were anyway? We see this targeting protesters, recently. We see them surveilling a lot of communities in this way. How did you get this? Oh, you were at a protest last week? Who was there? This all happens in the same scope, and in all of these different agencies that have then been deputized. Right.

What happens when you have these situations ‑‑ with SESTA and FOSTA and EARN IT, we wouldn’t directly think of them as surveillance? Oh, they’re censoring and taking down these sites; that’s not directly surveillance. But that’s not exactly true. A lot of these bills are also like collecting information, right? Because they’re putting these sorts of laws in place to go against sex trafficking? So they tell you, these are the indicators of sex trafficking, looking for these indicators. But really, those indicators aren’t indicators of sex trafficking most of the time and are also spread out to sex workers and other people and people who are profiled as sex workers. So it’s also a lot of data collection that is happening in order for them to censor, shadowban, and do all of these things anyway. And that data is also being turned over to the police to be used.

But also, we generally see the kind of attack that happens to sex work coming in from all ends. Where like, so SESTA and FOSTA, we saw this. Right? This is just what happened already. People are pushed to work in more dangerous ways because you can’t go on Backpage, you can’t go on Craigslist ‑‑

DANIELLE BLUNT: Can I stop you for one sec? That is amazing. Thank you so much for that. I just want to take a moment to ask for definitions on what is contact tracing and just, like, anyone who wants to jump in a one‑sentence summary of FOSTA‑SESTA and EARN IT, just so folks are on the same page from the forefront.

What is contact tracing?

YVES: Well, I can like expand a little bit on contact tracing, right? So when an epidemic occurs, right, which I brought up HIV/AIDS, how they kind of figure out who might have it so that they can “treat them,” right, in the best case scenario, this information would not be used to police people. Right? But that’s not what happens. But, like, contact tracing is when they try to figure out who has had it, or who gave it to you, and like ‑‑ or who you could have possibly given it to, right, in order to stop the spread, so you can get those people into treatment centers or in treatment.

So if I, for example, was to go into a clinic, and I was like, I have chlamydia. Right. And they’re like, okay, so you have chlamydia. Who did you have sex with before this? Who did you have sex with after you thought you might have shown symptoms? And then you like list them off, like, okay, I saw Blunt! I saw Kendra! I saw Lorelei! I saw all of these people! And then they’re like, oh, did that person tell you that they have chlamydia? Did that person tell you that they have HIV/AIDS, da da da da da. We see the criminalization of HIV/AIDS like similarly with the criminalization that’s happened with COVID, where they’re criminalizing directly the spread of COVID and HIV, but also generally, right? They use this information to be like, oh, who did you get this from? And ideally, they wouldn’t police people, but what ends up happening is they ask you all of this information, and then they kind of pick out those indicators to be like, you’re a sex worker. Like, you’re selling sex. Right? So we’re gonna like show up, and we’re going to arrest you. Right.
I hope that that makes sense.

DANIELLE BLUNT: Yeah. And I also think, too, with the protests that are going around, I think that there’s a lot of contact tracing that’s being done with like sting rays and cell phone tracing, which I hope some other folks can talk about in a little bit. And I would just love like a one or two‑nonsense summary of FOSTA‑SESTA and EARN IT before we go into them in more detail.

KENDRA ALBERT: Um. I will do my best. Lorelei is watching me with an amused look on their face.

What is FOSTA-SESTA?

So, FOSTA and SESTA are laws that were passed in 2013 that greatly increased the incentive ‑‑ F‑O‑S‑T‑A and S‑E‑S‑T‑A. Thank you, Blunt. That greatly increased the incentive for online service providers, folks like FaceBook or Twitter or Craigslist, to remove any content related to sex work or possibly attributable in any way to sex trafficking or related to sex trafficking. And we can talk a little bit more about how specifically they did that, but that’s like the one‑line top‑level summary.
EARN IT is a pending bill in front of Congress right now that is meant to do some similar ‑‑ basically, engages in some similar legal stuff around child sexual abuse material, making it ‑‑ incentivizing companies and online platforms to be more potentially invasive in their searches for child sexual abuse material by creating more liability if they’re found hosting it.
Lorelei, how was that?

LORELEI LEE: Great. I think that was great. It’s very ‑‑ I mean, FOSTA‑SESTA has a lot of parts, and so it’s… But I think what you are talking about is the most important part, which is the impact of it and what it incentivizes.
And the one thing I would add about EARN IT is I think what EARN IT will do that is similar to FOSTA‑SESTA is that it will incentivize companies to remove all information related to sexual health and anything that teaches youth about sexuality.

DANIELLE BLUNT: Thank you.

KENDRA ALBERT: Very upbeat.

DANIELLE BLUNT: And ‑‑ yeah. (Laughs) We’ll be getting into those a little bit more. I have one more question for Yves: Is turning health info to the police doable via legal loopholes, like HIPAA, or is that happening in the shadows?

KENDRA ALBERT: I can also take that one, if you prefer. So HIPAA ‑‑ HIPAA, which is the U.S. health care privacy law, federal health care privacy law, explicitly has a carve‑out for so‑called “covered entities,” health care providers, turning over information to law enforcement. So it specifically says, you don’t need to get consent from people to turn their information over to law enforcement. So HIPAA doesn’t prevent that.

You know, another thing, and actually I think this really ties in really nicely to some of the stuff we want to talk about a little bit, like the Patriot Act, which brought in surveillance powers to the U.S. government, passed in 2001, right after 9/11, is a lot of times even if there isn’t an explicit process for getting a law enforcement agent or even a public health entity to request information, say if they went through appropriate legal process, many ‑‑ there are now ‑‑ there are often legal regimes that encourage what’s called “information sharing.” Which just basically means, like, that there are… They try to eliminate, like, privacy or other reasons that information might be siloed between different parts of the government or different governments, like federal, state, local. So even if, you know, you don’t have law enforcement knocking on the health providers’ door with a request for information, like a subpoena or whatever, there’s often these efforts to kind of standardize and collect and centralize these forms of information.
Yves, do you have anything ‑‑ do you want to add to that? Did that feel like an adequate summary?

YVES: I feel like that was very clear. Yeah.

What is the Patriot Act?

KENDRA ALBERT: So I think ‑‑ Blunt, do you mind if I keep going to talk about Patriot Act stuff for a second? So I think that, you know, one of the things that I think is worth noting is you see a couple general trends, in surveillance. And I’ll also let Korica talk more about some particular ways this plays out in particular communities. But just on a super high level. We can see this sort of movement from the Patriot Act to now, where A, we see more requirements around information sharing. One big critique of the U.S. government’s… I hesitate to say intelligence‑gathering apparatus with a straight face, but! And what I mean by that is the CIA, the NSA, the sort of intelligence agencies, as opposed to more traditional law enforcement agencies like the FBI or police. Was that they were gathering all this data, but they weren’t sharing it in ways that were actionable across multiple agencies. So when the Patriot Act was passed after 9/11, one of the goals was to make it easier for agencies to share information. I think that’s a general trend that’s happened, post‑Patriot Act to the current moment, where we see things like fusion centers and other ways to collect surveillance data from multiple ‑‑ and Palantir’s databases, and ICE’s data collection… Data that’s collected across multiple methods of surveillance and putting it together to gain more information about the lives of individual people.

Obviously, this has dramatic effects on sex working populations, often because, A, often specifically criminalized and over‑surveilled? But also, you know, if… Often, information that is innocuous, sort of not raising a red flag on its own, when combined with other information can suggest more specifically what kind of activities folks are engaged in or who they’re spending time with.

The other thing I want to highlight about the Patriot Act is surveillance after ‑‑ well. Two more things. I’m trying to be brief, but the lawyer thing is everything comes in threes, so I have to have three things I just want to highlight about the Patriot Act. Okay! So number two is that you see these particular surveillance tools be originally deployed for what’s considered very, very important law enforcement activity. So originally, actually, a lot of stuff we talked about in the context of antiterrorism work, and that’s what the Patriot Act was about. But over time, these law enforcement tools get sort of “trickle down,” for lack of a better term, into sort of more day‑to‑day enforcement activities. So we’ve seen these actually a lot with something called the sneak‑and‑peek warrant. Which that’s actually a term that, at least, Wikipedia tells me, that the FBI coined, not anti‑surveillance activists? It’s kind of funny that the FBI thinks that’s a good description of what this thing is. But basically, traditionally, if someone gets a warrant for searching your property, it’s basically a document where you go before a judge and you say here’s why we want to search, and the judge says, okay. You said where, you said why you’re allowed to. I’m going to sign off on this, and you police can go search that person’s house, for example.

So traditionally, you know, if you’re at the house, and police show up, you can ask them to see the warrant. And say, hey, I want to confirm that this is the warrant that allows you to search my house. What a sneak‑and‑peek warrant does is say ‑‑ is actually allows ‑‑ this is kind of, it kind of sounds like bullshit, but this is what really happens. Right? Allows the police to set up a ruse? Like, oh, to get you out of the house, go into the house, and sort of search your stuff. And actually, like, one of the contexts in which we’ve seen this really recently, and one of the reasons I connect this back to the Patriot Act, is the surveillance on massage parlors on the Robert Craft case in Florida. Actually what the police did is get a sneak‑and‑peek warrant, claim there was a bomb threat or a suspicious package at a nearby building. All of the folks who worked in the massage parlors were sort of escorted ‑‑ had to be away from the building for their own safety, and the police went in and put cameras in.

And we know this because they tried to criminally prosecute Robert Craft, and Robert Craft had enough money and sort of ‑‑ to hire a legal team that was able to challenge the validity of the sneak‑and‑peek warrant that they used to surveil the massage parlors and others working there.

So, when those warrants were included in the Patriot Act, there’s nothing in there about human trafficking investigations, let alone the stuff that happened in Florida, which actually ‑‑ there’s no human trafficking prosecution coming out of it. Right. From what I read, and others may know better than I do, so I’ll cede whatever claim I have to the truth there, but. It doesn’t look like human trafficking was involved.

So sneak‑and‑peek warrants weren’t written into the law for those sort of investigations, let alone surveillance of prostitution, but these information technologies ‑‑ and here, I include technologies in the computer sense but also in the ways governments do surveillance. Once they get written into the law, their use often gets broadly expanded to new populations, new circumstances. It’s sort of like, you might as well use it.

I had a third thing about the Patriot Act. But… I guess the third thing I’ll say quickly, before I stop, is that I think one thing you’ll see a lot of in discussions about surveillance reform, and especially how it sort of fits into conversations like we might have here about sex work, is sort of an inherent trust that like procedures are going to save people. (chuckles) Which, I’m really skeptical of, just sort of personally? But you know, if you look at sort of what happened, you know, between the Patriot Act and now… So, there was ‑‑ there’s a thing called Section 215 of the Patriot Act, which basically functionally allowed the National Security ‑‑ the NSA to search people’s call logs to see who was calling who. And this was a ‑‑ there was like a legitimately robust debate about this. But one of the sort of reform methods that was actually put on the table and passed as part of the… USA Freedom Act, in I want to say 2016? Don’t quote me on the date. Was actually, they were like, okay, great. Well, the U.S. government can’t hold this giant database of call data anymore. They can’t see who you called and when. But they can go to the phone company and ask them.

And like, yes, that is better. I’m not gonna ‑‑ I would rather they have to go and ask Verizon nicely before they get the call data. But functionally, I’m like, that’s not ‑‑ that’s not safety. Right? And I think that, you know, when we see a lot of the surveillance ‑‑ some types of surveillance reform activity, especially post‑Patriot Act, we’re not even getting close to back where we were pre‑Patriot Act. We’re sort of, like, trying to kind of tinker around the margins, slash maybe add a teeny bit more process. That’s not going to help the folks who are most criminalized and most surveilled.

Anyway. That was a lot from me, so I’m going to stop. Blunt, do you have another question you want to tee up, or folks want to react to any of that?

LORELEI LEE: I have a question, actually. What is a fusion center?

What is a fusion center?

KENDRA ALBERT: Um. Well, so I’m ‑‑ I’m gonna do my best. It’s been a little while. But the… Ha. Despite the weird, kind of futuristic name, it’s basically where all the cops get together. So different ‑‑ (Laughing) Yeah. Different types of law enforcement often, like, have different beats. So one of the goals of fusion centers is to like combine information and share policing and surveillance information from different law enforcement agencies. So that can be, like, you know the one actually I’m most familiar with is the one outside of San Francisco. And there’s been a lot of ‑‑ I don’t to erase ‑‑ there’s been a lot of really amazing research and activism against fusion centers. But actually often and primarily by communities of color. But usually, they’re like… It’s where the San Francisco PD and the Marin County PD, which is the county north of San Francisco, and the BART ‑‑ which is the public transit, one of the public transit organizations ‑‑ where they cops would all share information and sort of share tips. And they were a result of the sort of, the attempt after 9/11 to deal with this ‑‑ what people saw as this problem of all this information being siloed.

DANIELLE BLUNT: Awesome. Thank you so much, Kendra.
Korica, I would love to hear from you if you feel like now’s a good time to chime in.

The history of surveillance in marginalized communities

KORICA SIMON: Yeah. So I can speak a little bit on the history of surveillance. So, as Kendra stated earlier, marginalized communities have historically been affected, have been victims, of government surveillance. And surveillance does have roots in slavery. So in 1713, New York City passed lantern laws. These laws were used to regulate Black and Indian slaves at night. So they ‑‑ if you were over the age of 14, you could not appear in the streets without some kind of lighted candle so that the police could identify you.
And we’ve seen like this same thing recently, with NYPD, where they are shining survey lights like floodlights in Black communities. And we saw that increase after they received a lot of criticism over stop and frisk. And people in those neighborhoods were reporting that they could not sleep at night. Like, the lights were just blinding them.
And Simone Browne has written a lot about this subject, and how the ways in which light has been used to surveil people. And I believe they also write about technology, as well, and how we’ve moved to that side of things.

What is COINTELPRO?

So in regards to technology surveillance, one of the most well‑known abuses of surveillance by the government is COINTELPRO. I think it stands for counterintelligence program. It was basically a series of illegal counterintelligence projects conducted by the FBI aimed at surveilling and discrediting, disrupting, political organizations. So the FBI in particular targeted feminist organizations, communist organizations, and civil rights organizations. And the government’s job was basically to do whatever they could to just like disband them and get rid of them by any means necessary. And they mostly did this through, like, wiretaps, listening in on people’s phone calls, tracking them down, as well as having informants involved as well.

And as a result of this, quite a few people were murdered or put into prison. Some Black members of the Black Panther party are still in prison. And… Two of the most talked about people who are victims of this are Martin Luther King, Jr., as well as Fred Hampton, who was drugged by an FBI informant and then murdered by Chicago police. But also Angela Davis has been a victim of this as well. And again, we know that these practices are still continuing today. So we kinda got into the protesters and how they’re being surveilled. And I think it came out in 2018, 2019, that Black Lives Matter activists were being watched. Their activity was being watched on the internet. And now we have seen recent reports that protesters today are being watched, as well, either through body cameras, cell site simulators, license plate readers, social media, drones, as well as just cameras in that area that may use facial recognition technology that could help the police identify who the protester is and get access to your social media accounts.

So these are all, like, issues that are happening today as technology increases. We’ve only seen it get worse. And we know that marginalized communities are the most affected by this. If they use this on Black, Native, Latinx, immigrant communities, they’re also going to use this on others as well. Sex workers, and sex workers mostly fall into marginalized communities. So.

I don’t know if I should talk about the third part right now, or if I should wait? ‘Cause it’s a little bit different, but… Okay. I’ll just go ahead. (Laughs)

What is The Third Party Doctrine?

So, kind of transitioning a little bit t. The Third Party Doctrine is a doctrine that comes out of two Supreme Court cases, United States v Miller and Smith v Maryland. And what they state is if you voluntarily give your information to third parties, then you have no reasonable expectation of privacy. So third parties include your phone company, Verizon, Sprint; e‑mail servers, if you use Gmail; internet service providers; as well as banks. And so that means that the government can obtain your information from these companies without having a warrant. So they don’t have to have, like, probable cause that you’re doing something in order to get access to this information.

And the Supreme Court’s logic behind this decision was that, well, if you tell someone something, then you’re giving up your privacy, and like you can’t expect that that will stay private forever. What ‑‑ I should also back up and say that these cases were decided in the 70s? So. Not today, where like our whole life is on the internet, and we are constantly giving third parties our information. And actually Justice Sotomayor, she has suggested that she would like the Court to rethink the third party doctrine, because it’s just a completely different time today. A lot of us use our GPS, and we wouldn’t think that ‑‑ I don’t know. That they could just share all of our information without us knowing.

And I will say that if you’re ever curious about like how often the government is requesting access to this information, some companies, like Google, I think FaceBook, and Sprint, they do report this. I know Google reports it under transparency reports. And you can see how often the government has asked them ‑‑

KORICA SIMON: Oh. Well, hopefully, they’re still doing it, and you can see. I think it’s roughly a hundred thousand people a year. But we don’t know, like, what the result of that is. It’s honestly probably a lot of people who aren’t doing anything at all.

And so we’ve also seen, like, some people starting to move their e‑mail accounts from using Gmail to e‑mail servers that care a little bit more about privacy and that are more willing to fight these requests from the government.

And then I’ll also say the last thing is that the government can also request that these companies, like, not tell you at all that they’ve requested this information. So… This could be done completely in secret, as well. So.

Privacy from law enforcement and the Fourth Amendment

KENDRA ALBERT: So. I think ‑‑ I want to just sort of flag some stuff that Korica said and sort of highlight certain parts of it, and I want to contextualize a little bit of this. I think, you know, often ‑‑ we’re sort of talking here about sort of privacy from law enforcement, and the primary source of privacy from law enforcement in the U.S. is the fourth amendment, which is so obvious Korica didn’t say it, but I’m going to say it just in case it isn’t obvious for other folks. And, you know… Two ‑‑ one thing worth noting about the fourth amendment, for like folks who are sort of concerned about where ‑‑ about like the relationship between all of these legal doctrines and their actual lives? You know, for many, like… Often, I want to contextualize for folks that having Fourth Amendment protection, or like saying, oh, the U.S. government violated the Fourth Amendment, only gets you so far. Because if what you want is the government not to have access to that information, the horse has already left the barn, to use the right metaphor. Which is to say that most of the remedies that come from, you know, unconstitutional searches and seizures, or unconstitutional requests for information, just are about not having ‑‑ like, that information not being able to be used against you in court. Which is of very limited value if what you’re concerned about is like the safety of yourself or your community, of not getting folks arrested, or if you don’t have questions access to the kinds of representation and resources that would allow you to go through a legal battle and you’re going to plea out the second that you get arrested.

So, you know, I always want to caution any story I tell, or any story we tell, about the importance of constitutional rights in this area with a little sort of real politic about what does it mean, or real talk about what does it mean, to have access to these kinds of rights.

The other thing I want to flag is one ‑‑ what Korica said is 100% correct, as a matter of doctrinal law. There is a weird thing that has happened with regards to government access to data, which is a lot of the larger online service providers, and in this I include Google and Twitter, some of the like… FaceBook, actively will not provide certain types of information absent a, like, appropriate legal process. So it’s actually legally contestable whether the government can get access to your e‑mail that’s stored on a server without a warrant. It has to do something with ‑‑ in certain contexts, with how old the e‑mail is, and whether it looks like it’s been abandoned on the server? Is statute that covers that, the electronic communications privacy act, was passed in the 90s, and boy does it read like that! Like, good luck!

Does the government need a warrant to access your e-mail?

But point being, Gmail will require a warrant to get access to your e‑mail content. That’s great, except that, you know… If your e‑mail content gets turned over and then you then want to challenge it, you’re still in the same place you were previously, which is that the government has access to the e‑mail, and that could mean serious consequences for you independent of whether it’s later admissible in court.

So part of what we’re talking about with legal literacy in this space, what I want to encourage folks to think about is, okay, how do I keep me and my community safe independent of the legal remedies? Because oftentimes, those legal remedies aren’t acceptable to everybody. Just realistically and very obviously. And/or will sort of help you after the fact? Maybe it means you recover money. Or maybe it means the evidence isn’t used against you in court. That doesn’t help very much if what you’re concerned about is the safety of you and your people.

So making sure that, like, we don’t pretend these remedies will make people whole for the harms they experience from surveillance or from the government. But rather that, you know, some of these protective measures that folks can take are about sort of trying to prevent the types of harm that the surveillance might cause in the first place.

Apples one more quick note is that a relatively recent Supreme Court decision has suggested that the government does need a warrant to access your cell phone location. So that was like one little bit of good in a sea of terribleness that is the third party doctrine.

LORELEI LEE: I think ‑‑ so I think I’ll respond to a little bit of that to say that I… you know, I’m thinking about the connections that are between what Yves has been talking about, and then what you folks are talking about, in terms of… the way that information gets used against you that isn’t really cognizable in the law, but once they have your information and have you on their radar, they use that information to get more information, to follow you, to trace your contacts, and happening in multiple different contexts. And… Something that I think is really important that people don’t think about all the time is that everyone breaks the law. And… (Laughs) And it is people who are criminalized. It is ‑‑ so, we think of criminalization as being about behavior. But it is really about people in communities.

Because everyone breaks the law, and the only folks who are targeted by police ‑‑ and I mean state police, federal police, et cetera ‑‑ like, those are the folks who get punished for breaking the law. And that punishment can be… you know, because they have followed you for a certain period of time in order to collect enough information in order to the make something cognizable in the law.

So I’m thinking about how one of the themes of what we’ve been talking about is the deputizing of folks who are not thought of as law enforcement agencies, but whose collection of your information becomes a way of enforcing behavioral norms. And that happening in a way that is ‑‑ goes beyond what criminal law can even do. And so… Thinking about sort of the modern history of how this has happened in law that ‑‑ in some of the laws that we’ve been talking about, some of the federal laws that we’ve been talking about. So, the Patriot Act, one thing to think about in terms of the Patriot Act is how prior to 9/11, Congress had been considering passing some form of regulation for internet companies and data ‑‑ regarding data collection. And Kendra, please jump in if I am messing this up. Or anyone, obviously. (Laughs) But… When 9/11 happened, that regulation sort of was pushed to the side. And it is during this time period that we have this sort of ‑‑ we have the increase in government surveillance, but we also have this sort of recognition by private corporations that data collection is something that can be monetized. And they are unregulated in doing this, and there is this idea that if you have given over your information voluntarily, it belongs to those people, regardless of whether you did it knowingly or not.
So we have this sort of rise of data collection that is a creation of surveillance tools by corporations, and there’s sort of a monetary incentive to keep creating stronger and stronger data collection tools that can be more and more invasive and do this kind of contact tracing that does the thing that Yves has been talking about, that you folks have been talking about, where you don’t ‑‑ each piece of information looks innocuous, but when you put it together you have a map of who you are, and that’s especially concerning for sex working people because they identify sex working people based on this collection of seemingly innocuous information. And you have the incentive to create tools that are more and more effective at collecting that information. And then you have the partnership between government and private companies that then allows that information to be used in order to enforce norms that are… expected by the state, that are thought of as beneficial to the state. And, obviously, targeting people in the sex trades at a high rate. And especially people in the sex trades who are parts of ‑‑ part of other marginalized communities.

And so FOSTA and EARN IT are just sort of… I think we talk about FOSTA a lot as though it is a, like, a revolutionary law that was passed, as though it made huge changes in the law. And it, you know, it did make a change to one specific law that I think people thought of as being sort of dramatic. That’s Section 230. However, it is ‑‑ it really was just an evolution out of stuff that had already been happening. So FOSTA’s purpose, and EARN IT’s purpose, as well, one of the main purposes of both of these laws is to decrease limits on civil liability for internet companies. And you can think of that as being Congress sort of taking out, taking themselves out, taking their responsibility away from themselves in terms of regulating these companies and putting, deputizing civilians to do that regulating for them, and deputizing corporate ‑‑ also deputizing corporations to create… rules and collect data that is thought of… (Sighs) Or is publicized! As being some, having some impact on trafficking and sexual exploitation and sexual violence? But all of that being simply… a show. And actually increasing sex workers’ vulnerability to exploitation. And when you decrease our ability to communicate with each other, when you decrease our ability to be visible online, when you decrease our ability to share information, health and safety information, you increase folks’ liability ‑‑ sorry, folks’ vulnerability, to violence and exploitation.

And I notice that someone asked earlier whether EARN IT had a piece about not prohibiting sexual health information for use. And it doesn’t, at all. But what it does is increase civil liability so that it incentivizes companies to draw the line further than the law specifies. And that is the same thing that FOSTA does. So, these laws ‑‑ because civil liability can be ‑‑ right, anybody can sue. You know… So, think about ‑‑ in terms of criminal law enforcement, that ‑‑ you need specific resources. Like, the police and the FBI ‑‑ policing happens on all of these different levels, state and federal. I mean, they do have ‑‑ obviously, this has been, you know, this has been in public conversation for ‑‑ especially recently, is how many resources these folks do have. And it is a obscene amount of resources. However! It is still less likely that you will be subject to criminal prosecution than that as a company you will be subject to a lawsuit. And the lawsuits also have a lower requirement in court in order to have liability. Like, civil liability, you have to show less in court than you do to prove criminal liability.

And so when you increase civil liability, you incentivize companies to draw the line much further than the law specifies… because they want to get rid of even the appearance of liability, and even, you know ‑‑ because also, lawsuits are expensive, regardless of whether the claims can be proven or not! So, that’s ‑‑ I’ll stop there.

DANIELLE BLUNT: I wanted to make sure that we take a few minutes to sort of talk about what FOSTA‑SESTA and what EARN IT are amending. So Kendra, I would love just like a two‑minute summary of Section 230, and then Lorelei, if you wanted to sort of continue with what ‑‑ like, what EARN IT is, and what EARN IT’s proposing, and why the over‑‑‑ how ‑‑ and the ways in which it’s so overly broad that things like queer sex ed could get caught up in it.

KENDRA ALBERT: Yeah. And I think actually I want to sort of tag on to the end of what Lorelei was saying, ’cause I think it ties perfectly into a discussion about Section 230, which is to say the sort of what we lawyers would call “commandeering,” but the use of private companies to do things that the government… isn’t sure that it has the political capital or will to push forward? It’s not just because they, like, can make it happen using civil liability. It also is much harder to bring a First Amendment challenge to, like, companies deciding “voluntarily” to over‑enforce their own rules. Which, they’re not bound by the First Amendment. Versus the government making a particular rule, which would be susceptible to a First Amendment challenge.

So I can talk a little bit more about that, but I just want to make that point what Lorelei is saying. Which is delegating these responsibilities to private companies is not just better from, oh, you can kind of throw up your hands and claim no moral responsibility for what happens, but also it limits the ability of individuals what are harmed by these sort of changes to legal regimes to effectively challenge them.

What is section 230 of the Communication Decency Act?

So let me talk about 230, which I think will help us conceptualize the stuff, and then we can jump back to SESTA and FOSTA and EARN IT.

So Section 230 of the Communications Decency Act was passed in 1996…? I’m really bad with years. Anyways! Passed in 1996. And it was originally part of an omnibus anti‑porn bill, that had everything ‑‑ that was supposed to restrict minors from seeing porn on the internet. Everybody in the 90s was real worried about porn on the internet. And… It turned out that most of that bill was unconstitutional. It was struck down by the Supreme Court in a case called ACLU versus Reno. But what was left was this one provision that hadn’t gotten a ton of attention when it passed called Section 230. And what Section 230 does is it says that no online service provider can be held liable for the sort of ‑‑ or, to be the publisher of content where someone else, like, sort of spoke it online.

Okay. What the fuck does that mean? So, let’s take a Yelp ‑‑ let’s take Yelp, for example. On Yelp, there are Yelp reviews, posted by third parties. So if I post a Yelp review of my local carpet cleaner. Is always use them, because there’s a funny case involving carpet cleaners. Um. Anyway! I post a review, who I have not used. They have cleaned zero of my carpets. I say, these people are scum bags. They ripped me off. They told me it would cost $100 to clean all my carpets, and it cost me 3,000, and I got bedbugs. So that’s inflammatory. They could potentially sue me, because it’s not true and it harms their business.

What Section 230 says is carpet company can come after me, Kendra, for posting that lie, but they can’t sue Yelp. Or if they do, they’re going to lose. Because Yelp has no way of knowing if my claim is true or false.
So that’s the original meaning of Section 230. It’s gotten broadly interpreted, for lots of good reasons.

So right now, there are something like 20 lawsuits against Twitter for facilitating terrorism, all of them thrown outed on Section 230 grounds. The one that is most relevant to our conversation right now is a case out of Massachusetts called Doe v Backpage, which was brought by a number of folks ‑‑ survivors of trafficking against Backpage.com, for what they claimed was complicity and sort of knowledge of the ads that were placed on Backpage that they were harmed as a result of. And the first circuit, which is a sort of… one step below the Supreme Court, in terms of courts, said: That’s all very well and good, but Backpage isn’t the speaker of any of those ads. They didn’t write the ads. They don’t know anything about the ads. We’re throwing out this case. And in the aftermath of that, Congress was like, this shall not stand! And passed FOSTA and SESTA. And I’ll turn it over to Lorelei to talk more about that.

LORELEI LEE: I think I’m curious what the audience’s questions are about FOSTA‑SESTA, because I think there’s been quite a bit of information written about them, about that law, and I wouldn’t want to just talk on and on about it when it’s not focused to what people want to hear. Or should I talk about EARN IT? Or ‑‑ I don’t know, someone tell me ‑‑

DANIELLE BLUNT: I think we did a summary of FOSTA‑SESTA. I would like another one‑sentence summary of FOSTA‑SESTA, the impact. And then what the fuck EARN IT is and where it’s at, would be helpful.

LORELEI LEE: Yeah, so we can talk about why Section 230 matters to these laws.

DANIELLE BLUNT: Yeah.

Why is Section 230 important?

LORELEI LEE: So FOSTA‑SESTA does several things, does like six things in federal law, including create a new crime under the Mann Act, which is originally the White Slave Trafficking Act of 1910, and it’s been renamed, but it’s not better…? (Laughing) Oh, boy. So it creates new crimes. That’s one thing that it does. But it also a changes Section 230 so that it no longer protects against lawsuits under federal law regarding specifically the federal anti‑trafficking law, 1591 ‑‑ 18 USC, 1591, in case anyone wants that number. Um. (Laughs)

And so that, what that does… There ‑‑ it’s not clear that Section 230 was actually preventing people from suing companies, specifically Backpage. Backpage was the center of congressional conversations and the center of media attention. And… The chamber was that Backpage was going un‑‑‑ they were held ‑‑ not being able to be held accountable for trafficking that was happening on their website. But actually, the first circuit case was maybe… just didn’t have enough evidence yet to show how actually Backpage could have been held responsible regardless of Section 230, because Backpage was doing things like editing ads and that kind of thing that would have made them liable in a way that’s unlike Yelp.

And so… So, okay. But! People started talking about Section 230 because there was a documentary made about the first circuit case, and it was very well‑publicized, and that documentary was shown in Congress. And people started talking about Section 230 as though that was the thing preventing lawsuits.
I mean, another important piece to remember about making civil liability the place where we enforce anti‑trafficking law and anti‑exploitation law is that it puts the onus on victims of trafficking and victims of exploitation to bring those lawsuits that are very expensive, that ‑‑ lawyers for those claims are inaccessible. You have to spend years talking about your trauma. And! You know, it takes such a long time to get ‑‑ if you are going to even get compensation ‑‑ and then, at the end, you get monetary compensation if you win your lawsuit. But ‑‑ it doesn’t prevent trafficking! And it doesn’t prevent exploitation. And we know that there are other methods of doing that that are much more effective. And many of those methods involve the investment of resources, I think ‑‑ I think this is one of the reasons that this is happening, is that many of those solutions involve the investment of resources in marginalized communities. And instead, Congress wants to pass bills that don’t require the redistribution of wealth in this country.

So EARN IT does something similar to Section 230. And the way that FOSTA makes a carve‑out in Section 230 around the federal anti‑trafficking laws, EARN IT makes a carve‑out in Section 230 around the child exploitation laws, specifically child sexual abuse material laws. And, similarly, when Kendra and I did this research, there haven’t ‑‑ there hasn’t been a lot of examples of Section 230 preventing those claims being brought. So, again, it feels a little bit more like this is for show than anything else. But we can predict that the impact that EARN IT will have will be very similar to the impact that FOSTA had in terms of the removal of information online and the censoring of people online. And the ‑‑ not just the removal of information, but the prohibition on specific people talking.

And we think that, based on our, like, analysis of EARN IT, that that impact will be really on queer youth. So, because that’s a specific community for whom there’s a lot of fear around sexual information being shared, and it’s also a specific community who is seeking that information out! Because, I mean, just being ‑‑ having been a queer youth once myself! I know that, like, you just don’t ‑‑ you just don’t necessarily have access to folks when you’re a kid who can tell you that you’re okay and you’re normal.

DANIELLE BLUNT: Yeah. Thank you for that, Lorelei. And I wonder, too, if Yves and Korica, if you have anything that you would like to add before we open it up to the Q&A.

YVES: I mean, I think that… you know, y’all covered it pretty well, right? Like, I think that like everybody covered a lot of stuff. I mean, I’ve been looking at the questions, and I also only really have, like, a little bit to say in terms of… you know, the way that we see a lot of this happen, right? We’ve obviously talked about criminalization; we’ve talked about censorship, and kind of what happens. But specifically, in talking on this panel around the impact on sex workers and marginalized communities, like the ways that we really see a lot of this happen, and the push for this, right? Like, whenever there’s an increase in surveillance, like that increases the scope, it’s going to increase the scope of policing, and also the general stigmatization. Right?

So we don’t ‑‑ like, I think everybody kind of knows that I’m, like, most knowledgeable in terms of the impact on in‑person sex work. But when we also look at the impact these things have on like digital sex work, that has gotten so much more popular during these times, right, we also see that all of these groups, and like ‑‑ the groups that are behind these bills to begin with, right? Are pushing for other forms of censorship or limitations being put on not just sex workers, but other marginalized communities, but also, like, you know, we know that these intersect. We know that there are intersections here. Is that, they ask people like credit card companies to not accept payments for sex workers. We’ve seen that happen, right? And like, in terms of like in‑person sex work ‑‑ and Lorelei talked about this, right? People get pushed into the most dangerous forms of sex work, making them more vulnerable. In fact, making them more vulnerable to the human trafficking that these people are so against, and make everybody so much more vulnerable to all of these things, which we like kind of talked about. And I think it’s kind of important in talking about the questions that I’m kind of seeing about, you know, what do we do? Because I feel like a part of our conversation kind of scares everybody into being like, oh, my gosh, I should just not use social media! I shouldn’t even text! (Laughs) Which, I don’t want people to think that that is the case? Right? F I think that, like, all of these different encryption methods, and these things, right ‑‑ although! Right? I do not think that they’re foolproof, which they’re not! Like, there are still many ways in which the police and like other agencies can get access to this information, one of those ways being like whomever you’re sending the information to, and wherever that information kind of ends up. If you like, you know, sync it to your laptop, sync it to your phone. All of these different ways, right? But these are tools that can protect you.
So I think that if you want to use these encryption methods, these like ‑‑ proton mail to encrypt your mail, iPhone messages, that’s a good thing. Take what you have. But know they’re not foolproof.

I also want to talk about ‑‑ Kendra kind of talked about this and the reforms and what they look like, and how we think we’ve solved this, or people think they’ve been solving it? Obviously, I came into this conversation, I told everybody I’m an abolitionist. Right? I work with abolitionist groups. I think at the end of the day, surveillance is a strategy that they use in policing, before this technology existed. Before they started doing this stuff, they always surveilled. It’s just an arm of policing. At the end of the day, the problem is policing, policing that has always been used and meant to use to disappear communities. right?
So the bigger fight that we’re fighting is policing. So I don’t want people to think that the fear is, oh, you shouldn’t do anything. The truth is, if you’re a marginalized person, if you’re a sex working person, they are going to want to police you, no matter what, and we’re fighting against that.
(silent applause)

DANIELLE BLUNT: Thank you so much for that. Korica, did you have anything that you wanted to add?

KORICA SIMON: I will say I went to a conference. It was a Law for… Black Lives? I think is what it’s called? And they do things around, like, the law and liberation of Black people. And the speaker talked about how, like, have you ever noticed if you lived in a Black neighborhood, or a person of color, people of color neighborhood, police are everywhere? But if you live in white neighborhoods, police are not there. And it’s not because there is more crime in one area over another. In fact, like, police just make the crime worse. And I thought about that a lot, ’cause I’ve lived in Black neighborhoods. I’ve lived in white neighborhoods. And there is a stark difference. And there is still, like, “crime” happening in the white neighborhood, but nothing ‑‑ like, police weren’t there. So I think it is important to think about how policing is the problem.

And then one other thing I forgot to bring up is that before this talk, I was doing like research on what’s been happening lately, ’cause I feel like there’s just always so much happening. And something that I missed was that some police departments at NYPD and Chicago Police Department, they have been putting like ads… sex ads on websites, and people will text that number looking for services. And they will ‑‑ the NYPD Police Department will send them a message saying ‑‑ I have it pulled up. It’ll say, like, this is the New York Police Department. Your response to an online ad for prostitution has been logged. Offering to pay someone for sexual conduct is a crime and is punishable for up to seven years of incarceration. And… Yeah!
So, people in that article were talking about how the police have access to their name and their phone number, and they don’t know like what’s gonna happen to them. Like, are they just logged in some database? And I think it’s safe to assume that they probably are logged into some kind of database. And I think, as we think about how ‑‑ as yeast said, how sex work is becoming even more digital with the time that we’re in, like the impact of this on sex workers, I’m guessing it’s gonna be really large. So, yeah. I just wanted to bring up that extra way that surveillance is happening.

LORELEI LEE: I actually wanna add one more thing that I intended to say and forgot to say, which is just that in terms of this question of what we do, I do think that one other point to me, like, when they’re passing these laws, something else they’re doing is deputizing us to police ourselves and to chill our own speech and to prevent us from organizing and to prevent us from using any tools at all to communicate with each other and to talk about these issues. So, I don’t know. I do think that the… It is a mistake for us to use this as a reason not to… speak to each other! I mean, that is like really what we’re talking about, when we talk about not using online tools and other electronic tools of communication that are…

DANIELLE BLUNT: Yeah, it’s really interesting, too. And right now, some of Hacking//Hustling’s researchers are wrapping up the survey that we were doing on content moderation and how it impacts sex workers and activists who are talking about Black Lives Matter over the last few months. And like, we’re definitely noticing themes of speech being chilled, just like we did with our research on FOSTA‑SESTA, as well as the impact of, like, platform policing on both social media and financial technologies has just about doubled for people who do both sex work and are vocally protesting or identify as activists. And… The numbers are just… very intense.

So like, I think… Being mindful about how we communicate, rather than not communicating, is a form of harm reduction and community care. And I also see that this ‑‑ this panel as a form of harm reduction and community care, and this in partnership with our digital literacy training. Because I do believe that the more that we know, the more we’re able to engage meaningfully when legislation like this comes up. And… Like… A lot of these laws aren’t meant to be ‑‑ aren’t written to be read by the communities that they impact, and they’re often intentionally written to be unintelligible to the communities that they impact, and think that they can just like get them signed into law without ‑‑ without having to check in with the communities that are harmed by this legislation.

So I think that anything that we can do to better understand this and decrease that gap between the people who are writing this legislation, or the, like, tech lawyers who are opposing this legislation, and like bringing in our own lived experiences? Is incredibly important work.

KENDRA ALBERT: I also ‑‑ well, I know we want to ‑‑ well, I’ll stop. Blunt, you want to do Q&A?

DANIELLE BLUNT: Sure, we can do Q&A. If you had one thing to add, that’s fine.

KENDRA ALBERT: So one thought I had there is, one, it’s totally right? But it can feel like oh, my god, there’s so much? That’s one of the hard things with talking about surveillance. It’s like, yeah! You know, police and law enforcement have so many tools in their law enforcement, and… You know? But at the same time, like, our ‑‑ we care for each other by, like, creating space to talk about what makes us feel safer, and how can we make ‑‑ take risks that we all agree to be taking? Right? Risk‑aware consensual non‑encrypted information.

DANIELLE BLUNT: I love that! (Laughing)

KENDRA ALBERT: It rolls just right off the tongue.
But I think I want to highlight what Yves was saying, in terms of the problem is policing? And I think one of the ‑‑ and Korica also said the same thing, so, you know. What we’re all saying, in terms of the problem being policing, and the solution not just being like finding more ways to like slightly narrow the surveillance tools? I think one of the real problems around surveillance, sort of surveillance debates generally, and I say this as somebody who comes out of a technology law tradition, is that they are ‑‑ the folks who are doing work on like sort of high‑level surveillance tools, like things like the sneak‑and‑peek warrants or Section 215 of the Patriot Act, are often deeply disconnected from the communities who are most likely to be harmed once these surveillance tools are widely used. Right? Like, just like with the technology laws, right, there is this way in which, you know, the conversation around like mass surveillance is up here, and we’re supposed to be afraid of mass surveillance, because mass surveillance means surveillance of white folks like me and not communities of color. Right? But at the same time, like, that… So much of the rhetoric relies on like the idea that it’s okay to surveil some folks, but it’s not okay to surveil others. And part of how we fight back is by deconstructing the notion that it’s okay to do this ‑‑ to use these tools on anybody. That like, you know, it doesn’t ‑‑ it’s not actually like, oh, there’s a bad enough set of crimes to make this okay. Right? And that’s part of ‑‑ part of it is not getting sucked into the sort of like whirlpool of like, well, you know, is terrorism worse than human trafficking? Well, if terrorism isn’t worse than ‑‑ if they’re both equally bad, then we need to have the same tools to prosecute human trafficking as we do to terrorism. And here we are where they’re getting a sneak‑and‑peek warrant to go into a massage parlor in Florida.

And I don’t say that flippantly, because those are real folks’ lives, just like there are real folks’ lives impacted by surveillance of supposed terrorist communities. Looking at all of the mosques in New York and Detroit, where folks were under persistent surveillance after 9/11.

So I think part of what we do is we resist the idea that it’s okay if this happens to other people. Because, you know, that’s how… That’s how the tools get built that will, like, eventually be used against all of us.that was what I wanted to say. I’ll stop there.

DANIELLE BLUNT: Thank you, Kendra. Okay. We’re going to open it up for Q&A. Someone asked if we could touch on the recent encryption legislation and how protected we are using services like Signal, WhatsApp, and Telegram.

KENDRA ALBERT: I can take that, and then if Lorelei and Blunt, if you want to jump in if I screw it up.

So, you know, EARN IT was one of the sort of pieces of legislation that was kind of proposed to… make it sort of ‑‑ I don’t want to say “end” encryption, but would have had the practical effect of making encryption, encrypted services more difficult to sort of produce. The other is the LAPD I think laid ‑‑ that’s probably not how people have been pronouncing it. But.
(Laughter)
The ‑‑ that bill is way worse. I do not think it’s going to pass. It sort of all‑out tries to ban encryption.

EARN IT actually, sort of between the initial proposal and the version of the bill we’re currently on, got much better on encryption? So now it specifically says that, you know, using encryption won’t ‑‑ like, isn’t supposed to be able to be used against a service. Like, for purposes of figuring out whether they’re liable for child exploitation material. It’s really ‑‑ it turns out that that construction is not just complicated when I say it, but very complicated in the bill, and might do less.
In terms of what the impact is gonna be on like Signal, WhatsApp, and Telegram ‑‑ you know, what I’m hearing in this question is sort of end‑to‑end encrypted services, where the service provider doesn’t have access to your communications? You know, I think that it would be unlikely ‑‑ if ‑‑ if, God forbid, EARN IT as currently existing passes, I think it would be unlikely to sort of result in Signal or WhatsApp going away. In fact, actually, some advocates are currently ‑‑ like Miles Nick in particular ‑‑ are arguing that the current internet construction earn best of your knowledges encryption? I’m a little more skeptical about that than he is. Happy to sort of ‑‑ you know, at me on Twitter if you want to talk about that.

But I don’t ‑‑ those services are not going away under the current version of EARN IT. However, the Justice Department has been trying to sort of get back doors into encrypted services for a long time, and they’re not going to stop. So it’s sort of a… nothing to watch for right now, but stay vigilant on that front.

YVES: I just wanted to generally say, right, like I think… Like, if the question’s kind of getting at like in your personal life, like, how ‑‑ what’s the danger, or like if you’re doing some kind of criminalized work, or something that you are afraid of like the police getting information of, right? Like, it’s not gonna do your harm to use an end‑to‑end encryption service, like iMessage, like WhatsApp, like Signal, like Telegram. Right? But it’s not something that’s gonna protect you wholly? But also should note that, you know, the like the person asking this question is like an organizer or a whore, like, you know. So like, when ‑‑ most of the time, when this information gets in the hands of police from like your texts or things like that, it’s not because they’ve like hacked the system. It’s not like something like that. It’s usually because someone you’ve talked to has like the, the police have gotten ahold of them, they’ve given that information to them. And like, the ways like ‑‑ I kind of talked about this in the beginning, right? When they deputize civilians, we’re not just generally ‑‑ I literally mean there are also people who are just going to be, like, I think that there’s a sex worker at my hotel! Like, da da da da! I think there’s a sex worker in my Uber! Right? And like handing over that information.
So I don’t want people to be like, oh, I’m just like not safe anywhere. Because that’s not really what the scenario looks like in real life, when you’re like on the street and like working. Right? But they’re not, like, fully safe. It’s not like, oh, you can type anything into Signal, and it’s like Gucci.

DANIELLE BLUNT: Right. And I think, too, people can take screenshots. Oftentimes, that’s how information is shared even when you’re using encrypted channels. So I think also just being mindful about what you say, when our saying it to. If you’re using Zoom, knowing that this is going to be a public‑facing document, and we’re not currently planning any political uprisings in this meeting? So it feels okay and comfortable to be using Zoom as the platform. But like… Personally, in my work, even if I’m using an encrypted platform, like, I don’t say anything that would… like, hold ‑‑ I do my best to avoid saying things that would, like, hold up in court as evidence, in the way that I use language.

KENDRA ALBERT: Yeah. I think in the immortal words of Naomi Lauren of Whose can have corner Is It Anyway, people need to learn how to stop talking. Which it turns out is both solid advice, and what my advice is if the police want to talk to you. So, solid on many different front seat.

LORELEI LEE: I think it’s really important ‑‑ like, I think several people have said this already, but just to really emphasize that when we’re talking about this stuff, the intention is to have… you know, informed consent, you know, for lack of a better word, of using these tools. And that… You know, especially if we’re talking about, we’re talking about sex working people, we’re talking about ‑‑ Aaa! Caty Simon! (Laughing) I’m sorry, I had to interrupt myself to get excited that Caty Simon is here. Another expert on all of this stuff.

The thing I was going to say is I do think that sex workers are criminalized folks from many marginalized communities are really good already at risk assessment. Understanding what level of risk you are comfortable with. And using these tools with that in mind. And knowing that nothing ‑‑ there are no answers! Right? There’s no, there’s no system except abolition that is going to prevent these kinds of harms from happening. Abolition of actually policing and capitalism, perhaps! So.
Oh, and the thing I was na say, which is maybe not that important, but the question I had for Kendra, is whether you think EARN IT is still part of encryption in terms of best practices and how that might inform future corporations. I’m not sure if that’s too far in the weeds?

KENDRA ALBERT: I think it could be. So one of the things we’ve been saying internal internally about EARN IT, like in Hacking//Hustling, that I want to emphasize here, is the lack of clarity of what the bill is going to do is a feature by the creators, not a problem? It’s not that we’re failing at interpretation? ‘Cause we’re not. You know. But the… You know, I can say all I want what I think EARN IT means with regards to Signal and Telegram, but as Lorelei pointed out one of the things EARN IT does is create this commission that creates best practices, which who the hell knows what’s going to be in there. And it’s really unclear even how the liability bits are going to shake out.

So even with a specific amendment to the current version of EARN IT that’s supposed to protect encrypted services, we don’t really know what’s going to happen. So really good point, Lorelei. Thank you.

DANIELLE BLUNT: “Other than being educated or somehow not using any technology, what can we do?” I feel like we touched on that a little bit, but if anyone wants to give a quick summary.

KENDRA ALBERT: Yeah, I mean, I think just to echo what Lorelei and yeast have already said, right. Engage in thoughtful conversations around how you’re using the technology, and be thoughtful around how you’re using it, when our sharing what with. I think for me ‑‑ and actually, maybe this gets to the next question, which is sort of like… It doesn’t matter if you don’t break the law? Or what ‑‑ but I don’t have anything to hide! Right? You know, the way I think about this is that, like, everyone has something that law enforcement could use to, like, make your life miserable. That’s just reality. Some folks have many more things! Like. But everyone has something. And so… Not ‑‑ my goal, like our goal I think here is not to sort of suggest like paranoia, they could be listening to everything. Although, you know, yes! I’m not pretty sure that there’s a legal authority to do most things that law enforcement wants to do, and like I’m not under any illusions about that. But so part of how we think about this is, you know, how do we take care of the folks around us and be thoughtful around the risks that we’re taking and make sure that we’re taking risks that are aligned with our values and the things we need to do? Right? And those are gonna look different for everybody. But I welcome thoughts from other folks on the panel, ’cause I think I’ve said enough.

LORELEI LEE: I think I’d like to add something, which is that oftentimes when we talk about this stuff, we talk about it in terms of personal risk as though risk belongs to us alone, when I think it’s really important to recognize the communities that you’re interacting with and the people that you’re interacting with and to understand that even if you feel as though law enforcement won’t do anything to you, you’re not a likely target, it’s highly likely that there are people in your life who are likelier targets, and your refusal to talk to law enforcement, or your care around how you communicate with folks, is protective of the people around you and the people that you care about who you might not even know what their levels of risk are.

And then the other thing I want to add in terms of things to do is that to think about how ‑‑ what, what actions you’re capable of to oppose the passage of anti‑encryption laws, to oppose the passage of laws that target sex working people and people of color and people in marginalized communities, and to think about if you feel as though you have a lower level of risk of being targeted by law enforcement, that means that you have a greater capability for maybe going out and protesting! I have to tell you, I know a lot of criminalized people who do not feel safe protesting on the street, do not feel safe talking on un‑encrypted platforms, don’t feel safe talking on panels like this. And if you feel safe doing those things, then it’s your responsibility to do those things. So.

YVES: I mean, I don’t know if you were going to ask the next question, but like Lorelei and Kendra kind of talked about it a bit. I kind of just want to say if someone asks you, or like, it doesn’t matter if you don’t break the law, that’s not really a part of the issue, right? Like, laws, crimes, the things that we define as crimes are entirely arbitrary. Right? So who gets arrested, who gets criminalized, all of these things are… just simply based on who, like, the system is against. Which we know that that means Black, Indigenous, people of color, especially trans people, any gender non‑con forge ‑‑ gender nonconforming people, sex working people, anything that is outside the scope of what white supremacist culture would consider to be a good and appropriate person! Right?

So it’s not really about breaking laws. Or, you shouldn’t be afraid of anything if you haven’t done anything. Because it doesn’t matter. They’re going to criminalize people regardless of that. Right? They’re going to incarcerate people regardless of that. Like, all of these things are a death sentence to marginalized folks, which is why we kind of talk about it in this way. It’s not about, like ‑‑ well, I mean, it is about like surveillance is bad? It’s infringing on rights of people, right? But it’s also about the fact that surveillance is just like a tool that is used for policing, for incarceration, in order to just disappear whomever. Right?

So, when talking about that, surveillance is bad for that reason. For the reason, like, I talked about a little bit with contact tracing, right? That, in theory, should be a good thing. Should mean that we are keeping people safe. Should mean that people aren’t getting COVID, or are getting treated for COVID, are getting treated for HIV/AIDS. But we know that in a world where we have policing, that is simply not what happens! Right? It’s not a case of, will they use it? They will. They will use it, they will criminalize it, they will arrest people. So we want to get rid of it. Wholesale.

DANIELLE BLUNT: Yeah. And I think, too, when you talk about contact tracing in that capacity, I can’t help but think about the ways that data is scraped from escort ads to facilitate the de‑platforming across social media and financial technologies of sex workers and other marginalized communities, as well as activists. So I think both on the streets and on the platforms, like… This is not being used for good? And that it needs to end.

Okay. I’m gonna try and get one or two more questions in. Someone asked, do you think that EARN IT is going to be passed?

(all shrugging)

I think that’s our official comment!

KENDRA ALBERT: Yeah, for anyone that’s not watching the video or is not able to watch the video, there is just a lot of shrugging.

DANIELLE BLUNT: Yeah. But if you keep following Hacking//Hustling, we’ll keep talking about EARN IT and updates when they come, so. If you want to follow @hackinghustling on Twitter, that’s usually where our most up to date shit is.
Someone said, hypothetically, if someone wants to be a lawyer and is studying the LSAT and hoping to apply in the fall, should they not post publicly about these things or attend protests where you could be arrested?

KENDRA ALBERT: I can take that one. So, you can absolutely post publicly about these things. So the thing to worry about here is that the… like, for lawyers, is this thing called character and fitness, which is basically if you want to practice as a lawyer after you go to law school, you have to get admitted to one of the state bars, and state bars have particularly requirements. I actually don’t know a ton about how those interact with, like, a past history of sex work? But the sort of watch word in terms of thinking about character and fitness is honesty, generally speaking. Like, the goal ‑‑ folks generally ‑‑ pretty much most things are overcomeable through character and fitness, if you explain sort of what happened. So getting arrested at a protest, like, that’s ‑‑ you can totally still pass the bar and become a lawyer through that. Absolutely posting publicly about like abolition or sex work or, you know, those kinds of things.

You know, where I would start to sort of think about whether you want to talk to someone who has more experience about this than me is, um, if you have felonies on your word, or if you are sort of worried that you have any behavior that folks might use, might believe makes you less honest. So things like fraud convictions often come up. But I’ll stop there.

DANIELLE BLUNT: Awesome. And then I think this will be our last question, as we’re just at time. Any books recommendations along with Dark Matters: On The Surveillance of Blackness by Simone Browne? So sounds like folks are interested and want to learn more.

KENDRA ALBERT: This isn’t a book, but Alvaro Bedoya recently wrote a piece on The Color of Surveillance. It’s really amazing. So I recommend that.

DANIELLE BLUNT: Will you tweet that out? If folks say things, will you tweet them?

KENDRA ALBERT: Yeah.

KORICA SIMON: I have a few books that I’ve ordered and I need to read, before the summer is over? Black Software: The Internet & Racial Justice, from the Afronet to Black Lives Matter. It’s talking about how technology can… Oh. Digital racial justice activism is the new civil rights movement. There’s Automating Inequality: How High‑Tech Tools Profile, Police, and Punish the Poor. Have you read that?

KENDRA ALBERT: It’s really good. I really recommend Automating Inequality.

KORICA SIMON: The last one is Race After Technology: Abolitionist Tools for the New Jim Code.

LORELEI LEE: I think I would add The Age of Surveillance Capitalism, which talks a little bit about the history of the development of some of these data collection tools.

DANIELLE BLUNT: Yves, did you have one you were saying or typing?

YVES: I mean, I would recommend there’s The Trials of Nina McCall, which is about sex work surveillance, and I think it’s the decades‑long government plan to imprison promiscuous women. I would also recommend, if you’re interested in learning more about how public health is weaponized as surveillance against marginalized communities, Dorothy Roberts writes a lot of stuff about this. So, yeah.

DANIELLE BLUNT: And ‑‑ that’s a beautiful place to end. Thank you, Lorelei, for sharing that. I feel like ‑‑ (Laughs) We’re all ‑‑ everyone’s crying. I’m crying. Speaking for everyone. (Laughs) If people want to be found online, or if you want to like lift up the work of the organizations that you work with, can you just shout out the @?

KENDRA ALBERT: @HackinHustling. It’s really great!

YVES: You’re fine. @redcanarysong, and @SurvivePunishNY.

DANIELLE BLUNT: Well, we are slightly overtime. Thank you so much to our panelists, and Cory, our transcriber, for sticking with us. I’m going to stop the livestream now, and stop the recording.

Panelists

Lorelei Lee (they/them)is a sex worker activist, writer, recent law school graduate, and 2020 Justice Catalyst Fellow. Their adult film work has been nominated for multiple AVN awards and won a 2015 XRCO award. Their essays, fiction, and poetry have been published or are forthcoming in The Establishment, Denver Quarterly, $pread Magazine, Salon, Buzzfeed, n+1, WIRED, The Believer, and elsewhere. They are a contributor to the anthologies Coming Out Like a Porn Star, The Feminist Porn Book, Hustling Verse, and others. They were a founding member of Survivors Against SESTA, are a researcher and analyst with Hacking//Hustling, and serve on the steering committee of Red Canary Song.

Yves (they/she) is a queer and disabled Viet cultural worker and sex worker whose organizing home is with Survived & Punished NY, Red Canary Song, and currently FTA4PH. Yves comes from a background in Rhetoric and focuses on the study of collective and public memory and uses it as a framework for their work in art and organizing for prison/police abolition and the decriminalization of sex work.

Kendra Albert (they/them) is a clinical instructor at the Harvard Cyberlaw Clinic at the Berkman Klein Center for Internet and Society, where they teach students how to practice technology law by working with pro bono clients. They also have held an appointment as a lecturer, teaching classroom courses on the First Amendment as well as transgender law. Kendra holds a B.H.A from Carnegie Mellon University and a J.D. from Harvard Law School. They previously served on the board of Double Union, a feminist hackerspace in San Francisco and run a side business teaching people how to use their power to stand in solidarity with those who are marginalized. Kendra’s research interests are broad, spanning constitutional law, queer theory, video games, and computer security. Their work has been published in Logic, WIRED, and the Greenbag, and covered in The New York Times.

Korica Simon (she/her) is a third year law student at Cornell University and a fellow for the Initiative for a Representative First Amendment through Harvard’s Cyberlaw Clinic. This past year, she worked as a graduate teaching assistant for an information science course at Cornell called Information Ethics, Law, and Policy, where she taught a course around the ethics of up and coming technology and engaged with students on how the law should respond to these innovations. In addition, she has had the pleasure of working on sex worker rights issues through an internship at Legal Voice in Seattle and through the Cornell Gender Justice Clinic. After graduation, she’s hoping to become a privacy lawyer focusing on freedom of expression issues that marginalized communities face in the age of technological surveillance.

Leave a Reply