Movement Lawyering: Challenging Narratives Around Online Laws Pertaining to Sex Work

Movement Lawyering: Challenging Narratives Around Online Laws Pertaining to Sex Work. The Office of Clinical and Pro Bono Programs, The Cyberlaw clinic and Berkman Klein Center at Harvard University. July, 2020.

(add video and transcript)

What is Movement Lawyering?

Mason: Thank you all for coming. Welcome to day two of our two day series
on Movement Lawyering. Very excited today to have two incredible speakers with us.
First, we have Danielle Blunt who is a professional New York City based Dominatrix, the and Sex Worker Rights Advocate. She has her master’s in public health and researches the intersection of sex work and equitable access to tech. Blunt is one of the co-founders of Hacking//Hustling, a collective of sex workers and accomplices working at the intersection of technology and social justice formed in response to SESTA-FOSTA. Blunt is on the advisory board of Berkman Klein’s initiative for a representative first amendment,
and she enjoys redistributing money from institutions, watching her community thrive, and making men cry.

We also have Kendra Albert, who is a clinical instructor at the Cyberlaw Clinic where they teach students to practice technology law. They hold a degree from Harvard Law School
and serve on the Board of the ACLU of Massachusetts. They enjoy redistributing money from institutions, working on their solidarity practice, and making people in power uncomfortable.

So two amazing speakers. And I’m going to turn it over to Kendra to begin the first part of our discussion

Kendra Albert: Awesome. Thank you, Mason. I’m super excited to be here and be in conversation with Blunt. Building off of Asana and Yumina’s fantastic introduction to movement lawyering yesterday. Blunt and I talked about this a little bit in advance. I think what we’re kind of hoping to talk about is sort of realistically how conversations around movement lawyering style relationships might work in practice. And I’m using the example of some of our work together.

I figured we kind of start with what, I guess, I’ve been jokingly calling our organizing meet Q, which was how Blunt and I became connected and started working together. And then talk a little bit about how we think about our work and some of the stuff that we’ve done together and how that fits into the movement lawyering frame, and how there maybe other frameworks to think about it, and then what lessons we might be able to learn from some of our work together that we’re taking into the future and then maybe helpful to use. That’s my plan. Blunt, anything you want to add to that?

Blunt: No, I’m really excited to chat about this and for the opportunity to reflect this, because from my perspective, it wasn’t so much as an intentionality of seeking out movement lawyers so much as screaming into the void and Kendra responded.

Kendra Albert: Do you want to talk a little bit about that?

What role does social media play in movement work?

Blunt: Yeah. I retweeted it yesterday, but I… Hacking//Hustling was formed in response to SESTA-FOSTA. Melissa Gira Grant and I, and some other comrades put on some immediate harm reduction programming with Eyebeam. Eyebeam is an art and tech organization in Brooklyn that funds some pretty awesome work. They had recently done a panel series on women in tech, and sex work was left out of the conversation, so we invited them to continue the conversation. This was in our programming there, which then turned into the organization that Hacking//Hustling is.

While we were organizing against FOSTA-SESTA, we were met with this sort of deafening silence from the tech community, from tech lawyers, from just about everyone other than sex workers and very few allies. I was researching content moderation and doing as much research as I could because people who I expected to be having these conversations weren’t. I was reading custodians of the internet and saw that Tarleton was giving a talk at, I think, Berkman Klein. And Kendra just happened to be moderating that conversation.

I raised my hand on Twitter and tweeted. I didn’t even tweet at you. It’s just something we were discussing. But I tweeted at Berkman Klein and Tarleton asking about how you can write a book on content moderation and have a whole chapter on section 2-30 and not ever mentioned FOSTA-SESTA. Kendra, do you want to sort of talk a little bit about what your response was?

Kendra Albert: Yeah, sure. I think I asked the question to Tarleton at the session. A little bit of backstory prior to where sort of that moment was that sex worker rights issues were something I’d been sort of paying attention to kind of in the background for a while. I’d read Melissa Gira Grant, Playing the Whore, which is an excellent book for folks who haven’t read it. And sort of have been roughly following some of the aftermath of FOSTA-SESTA and the things that had happened with the organizing before. Mostly, I think through the lens of EFF sort of talking about sex workers and working with sex workers a little bit in their organizing against FOSTA-SESTA. It came time to ask that question, and so I asked the question and I don’t really remember what Tarleton’s response was.

But then afterwards I sort of reached out to Blunt on Twitter and found the thread and was like, “Hey, I hope I asked it okay. I didn’t see you.” There was a longer bit that Blunt had screen-shoted

Blunt: It was very verbose.

Kendra Albert: Yeah. And I don’t think I asked quite that. I think I just literally was like, “So what about FOSTA-SESTA?” Right? But happy to talk more. And so then we ended up, I think, having a phone call where I got to hear more about what Hacking//Hustling had been doing. And I actually had tried to tune into the Ibeam event, but I hadn’t been quite able to hear it.

We started talking about what next steps look liked or felt like for Hacking//Hustling and how maybe I could be useful. And I think one thing to note about those initial conversations is I think I was very much approaching them from a lawyer frame. Not necessarily that I would take Hacking//Hustling on as a client, but my expertise is as a lawyer. What I’m bringing to this is my ability to interpret FOSTA-SESTA. Or my ability to sort of do legal reasoning or whatever. And that wasn’t exactly what you were looking for from me. Can you tell me more, tell us more about that?

Blunt: Sure. Yeah. We’ve been just sort of very frustrated in trying to get a response. And I also want to note that your initial response to me made me very excited to pursue working together, having conversations together, because your response was something like, “Yes, this is really exciting for me. I’ve been really interested in thinking about FOSTA-SESTA but didn’t want to do that without input from sex workers.” And I was like, “Great. Like an ally. Great, amazing.”

And so we sort of took it from there. I think we had a phone conversation and then you invited myself and a few other sex workers to Berkman Klein to have a conversation. And what I also remember about that conversation is that it was a group of folks who have access to those institutional spaces who benefit from the privileges of being employed by, or going to Harvard, sitting in a room listening to three sex workers sort of scream about how horrible this legislation was and what our fears were and what we were experiencing in community.

Reflecting on that, it very much was in alignment with the work that Hacking//Hustling had been doing, where people who have power in institutions give that power to, are being put in a situation where they first have to listen to the communities who are impacted and who they’re purporting to serve.

That meeting to me followed the same sort of way that we planned our initial programming at Hacking//Hustling at Eyebeam, which was the first day was a panel of sex workers talking about their experiences with navigating online spaces, losing access to these online spaces. And then it wasfollowed up by a day of… We actually found T for Tech, a trans led organization providing harm reduction materials who also had sex working teachers to give the harm reduction, digital security trainings, which was actually very cool. Yeah. It was really amazing that our first time… My first time entering the Berkman Klein space was people were really interested in listening before moving into brainstorming solutions.

Kendra Albert: Yeah. And I think one thing to flag there is we also did sort of think about what were the next steps and brainstorm solutions. And there were a couple of different things that came out of even that meeting. I think there were sort of a harm reduction seen on understanding financial systems and how platforms track you, with an eye towards reducing the chances that folks weren’t getting their financial accounts shut down, because that’s something that happens, for folks who aren’t aware, to sex workers all the time. And then also we drafted, actually with the clinics, some clinic students, models that are sent to a platform that sort of deleted your account because there was sexual content on it.

Not that that held any particular legal weight. There’s no legal claim you can bring. But just in terms of having access to a draft or a template letter that’s in lawyer language, that was something we worked on. What I remember about that first meeting is I was really nervous because I was really worried that it wasn’t going to be useful or whatever. And that I went and bought very fancy donuts because-

Blunt: I remember the donuts.

Kendra Albert: I wanted to suggest that y’all were worthy of very fancy donuts, and that my colleagues in the clinic, including Nathan and Adam Nagy and many other wonderful folks helped me carry all the coffee equipment upstairs because I felt very strongly that it should be hospitable. I don’t know. My relationship to my Judaism is questionable at best, but the Jewish mother instinct that like, “I will make sure you get the appropriate fancy food.” Is strong, right?

Kendra Albert: So yeah. And then I think one of the things that we worked on from there actually was something that came to fruition yesterday, which was, as I started prepping for this conversation, because one of the things I also wanted to do was talk a little bit about the legal context of FOSTA, again, because that’s where I kind of felt comfortable. I sort of realized that nobody had really written a ton on it, and it wasn’t really clear what it did.

I sort of did some analysis, but was also like, “This is vastly incomplete.” And we ended up, along with Hacking//Hustling comrade, Lorelei Lee, sort of working with the Cornell Gender Justice clinic to sort of produce this very long form guide. Every time we thought we were done, it grew three sizes, so it’s 87 pages. It’s on [SSRM 00:12:05]. I tweeted it yesterday.

In some ways, it’s not a great example of movement lawyering because it’s not really community. But on the other hand, it is in response to a need that we sort of identified together, which was the lack of ability to really understand what FOSTA was doing and to be able to point people to things.

Blunt: I also want to backtrack just a little bit about framing that as an act of movement lawyering, because it was addressing the needs of the community, I think better preparing other lawyers to have the conversation. It was impossible to find a lawyer after FOSTA-SESTA was signed into law to give a “know your rights” training, because no one knew what the law did or no one understood the law. I think that that did meet a need, as well as while… What I think you’re not giving yourself credit for there is, what we also did was we provided a brief, a little card that could be handed out to street-based workers who were interested in knowing what FOSTA-SESTA was. There was also educational components.

Kendra Albert:That’s fair. Yeah. I do think, in some ways, it was the analysis we needed to do a lot of the community- based work, even if it isn’t directly accessible to community. And I think that’s fair. I’m wondering if we want to talk a little bit about the event, the bigger event that we did and that process, and then maybe we can sort of migrate towards talking a little bit about how we think about movement lawyering as a term or to describe what we’re talking about, or other lessons we’ve learned.

Blunt: Yeah. Something that Hacking//Hustling is interested in is moving these conversations about how sex workers utilize technology and the ways that sex workers are harmed by the same technologies that they need to use to survive and stay in touch with community, make money, and to organize, and to fight legislation like FOSTA-SESTA, and to fight legislation like EARN II. We want to be these conversations primarily by and for community, but something that is also very important to me into the work that Hacking//Hustling does, is that these conversations are also being had at spaces who have institutional power.

Blunt: And I think that people who their MO is already operating within those spaces of institutional power, often overlook how much can come from attaching movement work to a name. There are definitely pros and cons to this, but if my organization collaborates with Berkman Klein or has Berkman Klein’s name on something, which we’ll talk more about later, that then allows the work that we’ve been doing to be seen as the Academy, as worthy of putting two years of resources to you and then collaborating with Cornell to create the legal document that we needed and served a need of the community

Kendra Albert: I note that Tim in the Q&A has been like, “Kendra said they talk about how they screwed up, and I haven’t heard about that yet.” This is perfect. I’m so glad you asked that, Tim, because actually what Blunt just said about not understanding how powerful these institutions are is like conveners and legitimizers of work was something I didn’t understand before I started working with Blunt.

I think when I started working with Blunt, I didn’t really understand why Hacking//Hustling was really so invested in throwing an event at Harvard. And this actually led to a really interesting news communication, which I’m going to talk about for a second, which was I felt really… Even before I started working with Blunt super formerly, I felt strongly about making sure folks got paid. And I feel way more strongly about that now as anyone who is working with me knows, or has worked with me knows.

Kendra Albert: I ran into a lot of barriers around getting Hacking//Hustling folks paid when we were trying to put on an event at Harvard. And it made me really uncomfortable because I felt weird going to Blunt and saying, “I can’t pay you.” Because I understand how important getting money for this kind of work is.

I think basically what I did was stop responding to email for three weeks, out of shame. Finally, I think we finally got on the phone and I was like, “Look, I’ve tried and I just don’t know how to pay you.” Like, “I cannot pay you what this work is worth. Do you want to cancel?” Blunt, do you want to talk about what you said?

Blunt: Yeah. I was trying to parse that because I was like, “Canceling wasn’t on my mind.” First of all, we do this shit for free all the fucking time, and that’s fucked up. But also I want to call out that for the first year, Hacking//Hustling was 100% funded through our main organizers direct labor in the sex trades and through a client donation that went through a 501(c)(3). I frame sort of all of my work as hustling, so when we’re having this conversation about movement lawyering, I’m like, “Oh, like hustling academic institutions to shift their power? I can talk about that.” Since we had that client donation, Hacking//Hustling was able to then… Which we had done with Eyebeam as well, is Eyebeam was able to find X amount of money and where we felt people should be paid more for their labor, we were able to fill in the rest, which is sort of what we also did with the event that we put on with Berkman Klein.

But it wasn’t just the access to the financial resources that Harvard has, which they do have, it was just very difficult to find them for this purpose. We were able to pay people through the work that our main organizers were doing in the sex trades, and we pay people fairly well. I was like, “Don’t worry. We hustle in other spaces too. We’ve got this covered.”

Having this event take place at Harvard is something that would get press coverage in a way that it wouldn’t normally, is something that will bring these ideas and this community’s expertise to people who don’t normally have access to that. There were a few things that came from that. I don’t know if you want to talk about a little bit of the internal process of organizing that and the work with Whose Corner and what came from it.

Kendra Albert: Yeah. I think that through that event, I got introduced to the folks at Whose Corner, anyway, which is a sex worker focused, street sex workers, homeless and drug user focused mutual aid org out in Western Mass. There are some of the stuff where they needed not necessarily legal advice, but sort of counseling that had to do with law stuff.

I ended up working with them on that. And then it turned out that they had this need where what they were looking for was record ceiling for a number of their members who had prior felonies or misdemeanors on their records. And that was the thing where actually I think… This is the point at which I feel I maybe crossed the threshold into movement lawyer, where I was like, “Oh, I could learn how to do that. I can find somebody to do that.” Not that that can’t be that hard because yeah, it’s hard work and it’s real, but the fact that this is not my core area of expertise and the work that I studied in law school, doesn’t mean that that need is not real and it’s not important to folks I’m in community with. If the need is real and the work is important to folks I’m in community with and it’s in my capacity, then that’s a thing that I should be doing.

That project actually got put on hold because of the pandemic, but we were working with them to figure out everything from where can we get a photocopier? When we’re in Holyoke Mass, notarizing all this paperwork, right? That was the thing. And I think that that felt really good because this is a community… Whose Corner Is It Anyway is an amazing group of folks doing really, really fantastic work. I got to know them because Blunt and Red, who organized the Hacking//Hustling event on Thursday, Blunt is probably about to hustle by putting their link in the chat and I’m here for it, said, “We don’t want this to just be sex workers who work primarily online in terms of who has access to the space. And we want to hear different sets of concerns, different folks, different views on sex work and surveillance.” They knew the Whose Corner folks, and Whose Corner folks were able to come because they were paid through the acts of Hacking//Hustling. I think that’s super important.

Blunt: Yeah. I was going to say, and also, I think with the work of Hacking//Hustling is also sort of… It’s bridging gaps between communities and institution, bridging gaps between who is funded for their labor and who is not. Who is speaking on behalf of themselves and who’s speaking… I think there’s a very big difference between inviting me to have this conversation with you, Kendra, or you then telling this as a story, as if I’m not a person who is also involved in this work.

We thought that it was incredibly important to also have the perspective of our incarcerated comrades and had my comrade Red called up Alicia Walker, who I’ll also drop the GoFundMe in the link in a second, who is an incarcerated survivor of gender-based violence, who’s currently locked up in Chicago in the middle of a pandemic. But she was able to call in, and I think that was the most moving part of the conference that we put on. For me, it was hearing the process that Red goes through to… I feel a lot of people just maybe haven’t called folks who are incarcerated. And hearing that process, or not knowing if we would actually be able to get in touch with Lily to hear what she wanted to say and what she wanted to share with folks. And I think that that was my favorite part. And I think broadening the conception of what is technology and how does technology affect and impact people was also a very important part of that project.

Kendra Albert:Yeah. Do we want to talk a little bit about movement lawyering as hustling, which Blunt came up before this call and I love it. I’m going to let her talk about it for a little, and then I can talk about my sort of relation to you and reaction to it.

Blunt:Yeah. When you asked me to have a conversation with you about movement lawyering, I was like, “I don’t know anything about movement lawyering.” I also think that this conversation is interesting because it’s bringing to light a lot of work that we were both doing internally. I didn’t know some of the fears or hesitations that maybe you had that you’re talking about now. Like I said in the beginning, it wasn’t my intention to put on this programming at Harvard when I reached out to you. I was literally screaming into a void. You were one of the only people who responded in a way that made me feel comfortable with engaging with you.

I thought of all of my work as an act of hustling, whether or not it’s directly with my labor in the sex trades. All of my work is currently funded by my direct labor in the sex trades. And this was somethingthat I believe it was… Yumina mentioned on the last call that her work is largely funded through the corporate law that she does. And I’m like, “Oh, I know something about that.” I know something about finding alternate ways of funding work that is traditionally unpaid. And so much of sex worker organizing is unpaid.

My work comes out of a space of harm reduction care coordination, which I frame as hustling fucked up systems that were never meant to make work in the first place for beautiful people that I care deeply about. And trying to bridge that gap of service for sex working people who are trying to access healthcare. And so when it moved into more of a space of tech, I saw gap that needed to be bridged as these spaces with institutional support and power. And so I just truly think of movement lawyering as how can I hustle lawyers and people with access to institutional power to have the conversations that I want them to be having and encourage them to do that.

I just want to note that frame of how you serve, how you get these institutions that were never meant to take care of folks, to take care of folks better, is one that I really love and I think is really beautiful. And I think it also speaks to things that I think about in my work. And I think that’s a point of commonality between us.

Kendra Albert:

Yeah. I think I often make this joke that Blunt taught me how to hustle, and it’s totally true. I know that there are some folks on this call who have benefited from my advice about how to get paid, I don’t know where you are in the [crosstalk 00:27:51]-

Blunt:
I love teaching people how to hustle.

Kendra Albert:

Institutions often make us feel we should be grateful for whatever we get. What I’ve learned from working with Blunt is just like, “No, ask for more.” Right? Often our relationship to asking and sort of pushing is that a lot of us who have had access to these spaces are afraid because maybe our access will get taken away or maybe someone will get annoyed at us. For me, I think what I’ve learned in some of our work together is that pales in comparison to the harms that our people are experiencing. And that the people I care about and who I am in community with are experiencing. And so it’s my fucking job to be able to be like, “Okay, does this make you a little bit uncomfortable when I ask for this thing? I’m sorry.” But actually the people who need it, need it.

I don’t think I had that frame or that understanding before I started doing work in community, because I think that it’s really easy, especially as a lawyer, where you’re kind of role constrained. The whole point of certain forms of lawyering is to sort of put a barrier between you and the client, to separate you from the client in terms of their emotional needs or their material needs. I think Massachusetts maybe just allowed for lawyers to occasionally pay for food for their clients. I have friends who are public defenders and often they can’t actually pay for a sandwich for their client who’s really hungry because that’s a violation of the ethical rules.

That kind of relationality is such a big part of actually working together rather than sort of standing up and telling the trauma story in order to serve some greater political point. And I also think, for me, the other thing I’ll say about it is it’s really changed the way I think about scholarship.

I don’t produce a lot of traditional legal scholarship. It’s just not really my bag. And I think part of that is because obviously I have opinions, and plenty of opinions, on how things should be. But in some ways, many of the subjects I’m most expert on, I’m most interested in doing work for clients or sort of community work because what I think about what should happen, doesn’t feel that meaningful. Right?

I was talking to a staffer, legislative staffer, for a Senator about 2-30 reform. They were like, “What do you think?” And I’m like, “I don’t know how to answer that question.” What I said was the Hacking//Hustling party line from our last press release of like, “Here are the five things that we’re thinking about.” Right? Because Blunt taught me how to hustle, I’m not that much of an idiot.

The relationship of academics and of lawyers of this idea of our personal beliefs are what should inform our work rather than like, “Oh, actually, what are the community of folks that I work with? What I need out of this moment from me.”

Blunt:

Yeah. I’m thinking back to a lot of when there was just infinite programming and infinite bills about 2-30. No one was talking about the communities that would be impacted about it. Everyone was talking about platform liability. And I’m like, “People are literally dying because of this legislation. They need to be at the front and the center of these conversations and not added on as an afterthought.”

If the work isn’t centering the people who are literally feeling it. Platforms are not people, people are people, and we need to listen to humans who are impacted by these policies. And I think there’s this difference. I’m thinking of it of like movement lawyering versus savior fetishism, and how different those power dynamics need to be so that you’re not causing harm by telling someone’s story as if it’s your own, that you as a lawyer then don’t own that story and then build a brand around that.

Kendra Albert:

Yeah. Part of it is also, I think, something Asana said yesterday about who’s the expert, right? Now, having worked with Hacking//Hustling and Blunt for a while, I’m not an expert very much of the stuff we talk about. Yeah, I maybe know more about 2-30 than some of the other folks we talk to, or that are in our conversations, but I, even as a lawyer or as a lawyer have so much to learn about movement work, about organizing, about sex workers, about folks lives and where they’re at and how I can be helpful. I think that’s really humbling, but also I think as someone who likes to learn new things, it’s really just Even since we’ve been in the pandemic, there’s now a group chat that at times has been very active.

And just feeling close to and in community with folks in terms of being like, “Okay, this is who I talk to everyday.” And I think at this point often talk every day. Just thinking about it, Lisa is like, “Oh, this is my movement lawyering work.” And we’re like, “Oh, this is who I talk to, who I work with, who hears me, who watches me drink too much Rose on Zoom and then not finish the book club book, that kind of thing. There’s another thing I screwed up, Tim. I didn’t finish the book club book. To be fair, nobody else did it.

Anyway. Before we sort of open up for questions, because we’re sort of nearing, we’re a little more than halfway through, anything else you want to add? I know we talked a little bit about your view about sort of institution…

Blunt: Yeah, I think one thing that we talk a lot about, and that I think that this conversation help facilitate is that a lawyer who is doing movement work doesn’t necessarily represent the institution that they work for or the beliefs of that institution. I remember having a conversation with you about how important it felt to have a Harvard affiliation of some of the programming that Hacking//Hustling has done. And how that will literally help us get grants to fund the unpaid labor that we’re doing. And you were like, “That happened because I put…” The internal work that a movement lawyer is doing within their institution doesn’t reflect the values or the principles of that institution necessarily.

Berkman Klein or Harvard might be happy to have me on this panel or have Kendra pushed to have this really radical conference and give us space and a little bit of money, but I, as an actively sex working woman, who’s naked on the internet, not going to get a fellowship from Berkman Klein or from other institutions like that. And I think that that’s something that’s become really… Frankie is mad. That’s something that has become more and more apparent to me of what are the ways that my privilege allows me to move in and out of these academic spaces, and how can I create a bridge for other folks to come with me? And then what are the barriers that I may be blind to because of my ability to sort of move through those spaces that actually hinder me from moving forward.

Blunt: What I’m also interested in, or talking about is like Kendra is awesome and a great accomplice, and is consistently inviting me into those spaces in a way that allows me to have these conversations publicly, but also gives me more options in the work that I do and the choices that I make and will ideally, hopefully, eventually end in funding. I can’t overlook that enough of what it looks to be invited into a space which is valuing my expertise and my experience as well as providing me with opportunities to move further into those spaces without that person.

Kendra Albert: Yeah. I think that’s so important because I think that what you don’t want is the movement lawyer to always end up as the gatekeeper who is like, “Oh, you only get access to these spaces through me.” That’s a really shitty dynamic. I will have succeeded when Hacking//Hustling throws a conference at an Ivy league institution and I have literally nothing to do with it. And actually I will have succeeded when we’ve abolished prisons and sex work is decriminalized and lots of other stuff. But in terms of short term movement goals.

Kendra Albert: But yeah. I see Asana threw a question in our chat. I’m going to take that first. And then if they have questions for either of us in the Q&A, we would really love to hear them. Or just topics you’d like us to talk more about. That also works.

Kendra Albert: Someone asks if we could discuss the notion of recruiting and creating more movement lawyers and how you’ve been successful/unsuccessful in doing it.

It’s up with a hard question because I actually don’t think we’ve spent a ton of time. Well, I was going to say and that’s not true. We haven’t necessarily explicitly set it as a goal, but I think I’ve watched Blunt sort of worked with bringing law students into these conversations. I think one tricky thing I will flag and then I’ll let Blunt sort of react as well, is that there are ways in which my positioning at a technology law clinic is really ideal for the work that we’re doing., because that’s consistent often with some of the needs that Hacking//Hustling has in terms of subject matter. But often, folks who go into a technology clinic are not necessarily always the most motivated by social justice or have the background in sex worker issues.

If I worked at a gender justice clinic like the one at Cornell, it might be easier for me to attract students who are really invested in doing work that serve sex workers and sort of ready to do movement lawyering. There are many things I love about my current position, but I definitely don’t feel the majority of students I work with their dream is to become a movement lawyer. Maybe I’m just pushing them a little bit further towards doing public interest work, even if I’m not sort of being like, “Movement lawyering, that’s the paradigm.”

I also think I’m still learning how to do it. There is definitely like, “Oh, you can totally teach while you’re learning things.” But some of the work we do feels really high stakes, and I don’t want to harm people. And I think for me that sometimes I think lets with me to be more cautious about including folks I don’t know super well in it. I’m so grateful for the trust that folks place in me and the conversations that I get to be a part of, and I wouldn’t want to do anything to jeopardize the people who have trusted me in that way.

Blunt: It’s so interesting to me because it’s not like I was intentionally seeking out movement lawyers when Kendra and I began our working relationship. That it wasn’t an intentional process so much as one that evolves, which I think is also a really interesting framework for just being in contact with community, I think is a helpful way to push people and recruit movement lawyers.

I think that the folks, the law students who came to the Hacking//Hustling event that we put on at Harvard, would not have gotten that type of education and heard the expertise of the communities that they may or may not be working with without that.:

The other thing that I think about is also really important is, we’re talking about our relationship of a sex working person and someone who’s working at a tech law clinic. I’m currently shadowbanned online and at constant risk of being de-platformed. And when I think about Kendra and my relationship, I think about, what would have happened if I did not have access to Twitter and I could not have asked that question? It’s something that comes up a lot for me.

I think fighting against bills like the EARN IT Act and fighting against bills like FOSTA-SESTA because not only does it like… These laws have killed people and people have died because of these laws. And they also de-stabilize movement work and de-stabilize our ability to be present online and speak for ourselves. When I think about this, I think it’s all so related to me, of fighting for things that allows me to have the same access to online spaces as my non-sex working peers do, as part of that work. Because if these communities disappear from online, how are you going to get in touch with them? Especially in the middle of the pandemic.

It’s just another layer that invisibilizes and decreases the power of a community when people are banned from these spaces. Yeah, I think it’s just a matter of inviting community in to educate you. And this is also something that I did as my job at Persist Health Project, which serves sex working people in New York state accessing healthcare.

One of my jobs was providing best practice trainings for doctors and med school students. Hacking//Hustling also provides best practice trainings for people in the Academy who are interested in bettering their sex worker competency. Something that came out of that when I was teaching of like taught those classes that all the major New York city teaching hospitals and the med students were like, “This hour and a half is the most that we’ve ever talked about sex in our entire three years of med school so far.” That was a feedback that I found really interesting.

Mason Kortz: Yeah. I was just going to jump in and once again ask, anyone who has questions, please feel free to ask them. First of all, thank you. This has just been really incredible and it’s really inspiring hearing both of you speak every time I interact with you. One question I wanted to dig a little deeper into is, Blunt, you specifically mentioned bringing in different members of the sex working community who have different needs, who are affected in different ways by the same policy changes. And that’s something that as I’ve kind of started to learn about movement lawyering was one of the first things that I really became aware of.

Mason Kortz: Communities that often look or are represented as being monolithic from the outside, once you begin interacting with them, or not, and people have different interests and what may be helpful to one person could be harmful to another. I was just wondering if the two of you could speak a little bit more about navigating that and the experience of making sure that the communities you’re working with are not reduced to those people who have the most access or the most voice.

Blunt: Yeah. The work of Hacking//Hustling is I’m sort of starting to conceptualize it as threefold of tech law policy that affects how we interact with these online spaces, what happens when we lose access to those online spaces, and making sure that we’re providing harm reduction resources for our street working comrades, that we’re also figuring out what the tech needs of folks who are trading sex on the street are, as well as being in touch with our incarcerated comrades.

That’s sort of how I’ve been conceptualizing the work that we’ve been doing. And I’m always interested in also learning more and how to do better and to make sure that I’m not speaking over other people or I’m not providing… The Hacking//Hustling isn’t providing resources to people of what we think people should be learning, but rather what people actually need. And meeting the needs of community by not assuming them.

Kendra Albert: I think that makes a ton of sense. And the thing I want to just add is, Blunt said a lot of nice things about me, which is very kind, but I think one thing that I don’t want to lose track of is how amazing Hacking//Hustling is, and Blunt is specifically, at making sure that we’re not just hearing from the sex workers with the most privilege. One of the terms I learned for the first time at the Hacking//Hustling commuting last November… Oh my God, eternity ago, right? Was the term whorearchy, which was just this idea that within sex work and sex worker, Jason Fields and Blunt, please correct me if I fuck this up. There are inherent hierarchies about how folks interact.

I didn’t know that when I started working with Blunt. To Blunt’s credit, she knew that. She had thought about, how do we bring in different folks? How do we be making sure that Hacking//Hustling isn’t just the folks talking about SESTA-FOSTA online but also serving the needs of street-based sex workers.

I also think how you might show up for folks really does vary based on what their needs are. Obviously. I mean, that sounds obvious when I say it, but just to be very clear. Which is that if Hacking//Hustling might need a bill analysis, that’s something I can do. Some of our street-based comrades might need money. They don’t need legal advice, they just need money so they can pay rent, right?

Blunt: Or a letter from their PO officers so they can come speak at Harvard.

Kendra Albert: Yeah. Being willing to show up in different ways and thinking about… I think if you’d asked me six years ago, I may be like, “Well, maybe I’m a little uncomfortable giving money to these folks I also work with because this is going to reduce me to my money.” And now I’m like, “Oh, I’m sorry.”

Actually, I was being interviewed for something, where it was for a nonprofit. They were like, “We want to be one of the primary places you donate.” And in all seriousness, to this folks of nonprofits, I was like, “Actually, I don’t really give too many nonprofits anymore. Most of my giving is to sex worker mutual aid funds.” They just kind of looked at me and I was like, “Oh, is that not…” And that now just feels like a natural extension of this work, and being in community with folks and showing up for them. And that can mean bill analysis and it can also mean money. That, it feels important to me.

Blunt: I think something that was very interesting about the work that Hacking//Hustling has done is we conduct very casual needs assessments before putting on programming to sort of assess that what we think folks need is accurate. And if it’s not, where do we fill in the gaps? As well as when we were conducting our research on FOSTA-SESTA, which I also want to point out that the only actual research that exists on the impacts of FOSTA-SESTA have been done by sex workers. I think there are two or three reports, one which was done by Hacking//Hustling.

When we did the research on Hacking//Hustling, we distributed it primarily online, so that means that it’s only accessible to folks who have access to the internet. We also partnered with Whose Corner Is It Anyways, the really amazing organization that Kendra was talking about early in Western Mass, to do the survey with their community at one of their meetings.

I learned a lot about that practice because I worked really… We hired and paid Naomi to modify the surveys so that it was both accessible to the community using the language that that community uses. And also we added on 40 questions for them that they were just interested in for grant purposes. We had the questions that were the same so that we would analyze it and then also then just gave them all of the raw data so that they can do whatever the fuck they want with it. That data is theirs to use to hopefully help them get more money and be used for grants.

What’s something that’s so interesting that came out of that is that folks who are working on the street have no fucking idea what FOSTA-SESTA is. What they did say is that they noticed… On their small stroll in Western Mass, they noticed 10 to 15 more street-based workers hanging around and had heard about FOSTA-SESTA from them. I think that it helped contextualize the research in a way that FOSTA- SESTA pushes people into unsafe working conditions, but people who did not have access to those safer working conditions in the first place didn’t know what FOSTA-SESTA was.

What are good resources for sex work 101?

Mason Kortz: We have three questions in the queue. I’m going to read them out loud for people who are watching this after the fact. First one from Rom. Two-part question, how can technologists amplify cause, and what is your favorite resource about sex work 101 for non-sex workers?

Blunt: Sure. I think Melissa Gira Grant’s Playing the Whore is an excellent book, as well as Revolting Prostitutes, are the two books that I would recommend diving into first. I would also suggest following sex workers and sex worker organizations online to see what they’re talking about and being in internet space community with folks. Because I think often our social media followings are so siloed that we don’t see this and don’t see how the communities who are directly impacted are responding. And also because the platform is literally erasing people from it, is also part of the problem. So being intentional about making that you’re seeing community responses too.

Kendra Albert: Can I just add something to that Blunt? Is that okay? I think that one, Rom you asked, how can technologists amplify this cause? And I think there are maybe two things I want to flag. I think one of the lessons I’ve learned as a lawyer in this space is actually letting go of my identity as a lawyer and just being, “What are the other capabilities that I have?” I have a Twitter account with probably mostly followers who are in tech policy. I have access to institutional spaces, right? Literally physical spaces at Harvard in non-pandemic times, and Zoom spaces at Harvard in pandemic times. Or access to other kinds of resources.

One question I would ask first is, how do I let go of my professional identity in doing this work? And just show up as a person who wants to help. And then I think it can be helpful to also, as you sort of get to understand people’s experience better through doing the one-on-one work and through showing up as a person, then think about, okay, how can I show up as a technologist? What’s going on there. That feels like a sort of part of the answer to me as well. Blunt, you want to add anything?

Blunt: No, I’m just echoing that. It’s like when so many sex workers are shadowbanned. Melissa Gira Grant was trying to @ me in something the other day and couldn’t find me because I’m name suggestion banned. She literally couldn’t find my account to tag in a post. And like with the Berkman Klein, I’m not tagged in any of those posts. I’m not sure if it was just for typing my name because people literally couldn’t find my name to add.

I think thinking about how sex workers do not have access to these same tools that people take for granted, both academic power as well as the ability to be seen on social media, is part of that. Someone’s asking a question, which I feel is somewhat related. Can you address how platforms especially payment processors fear of processing money related to sex work affect these direct giving efforts?

I’m going to drop a link right here that we put together on account shutdowns and a harm reduction guide, which I think is interesting and helpful, about understanding internally the way that this oftentimes network shutdown of sex worker, financial processes happen. And right now Hacking//Hustling is conducting research on content moderation in response to the police violence against black folks as well as the intersection between sex worker and activist, because a lot of activism is actually funded by folks direct labor in the sex trade, because it is oftentimes unpaid labor.

One thing that we’re finding in one of the common themes, is how sex workers losing access to financial technologies disrupts movement work. Is one thing that we’re focusing on. And our ability to provide mutual aid to each other, especially in a pandemic where we’re not allowed to just… We’re not as able to just hand people cash, which is how we pay people at the first Hacking//Hustling event. It was just handing people cash before they spoke, which felt very important because sex workers always get paid before rendering a service ideally.

None of us have access to the same financial technologies. It’s really difficult to move money in a community whenever… I’ve lost access to three or four different financial services. And when we were paying folks for the Hacking//Hustling event, it was coming out of my personal account and then being reimbursed. I had to use four different payment processors in order to be able to provide everyone with stipends that… Yeah. I think people just also overlook how impactful it can be to provide someone with money who needs money to give them that money. That is a hugely radical act that is very effective.

Kendra Albert: Yeah. And I think the other thing I’ll say there is my experience of some of this from the Harvard side is the institutions often have no ability to meaningfully assess risks to folks when they’re paying them. The wallet name that you have that you need to get paid under maybe very different than the name you organize under or appear under as a sex worker. And so if you appear at an institution and they need you to be one honorarium, but it has to go to the wallet name rather than the name you work under, even though the name you work under is the name you spoke under, that’s a point of connection between two identities that you may not want to connect it as a sex worker.

Sometimes, as someone who works at the institution, part of my job is trying to navigate that, right? Being like, “Okay…” Blunt, can I talk about the Case Western thing? Okay. We were on a panel together at Case Western and they were like, “Oh, we want to pay you.” Which was great. But then it was like, “Oh, we need you to fill out this W-9 or whatever.” And it’s all this personal information that Blunt didn’t necessarily want to give. What we did was it was like, “Just pay Kendra and Kendra will pass along the money.” Right?

And that works because we have a pretty close relationship where that was something that I think we both thought that was going to work fine in terms of trust. Some institutions would definitely not do that, right?

To your point, to Brianna’s question about sort of the way in which account shutdowns and financial shutdowns affect folks. I think institutions often take for granted access to financial infrastructure, whether it’s certain kinds of bank accounts under whichever name or Venmo or PayPal, or even if it’s stuff like paying folks two months after an event, or reimbursing people. That’s something that institutions often take for granted and people don’t have a lot of space to be like, “This is not cool and this doesn’t work for me.” Yeah.

Blunt: Yeah. No. Truly. Something I think that everyone in academia could learn is to pay people before they render a service. It’s just mind boggling to me that this is not common practice. But asking a marginalized community member for legally revealing information about them without understanding how that could expose them to potential harm, I think it’s something that definitely needs to be considered as well as how quickly are they getting those funds.

Blunt: I will often pay people either before or right after something ends with a direct transfer that lands directly in their accounts so they have access to that money the same day or the next day when it is processed. When I have had to postpone events, I have offered to pay people upfront when the event was supposed to occur in case people were relying on that payment to get through the month.

Kendra Albert: Yeah. I know we’re at 2:00 PM and I see there’s a couple of questions left, but I guess we want to sort of wrap up. Blunt, are there any other thoughts you want to share or things you want to say?

Blunt: I mean, I’m just really excited for the opportunity to have this conversation and sort of reflect on our relationship because I feel we were both doing a lot of invisible work or work that also I wasn’t super aware that I was doing just by it being radical, just asking for access to these spaces. I was just like, “I’m just trying to bridge these gaps.” It’s really interesting to sort of deconstruct the power dynamics in our relationship as well as what we’ve both learned from working with each other. Thank you for taking the time to invite me here to have this conversation.

Kendra Albert: The best way to ever have conversations is by force in front of a whole bunch of Zoom attendees. No. The feeling is so mutual. I hope it’s clear to everyone attending, and I actually say this all the time, how much I’ve learned from working with Blunt and how grateful I am to be in community with her and the many other folks you work with.

Kendra Albert: I think movement lawyering can feel kind of abstract. And I think for me, I guess the takeaway I would just offer is that it felt so natural. Obviously, I’m getting to learn a ton, and obviously if I’m working with sex workers and their lives and livelihoods and community is on the line, they’re calling the shots. Obviously, I need to let go of some of my own ego around this stuff and get over it

I think that in some ways, as Blunt said, movement lawyering is a framework I came to apply after the facts to the things that I was already doing, because they were what felt right at the time and felt responsive to the needs of our relationship and the folks we work with. And so as y’all go out into doing your work, whether it’s movement lawyering or other kinds of movement work, I wish you some of the same ease, I guess, of finding folks who you click with and who you can grow together with. Mason, I know we need to mention our sponsors, so I’ll stop there.

Mason Kortz: Thank you. And again, let me just echo, thank you both of you for giving us so much to think about and to bring back to our work. I want to also just say thank you to Asana and Yumina for their amazing presentation yesterday. I think between the two days we have stuff that we can take back and reflect on and hopefully really improve the way we all do our own tasks.

Apologies to those questions that we didn’t get to. And thank you to all the attendees for coming. And finally, thank you to our sponsors, to the Berkman Klein Center for Internet and Society for providing the infrastructure to have this talk, and to the Office of Clinical and Pro Bono programs for helping us make sure that our panelists get paid.

Thank you all for coming. Welcome to day two of our two day series on movement lawyering. Very excited today to have two incredible speakers with us. First, we have Danielle Blunt, who is a professional New York City based dominatrix and sex worker rights advocate. She has her master’s in public health and researches the intersection of sex work and equitable access to tech.

Legal Literacy Training

Yves, Lorelei Lee, Kendra Albert, and Korica Simon will present on the First Amendment, section 230, Patriot Act, the ways in which fear creates a push for state surveillance and the impact that this has on our community.

DANIELLE BLUNT: Hi, everyone. I am Danielle Blunt. I use she/her pronouns. I’m with Hacking//Hustling, a collective of sex workers and allies working at the intersection of technology and social justice to interrupt state surveillance and violence facilitated by of technology. Do we want to go around and do introductions first?

KENDRA ALBERT: I can go. Hi, everybody. My name is Kendra Albert. I’m an attorney, and obliged to say via that that none of this is legal advice. I work at the Harvard Cyberlaw Clinic as a clinical instructor there, and also do some work with Hacking//Hustling, and my pronouns are they/them. I’m super excited to be here with y’all.

YVES: Hi, I’m Yves. I use she and they pronouns. I’m an organizer with Survived & Punished New York, which is an abolition group. And with Red Canary Song, which is around supporting migrant sex workers.

KORICA SIMON: Hi. I’m Korica Simon. My pronouns are she/her. I’m a law student in third year at Cornell. I got involved with sex worker legal rights when I was in Seattle, and I worked for a nonprofit called Legal Voice. And I also got involved through a clinic at Cornell called the Gender Justice Clinic. So I’m excited to be here today and talk to you all more about the subject.

LORELEI LEE: Hi, everyone. My name’s Lorelei. I’m an activist and I work with Red Canary Song, as well as Hacking//Hustling, and have worked with these folks on issues of surveillance and tech‑related violence that’s impacting people in the sex trades for the last few years. So I’m really excited to be in conversation as well!

DANIELLE BLUNT: Awesome. And Yves, do you want to kick it off with a little community report back? That’d be a great place to start.

How does surveillance impact the sex workers?

YVES: Hi. So, yeah! I’m just going to talk about a little bit about how we got here, and then a little bit how it’s affected mainly the sex working community, but also in general.

So what we’re mainly talking about, or what I’ll mainly be talking about, is around contact tracing and public health, like, uses for surveillance. And then also SESTA/FOSTA and the EARN IT bills.

So these have all increased policing, and I don’t just mean the police department, but also through citizen surveillance and deputizing not only just like people, but also deputizing nonpolice department agencies and companies. Right? This includes a lot of different people, and this really like increases the scope of policing and criminalization in a way that it wasn’t previously. And the way that this has happened is, like, structural violence that has led to so much harm to already marginalized and stigmatized communities.

So kind of the reason why this is, like, happening the way that it is is because a lot of the conversation around surveillance, like, belies powerful moralism that resists against evidence and logic. We see this like happening post‑9/11, where you get this idea of like, if you see something, say something. Which sort of creates these hypervigilant crusaders for both antiterrorism but also anti‑trafficking, which is also an issue that is often tied up with sex work.

So when we see this happening ‑‑ the surveillance has increased in such a way that we’ve seen this happen before? Like, this predates all of these sorts of things. Right? They’ve used contact tracing before to criminalize… different marginalized communities and sex working communities for a really long time. With HIV/AIDS we see as an example, right? They ask you, when you get tested for HIV, they’ve also criminalized the spread of HIV period, right? But when you go into a clinic and you are asked, you’re gonna be asked if you test positive, or if you think that you have any STI, right, but especially HIV/AIDS, they’re going to say, who have you had sex with? Like, name those people. In what timeframe did you have sex with those people? We want to contact those people; how do we get in contact with them. Right? Whether that’s from someone directly, or from you yourself when you go in to get tested. They’re getting that information, and the truth is that information doesn’t just stay at the health clinic, doesn’t just stay where you think it stays. Right? That information gets passed on to the police, to these different agencies like CPS, right? And that leads to the criminalization of a lot of people, not just sex working people. Right? Also people who are profiled as sex workers. They use that information to be like, oh, you’re selling sex. So then the cops are going to come knocking on your door. And this similarly happens with COVID, right?

How is contact tracing used to surveil marginalized communities?

So we’re seeing what’s happening with COVID is they’re using contact tracing in a very similar way. And also, if you’ve seen the public rhetoric around this, right, the way public health officials and government officials are talking about it, they say that sex workers are high‑risk people, right, to have COVID. So if you are working in a massage parlor or on the street and you come in to get tested for COVID, they’re going to be like, oh, how did you get contact ‑‑ how did you get this? Who were you in contact with? Or if someone you know has got COVID and goes to get tested, they ask who were you in contact with, and you tell them, oh, I went to this massage parlor or met them on the street, that’s a way of criminalization. The truth is they are not just out to treat you, right? They are going to turn over that information to the police. This happens to spread the scope of policing in so many different ways, not just in the sex working community. Right? In a lot of different communities, they use these exact same surveillance techniques. We should think of surveillance as a strategy that exists within the larger frame of policing.
Also contact tracing, like, these different methods of policing are used to marginalize other communities, who are not sex workers but are targeted as if they were anyway? We see this targeting protesters, recently. We see them surveilling a lot of communities in this way. How did you get this? Oh, you were at a protest last week? Who was there? This all happens in the same scope, and in all of these different agencies that have then been deputized. Right.

What happens when you have these situations ‑‑ with SESTA and FOSTA and EARN IT, we wouldn’t directly think of them as surveillance? Oh, they’re censoring and taking down these sites; that’s not directly surveillance. But that’s not exactly true. A lot of these bills are also like collecting information, right? Because they’re putting these sorts of laws in place to go against sex trafficking? So they tell you, these are the indicators of sex trafficking, looking for these indicators. But really, those indicators aren’t indicators of sex trafficking most of the time and are also spread out to sex workers and other people and people who are profiled as sex workers. So it’s also a lot of data collection that is happening in order for them to censor, shadowban, and do all of these things anyway. And that data is also being turned over to the police to be used.

But also, we generally see the kind of attack that happens to sex work coming in from all ends. Where like, so SESTA and FOSTA, we saw this. Right? This is just what happened already. People are pushed to work in more dangerous ways because you can’t go on Backpage, you can’t go on Craigslist ‑‑

DANIELLE BLUNT: Can I stop you for one sec? That is amazing. Thank you so much for that. I just want to take a moment to ask for definitions on what is contact tracing and just, like, anyone who wants to jump in a one‑sentence summary of FOSTA‑SESTA and EARN IT, just so folks are on the same page from the forefront.

What is contact tracing?

YVES: Well, I can like expand a little bit on contact tracing, right? So when an epidemic occurs, right, which I brought up HIV/AIDS, how they kind of figure out who might have it so that they can “treat them,” right, in the best case scenario, this information would not be used to police people. Right? But that’s not what happens. But, like, contact tracing is when they try to figure out who has had it, or who gave it to you, and like ‑‑ or who you could have possibly given it to, right, in order to stop the spread, so you can get those people into treatment centers or in treatment.

So if I, for example, was to go into a clinic, and I was like, I have chlamydia. Right. And they’re like, okay, so you have chlamydia. Who did you have sex with before this? Who did you have sex with after you thought you might have shown symptoms? And then you like list them off, like, okay, I saw Blunt! I saw Kendra! I saw Lorelei! I saw all of these people! And then they’re like, oh, did that person tell you that they have chlamydia? Did that person tell you that they have HIV/AIDS, da da da da da. We see the criminalization of HIV/AIDS like similarly with the criminalization that’s happened with COVID, where they’re criminalizing directly the spread of COVID and HIV, but also generally, right? They use this information to be like, oh, who did you get this from? And ideally, they wouldn’t police people, but what ends up happening is they ask you all of this information, and then they kind of pick out those indicators to be like, you’re a sex worker. Like, you’re selling sex. Right? So we’re gonna like show up, and we’re going to arrest you. Right.
I hope that that makes sense.

DANIELLE BLUNT: Yeah. And I also think, too, with the protests that are going around, I think that there’s a lot of contact tracing that’s being done with like sting rays and cell phone tracing, which I hope some other folks can talk about in a little bit. And I would just love like a one or two‑nonsense summary of FOSTA‑SESTA and EARN IT before we go into them in more detail.

KENDRA ALBERT: Um. I will do my best. Lorelei is watching me with an amused look on their face.

What is FOSTA-SESTA?

So, FOSTA and SESTA are laws that were passed in 2013 that greatly increased the incentive ‑‑ F‑O‑S‑T‑A and S‑E‑S‑T‑A. Thank you, Blunt. That greatly increased the incentive for online service providers, folks like FaceBook or Twitter or Craigslist, to remove any content related to sex work or possibly attributable in any way to sex trafficking or related to sex trafficking. And we can talk a little bit more about how specifically they did that, but that’s like the one‑line top‑level summary.
EARN IT is a pending bill in front of Congress right now that is meant to do some similar ‑‑ basically, engages in some similar legal stuff around child sexual abuse material, making it ‑‑ incentivizing companies and online platforms to be more potentially invasive in their searches for child sexual abuse material by creating more liability if they’re found hosting it.
Lorelei, how was that?

LORELEI LEE: Great. I think that was great. It’s very ‑‑ I mean, FOSTA‑SESTA has a lot of parts, and so it’s… But I think what you are talking about is the most important part, which is the impact of it and what it incentivizes.
And the one thing I would add about EARN IT is I think what EARN IT will do that is similar to FOSTA‑SESTA is that it will incentivize companies to remove all information related to sexual health and anything that teaches youth about sexuality.

DANIELLE BLUNT: Thank you.

KENDRA ALBERT: Very upbeat.

DANIELLE BLUNT: And ‑‑ yeah. (Laughs) We’ll be getting into those a little bit more. I have one more question for Yves: Is turning health info to the police doable via legal loopholes, like HIPAA, or is that happening in the shadows?

KENDRA ALBERT: I can also take that one, if you prefer. So HIPAA ‑‑ HIPAA, which is the U.S. health care privacy law, federal health care privacy law, explicitly has a carve‑out for so‑called “covered entities,” health care providers, turning over information to law enforcement. So it specifically says, you don’t need to get consent from people to turn their information over to law enforcement. So HIPAA doesn’t prevent that.

You know, another thing, and actually I think this really ties in really nicely to some of the stuff we want to talk about a little bit, like the Patriot Act, which brought in surveillance powers to the U.S. government, passed in 2001, right after 9/11, is a lot of times even if there isn’t an explicit process for getting a law enforcement agent or even a public health entity to request information, say if they went through appropriate legal process, many ‑‑ there are now ‑‑ there are often legal regimes that encourage what’s called “information sharing.” Which just basically means, like, that there are… They try to eliminate, like, privacy or other reasons that information might be siloed between different parts of the government or different governments, like federal, state, local. So even if, you know, you don’t have law enforcement knocking on the health providers’ door with a request for information, like a subpoena or whatever, there’s often these efforts to kind of standardize and collect and centralize these forms of information.
Yves, do you have anything ‑‑ do you want to add to that? Did that feel like an adequate summary?

YVES: I feel like that was very clear. Yeah.

What is the Patriot Act?

KENDRA ALBERT: So I think ‑‑ Blunt, do you mind if I keep going to talk about Patriot Act stuff for a second? So I think that, you know, one of the things that I think is worth noting is you see a couple general trends, in surveillance. And I’ll also let Korica talk more about some particular ways this plays out in particular communities. But just on a super high level. We can see this sort of movement from the Patriot Act to now, where A, we see more requirements around information sharing. One big critique of the U.S. government’s… I hesitate to say intelligence‑gathering apparatus with a straight face, but! And what I mean by that is the CIA, the NSA, the sort of intelligence agencies, as opposed to more traditional law enforcement agencies like the FBI or police. Was that they were gathering all this data, but they weren’t sharing it in ways that were actionable across multiple agencies. So when the Patriot Act was passed after 9/11, one of the goals was to make it easier for agencies to share information. I think that’s a general trend that’s happened, post‑Patriot Act to the current moment, where we see things like fusion centers and other ways to collect surveillance data from multiple ‑‑ and Palantir’s databases, and ICE’s data collection… Data that’s collected across multiple methods of surveillance and putting it together to gain more information about the lives of individual people.

Obviously, this has dramatic effects on sex working populations, often because, A, often specifically criminalized and over‑surveilled? But also, you know, if… Often, information that is innocuous, sort of not raising a red flag on its own, when combined with other information can suggest more specifically what kind of activities folks are engaged in or who they’re spending time with.

The other thing I want to highlight about the Patriot Act is surveillance after ‑‑ well. Two more things. I’m trying to be brief, but the lawyer thing is everything comes in threes, so I have to have three things I just want to highlight about the Patriot Act. Okay! So number two is that you see these particular surveillance tools be originally deployed for what’s considered very, very important law enforcement activity. So originally, actually, a lot of stuff we talked about in the context of antiterrorism work, and that’s what the Patriot Act was about. But over time, these law enforcement tools get sort of “trickle down,” for lack of a better term, into sort of more day‑to‑day enforcement activities. So we’ve seen these actually a lot with something called the sneak‑and‑peek warrant. Which that’s actually a term that, at least, Wikipedia tells me, that the FBI coined, not anti‑surveillance activists? It’s kind of funny that the FBI thinks that’s a good description of what this thing is. But basically, traditionally, if someone gets a warrant for searching your property, it’s basically a document where you go before a judge and you say here’s why we want to search, and the judge says, okay. You said where, you said why you’re allowed to. I’m going to sign off on this, and you police can go search that person’s house, for example.

So traditionally, you know, if you’re at the house, and police show up, you can ask them to see the warrant. And say, hey, I want to confirm that this is the warrant that allows you to search my house. What a sneak‑and‑peek warrant does is say ‑‑ is actually allows ‑‑ this is kind of, it kind of sounds like bullshit, but this is what really happens. Right? Allows the police to set up a ruse? Like, oh, to get you out of the house, go into the house, and sort of search your stuff. And actually, like, one of the contexts in which we’ve seen this really recently, and one of the reasons I connect this back to the Patriot Act, is the surveillance on massage parlors on the Robert Craft case in Florida. Actually what the police did is get a sneak‑and‑peek warrant, claim there was a bomb threat or a suspicious package at a nearby building. All of the folks who worked in the massage parlors were sort of escorted ‑‑ had to be away from the building for their own safety, and the police went in and put cameras in.

And we know this because they tried to criminally prosecute Robert Craft, and Robert Craft had enough money and sort of ‑‑ to hire a legal team that was able to challenge the validity of the sneak‑and‑peek warrant that they used to surveil the massage parlors and others working there.

So, when those warrants were included in the Patriot Act, there’s nothing in there about human trafficking investigations, let alone the stuff that happened in Florida, which actually ‑‑ there’s no human trafficking prosecution coming out of it. Right. From what I read, and others may know better than I do, so I’ll cede whatever claim I have to the truth there, but. It doesn’t look like human trafficking was involved.

So sneak‑and‑peek warrants weren’t written into the law for those sort of investigations, let alone surveillance of prostitution, but these information technologies ‑‑ and here, I include technologies in the computer sense but also in the ways governments do surveillance. Once they get written into the law, their use often gets broadly expanded to new populations, new circumstances. It’s sort of like, you might as well use it.

I had a third thing about the Patriot Act. But… I guess the third thing I’ll say quickly, before I stop, is that I think one thing you’ll see a lot of in discussions about surveillance reform, and especially how it sort of fits into conversations like we might have here about sex work, is sort of an inherent trust that like procedures are going to save people. (chuckles) Which, I’m really skeptical of, just sort of personally? But you know, if you look at sort of what happened, you know, between the Patriot Act and now… So, there was ‑‑ there’s a thing called Section 215 of the Patriot Act, which basically functionally allowed the National Security ‑‑ the NSA to search people’s call logs to see who was calling who. And this was a ‑‑ there was like a legitimately robust debate about this. But one of the sort of reform methods that was actually put on the table and passed as part of the… USA Freedom Act, in I want to say 2016? Don’t quote me on the date. Was actually, they were like, okay, great. Well, the U.S. government can’t hold this giant database of call data anymore. They can’t see who you called and when. But they can go to the phone company and ask them.

And like, yes, that is better. I’m not gonna ‑‑ I would rather they have to go and ask Verizon nicely before they get the call data. But functionally, I’m like, that’s not ‑‑ that’s not safety. Right? And I think that, you know, when we see a lot of the surveillance ‑‑ some types of surveillance reform activity, especially post‑Patriot Act, we’re not even getting close to back where we were pre‑Patriot Act. We’re sort of, like, trying to kind of tinker around the margins, slash maybe add a teeny bit more process. That’s not going to help the folks who are most criminalized and most surveilled.

Anyway. That was a lot from me, so I’m going to stop. Blunt, do you have another question you want to tee up, or folks want to react to any of that?

LORELEI LEE: I have a question, actually. What is a fusion center?

What is a fusion center?

KENDRA ALBERT: Um. Well, so I’m ‑‑ I’m gonna do my best. It’s been a little while. But the… Ha. Despite the weird, kind of futuristic name, it’s basically where all the cops get together. So different ‑‑ (Laughing) Yeah. Different types of law enforcement often, like, have different beats. So one of the goals of fusion centers is to like combine information and share policing and surveillance information from different law enforcement agencies. So that can be, like, you know the one actually I’m most familiar with is the one outside of San Francisco. And there’s been a lot of ‑‑ I don’t to erase ‑‑ there’s been a lot of really amazing research and activism against fusion centers. But actually often and primarily by communities of color. But usually, they’re like… It’s where the San Francisco PD and the Marin County PD, which is the county north of San Francisco, and the BART ‑‑ which is the public transit, one of the public transit organizations ‑‑ where they cops would all share information and sort of share tips. And they were a result of the sort of, the attempt after 9/11 to deal with this ‑‑ what people saw as this problem of all this information being siloed.

DANIELLE BLUNT: Awesome. Thank you so much, Kendra.
Korica, I would love to hear from you if you feel like now’s a good time to chime in.

The history of surveillance in marginalized communities

KORICA SIMON: Yeah. So I can speak a little bit on the history of surveillance. So, as Kendra stated earlier, marginalized communities have historically been affected, have been victims, of government surveillance. And surveillance does have roots in slavery. So in 1713, New York City passed lantern laws. These laws were used to regulate Black and Indian slaves at night. So they ‑‑ if you were over the age of 14, you could not appear in the streets without some kind of lighted candle so that the police could identify you.
And we’ve seen like this same thing recently, with NYPD, where they are shining survey lights like floodlights in Black communities. And we saw that increase after they received a lot of criticism over stop and frisk. And people in those neighborhoods were reporting that they could not sleep at night. Like, the lights were just blinding them.
And Simone Browne has written a lot about this subject, and how the ways in which light has been used to surveil people. And I believe they also write about technology, as well, and how we’ve moved to that side of things.

What is COINTELPRO?

So in regards to technology surveillance, one of the most well‑known abuses of surveillance by the government is COINTELPRO. I think it stands for counterintelligence program. It was basically a series of illegal counterintelligence projects conducted by the FBI aimed at surveilling and discrediting, disrupting, political organizations. So the FBI in particular targeted feminist organizations, communist organizations, and civil rights organizations. And the government’s job was basically to do whatever they could to just like disband them and get rid of them by any means necessary. And they mostly did this through, like, wiretaps, listening in on people’s phone calls, tracking them down, as well as having informants involved as well.

And as a result of this, quite a few people were murdered or put into prison. Some Black members of the Black Panther party are still in prison. And… Two of the most talked about people who are victims of this are Martin Luther King, Jr., as well as Fred Hampton, who was drugged by an FBI informant and then murdered by Chicago police. But also Angela Davis has been a victim of this as well. And again, we know that these practices are still continuing today. So we kinda got into the protesters and how they’re being surveilled. And I think it came out in 2018, 2019, that Black Lives Matter activists were being watched. Their activity was being watched on the internet. And now we have seen recent reports that protesters today are being watched, as well, either through body cameras, cell site simulators, license plate readers, social media, drones, as well as just cameras in that area that may use facial recognition technology that could help the police identify who the protester is and get access to your social media accounts.

So these are all, like, issues that are happening today as technology increases. We’ve only seen it get worse. And we know that marginalized communities are the most affected by this. If they use this on Black, Native, Latinx, immigrant communities, they’re also going to use this on others as well. Sex workers, and sex workers mostly fall into marginalized communities. So.

I don’t know if I should talk about the third part right now, or if I should wait? ‘Cause it’s a little bit different, but… Okay. I’ll just go ahead. (Laughs)

What is The Third Party Doctrine?

So, kind of transitioning a little bit t. The Third Party Doctrine is a doctrine that comes out of two Supreme Court cases, United States v Miller and Smith v Maryland. And what they state is if you voluntarily give your information to third parties, then you have no reasonable expectation of privacy. So third parties include your phone company, Verizon, Sprint; e‑mail servers, if you use Gmail; internet service providers; as well as banks. And so that means that the government can obtain your information from these companies without having a warrant. So they don’t have to have, like, probable cause that you’re doing something in order to get access to this information.

And the Supreme Court’s logic behind this decision was that, well, if you tell someone something, then you’re giving up your privacy, and like you can’t expect that that will stay private forever. What ‑‑ I should also back up and say that these cases were decided in the 70s? So. Not today, where like our whole life is on the internet, and we are constantly giving third parties our information. And actually Justice Sotomayor, she has suggested that she would like the Court to rethink the third party doctrine, because it’s just a completely different time today. A lot of us use our GPS, and we wouldn’t think that ‑‑ I don’t know. That they could just share all of our information without us knowing.

And I will say that if you’re ever curious about like how often the government is requesting access to this information, some companies, like Google, I think FaceBook, and Sprint, they do report this. I know Google reports it under transparency reports. And you can see how often the government has asked them ‑‑

KORICA SIMON: Oh. Well, hopefully, they’re still doing it, and you can see. I think it’s roughly a hundred thousand people a year. But we don’t know, like, what the result of that is. It’s honestly probably a lot of people who aren’t doing anything at all.

And so we’ve also seen, like, some people starting to move their e‑mail accounts from using Gmail to e‑mail servers that care a little bit more about privacy and that are more willing to fight these requests from the government.

And then I’ll also say the last thing is that the government can also request that these companies, like, not tell you at all that they’ve requested this information. So… This could be done completely in secret, as well. So.

Privacy from law enforcement and the Fourth Amendment

KENDRA ALBERT: So. I think ‑‑ I want to just sort of flag some stuff that Korica said and sort of highlight certain parts of it, and I want to contextualize a little bit of this. I think, you know, often ‑‑ we’re sort of talking here about sort of privacy from law enforcement, and the primary source of privacy from law enforcement in the U.S. is the fourth amendment, which is so obvious Korica didn’t say it, but I’m going to say it just in case it isn’t obvious for other folks. And, you know… Two ‑‑ one thing worth noting about the fourth amendment, for like folks who are sort of concerned about where ‑‑ about like the relationship between all of these legal doctrines and their actual lives? You know, for many, like… Often, I want to contextualize for folks that having Fourth Amendment protection, or like saying, oh, the U.S. government violated the Fourth Amendment, only gets you so far. Because if what you want is the government not to have access to that information, the horse has already left the barn, to use the right metaphor. Which is to say that most of the remedies that come from, you know, unconstitutional searches and seizures, or unconstitutional requests for information, just are about not having ‑‑ like, that information not being able to be used against you in court. Which is of very limited value if what you’re concerned about is like the safety of yourself or your community, of not getting folks arrested, or if you don’t have questions access to the kinds of representation and resources that would allow you to go through a legal battle and you’re going to plea out the second that you get arrested.

So, you know, I always want to caution any story I tell, or any story we tell, about the importance of constitutional rights in this area with a little sort of real politic about what does it mean, or real talk about what does it mean, to have access to these kinds of rights.

The other thing I want to flag is one ‑‑ what Korica said is 100% correct, as a matter of doctrinal law. There is a weird thing that has happened with regards to government access to data, which is a lot of the larger online service providers, and in this I include Google and Twitter, some of the like… FaceBook, actively will not provide certain types of information absent a, like, appropriate legal process. So it’s actually legally contestable whether the government can get access to your e‑mail that’s stored on a server without a warrant. It has to do something with ‑‑ in certain contexts, with how old the e‑mail is, and whether it looks like it’s been abandoned on the server? Is statute that covers that, the electronic communications privacy act, was passed in the 90s, and boy does it read like that! Like, good luck!

Does the government need a warrant to access your e-mail?

But point being, Gmail will require a warrant to get access to your e‑mail content. That’s great, except that, you know… If your e‑mail content gets turned over and then you then want to challenge it, you’re still in the same place you were previously, which is that the government has access to the e‑mail, and that could mean serious consequences for you independent of whether it’s later admissible in court.

So part of what we’re talking about with legal literacy in this space, what I want to encourage folks to think about is, okay, how do I keep me and my community safe independent of the legal remedies? Because oftentimes, those legal remedies aren’t acceptable to everybody. Just realistically and very obviously. And/or will sort of help you after the fact? Maybe it means you recover money. Or maybe it means the evidence isn’t used against you in court. That doesn’t help very much if what you’re concerned about is the safety of you and your people.

So making sure that, like, we don’t pretend these remedies will make people whole for the harms they experience from surveillance or from the government. But rather that, you know, some of these protective measures that folks can take are about sort of trying to prevent the types of harm that the surveillance might cause in the first place.

Apples one more quick note is that a relatively recent Supreme Court decision has suggested that the government does need a warrant to access your cell phone location. So that was like one little bit of good in a sea of terribleness that is the third party doctrine.

LORELEI LEE: I think ‑‑ so I think I’ll respond to a little bit of that to say that I… you know, I’m thinking about the connections that are between what Yves has been talking about, and then what you folks are talking about, in terms of… the way that information gets used against you that isn’t really cognizable in the law, but once they have your information and have you on their radar, they use that information to get more information, to follow you, to trace your contacts, and happening in multiple different contexts. And… Something that I think is really important that people don’t think about all the time is that everyone breaks the law. And… (Laughs) And it is people who are criminalized. It is ‑‑ so, we think of criminalization as being about behavior. But it is really about people in communities.

Because everyone breaks the law, and the only folks who are targeted by police ‑‑ and I mean state police, federal police, et cetera ‑‑ like, those are the folks who get punished for breaking the law. And that punishment can be… you know, because they have followed you for a certain period of time in order to collect enough information in order to the make something cognizable in the law.

So I’m thinking about how one of the themes of what we’ve been talking about is the deputizing of folks who are not thought of as law enforcement agencies, but whose collection of your information becomes a way of enforcing behavioral norms. And that happening in a way that is ‑‑ goes beyond what criminal law can even do. And so… Thinking about sort of the modern history of how this has happened in law that ‑‑ in some of the laws that we’ve been talking about, some of the federal laws that we’ve been talking about. So, the Patriot Act, one thing to think about in terms of the Patriot Act is how prior to 9/11, Congress had been considering passing some form of regulation for internet companies and data ‑‑ regarding data collection. And Kendra, please jump in if I am messing this up. Or anyone, obviously. (Laughs) But… When 9/11 happened, that regulation sort of was pushed to the side. And it is during this time period that we have this sort of ‑‑ we have the increase in government surveillance, but we also have this sort of recognition by private corporations that data collection is something that can be monetized. And they are unregulated in doing this, and there is this idea that if you have given over your information voluntarily, it belongs to those people, regardless of whether you did it knowingly or not.
So we have this sort of rise of data collection that is a creation of surveillance tools by corporations, and there’s sort of a monetary incentive to keep creating stronger and stronger data collection tools that can be more and more invasive and do this kind of contact tracing that does the thing that Yves has been talking about, that you folks have been talking about, where you don’t ‑‑ each piece of information looks innocuous, but when you put it together you have a map of who you are, and that’s especially concerning for sex working people because they identify sex working people based on this collection of seemingly innocuous information. And you have the incentive to create tools that are more and more effective at collecting that information. And then you have the partnership between government and private companies that then allows that information to be used in order to enforce norms that are… expected by the state, that are thought of as beneficial to the state. And, obviously, targeting people in the sex trades at a high rate. And especially people in the sex trades who are parts of ‑‑ part of other marginalized communities.

And so FOSTA and EARN IT are just sort of… I think we talk about FOSTA a lot as though it is a, like, a revolutionary law that was passed, as though it made huge changes in the law. And it, you know, it did make a change to one specific law that I think people thought of as being sort of dramatic. That’s Section 230. However, it is ‑‑ it really was just an evolution out of stuff that had already been happening. So FOSTA’s purpose, and EARN IT’s purpose, as well, one of the main purposes of both of these laws is to decrease limits on civil liability for internet companies. And you can think of that as being Congress sort of taking out, taking themselves out, taking their responsibility away from themselves in terms of regulating these companies and putting, deputizing civilians to do that regulating for them, and deputizing corporate ‑‑ also deputizing corporations to create… rules and collect data that is thought of… (Sighs) Or is publicized! As being some, having some impact on trafficking and sexual exploitation and sexual violence? But all of that being simply… a show. And actually increasing sex workers’ vulnerability to exploitation. And when you decrease our ability to communicate with each other, when you decrease our ability to be visible online, when you decrease our ability to share information, health and safety information, you increase folks’ liability ‑‑ sorry, folks’ vulnerability, to violence and exploitation.

And I notice that someone asked earlier whether EARN IT had a piece about not prohibiting sexual health information for use. And it doesn’t, at all. But what it does is increase civil liability so that it incentivizes companies to draw the line further than the law specifies. And that is the same thing that FOSTA does. So, these laws ‑‑ because civil liability can be ‑‑ right, anybody can sue. You know… So, think about ‑‑ in terms of criminal law enforcement, that ‑‑ you need specific resources. Like, the police and the FBI ‑‑ policing happens on all of these different levels, state and federal. I mean, they do have ‑‑ obviously, this has been, you know, this has been in public conversation for ‑‑ especially recently, is how many resources these folks do have. And it is a obscene amount of resources. However! It is still less likely that you will be subject to criminal prosecution than that as a company you will be subject to a lawsuit. And the lawsuits also have a lower requirement in court in order to have liability. Like, civil liability, you have to show less in court than you do to prove criminal liability.

And so when you increase civil liability, you incentivize companies to draw the line much further than the law specifies… because they want to get rid of even the appearance of liability, and even, you know ‑‑ because also, lawsuits are expensive, regardless of whether the claims can be proven or not! So, that’s ‑‑ I’ll stop there.

DANIELLE BLUNT: I wanted to make sure that we take a few minutes to sort of talk about what FOSTA‑SESTA and what EARN IT are amending. So Kendra, I would love just like a two‑minute summary of Section 230, and then Lorelei, if you wanted to sort of continue with what ‑‑ like, what EARN IT is, and what EARN IT’s proposing, and why the over‑‑‑ how ‑‑ and the ways in which it’s so overly broad that things like queer sex ed could get caught up in it.

KENDRA ALBERT: Yeah. And I think actually I want to sort of tag on to the end of what Lorelei was saying, ’cause I think it ties perfectly into a discussion about Section 230, which is to say the sort of what we lawyers would call “commandeering,” but the use of private companies to do things that the government… isn’t sure that it has the political capital or will to push forward? It’s not just because they, like, can make it happen using civil liability. It also is much harder to bring a First Amendment challenge to, like, companies deciding “voluntarily” to over‑enforce their own rules. Which, they’re not bound by the First Amendment. Versus the government making a particular rule, which would be susceptible to a First Amendment challenge.

So I can talk a little bit more about that, but I just want to make that point what Lorelei is saying. Which is delegating these responsibilities to private companies is not just better from, oh, you can kind of throw up your hands and claim no moral responsibility for what happens, but also it limits the ability of individuals what are harmed by these sort of changes to legal regimes to effectively challenge them.

What is section 230 of the Communication Decency Act?

So let me talk about 230, which I think will help us conceptualize the stuff, and then we can jump back to SESTA and FOSTA and EARN IT.

So Section 230 of the Communications Decency Act was passed in 1996…? I’m really bad with years. Anyways! Passed in 1996. And it was originally part of an omnibus anti‑porn bill, that had everything ‑‑ that was supposed to restrict minors from seeing porn on the internet. Everybody in the 90s was real worried about porn on the internet. And… It turned out that most of that bill was unconstitutional. It was struck down by the Supreme Court in a case called ACLU versus Reno. But what was left was this one provision that hadn’t gotten a ton of attention when it passed called Section 230. And what Section 230 does is it says that no online service provider can be held liable for the sort of ‑‑ or, to be the publisher of content where someone else, like, sort of spoke it online.

Okay. What the fuck does that mean? So, let’s take a Yelp ‑‑ let’s take Yelp, for example. On Yelp, there are Yelp reviews, posted by third parties. So if I post a Yelp review of my local carpet cleaner. Is always use them, because there’s a funny case involving carpet cleaners. Um. Anyway! I post a review, who I have not used. They have cleaned zero of my carpets. I say, these people are scum bags. They ripped me off. They told me it would cost $100 to clean all my carpets, and it cost me 3,000, and I got bedbugs. So that’s inflammatory. They could potentially sue me, because it’s not true and it harms their business.

What Section 230 says is carpet company can come after me, Kendra, for posting that lie, but they can’t sue Yelp. Or if they do, they’re going to lose. Because Yelp has no way of knowing if my claim is true or false.
So that’s the original meaning of Section 230. It’s gotten broadly interpreted, for lots of good reasons.

So right now, there are something like 20 lawsuits against Twitter for facilitating terrorism, all of them thrown outed on Section 230 grounds. The one that is most relevant to our conversation right now is a case out of Massachusetts called Doe v Backpage, which was brought by a number of folks ‑‑ survivors of trafficking against Backpage.com, for what they claimed was complicity and sort of knowledge of the ads that were placed on Backpage that they were harmed as a result of. And the first circuit, which is a sort of… one step below the Supreme Court, in terms of courts, said: That’s all very well and good, but Backpage isn’t the speaker of any of those ads. They didn’t write the ads. They don’t know anything about the ads. We’re throwing out this case. And in the aftermath of that, Congress was like, this shall not stand! And passed FOSTA and SESTA. And I’ll turn it over to Lorelei to talk more about that.

LORELEI LEE: I think I’m curious what the audience’s questions are about FOSTA‑SESTA, because I think there’s been quite a bit of information written about them, about that law, and I wouldn’t want to just talk on and on about it when it’s not focused to what people want to hear. Or should I talk about EARN IT? Or ‑‑ I don’t know, someone tell me ‑‑

DANIELLE BLUNT: I think we did a summary of FOSTA‑SESTA. I would like another one‑sentence summary of FOSTA‑SESTA, the impact. And then what the fuck EARN IT is and where it’s at, would be helpful.

LORELEI LEE: Yeah, so we can talk about why Section 230 matters to these laws.

DANIELLE BLUNT: Yeah.

Why is Section 230 important?

LORELEI LEE: So FOSTA‑SESTA does several things, does like six things in federal law, including create a new crime under the Mann Act, which is originally the White Slave Trafficking Act of 1910, and it’s been renamed, but it’s not better…? (Laughing) Oh, boy. So it creates new crimes. That’s one thing that it does. But it also a changes Section 230 so that it no longer protects against lawsuits under federal law regarding specifically the federal anti‑trafficking law, 1591 ‑‑ 18 USC, 1591, in case anyone wants that number. Um. (Laughs)

And so that, what that does… There ‑‑ it’s not clear that Section 230 was actually preventing people from suing companies, specifically Backpage. Backpage was the center of congressional conversations and the center of media attention. And… The chamber was that Backpage was going un‑‑‑ they were held ‑‑ not being able to be held accountable for trafficking that was happening on their website. But actually, the first circuit case was maybe… just didn’t have enough evidence yet to show how actually Backpage could have been held responsible regardless of Section 230, because Backpage was doing things like editing ads and that kind of thing that would have made them liable in a way that’s unlike Yelp.

And so… So, okay. But! People started talking about Section 230 because there was a documentary made about the first circuit case, and it was very well‑publicized, and that documentary was shown in Congress. And people started talking about Section 230 as though that was the thing preventing lawsuits.
I mean, another important piece to remember about making civil liability the place where we enforce anti‑trafficking law and anti‑exploitation law is that it puts the onus on victims of trafficking and victims of exploitation to bring those lawsuits that are very expensive, that ‑‑ lawyers for those claims are inaccessible. You have to spend years talking about your trauma. And! You know, it takes such a long time to get ‑‑ if you are going to even get compensation ‑‑ and then, at the end, you get monetary compensation if you win your lawsuit. But ‑‑ it doesn’t prevent trafficking! And it doesn’t prevent exploitation. And we know that there are other methods of doing that that are much more effective. And many of those methods involve the investment of resources, I think ‑‑ I think this is one of the reasons that this is happening, is that many of those solutions involve the investment of resources in marginalized communities. And instead, Congress wants to pass bills that don’t require the redistribution of wealth in this country.

So EARN IT does something similar to Section 230. And the way that FOSTA makes a carve‑out in Section 230 around the federal anti‑trafficking laws, EARN IT makes a carve‑out in Section 230 around the child exploitation laws, specifically child sexual abuse material laws. And, similarly, when Kendra and I did this research, there haven’t ‑‑ there hasn’t been a lot of examples of Section 230 preventing those claims being brought. So, again, it feels a little bit more like this is for show than anything else. But we can predict that the impact that EARN IT will have will be very similar to the impact that FOSTA had in terms of the removal of information online and the censoring of people online. And the ‑‑ not just the removal of information, but the prohibition on specific people talking.

And we think that, based on our, like, analysis of EARN IT, that that impact will be really on queer youth. So, because that’s a specific community for whom there’s a lot of fear around sexual information being shared, and it’s also a specific community who is seeking that information out! Because, I mean, just being ‑‑ having been a queer youth once myself! I know that, like, you just don’t ‑‑ you just don’t necessarily have access to folks when you’re a kid who can tell you that you’re okay and you’re normal.

DANIELLE BLUNT: Yeah. Thank you for that, Lorelei. And I wonder, too, if Yves and Korica, if you have anything that you would like to add before we open it up to the Q&A.

YVES: I mean, I think that… you know, y’all covered it pretty well, right? Like, I think that like everybody covered a lot of stuff. I mean, I’ve been looking at the questions, and I also only really have, like, a little bit to say in terms of… you know, the way that we see a lot of this happen, right? We’ve obviously talked about criminalization; we’ve talked about censorship, and kind of what happens. But specifically, in talking on this panel around the impact on sex workers and marginalized communities, like the ways that we really see a lot of this happen, and the push for this, right? Like, whenever there’s an increase in surveillance, like that increases the scope, it’s going to increase the scope of policing, and also the general stigmatization. Right?

So we don’t ‑‑ like, I think everybody kind of knows that I’m, like, most knowledgeable in terms of the impact on in‑person sex work. But when we also look at the impact these things have on like digital sex work, that has gotten so much more popular during these times, right, we also see that all of these groups, and like ‑‑ the groups that are behind these bills to begin with, right? Are pushing for other forms of censorship or limitations being put on not just sex workers, but other marginalized communities, but also, like, you know, we know that these intersect. We know that there are intersections here. Is that, they ask people like credit card companies to not accept payments for sex workers. We’ve seen that happen, right? And like, in terms of like in‑person sex work ‑‑ and Lorelei talked about this, right? People get pushed into the most dangerous forms of sex work, making them more vulnerable. In fact, making them more vulnerable to the human trafficking that these people are so against, and make everybody so much more vulnerable to all of these things, which we like kind of talked about. And I think it’s kind of important in talking about the questions that I’m kind of seeing about, you know, what do we do? Because I feel like a part of our conversation kind of scares everybody into being like, oh, my gosh, I should just not use social media! I shouldn’t even text! (Laughs) Which, I don’t want people to think that that is the case? Right? F I think that, like, all of these different encryption methods, and these things, right ‑‑ although! Right? I do not think that they’re foolproof, which they’re not! Like, there are still many ways in which the police and like other agencies can get access to this information, one of those ways being like whomever you’re sending the information to, and wherever that information kind of ends up. If you like, you know, sync it to your laptop, sync it to your phone. All of these different ways, right? But these are tools that can protect you.
So I think that if you want to use these encryption methods, these like ‑‑ proton mail to encrypt your mail, iPhone messages, that’s a good thing. Take what you have. But know they’re not foolproof.

I also want to talk about ‑‑ Kendra kind of talked about this and the reforms and what they look like, and how we think we’ve solved this, or people think they’ve been solving it? Obviously, I came into this conversation, I told everybody I’m an abolitionist. Right? I work with abolitionist groups. I think at the end of the day, surveillance is a strategy that they use in policing, before this technology existed. Before they started doing this stuff, they always surveilled. It’s just an arm of policing. At the end of the day, the problem is policing, policing that has always been used and meant to use to disappear communities. right?
So the bigger fight that we’re fighting is policing. So I don’t want people to think that the fear is, oh, you shouldn’t do anything. The truth is, if you’re a marginalized person, if you’re a sex working person, they are going to want to police you, no matter what, and we’re fighting against that.
(silent applause)

DANIELLE BLUNT: Thank you so much for that. Korica, did you have anything that you wanted to add?

KORICA SIMON: I will say I went to a conference. It was a Law for… Black Lives? I think is what it’s called? And they do things around, like, the law and liberation of Black people. And the speaker talked about how, like, have you ever noticed if you lived in a Black neighborhood, or a person of color, people of color neighborhood, police are everywhere? But if you live in white neighborhoods, police are not there. And it’s not because there is more crime in one area over another. In fact, like, police just make the crime worse. And I thought about that a lot, ’cause I’ve lived in Black neighborhoods. I’ve lived in white neighborhoods. And there is a stark difference. And there is still, like, “crime” happening in the white neighborhood, but nothing ‑‑ like, police weren’t there. So I think it is important to think about how policing is the problem.

And then one other thing I forgot to bring up is that before this talk, I was doing like research on what’s been happening lately, ’cause I feel like there’s just always so much happening. And something that I missed was that some police departments at NYPD and Chicago Police Department, they have been putting like ads… sex ads on websites, and people will text that number looking for services. And they will ‑‑ the NYPD Police Department will send them a message saying ‑‑ I have it pulled up. It’ll say, like, this is the New York Police Department. Your response to an online ad for prostitution has been logged. Offering to pay someone for sexual conduct is a crime and is punishable for up to seven years of incarceration. And… Yeah!
So, people in that article were talking about how the police have access to their name and their phone number, and they don’t know like what’s gonna happen to them. Like, are they just logged in some database? And I think it’s safe to assume that they probably are logged into some kind of database. And I think, as we think about how ‑‑ as yeast said, how sex work is becoming even more digital with the time that we’re in, like the impact of this on sex workers, I’m guessing it’s gonna be really large. So, yeah. I just wanted to bring up that extra way that surveillance is happening.

LORELEI LEE: I actually wanna add one more thing that I intended to say and forgot to say, which is just that in terms of this question of what we do, I do think that one other point to me, like, when they’re passing these laws, something else they’re doing is deputizing us to police ourselves and to chill our own speech and to prevent us from organizing and to prevent us from using any tools at all to communicate with each other and to talk about these issues. So, I don’t know. I do think that the… It is a mistake for us to use this as a reason not to… speak to each other! I mean, that is like really what we’re talking about, when we talk about not using online tools and other electronic tools of communication that are…

DANIELLE BLUNT: Yeah, it’s really interesting, too. And right now, some of Hacking//Hustling’s researchers are wrapping up the survey that we were doing on content moderation and how it impacts sex workers and activists who are talking about Black Lives Matter over the last few months. And like, we’re definitely noticing themes of speech being chilled, just like we did with our research on FOSTA‑SESTA, as well as the impact of, like, platform policing on both social media and financial technologies has just about doubled for people who do both sex work and are vocally protesting or identify as activists. And… The numbers are just… very intense.

So like, I think… Being mindful about how we communicate, rather than not communicating, is a form of harm reduction and community care. And I also see that this ‑‑ this panel as a form of harm reduction and community care, and this in partnership with our digital literacy training. Because I do believe that the more that we know, the more we’re able to engage meaningfully when legislation like this comes up. And… Like… A lot of these laws aren’t meant to be ‑‑ aren’t written to be read by the communities that they impact, and they’re often intentionally written to be unintelligible to the communities that they impact, and think that they can just like get them signed into law without ‑‑ without having to check in with the communities that are harmed by this legislation.

So I think that anything that we can do to better understand this and decrease that gap between the people who are writing this legislation, or the, like, tech lawyers who are opposing this legislation, and like bringing in our own lived experiences? Is incredibly important work.

KENDRA ALBERT: I also ‑‑ well, I know we want to ‑‑ well, I’ll stop. Blunt, you want to do Q&A?

DANIELLE BLUNT: Sure, we can do Q&A. If you had one thing to add, that’s fine.

KENDRA ALBERT: So one thought I had there is, one, it’s totally right? But it can feel like oh, my god, there’s so much? That’s one of the hard things with talking about surveillance. It’s like, yeah! You know, police and law enforcement have so many tools in their law enforcement, and… You know? But at the same time, like, our ‑‑ we care for each other by, like, creating space to talk about what makes us feel safer, and how can we make ‑‑ take risks that we all agree to be taking? Right? Risk‑aware consensual non‑encrypted information.

DANIELLE BLUNT: I love that! (Laughing)

KENDRA ALBERT: It rolls just right off the tongue.
But I think I want to highlight what Yves was saying, in terms of the problem is policing? And I think one of the ‑‑ and Korica also said the same thing, so, you know. What we’re all saying, in terms of the problem being policing, and the solution not just being like finding more ways to like slightly narrow the surveillance tools? I think one of the real problems around surveillance, sort of surveillance debates generally, and I say this as somebody who comes out of a technology law tradition, is that they are ‑‑ the folks who are doing work on like sort of high‑level surveillance tools, like things like the sneak‑and‑peek warrants or Section 215 of the Patriot Act, are often deeply disconnected from the communities who are most likely to be harmed once these surveillance tools are widely used. Right? Like, just like with the technology laws, right, there is this way in which, you know, the conversation around like mass surveillance is up here, and we’re supposed to be afraid of mass surveillance, because mass surveillance means surveillance of white folks like me and not communities of color. Right? But at the same time, like, that… So much of the rhetoric relies on like the idea that it’s okay to surveil some folks, but it’s not okay to surveil others. And part of how we fight back is by deconstructing the notion that it’s okay to do this ‑‑ to use these tools on anybody. That like, you know, it doesn’t ‑‑ it’s not actually like, oh, there’s a bad enough set of crimes to make this okay. Right? And that’s part of ‑‑ part of it is not getting sucked into the sort of like whirlpool of like, well, you know, is terrorism worse than human trafficking? Well, if terrorism isn’t worse than ‑‑ if they’re both equally bad, then we need to have the same tools to prosecute human trafficking as we do to terrorism. And here we are where they’re getting a sneak‑and‑peek warrant to go into a massage parlor in Florida.

And I don’t say that flippantly, because those are real folks’ lives, just like there are real folks’ lives impacted by surveillance of supposed terrorist communities. Looking at all of the mosques in New York and Detroit, where folks were under persistent surveillance after 9/11.

So I think part of what we do is we resist the idea that it’s okay if this happens to other people. Because, you know, that’s how… That’s how the tools get built that will, like, eventually be used against all of us.that was what I wanted to say. I’ll stop there.

DANIELLE BLUNT: Thank you, Kendra. Okay. We’re going to open it up for Q&A. Someone asked if we could touch on the recent encryption legislation and how protected we are using services like Signal, WhatsApp, and Telegram.

KENDRA ALBERT: I can take that, and then if Lorelei and Blunt, if you want to jump in if I screw it up.

So, you know, EARN IT was one of the sort of pieces of legislation that was kind of proposed to… make it sort of ‑‑ I don’t want to say “end” encryption, but would have had the practical effect of making encryption, encrypted services more difficult to sort of produce. The other is the LAPD I think laid ‑‑ that’s probably not how people have been pronouncing it. But.
(Laughter)
The ‑‑ that bill is way worse. I do not think it’s going to pass. It sort of all‑out tries to ban encryption.

EARN IT actually, sort of between the initial proposal and the version of the bill we’re currently on, got much better on encryption? So now it specifically says that, you know, using encryption won’t ‑‑ like, isn’t supposed to be able to be used against a service. Like, for purposes of figuring out whether they’re liable for child exploitation material. It’s really ‑‑ it turns out that that construction is not just complicated when I say it, but very complicated in the bill, and might do less.
In terms of what the impact is gonna be on like Signal, WhatsApp, and Telegram ‑‑ you know, what I’m hearing in this question is sort of end‑to‑end encrypted services, where the service provider doesn’t have access to your communications? You know, I think that it would be unlikely ‑‑ if ‑‑ if, God forbid, EARN IT as currently existing passes, I think it would be unlikely to sort of result in Signal or WhatsApp going away. In fact, actually, some advocates are currently ‑‑ like Miles Nick in particular ‑‑ are arguing that the current internet construction earn best of your knowledges encryption? I’m a little more skeptical about that than he is. Happy to sort of ‑‑ you know, at me on Twitter if you want to talk about that.

But I don’t ‑‑ those services are not going away under the current version of EARN IT. However, the Justice Department has been trying to sort of get back doors into encrypted services for a long time, and they’re not going to stop. So it’s sort of a… nothing to watch for right now, but stay vigilant on that front.

YVES: I just wanted to generally say, right, like I think… Like, if the question’s kind of getting at like in your personal life, like, how ‑‑ what’s the danger, or like if you’re doing some kind of criminalized work, or something that you are afraid of like the police getting information of, right? Like, it’s not gonna do your harm to use an end‑to‑end encryption service, like iMessage, like WhatsApp, like Signal, like Telegram. Right? But it’s not something that’s gonna protect you wholly? But also should note that, you know, the like the person asking this question is like an organizer or a whore, like, you know. So like, when ‑‑ most of the time, when this information gets in the hands of police from like your texts or things like that, it’s not because they’ve like hacked the system. It’s not like something like that. It’s usually because someone you’ve talked to has like the, the police have gotten ahold of them, they’ve given that information to them. And like, the ways like ‑‑ I kind of talked about this in the beginning, right? When they deputize civilians, we’re not just generally ‑‑ I literally mean there are also people who are just going to be, like, I think that there’s a sex worker at my hotel! Like, da da da da! I think there’s a sex worker in my Uber! Right? And like handing over that information.
So I don’t want people to be like, oh, I’m just like not safe anywhere. Because that’s not really what the scenario looks like in real life, when you’re like on the street and like working. Right? But they’re not, like, fully safe. It’s not like, oh, you can type anything into Signal, and it’s like Gucci.

DANIELLE BLUNT: Right. And I think, too, people can take screenshots. Oftentimes, that’s how information is shared even when you’re using encrypted channels. So I think also just being mindful about what you say, when our saying it to. If you’re using Zoom, knowing that this is going to be a public‑facing document, and we’re not currently planning any political uprisings in this meeting? So it feels okay and comfortable to be using Zoom as the platform. But like… Personally, in my work, even if I’m using an encrypted platform, like, I don’t say anything that would… like, hold ‑‑ I do my best to avoid saying things that would, like, hold up in court as evidence, in the way that I use language.

KENDRA ALBERT: Yeah. I think in the immortal words of Naomi Lauren of Whose can have corner Is It Anyway, people need to learn how to stop talking. Which it turns out is both solid advice, and what my advice is if the police want to talk to you. So, solid on many different front seat.

LORELEI LEE: I think it’s really important ‑‑ like, I think several people have said this already, but just to really emphasize that when we’re talking about this stuff, the intention is to have… you know, informed consent, you know, for lack of a better word, of using these tools. And that… You know, especially if we’re talking about, we’re talking about sex working people, we’re talking about ‑‑ Aaa! Caty Simon! (Laughing) I’m sorry, I had to interrupt myself to get excited that Caty Simon is here. Another expert on all of this stuff.

The thing I was going to say is I do think that sex workers are criminalized folks from many marginalized communities are really good already at risk assessment. Understanding what level of risk you are comfortable with. And using these tools with that in mind. And knowing that nothing ‑‑ there are no answers! Right? There’s no, there’s no system except abolition that is going to prevent these kinds of harms from happening. Abolition of actually policing and capitalism, perhaps! So.
Oh, and the thing I was na say, which is maybe not that important, but the question I had for Kendra, is whether you think EARN IT is still part of encryption in terms of best practices and how that might inform future corporations. I’m not sure if that’s too far in the weeds?

KENDRA ALBERT: I think it could be. So one of the things we’ve been saying internal internally about EARN IT, like in Hacking//Hustling, that I want to emphasize here, is the lack of clarity of what the bill is going to do is a feature by the creators, not a problem? It’s not that we’re failing at interpretation? ‘Cause we’re not. You know. But the… You know, I can say all I want what I think EARN IT means with regards to Signal and Telegram, but as Lorelei pointed out one of the things EARN IT does is create this commission that creates best practices, which who the hell knows what’s going to be in there. And it’s really unclear even how the liability bits are going to shake out.

So even with a specific amendment to the current version of EARN IT that’s supposed to protect encrypted services, we don’t really know what’s going to happen. So really good point, Lorelei. Thank you.

DANIELLE BLUNT: “Other than being educated or somehow not using any technology, what can we do?” I feel like we touched on that a little bit, but if anyone wants to give a quick summary.

KENDRA ALBERT: Yeah, I mean, I think just to echo what Lorelei and yeast have already said, right. Engage in thoughtful conversations around how you’re using the technology, and be thoughtful around how you’re using it, when our sharing what with. I think for me ‑‑ and actually, maybe this gets to the next question, which is sort of like… It doesn’t matter if you don’t break the law? Or what ‑‑ but I don’t have anything to hide! Right? You know, the way I think about this is that, like, everyone has something that law enforcement could use to, like, make your life miserable. That’s just reality. Some folks have many more things! Like. But everyone has something. And so… Not ‑‑ my goal, like our goal I think here is not to sort of suggest like paranoia, they could be listening to everything. Although, you know, yes! I’m not pretty sure that there’s a legal authority to do most things that law enforcement wants to do, and like I’m not under any illusions about that. But so part of how we think about this is, you know, how do we take care of the folks around us and be thoughtful around the risks that we’re taking and make sure that we’re taking risks that are aligned with our values and the things we need to do? Right? And those are gonna look different for everybody. But I welcome thoughts from other folks on the panel, ’cause I think I’ve said enough.

LORELEI LEE: I think I’d like to add something, which is that oftentimes when we talk about this stuff, we talk about it in terms of personal risk as though risk belongs to us alone, when I think it’s really important to recognize the communities that you’re interacting with and the people that you’re interacting with and to understand that even if you feel as though law enforcement won’t do anything to you, you’re not a likely target, it’s highly likely that there are people in your life who are likelier targets, and your refusal to talk to law enforcement, or your care around how you communicate with folks, is protective of the people around you and the people that you care about who you might not even know what their levels of risk are.

And then the other thing I want to add in terms of things to do is that to think about how ‑‑ what, what actions you’re capable of to oppose the passage of anti‑encryption laws, to oppose the passage of laws that target sex working people and people of color and people in marginalized communities, and to think about if you feel as though you have a lower level of risk of being targeted by law enforcement, that means that you have a greater capability for maybe going out and protesting! I have to tell you, I know a lot of criminalized people who do not feel safe protesting on the street, do not feel safe talking on un‑encrypted platforms, don’t feel safe talking on panels like this. And if you feel safe doing those things, then it’s your responsibility to do those things. So.

YVES: I mean, I don’t know if you were going to ask the next question, but like Lorelei and Kendra kind of talked about it a bit. I kind of just want to say if someone asks you, or like, it doesn’t matter if you don’t break the law, that’s not really a part of the issue, right? Like, laws, crimes, the things that we define as crimes are entirely arbitrary. Right? So who gets arrested, who gets criminalized, all of these things are… just simply based on who, like, the system is against. Which we know that that means Black, Indigenous, people of color, especially trans people, any gender non‑con forge ‑‑ gender nonconforming people, sex working people, anything that is outside the scope of what white supremacist culture would consider to be a good and appropriate person! Right?

So it’s not really about breaking laws. Or, you shouldn’t be afraid of anything if you haven’t done anything. Because it doesn’t matter. They’re going to criminalize people regardless of that. Right? They’re going to incarcerate people regardless of that. Like, all of these things are a death sentence to marginalized folks, which is why we kind of talk about it in this way. It’s not about, like ‑‑ well, I mean, it is about like surveillance is bad? It’s infringing on rights of people, right? But it’s also about the fact that surveillance is just like a tool that is used for policing, for incarceration, in order to just disappear whomever. Right?

So, when talking about that, surveillance is bad for that reason. For the reason, like, I talked about a little bit with contact tracing, right? That, in theory, should be a good thing. Should mean that we are keeping people safe. Should mean that people aren’t getting COVID, or are getting treated for COVID, are getting treated for HIV/AIDS. But we know that in a world where we have policing, that is simply not what happens! Right? It’s not a case of, will they use it? They will. They will use it, they will criminalize it, they will arrest people. So we want to get rid of it. Wholesale.

DANIELLE BLUNT: Yeah. And I think, too, when you talk about contact tracing in that capacity, I can’t help but think about the ways that data is scraped from escort ads to facilitate the de‑platforming across social media and financial technologies of sex workers and other marginalized communities, as well as activists. So I think both on the streets and on the platforms, like… This is not being used for good? And that it needs to end.

Okay. I’m gonna try and get one or two more questions in. Someone asked, do you think that EARN IT is going to be passed?

(all shrugging)

I think that’s our official comment!

KENDRA ALBERT: Yeah, for anyone that’s not watching the video or is not able to watch the video, there is just a lot of shrugging.

DANIELLE BLUNT: Yeah. But if you keep following Hacking//Hustling, we’ll keep talking about EARN IT and updates when they come, so. If you want to follow @hackinghustling on Twitter, that’s usually where our most up to date shit is.
Someone said, hypothetically, if someone wants to be a lawyer and is studying the LSAT and hoping to apply in the fall, should they not post publicly about these things or attend protests where you could be arrested?

KENDRA ALBERT: I can take that one. So, you can absolutely post publicly about these things. So the thing to worry about here is that the… like, for lawyers, is this thing called character and fitness, which is basically if you want to practice as a lawyer after you go to law school, you have to get admitted to one of the state bars, and state bars have particularly requirements. I actually don’t know a ton about how those interact with, like, a past history of sex work? But the sort of watch word in terms of thinking about character and fitness is honesty, generally speaking. Like, the goal ‑‑ folks generally ‑‑ pretty much most things are overcomeable through character and fitness, if you explain sort of what happened. So getting arrested at a protest, like, that’s ‑‑ you can totally still pass the bar and become a lawyer through that. Absolutely posting publicly about like abolition or sex work or, you know, those kinds of things.

You know, where I would start to sort of think about whether you want to talk to someone who has more experience about this than me is, um, if you have felonies on your word, or if you are sort of worried that you have any behavior that folks might use, might believe makes you less honest. So things like fraud convictions often come up. But I’ll stop there.

DANIELLE BLUNT: Awesome. And then I think this will be our last question, as we’re just at time. Any books recommendations along with Dark Matters: On The Surveillance of Blackness by Simone Browne? So sounds like folks are interested and want to learn more.

KENDRA ALBERT: This isn’t a book, but Alvaro Bedoya recently wrote a piece on The Color of Surveillance. It’s really amazing. So I recommend that.

DANIELLE BLUNT: Will you tweet that out? If folks say things, will you tweet them?

KENDRA ALBERT: Yeah.

KORICA SIMON: I have a few books that I’ve ordered and I need to read, before the summer is over? Black Software: The Internet & Racial Justice, from the Afronet to Black Lives Matter. It’s talking about how technology can… Oh. Digital racial justice activism is the new civil rights movement. There’s Automating Inequality: How High‑Tech Tools Profile, Police, and Punish the Poor. Have you read that?

KENDRA ALBERT: It’s really good. I really recommend Automating Inequality.

KORICA SIMON: The last one is Race After Technology: Abolitionist Tools for the New Jim Code.

LORELEI LEE: I think I would add The Age of Surveillance Capitalism, which talks a little bit about the history of the development of some of these data collection tools.

DANIELLE BLUNT: Yves, did you have one you were saying or typing?

YVES: I mean, I would recommend there’s The Trials of Nina McCall, which is about sex work surveillance, and I think it’s the decades‑long government plan to imprison promiscuous women. I would also recommend, if you’re interested in learning more about how public health is weaponized as surveillance against marginalized communities, Dorothy Roberts writes a lot of stuff about this. So, yeah.

DANIELLE BLUNT: And ‑‑ that’s a beautiful place to end. Thank you, Lorelei, for sharing that. I feel like ‑‑ (Laughs) We’re all ‑‑ everyone’s crying. I’m crying. Speaking for everyone. (Laughs) If people want to be found online, or if you want to like lift up the work of the organizations that you work with, can you just shout out the @?

KENDRA ALBERT: @HackinHustling. It’s really great!

YVES: You’re fine. @redcanarysong, and @SurvivePunishNY.

DANIELLE BLUNT: Well, we are slightly overtime. Thank you so much to our panelists, and Cory, our transcriber, for sticking with us. I’m going to stop the livestream now, and stop the recording.

Panelists

Lorelei Lee (they/them)is a sex worker activist, writer, recent law school graduate, and 2020 Justice Catalyst Fellow. Their adult film work has been nominated for multiple AVN awards and won a 2015 XRCO award. Their essays, fiction, and poetry have been published or are forthcoming in The Establishment, Denver Quarterly, $pread Magazine, Salon, Buzzfeed, n+1, WIRED, The Believer, and elsewhere. They are a contributor to the anthologies Coming Out Like a Porn Star, The Feminist Porn Book, Hustling Verse, and others. They were a founding member of Survivors Against SESTA, are a researcher and analyst with Hacking//Hustling, and serve on the steering committee of Red Canary Song.

Yves (they/she) is a queer and disabled Viet cultural worker and sex worker whose organizing home is with Survived & Punished NY, Red Canary Song, and currently FTA4PH. Yves comes from a background in Rhetoric and focuses on the study of collective and public memory and uses it as a framework for their work in art and organizing for prison/police abolition and the decriminalization of sex work.

Kendra Albert (they/them) is a clinical instructor at the Harvard Cyberlaw Clinic at the Berkman Klein Center for Internet and Society, where they teach students how to practice technology law by working with pro bono clients. They also have held an appointment as a lecturer, teaching classroom courses on the First Amendment as well as transgender law. Kendra holds a B.H.A from Carnegie Mellon University and a J.D. from Harvard Law School. They previously served on the board of Double Union, a feminist hackerspace in San Francisco and run a side business teaching people how to use their power to stand in solidarity with those who are marginalized. Kendra’s research interests are broad, spanning constitutional law, queer theory, video games, and computer security. Their work has been published in Logic, WIRED, and the Greenbag, and covered in The New York Times.

Korica Simon (she/her) is a third year law student at Cornell University and a fellow for the Initiative for a Representative First Amendment through Harvard’s Cyberlaw Clinic. This past year, she worked as a graduate teaching assistant for an information science course at Cornell called Information Ethics, Law, and Policy, where she taught a course around the ethics of up and coming technology and engaged with students on how the law should respond to these innovations. In addition, she has had the pleasure of working on sex worker rights issues through an internship at Legal Voice in Seattle and through the Cornell Gender Justice Clinic. After graduation, she’s hoping to become a privacy lawyer focusing on freedom of expression issues that marginalized communities face in the age of technological surveillance.

EARN IT Act – Harm Reduction Guide

With the EARN IT Act on the horizon we want to build power and community care. We compiled this quick harm reduction guide of things you can do right now to take care of yourselves and each other!

As we organize around our opposition to EARN IT, here are a few things harm reduction tips that you can do right now to stay connected and make sure you don’t lose important information and community.

  1. Remember- We don’t all trade sex under the same circumstances and not all of us have access to the same online tools. Not all of this may be applicable to you or how you work, so please take whatever is useful, ignore the rest and take care of each other!
  2. Social media is not a stable platform for sex workers. We are shadowbanned and loose accounts at a significantly higher rate than our peers. Having multiple ways to stay in touch with clients and community can help in the event of losing an account.
  3. It is important to stay connected! Start collecting e-mail addresses and build a mailing list. Having a list of how to get in touch with clients and community is helpful in case you lose access to your social media accounts.
  4. Back your shit up! Download backups of everything! You can archive your social media posts, google drive docs, website text, and images. Keep multiple backups if you can, including a hard drive.
  5. Direct clients to your e-mail! Ask clients what the best way to stay in touch is. Make sure your favorite clients know how to get in touch with you too!
  6. If your sex work account is deleted it’s not your fault!

Want to know more about why we oppose the EARN IT Act? Click here.

(These harm reduction tips, in response to the EARN IT Act, are visualized below on 7 tiles of light pink, blue and yellow background with the graphic of a web browser and a teal bar on the bottom with a heart on it)

Digital Literacy Training (Part 3) with Olivia McKayla Ross (@cyberdoula) and Ingrid Burrington (@lifewinning)

Platform Surveillance Digital Literacy Training

INGRID: So today, we were gonna talk about platform surveillance. And… There’s, in general, what we’re kind of focused on is both ways that platforms surveil and forms of state surveillance that utilize platforms.

So, this is sort of a outline of where we’re going today. First, kind of doing a little bit of clarifying some terms that we’re gonna use. Those two different kind of terms being state surveillance and surveillance capitalism. And then we’ll talk a little bit about mitigation strategies, with both.

What is surveillance capitalism?

So, clarifying some terms. We ‑‑ we wanted to kind of make sure we were clear about what we were talking about. So surveillance capitalism is a term used to describe sort of a system of capitalism reliant on the monetization of personal data, generally collected by people doing things online.

What is state surveillance?

State surveillance is a tactic used by governments to intimidate and attack. It can be digital? It can also be IRL. For many years, it was just people following people, or decoding paper messages. It’s more sophisticated now, but it takes many different forms.

And surveillance capitalism kind of emerges in part because of the demands of state surveillance. So in the early 2000s, after 9/11, the existence of a military industrial complex that had funded the development of surveillance technologies was already well established, but was amped up in some ways by the demands of the global war on terror. And there are… you know, deep interconnections between the history of the Silicon Valley and the history of the military industrial complex.

The state will utilize the resources of surveillance capitalism. But we wanted to make it clear, like, surveillance capitalism’s negative impacts and risk to personal safety are not solely confined to state violence. They can perpetuate interpersonal violence; they can, you know, create greater harms for your ability to kind of, like, find a place to live, or change your mobility. And those are ‑‑ you know, those two distinctions, we just wanted to kind of clarify as we go into this.

And I feel ‑‑ like, part of me wants to believe this is kind of already taken as a given, but it’s good to kind of have it said: Neither surveillance capitalism nor state surveillance exist independent of racism and the legacy of the transatlantic slave trade. Sometimes ‑‑ I found when I first started getting into this space, in 2012, 2013, that was kind of an angle that was kind of neglected. But you don’t have an Industrial Revolution without transatlantic slave trade, and you don’t have surveillance without the, you know, need to monitor people who are considered, you know ‑‑ property, basically. I would highly recommend Simone Browne’s Dark Matters as an introduction to the role of Blackness in surveillance. But, yeah. Just wanted to raise that up, and be really honest about who is affected by these systems the most.

And… Also wanted to reiterate something that we said in the first day of this series, which is, nothing is completely secure… and that’s okay. We’re learning how to manage risk instead. Right? The only way to be completely safe on the internet is to not be on the internet. Connection is, you know, involves vulnerability by default. And… The work that we’re trying to all do is find ways to create space. To mitigate or, like, understand the risks that we’re taking, rather than just assuming everything’s a threat and closing off. Or saying, well, I’m doomed anyway; doesn’t matter.

What is threat modeling?

In the world of security studies and learning about, you know, building a security ‑‑ like, a way of thinking about keeping yourself safe, a term that comes from that space is threat modeling. Which is a practice of basically assessing your potential vulnerabilities with regard to surveillance. We’re not going to be doing threat modeling, per se, in this presentation. There are some really great resources out there that exist on how to do that that we can ‑‑ we’re happy to point you to. But we wanted to raise it up as something that can help in approaching managing risk and mitigation, because it’s sort of inventorying your own, you know, circumstances and understanding where you are and aren’t at risk, which can make it a little less overwhelming.

All right. So the concept of state surveillance, for this presentation, we’re going to be talking about it on and using platforms, which we talked about yesterday. There’s all kinds of other ways? (Laughs) That, you know ‑‑ as I said earlier, that the state can engage in surveillance. We’re, right this second, just gonna focus on platforms. We might be ‑‑ if there are questions specifically about non‑platform things, maybe in the Q&A part, we could talk about those, if there’s time.

What are platforms and why do they matter?

So, platforms are companies. I think we’ve said this a lot over the last three days. And what that generally means is that platforms have to, and will, comply with requests from law enforcement from user data. They don’t have to tell anyone that that happens. They do, some of the companies; some big companies do. These are from ‑‑ the one on the top is Twitter’s annual transparency report, and the one below is Facebook’s. And these are just graphs that their ‑‑ digitalizations they made about government requests for user data. And ‑‑ but again, this is almost a courtesy? This is like something that’s kind of done maybe for the brand, not necessarily because they have any obligation. But… It’s also just a reminder, like, there’s ‑‑ they can’t actually necessarily say no to, like, a warrant. This also applies to internet service providers, like Verizon; mobile data providers; hosting services. Like, companies have to do what the law tells them, and most of the internet is run by companies.

Next slide… So, companies don’t actually ‑‑ but governments don’t really always have to ask platforms to share like private data, if there’s sort of enough publicly available material to draw from. The method of using sort of publicly accessible data from, you know, online sources is sometimes called open source investigations, in that the method is reproducible and the data is publicly available. When Backpage still existed, that was more or less how cops would use it to conduct raids. One older example of this is in 2014, the New York Police Department conducted a massive raid on public housing project Harlem to arrest suspected gang members. It was called ‑‑ oh, shoot. It had like some terrible name… Operation Crew Cut. That’s what it was called. (Laughs) And much of the evidence used in the raid was culled, basically, from cops lurking on social media, interpreting slang and inside jokes between teenage boys as gang coordination. Some of the young men ‑‑ I think they were as young as 16 and as old as 23 ‑‑ who were caught in that raid are still serving sentences. Some of them were able to challenge the case and be let out, but they still ‑‑ it was a pretty terrible process.

A more recent example of sort of police using publicly available data is this one on the left in June. This woman in this photo was charged with allegedly setting a Philadelphia police vehicle on fire. And the police were able to figure out who she was, based on a tattoo visible in this photo ‑‑ which you can’t really see in this image because it’s quite small; I couldn’t get a bigger one. Based on a T‑shirt she had previously bought from an Etsy store, and previous amateur modeling work on Instagram. So, you know, maybe only a handful of people had bought that one Etsy shirt. And they were ability to kind of figure out and match her to these other images than online out in the public.

What is open source investigation and why does it matter?

I want to note briefly that open source investigation, or digital investigation using publicly available resources, isn’t inherently a bad thing or technique. It’s research. Activists and journalists use it to identify misinformation campaigns and human rights violations when it’s not safe for them to be on the ground. I’ve used it in my own work. But you know, the point is don’t make it easier for the police to do their job.

What is metadata and why does it matter?

The next slide… is another source of information that can be pulled from publicly available sites, besides just sort of reading the material and the images, is metadata. And “metadata” is sort of just a fancy word for data about data. If ‑‑ one way that sometimes this gets described is like, if you have like a piece of physical mail, the letter is sort of like the data, but who it’s ‑‑ the envelope, so who it’s mailed to, where it was mailed from, things like that, that’s the metadata. That’s the container that has relevant information about the content.

So in an image, it’s encoded into the file with information about the photo. This is some screenshots of a photo of my dog, on my phone. (Laughs) She, she briefly interrupted the session yesterday, so I thought I’d let her be a guest in today’s presentation. And if I scroll down on my phone and look kind of further below in the Android interface, I can see that it has the day and the time that the photo was taken, and then what I’ve blocked out with that big blue square right there is a Google Maps embed that has a map of exactly where I took the picture. You can also see what kind of phone I used to take this picture. You can see where the image is stored in my folder. You can see how big the image file is. These are examples of, like, metadata. That, combined with things like actual information in the data, like the actual information in the image, like a T‑shirt or a tattoo, all of that is like really useful for law enforcement. And metadata is something that is stored ‑‑ if you use like Instagram’s API to access photos, you can get metadata like this from the app.

Is it possible to remove metadata from your phone’s camera?

OLIVIA: So, surveillance capitalism! Really big ‑‑ oh, there’s a Q&A. Is it possible to remove metadata from your phone’s camera?

INGRID: So there’s two things that you can do. One is ‑‑ I think that you can, in your settings, like you can disable location? Being stored on the photos? Depending, I think, on the make and model. Another thing, if you’re concerned about the ‑‑ you know, detailed metadata… you can ‑‑ like, taking a screenshot of the image on your phone is not gonna store the location that the screenshot was taken. It’s not gonna store ‑‑ like, it might ‑‑ I think that the screenshot data might store, like, what kind of device the screenshot was taken on. But given ‑‑ but that doesn’t necessarily narrow it down in a world of mostly iPhones and, you know, Samsung devices and Android devices. Like, it could be ‑‑ it’s like a bit less granular. Yeah.

OLIVIA: Awesome.

Surveillance Capitalism: How It Works and Why It Matters

So, surveillance capitalism! I don’t know if you guys notice… I’ve been seeing them a lot more often. But some advertisements in between YouTube videos are just kind of like multiple choice questions? Some of them ask how old you are; some of them might ask if you’ve graduated from school yet; et cetera. So like, in what world is a single‑question survey a replacement for, say, a 30‑second advertisement for Old Spice deodorant?

Our world! Specifically, our world under surveillance capitalism. So, to go further on Ingrid’s initial definition, surveillance capitalism occurs when our data is the commodity for sale on the market. And usually, almost always created and captured through companies who provide online service ‑‑ free online services, like Facebook, Google, YouTube, et cetera.

We can’t really know for sure how much our data is worth? There’s no industry standard. Because, at the end of the day, information that’s valuable for one company could be completely useless for another company. But we do know that Facebook makes about $30 every quarter off of each individual user.

What are data brokers and why do they matter?

But social media sites aren’t the only ones with business models designed around selling information. We also have data brokers. And data brokers… If we go back to the private investigator example that we saw in the state surveillance section, thinking about the tools at their disposal, the level of openness that you have online, they could find out a lot of things about you. Like where you’ve lived, your habits, what you spend money on, who your family members and romantic partners are, your political alignments, your health status, et cetera. That’s like one person.

But imagine that rather than searching through your data and piecing together a story themselves, they actually just had access to a giant vacuum cleaner and were able to hoover up the entire internet instead. That is kind of what a data broker is!

I made up this tiny case study for Exact Data. They’re a data broker, and they’re in Chicago.

And Exact Data has profiles of about 20 ‑‑ not twenty. 240 million individuals. And they have about 700 elements associated with each of them. So some of the questions that you could answer, if you looked at a stranger’s profile through Exact Data, would be their name, their address, their ethnicity, their level of education, if they have grandkids or not, if they like woodworking. So it goes from basic data to your interests and what you spend time on.

So, potential for harm. You get lumped in algorithmically with a group or demographic when you would prefer to be anonymous. Your profile may appear in algorithmic recommendations because of traits about yourself you normally keep private. The advertisements you see might be reminders of previous habits that could be triggering to see now. And it’s also a gateway for local authorities to obtain extremely detailed information about you. I don’t know if Ingrid has any other points that might be potential for harm.

How to Mitigate Harm

But, luckily, there are ways to mitigate. Right? You can opt out of this. Even though it’s pretty hard? But if you remember from our first workshop where we talked about how websites collect data from us, we know that it’s captured mostly using scripts: trackers, cookies, et cetera. So you can use a script blocker! Also, reading the terms of service, it will probably mention the type of data an online service collects and what it’s for. They don’t always, but a lot of them do. So if you read it, you’re able to have a bit more agency over when you agree to use that service or not, and you might be able to look for alternatives that have different terms of service.

Privacy laws in the United States are pretty relaxed when it comes to forcing companies to report things. So, one tip is also to try setting your VPN location to a country that has stronger privacy laws. And then you might get a lot more banners about cookies and other trackers that they’re required to tell you if you live somewhere else that’s not here.

You can also contact data brokers and ask to be put on their internal suppression list. And a lot of brokers have forms you can fill out online requesting that. But the only issue is that this is really hard? Because there are a lot of data broker companies, and we don’t actually know how many more there are, because this is an industry that’s pretty unregulated.

Another mitigation strategy is creating, essentially, a digital alter ego that’s difficult to trace to your other online accounts. So if you are behaving in a way that you don’t want to be conflated algorithmically with the rest of your life, you can create separate online profiles using different e‑mail addresses, potentially using them in other places, and just creating as much distance between you and one aspect of your life and you in the other aspect of your life, and compartmentalizing in a way that makes it difficult to connect the two of you.

And then of course you can use encrypted apps that don’t store metadata or actual data. This could include messaging apps, but this could also include… word processors like CryptPad; it could include video conferencing; it could include a lot of different apps.

So, to wrap everything up: Over the past three days…

Wrapping Up the Digital Literacy Series

INGRID: So, I guess we wanted to kind of try and provide some wrap up, because we covered a lot of things in three days. And that was, like, a very broad version of a very, like, deep and complicated subject. But we sort of ‑‑ you know. We went through, you know, the foundational kind of depths of the internet, how it’s, you know, made, what’s like the actual kind of technical aspects of how it works. The platforms that, you know, are built atop that foundation that extract value out of it. And systems of power that incentivize those platforms to exist and control and govern kind of how some of that value is used ‑‑ (Laughs) Or misused.

And I guess across all three of these, I had a couple of, like, kind of bigger questions or things to think about that I wanted to kind of put forward. One is sort of, like, I think in some ways the neoliberal public/private structure of the internet as a ‑‑ as an infrastructure that everyone lives with… like, they’re ‑‑ like, you ‑‑ it’s ‑‑ it shapes the way that, like, everything else kind of follows. Right? When a… when something that was originally sort of like a government‑built property becomes a commodity that becomes the foundation of how anyone kind of can like live in the world, it creates kind of a lot of these aftereffects.

And I think ‑‑ I find internet history always really fascinating, because it’s a reminder that a lot of this is very contingent, and it could have gone different ways. Sometimes, people wanted it to go a different way? And thinking about what it looks like to build different networks, build different services and structures. And, you know, while living within surveillance capitalism. ‘Cause we haven’t built different internets and different structures quite yet. Our surveillance capitalism’s still pretty big. A big part of taking care of one another and ourselves is… taking care with where and how we speak and act online. Which is different from being afraid to say things? And more being kind of competent in where and how you, like, choose to speak, to protect yourself and to protect people you care about.

I think… that’s ‑‑ yeah, that went by really fast! (Laughs)

BLUNT: We’ll just shower you with questions. (Laughs)

How are companies making money off of data?

I have two questions in the chat. Someone says: How is it exactly that companies make money off of our data? Is it just through ads? Are there other processes?

OLIVIA: So, when it comes to making money off of it, sometimes… let’s say you’re a company that sells… let’s say you’re a company that sells headphones. And you are tracking data of the people who are using your headphones. They buy them, and then in order to use them, they have to download an app into their phone. Right? Through that app, they might record things like the songs you listen to, what time of day you listen to them, how long you’re using the app, where you are when you’re listening to it… And they might have this, like, select little package of data about you.

Now, they might find that that’s data that, like, a music… campaign ‑‑ the people who like do advertisement for musicians, I guess? I don’t remember what that job’s called. But it’s more like the idea that different companies collect data that’s useful for other companies in their… in their marketing practices, or in their business practices.

So Facebook collects data that a lot of different companies might want for a myriad of reasons, because the amount of data Facebook kind of siphons from people is so large. And so ‑‑ yeah, does that…? Do any of you guys have something to say around that, about other ways that companies might ‑‑

INGRID: Yeah. I mean, a lot of it bottoms out in ads and market research.

OLIVIA: Yeah.

INGRID: There ‑‑ I mean, another, another place where ‑‑ I don’t think, it’s not the most lucrative source of revenue, I think? In so far as, it’s not the biggest buyer? But like, police departments will buy from data brokers. That’s some ‑‑ and that’s, there’s no real regulation on whether or when they do that.

So. Like, you know, it’s ‑‑ just, information in general is valuable. (Laughs) So, it’s ‑‑ it’s not ‑‑ like, I think the ‑‑ and I mean, ironically, I think what’s kind of so fascinating to me about the model of surveillance capitalism is that, like, ads don’t work. Or like, they kinda work, but like. They don’t… In terms of ‑‑ like, in terms of actually proving that, like, after I look at a pair of glasses once, and then I’m followed around on the internet by the same pair of glasses for like two and a half months, like… The actual success rate that I actually bought the glasses, I don’t think is that high? But there is just enough faith in the idea of it that it continues to make lots and lots of money. It’s like, very speculative.

OLIVIA: I actually saw a article recently that said… instead of advertising ‑‑ like, say like we all paid for, like, an ad‑free internet? It would cost about like $35 a month, for each of us. In terms of, like, being able to maintain, like, internet infrastructure and pay for things, without having advertisements.

If you have an alter ego for privacy, how can you ensure it remains separate? Is facial recognition something to worry about?

OLIVIA: “If you have an alter ego account and a personal account, how do you ensure your online accounts stay completely compartmentalized and aren’t associated through your device or IP address, et cetera?” And then they say, “Is there a way to protect your face from being collected on facial recognition if you post pictures on both accounts?”

INGRID: Yeah. So we didn’t ‑‑ we didn’t talk about facial recognition. And I ‑‑ I kind of ‑‑ I kind of falsely assumed that that’s ‑‑ it’s been so heavily talked about in the news that maybe it was sort of the thing people were kind of ‑‑ not ‑‑ but I also didn’t want to overemphasize that as a risk, because there’s so much information beyond a face that can be used when trying to identify people?

In terms of if you’re posting pictures on two different accounts… I don’t ‑‑ like ‑‑ I mean, if they’re similar photos, I don’t think ‑‑ I think the answer is, like, your face will be captured no matter, like, what? That’s sort of a piece of it. I don’t know. Olivia, can you think of any examples of, like, mitigate ‑‑ like, mitigation of face recognition, beyond like ‑‑ I mean, Dazzle doesn’t really work anymore. But like, in the same way that like people will avoid copyright, like, bots catching them on YouTube, by like changing the cropping, or subtly altering like a video file?

BLUNT: I just dropped a link. Have you seen this? It’s from the Sand Lab in Chicago, called the Fawkes Tool, and it like slightly alters the pixels so that it’s unrecognizable to facial recognition technologies. I’m still sort of looking into it, but I think it’s an interesting thing to think about when we’re thinking about uploading photos to, like, escort ads or something like that.

OLIVIA: I think that’s difficult when it comes to, like, facial recognition, is because depending on like who the other actor is, they have access to like a different level of technology. Like, the consumer‑facing facial recognition software, like the stuff that’s in Instagram face filters, and like the stuff that’s on your Photobooth app on your laptop, it’s really different from the kinds of tools that like, say, the state would have at their disposal.

So it’s kind of like a different… I don’t know if the word “threat model” is even a good way to phrase it, because we know that like, say, for instance, the New York Police Department definitely has tools that allow them to identify pictures of protesters with just their eyes and their eyebrows.

And so, normally… if I were talking to someone who uses ‑‑ who has, like, two different accounts and is interested in not, like, being connected to both of those accounts because of their bio‑metric data, like their face, I would normally suggest that they like wear a mask that covers their whole face, honestly. Because I don’t really know of a foolproof way to avoid it, digitally, without like actively, like, destroying the file. Like, you’d have to put like an emoji ‑‑ like, you’d have to physically ‑‑ you’d have to physically cover your face in a way that doesn’t… that’s irreversible for someone else who downloads the photo to do. Because there’s a lot of tricks online when it comes to, like, changing the ‑‑ changing the lighting, and like, putting glitter on your face, and doing a lot of different stuff?

And some of those work on consumer‑facing facial recognition technology. But we don’t actually know how ‑‑ if that even works at the state level, if that makes sense.

So depending on like, who you’re worried about tracking your account… you might just want to straight up cover your face, or leave your face out of photos.

What is gait analysis and why is it important?

BLUNT: I wonder, too, do you ‑‑ if you could talk a little bit about gait analysis, and how that’s also used? Are you familiar with that?

INGRID: I don’t ‑‑ I don’t know enough about gait analysis…

OLIVIA: I know that it exists.

INGRID: Yeah. And I think ‑‑ like, it is ‑‑ and this is another thing where, in trying to figure out what to talk about for this session, figuring out like what are things that we actually know where the risks are, and what are things that… may exist, but we, like, can’t necessarily like identify where they are?

OLIVIA: I have heard of resources for people who are interested. Like, for high‑risk people who are worried about being founded via gait analysis? And gait analysis is literally being identified by the way that you walk, and the way that you move. And there are resources of people who, like, teach workshops about like, how to mess with your, like, walk in a way that makes you not recognizable. And how to, like, practice doing that.

BLUNT: It’s fascinating.

Does it matter if you use popular platforms in browsers or apps?

INGRID: “If you use popular platforms like Facebook and Instagram in browsers instead of apps, does that give you a little more control over your data, or does it not really matter?”

I ‑‑ so, Olivia and Blunt, if you have other thoughts on this, please jump in. I mean, my position is that it kind of doesn’t matter, in so far as what ‑‑ the things that Facebook, like, stores about you are things you do on Facebook. Like, it’s still tied to an account. Unless you’re talking about ‑‑ so I don’t think ‑‑ so it’s kind of whether it’s, you know ‑‑ like, ultimately, like every ‑‑ it’s not just like… you know, passive, kind of, like trackers are happening that you could maybe use a script blocker on, and that’s cool? But things you like, and things you click on, on Facebook in the browser, are still going to be stored in a database as attached to your profile. So it doesn’t necessarily change, I think, the concerns over both of those. But.

BLUNT: I’m not totally ‑‑ I have also heard things about having the Facebook app on your phone, that it gives Facebook access to more things. Like, the terms of service are different. I’m not totally sure about it. I don’t have it on my phone.

INGRID: That’s actually ‑‑ that’s a good point. I apologize. I guess it also ‑‑ it depends on what thing you’re trying ‑‑ kind of concerned about. So, one way that ‑‑ one thing that Facebook really likes having information on for users, individual users, is who else they might want to be Facebook friends with. Right? The like “People You May Know” feature, I once read, uses up like a very large percentage of, like, the Facebook infrastructure compute. Because connecting people to other people is really, really hard. And once ‑‑ and like, the Facebook app being on your phone does give it kind of the opportunity to be opened up to your phone contacts, and places you take your phone. Which can expand, like, the network of people that it thinks might be, like, in your proximity, or in your social network. Because if a phone number in your phone has a Facebook account, maybe they will ‑‑ they’ll say, like, oh, you know this person, probably!

In 2017, Kashmir Hill and Surya Mattu did a feature for Gizmodo on how it works, that was inspired by Kashmir getting recommended a friend on Facebook that was a long‑lost relative, from her like father’s previous marriage or something. That there was, like, no way they would have otherwise met. And part of ‑‑ so, her interest partly came out of trying to figure out how they could have possibly even made those connections. And Facebook wouldn’t tell them! (Laughs) Because the “People You Know” feature is also a very powerful tool that makes them, like, an app that people want to use, in theory. They also did some follow‑up stories about sex workers being, like, outed on their alt apps, on their like alt accounts, because the “People You May Know” feature was recommending friends who knew the sex worker from like other parts of their life the alt account. And there also were examples of, like, therapists and like mental health professionals being recommended people who were clients as Facebook friends. People who met other people in, like, you know, 12‑step meetings being recommended as Facebook friends.

So there is ‑‑ in terms of, like, app versus browser, like, Facebook won’t say whether or not some of this stuff goes into the, you know, whether or not information it gathers from mobile devices goes into its “People You May Know” recommendations. But based on examples like this, it seems likely that that plays a role.

So that doesn’t ‑‑ I guess, in terms of control over your data, like… I think I misunderstood the framing of the question, ’cause I guess it’s more ‑‑ it gives you slightly more control over what Facebook does and doesn’t know about you. Because if Facebook doesn’t know what you’re doing with and on your phone, that’s probably like not a bad idea.

Did that all make sense, or was that…? I don’t know.

How secure is Facebook Messenger? How secure is Instagram?

BLUNT: No, I think that made sense. Someone was asking about the Facebook Messenger app. I think the same thing sort of applies to that, ’cause it’s all connected. I don’t know if anyone has anything else to say about that.

INGRID: This is the part where I admit that I’m not on Facebook. So, I’m actually terrible at answering a lot of Facebook questions, because I don’t…

BLUNT: Yeah, I ‑‑ I also think, like, Instagram is owned by Facebook, so also having the Instagram app on your phone, I feel like, might also bring up some of the same concerns?

INGRID: Yeah. I think it’s slightly ‑‑ from ‑‑ like, from what I ‑‑ I mean, I do use Instagram, so I can remember that interface slightly better? But… Like… My experience ‑‑ like, so as far as I’ve been able ‑‑ like. So like, as my point of comparison, I had sort of a dummy, like, lurker Facebook account, for some research a while ago. And the difference between its attempts to like connect me and suggest follows, versus Instagram’s attempts, were like ‑‑ Facebook seemed far more aggressive and given far less information about me was able to draw connections that, like, didn’t make any sense ‑‑ that like, were, very accurate. So, I think that’s just a good thing ‑‑ like, it’s ‑‑ you know. Don’t trust Instagram, because don’t trust Facebook, but. In my experience ‑‑ like, I don’t know if it’s… as much a part ‑‑ like, it’s not as central to the business model.

BLUNT: Yeah. And I think, too, just speaking from my own personal experience, like when I have had Facebook or Instagram, I use like a tertiary alias and lock the account and don’t use a face photo on the app, just so if it does recommend me to a client they’re much less likely to know that it’s me. And like, that has happened on my Instagram account. So. My personal Instagram account.

What is Palantir and why does it matter?

INGRID: Yeah. There are several follow‑ups, but I feel like this question “Can you explain about Palantir?” has been sitting for a while, so I want to make sure it gets answered, and then come back to this a little bit. So ‑‑

OLIVIA: I can explain a little bit about Palantir. So, it’s kind of the devil. We have ‑‑ I think it’s existed for… since, like, 2014? That might be the wrong ‑‑ no, not ‑‑ I think it was 2004, actually!

INGRID: Yeah, no, it’s 2004. I just ‑‑ I accidentally stumbled into a Wall Street Journal article about them from 2009 yesterday, while researching something else, and died a little? It was like, “This company’s great! I can’t see how anything could be problematic.”

BLUNT: It’s 2003, yeah. 17 years.

OLIVIA: It was started by Peter Thiel, who is a really strong Trump supporter and is linked to this really weird, like, anti‑democracy pro‑monarchy movement in Silicon Valley that’s, like, held by a lot of like a weird circle of rich people. And they are kind of the pioneers of predictive policing, and have also assisted ICE with tracking down and deporting immigrants. And they actually recently went public ‑‑ hmm?

INGRID: They did? Wow!

OLIVIA: Yeah. They haven’t, like, turned a profit yet, in 17 years. But it was initially funded, if I’m getting this right, I’m pretty sure they were initially funded by like the venture capital arm of the CIA.

INGRID: Okay, they actually, they haven’t gone public yet, but they are planning for an IPO…

OLIVIA: Soon. Is that it?

INGRID: Yeah.

OLIVIA: Okay.

INGRID: Sorry, just ‑‑ ’cause they ‑‑ so, a thing about this company is that ‑‑ like, every two years, there is a flurry of press about them maybe doing an IPO, and then they don’t. And… So, I mean ‑‑ and yeah, sorry. So they were ‑‑ their initial funding partially came from In‑Q‑Tel, which is a capital firm run by the CIA that funds companies that make products that the CIA might need. Which ‑‑ so… Keyhole, which was a satellite imagery, like, software company, was initially funded by the CIA. And that company was later acquired by Google and became Google Earth. So just an example of the kind of things that they fund. It’s stuff like that.

OLIVIA: Oh, and to clarify, it’s like a datamining company. So they do the same kind of thing that the case study that I showed earlier does. But they have ‑‑ they’re really good at it. And they also create tools and technologies to do more of it.

INGRID: So ‑‑ and they’re kind of a good example of the point made at the beginning of this about surveillance capitalism and state surveillance being closely intertwined. Palantir has corporate and government contracts to do datamining services. I think they were working with Nestle for a while, and Coca‑Cola. They want to be providing as much tools to businesses as they do to ICE. And those, you know, kinds of services are seen as sort of interchangeable. (Laughs)

I mean, the funny thing to me about Palantir, too, is that ‑‑ it’s not like they’re ‑‑ in some ways, I feel like I’m not even sure it’s that they’re the best at what they do? It’s that they’re the best at getting contracts and making people think they’re cool at what they do?

OLIVIA: They market themselves as like a software company, but they really just have a lot of files.

INGRID: They’re kind of ‑‑ somebody in the tech industry once described them to me as like the McKinsey of datamining? That’s a firm that ‑‑ they work with lots of governments and corporations that do things that seem to just make everything worse, but they keep making money? (Laughs) Is the best way to describe it!

So I think, in terms of, like, explaining about Palantir, like… I guess, they are a high‑profile example of the kind of company that is rampant throughout the tech industry. They’ve had the most cartoonish accoutrement in so far as, you know, one of their founders is literally a vampire. And ‑‑ you know, they took money from the CIA. And their name comes from a Lord of the Rings, like, magical seeing stone. In some ways, I think that there is… a level ‑‑ like. They have ‑‑ there have been like documented instances of them doing egregious things, such as working with the City of New Orleans Police Department to develop predictive policing tools without an actual contract, so without any oversight from the City Council, without any oversight from the Mayor’s Office, based on ‑‑ and basically through the funding for the project coming through a private foundation. But in terms of, like, you personally in your day‑to‑day life, should you worry about this specific company any more than you would worry about a specific state agency? I don’t think that’s ‑‑ it’s going to depend on your particular risk levels, but… They’re kind of ‑‑ they’re a tool of the state, but not necessarily themselves going to, like, come for people.

OLIVIA: Also ‑‑

INGRID: Oh, literally a vampire? Peter Thiel is one of those people who believes in getting infusions of young people’s blood to stay healthy and young, so. As far as I’m concerned? A vampire.

What are Thorn and Spotlight?

BLUNT: I also just wanted to talk briefly about Thorn and Spotlight, ’cause I think that ‑‑ so, Spotlight is a tool used by Thorn, which I believe Ashton Kutcher is a cofounder of? The mission of Thorn is to, quote, stop human trafficking, and what they do is use their technology to scrape escort ads and create databases of facial recognition built off of those ads. And so I think it’s just interesting to think about the relationship between these tools and how they collaborate with the police and with ICE and in a way that could potentially facilitate the deportation of migrant sex workers, as well.

INGRID: Okay. Sorry, let’s ‑‑ (Laughs) Yes. Ashton. Fuck him.

Will deleting the Facebook or Instagram app help?

So, one question here… “Deleting the Facebook or Insta app won’t help, right, because the info on you has will be been collected?” Not necessarily. I mean, there won’t be any more data collected? And I think ‑‑ it’s true that it won’t be erased, unless you delete your account. And like, go through the steps to like actually‑actually delete it, because they’ll trick you and be like “Just deactivate it! You can always come back!” Never come back…

But like, yeah. There’s ‑‑ if it’s something that, like ‑‑ you know. As you continue to live your life and go places, although I guess people aren’t going places right now… They won’t have any more material. Yeah.

What data points does Facebook have?

BLUNT: Someone asked: “If you have an existing Facebook account that only has civilian photos and you haven’t touched it for four years, it only has those data points, right?” I think that’s a good follow‑up to the previous question.

INGRID: Yeah, that’s true. Well ‑‑ there’s also people you know who have Facebook accounts? And like, Facebook has this internal mechanism for, like, tracking non‑Facebook users as, like, air quote, like, “users,” or as like entities that they can serve ads to. And generally, it’s based on, like, those people being, like, tagged in other people’s images, even if they don’t have an account, or if they have a dead account. Like, if somebody ‑‑ if you have like a four‑year‑old untouched Facebook account, and somebody tags a contemporary photo of you, like, those data points are connected.

So, you know. Whatever other people do that could connect back to that original account, or whatever ‑‑ yeah. Whatever followers or friends you have on it… Updates that they produce could kind of be new data points about you.

Can fintech platforms connect your pay app accounts to your social accounts?

“In terms of fintech, how easy is it for companies to link your pay app accounts to social accounts?” Blunt, you might have a more comprehensive answer on this than I will.

BLUNT: Okay… Let me take a look. So, I think that there are, like, databases built off of escort ads that then get shared on the back end to sort of facilitate, like, a network‑type effect of de‑platforming sex workers. So a lot of people report ‑‑ and some of the research that we’re doing sort of confirms this ‑‑ that once you experience, like, de‑platforming or shadowbanning on one platform, you’re significantly more likely to experience it or lose access to it on another. So, as like a form of harm reduction and being able to keep access to those financial technologies, I suggest just having like a burner e‑mail account that you only use for that that’s not listed anywhere else publicly, that sounds like vanilla and civilian.

So that way, if they’re, like, running ‑‑ if they’re scraping e‑mail addresses from escort ads and then selling that data to facial ‑‑ to financial technologies, that your e‑mail won’t be in there. It’s just like adding one layer of protection. It might be connected in some other ways, but… just sort of as a form of harm reduction.

I don’t know if that answered… that question.

And I’m looking right now for resources that specifically ‑‑ resource specifically on protecting community members in regards to ICE centering trans sex workers. I know that… Red Canary Song does some work around this, specifically with massage parlor workers, and I’m looking up the name of the organization of trans Latinx sex workers in Jackson Heights. So I will drop that link so you can follow their work, as well.

And please feel free to drop any other questions in the chat, even if it wasn’t covered. We’ll see if we can answer them, and this is your time. So, feel free to ask away.

(Silence)

Tech Journals or Websites to Follow

INGRID: “What tech update journals or websites do we follow to stay up to date?” Oh! I want to know more about what other people do, too. I tend to, like ‑‑ I tend to follow specific writers, more than specific outlets, partly because… like, there are freelancers who kind of go to different places. But also, I kind of value seeing people who, like, have developed expertise in things. So… Julia Carrie Wong at The Guardian is someone I read a lot. Melissa Gira Grant, at The New Republic. (Laughing) Is not always writing about tech, but is probably one of the smartest and sharpest and most integrity‑filled writers I know.

BLUNT: Definitely one of the first to cover FOSTA‑SESTA, for sure.

INGRID: Yeah. Yeah. And… I’ve been… Let’s see. Motherboard, in general, Jason Koebler and Janus Rose, are very well‑sourced in the industry. So I usually trust things that they cover and find. Caroline Haskins is a young reporter who used to be at Motherboard and is now at BuzzFeed and does excellent work, along with Ryan Mac. And Kashmir Hill, who is now at The New York Times, but has also written for Gizmodo and others. And who else… Davey Alba is also with The New York Times, and is really great. Those are mine.

BLUNT: I sign up for the ‑‑ it’s like, Casey Newton’s daily e‑mail letter? And I just read that to stay up to date on certain news, and then research more. I know that the Internet Freedom Festival also has good updates, and I’m also happy to drop links to other weekly mailing lists that I sign up to, as well.

OLIVIA: Oh, I was muted! Oops. I, I listen to a lot of podcasts. And I know the, like, Mozilla IRL podcast was really good, for me, in terms of like learning more about like the internet, and specifically like surveillance infrastructure. And they have a lot of episodes. So if you’re, like, idling, and you have time to listen rather than reading. They also ‑‑ Mozilla also has their transcripts out, which is pretty nice.

Can browser bookmarks be accessed?

INGRID: “Is there any way for bookmarks on my browser to be accessed?” I believe the answer for that is gonna depend on the browser. Because ‑‑ or, it can depend on the browser? So, I think in the case of a browser like Chrome, which is owned by Google, if you are like logged into your Google account as part of, like, your browser using, I think all of that information then gets associated with your Google account. So Google will have that data. In terms of access beyond that, I’m not sure.

And then I think for other browsers, I don’t believe that that would be something that’s stored on Firefox. I’m not sure about Microsoft Edge. Olivia, do you have any other thoughts on that one?

OLIVIA: I, I don’t know…

How secure and safe is Zoom?

INGRID: Talking about safety of Zoom! Okay. Yeah. We talked, we talked a little bit about this yesterday, I think. Zoom is, you know, it’s software that was made for like workplace calls, and is designed for that setting. Which means ‑‑ (Laughs) In some ways, like, it is inherently a workplace surveillance tool. It is… relatively, like ‑‑ I mean, it’s, it’s not secure in the sense that, like, these ‑‑ I mean, first of all, this call, this is being recorded, and can ‑‑ you know, Zoom calls can be broadcast to livestream, like this one is. But also, the, you know, communications ‑‑ like, the actual calls aren’t, you know, encrypted in any way. Like, they kind of can just be like logged onto if they’re public. There can kind of just be some URL hacking. There are, you know, different settings you can make in terms of letting people in or out of meetings. But at the end of the day, also, Zoom has access to the calls! (Laughs) And how much you trust Zoom with that material is, you know, at your discretion.

I… (Sighs) I mean, in general, like… When it comes to, like ‑‑ like, Zoom calls are not where I would discuss, like, sensitive topics, or anything I wouldn’t want to have on the record. And that’s generally just the protocol I take with it. And I think ‑‑ I mean, that being said, like, yeah. There are… it is in such common use at this point, in terms of like spaces for events, like this one! That I won’t, like, kind of outright boycott it, simply because it’s become such a ubiquitous tool. But I think compartmentalizing what I do and don’t use it for has been helpful.

BLUNT: And so if you’re interested in staying up to date with EARN IT, I would suggest following Hacking//Hustling and Kate Villadano. I can drop their handle. And also, on July 21st, we’ll be hosting with our legal team… a legal seminar, sort of similar to this with space to answer questions, and we’re providing more information as to where EARN IT is and how you can sort of plug in on that.

Is there government regulation surrounding data tracking?

√: “Is there government regulation of data tracking, or not so much?” Not so much! Yes! In the United States, there’s very little regulation.

So, the reason ‑‑ or, one of the reasons that if you use a VPN and set it to a place like Switzerland and use it, you get a lot more information about what tracking is happening and you can make requests for data from platforms, is because of a European Union regulation called GDPR, General Data Protection Regulations? Or maybe General Data Privacy Regulations, sorry. And, yeah. The United States does not have an equivalent to that. Some websites ‑‑ in some ways, like, because the European Union is such a large market, I have seen some companies kind of just unilaterally become GDPR‑compliant, for like all users, simply because it’s easier than having, like, a GDPR version and a “other places” version. But, you know, when it comes to Facebook or… like, Instagram, or like large platforms, there’s ‑‑ like, they don’t ‑‑ they don’t really have an incentive to create conditions where they collect less data. So I think there, it’s kind of like, well, sorry. It’s gonna be that way. (Laughs) Yeah.

And it’s not as though ‑‑ and I think it is a thing that, like, lawmakers have interest in? But I think part of the challenge is… both, like, these companies are, you know, very well‑funded and will, like, seek to ‑‑ and will like lobby against further regulation? And also like a lot of people in Congress are… old! And bad at computer? And… don’t necessarily have ‑‑ sometimes have trouble, I think, wrapping their heads around some of the concepts underlying this. And, you know, and are not necessarily ‑‑ and like, I think the general atmosphere and attitude around, like, free markets solve problems! Kind of further undermines pursuit of regulation.

What exactly contributes to shadowbanning?

“In terms of social media accounts following your activity, based on your research so far for shadowbanning et cetera, who do you follow and… to certain lists?” I think, Blunt, this question might be for you, because of the research.

BLUNT: Yeah. I think it’s less about who you follow, and more about who you interact with. So, like, we’re still analyzing this research, but there seems to be a trend that, like, if ‑‑ if you’re shadowbanned, the people that you know are more likely to be shadowbanned, and there might be some relationship between the way that you interact and the platform? But we’re still figuring that out? But just like one thing you can try and ‑‑ we talked about this in another one, but having a backup account for promo tweets, so your primary account with the most followers doesn’t exhibit quote‑unquote “bot‑like activity” of automated tweets. And just having, like, casual conversation about cheese, or nature…

(Laughs) We’re not totally sure how it works.

Oh, and also! I believe her name is Amber ‑‑ I’m going to drop the link to it. But someone is doing a training on shadowbanning, and it seems like we’re collecting data on like multiple accounts. And it seems like there’s some interesting things to say. So I’m going to go grab a link to that. If you’re interested in learning more on shadowbanning, that’s on the 25th, at like 5:00 p.m. I think. So I’ll drop a link.

And just for the livestream: So, this is with Amberly Rothfield, and it’s called Shadowbanning: Algorithms Explained, on July 25th at 6:00 p.m. Still a few spots left. And it looks like she was pulling data on different accounts and playing with the accounts and analyzing the posts’ interaction and traction. So that should be really interesting, too.

Cool. Thank you so much for all of the amazing questions. I think we’ll give it a few more minutes for any other questions that folks have, or any other resources we can maybe connect people with, and then we’ll log off!

Can you request your information from datamining companies?

BLUNT: Oh. Someone’s asking, can I request my information from datamining companies?

OLIVIA: Yes, you can! Yes, you can. And a lot of them… Let me see if I can find a link? ‘Cause a lot of them have, like, forms, either on their website or available where you can make requests like that. You can request to see it, and I’m pretty sure you can also ‑‑ I know you can request that they stop collecting it and that they get rid of your file. But I think you can also request to see it.

BLUNT: I also just dropped a link to this. This is an old tool that I’m not sure if it still works, but it’s Helen Nissenbaum’s AdNauseam, which clicks every single advertisement, as well as tracking and organizing all of the ads targeted at you. It’s really overwhelming to see. I remember looking at it once, and I could track when I was taking what class in school, when I was going through a breakup, just based on my searches and my targeted ads.

Cool. So, is there anything else you want to say before we log off?

INGRID: I mean, thank you all so much for participating. Thank you, again, Cory, for doing the transcription. And… Yeah! Yeah, this has been really great.

Digital Literacy Training (Part 2) with Olivia McKayla Ross (@cyberdoula) and Ingrid Burrington (@lifewinning)

Platforms and you digital literacy training

Digital Literacy Training (Part 2) Transcript

OLIVIA: Hi, everyone. My name’s Olivia. My pronouns are she/her. Co‑facilitating with Ingrid. And some of the values that this particular digital literacy/defense workshop will be centered in include cyber defense, less as a way of military technology, right? Reframing cryptography as more of an abolitionist technology. Right? And cyber defense as an expression of mutual care and a way of accumulating community‑based power. And in that way, also thinking of ways to teach this type of material in ways that are antiracist, but also anti‑binary and pro‑femme.

And so, we’re really ‑‑ we really care a lot about making sure that this is trauma‑informed and teaching from a place of gentleness, considering the previous digital harm people have experienced and trying not to relive it. So if you need to take a break, remember that this is being recorded and posted online so you will be able to access it later.

INGRID: Great. Thank you, Olivia. My name’s Ingrid. I use she/her pronouns. And welcome back to people who were here yesterday. Today, we are talking about platforms! And in this context, we primarily mean social media sites like Facebook and Instagram. Some of this, you know, it can be applied to contexts where people kind of buy and sell stuff. But essentially, we’re talking about places where people make user accounts to communicate with each other. And ways in which ‑‑ but with kind of more of a focus on kind of the large corporate ones that many people are on!

There were four sort of key concepts we wanted to cover. There’s a lot in them, so we’ll try to move through them smoothly. First kind of being algorithmic curation and the way that can produce misinformation and content suppression. And some of the laws and legal context that are defining decisions that platforms make. We talked a little bit about this yesterday, but, you know, reiterating again: Platforms are companies, and a lot of decisions they make come out of being concerned with keeping a company alive, more than taking care of people.

What is algorithmic curation and why does it matter?

So we’re going to start with algorithmic curation. And I think there’s a thing also that came up yesterday was a tendency for technical language to kind of alienate audiences that don’t know as much about computers or math, I guess. An algorithm is a long word that ‑‑ (Thud) Sorry. That’s the sound of my dog knocking on my door, in the background.

Broadly speaking, an algorithm is a set of rules or instructions ‑‑ (clamoring) Excuse me. One second. She just really wants attention. I’m sorry you can’t see her; she’s very cute!

But… An algorithm is a set of rules or instructions for how to do a thing. You could think of a recipe or a choreography. The difference between an algorithm used in the context of a platform and a algorithm that contains, you know, ingredients for a recipe is that there is a lot less flexibility in interpretation in an algorithm. And it’s usually applied on a much larger scale.

And the reason that a lot of platforms… deploy algorithmic curation, and what algorithmic curation is experienced as, is often recommendation algorithms? And algorithms that determine what content is going to show up in a social media timeline.

So I am ‑‑ you know, I have recently been watching Avatar: The Last Airbender on Netflix. I am 33 years old. And… (Laughs) I found that, you know, Netflix wants to make sure that I know they have lots of other things that I might like because I liked that show. Right? And you could kind of think of algorithms as kind of being if this, then that rules. Like, if somebody watches this show, look at all the other people who watched that show and the other shows that they watched, and suggest that, you know, you probably will like those.

And platforms give the rationale for deploying these kinds of algorithms partly just trying to help people? Right? Like, discover things, because there’s so much content, and you’ll get overwhelmed, so we prioritize. What it actually kind of in practice means is trying to keep you using a service. Right? Like, I’m probably going to cancel my Netflix account once I finish Avatar, so. But oh, like no, now I gotta watch The Dragon Prince. Right?

I think… Do I do this part, or Olivia?

OLIVIA: I can do it?

INGRID: Sorry! I couldn’t remember how we split up this section.

OLIVIA: I… So… In early social media, we didn’t really have super complicated algorithms like the ones we do now. You have the, like, find your friends algorithms that would basically like show you perhaps like the friend of your friends. But the people you follow were mostly the only people whose posts you would see.

But now that we’re able to collect more user data about how you’re using the platform, as well as your activities off the platform, now algorithms are able to become more complicated, because there’s so much more information that they’re able to use.

So some of the things that might be going into your algorithmic curation are listed here. It’s a really long list, and not all of the things that are on this list are even… not all of the things that are on this list are even like the long exhaustive list of things that might be factoring into the algorithm? ‘Cause so few platforms actually disclose what are the things that contribute to the stuff that you see, and what you don’t see, and who’s seeing your own content, and the people who don’t see your own content. But one thing that we know for sure is that the way that these platforms are designed is specifically in order to make money. And so following that motive, you’re able to kind of map a lot of the predicted behavior of some of them.

And one of the really big consequences of these like algorithmic filter bubbles is misinformation. Right? So because we’ve all been inside for the past couple of weeks and months, we’re all really susceptible to seeing really targeted misinformation, because we’ve been online a lot. And so it’s quite possible that more data is being collected about you now than ever before. Platforms make money off of our content, but especially content that encourages like antisocial behaviors. And when I say antisocial behaviors, I mean like antisocial for us pro‑social behaviors. One of these categories encourages a healthy boundary with social media, like light to moderate use. Comforting people! Letting people know that they rock! Right? Cheering up people. Versus antisocial behaviors, while they’re much less healthy, they encourage people to use social media like three times as much. Right? People are spreading rumors; people are posting personal infection; if people are being ignored or excluded or editing videos or photos or saying mean things. Right? And so that makes an environment where misinformation does super well, algorithmically.

Through their design, especially platforms like Instagram and Twitter, they prioritize posts that receive lots of attention. We see this like how people ask others to “like” posts that belong to particular people so that they’ll be boosted in the algorithm. Right? They prioritize posts that get a lot of clicks and that get a lot of like feedback from the community. And it’s really easy to create misinformation campaigns that will take advantage of that.

OLIVIA: Nice. That was a really quick video from the Mozilla Foundation. But I wanted to clarify that there’s this assumption that people who fall for misinformation are like kinda dumb, or they’re not like thinking critically. And this is like kind of a really ableist assumption, right? In truth, anyone could unknowingly share misinformation. That’s like how these campaigns are designed, right? And there’s so many different forms that misinformation takes.

It could be like regular ole lies dressed up as memes; fabricated videos and photos that look super real, even though they’re not; performance art and like social experiments? (Laughing) Links to sources that don’t actually point anywhere? And it could have been investigation that was originally true! But then you told it to your friend, who got the story kind of confused, and now it’s not true in a way that’s really, really important. And of course, there’s also conspiracy theories, and misleading political advertisements, as well.

But sometimes, misinformation is less about being not told ‑‑ being told a lie, and more about not being told the truth, if that makes sense.

So, the easiest way to avoid misinformation is to just get in the habit of verifying what you read before you tell someone else. Even if you heard it first from someone that you trust! Right? Maybe one of your friends shared misinformation. But my friend is a really nice, upstanding citizen! Right? There’s no way that… I don’t know; being a citizen doesn’t matter. My friend is a nice person! And not always… are the people ‑‑ people who share misinformation aren’t always doing it to stir the pot. They just got confused, or they just… ended up in a trap, really.

So, fact‑check the information that confuses you, or surprises you. But also fact‑check information that falls in line with your beliefs. Fact‑check all of it. Because you’re more likely to see misinformation that falls in line with your beliefs because of the algorithmic curation that we talked about before. Right? We have an internet that’s like 70% lies.

So, two sites that were pretty popular when I asked around how people fact‑checked were PolitiFacts and Snopes.com. You could also use a regular search engine. There’s Google, but also using DuckDuckGo at the same time. You could ask a librarian. But also, if you look at a post on Instagram or Twitter and scroll through the thread, there might be people saying, like, hey, this isn’t true; why’d you post it? So always be a little bit more thorough when you are interacting with information online.

How does algorithmic curation contribute to content suppression and shadowbanning?

INGRID: So the next sort of thing we wanted to talk about that’s a, you know, consequence of algorithmic curation and companies, like, platforms being companies, is suppression of content on platforms. Right? Platforms have their own terms of service and rules about what people can and can’t say on them. And those terms of service and rules are usually written in very long documents, in very dense legal language that can make it hard to understand when you break those rules, and are kind of designed to, you know, be scrolled through and ignored.

And we wanted to ‑‑ but because a lot of the decisions about what is, like, you know, acceptable content or unacceptable content are, again, being made by an algorithm looking for keywords, for example… the platforms can kind of downgrade content based on assumptions about what’s there.

So… shadowbanning is a concept that I imagine many of you have heard about or, you know, encountered, possibly even experienced. It actually originally is a term that came from like online message groups and forums. So not an automated algorithm at all. Basically, it was a tool used by moderators for, you know, forum members who liked to start fights, or kind of were shit‑stirrers, and would basically be sort of a muting of that individual on the platform. So they could, you know, still post, but people weren’t seeing their posts, and they weren’t getting interaction, so they weren’t getting whatever rise they wanted to get out of people.

Today, the more common kind of application of the term has been describing platform‑wide essentially muting of users from, like, the main timeline, or making it hard to search for that individual’s content, based on what is thought to be automated interpretation of content. I say “what’s thought to be automated interpretation of content,” because there is a lot that is only kind of known about what’s happening on the other side of the platform. Again, yeah, what it often looks like is not showing up in search unless someone types the entirety of a handle; even if you follow that person, that person’s content not showing up in the main timeline, like in their follower’s feeds, not showing up in a hashtag…

And, shadowbanning is like a really gaslighting experience? Because it’s hard to know, is the result of what I’m saying is people just don’t like it, or people just don’t care anymore, or am I being actively suppressed and people just can’t see me? And if it’s something that has happened to you, or is happening to you, one thing that is important to remember is like you will feel very isolated, but you are in fact not alone. This is a thing that happens. It’s often sort of ‑‑ it’s been, over time, kind of dismissed by platforms as myth or kind of ‑‑ and I think, I wonder if, in some ways, perhaps their aversion to it comes from associating it with this less automated context? Because it’s like, well, we’re not deliberately trying to mute anybody; it’s just our systems kind of doing something! But the systems are working ‑‑ you know, they designed them, and they’re working as designed. Right?

Instagram recently, in making an announcement about work that they want to do to address sort of implicit bias in their platform, sort of implicitly acknowledged that shadowbanning exists. They didn’t actually use the term? But it is interesting to see platforms acknowledging that there are ways that their tools will affect people.

In terms of the “what you can dos” and ‑‑ Blunt, if you have anything that you want to add to that, I’d totally be happy to hear because I’m far from an expert. It’s a lot of what the sort of like best practices tend to be based on what other people have shared as like working for them. So basically, I don’t want to tell you anything and say like this is a guarantee this will like work for you in any given context. One thing that I have seen a lot is, basically, posting really normie content? Like, just going very off‑script from whatever your normal feed is, and doing something like, I don’t know, talking about your pet, or having ‑‑ you know, talking about like cooking. Basically just like changing what you’re doing. Another approach is getting your friends and followers to engage with your content, so that it’s seen as popular, so that it will like return to the timeline.

Blunt, is there anything else that you would want to include in there?

BLUNT: Yeah, I think something that communities found to be useful is that if you are going to be automating posts, to do it on a backup account so that what’s flagged as bot‑like behavior is ‑‑ so your promo account might be shadowbanned, but you might have a wider reach to direct people to where to give you money. But it’s a really complex topic. I’ve been thinking about it a lot right now as I was just ‑‑ Hacking//Hustling is currently studying shadowbanning. So far, we’ve found our data backs up a lot about what sex workers know to be true about show shadowbanning sort of works, what seems to trigger it and what seems to undo it. But as I was making a thread about the research, which both included the words “sex worker” and “shadowban,” I was like, I don’t even know if I can say either of these words without being shadowbanned! So I write it with lots of spaces in it, so hopefully the algorithm won’t recognize it, which also makes it inaccessible to anybody using a screen reader.

So, I don’t know. I know there was a class on how to reverse a shadowban, but I also think that after the global protests started that the algorithm changed a little bit, because we were noticing a lot ‑‑ a higher increase of activists and sex worker content suppressed in the algorithm.

INGRID: Yeah. That’s ‑‑ do you know when you’re going to be putting out some of the research from ‑‑ that Hacking//Hustling’s been doing?

BLUNT: Yeah, we just tweeted out a few of our statistics in light of the recent Twitter shenanigans, and… (Laughs) Some internal screenshots being shared, where they say that they blacklist users? Which is not a term I knew that they used, to describe this process. We’re in the initial analysis of the data stages right now, and we’ll probably ‑‑ our goal is to share this information primarily with community, so we’ll be sharing findings as we are able to, and then the full report will probably come out in like two to three months.

Can algorithms judge video content?

INGRID: “Have you found that the algorithm can judge video content? I know nudity in photos are flagged.” I would defer to Blunt on this question, actually.

BLUNT: I would say, yeah. I’ve had videos take ‑‑ I have lost access to YouTube from videos. So I think anything that you post with a… either a link… for sex work, or just links in general and photos are more likely to be flagged. So, like, personally, I notice my posts that are just text‑based show up higher and more frequently in the algorithm and on the feed.

Which laws and politics surround content suppression?

INGRID: Mm‑hmm… yeah. So the other kind of form of suppression we wanted to mention and talk about is not as algorithmic. It’s when, you know, the state gets involved.

So platforms are companies; companies are expected to follow rules; rules are made by governments. Sometimes, it’ll kind of look like shadowbanning. So TikTok has been reported to basically down‑rank certain kinds of content on the site, or like not, you know, have it show up in a “For You” page, or on your follow page, depending on laws in a country around homosexuality. Sometimes it’s, you know, a result of companies creating rules that are sort of presented as being about national security, but are actually about suppressing dissent. So in Vietnam and the Philippines, there have been rules basically made that mean ‑‑ that have made the contents of social media posts seen as, you know, potentially threats against the state, basically. And sometimes their rules about protecting the vulnerable are actually about, you know, some moral majority bullshit. Which seems like a good time to start talking about sort of legal contexts!

And a lot of this is ‑‑ all of this particular section is really USA contexts. And I feel like I should ‑‑ I wanted to kind of give some explanation for that, because I feel weird doing this like broad sweep on, like, other kind of like countries’ approaches and focusing so much on the United States. But the reason for doing that is, basically, America ‑‑ as, you know, an imperialist nation! Tends to have an outsized impact on what happens on global platforms, overall. And there’s, you know, two reasons for that; one is that most of these companies are located in the United States, like their headquarters are here, so they are beholden to the laws of the place; but secondly, it’s also about sort of markets. Right? Like, the ‑‑ if you, you know. Like, if Facebook is like, we don’t need the American consumer base! Like, it’s probably going to affect their ability to make money.

And there are exceptions in terms of, like, the ways that other law, like, law kind of impacts platforms’, like, structure and decisions. And we talked a little bit yesterday about European privacy laws, but we’ll try and bring a little more in tomorrow about those.

First kind of category is like ‑‑ this is a little bit of a tangent, but it came up yesterday, so I wanted to kind of mention it. This is an image from the account shutdown guide that Hacking//Hustling made, that I did some work on. And basically, platforms that, you know, can facilitate financial transactions, which can be something, you know, like Stripe, PayPal, or Venmo, but, you know… Basically, they have to work with banks and credit card companies. And banks and credit card companies can consider sex work‑related purchases to be like “high risk,” despite there being very little evidence that this is true? The reason sometimes given is the possibility of a charge‑back? Meaning, you know, hypothetically, heteronormative sitcom scenario, that I don’t want my wife to see this charge on my bill! So reports it, and it gets taken off. How much this is actually the case? Unclear. It’s also, like, they’re just kind of jerks.

But, you know, platforms don’t actually have a lot of ability to kind of decide ‑‑ like, to actually like argue with these companies? Because they control the movement of money. Around, like, everywhere? So, in some ways, it’s kind of ‑‑ you know, they kind of just have to fall in line. I mean, that being said, companies themselves are also like kinda dumb. I wasn’t sure whether this needed to be included, but this Stripe blog post explaining why businesses aren’t allowed? They have a section on businesses that pose a brand risk! And they have this whole thing about like, oh, it’s our financial partners don’t want to be associated with them! It’s not us! But, you know, like, fuck out of here, Stripe.

What is section 230?

Back to other laws! (Laughing) So. Section 230 is a term that maybe you’ve heard, maybe you haven’t, that describes a small piece of a big law that has a very large impact on how platforms operate and, in fact, that platforms exist at all. So in the 1990s, lawmakers were very stressed out about porn on the internet. Because it was 1996, and everyone was, you know, didn’t know what to do. And a bill called the Communications Decency Act was passed in 1996. Most of it was invalidated by the Superior Court? Section 230 was not. It’s part 230 of it. It’s a very long bill. It’s really important for how platforms operate, because it says that platforms, like, or people who run hosting services, are not responsible when somebody posts something illegal or, you know, in this case, smut. I, I can’t believe that there was a newspaper headline that just said “internet smut.” It’s so silly… But that the platform, the hosting service, they’re not responsible for that content; the original poster is responsible. Like, if you wanted to sue someone for libel, like, you would not sue the person who hosted a libelous website; you would sue the creator of the libelous website.

And this was initially added to the Communications Decency Act, because there was concern ‑‑ really because of capitalism! There was concern that if, if people were afraid of getting sued because somebody, you know, used their services to do something illegal, or used their services to post something that they could get sued for, that people would just not go into the business! They would not make hosting services. They would not build forums or platforms. And so it ‑‑ removing that kind of legal liability… opened up more space for, for platforms to emerge. It’s, in some ways, it’s a fucked up compromise, in so far as it means that when Facebook does nothing about fascists organizing on their platforms and fascists actually go do things in the world, Facebook can’t be held responsible for it. Right? I mean, the Charlottesville rally in 2017 started on Facebook. Facebook obviously got some bad PR for it, but, you know. Then again, writing some exceptions where platforms are responsible for this or that… tend to not be made on kind of trying to meaningfully support people with less power, but usually about what powerful people think are priorities. Such as the first effort, in 2018, to change or create exceptions to Section 230. Which was FOSTA‑SESTA!

What is FOSTA-SESTA?

It was sold originally as fighting trafficking? The full ‑‑ FOSTA and SESTA are both acronyms. FOSTA is the Allow States and Victims to Fight Online Sex Trafficking Act. SESTA is the Stop Enabling Sex Traffickers Act. But the actual text of the law uses the term, “promotion or facilitation of prostitution and reckless disregard of sex trafficking.” So basically, it’s kind of lumping sex work into all sex trafficking. Which… Yeah. That’s ‑‑ not, not so wise.

And what it essentially creates is a situation where companies that allow that ‑‑ allow prostitution, or facilitation of prostitution, and reckless disregard of sex trafficking to happen on their platform? Can be held legally responsibility for that happening. The day that FOSTA and SESTA was signed into law, Craigslist took down the Personals section of its website. It has generally heightened scrutiny of sex worker content across platforms, and made it a lot harder for that work to happen online.

What is the EARN IT Act?

And in some ways, one of the scary things about FOSTA‑SESTA is the way in which it potentially emboldens further kind of attempts to create more overreaching laws. The EARN IT Act is not a law, yet. It is one that is currently being… discussed, in Congress. It emerged as ‑‑ or, the way that it’s been framed is as a response to an investigative series that happened at the New York Times about the proliferation of sexual images of children on platforms. And this, this is a true thing. Basically, any service that allows uploading of images has this problem. Airbnb direct messages can be, are used? And it’s a real thing. But this, the actual law is a very cynical appropriation of this problem with a solution that really serves more to kind of control and contain how the internet, like, works.

It proposes creating a 19‑member committee of experts, headed by the Attorney General, who would be issuing best practices for companies and websites, and allow those that don’t follow the best practices to be sued. And what “best practices” actually means is currently ‑‑ is like very vague in the actual text of the bill. The word “encryption” does not actually appear in the text of the bill, but its authors have a long history of being anti‑encryption. The current Attorney General, Bill Barr, has expressed wanting back doors for government agencies so that they can look at encrypted content. And likely, you know, it’s thought it could include “best practice” things like make it easier for the government to spy on content.

This is ‑‑ you know. I know somebody who worked on this series, and it is so frustrating to me to see that effort turn into, how about we just kill most of what keeps people safe on the internet?

So I mention, this is something that is more good to pay attention to. Write your Congress member about. Hacking//Hustling has done ‑‑

What is encryption?

Oh, Blunt would like me to define encryption. So it’s a mechanism for keeping information accessible only to people who know how to decode it. It is a way of keeping information safe, in a way! And… The ability ‑‑ and it’s ‑‑ the introduction ‑‑ encryption was not inherently actually part of the early internet, because it was originally created by researchers working for the government who thought it would just be government documents moving around it, so they were all public anyway. But it has since been kind of normalized into a part of, like, just using the internet as we know it today. But it’s, in this context, it’s ‑‑ yeah, basically, it means that when ‑‑ if I want to send you a message, that the only people who can read that message are like you and me, and not the service that is moving the message around, or not like the chat app that we’re using.

That was ‑‑ I feel like that was a little bit garbled, but… I don’t know if you like ‑‑ if, Olivia, is there anything that you would want to add to that? Or a better version of that? (Laughs)

OLIVIA: I think, I think you’ve mostly said it, in terms of it’s like a way of like encoding information so that ‑‑ someone might know the information is present, but they don’t know what it says. So, when we have things like end‑to‑end encryption on the internet, it means that something is encrypted on my side, and no matter, like, say what third party tries to look at the message that I sent to you while it’s in transit, it can’t be seen then, and it also can’t be seen by them on the other side, because the person who I sent the message to has their own, like, code that allows them to decode the message that’s specific to them. And this happens on a lot of platforms without our knowledge, in the sense that apps that are end‑to‑end encrypted, like Signal, they don’t really tell you what your key is. Even though you have one, and the person that you’re talking to has one, it’s not like you’re encoding and decoding yourself, because the math is done by other things.

But if the bill goes out of its way to exclude encryption, then it might make it potentially illegal for these services to exist, which would be a really bad thing for journalists and activists and sex workers and, like, everybody.

INGRID: Yeah. And additionally, there is ‑‑ I mean, within the world of people who work on encryption and security tools, any ‑‑ the idea of creating a back door, or some way to sneakily decrypt a thing without somebody knowing, is that that creates a vulnerability that… essentially, it creates a vulnerability that essentially anyone else could exploit. Like, if it exists there, it’s like somebody will hack it and figure it out.

OLIVIA: There’s no such thing as a door that only one person can use.

What’s the connection between the EARN IT Act and The New York Times?

INGRID: A question ‑‑ EARN IT is not solely a response to an article by the New York Times? It was a series of seven articles. And when I say “in response,” that is the argument ‑‑ that is the statement made by the people who wrote the bill. I think that it was more that EARN IT was proposed by some Congress people who saw an opportunity to cheaply exploit outrage over, like, abuse of children, to put forward some policies that they would want to have happen anyway. I think, like, the ‑‑ it’s ‑‑ and I think the reason, I guess, I mention it is because I think it’s also important to acknowledge the ways that these ‑‑ yeah, it was all, it was an entire, entirely from the New York Times. And it’s, you know, honestly, like, I don’t… I, I think that the main takeaway from that series to me was more that, like, companies are dropping the ball? Not that we need the government to come in and, like ‑‑ or like, if there’s supposed to be government making rules about how companies address this issue, like, I don’t think that the solution is create a committee that pursues, like, telling the companies what to do in this way that doesn’t actually seem to have anything to do with the actual problem they’re talking about.

BLUNT: Totally. And we actually ‑‑ I just want to also say that on the 21st, Hacking//Hustling will be hosting a legal literacy panel, where we will be talking about the ways that fear and threats to national security are used to pass… laws that police us further, that want to end encryption, that want to do away with our privacy. So if you check out HackingHustling.org slash events, I think, you should be able to find out more about that. Again, that’s at 7:00 p.m. on the 21st. You’ll be able to learn a lot more. We’ll do an update on EARN IT, where to look for updates, and similar legislation that’s being passed.

INGRID: I did see ‑‑ there was like a ‑‑ I saw an article that said a bill was being worked on, that was basically like in response to EARN IT, trying to say, like, yes, this is this problem you’re claiming that you’re going to address, like, it’s bad, but like this is not the way to do it, and trying to come up with an alternative. I think Ron Whiting was involved. Do you know anything about this?

BLUNT: Yeah, I think that’s ‑‑ yes. I mean, yes, we will talk about that on the 21st. I’m not ‑‑ we will have our legal team talk about that, so I don’t say the wrong thing.

INGRID: Okay, great. Moving forward!

What are some secure and private platform alternatives?

Olivia, do you want to do the platform alternatives? I feel like I’ve just been talking a lot!

OLIVIA: Sure! So, it kind of sucks that we’re all kind of stuck here using… really centralized social media platforms that we don’t control, and that kind of, in like nefarious and really complicated ways, sometimes control us. And so you might be thinking to yourself, gee, I wish there was something I could use that wasn’t quite Instagram and wasn’t quite Twitter that could let me control information.

So, we have some alternatives. One of these alternatives is called Mastodon. And… Essentially, it’s a independent ‑‑ is that the word? I think the word is ‑‑

BLUNT: An instance?

OLIVIA: It’s an instance! There you go. It’s an instance of… Oh, no, I don’t think that’s the word, either.

Basically, Mastodon is a very ‑‑ is a Twitter‑like platform that’s not Twitter, and instead of going on like a centralized place, you can set up your own Mastodon instance for your community. So instead of having ‑‑ like, you might have Mastodon instances that are called other names? Kind of like ‑‑ would a good analogy be like a subreddit?

INGRID: Maybe. I think, like, the existence of ‑‑ so, Mastodon is also from a project to create… like, open standards for social networking tools. I think we talked a little bit about sort of standardizing of browsers and web content. And in the last decade, one that’s been in development is one for just creating an open standard of what, like, a social network should do and could be. The protocol is actually called ActivityPub, and Mastodon is built on top of it. It’s, it’s more ‑‑ it’s kind of like… the term used for how they’re actually set up is like “fed rated.”

OLIVIA: Federated!

INGRID: Yeah. You set up one that’s hosted on your own. And it can connect to other Mastodon sites that other people run and host. But you have to decide whether or not you connect to those sites. And I think the, the example ‑‑ the thing that ‑‑ sorry. I can jump off from here, ’cause I think the next part was just acknowledging the like limitations. (Laughs) ‘Cause I think ‑‑ so… With ‑‑ so, this is a screenshot of Switter, which had been kind of set up as a sex work‑friendly alternative to Twitter, after FOSTA‑SESTA. And… It has run into a lot of issues with staying online because of FOSTA‑SESTA. Their hosting in ‑‑ like, I think Cloudflare was originally their hosting service, and they got taken down, because the company that like made ‑‑ you know, the company that was hosting it didn’t want to potentially get hit with, you know, like, liabilities because FOSTA‑SESTA said you were facilitating sex trafficking or some shit.

So it’s, it’s not a, necessarily, like, obvious ‑‑ like, it’s not easy, necessarily, to set up a separate space. And whether setting up a separate space is what you want is also, like, a question.

OLIVIA: Another option is also… Say you have a community that’s on Instagram, or on Twitter, and you guys are facing a lot of algorithmic suppression, and you’re not able to, like, reliably which you can’t to the ‑‑ communicate to the people who like your page. You could also split it both ways. You could try having an additional way of communicating to people. So you might have like a Twitter page where you have announcements, but then have a Discord server or something where you communicate with community members, or similar things.

And those types of interventions would essentially allow you to avoid certain types of algorithmic suppression.

INGRID: Yeah. And in a way, the construction of an alternative, it’s, I think… the vision probably is not to create, like, a new Facebook, or a new, you know, Twitter, or a new Instagram, because you will just have the same problems. (Laughs) Of those services. But rather to think about making sort of intentional spaces, like, either ‑‑ like, within, you know, your own space. This is a screenshot of RunYourOwn.social, which is a guide created by Darius Kazemi about ‑‑ you know, what it is to create intentional online spaces. I just find it really, really useful in thinking about all this stuff.

All right. Those were all our slides…

BLUNT: I actually just wanted to add one little thing about that, just to follow up on those previous two slides. I think it’s important to note, too, that while there are these alternatives on Mastodon and in these various alternatives, that’s often not where our clients are? So I think that it can be helpful for certain things, but the idea that entire communities and their clients will shift over to a separate platform… isn’t going to, like, capture the entire audience that you would have had if you had the same access to these social media tools that your peers did. So I think just one thing that I’ve been recommending for folks to do is to actually, like, mailing lists I think can be really helpful in this, too, to make sure that you have multiple ways of staying in touch with the people that are important to you, or the people that are paying you. Because we don’t know what the stability is of a lot of these other platforms, as well.

INGRID: Yeah.

OLIVIA: E‑mail is forever.

BLUNT: Yeah.

INGRID: Yeah, that’s a really, really good way to ‑‑ you know, point. And thank you for adding that.

Okay! So I guess… Should we ‑‑ I guess we’re open, now, for more questions. If there’s anything we didn’t cover, or anything that you want kind of more clarification on… Yeah.

I see a hand raised in the participant section, but I don’t know if that means a question, or something else, or if… I also don’t know how to address a raised hand. (Laughs)

BLUNT: Yeah, if you raise your hand, I can allow you to speak if you want to, but you will be recorded, and this video will be archived. So, unless you’re super down for that, just please ask the questions in the Q&A.

What is Discord and how secure is it?

Someone asks: Can you say more about Discord? Is it an instance like Switter or Mastodon? What is security like there?

OLIVIA: So Discord is a ‑‑ is not an instance like Switter and Mastodon. It’s its own separate app, and it originated as a way for gamers to talk to each other? Like, while they’re playing like video games. And so there’s a lot of, a lot of the tools that are currently on it still make kind of more sense for gamers than they do for people who are talking normally.

A Discord server isn’t really an actual server; it’s more so a chat room that can be maintained and moderated.

And security… is not private. In the sense that all chats and logs can be seen by the folks at, like, at Discord HQ. And they say that they don’t look at them? That they would only look at them in the instance of, like, someone complaining about abuse. So, if you say like, hey, this person’s been harassing me, then someone would look at the chat logs from that time. But it’s definitely not… it’s not a secure platform. It’s not‑‑ it’s not end‑to‑end encrypted, unless you use like add‑ons, which can be downloaded and integrated into a Discord experience. But it’s not out of the box. It’s mostly a space for, like, communities to gather.

Is that helpful…?

INGRID: “Is the information on the 21st up yet, or that is to come?” I think this is for the event ‑‑

BLUNT: Yeah, this is for July 21st. I’ll drop a link into the chat right now.

What are some tips for dealing with misinformation online?

INGRID: “How would you suggest dealing with misinformation that goes deep enough that research doesn’t clarify? Thinking about the ways the state uses misinformation about current events in other countries the U.S. uses to justify political situations.” (Sighs) Yeah, this is ‑‑ this is a hard one. The question of just ‑‑ yeah. The depths to which misinformation goes. I think one of the… really hard things about distinguishing and responding to misinformation in this ‑‑ in, like, right in this current moment… is doing ‑‑ is kind of ‑‑ know ‑‑ like, it is very hard to understand who is an authoritative source to trust? Because we know that the state lies. And we know that the press follows lies! Right? Like, I imagine some of you were alive in 2003. Maybe some of you were born in 2003. Oh, my goodness.

(Laughter)

I ‑‑ again, I feel old. But… Like, the ‑‑ and you know, it’s not even ‑‑ like, you can just look at history! Like, there are… there are lots of legitimate reasons to be suspicious! Of so‑called authoritative institutions.

And I think that some of the hard things with those ‑‑ with, like… getting full answers, is… being able to ‑‑ is like finding, finding a space to like kind of also just hold, like, that maybe you don’t know? And ‑‑ and that actually maybe you can’t know for sure? Which is to say, maybe ‑‑ okay, so one example of this. So, I live in New York. I don’t know how many of you were ‑‑ are based near here, or heard about ‑‑ we had this fireworks situation this summer? (Laughing) And there was a lot of discussion about, like, is this like a op? Is this some sort of, like, psychological warfare being enacted? Because like, there were just so many fireworks. And, you know, the ‑‑ it’s also true that, like, fireworks were really like cheap, because fireworks companies didn’t have more fireworks jobs to do. I, personally, was getting lots of like promoted ads to buy fireworks. But like at the end of the day, the only way that I could kind of like safely manage, like, my own sense of like sanity with this is to say, like: I don’t know which thing is true. And the thing that ‑‑ and like, neither of these things address the actual thing that I’m faced with, which is like loud noise that’s stressing out my dog.

And so I think that some ‑‑ I think the question with, like, misinformation about sort of who to trust or what to trust, is also understanding, like… based on like what I assume, what narrative is true or isn’t true, what actually do I do? And… How do I kind of, like, make decisions to act based on that? Or can I act on either of these?

I guess that’s kind of a rambly answer, but I think ‑‑ like, there isn’t always a good one.

BLUNT: I just dropped a link to ‑‑ it’s Yoghai Benkler, Robert Faris, and Hal Roberts’ Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. I think it’s from 2018? I think it’s a really interesting route if you’re interested in learning more about that.

INGRID: There are two other questions, but I just want to quickly answer: What happened in 2003 is America invaded Iraq based on pretenses of weapons of mass destruction that didn’t exist. And companies ‑‑ like, news outlets reported that with no meaningful interrogation. (Laughs) Sorry.

What’s going on with TikTok and online privacy right now? Is it worse than the EARN IT Act?

OLIVIA: Re: TikTok… It’s a really confusing situation, because most places, especially a lot of cyber security experts on the internet, have been saying to delete TikTok? But also a lot of that ‑‑ a lot of reasons that it’s being done so are kind of boiling down to, it’s a Chinese app. Which is really xenophobic. But there are ‑‑ TikTok does track a lot of information about you. What it uses it for, mostly it’s to send you really, really, hyper‑specific TikToks. But it definitely is ‑‑ like, that information is being collected about you, and it exists in their hands. So I think it’s mostly a decision for individuals to make about whether they’re going to decide to trust TikTok with their information in that way. Because they absolutely know where you live, and they definitely know whatever things about you that you feel like they’ve gathered in order to create the TikTok algorithm that shows up in your feed. Those things can be ‑‑ those things are true. So.

I think ‑‑ Ingrid, do you have anything to say on that?

BLUNT: You’re still muted, Ingrid, if you’re trying to talk.

INGRID: Oh, sorry. I… The question, also, asked, you know, if things like the data collection on platforms like TikTok was worse than things like EARN IT. And I think the… It kind of depends on where you think, like, sources of harm are going to be? It’s ‑‑ you know, it’s kind of, just ‑‑ it’s different! Like, you know, there’s a bunch of information that a company now has that they could choose to sell, that they could choose to utilize in other ways, that they might give to a law enforcement agency that gets a subpoena. But whether or not ‑‑ but like, EARN IT and FOSTA‑SESTA are examples of ‑‑ like, those are ‑‑ that’s, I guess, a different kind of harm? That harm has less to do with collection of information, and more about suppression of content and information and of certain kinds of speech.

“Is it fair to say that social media companies can use your username alone to connect you to other accounts? Should we slightly modify our usernames to avoid being associated and shut down all at once?” So I think ‑‑ I mean, I would say just for the question of like whether to modify your username or not, I think that’s also a risk assessment question, in so far as if you need people to be able to find you across multiple platforms, I would not want to tell you to like not do that? Or to like make it harder for you to, like, reach clients or an audience. I think ‑‑ social media companies tend to… whether they’re looking for you across platforms, like, is not as clear to me. I think it depends on the, like, agreements that exist within the platform. So like, I know that ‑‑ I mean, like Facebook and Instagram are owned by the same company. Right? So they will end up sharing ‑‑ like, the sharing of those two identities, like, is fairly ‑‑ you know, that’s likely to happen. But…

OLIVIA: Some might not be looking for your other accounts? But if you’re ever, like, being investigated by like an actual individual person, or like say your local police department, or the state in general, they probably would be.

INGRID: Yeah. And in that case, I think that what may be more helpful is if you have sort of a public persona that you want to have kind of have a similar identity… That’s a choice you can make. And then if there’s like alt accounts that, you know, maybe are where you have more personal, like, communications, or are working ‑‑ you know, kind of more connected to community and less business? That, making those slightly harder to associate, or making those slightly more compartmentalized? And we’ll talk more a little bit about sort of compartmentalizing identities tomorrow. But I think, yeah, that’s one way to kind of address that ability of being kind of identified.

BLUNT: I think, too, I wanted to add that it’s not just like using the same username, but where you post it, or like what e‑mail is associated with an ad. If you’ve linked your social media to a sex working ad, one of the statistics that we found in the research, the ongoing research projects that Hacking//Hustling is doing right now on shadowbanning is that sex workers who linked their social media to an advertisement are significantly more likely to believe they’ve been shadowbanned, at 82%. Which seems to me that linking might put you in… the bad girl bin, as I call it. (Laughs)

Do we have any other questions? We still have a good chunk of time. Or anything that folks want more clarity on?

What is DuckDuckGo and what is a VPN? Should we use them?

Okay, so we have one that says, “I heard DuckDuckGo mentioned. Do you personally use that search engine? Also, I recently started using ExpressVPN, as I just started sex work, and bad on my part, I did little research on which VPNs. Have you heard of ExpressVPN? Do you have another app that you personally use or have more knowledge about? I want to stay safe and of course share with others what would be the best app to use for VPN.”

INGRID: Olivia, do you want to take some of this one…?

OLIVIA: I was muted. So, I do use DuckDuckGo, most often. Sometimes, if I’m trying to like test to see if something ‑‑ like, if I’m using another ‑‑ like, my house computer uses Google, because my mom’s like, I don’t like DuckDuckGo! It’s not showing me the things I want to see! And that’s usually because Google, again, collects data about you and actively suggests results that it thinks are the things you’re searching for, whether or not they’re what you’re actually searching for.

For VPN use, I use ProtonVPN, mainly because it’s free and I don’t really have money to pay for a VPN right now. But I think ExpressVPN is one of the most popular ones. So I’d say it’s pretty trustworthy.

INGRID: Yeah, I’ve used ExpressVPN. I’ve seen that it’s ‑‑ yeah. It’s generally, I think, a well‑regarded one. I think that’s partly why it costs the money it costs. (Laughs) So I think ‑‑ yeah. If you don’t want to have to keep paying for it; but if you’ve already paid for it, yeah, keep using it.

What are the alternatives for encryption?

Yeah. “Can we talk about some alternatives for encryption, assuming a back door is created?”

OLIVIA: This isn’t ‑‑ oop.

INGRID: Go ahead.

OLIVIA: This isn’t really an alternative for encryption, but I think one of the things that we could start doing is ‑‑ less so would it be, like, trying to function without encryption, but instead encrypting our messages ourselves. Because technically, you could have end‑to‑end encryption over Instagram DM if you do the hand work of encrypting the messages that you send by yourself. Bleh! Tripped over my tongue there.

So there are a lot of apps, specifically for e‑mail, I’m thinking of? Like, Enigmail, and Pretty Good Privacy, that are essentially tools that you can use to “hand encrypt,” in quotation marks, your e‑mails, so you don’t have to depend on someone doing that for you. Right, the government can’t knock on your door and say you’re not allowed to encrypt anymore. And encryption algorithms are mathematical things. So you wouldn’t be able to make one that’s like kind of broken. The ones that we have now are… as long as ‑‑ like, Signal for instance is very public about the algorithms that they use, and that’s how we know that we can trust them. Because other people can trust them, and they’re like, yeah, it’s really ‑‑ it would take a computer about a thousand years to crack this. And so we’re able to use those same algorithms by ourselves without depending on other platforms to do that work for us. And it would suck that we’d have to interact with each other with that level of friction? But it is possible to continue to have safe communications.

BLUNT: Yeah, and I think just in general, if you’re unsure about the security of the messaging system that you’re using? Like, right now, we’re using Zoom, and we had this conversation a bit yesterday. But I’m speaking on Zoom as if I were speaking in public. So if I were to say ‑‑ if I wanted to talk about my personal experiences, potentially I would phrase it as a hypothetical, is also one way. So just slightly changing the ways that you speak, or… Yeah. I think that’s also an option. Go ahead, sorry.

OLIVIA: No, I agree. Just bouncing off with the people that you’re talking to that, like, hey, we’re not going to talk about this. And not being, like, reckless. So in a, like in a public forum, don’t like post about the direct action that’s happening on Sunday at city hall. Things like that are not things ‑‑ just like using, in that sense, using discretion, at that point.

What is the back door issue and how does it relate to encryption?

BLUNT: Someone says: “So the back door issue is for companies that encrypt for us?”

INGRID: Basically, yeah. The ‑‑ the back door issue, or like what, I guess… the back door issue is not ‑‑ and it’s also not necessarily, like, all encryption would stop working. Right? It would be something like… you know, the government ‑‑ like a government saying, hey, WhatsApp, we want access to conversations that currently we can’t have access to because WhatsApp communications are encrypted, and ordering WhatsApp to do that. And one would hope? (Laughs) That ‑‑ like, companies also know that they have a certain amount of, like, brand liability… when they remove security features. So it’s something that would probably be known about? I don’t think that it would be done ‑‑ like, I would hope it wouldn’t be done surreptitiously? But, yeah. It’s more about, like, whether or not certain ‑‑ like, previously considered secure communications would become compromised. It wouldn’t necessarily end the possibility of ever, you know, deploying encryption ever again. It would be more of a service by service thing.

BLUNT: We still have some time for more questions, if anyone has any. Please feel free to drop them into the Q&A.

And maybe if Ingrid and Olivia, if you wanted to chat a little bit about what we’ll be talking about tomorrow, folks might have an idea of other things that they might want clarity on, or other things that they are really hoping might be covered.

What will be covered in part 3 of the digital literacy series?

OLIVIA: Yeah, tomorrow we’re gonna talk a lot about surveillance, like more specifically. So like, surveillance that’s done on platforms, and in ‑‑ but also like talking both about surveillance capitalism and state surveillance, and how they ‑‑ the different ways that they might cause harm for someone who’s like trying to use the internet. Yeah. I think those are the most ‑‑ the biggest points? But also thinking about… like, mitigation.

INGRID: Yeah. And we’re ‑‑ and in the context of state surveillance, we’re primarily talking about when the state utilizes platforms in the service of surveillance, or obtains information from platforms. There are a myriad of other ways that the state can ‑‑ that, you know, police departments or federal or state governments can engage in surveillance of people, digitally or otherwise. But partly because the scale and scope of that topic is very, very large, and because we know people are coming from lots of different settings, and maybe like ‑‑ and we don’t personally know the ins and outs of the surveillance tools of every police department in the world? We didn’t want to kind of put forward, like, examples of tools that might just be, like ‑‑ that would mostly just create, like, greater like anxiety or something, or that wouldn’t necessarily be an accurate depiction of threats or realities that people might face.

If there is interest in more of those things, we’re happy to do questions about them in the thing? But it’s not something that we did ‑‑ we’re doing a deep dive into, because… again, it seems like that might be better to do more tailored questions to specific contexts.

BLUNT: I’m curious ‑‑ did you see the EFF launched the searchable database of police agencies and the tech tools that they use to spy on communities? Speaking of not spying on people! (Laughing)

INGRID: Yeah, but that’s the thing ‑‑ another thing is like, well, those tools are here. God bless these agencies for putting that work together.

BLUNT: Cool. So I’ll just give it like two or three more minutes to see if any other questions pop in… And then I’ll just turn off the livestream, as well as the recording, in case anyone would prefer to ask a question that’s not public.

How to Build Healthy Community Online

Okay. So we have two more questions that just popped in… “Could you speak to building healthy community online? How to do that, how to use platforms for positive information spread?”

OLIVIA: So, when it comes to building healthy communities, I think… it really comes down to, like, the labor of moderation. Like, it has to ‑‑ it has to go to someone, I think. We often have ‑‑ one of the problems with a lot of platforms online is that they’re built by people who don’t really, like, see a need for moderation, if that makes sense? Like, one of the issues with Slack is that there was no way to block someone, in Slack. And a lot of the people who originally were working on Slack couldn’t conceive of a reason why that would be possible ‑‑ couldn’t conceive of a reason why that would be necessary. While someone who’s ever experienced workplace harassment would know immediately why that kind of thing would be necessary, right?

And so I think when it comes to like building healthy communities online, I think like codes of conduct are really honestly the thing that’s most necessary, and having people or having ‑‑ creating an environment on that specific profile or in that specific space that kind of invites that energy in for the people who are engaging in that space to do that moderation work, and to also like promote… pro‑social interactions, and to like demote antisocial interactions, and things like that.

BLUNT: I also think that we ‑‑ Hacking//Hustling also on the YouTube channel has… a conversation between myself and three other folks talking about sort of social media and propaganda and a couple of harm reduction tips on how to assess the, like, truthfulness of what you’re sharing and posting. And I think that’s one thing that we can do, is just take an extra second before re‑tweeting something and sharing something, or actually opening up the article before sharing it and making sure that it’s something that we want to share… is a simple thing that we can do. I know things move so fast on these online spaces that it’s sometimes hard to do, but I think that that… if, if you’re able to assess that something is misinformation, or maybe it’s something that you don’t want to share, then. It slows down the spread of misinformation.

Thank you so much to everyone and their awesome questions. I’m just going to take one second to turn off the YouTube Live and to turn off the recording, and then see if folks have any questions that they don’t want recorded.

Okay, cool! So the livestream has stopped, and the recording is no longer recording. So if folks have any other questions, you’re still on Zoom, but we would be happy to answer anything else, and I’ll just give that two or three more minutes… And if not, we’ll see you tomorrow at noon.

(Silence)

Okay. Cool! Anything else, Ingrid or Olivia, you want to say?

INGRID: Thank you all for coming. Thank you, again, to Cory for doing transcription. Or, live captioning. Yeah.

BLUNT: Yeah, thank you, Cory. Appreciate you.

OLIVIA: Thank you.

CORY DOSTIE: My pleasure!

BLUNT: Okay, great! I will see you all on ‑‑ tomorrow! (Laughs) Take care.

INGRID: Bye, everyone.

OLIVIA: Bye, everyone!

Digital Literacy Training (Part 1) with Olivia McKayla Ross (@cyberdoula) and Ingrid Burrington (@lifewinning)

Part 1: OK But What Is The Internet, Really? In this three-day lunch series with Olivia McKayla Ross (@cyberdoula) and Ingrid Burrington (@lifewinning), we will work to demystify the tools and platforms that we use every day. It is our hope that through better understanding the technologies we use, we are better equipped to keep each other safe!

Digital Literacy Training (Part 1) Transcript

OLIVIA: Hi, everyone.

Just before we begin, some of the things that ‑‑ the values that we’re trying to cement this workshop in terms of cyber defense is firstly acknowledging cyber defense as a way of maintaining community‑based power, and cryptography as an abolitionist technology rather than military or something that doesn’t come from us, right?

So, there have been ways of using techniques like cryptography, and using ‑‑ and that community defense is something that doesn’t have to be immediately associated with a white supremacist, industrial technology.

So following that, we want to affirm that there can be a cyber defense pedagogy that can be ant-iracist, anti‑binary, and pro‑femme. But also one that’s trauma informed, right? And doesn’t reinforce paranoia. Because we know there are white supremacist institutions. And teaching from a place of gentleness. And considering, because of our myriad identities, the previous harm people might have experienced, and trying not to replicate it or force people to relive it.

So if you need to take space at any point during this workshop, we want to honor that, and this will be reported and available for view at a later time, as well.

INGRID: Thank you, Olivia. That was great.

My name is Ingrid. I go by she/her pronouns. And we are ‑‑ welcome, welcome to the internet! (Laughs) This is the first of a series of three digital literacy you sessions where we’re gonna be walking through a few different concepts.

And this first one we wanted to start with was really getting into just some of the baseline, you know, technical things around what the internet actually is and how people experience it, or how it, you know, works.

And… We’ve sort of organized this into a couple of sections. We’re gonna, you know, talk ‑‑ start kind of with a couple things about our personal kind of opinions about how to talk about some of these things, some grounding perspectives we’re bringing to it. The internet and kind of how it works as an infrastructure.

Browsers? Which, as like a particular technology for interfacing with the internet. And the World Wide Web, which is… you know, basically the thing that the browser takes you to. (Laughs)

So, starting with our opinions… (Laughs) We got ‑‑ we got more, but these seem important to start with.

The first one that we wanted to convey is that, you know, some of this stuff around what ‑‑ around how the internet works gets treated like this sort of special knowledge, or like something only for smart people. But, you know, companies have a lot more resources to do things. The people who run, work in, found tech companies often have had, you know, privileges like generational wealth! Or like early exposure to technology, that mean that some of this stuff was just more available to them.

And has been for a long time. And if there are things that are confusing, or unfamiliar, it’s ‑‑ you know, it is not because you don’t understand. And it’s because the people who kind of have a lot of control and power, like, are able to like overcome things that are confusing… Yeah.

We’ll come back to this point in other ways, I think, in this presentation today.

OLIVIA: The other point that we really want to hammer in is that nothing is completely secure online. And that’s due to the nature of how we connect to the internet, right? The only way you can really have a completely secure computer is to have a really, really boring computer! Right?

Computers are interesting because… computers and the internet are able to be interesting and fun things to use because we are able to connect to other computers. Right? Because it’s a form of a telecommunication device. And so it’s kind of okay! That our computers can’t be completely secure, because if they were, they’d just be kind of like brick boxes that don’t really do anything.

So instead of trying to chase like a mythological, like, security purity, what we do is we learn to manage risk instead. Right? We create systems so that we put ourselves in at least danger as possible.

What is the internet?

INGRID: So, for our kind of initial kind of grounding point, we want to just ‑‑ or, what the internet is. And this is, this is a hard question, sometimes, I find? Because… The word “internet” comes to kind of mean lots of different things. I ‑‑ for me, one of the most, like, the simplest summary I can ever provide is that the internet is just computers talking to computers. (Laughs)

It’s information going between computers. This image, which is, you know, one of many you can find when you Google image search “internet diagram” is a bunch of computers in, you know, a household, including a game machine and a few PCs. Who is this person? With all these devices? And they’re connecting to a router in their house, which has connected to a modem, which connects to the internet! Which is more computers. Not the ones that you’re seeing on the screen.

It’s kind of dorky, but this is a really goofy example of a computer talking to another computer. It’s from the movie Terminator 3. This also, I realize, is an Italian dub?

INGRID: So, I show this ‑‑ so what’s actually happening in this scene, which is, yes, very garbled, is the lady terminator, who is a robot, a very large sentient computer, is using a cell phone, like a dumb phone, to call another computer? And then she is making noises into the phone that are a translation of data into audio signal. And that is allowing her to hack into the LA School District’s database. It’s ‑‑ and it’s, you know, it’s very 2003? (Laughs) In that that was an era where, when people were getting online in their homes, they would have to connect to a modem that made sounds like that, too.

So I think, you know, it’s kind of a corny old example, but I like it because it also shows something that is hard to see in our day‑to‑day use of the internet, which is that for information to move from one computer to another computer, it has to be rendered into something material. In this case, it’s tones? It’s sound? On a home computer connected to a wi‑fi network, it would be radio waves. And kind of when you get to different layers of the internet, it’s going to be pulses of light traveling through fiberoptic cable.

So everything you type, every image you post, at some point it gets ‑‑ you know, that digital data gets transformed into a collection of, you know, arrangements of points of light, or, you know, a sound, or like a different material.

And it’s, you know, it’s much bigger! (Laughs) Than, like, than what we see on a screen! This is a map of the submarine cables that cross oceans that make it possible for the internet to be a global experience. It’s very terrestrial?

This is just for fun. This is just a video of a shark trying to eat one of the cables in the ocean… A cutie.

Rawrumph!! I just love his little… The point being, yeah. The internet is vulnerable to sharks! It is… it is very big, and it is complicated, and it is ‑‑ it is not just, you know, a thing on a screen. It needs a lot of physical stuff.

And when computers talk to computers, that doesn’t usually mean, like, a one‑to‑one connection? Right? So… I’m talking in this webinar to all of you right now, but, like, my computer is not directly connecting to your computer. What’s actually happening is that both of our computers are talking to the same computer… somewhere else.

There’s like a, you know, intermediary machine, that’s probably in a big building like this. This is an Amazon data center in Ashburn, Virginia. And that’s kind of the model that most of the internet takes; it’s usually, there’s kind of intermediary platforms, right?

And in a lot of technical language, this is called the client‑server model. The idea being that a server, which is a computer, holds things that are, you know, content on the internet, or applications like Zoom, and the client, which is just a computer, requests things from the server. You know, the server serves that. This goes ‑‑ this gets to the client computer through a routing process, that usually means that the information has to travel through multiple computers.

But! Again, this, like ‑‑ these words just mean computer and computer? Technically, you could turn a home computer into a server and get a stable internet connection and make it something ‑‑ make it something that just serves information to the internet. Or, you know, you could even think about the fact that because, you know, lots of information is taken from personal computers and sent to companies, you know, in some ways we are serving all of the time!

And I ‑‑ mostly, this is just a dynamic, again, thinking about… who controls and how the internet is governed, that I think is important to acknowledge? I mean, in some ways, the internet is not computers talking to computers so much as… computers owned by companies talking to computers owned by people?

The internet, you know, it began as a project funded by the U.S. military, but became the domain of private companies in the late 1990s. So all of that stuff that I was talking about earlier? You know, the submarine cables, the data centers, they’re all private property owned by corporations. And it’s kind of ‑‑ all of the, you know, technical infrastructure that makes the internet possible is a public good… but it’s all managed by private companies. So it’s kinda, it’s more, you know, a neoliberal private partnership. And it has been more a long time.

And I mention this mainly because it’s good to remember that companies are beholden to laws and markets, and it’s in a company’s interest to be compliant with laws and be risk‑averse, and that’s partly why a lot of decisions made by platforms or other companies are often, like, kind of harmful ‑‑ like, can be harmful to communities like sex workers.

And again, like, this doesn’t have to be the way the internet is? It’s just sort of how it has been for a very long time.

So, computers talking to other computers is what, you know, is our very simple summary of what the internet is. But computers don’t necessarily ‑‑ don’t ‑‑ can talk to each other in different kind of languages or dialects, let’s say? Which, in, you know, internet speak, are called protocols. Which, you know, a protocol is what it sounds like: It’s a set of rules about how something’s done. And so that’s, I find, maybe the dialect or language thing kind of useful.

Common Internet Protocols

So a few protocols that exist for the internet that you probably encounter in your daily life that you maybe don’t think that much about are Internet Protocol, wi‑fi, Address Resolution Protocol, Simple Mail Transfer Protocol, and HyperText Transfer Protocol. Maybe you haven’t heard as much, or it’s not as commonly talked about? But I’ll explain about these.

And I apologize; these screenshots are from my Mac. There are ways to access these same sorts of things from a Windows machine? I don’t have screenshots. (Laughs)
So Internet Protocol is basically the foundation of getting on the internet. It assigns a number called an IP address, Internet Protocol address, to a computer when it’s connected to a network. And that sort of ‑‑ that is the ID that is used for understanding, like, who a computer is and how do you access it.

So when I want to go get content from a specific website, what I’m actually requesting under the hood is… is a set of numbers that is an IP address, which is like the name or ID of the computer that I want to go to.

I’m hoping this isn’t too abstract, and I hope, like ‑‑ yeah, please, if there are places where you have questions… please, add things to the Q&A.

So, Address Resolution Protocol and Media Access Control are a little different, but I wanted to talk about because it’s sort of related to understanding how your computer becomes a particular identity.

So, all ‑‑ there’s a question: Do all computers have their own IP address? They do, but they change, because different ‑‑ basically, when you go ‑‑ when you join the network, the address is assigned. It’s not a fixed ID. But there is a fixed ID that is connected to your computer, and it’s called a Media Access Control, or MAC, address.

And this is another screenshot from my machine. You can see this thing I circled here. That is my MAC address. And that is at the level of like my hardware, of my computer, an ID that has been… basically, like, baked into the machine. Everything that can connect to a network has one of these IDs.

And when ‑‑ and so Address Resolution Protocol is a mechanism for associating your temporary IP address with the MAC address, and it mainly exists so that if there’s, like ‑‑ like, if the network screws up and assigns the same IP address to two things, to like two different devices, the MAC address can help resolve like, oh, we actually mean this device, not that device.

Oh, I realize I didn’t make a slide for wi‑fi. I think most of you probably know wi‑fi as, like, it is the wireless ‑‑ the way that basically information is transferred to something wireless.
Yes! Your IP ‑‑ well. Your IP address… will change when you connect, although it generally won’t change that much… It’s, it’s not like ‑‑ how am I answering this?

Like, if you’re ‑‑ if you’re connecting to the internet, in like your home? It’s probably ‑‑ you’re probably gonna get the same ID number, just ’cause it’s the same device you’re connecting to? But when you connect to a network at ‑‑ I guess no one goes to coffee shops anymore…

But in the time when you would go to a place with a different wireless network and connect to the internet! (Laughing) You would probably have a different IP address, because you’re connecting from a different device in a different network.

Oh, the other thing ‑‑ the only other thing about wi‑fi thing I will mention right now is that “wi‑fi” doesn’t actually mean anything. It’s not an acronym; it’s not an abbreviation. It’s a completely made‑up name… No one ‑‑ no one has a good answer for why it’s named that! (Laughs) I think like a branding consultant named it? It’s ‑‑ anyway.

So other protocols. So the Simple Mail Transfer Protocol, that underlies how e‑mail works.

So you encounter it a lot, but probably don’t think much about what ‑‑ like, that’s its own special kind of language for moving information, that’s different from the HyperText Transfer Protocol, which is one that may be familiar to all of you because it is the central protocol used for moving information in the browser!

Which is a nice segue, but I realized I also should mention that there is a variant of HTTP called HyperText Transfer Protocol Secure, or HTTPS. It’s an implementation of HTTP that encrypts the information transferred. So, that wasn’t adopted or implemented when browsers and HTTP were first being developed?

Because, again, these technologies were being developed with, you know, public funding and thought of as tools for scientific research, not for making purchases with credit cards or having, you know, private communications. So the implementation of security features and encryption into the internet is sometimes clumsy or frustrating because it was not designed into the original concept.

What’s an internet browser?

All right. So, we are next moving into the browser. I’m kind of a nerd about internet history things, so part of what I wanted to talk about with the browser is just its origin story?

The first example of a browser that was easy to use was created by researchers at a University of Illinois, including a guy named Marc Andreessen. He made something called Netscape Navigator. It was kind of a… It was a very important opening of the internet to the general public, and it changed a lot of the people’s the perception and ability to be part of the internet.

Marc Andreessen became very rich because he did this, and he founded a venture capital company, or firm, called Andreessen Horowitz. Returning to the idea that a lot of these companies are not smart, they’re just rich? He worked on a thing that is very important… That is not a good reason that he gets to throw money at Airbnb and decide how, you know, urban planning and housing is going to be changed forever!

There are fundamental kind of reasons ‑‑ like, there’s something about that which I feel is kind of important to remember. Both to acknowledge ‑‑ it’s not that Marc Andreessen is a dumb guy; that’s that he’s been given a lot of authority through getting a lot of money through being part of one ‑‑ through doing a clever thing.

A lot of the things that define the browser in the 1990s when it was first becoming an adopted thing were actually proprietary technologies made by different companies. So different companies had their own browsers that they had made. And they wanted to be The Browser everyone used. Right? And so they invented new things to make their browser cool? But they wouldn’t work on other ones.

So Olivia will talk a little bit more about these, I think, in the section on the web. But Cascading Style Sheets, which is a way of adding, you know, designing aspects of a web page, were invented by Microsoft. Javascript, which is a programming language that works in browsers, was created by a guy at Netscape in 14 days? (Laughs) And, yeah, if you wanted to ‑‑ if you made a website and it had, you know, CSS in its layout, it would be like ‑‑ it would be visible in a Microsoft browser, but not in a Netscape browser.

This was a terrible way of doing things? And possibly because companies got nervous about possibly getting regulated, and partly because it was just bad for business, they started ‑‑ they sort of, they figured out how to kind of put aside some of their differences and develop standards, basically.

So the standardization of browsers, so that basically when I open something in Chrome and I open something in Firefox it looks the same and it works the same… kind of starts to be worked on in 1998. It really only starts to be implemented/widespread in 2007, and it continues to be worked on. There are entire kind of committees of people who mostly work at the tech companies that make these browsers who kind of come and talk to each other about, like, what are the things we’re all gonna agree are gonna ‑‑ about, like, in terms of how this technology works?

And we’re looking at, and wanting to talk a little bit, about browsers also because they are really useful teaching tools. It’s really easy ‑‑ well, it’s not “really” easy. It is pretty easy to kind of look at what’s going on behind the scenes, using a browser. And that’s mainly because they’re very old.

You know, by 2007 when the iPhone emerges, and when I think the App Store is in 2010 or 2011, you can’t really look and see ‑‑ it’s much harder to go on your phone and see, like, I wonder what kind of data Instagram is sending back to, you know, Facebook right now! Like, to actually try and look for that on your phone is almost impossible. But you can kind of start to look for that in a web browser.

And that’s sort of a privileging of desktop technology, and a legacy of this being kind of an old technology, where transparency was treated as just inherently a good idea. And I think that if they were being built today, we probably wouldn’t have it.

So, we’re going to introduce you to some browser tools in this next section ‑‑ oh, wait, sorry, one more thing I wanted to acknowledge. This isn’t super detailed as far as comparing the privacy features of different browsers? But ‑‑ and we are working on a list of sort of, like, a bibliography that we can share with everyone later.

The point being ‑‑ the main thing I just wanted to convey here is like different browsers defined by different companies, they’re gonna all work more or less the same, but they do have kind of underlying qualities that might not be great for user privacy. And, also, there’s, you know, questions of like… when, you know, one company kind of controls the browser market, how does that change kind of the way that people see the internet?

So, you know, doing some research, doing some comparison of, of what different browsers… you know, do and don’t do. Most of the screenshots for this were done in Firefox. If you use other browsers, that’s fine. But… Yeah.

All right. Now ‑‑ (Laughs) Now we will move to World Wide Web!

What are web pages and how do they work?

OLIVIA: Hi, everyone! So, this part is talking a lot about the actual content that you are able to look at using your browser. So we’ll be making use of a lot of the tools that Ingrid mentioned about looking deeper into the actual… web pages themselves.

Awesome. So, this is a web page. It’s the same page that the video that we showed earlier in the beginning of sharks biting undersea cables! (Laughs) And it’s accessible to anyone who can connect their computer to the World Wide Web. And so, a lot of times we use “the internet” and “the web” interchangeably?

But the internet itself is more of the infrastructure, and the actual place, if we can call it a place, that we’re going to logically… is called the World Wide Web. Right? That’s the whole WWW‑dot thing that we’ve all been doing.

So, web pages are hosted on computers! You can host a web page on your own computer; you can pay another company to host it for you; other companies host themselves, if they have a lot of money. And… If you are paying someone else to host your website for you, you might end up ‑‑ you have a lot less autonomy. Right?

So there’s a lot of movements for people to like start hosting things themselves to avoid things like censorship and surveillance. Because like we said in the beginning, companies are beholden to a lot stricter laws than individuals are. And individuals are able to kind of themselves say ‑‑

What’s the difference between VPN and TOR? If we have time at the end, we will cover that a little bit, briefly. But essentially, a VPN ‑‑ TOR is a browser, and a VPN is something that you can install into your computer.

TOR does something, does things, that are very similar to what VPNs do, in terms of like onion routing? But they’re not… they’re not the same. Like, you can use TOR to navigate the internet, or you can use a VPN and use your normal browser. Right.

To look at a web page’s source, right, oftentimes you can right click or can N‑click? And you click the, like, You click View Page Source, and you’ll be able to get a closer look at the actual web page itself.

And so when you, when you view the source, you ‑‑ oh, you can go back. When you view the source, you end up seeing HTML. Right? So we told you earlier that the web uses HTTP, which is the HyperText Transfer Protocol, to send and receive data. The data that’s being sent and received is HyperText. Right? That’s written in the HyperText Markup Language.

So… HTML isn’t a programming language, per se; it’s a markup language. So it defines the structure of your content. It displays things, like text and images and links to other web pages.

And there are two ways that HTML pages can exist: Static and dynamic. So static would be a lot of the pages that we might code ourselves, right? Dynamic is more of the… the web pages that are generated dynamically are like Facebook and Instagram. The user requests a page, which triggers code that generates an HTML page.

So sometimes you would want… ‑‑ if you try to look at the source code of a website, you won’t really see much of anything? Because that code, like, doesn’t exist yet. Unless you open an inspector, and you look at the code that’s visible on your side.

So, to make this content look better, it’s often styled. Right? ‘Cause otherwise, it would just be plain Arial size 12. So we add color, add shape, animation, layouts, italics. And we do that using Cascading Style Sheets, or CSS.

CSS is also not a programming language. It’s a way of representing information.
So this is what a static HTML file might look like. I grabbed this from a teaching resource, so that’s why you can see things like explanations of what HTML is, because I thought it would look a bit cleaner than the WIRED article.

And this is a CSS file! You see things like font size, font family, color, background color, position. Right? So those are the types of things that you can control using CSS. You can even make animations.

So, knowing that ‑‑ the point that we’re trying to make in saying this is that HTML can’t do anything with your data. Neither can CSS. They just display things that are coming from the other computer that you’re connecting to. So how are web pages collecting our data? Well, the code that actually does stuff in your browser is usually written in Javascript.

So… To see it in action, we can go into Tools, and Web Developer, and Inspector! And we can see some of the stuff that’s going on behind the scenes, right? This is how you do this in Firefox, and it’s similar but not identical in other browsers like Chrome and Safari. You ‑‑ I don’t think you can do this in Safari at all, but I might be wrong about that.

So if you check out the Inspector tab, we have an easier way of reading the HTML source than just pulling it all up in a really large, confusing doc. Right? We get syntax highlighting. We get little disclosure triangles. And we’re able to highlight things and see ‑‑ we’re able to hover over different parts of the HTML, and it’ll highlight that section in the actual web page. So it’s a really useful teaching tool.

The Console tab, we’re able to see more of the Javascript activity that’s happening in the background of the page. So we’re able to see all of these, like, jQuery calls and database calls and analytics. Right? So this is how a web page might try to get information about you so that they ‑‑ the company, in this case WIRED, can use that information to structure their own marketing practices. Like, how many people went to this article about sharks biting undersea cables? They would use Javascript in order to record the fact that you, one person, went to this website.

In the Network tab, it shows data being sent and data being received by your browser. Right? So all of the, all of the ones marked “POST,” or ‑‑ you can only see the P‑O, in this part, are being sent, and all the ones marked “GET” are being received.

And so some of this stuff is fairly, like, normal. It’s actual HTML stuff that’s being included on the page. You can see the different types. And then some of it, in other places, you would be able to see like actual trackers. Right?

And when you click on one of the items, you’re able to see more information about what’s being transferred.

INGRID: And this is not necessarily ‑‑ I mean, although this is not very helpful? Like, when you click the headers? It’s like, here is a bunch of words! I don’t know what’s going on! But the other tabs can give us a little more, and depending on the type of network request, you’ll get slightly easier‑to‑read data.

What are cookies and how do they work?

So, in this section, we’re going to talk a little bit about some of the tracking methods. Cookies… are called cookies, because in ‑‑ they’re called cookies with the web, because in a different technology, whose name I do not recall, this same thing was called a magic cookie.

And I don’t know why it was called that in the other one… It’s just a, it’s a… it’s a holdover from the fact that a small number of people working on the internet had inside jokes, as far as I can tell.

But a cookie is a text file that contains information. Usually it’s something like an ID. And it’s used for doing things like storing preferences, or kind of managing things like paywalls on news websites.

So in this case, the cookie that was handed off to me from this particular page gave me this ID number that’s just like a pile of letters and numbers. And my browser will store that cookie, and then when I ‑‑ if I go back to the WIRED website, it’ll see ‑‑ it’ll check to see, like, oh, do I already have a cookie assigned to this one?

And if it does, it will take note of how many other WIRED articles I’ve already read. And that’s how WIRED gets ‑‑ is able to say, hey, we noticed you’ve gone, you’ve read all your free articles… Stop, stop doing that. You don’t get anymore.

And they’re not ‑‑ they can also be used for things like, you know, like say you have, you know, a certain kind of like ‑‑ like, you have a log‑in with a particular website, and you don’t want to have to log in every time, and the cookie can store some information for you.
But they’re also used for things, like, kind of tracking ‑‑ like, just trying to see where people go online to, you know, be able to figure out how to sell them things.

Just a distinction note, like, if you look at things in the Network tab: A response cookie is a file that comes from, like, a website to your computer; a request cookie is a file that your computer generates that goes to that computer. And it’s, you know. And a lot of this is stuff that is encrypted or encoded or kind of arbitrary ‑‑ like, which is good, in so far as it’s not creating ‑‑ oh, sorry.

It’s not ‑‑ it’s not just giving, you know, information, passing information about you and storing it in the clear? You still probably don’t want it? (Laughs)

So cookies can also be used for, like, tracking. This website has like, you know, a lot of different scripts running on it, because media companies work with other, you know, companies that do this kind of audience tracking stuff.

So like, when I was looking at this one, it was like the URL that the domain was coming from is elsa.memoinsights.com. That’s a weird name, and I don’t know what any of this is. If I type that into the browser, it doesn’t produce a web page?

But when I Google “memo insights,” I find: A company that works with companies to give them, you know, competitive analysis and campaign summaries. I don’t know what these things are, but this is some boutique company that works with Conde Nast, which owns WIRED. Maybe they do something with what I read, and maybe we can learn that people who read WIRED also read the New Yorker, or something.

What are pixel trackers and how do they work?

There are other trackers on the web that are not based in cookies and are a little bit weirder. So, pixel trackers are basically just tiny image files. They’re called this because, you know, sometimes they’re literally just one pixel by one pixel. And to load the image, you know, so the image is hosted on a server somewhere else, not on the WIRED website.

It’s hosted by whatever company, who knows, is doing this work. And because the image has to load from this other server, my computer makes a request to that server. And once that request is logged, it’s ‑‑ that’s, you know, the… that server can, you know, get information from my request about, you know, my computer, where I’m coming from, what ‑‑ like, how long I spent on it, what time I accessed it.

If you ever used, like, e‑mail marketing software, or like newsletter software, like MailChimp or TinyLetter, this is usually how those services are able to tell you how many people have opened your e‑mail. They’ll have like an invisible pixel tracker loaded into the, into the actual e‑mail, and will send the information about when that image loaded to the newsletter web service.

So, and so pixel trackers, they’re sort of sneaky in that they’re like… Again, like, you literally can’t see them on the web page. And they’re not as transparently kind of present? (Laughs) As other things?

What is browser fingerprinting and how does it work?

A more ‑‑ another method of tracking users on the internet across different websites is something called browser fingerprinting, which is a bit more sophisticated than cookies. So in the last few years, browsers have become a lot more dependent on and intertwined with a computer’s, like, operating system and hardware. For example, when you join a Google Hangout or a Zoom call! (Laughs)

You ‑‑ the browser is gonna need to access your webcam and your microphone. Right? And those are, those are, you know, parts of the hardware. So there needs to be like ways for the browser to talk to those parts of your computer? And that in and of itself isn’t a bad thing. But! It means that if a, you know, if some code is triggered that asks questions about, you know, those other parts of hardware, it might just get ‑‑ like, that’s data that could get sent to another server.

So in this example, we’re looking at the loaded information includes things like browser name, browser version. And that’s stuff ‑‑ like, that will usually be in a typical request. Like, knowing what the browser is or what kind of browser isn’t that unusual? But then we get things like what operating system am I on? What version of the operating system am I on?

I don’t ‑‑ like, I don’t know why this site needs that information! And I didn’t see any fingerprinting happening on the WIRED website, so I had to go to the YouTube page that the video was on. (Laughs)

There are a lot of more detailed sorts of things that can be, like, pulled into fingerprinting. So like your camera. Like, is your camera on? What kind of camera is it? That can get ‑‑ that can be something that a, you know, browser fingerprint will want to collect. Your, like, your battery percentage, weirdly? It’s ‑‑ and all of this is in the service of creating, like, an ID to associate with you that is definitively your computer, basically.

As opposed to, like, you know, you can actually like erase cookies from your browser, if you want to. Or you can say, like, don’t store cookies. But it’s a lot harder to not have a battery.

In terms of knowing if fingerprinting’s happening, one way to do that in the Network tab is you’re looking for the POST requests, which means that your computer is sending something to another computer. And one way that it can get sent is in a format called JSON, which is an abbreviation for JavaScript Object Notation. Which is basically a format for data that can be processed by the programming language that works in the browser.

This is ‑‑ another way if, like, if the, you know, the Network tab is like a little overwhelming, there are browser extensions that can show you more kind of detailed things about what’s going on with fingerprinting.

Additionally, just as a sidenote, browser ‑‑ like, browser extensions are another example of like throwbacks of the browser. The idea that anyone can build, like, extra software for that piece of software? It’s like, no one would ever let you do that to the Instagram app on your phone. And it’s sort of a, it’s kind of a leftover thing from something ‑‑ like, Firefox started doing it in 2004, and then everyone copied them. (Laughs)

But, back to fingerprinting.

Just as far as ‑‑ this is a Chrome extension called DFPM, Don’t FingerPrint Me, which just logs this in a slightly tidier way. So I thought I would show it. And it highlights a couple of examples of ways that this page is currently doing fingerprinting that I might want to know about.

So canvas fingerprinting is a method ‑‑ it sort of describes it here. It draws like a little hidden image on the page that then is kind of encoded to be, like, your fingerprint. I think Firefox actually blocks this by default, so I had to do this in Chrome! (Laughs)

WebRTC, that’s related to your camera and microphone. WebRTC stands for Web RealTime Communication, or Chat, I’m not sure which. But that’s basically the tool used for making ‑‑ for doing web calls. They’ll also look at what fonts you have on your computer, your screen resolution. You can see here the battery level stuff.

So I guess the point I wanted to bring across with the fingerprinting stuff is just that, like, there are lots of different things in play here.

Should we ‑‑ do you think we have time for our bonus round…? Oo, it’s almost 1:00. But I feel like there was ‑‑ I’m hoping, I think there was some interest in this. I don’t know, Olivia, what do you think?

OLIVIA: I just pasted in the chat an answer to the TOR versus VPN question? So we can skip those slides. But it might be useful to kind of rapid‑fire go through safer browsing techniques? Yeah, I just got a “yes please” in the Q&A.

What is a VPN and how does it work?

INGRID: Okay. Quick version of the VPN thing. This is how a normal connection, you know, logs data about you. I go to a website, and it logs this computer came to me! This computer over here.

A VPN basically means that you’re connecting to that computer through another computer. And so your request looks as though it’s coming from kind of somewhere else. That being said, like, it’s ‑‑ you know, there’s still other data. Like, given the point I just made about fingerprinting, there’s other data that could be collected there that’s worth thinking about.

When data travels through TOR, TOR is an acronym for The Onion Router, and the idea is that it wraps your request in multiple ‑‑ by going through multiple different computers, which are called relays.

So when you use TOR, which is a browser, to connect, it sends your request through this computer and this computer and this computer, and whatever is the last one you were on before you get to the page you want to visit, that’s where this ‑‑ that’s the, like, IP address that this device is going to log. These are called ‑‑ this last sort of like hop in the routing is called the exit relay. Those can be ‑‑ yeah. I think that that, that was my attempt at being quick. I apologize. (Laughs)

OLIVIA: Fun fact about VPNs. If you ‑‑ because the United States has different privacy laws than other countries, if you were to connect to a VPN server that was in, for example, the European Union, you might get a lot more notifications from the sites that you normally go to about different cookies and different things that they do with your data. Because in Europe, they’re required to tell you, and in America, they’re not always required to tell you what they’re doing with your data.

What is private web browsing and how does it work?

Oh, I can take it. So this is how, in Firefox, you would open a private window. And private windows, I think we’re all kind of a little bit familiar with them. They clear your search and browsing history once you quit. And it doesn’t make you anonymous to websites, or to your internet service provider. It just keeps it private from anyone else.

But that might be really useful to you if you are using a public computer, or if you’re using a computer that might be compromised for any other reason. Like say if you suspect that you’re going to protest and a cop might take your device from you.

What are script blockers and how do they work?

INGRID: So script blockers, so like the tracking and the little analytic tools and stuff usually are written in Javascript, because that is the only programming language that works in a browser. So there are tools that will prevent Javascript from running in your browser. And that can be helpful for preventing some of those tracking tools from sending data back to, back to some, you know, computer somewhere else. It can be a little bit frustrating, because Javascript is used from all ‑‑ for all sorts of things on websites. Sometimes it’s used for loading all of the content of the web page! (Laughs)

Sometimes it’s used to, you know, make things kind of have like fun UI! But it’s ‑‑ so it sort of… You know. It’s worth ‑‑ it’s interesting to try, if only to see how much of your internet experience needs Javascript? But yeah. There are some tools that will, you know ‑‑ the Electronic Frontier Foundation has a cool extension called Privacy Badger that sort of learns which scripts are trackers and which ones aren’t as you browse. But yeah. These are, yeah, these are extensions that browsers will ‑‑ you can install onto a browser.

And then firewalls!

What is a firewall and how does it work?

OLIVIA: So firewalls are kind of the first line of defense for your computer’s security. It would prevent, basically, other computers from connecting directly to your computer, unless you like say yes or no. And so… They’re really easy to turn on? On your computers? But they’re not that way by default.

So in a Mac computer, like I’ve shown here, you would literally just go to security and privacy, and go to the firewall tab, and it’s like one button. Turn off, or turn on. And you don’t really have to do much more than that.

And in Windows, there’s a similar process, if you go to the next slide, where you really just go into settings, go into security, and switch the “on” setting. It’s pretty… It’s pretty easy, and it’s kind of annoying that it’s not done for you automatically.

But I recommend everyone to just check out and see, like, hey, is my firewall turned on? Because it’s a really easy step to immediately make your computer much safer.

INGRID: All right! We went through all the slides! (Laughter)

BLUNT: That was perfectly timed! You got it exactly at 1:00

What’s the difference between a VPN and TOR?

OLIVIA: Okay. So for the TOR versus VPN answer.

As we said just a while ago, TOR uses onion routing and sends your data through multiple computers called TOR nodes to obscure traffic and anonymize you, while a VPN just connects you to a VPN server, that are often owned by VPN providers, which is sometimes you have to pay to use them and other ones are free.

So I described it as kind of like a condom? (Laughs) Between you and your internet service provider? So Verizon knows that you’re using a VPN, but it doesn’t know what you’re doing on it, because a VPN would encrypt all your traffic.

It’s really important that you use a VPN that you trust, because all of your internet traffic is being routed through their computer, which is another reason people like to pay. Because you can have a little bit more faith that it’s like a trusted service if you’re paying for it? Even though that’s of course not always true.

But there is Proton Beacon, which is one I use that’s free, which is run by the same people who run Proton Mail, which I use. I haven’t had any problems with it.

You can use a VPN and TOR at the same time, which is what the question directly asked. And I believe that your ISP would know that you’re using a VPN, but because you’re using a VPN it wouldn’t know that you’re using TOR. Ingrid, if that’s not true, you can like clap me on that.

Because TOR is super slow and it routes your computer through a bunch of different things, it can break a lot of websites, including video streaming like YouTube and Netflix. A lot of people use VPNs, however, so they can access videos or things that are banned in different countries by making it look like they’re in a different place.

But if you’re doing something highly sensitive or illegal, you’d probably want to use TOR, and probably some other precautions, too.

BLUNT: Thank you so much. That was super helpful. Do folks have any questions? Is there anything that people would benefit from sort of like going back and going into in a little bit more detail?

Someone just said: Is there a way around TOR breaking websites? I’ve had used it and it throws a lot of captcha tests on regular websites.

OLIVIA: So Cloudflare kind of hates TOR? (Laughs) It takes a really aggressive stance towards TOR users, actually? There was like an Ars Technica article I read that said Cloudflare said 90% of TOR traffic we see is, per se, malicious.

So I don’t know if there’s going to be a time that you can use captcha and not have it act up, because Cloudflare sees that kind of activity as malicious activity.

Can Apple see what you’re doing on your computer or phone?

INGRID: “This may be hardware‑related, but does Apple see what you’re doing on your computer because you connect to the internet, e.g. any photos, videos you store?”

Okay, to make sure I understand the question: Is the question whether, like, if you’re using an Apple device, whether Apple is able to see or collect anything if you’re connected to the internet from that device?

Okay. So I think ‑‑ I mean, the answer to that is you would need to kind of tell them to do that? (Laughs)

They’re ‑‑ so like, if you are using something like iCloud to store photos and videos, then yes, they would be able to see and have all of those. But in terms of, like, just being on the internet doing things on an Apple device? Apple can’t personally, like, kind of peek in and see that. I mean, they, like ‑‑ there are, you know, other computers will know that you’re on an Apple device.

But yeah, you have to be directly interfacing with Apple’s network for Apple to be able to have anything on or from your computer.

OLIVIA: And when it comes to things like iMessage and iCloud, they… say? That that information is encrypted. Of course, it’s like not open sourced, so we don’t actually know how they’re encrypting it or what they do. But Apple has said for a while that communications between, like say two iMessage users?

So not someone using it to speak to someone who has an Android; that’s SMS.

But two iMessage users speaking to each other, that’s technically an end‑to‑end encrypted conversation. Apple does collect some information from you when you are initially typing in someone’s number to text them, because it pings the server to find out if that number is associated with an iCloud amount.

So for iPhone users, that little moment between when a number that you’re typing in turns either blue or green, in that moment it’s sort of pinging Apple’s servers. So they do have a list of the times that that ping has occurred.

But of course, that doesn’t tell you if you actually contacted the person whose number you typed in; it just knows that you made that query. And that’s the extent, so Apple says, of the information that they collect about your iMessage conversations.

So, yes, they do ‑‑ they can technically see that information? But they tell us that they don’t look at it. So.

Open-source vs. Closed-source Technology

BLUNT: Can you explain a little bit more about open source or closed source technologies?

OLIVIA: Yeah! So, open source technologies are… basically, they’re apps, websites, tools that they’re ‑‑ the code that’s used to write them and run them is publicly available.

When it comes to security technologies, it’s really… best practice to try to use tools that are open source, because that means that they’re able to be publicly audited.

So like, regular security experts can like go in and like actually perform an audit on open source security tools, and know that they work. Versus, you have a lot of paid security tools that you basically assume that they work because people tell you that they work?

And you can’t really, like ‑‑ the public can’t really hold them to any, like, public accountability for whether or not they work or not.

Versus you can actually, like, test the encryption algorithm, say, of Signal, which is a messaging app and all of their code is public information.

INGRID: Open source, it’s also like a way of… kind of letting people developing software kind of support each other, in a way? Because the fact that Signal is open source, it’s not just like oh, we can be accountable if Signal says it’s doing something but it’s not; it’s also a way to be like, hey, I noticed something. Is it working? And you can actually directly contribute to improving that technology.

It’s complicated ‑‑ I mean, the world of open source, it’s complicated in that it’s like, it still has elements of the like… you know, snobby, like, like culture of tech, sometimes? But it, it’s kind of ‑‑ in principle, it’s very like useful for being able to have technologies that are accountable and that kind of have some element of like public engagement and understanding.

How to Choose a VPN

BLUNT: Awesome. Thank you. And so I have another question in the chat: What are some good ways to assess the trustworthiness of a VPN, as you were discussing before?

OLIVIA: The way most people do it, I think, Ingrid, you could check me on this, is kind of by reputation. If you look up how to find a good VPN, you’ll find a lot of articles where people talk about the pros and cons of different ones. And you’ll be kind of directed to the ones considered by the public to be the most trustworthy ones?

INGRID: Yeah. And I think one way I guess I evaluate companies sometimes on this is like looking at their level of engagement with the actual, like, issues that they… of like user privacy?

So like, one of the, you know, things I ended up using as a reference for this workshop as a guide to making ‑‑ as a guide for, like, different browsers, was like a blog post by Express VPN. And they’re a company that, they don’t have to tell me anything about like which browser ‑‑ there’s no reason for them to generate that content.

I mean, it’s good PR‑ish? But they’re not going to get new customers because I’m using a different browser now.

So some of it’s thinking, you know, is it open source or not? What is the like business model? And are they kind of actively, you know, engaging with issues related to user privacy?

We’ll talk a little bit more tomorrow about legislative issues around privacy, and that’s also another way. Like, have they taken positions on particular, you know, proposed laws that could harm user privacy?

To me, those are sort of like, how are they kind of like acting on principles?

OLIVIA: It also might be a good way of checking to see if ‑‑ yeah! If they produce logs in court proceedings, so you know that they don’t track traffic.

Also, to see like, say, certain companies might be funded by other companies that, like, are less concerned about… public safety or privacy or human rights.

So that might also be a good way of like checking to see, like, the integrity of a VPN company. ‘Cause at the end of the day, they’re all companies.

Is WordPress a reputable option for sex workers?

INGRID: All right. The next question: Would y’all consider WordPress reputable for housing a sex worker website?

This ‑‑ thank you for asking, because it lets us kind of talk about something I wanted to figure out how to include in that whole presentation but didn’t.

So… Just as like a point of clarification, and maybe this is understood by people, but maybe for the video it will be helpful… WordPress? (Sighs) Is both a, like, hosting company and a piece of software. WordPress, I think ‑‑ WordPress.org is the hosting one? Or WordPress.com? I can never remember. (Laughs)

I think it’s WordPress.com. But you can host a website on WordPress’s, like, platform, and when you do that you will be running a website that is built using WordPress’s software. Which is also called WordPress! This is confusing and annoying.

But… you can also use WordPress’s software on another web, like, hosting service. Like, you can install WordPress onto a like hosting service website. I think a fair amount today, like of hosting services, actually do sort of a one‑step click, like they’ll set up a server with WordPress for you option.

In terms of WordPress, like, as the host of a website? And as a host for sex worker websites… I don’t actually know. I would say ‑‑ I would, like, check ‑‑ I would need to go check their terms of service? (Laughs)

I think in general… Yeah. I don’t totally ‑‑ I think with all hosting companies, it’s hard ‑‑ like, they’re, like, figuring ‑‑ figuring out which ones are kind of the most reputable is partly about looking at any past incidents they’ve had in terms of takedowns, or like what their ‑‑ also like where they’re located?

So like, WordPress is a company based in the United States, so they’re beholden to United States laws and regulations. And I’m guessing part of the reason this question was asked is that this person ‑‑ that you probably know a little bit about FOSTA‑SESTA, which makes it harder for companies to allow any content related to sex work on their servers.

And as far as I know, WordPress wants to be compliant with it and hasn’t taken a radical stance against it.

Blunt, do you have any…?

BLUNT: Yeah, I can say I think hosting anywhere on a U.S.‑based company right now has a certain amount of risk, which you can decide if that works for you or not. If you are hosting on WordPress right now, I would just recommend making lots of backups of everything, as like a harm reduction tool. So if they decide to stop hosting your content, you don’t lose everything.

And I also just recommend that for most platforms that you’re working on. (Silence)

Cool. So we have around 15 minutes left. So if there are any other questions, now’s the time to ask them. And… I don’t ‑‑ and if not, I wonder if just chatting Ingrid and Olivia a little bit about what y’all will be covering in the next two days!

Okay, we have two more questions.

Can you reverse browser fingerprinting?

“This may be a digital surveillance question, but once you get browser fingerprinted, is it reversible?”

INGRID: Hmm. That’s actually a question where I’m not sure I know the answer. Olivia, do you know…?

OLIVIA: No…

INGRID: I do know that… you can sort of ‑‑ I know on some, on mobile devices, you can like spoof aspects of your identity?

So, like, you can ‑‑ like, so I mentioned Mac addresses are sort of this hard coded thing. That’s just the idea of your like device? A phone can actually ‑‑ like, you can actually generate sort of like fake MAC addresses? (Laughs)

That are the one that’s presenting to the world? So if that sort of was a piece of your fingerprinted identity, that’s one way to kind of, like ‑‑ you know. It’s like you wouldn’t be a perfect match anymore? But… Yeah, I don’t know if there’s sort of a way to completely undo a fingerprinting.

Yeah. I will also look into that and see if I can give you an answer tomorrow, if you’re going to be here tomorrow. If you’re not, it will be in the video for tomorrow.

Additional Digital Literacy Resources

BLUNT: Great, thank you. And someone asked: Are there any readings that y’all would recommend? I’ve read Algorithms of Oppression and am looking for more. I love this question!

OLIVIA: That… the minute I heard that question, like, a really long list of readings just like ran through my brain and then deleted itself? (Laughs) We’ll definitely share like a short reading list in the bibliography that we’ll send out later.

BLUNT: Awesome. That’s great.

Okay, cool! This has been really amazing. Thank you so much. I’m just going to say, one more chance for questions before we begin to wrap up.

Or also, I suppose, things that you’re interested in for the next two days, to see if we’re on track for that.

How do fintech companies use digital surveillance?

Someone asks: This is a fintech‑related question for digital surveillance, but can you talk about how that kind of works internet‑wise?

INGRID: Fintech…

BLUNT: For financial technologies. And how they track you. Oh! So like, if you’re using the same e‑mail address for different things? Is that sort of on the…?

OLIVIA: Like bank tracking? Like money type of…?

INGRID: So… Depending on the, you know, like financial servicer you’re working with, like PayPal or Stripe or whatever, they’re going to have ‑‑ like, they ‑‑ like, in order to work with banks and credit card companies, they are sort of expected to kind of know things about you.

These are like related to rules called KYC, Know Your Customer. And so part of the tracking or like ‑‑ or, not tracking, but part of information that is collected by those providers is a matter of them being legally compliant?

That doesn’t mean it produces great results; it’s simply true.

And I think the ‑‑ in terms of the layer ‑‑ I’m trying to think of what’s ‑‑ I don’t know as much about whether or not companies like Venmo or… Stripe or PayPal are sharing transaction data? I’m pretty sure that’s illegal! (Laughs) But… Who can say. You know, lots of things happen. That would be capitalism.

BLUNT: I also just dropped the account shutdown harm reduction guide that Ingrid and Hacking//Hustling worked on last year, which focuses a lot on financial technologies and the way that, like, data is sort of traced between them and potentially your escorting website. So that was just dropped into the chat below, and I can tweet that out as well in a little bit.

Zoom vs. Jitsi: which is more secure?

OLIVIA: Privacy/security issues of Zoom versus Jitsi… I also prefer to use Jitsi when feasible? But I also found that call quality kind of drops really harshly the more people log on. Like, I don’t think we can actually sustainably have a call of this many people on Jitsi without using like a different ‑‑ without hosting Jitsi on a different server.

Concerning how I handle the privacy/security issues of Zoom, they’re saying they’re going to start betaing end‑to‑end encryption later this month. I don’t know what that actually even means for them, considering that they’re not open source, right?

But I do say that one of the things that I tend to try and practice when it comes to, like, using Zoom, is kind of maintaining security culture amongst me and people who we’re talking to. Right? So I’m never going to talk about, like, any direct actions, right, that are going to happen in real life on Zoom. Refrain from just, like, discussing activity that could get other people in trouble anyway.

Like, while it would be nice to have, like, say this kind of conversation that we’re all having now over an encrypted channel, I think it’s generally much safer and ‑‑ I don’t like using the word “innocent,” but that’s like the word that is popping into my head, to talk about ‑‑ to use Zoom for education, even if it is security education, than it would be to actually discuss real plans.

So… It might be really beneficial to you if you are, like, say, having ‑‑ using Zoom to talk to a large group of people about something that is kind of confidential? To talk over, like, Signal in a group chat, or some other encrypted group chat platform, and decide like, okay, what are you allowed to say over Zoom, and what you’re not allowed to say. And to think of Zoom as basically you having a conversation in public.

Assume for all of your, like, Zoom meetings that someone’s recording and posting it to ‑‑ (Laughs)

YouTube later! And that would probably be… that would probably be the most… secure way to use it, in general? Is just to assume that all of your conversation’s in public.

BLUNT: Yeah. I totally agree, Olivia. And that’s why this is going to be a public‑facing document. So, Zoom felt okay for us for that.

INGRID: Yeah. I mean, I think another way I’ve thought about this with Zoom is like, just remembering what Zoom’s actually designed for, which is workplace surveillance? Right? It’s like, you know, its primary market, like when it was first created, and still, is corporations. Right?

So there’s lots of ‑‑ so like also, when you’re going into like ‑‑ even if you’re going to a, you know, public Zoom thing that is, you know, about learning something. Like, whoever is managing that Zoom call gets a copy of all of the chats. Right?

And even if you’re chatting like privately with one other person, that message is stored by ‑‑ like, someone gets access to that! And… Mostly just that’s something to… like, thinking ‑‑ like, just keep in mind with, yeah, what you do and don’t say. Like, especially if you are not the person who is running the call.

Think about what you would or wouldn’t want someone you don’t know to kind of like have about you.

What’s to come in the digital literacy lunch series?

BLUNT: Awesome. Thank you so much. Do you want to start to wrap up and maybe chat briefly about what we’ll be seeing in the next two sessions?

OLIVIA: Sure, yeah. So the next two sessions are going to be one talking more about how platforms work and sort of the whole, like, algorithmic ‑‑ bleh! (Laughs)

Algorithmic curation, and how misinformation spreads on platforms, and security in the Twitter sphere, rather than just thinking about using the internet in general. And then the third will be talking more explicitly about internet surveillance.

So we’re going to be talking a little bit about surveillance capitalism, as well as like state surveillance, and the places where those intersect, and the places where you might be in danger and how to mitigate risk in that way.

Operations Security (OPSEC): An Introductory Overview

Opsex operational security meme

We live in an age of increased surveillance and censorship. Social media is a bastion for fascism. Abusers target sex workers, queer users, and people of color and prey on them without fear. Whorephobes and bigots alike use our vulnerability to their advantage through social manipulation, doxxings, and swattings. It has never been a more dangerous time to be a marginalized person online. And our first line of defense is operations security.

“Operations security (OPSEC) is a process by which organizations assess and protect public data about themselves that could, if properly analyzed and grouped with other data by a clever adversary, reveal a bigger picture that ought to stay hidden,” CSO writes.

OPSEC is to online safety what sex education is to sex: a necessary part of modern life that is underfunded, underappreciated, and rarely discussed in an approachable way. This guide is our attempt to introduce OPSEC in an accessible way to sex workers, activists, marginalized users, and allies who may not necessarily have the tech literacy to know about these harm reduction practices.

(Please note that this is an introductory overview to digital and technical safety, and it may not provide the full protection you need in your specific circumstance. For more information, see the links at the end of this article.)

Why does operations security (OPSEC) matter?

Imagine you’re a sex worker from New York at a Black Lives Matter march. While you were spraypainting a statue, an NYPD officer successfully grabbed you, stole your phone, and forced you to use your FaceID login to unlock your messages. He was able to browse through your photos and text messages in detail. Luckily, your fellow protesters came in, dearrested you, and brought you and your phone back to safety. You’re shaken from the ordeal, but the worst is over, right?

Well, no. The NYPD officer saw signs that you were engaging in full-service work in your messages. You accessed a hacked public WiFi near the march, and officers were able to grab your Twitter and Instagram account names. The NYPD was able to identify your phone and track you on the walk home. The police now have your address and enough evidence of some kind to draft up a warrant, and they’re eager to enact revenge.

But instead of immediately arresting you, they break into your WiFi connection and keep tabs on your Facebook posts, Twitter DMs, and Instagram chats. It’s a gold mine for the cops: they know that you’re not just going to multiple protests, but you played a key role in pulling down multiple racist monuments. Not just that, they also have corroborating evidence to arrest a few of your fellow full-service workers joining you for the “vandalism.”

You didn’t know the cops were spying on you. How could you? The game was rigged against you from the start.

Or, imagine you were never arrested in the first place. You advertise on an escorting website where you had to upload your ID. The escorting website has been raided by the feds, and facial recognition technologies, such as Thorn’s SPOTLIGHT, build databases off of escort ads. When the cops are going through footage, they are able to link an image of your face from the protest to your escorting ad and have access to your ID and social media accounts.

This is not a dystopian future; this is now. This is not to instill fear; this is to encourage you to protect yourself, protect your data, and to protect each other.

So, what is operations security (OPSEC)?

It’s no secret that the government can track your online activity. But surveillance is actually much more prevalent than most people think. When you visit a website, your connection leaks a ton of information about where you are located, down to your country, state, city, and even a guesstimate of your latitude and longitude. Meanwhile at work, you’re forced to use surveillance software like Cocospy, which sends your boss information on your social media posts, text messages, call logs, and more. And if that isn’t enough, predators, police officers, and right-wing fascists can easily break into your WiFi network and snoop on your web traffic with a few apps and some tech knowledge. It doesn’t take much to steal your login information.

Good OPSEC grants you protection against hacking, data theft, doxing, and surveillance. OPSEC is preventative in nature: it requires you to understand your biggest threats and the potential ways they can harm you. Identifying and conceptualizing this is called threat modeling.

There are various design philosophies for threat modeling. The Electronic Frontier Foundation’s Surveillance Self-Defense project offers a great starting model based on five key questions:

  • What do I want to protect?
  • Who do I want to protect it from?
  • How bad are the consequences if I fail?
  • How likely is it that I will need to protect it?
  • How much trouble am I willing to go through to try to prevent potential consequences?

Ars Technica also offers a valuable guide to threat modeling based off these four questions:

  • Who am I, and what am I doing here?
  • Who or what might try to mess with me, and how?
  • How much can I stand to do about it?
  • Rinse and repeat.

Threat models require careful consideration about the trade-offs to different protections. If you’re an online sex worker with a popular Twitter presence, it may be incredibly difficult or outright impossible to stop using social media. However, communicating with your full-service clients over a burner phone connected to Signal may be a good option to evade police surveillance.

An example of the minimal data handed over to the U.S. government by Signal during a subpoena.
Data handed over to the U.S. government by Signal during a subpoena is minimal. For more information, read here.

What is encryption?

“Encryption is a process that encodes a message or file so that it can only be read by certain people,” Search Encrypt writes. “Encryption uses an algorithm to scramble, or encrypt, data and then uses a key for the receiving party to unscramble, or decrypt, the information.”

Let’s say you want to send an encrypted message to another user. The words you type in – or the “plaintext” – is algorithmically encoded into something called “ciphertext.” Ciphertext can only be decoded with its encryption key. When you send your message, the other user receives the decryption key and converts ciphertext back to plaintext.

End-to-end encrypted messaging

Some services offer encrypted messaging where the service holds the key to your messages. This means the site can choose to decrypt your messages and read them or send your messages to law enforcement upon request. This is why the best form of encrypted messaging is end-to-end encryption.

End-to-end encryption “means that messages are encrypted in a way that allows only the unique recipient of a message to decrypt it, and not anyone in between,” Wired reports. “In other words, only the endpoint computers hold the cryptographic keys, and the company’s server acts as an illiterate messenger, passing along messages that it can’t itself decipher.”

Sex workers, privacy advocates, organizers, and journalists commonly rely on end-to-end encryption to respond to their threat model. Thanks to social media and smartphones, end-to-end encrypted messaging is as popular as it is accessible, and there are a number of services you can use to keep in touch with others.

Popular end-to-end encrypted messaging services include:

  • Signal
  • WhatsApp
  • Telegram
  • Dust
  • Wire
  • Keybase
  • iMessage

Among these, the following are generally considered the best for the most private and secure messaging:

  • Signal – Open-source, strong pro-privacy stance, data collection minimal, zero-access encryption. Most popular
  • Wire – Open-source with similarly strong pro-privacy stance, phone number not required
  • Dust – Automatic 24 hour message deletion, phone number kept private after creating username, based off Signal protocol

Note that each of these platforms have their pros and cons. For example, Signal requires your phone number, which may put sex workers at risk for being identified.

Encrypted email

In terms of email services, end-to-end encryption and zero-access encryption is preferred. The latter is a form of encryption that prevents service providers from reading your emails in plaintext while “at rest,” or sitting in your inbox.

Two popular end-to-end encrypted email services include ProtonMail and Tutanota. Both offer end-to-end encrypted communication with fellow service users, such as a ProtonMail user emailing another ProtonMail user.

Be warned that ProtonMail does not encrypt subject lines, while Tutanota does. Additionally, no email service can provide end-to-end encrypted communication if one of the recipients does not use end-to-end encryption. A ProtonMail message sent to an @aol.com account, for example, will not be encrypted in the AOL user’s inbox. Your correspondence will be encrypted at rest within your own inbox, however. For more information, read this author’s overview and review of ProtonMail.

(One workaround for this issue is PGP. Short for “Pretty Good Privacy,” this involves a sender encrypting an email with a key, and a recipient decrypting it with their own key. Mozilla Thunderbird users can easily navigate this with the Enigmail add-on.)

Hiding your internet footprint with a VPN

A virtual private network is a service that lets users connect to an off-site server to route traffic over the internet. This connection uses an encrypted tunnel to protect your privacy. This ensures your outbound and inbound web traffic alike are secure.

A screenshot from a ProtonVPN user  who is taking advantage of an encrypted connection.
Screenshot from ProtonVPN.

“When you browse the web while connected to a VPN, your computer contacts the website through the encrypted VPN connection. The VPN forwards the request for you and forwards the response from the website back through the secure connection,” Chris Hoffman writes for How-to Geek. “If you’re using a USA-based VPN to access Netflix, Netflix will see your connection as coming from within the USA.”

VPNs come with their trade-offs. Your ISP can see when you’re using a VPN, as can other websites. VPNs are much more common than in previous years, although simply using one may be enough to gain a company, police department, or state entity’s attention. Your information is in the hands of your VPN provider, and some companies are more trustworthy than others. Do your research before choosing a VPN, especially if you’re planning to engage in high risk activism work or full-service sex work.

Several popular, vetted VPN services include:

Privacy-friendly software alternatives

When corporations control the programs you use, they control access to the data you create with their platforms. There are plenty of privacy-friendly software alternatives to some of the most basic proprietary software out there, many of which open-source. Microsoft Office, for instance, has a free, open-source alternative called LibreOffice. Here is a list of alternatives to some of the most popular websites and services out there:

Additional alternatives can be found on PRISM Break.

Switch to Linux and minimize data tracking

If you’re on a Windows or MacOS computer, your data is being tracked. Microsoft and Apple are notorious for collecting an immense amount of information on its users and storing it. One of the few viable alternatives to these corporate tech giants is using Linux.

A screenshot of the Linux Mint start menu.
Screenshot from Linux Mint.

Linux is not one operating system, but a family of free open-source OSes built off of the Linux kernel. In 2020 there are many distributions (or “distros”) available built for user accessibility, and these are as easy as placing a boot disc on a flash drive and installing the OS on your computer of choice. You can erase your current OS with Linux, create a “dual boot” option that keeps your current OS, or even install Linux on an external hard drive and use your distro between devices. Many distros support drive encryption, letting users protect their entire OS and all of its contents prior to boot-up.

Look into the following Linux distros for an accessible, privacy-friendly experience:

  • Debian – One of the most accessible secure distros available, relies entirely on free, open-source drivers and applications
  • PureOS – Security and privacy-based Linux distro
  • Linux Mint – Easy to use, similar in nature to Windows. Installation is easy, OS is highly stable, and overall a solid distro for newcomers
  • Manjaro – Like Linux Mint, user-friendly design and lightweight distro perfect for switching from Windows

For more information on Linux, visit FOSS Post’s beginner’s guide to the operating system family.

Conclusion

This guide goes over technical solutions sex workers and activists can take to protect their data. However, the role human error plays in OPSEC cannot be understated. A trusted VPN, secure Linux distro, and end-to-end encrypted email account will not protect you if you set all of your account passwords to “password,” or if you happen to share your address on social media.

Your OPSEC’s weakest link usually comes from an outside party: a client, a fellow organizer, a family member, or a friend. Ideally, you should send this guide to your trusted comrades and suggest they begin improving their digital security too. But you must meet your social network where it’s at. If your client does not understand why they need to use ProtonMail to communicate with you, it may be easier to simply purchase a burner phone for sex work and exchange numbers on Signal.

Always do your research before using any operating system, device, phone app, or communications platform. Services such as Telegram are not quite as secure as people assume, and products like ProtonMail are not fully upfront about their encryption features. You are as safe as the products you trust, so make them earn it.

At times, you may need to sacrifice convenience for privacy by taking certain conversations offline. Not all conversations can be had safely digitally.

There is no such thing as the perfect security system. The advice activists and tech freedom advocates provide is based on what we currently know and consider best practices. New laws, leaks, and technological innovations may introduce changes to your threat model. Stay connected to your local tech activist community to know more about contemporary OPSEC guidelines.

A closing note on privilege

Tech resources are a privilege. They are gatekept by white cishet men who assume their relationship with the world is the default. This not just drives women, trans people, sex workers, and Black activists from tech spaces, it cultivates exclusion. Poor OPSEC goes all the way back to the white men who get to decide who can access tech spaces, who cannot, and what issues the community cares about.

Your ability to successfully build a new computer, buy a new laptop, or even purchase a flash drive is dictated by your race, class, gender, and sex working status, among many other factors. It is the responsibility of the privileged to lend a hand and help the marginalized protect themselves. This can be done in numerous ways – running workshops, donating devices, volunteering one-on-one tech support, funding mutual aid projects, or directly giving your money to the most marginalized among us. No matter how you do it, it’s our responsibility to make sure digital safety is accessible to everyone.

Special thanks to Raksha Muthukumar and SX Noir for feedback on this post’s initial draft.

To read more from Ana Valens, click here!

Further Reading

EFF’s Surveillance Self-Defense Project – An in-depth overview of digital security and safety designed for new and experienced tech users alike

Attending a Protest: Surveillence Self-Defense – Digital safety guide by the EFF specifically for protesters, highly recommended

Protesting for Black Lives Matter? Follow these data privacy tips – For protesters attending Black Lives Matter marches or other events. Written by this guide’s author

How to Protest Safely in the Age of Surveillance – Additional overview for Black Lives Matter protesters

ProtonMail Review – Overview of ProtonMail, its features, and its weaknesses. Written by this guide’s author

GOP introduces bill that would give police easy access to encrypted data – Overview of a Senate bill targeting encryption. Would federally mandate “device manufacturers and service providers” to work with law enforcement in “accessing encrypted data if assistance would aid in the execution of [a] warrant”

How To Stop Instagram From Tracking Everything You Do – Overview of ways you can prevent Instagram from collecting personal data. The best option is, unfortunately, to delete Instagram from your phone

Threat Modeling

12346

What is Threat Modeling?

“A way of narrowly thinking about the sorts of protection you want for your data. It’s impossible to protect against every kind of trick or attacker, so you should concentrate on which people might want your data, what they might want from it, and how they might get it. Coming up with a set of possible attacks you plan to protect against is called threat modeling. Once you have a threat model, you can conduct a risk analysis.” – EFF

What are Threat Modeling Questions To Ask?

1.What do I want to protect?

2. Who do I want to protect it from?

3. How bad are the consequences if I fail?

4. How likely is it that I will need to protect it?

5. How much trouble am I willing to go through to try to prevent potential consequences?

What are other Threat Modeling Concerns?

What are my assets?

Who are my adversaries?

What are the threats of my adversaries?

What is the risk of ___ happening?

What does a sample Threat Model look like?

Example: Sex Work Provider in NYC

Assets: Photos, legal id, address, social media accts, email, communications, texts, bank acct, payment legers, contacts.

Adversaries: Cops, stalkers, family, exes, journalists, careless ppl, catfish, trolls, anti sex work ideologues, algorithms.

Threats: Location tracking spyware, doxxing, blackmail, report police, steal photos, intercept, falsified charge reason/arrest reason, reporting status as sex work provider to ‘vanilla’ job.

 

@babyfat.jpeg on Lesbians Who Tech

Last year, two organizers from Hacking//Hustling were rejected from speaking at last year’s Lesbians Who Tech convening in San Francisco, which took place shortly after SESTA-FOSTA was signed into law. Hacking//Hustling provided a partial scholarship to Baby Fat (@babyfat.jpeg) to attend and make sure that there would be sex worker representation at the conference. Baby Fat’s reflections on her experience at Lesbians Who Tech as a sex working Femme are below.

A few months ago, I was able to attend my first Lesbians Who Tech summit thanks largely to the support of my community. At the time of attending I was working as a digital media associate at a Queer healthcare nonprofit. Most of my 9-5 background has come from my work in Queer nonprofits, working mostly in direct outreach. For the last three years I have worked in tech specific positions within nonprofits, skills which I was able to acquire because of my hustling. I’m from a nontraditional background, but hustling has taught me everything I know about tech, marketing, and community management.

It’s worth mentioning I was able to attend the conference because I was awarded a partial scholarship for them. I attended the summit because I have always had a passion for social media and believe in its ability to connect community and provide accessible education, especially as it relates to Queer sexuality and wellness. From a hustling perspective, it’s the best way for me to engage and advertise to those who utilize the multitude of my services. Post FOSTA/SESTA I have had to rely even more heavily on social media and have since began operating more discreetly. 

While the conference was exciting and I was able to connect with some great folks I often felt that some overall nuance was missing. There was a lack of intentional conversations around gentrification, sex work, and the Queer complacency. Navigating the space as a fat femme sex worker was complex and exhausting at times, between being unable to fit in certain seating, being talked down to by masc attendees, or feeling uncomfortable disclosing the extent of my work. Because a bulk of my 9-5 career has been in nonprofits a majority of the conferences I have attended have been specifically dedicated to Queer theory, resistance, and community building. However, these spaces often lack on seeing the importance of tech within these movements and have been slow to adapt to the changes tech have created in communities. I think LWT is doing better work than most other tech specific conferences, but I do think they could benefit from adapting some of the approaches and topics Queer nonprofit conferences have.

Throughout the summit, I heard no mentions of gentrification from LWT leadership, which felt especially out of place considering that LWT seeks to empower the very people gentrification disproportionately effects. While gentrification has been a popular conversation in tech spaces, having been discussed in length, I can understand how it feels like it might not need as much attention. But I still feel it’s incredibly important to have some intentional dialogue and education around it. I’m from Chicago and the city’s recent tech expansion and attempt at being a global city has reinvigorated the conversation of gentrification and tech. If LWT truly aims to create a more intersectional and diverse tech workforce than they need to fully engage the communities that are being displaced by tech gentrification. LWT leadership needs to recognize they have a platform to educate and incite change. Choosing not to talk about gentrification is choosing to be complicit in it.

At the root of complicity are respectability politics, something LWT engages heavily in, in order to maintain funding, connections, and a respectable reputation. But with these politics comes the erasure of some folks who rely on tech for their safety and economic stability. Sex workers have always been at the forefront of using and building the popularity of tech platforms and services. Between navigating digital banking, advertising online, and censorship on social media sex workers utilize tech at significant rates. Sex workers made Cashapp and Venmo mainstream, and continue to be a driving force between both banking systems growth. But both systems, as well as most social media platforms, have made it increasingly difficult for sex workers to continue using them. 

I went to LWT knowing that there were no formal mentions of sex work in the programming, an oversight considering the historical connections between sex work and Queer folks. After all, pride was started by Marsha P. Johnson, a Black Trans woman, and a sex worker. Countless other Queer revolutionaries like Sylvia Rivera, Amber L. Hollibaugh, and Miss Major among numerous others have been on the front lines of Queer liberation. But as Queer folks have become more assimilated into mainstream culture, Queer sex workers have been pushed farther to the fringes by their own communities.

Whenever in casual conversation with other attendees, the mention of sex work would make them uncomfortable. When I disclosed my experiences in navigating social media as a sex worker, I could feel them try to calculate what type of work I did. It felt like I had to prove my credentials and cleanliness to them. I had a few people implore what type of sex work I did, and I generally got the feeling from them that some forms were more acceptable than others. Often times folks would withdraw from the conversation or worse, explain to me how they knew things were “difficult” because they read a Vice article once. When I pressed them for ways that they were working to make their companies and products better for sex workers since they read a Vice article, they often said there wasn’t much they could do because they weren’t a decision-maker or programmer. But I think that’s just coded language for “I don’t want to do anything.”

I don’t think it’s a matter of people not understanding the difficulties sex workers face while trying to navigate tech. I think it’s an issue of respectability politics; additionally, those that are willing to make change are unsure where to start. Sex work, despite what sex positivity would have you think, is still incredibly stigmatized, especially within educated Queer spaces, like LWT. Leadership at LWT has the power to educate attendees on the nuances of tech and sex work and can impact attendees to do more within their positions, but once again, they choose not to.

The highpoint of the conference for me was being able to see Angelica Ross speak, Ross has been incredibly vocal about the importance of sex workers in tech and has provided visibility to the larger movement. I want to see more more dialogue around sex work and sex workers speaking and facilitating conversations specifically at LWT in the future. Additionally, I would like to see LWT engaging more with sex-workers by partnering with sex worker specific organizations and speaking about sex work more vocally on their digital platforms. I think engaging more sex worker based organizations would encourage more sex workers to attend, and if anyone needs better tech, it’s sex workers. 

Publicly talking about sex work not only educates civilians on the nuances of tech and sex work but also actively destigmatizes sex work in tech spaces, making it easier for folks to openly (and comfortably) talk about their narratives as sex workers. I’m critical of LWT because I want it to succeed, I want people to feel comfortable and for tech to be reclaimed.