For over six years, Hacking//Hustling has worked to build the capacity of sex workers and survivors to create new technologies and interventions that increase safety. Through programming, research, and convenings, we worked to explode the definition of technology to include harm reduction strategies, mutual aid, organizing, art, and any/all tools sex workers and survivors develop to mitigate state, workplace, and interpersonal violence. Through our interventions in the academy, hustling institutions, moving resources, and creating space to lean deeply into community care, Hacking//Hustling has leveraged its connection to tech spaces to support the safety and survival of people who trade sex.
Hacking//Hustling formed in 2018 as a community response to FOSTA-SESTA and the shuttering of Backpage. We grew into a network of sex workers, survivors, and accomplices working to redefine technologies toward uplifting survival strategies that build safety without prisons or policing. Born out of crisis organizing, we have spent the past few years learning to value slowing down and prioritizing care. We have prioritized rest, and recovery –and finding out, continuously, what these words look like in practice. With this space, we have continued to show up as our full selves, focus on healing and made the intentional and thoughtful decision that the time has come to sunset Hacking//Hustling.
This is a difficult letter to compose, not only because when chapters come to a close there’s a grieving and sadness that can come with that, but because there’s been so much work and struggle by us and our extended communities over these years that it is overwhelming to try and hold. So we won’t try that here. Instead, we want to express our gratitude, collective love and rage.
We have, like so many sex worker-led and radical organizations, struggled amidst criminalization and crisis. We already know about mutual aid and care, but we had to significantly pivot when the COVID-19 pandemic hit. Six years after forming, our core collective members are graduating from PhD programs, pursuing new careers, fellowships, and acclimating to increased disability. Life moves, and capacity changes. Our goal with Hacking//Hustling was always to disrupt institutions, and part of that is knowing when to sunset, allow the work to take other forms, and support each other in new life pursuits.
It would not be possible to fully reflect here on all of the labor, the principles we moved with, the strategy and tactics we’ve experimented with, and the ways we’ve held each other. Instead, we hope our many communities will help us do this, in comments, in messages, in conversation with each other, in the ways you all may continue to call upon zines, toolkits, recorded workshops, art we’ve shared, and all the ways we will continue to carry lessons learned forward.
We have seen and felt major changes. From shifts in popular media such as sex work and photos of sex workers protesting accompanying news articles more often than the word “prostitute” and diembodied leg photos, more sex worker research getting funded, fellowships, community-made media, new survivor-led mental health efforts, platforms reaching out for sex worker compentent consultancy, sex worker art and theoretical writing, new tech workarounds– a chorus of hopes and vision – and so much was learned together. so much work we admire at the intersection of sex work and tech is being held and brought into being by these (and of course more) comrades!
As we reflect on our years of work, we produced some of the first research on the impact of FOSTA-SESTA, hosted a sex worker led convening at Harvard, a seven day conference, Sex Workers Organizing Against Barriers at Cornell, piloted a Formerly Incarcerated Sex Worker Tech Support program, hosted community calls and digital and legal literacy workshops to break down shitty tech policy, produced programming on the history of sex work and technology, and most importantly, do our best to get sex workers paid…in cash. But what we are most proud of is how we consistently have shown up for each other, prioritized community care, and moved at the speed of trust and capacity.
A community organization should not be a capitalist formation that is always seeking expansion in the name of expansion, and at the expense and exploitation of its members. Hacking//Hustling has created community, a body of work, and a space for disparate projects to be housed. We can create these spaces while remembering that ultimately those spaces exist for and are made up of people, and that there will come a time for that work to transform and iterate. The act of sunsetting prioritizes people over institutions and organizations. Sunsetting is a reminder of the values of our non-productive labor time, and ultimately our health, our joy, our value without titles. Sunsetting is a reminder that we cannot work without rest.
But we know that work never ends when an organization sunsets. It ripples outward, with people taking lessons learned into other formations and projects. We encourage everyone to follow Data 4 Black Lives, Digital Defense Fund, T4Tech, Decoding Stigma, Kink Out, Safer Movements Collective, the Support Ho(s)e Collective, and Veil Machine and continue making space for cross-movement conversations, and skill sharing across causes for bodily autonomy. Our Struggles are interconnected and our freedom is bound up in one another’s.
Our website (as well as our Instagram and Twitter) will remain a living archive of resources, research and creative projects. We hope it will continue to be useful, and support ongoing sex worker-led liberatory work. You may even see a few in-progress efforts shared out over the next year or so.
We wish to extend immense gratitude to our many communities–our fierce abortionists, our trans tech hacker family, sex working/trading/hustling co-conspirators, our incarcerated comrade-teachers, and so so many more.
This work would not have been possible without, and in spite of, the sex trades. Our labor, community labor, whore labor, moves mountains. And while a major accomplishment for us was securing modest funding from two endowments that have never before funded US sex worker organizing, the majority of our funding has come from our own labor in the sex trades.
Donna|Dante, an organizer with the Support Ho(s)e Collective, shared these words during a reflection and visioning session we held last year, about a pilot program we co-created: “I collaborated in SxHx with the Formerly Incarcerated Worker Support Program, doing that work around several other jobs, alongside others who believed in supporting people transition from prison. An entire group of people saw to compensation for my skill sets, saw my skills as valuable, my dream of being valued for my carework wasn’t just a dream, it really helped with my mental health at that time. It was really special to me, given my experiences with incarceration, this was pivotal and helpful for me. I was part of something bigger than myself.”
Our longtime collaborator, Lorelei Lee shared, “A place where my work is taken seriously, and that I’m taken seriously. It’s like, I always got angry at being called stupid & having that be assumed – that stigma that all sex workers are stupid. And the affirmation of being seen as smart, capable, serious – my whole life I was waiting to get that affirmation from civilians, but of course that never felt like a real part of my identity until whores reflected that back to me. We are essentially a school. If you ever take an art class, they talk about the new york school— it’s just group of people who get together and do work they feel is important, citation to each other, sex worker writing, data, created research, organizations, it is intentional. We have done that intentionally, built this collective way of imagining, creating out of imagining, elevating each other all together.”
One of our founders, Blunt, has said, “Community is the technology we need to invest in.” As our work evolves into other forms, we remain consistent in our call for abolition, in the here, in the now of us.
Where the name “Khady” appears, that is Juliana Friend, whose participation was supported by the Alternative Digital Futures Project at the Center for Long-Term Cybersecurity (CLTC) at UC Berkeley. Juliana Friend is an anthropologist and public health researcher focusing on technology and health. She first met San and Maya in Senegal in 2017.
Full English translations, followed by the original conversational dialogues of Wolof and French.
Tekki yu mat sëkk ci Àngle, topp ci waxtaani waxtaan yu njëkk ya ci Wolof ak Français.
Traductions complètes en anglais, suivies des dialogues conversationnels originaux en wolof et en français.
Interpreter/translator’s note:
1) I mark in red a few places where I want to check that I am hearing and/or translating the statements properly.
2) I hesitated about whether or not to include and translate what I said as an interpreter: either for clarity, transparency, or both.
———-
Kira (K): Khady do you want to do the honors of posing the first question in English, but also in Francais or Wolof for us?
Khady (KD): I would…I wonder, Kira, you said it so well. Would you want to do the English version and then I’ll translate?
Kira (K): Sure, yeah!
Fleur (F): I started a google doc.
KD: That’s so helpful, Fleur.
K: So the initial question, the first part of it ; what are our most pressing concerns? That could also mean, the most pressing threats that our communities are facing, both on and offline? We can take turns responding to that and then we can shift.
KD: Perfect.
Sana (S): OK, I will answer. Because what’s hardest about the internet and face to face, on the internet side. We aren’t completely integrated with the internet. For example, when COVID-19 came, many had problems finding work. Because at that moment, we didn’t know how Corona worked. To have contact with people, many people just stopped working. If you take someone like me, I stopped completely, like some other people did. You understand? But the internet, on the one hand, if it had security, it would suit us much better than contact. If it had security, I want to be clear, it would suit us much better. Because on the internet, you don’t risk anything. That’s my brief introduction. We can discuss.
Maya (M): Security, internet, it has security. It has more security compaKira with face to face. [S: that’s what I was saying] in relation to COVID. Because COVID cannot infect you. [S: Exactly!] Even STIs, HIV, they cannot infect you. But the insecurity that the internet has, is what we were talking about. Your image can be recorded, and people can commit blackmail.
S: For example, video.
M: If it’s your voice, too, they can blackmail you
S: So in Senegal. Me, my voice was put on the internet. I showed it to you Khady, do you remember?
KD: Yes.
M: But still, we worked on the internet. But there is no security. If there was security, everyone would be working on the Internet. Because contact is not safe. You can contract STIs, HIV/AIDS, COVID..you know COVID even now is getting people. Currently, today we are at over 100 [cases] in Senegal.
KD: 100 cases today ?
M: 100 cases in Senegal today. So that means that COVID is coming back. So if we can do video calls, that is better.
S: There’s less risk.
M: There’s less risk. But it’s not securitized.
S: It’s not securitized
M: There are some people who, who have means, they prefer to wait for a while, until COVID goes down again. But if there were security, if there were security, everyone would be working on the internet.
S: That is for sure.
M: That is for sure. Even me, who is talking with you right now, will work online.
S: Because there is less disease.
M: You have less disease, you don’t have anything. You stay at home. Your clients will have their pleasure too.
S: That’s right.
4:45 KD interprets.
Èmoussée (E): Thank you for that. That’s so interesting. I have a follow up question if that’s OK. I’m curious, part of what I was hearing was that it’s both the visibility of online sex work that causes a security problem. I’m curious to hear a little bit more about what they conceptualize as security or privacy that would make online sex work a more tenable option.
KD: So, what measures of security would make online sex work more viable? Got it. Yeah, that’s a great question.
E: And I think, what do they mean by security? Is it from the client? Is it just the visibility that makes it feel insecure? I guess I’m curious about elaborating a little bit.
8:15 KD interprets.
NOTE: [ I didn’t adequately convey the « visibility » part of E’s question. My apologies. Something we could return to next time, if you like]
M: Insecurity, it’s in relation to the clients. They can capture video, your image. How they do it, they capture/record and commit blackmail, threaten to post it on the internet. Or they capture your voice, audio. Even on the telephone they can capture your voice. It’s about blackmail. The security that we need, for example, the person who owns Facebook also owns Messenger and Whatsapp. He could arrange it so that whoever uses them can’t capture an image.
S: To protect you.
M: You couldn’t capture an image. You couldn’t capture audio. Therefore you couldn’t do blackmail. That’s what I want to emphasize about security. That’s what I think.
S: That’s what both of us think.
9:50 KD interprets.
F: And we know that technology exists right?
K: Yes it does.
F: Like, Netflix and all the streaming companies made it so that you can no longer even share a video on Zoom. So that, it seems totally possible.
K: It seems almost as if, by design, they don’t care about individuals’ personal safety and security online, and only protecting their property rights, right? And that’s what these major corporations care about. They care about property; they don’t care about people. We know that they can freeze or capture images and use screen recording. We know that they can disable downloading whenever they see fit. We see this all the time internationally, in the United States – like, we know these corporations have the capability to offer us, just people, the same level of protection and privacy considerations that they themselves enjoy as corporate entities. And yet, those things are not extended to us, which means that, rightfully as Maya and Sana have pointed out, and as Fleur has mentioned, this is the thing that we’re dealing with, is this discrepancy of power and access, right, when it comes to digital and online security.
And I also think about access. And maybe I’ll end with this as the other thing that I’m thinking about is actual material technology to facilitate transitioning to online work. And I think about the barriers to doing some of that online work, that would make several of us on this call feel safer in transitioning, right, is actually having the means to do that, which means having a computer that can connect to high-speed internet, having high speed internet in the first place, having a stable phone line, having a light, having a phone or a camera device. These are things that can become so exorbitantly priced and out of our reach. And sex workers, everyone needs these things to communicate in our modern society, but sex workers rely on these technologies, and they’re kept so far out of our reach so often. So I’m thinking about access here too as part of this security conversation and personal safety conversation that we’re having.
F: And I would add to that: home. Having a home to work out of. And Khady I know you have a lot to translate so I won’t say more on that, but I will get back to it. [laughter]
KD: No thank you for, you know, marking that.
14:12-17:36 KD interprets.
S: In parenthesis I can share that me, I have two apartments. I’m forced to move between my apartment and where I work. That is to say that it comes back to resources, which is why I’m mentioning this. You know, in my apartment, I can’t talk about everything. I’m forced to travel to [neighborhood X : maybe we should omit the neighborhood name to protect anonymity.] You know? And all of that is part of security. Because here I am free to talk about whatever I want, but at my home, I can’t talk about these things. So as Kira was saying, you have to have somewhere, have your apartment, have this, and that, and that and that. You have to have so many things. And it requires money to obtain them. If you have the means, and a little bit of protection, it will be alright. You can work online, chill, without problems. You won’t have any problems, any worries. Like me, what I was saying. Like now, it’s recording me. For example, you know, zoom, if you record something it says, your call is being recorded. You have to ‘accept’ that. If they had this on the internet in general that would suit us. For example, Khady, if you record me, if there is a message that is sent to me, like, “Khady is recording you.” That way, in an instant, I could stop the call. Or “Khady is doing a screen capture” like Snap. Snapchat. If they do a capture, automatically I will know that Khady is doing a capture. Zoom, just a moment ago, when it started recording, it said, we are recording. Then I’m the one with the right to accept or not. But I accept. If there were security like that, that would suit us. When you record, when you capture, I would know all of that. After that, we can figure out our apartment, our devices, etc etc. Everything. That’s what it’s about.
M: Regarding security, there was one day when I told you [Khady], it’s a computer that I need. Because sometimes when you’re using a small phone, you can’t do what you want to do. But if it was a computer, I could set it here, and while I talk to someone, there it sits. Work would be easier and faster. If it’s a small cell phone, if it’s just this, you can’t do what you want to do. To hold it and do your work at the same time, it’s difficult. If I had a computer, I would have a way to set the screen aside, position it, and turn on the camera. Insecurity, there is something else that is part of the issue in Senegal, and that is the police. Because sometimes you are talking to someone you think is a client, but it’s actually the police.
S: They take our money. Like the day before yesterday, the day before yesterday the police came and stood in the stairway to take my clients, take their money. Anyway, Senegal, Africa, it’s really something.
M: Yes, that happens these days.
S: The police should make us more secure, however they do not make us secure.
M: Exactly
S: If you are a sex worker, they can humiliate you.
M: They can rip you off.
KD: So the police were standing in your house taking your clients, or? I didn’t get that.
S: The police were standing in the stairwell. A police officer came and stood in the stairwell. Waited for a client to enter to receive services, and when he came out again, argued with [the client]. I said, “Commissioner!” I tell him, “get out of the house.” I go up to the commissioner and tell him, you don’t have the right to enter a peaceful house at midnight without an arrest warrant. I talked with him, and he left.
KD: He left, eh?
S: Because I know what I’m talking about. I know my rights. I’ve been at this for a little while now. They have to have an arrest warrant. They should leave the house, they should not enter the house.
M: Good.
S: Just recently I was having an issue with a police officer, and that’s not normal. All of that is part of security. The police should give us security.
M: And you can be talking with a client, but it turns out it is not a client. You’re talking to a police officer.
21:25: KD interprets
E: I just wanted to respond to what I hear Sana talking about, is this culture of consent, and how it’s not built into tech. And I think it goes back into the previous conversation that we were having of how the platforms already have all of these tools. It’s just used to create further violence against marginalized communities. And I think of how when I was deplatformed or kicked off of Instagram for being a sex worker, my IP address was banned, so I wasn’t allowed to make a new account on [the/a ?] device. But for many years when someone was harassing me online or stalking me, I couldn’t block all of them. And I think Instagram just gives this option now, in the last year or two, when the technology has been around for many years. And so there’s this power dynamic where these technologies exist, but it’s who has access to them. And this idea that this culture of consent isn’t built into technology when so much of what tech does is just extract data. They’re almost dependent on our lack of consent to have this type of relationship. And I love what Sana was saying about the Zoom notification, which is really quite new. I think it came out, like, two years into the pandemic, asking people for consent to record, which is such a small thing, but it provides this space where you’re able to consent with how you’re interacting with a technology that you’re working with. I think there’s so much more that we could see built into technology to give users more choice of how they interact with technology. I guess I can stop there for translation, or if anyone had something they wanted to add.
KD: Maybe if you wouldn’t mind, I’ll just translate that. I think maybe in smaller chunks, I might be able to do a slightly better job. So thank you for that.
26:49 KD interprets
M: What you were talking about makes sense. Because already, those who own the technology seek money, as you were saying. They’re after money. If they know that if they did something, they wouldn’t get money, then that wouldn’t work for them. They wouldn’t take it down. They’d refuse. What brings money, that’s what they want out of that.So they’re looking out for their own interests. Their interests. Not our interests, but their own interests. They’re looking at what they can gain, rather than what we can lose. That’s the problem. That’s the disadvantage.
30:25 KD interprets
NOTE : In this one I made an interpretation of what Maya was saying when I added “…is consent in their financial interests?” Kira responds to this question.
K: Yeah, that’s the question. And I think as we can see, unless people collectively demand and organize, put pressure on and use literally every effort in our means to try and curtail the kind of, extremely violent profit-motivated, right, anti-people policies and practices and terms of service, etc., that these tech companies have, we’re not going to see change. And even still, I try to remain hopeful, that we can really create and imagine ways of utilizing technology. But it is very apparent that these companies, left to their own devices, we wouldn’t even have what we have currently, which is not enough. And so, I think that might transition us into talking about what we do in the meantime.
Hacking//Hustling + Sénégal Comrades Conversation “Grande Causerie”: Part Two
August 8th, 2022
Interpreter/translator’s note:
Maya and Sana indicated a few identifiable people and institutions, so these have been removed.
I’ve highlighted in red some other potential identifiers that we haven’t yet discussed together.
In blue are some notes or questions for you. I put these in-text rather than in comments because I thought it might be easier to read on a mobile device. Also in blue – I added a few context clues in parenthesis that you can delete, keep or change.
In green, these are words or sentences that I want to check with you. I’m not sure I heard, captuKira, or interpreted them correctly.
———————————————————————————————————————
Sana (S): Kii mooy Fleur. Fleur.
Sana: This is Fleur. Fleur.
Fleur (F): Yes?
FleuK: Oui?
S: Very nice. Maangi commencer men de. Khady, maangi commencer wax Anglais.
S: Very nice. I’m starting to be able to. Khady, I’m starting to speak English.
F: Ah yah, I gotta learn some French. I was trying to learn Spanish over the summer but—
F: Ah oui, je dois apprendre le Francais. J’essayais d’apprendre l’espanol cet été mais –
Khady (KD): There’s some crossover
KD: Il y a quelques similarités.
F: Yah
F: Oui.
KD: Just French pronunciation is crazy but.
KD: La pronunciation en Francais est un peu fou mais.
S: Wolof moo gen a yomb
S: Wolof is easier.
Maya (M): Mu jang Wolof. Wolof moo gen a yomb.
Maya (M): She should learn Wolof. Wolof is easier.
F: Ah alright, good tip.
F: Oui ok. Bon conseil.
S: Langue Nationale u Sénégal.
S: The national language of Sénégal.
F: Mm Hmm
F: Oui oui.
KD: Anyway, would someone want to kick us off, maybe read the agenda?
KD: En tout cas, est-ce qu’il y a quelqu’un qui voudrait démarrer, peut-etre lisant l’agenda?
S: OK
S: OK
F: Do you want me to read it in English and then you read it in French and Wolof?
F: Voudrais-tu que je le lise en Anglais et puis tu peux le lire en Francais et en Wolof?
KD: Sure, yeah.
KD: Oui c’est bon.
F: Ok cool, so the three things that we said we were going to come back to: first, legal models. Second, talking through how we navigate our safety concerns, and especially collective demands. And we wanted to learn more about sutura. And then last, just talking about how sex workers use the internet.
F: OK cool, alors les trois points dont on voulait continuer de discuter sont d’abbord, les systems de loi. Deuxièmenet, discuter comment gérer la sécurité, et surtout le plaidoyer collectif. Et nous voulions apprendre plus à propos de sutura. Et finalement, discuter comment les professionels du sexe utilisent l’internet.
M: Première question, ci loi bi. Loi u PS yi, benn la. Parce-ce que su fekkee ne par exemple, à chaque fois da nu ne, avec la police, bu gniowee seen ker, amul droit dugg seen biir ker. Yaw en tant que PS, da nga wara am sa carnet sanitaire. Nga nekk en règles, nga respecter say rendez-vous, nekk en règles. Sinon police bu gniowee, parce-ce que loolu ci loi bi la bokk. Carnet sanitaire, ci loi bi la bokk. Respecter say rendez-vous. Dem say rendez-vous regulièrement. Su fekkee ne nga begg a baay, nga bind lettre ne da nga rajje. Nga ne pour quelle raison nak. Boo rajjee tamit, nga rajje un mois, walla quelques jours, walla deux mois, marier, walla voyager, dem nu [inaudible]. Boo gniowatee encore –
M: The first question, about the law. Law about sex workers is one thing. Because if there is for example, every time we say, with the police, when they enter your home, they don’t have the right to come inside your home. As a sex worker, you have to have your health notebook. You are following the rules. If not, when the police come, because that’s part of the law. The health notebook is part of the law. Attend your medical visits. Go to your medical visits regularly. If you want to leave sex work, you write a letter saying that you are leaving sex work. You give the reason. When you leave sex work also, you leave sex work for one month, or some days, or two months, you get married, travel, go where [inaudible]. When you come back again –
S: Def sa bilan en même temps
S: You do your checkup again.
M: Nga defat sa bilan. Pour l’inscription pour nga am carnet, dangay def bilaM: VIH, ak IST, yooyu nga koy def. Première fois bi. Benn day dem Hopital Dantec. Sa photo, quatre photos.
KD: est-ce que tu voudrais anonymizer l’hopital?
M: You do your checkup again. For registration to get your notebook, you do a checkup: HIV and STIs, you do that. The first time. One copy goes to the Dantec Hospital. Your photo, four photos.
S: Quatre photos.
S: Four photos.
M: Ak sa carte domicile bi wone fi nga dekk.
M: With your residence card that shows where you live.
S: Pour nga men a legale.
S: So that you can be legal.
M: Pour nga men a légale am carnet. Soo amee carnet leggi nak, chaque mois dangay gniow rendez-vous. Boo gniowee di nanu la xol, def ay bilan. Chaque six mois def le bilan. Bilan surtout VIH bi. Tous les six mois. Soo amee carnet. Leggi nak, ci lois u Sénégal. Loi bi ba tey, avec le carnet, du la permettre taxaw dans la rue pour le racolage.
M: So that you can be legal and have your health notebook. Once you have your health notebook now, every month you come for your visit. When you come they will look at you, do the checkups. Every six months you do the checkup. Checkup especially for HIV. Every six months. If you have the notebook. Now then, in Sénégalese law. The law even now, with the notebook, doesn’t permit you to stand in the street and do solicitation.
S: Et dans la maison.
S: And in the house.
M: Non dans la maison sax c’est pas interdit. Mais pour sécurité. Parce-ce que un jour on a rencontré le commissaire centrale, man ak une collègue
[nom et titre omis]
M: No, in the house, well, that’s not prohibited. But for security. Because one day we met the commissioner, me and a colleague
[name and job title omitted]
S: Ok.
S: OK.
M: Da nu ne carrément, la loi n’interdit pas. Parce-ce que kenn amul sa droit dugg ca biir ker. Benn. Parce-ce que tu n’as pas agressé. Tu n’as rien fait d’illégal. Tu es dans ta maison. Tu exerces ton travail là-bas. Donc amul droit dugg ci sa biir ker. Mais problème u sécurité. Parce-ce que parfois ton client, nga fekk bandit la.
M: They say plainly, the law does not prohibit that. Because nobody has the right to enter your house. First, Because you haven’t committed violence. You have done nothing illegal. You’re in your house. You’re going about your work there. So they have no right to enter your home. But it’s a problem of security. Because sometimes your client, it turns out they’re a bandit.
S: Bandit la, waaw.
S: They’re a bandit, yes.
M: Am na ay cas yoo xam ne, danu gniow, mesoon xew ak femme bi. Mais danu ko ot, mu de, soti capote ci kawam, jel xalisam, ak femme bi da fa tejjuwoon, waaw. Da nu yobbu xalisam.
M: There are cases where, they come, have never been with the woman before. But they [??? KD: checking word with Maya] her. She dies. They throw condoms on top of her, take her money, and the woman was shut in her room, yes. They took her money.
S: Waaw!
S: Yes!
M: Fekk xale bi tatam, yaram u nen. Seeni dekkando danu xey gis ko noonu rek, fegg fegg fegg fegg. Parce-ce que danu baay ba tejju. Après nu fekk mu de.
M: The young woman was completely naked. Her neighbors woke up and found her like that. Knock, knock, knock, knock. Because they’d left her and shut her in her room. After they found her dead.
S: Loolu fan la?
S: Where was this?
M: Et puis il y a tant d’autres. Am na Sacre Coeur, am na fi Guediawaye, am na Pikine, tant d’autres. Meme Grand Yoff. Bere na ay cas. Benn Tali la dernière fois. Danu yew rek mu fekk mu de. Parce-ce que loolu mooy inconvénient. Boo gisee police bu gniow ci bir ker yi, mais police bi bu gniowee ci biir ker yi, c’est pas normale. Danuy leen woo en tant que client. Danu wax ak nyoom ba nu agsi, bu nu gniowee nak, da nu ne, c’est la police. Xam nanu amul droit gniow fi, mais danu begg lekk sa xalis. Danu begg traquer pour lekk ci yaw xalis. Bu nu xamee ne yaw danga nekk en règles, et puis xam nga say droits, danu naan ba jox leen nu jend gazoile. Nu jox leen.
M: And there are many others. There are some in Sacre Coeur, there is Guediawaye, there is Pikine, many others. Even Grand Yoff. There are so many cases. Benn Tali last time. They just woke up and found her dead. Because that’s the downside. When you see police coming into the home, but police that comes into the home, that’s not right. They’ll call you as a client. We talk with them until they arrive, until they come and they say, it’s the police. They don’t have a right to come here, but they want to eat up your money. They want to track you down and eat your money. When they find out that you, you’re following the rules, and then that you know your rights, they plead until we give them money for gas. We give it to them.
S: Waaw noonu lay deme.
S: Yes that’s how it goes.
M: Mais sinon, problème bi mooy, sunu loi bi, obsolète la. Parce-ce que loi boo xam ne–
M: But apart from that, the problem is our law. It’s obsolete. Because it’s a law that –
S: Nyoom nyoo nu wara protéger, mais lunuy def—
S: Them, they should protect, but what they do –
M: Kenn du ko appliquer. Te bu nu ko appliquee aussi tamit, nyoom nax lanuy def. Parce-ce que doo xam ne amoo droit racoler. Parce-ce que c’est depuis ‘68 ba leggi. Xam nga loolu yag na. Doo taxaw ci mbed mi racoler. Police bu la fekkee, mu yobbu la. Benn. Walla nga fekk ci biir ker, ou bien dans les bars et tu n’es pas en règles, mu yobbu la. Hotel dangay dugg hotel, gen, fekk police taxaw ci buntu hotel bi di la xar.
M: No one applies it. And when they apply it, they trick you. Because you may not know that you don’t have the right to solicit. Because it’s since ’68 up until now. You know, that’s old. You can’t stand in the street soliciting. Police if they find you there, they’ll take you away. First. Or if you’re inside a house, or you’re at bars and you are not following the rules, they’ll take you away. A hotel, you’ll enter a hotel, work, and find police waiting for you in the hotel doorway.
S: Jel li nga yoor.
S: Take the money you have.
M: Jel li nga yoor. C’est pas normale. Parce-ce que finalement di nga naan yaw, carnet bi, pour lan la? Dangay naan, looy def ak carnet bi?
M: Take the money you have. It’s not normal. Because in the end, you’ll ask, the notebook, what is it for? You’ll ask, what do you do with the notebook?
S: Meme boo ligueyoo, dinanu yakar ne xalis bi—
S: Even if you’re not working, they will believe that the money –
M: Bu dee policer bi xamee, di nga xey parfois pour dem au marché, pour dem duggu. Nu ndaje ak yaw ci marché bi, lajj sa carnet. Alors que tu n’es pas au lieu de travail. Tu es venue au marché pour faire des achats, aller cuisiner pour manger. Mais danu xame la, jel sa carnet. Nga ne leen yoor uma carnet, marché laay dem. Nu ne [inaudible].
M: If a police officer knows, you’ll wake up in the morning sometimes to go to the market, to go in. They meet you in the market, ask for your notebook. But you’re not even in a place of work. You came to the market to buy things, to go cook and eat. But they make you, take your notebook. You tell them, I didn’t bring my notebook, I’m going to the market. They say [inaudible].
S: Alors que c’est pas normale.
S: But that’s not right.
M: Kon finalement loi bi da fa nekk obsolète dal.
M: So ultimately the law is obsolete then.
S: Amatoo vie privée. Loolu, amoo vie priveé parce-ce que bunu fekkee—
S: You don’t have a private life anymore. That, you don’t have a private life because if –
M: Sunu loi bi ca date depuis longtemps. Ancien loi bi, depuis 1968 jusqu’à présent. Boobu ba leggi nuungi liguey pour changer ko, mais kan mooy wax assemblé affaire u PS? Kan moo koy def? Parce-ce que la dernière fois amoon na déjà une personne ayant des opinions favorables aux populations marginalisées [titre et organisme omis]. Normalement war nga ko xam, yaw. Mais il était parlementaire. Mais il n’a jamais défendu les PS là-bas. Il ne l’a jamais fait. Et qui va maintenant défendre les PS?
JF : Maya , est-ce que cette mesure d’anonymisation est suffisante? En plus, est-ce que ça reflète ce que tu voulais dire ?
M: Our law has been around a long time. It’s an old law, dating from 1968 until the present. All that time we’ve been working to change it, but who is going to talk about sex workers’ issues in the [national] assembly? Who’s going to do it? Because last time there was already a person with views favorable to marginalized populations [title and organization omitted]. Normally you, you should know. But they were in parliament. But they never defended sex workers there. They never did that. And now who will defend sex workers?
S: C’est ça.
S: That’s it.
M: On a actuellement pour la loi, on a aucune loi actuellement. Parce-ce que le policier fait ce qu’il veut.
M: Currently as for the law, we don’t have any law currently. Because the police officer does what he wants.
S: En fait kenn nemeul défendre parce-ce que da nu ruus défendre. Par rapport aux États-Unis.
S: In fact no one dares defend because they are afraid of defending. In contrast to the United States.
M: Yaw yaay def rek. Ak policier, boo nekkee, dangay naxante ak moom, danga wara ko baay nga dem. Mais am na nu [inaudible 10:25-10:29 ]
M: You you’re doing this. And the police officer, if you’re there, you’re going back and forth with him. You should leave me be. But there are [inaudible]
S: Comme nyoom danuy plaidoyer, am unu loolu.
S: Like them, they do advocacy. We don’t have that.
M: Amul comme aux etats-unis. Mais fii, on a fait le plaidoyer dans tous les commissariats de Dakar, ici ici. A Dakar avec une ONG On a fait le plaidoyer. Mais rien. On ne peut rien. Meme le gouvernement ne peut rien. Finalement, ils ont maintenant fait le cadre de kii [inaudible 10:54-10:55].
M: Not like in the United States. But here, we’ve done advocacy in every police station in Dakar. Here, here. In Dakar with an NGO. We’ve done advocacy. But nothing. We can’t make anything happen. Even the government can’t do anything. In the end, they’ve made a kind of [inaudible].
S: Am de l’argent c’est tout. Ku soxla xalis dal, day dem ci PS. Ku sooxla, homme de tenu, ku sooxla, sécurité, ci PS lanuy dem. Mooy evenements, mooy Tabaski, mooy fete, sunu sooxla xalis ci PS lanuy dem.
S: Having money is everything. Whoever needs money like, they go to the sex worker. Whoever needs it, men in uniform, whoever needs it, security forces, they go to sex workers. On event days, on Tabaski, on holidays, if they need money they go to sex workers.
M: Ci PS lanuy gniow. Finalement c’est pas la peine nu amati carnet. Ku amul carnet –
M: It’s sex workers they go to. In the end it’s not worth having a notebook anymore. People with the notebook –
S: Est-ce que carnet di nanu eliminer?
S: Is the notebook going to be eliminated?
M: Déjà ku amul carnet, clandestine bi, normalement bu koy jappee, moom war na tudd 6 mois à un an de prison.
M: Already the person without a notebook, the clandestine person, normally when they catch them, they have to stay six months to a year in prison.
S: Ak amendes.
S: With fines.
M: Plus amendes. Mais actuellement c’est pas le cas. Clandestines elles sont plus libres que nous qui sont en règles.
M: With fines. But currently that’s not the case. The clandestines are freer than those of us who follow the rules.
KD:Ca c’est intéressant.
KD:That, that’s interesting.
M: Danuy graisser. Elles sont prètes à donner 50 mille ba ci kaw. Kon kenn du la japp. Alors que sunu coté nous, 5 milles lanu koy jox. Kon nyoom nyoo gen a nekk en règles que nous.[11:50] Loolu c’est la première question.
M: They pay them. They are ready to give 50 thousand and above. So no one will arrest them. Whereas for us, 5 thousand is what we give. So them, they are more in line with the rules than us. That’s the first question.
KD:Xanaa ma traduire loolu parce-ce que c’est très intéressant.
KD:Maybe I’ll translate that because it’s very interesting.
M: Danuy bokk gis gis rek.
M: We just share the same point of view.
[12:14-12:48 I’ll summarize this comprehension check for now, but am happy to also translate directly if preferKira– I check that I understood right, that it is their colleague who died. Maya confirms, says this is one example of many.
Je vais faire un résumé pour l’instant, mais si vous le préférez, je peux traduire le moment où je vous demande si j’ai bien compris. – Je verifie que j’ai bien compris, qu’une collègue est mort. Maya dit que c’est le cas, et que cas yu mel nonu bere nanu]
Y : [23 :36] Da ma tit. Gisoo bi ma waxoon loolu da ma tit. Da ma tit sax. Parce-ce que mooy loolu, da fa triste.
Y : I was scared. Didn’t you see, when I said that I was scared. Even me, I was scared. Because that’s how it is. It’s sad.
N : Loolu bere na de. Da ma leen xam. Danuy dem hopital. Da ma leen…
N : It happens often. I know some of them. We go to the hospital. For them I…
K: Khady, thank you so much for leading us through the translation. I’m wondering if you can convey first, just thanks to both of our comrades for sharing. That really kind of paints a much clearer picture for me about the situation. But also if you could convey condolences and rage about the passing, the murder, of their comrade and colleague and fellow worker. I’d really appreciate that. I’m deeply saddened to hear that. And also, something that I just wanted to articulate before it left my brain, was just like how again, internationally, it is so clear that state-based regulation and legalization models do such immeasurable violence to sex working communities. And that you know, the decriminalization of all survival, including everyone’s labor in the sex trades, is just so underscored with every story that Maya and Sana have shared, you know? And coupled with that, the absolute, like, ah I’m losing my train of thought because I was very emotionally activated by that translation. Just the way in which it’s so clear that even if the law was updated, police are going to do what police are going to do, right? They will overstep. They will exert power and force. And we see that here, right, in all manner of, police are not supposed to destroy or seize condoms anymore, if they find condoms on people that they stop and yet that continues to happen and worse. Worse things than that continue to happen also and so it’s not just about making the laws up to date, but really about questioning who those laws serve in the first place. And the clandestine – I was picking up on that word a lot and I was very curious to see how that was going to factor in here. And yah, doing work kind of off the books, right, it makes sense. It makes sense because the health screenings don’t seem to actually serve the people. They’re these arbitrary mechanisms of control from the state, right? And I wanted to ask a clarifying question about those too. Because I think I might know the answer to it, but it might be doubly damning to underline it here. Do the workers have to assume the costs associated with such screenings and health book maintenance as well? Cause I can only imagine that that’s probably the case, and just the financial violence and kind of extortion by the state in that regard too. And I’ll stop there because I know that’s a lot. I really appreciate conveying that back Khady.
K: Khady, merci de nous avoir aidé avec la traduction. Je me demande si tu peux tout d’abord remercier nos deux camarades d’avoir partagé avec nous. Vraiment cela me donne une image beaucoup plus claire de la situation.Mais de plus si tu pourrais donner des condoléances et communiquer la colère à propos de la mort, le meurtre, de leur camarade et collègue et collègue de travail. Je l’apprécierais beaucoup. Je suis triste d’avoir cette nouvelle, profondément. Et en plus, il y a quelque chose à dire avant que je l’oublie, c’est qu’encore une fois, sur le plan international, il est évident que la réglementation étatique et la modèle de légalisation font une violence incommensurable aux communautés de professionnels du sexe. Et tu sais, la décriminalisation de tout effort de survivre, y compris le travail de tous les PS, est mis en relief par chaque histoire que Maya et Sana racontent, tu sais ? Et en plus, l’extrême, c’est-à-dire, ah, je perds le fil de mes pensées parce ce que cette traduction a activé beaucoup de sentiments en moi. Mais en tout cas, la manière dans laquelle il est tellement évident que même si la loi était mise à jour, la police ferait ce que la police ferait, n’est-ce pas ? Ils vont au-delà. Ils vont exercer leur pouvoir et leur force. Et on peut voir cela, par exemple, la police ne doit plus détruire ou saisir les préservatifs, s’ils trouvent que les gens ont des préservatifs, mais malgré tout, toujours ça se passe, et pire. Et on voit des choses pires que ça donc il ne suffit pas de mettre à jour les lois, sinon de demander, en première place, ces lois servent à qui. Et la clandestine – j’ai entendu ce mot à plusieurs reprises et j’étais très curieuse de voir le rôle qu’elle joue. Et oui, travailler d’une manière non-enregistrée, ça fait du sens. Ça fait du sens parce ce que les bilans de santé ne semblent pas d’avancer le bien-être des gens. Il s’agit des mécanismes de contrôle arbitraires de l’état, n’est-ce pas? Et je voulais poser une question de clarification. Parce-ce que peut-être je sais déjà la réponse, mais il est possible que ça serait encore plus accablant si elles le disent maintenant. Les travailleurs, est-ce que c’est à eux de payer les frais des bilans et des carnets aussi ? Parce-ce que j’imagine que c’est le cas, et la violence financière et cette extorsion étatique dans ce contexte. Et je vais arrêter là parce-ce que je sais que c’est beaucoup. Je l’apprécierais beaucoup Khady si tu pourrais transmettre ce message.
F: I just echo Kira’s condolences too, and also just rage. But just echo everything Kira said, that this is clearly not about protecting or keeping sex workers safe. So I have a question but I’ll wait until after you translate.
F: Je voudrais amplifier les condoléances de Kira, et aussi, ma colère. Mais pour amplifier tout ce que Kira avait dit, il est évident qu’il ne s’agit pas de protéger les PS ou de protéger leur sécurité. Donc j’ai une question mais je vais attendre jusqu’à ce que la traduction soit complète.
[Note sur la traduction : J’ai inclus ci-dessus une petite partie de la traduction parce-ce que Sana répond directement à la question que je viens de traduire]
[Translation note: For now I put in the bit of translation below because it shows the question that Sana is directly answering]
KD: Et ne na, même su fekkee ne, elle a posé la question, même su fekkee ne loi bi da fa changer, est-ce que police di na soppi jikkoam? Meme su fekkee ne loi bi da fa changer, est-ce que police di na continuer def li muy def?
KD: And they said, even if, they posed the question, even if the law changes, will the police change their behavior? Even if the law changes, will the police continue to do what they do?
S: Bien sur. Parce-ce que bu nuy gen, seen chef dunu koy yeg. Boo demee ba ci chef bi, xam nga lan lanuy wax? Menoo nax, ngeen waxtaan ak nyoom, ngeen joxleen dara nu dem bala nu tarder seen liguey, chef bi loolu lanuy wax. Commissaire ci boppam. Xam nga lu may wax. Mes nanu daw, nu dem ba ci commissaire, da fa ne, doo leen jox moo gen tarder seen liguey. Imagine-toi parce-ce que commissaire bi moo nekkul police bu ndaw bi. Moo nekkul agent. Commissaire bi moo nekk gradé, nekk commissaire. Xam nu luy dox. Leggi danuy baay ba seen jour u Dimanche, nyoo dugg ci lieu bi fekk seen chef ne, duggleen.
S: Of course. Because who goes out, their chief won’t [checking comprehension w/ Sana]. If you go all the way to the chief, you know what they’ll say? You can’t fool them, you discuss with them, you give them something so that they go without delaying their work, that’s what the chief says. The commissioner himself. That’s what they tell me. Once we ran to the comissioner, and he said, why don’t you give them something so you don’t delay their work. Imagine that, because the commissioner is not a basic police officer. He’s not an agent. A comissioner is ranked, he’s become a commissioner. He knows how things work. So now they wait until their day of rest, they go find their chief, and give him some of the money.
M: Ba wedduwoon. [25:42]
M: And deny what happened.
S: Wedd bu sorree. En tout cas, ci mois bi di ngeen gis trois ou quatre fois. Di ngeen gis yu bere bere bere. Mais xolal, Khady. Liguey u Sénégal da fa meti. Surtout Afrique. C’est pas Sénégal seulement de. C’est partout en Afrique. Liguey bu metti, metti, metti. Da fa metti. Yaangi liguey, toonoo kenn. C’est ton corps. Mais policiers yi, da nu mel ne, nyoom nyoo moom sa corps. Yoo moomoo sa koor. Nyoom nyoo [26] moom sa corps.
S: Deny everything. In any case, within the month you’ll see it three or four times. You will see it a lot, a lot, a lot. But look Khady. Work in Sénégal is difficult. Especially in Africa. It’s not just Sénégal eh. It’s everywhere in Africa. Hard, hard, hard work. It’s hard. You do your work, hassling nobody. It’s your body. But police officers, they make it seem like it’s them who own your body. You don’t own your body. They have ownership over your body.
M: Dangay liguey di leen jox.
M: You work and then give them money.
S: Di leen jox. Pour quelle raison?
S: Give them money. For what reason?
M: Danuy liguey di faay ko fin du mois.
M: We work and then pay them at the end of the month.
S: Danuy liguey di faay ko fin du mois di ko yobbu seen famille. Bunu ligueyee da nu leen jox nyoom. Pour quelle raison? Xolal, gis nga femme bi di dee noonu noonu. Bu nu woowoon policiers yi leggi leggi, nu ne da nuy xol. Policiers yi nyo jekk a yaq. Au lieu nu sutural ko, da nu yaq. “Femme bi, Caaga la!”
S: We work and then pay them at the end of the month so they can bring it to their family. When we work we give money to them. For what reason? Look, you know the woman who died like that. When they call the police now, now, they say they’ll look into it. Police officers are the first ones to mess it up. Instead of giving her sutura, they ruin her. “This woman is a prostitute!”
M: Amul enquête.
M: There’s no investigation.
S: “Femme bi, Caaga la!”
S: “This woman is a prostitute!”
M: Cas bi du am benn suivie. Cas bi amul benn enquête.
M: This case there won’t be any follow up. This case there is no investigation.
S: Du am enquête. Du am suivie. Du am dara. Gis nga boo demee bi tribunal, président tribunal soo gisee day ne kii, jaaykat u boppam la. Nii lanu koy waxee, sans vergogne. “Lii la, PS la.” Carrément. Nit da fa wara ko sutural. Mais nun, day mel ku nuy humilier. Day mel ne kuy nyak sutura. Mooy loolu de.
S: There won’t be an investigation. There won’t be follow up. There won’t be anything. You see, when you go to the tribunal, the president of the tribunal says, her, she sells herself. This is how they talk about her, without shame. “She’s a this, she’s a sex worker.” Outright. People should be given sutura. But them, she’s like someone they humiliate. She’s like someone who doesn’t deserve sutura. That’s what it is.
M: Du am suivie. Du am dara. [inaudible].
M: There won’t be follow up. There won’t be anything. [inaudible].
S: Loolu baaxul. Nit bu deewee, même bu fekkee ne PS la, da fa wara am suivie. Nu topp ko, dem tejj ku ko rey. Mais du enquête du suivie. Pour quelle raison ? Danuy fatte ku ko rey, mais lu bonn rek lanuy wax. Da nu ne, xolal, kii da fa de ci neggam. Capotes yi. Li ak li. Danuy yaq nderam.Mais kan mooy genee la?
S: That is bad. When.a person dies, even if they are a sex worker, there should be follow up. They should follow up, continue on until they lock up the person who killed her. They say, look, her, she died in her room. Condoms, this and that. They sully her name. Now who is going to help you?
M: Dans les journaux da nu ne, on a tué une prostituée dans sa chambre.
[27:28]
M: In the newspapers they say, a prostitute was killed in her room.
S: Leggi kan mooy genee la? Du police, walla sapeurs du gniow pour sauver. Kan mooy genee loolu? Su fekkee ne sapeurs yi, na lanu xamee? Da nu wax ne, nit ka da nu ko rey.
S: Now who is going to help you? It’s not the police, or soldiers, they won’t come to save you. Who is going to help you? If it’s soldiers, what are they going to tell people? They just say, this person was killed.
KD: Loolu lanu tek ci journal bi walla?
KD: That’s what they put in the newspaper or?
M: Waaw, waaw.
M: Yes, yes.
S: Bien sûr! Loolu lanuy tek. Loolu lanuy wax.
S: Of course! That’s what they put. That’s what they say.
M: Prostituée.
M: Prostitute.
S: Mais normalement, nit bu dee, nga gniow fekk ko ba ci biir neeg, yaw en tant que homme de tenue, ma nam sapeur , mooy policier walla gendarme, sutural ko. Sunu la lajjee, nga ne, jigeen lanu rey ci neegam. Mais danuy wax PS bu dee ci biir neegam. Client moo ko rey. Loolu lanu koy waxee.
S: But normally, a person who is dead and you find them in their room, you as a man in uniform, that is a soldier, a police officer or a gendarme, you should give her sutura [“confidentiality” or “discretion”].[1] If you’re asked, you say, a woman was killed in her room But they say a sex worker is dead in her room. The client killed her. That’s how they talk about it.
M: Noonu lanu koy waxee.
M: That’s how they talk about it.
S: Ci journal yi. Alors que ce n’est pas normale. Entre parenthèse, entre parenthèse. Nun ni policiers yi bu nu gniowee ci nun, da nu yaqampti pour dem denc sunuy préservatifs. Sinon bu nu gniowee lu nu jekk a def mooy sacc sunuy préservatifs. Am na benn jour boo xam ne da nu sacc sunuy préservatifs, sacc sunuy 70 mille francs.
S: In the newspaper. But that’s not normal. In parenthesis, in parenthesis. Us, when police officers come to us, they are quick to go and keep our condoms. Otherwise, when they come the first thing they do is steal our condoms. One day they stole our condoms, stole 70 thousand francs.[2]
KD:70 mille francs?
KD:70 thousand francs?
S: 70 mille francs. Temps boobu nuungi [nom du quartier anonymisé]. Cette jour préservatifs yepp lanuy sacc. Loolu dal nanu [nom du quartier anonymisé]. Loolu mootax bunu gniowee automatiquement danuy denc sunuy baggages. Sinon buno ko gisee di nanu ko sacc, te doo ci men dara. Danuy gawante quoi. Danuy def teff teff, gaw. Boo xamee ne nyoom la rek, danuy nebb préservatifs yepp ak sunu xalis. Sinon men nanu ko sacc. Danu koy sacc. Am na bu mujj bi, sunu telephone lanu sacc. Telephone bu ndaw bi.
S: 70 thousand francs. At that time we were in [name of neighborhood removed]. That day they stole all of our condoms. At that time we were in [name of neighborhood removed]. That’s why when they come, we automatically hide our belongings. If not when they see it they will steal it, and you won’t be able to do anything about it. They’re fast eh. They’re quick quick, fast. When you just know it’s them, we’ll hide all the condoms and our money. If not they can steal it. They steal it. Recently, they stole our phone. That small phone.
M: Waaw
M: Yes
S: [le son n’est pas clair à 32:49-52. [Inaudible ]. Heureusement benn rek lanu yobbu. Loolu day arriver tamit de.
S: [Inaudible] Thankfully they only took one. That also happens sometimes.
F : I’m sorry, fines for what exactly ? Fines for having condoms ?
33:28-33:38 NOTE : Here Fleur is asking about fines because Khady had misunderstood Sana and mis-translated the part about the 70,000 francs. If useful for context, I can include that section of my translation.
Fleur demande des éclaircissements parce-ce que Khady avait mal compris Sana et Khady n’avait pas bien traduit la référence au 70 mille. Si c’est utile, pour donner du context, je pourrais inclure cette portion de la traduction.
S: Da nu ci deffoon ci lekket. Lekket. Lunuy liguey. Au nombre de 8 personnes, nu jel xalis bi.
S: What they did was eat money. Eating money, what we make. Eight people, they took our money.
KD: Ah OK. Thank you for clarifying that. I misunderstood. It wasn’t a fine but they took 140 dollars from them. Sorry, that was my misunderstanding.
KD: Ah OK. Merci d’avoir demandé, d’avoir éclairci ca. Je n’avais pas compris. Ce n’était pas une amende mais ils avaient pris 70 mille.
F: So it’s just theft. It’s just police, yah, OK.
F: Donc il s’agit du vol. C’est juste la police, oui, OK.
KD: So thank you for asking that Fleur.
KD: Merci d’avoir demandé, Fleur.
M: Leggi carnet bi –
N : Now, the notebook
Y : Dangay faay
Y : You pay..
N : Carnet bi danu koy jend 1000 francs. Leggi ticket consultation 1000 francs
M: We buy the notebook for 1000 francs[3]. Now the ticket for an appointment is 1000 francs.
S: Chaque mois.
S: Every month.
M: Chaque mois. Loolu c’est chaque mois. Mais carnet bi su fesee walla bu reree dangay jend.
M: Every month. That’s every month. But when the notebook is full or it gets lost you buy another.
S: Ak lunuy xol sa tat. [35:12]
S: And for them to examine your genitals.
M: Non non loolu am na. loolu kenn du ko jend.
M: No no that’s provided, no one buys that.
S: Man bis ba leggi da ma koy jend 600.
S: Me the whole time I’ve bought it for 600.
M: Xanaa foof, fekk mu manquer.
M: Maybe in that case, there’s not enough.
S: Ah ok.
S: Ah ok.
M: Mais normalement di nanu la jox. Leggi nyoom, carnet bi da ma koy jend. [35:31]Pour [nom de l’institution omis] mille francs. Leggi nak boo amee carnet, chaque mois, dangay fay mille francs pour ticket. Carnet bi su reree dangay jendat beneen temps, pour jendat beneen carnet. Su fesee dangay jendat beneen carnet mille francs. Mais bilan moom doo ko fay.
M: But normally they will give it to you. Now them, I buy the health notebook. For [institution name removed] 1000 francs. Now then when you have your notebook, every month, you’ll pay 1000 francs for the ticket. If you lose the notebook you’ll buy another, to buy another notebook. If it gets full you’ll buy another notebook for 1000 francs. But you don’t pay for the exam.
S: C’est gratuit.
S: It’s free.
M: Loolu moom doo fay c’est gratuit. Mais carnet bi, sama xalat man, njerin bi actuellement, di nga men a xol sa biir u yaram. Mais bu fekkee ne police, agents, walla homme de loi, [36] ammul benn lien ci loi bi. Xanaa consultations bi ngay def pour xol sa yaram rek, soo amee IST walla VIH walla noonu, pour loolu rek la.
M: That, well, you don’t pay, it’s free. But the notebook, my own perspective is, currently its importance is that they can examine your health. But if it’s a matter of police, agents, or men of the law, there is not one connection to the law. Maybe the consultations you do to examine your health, to see if you have STIs or HIV or something. That is all that it’s for.
S: Les ARVs sont gratuit.
S: ARVs are free.
M: Les ARVs sont gratuit. Les ARVs sont gratuit.
M: ARVs are free. ARVs are free.
KD:Les ARVs sont gratuit
KD:ARVs are free.
M: Les ARVs sont gratuit. Le dépistage est gratuit.
M: ARVs are free. HIV testing is free.
S: Ticket bi c’est pas gratuit de.
S: The ticket is not free, eh.
M: Non ticket bi c’est mille francs, chaque mois mille francs.
M: No the ticket is 1000 francs, every month 1000 francs.
KD:Le ticket mooy lan?
KD:What does ticket mean?
M: Ticket consultation.
M: Ticket for the visit.
KD:Kon, consultation dangay faay mille francs, mais pour bilan, dépistage, ak ARVs, doo fay.
KD:So, you pay 1000 francs for the consultation. But for the exam, HIV testing, and ARVs, you don’t pay.
Y et M: Non non.
Y and M: No no.
M: Ticket, avant c’était 500 francs. Avant c’était 500 francs. Mais maintenait [inaudible] c’est mille francs chez nous. Mais il y a d’autres districts où c’est 500 francs.
M: The ticket, before it was 500 francs. Before it was 500 francs. But now [inaudible] it’s 1000 francs for us. But there are other districts where it’s 500 francs.
S: Fooy loolu?
S: Where is that?
[N et Y discutent entre eux doucement. Leurs paroles n’ont pas été captées par l’enregistreur.]
[N and Y discuss among themselves softly. Their words are not picked up by recorder.]
KD: OK Loolu leer na merci beaucoup.
KD: OK that’s very clear, thank you very much.
K: Fleur did you want to ask your question?
K: Fleur ndax danga beggoon lajj sa question?
F:Thank you. And I appreciate that question too. Just knowing that like yah the costs of the notebook are, it’s not just dissimilar from like, house arrest in the US. How like people have to pay for their own, to be monitored and surveilled. But the question I was going to ask was sort of connected to what we were discussing, just knowing that the police are not going to be engaged in like keeping people safe, and knowing that there are so many incidents of violence and even like exploitation. What are the ways that sex workers in Sénégal keep each other safe? What are the strategies being used?
F:Merci. J’apprécie cette question aussi. Juste le fait de savoir que les frais du carnet sont, ça ressemble à l’assignation à domicile aux Etats-Unis. C’est à dire, le fait que les gens doivent payer pour leurs propres, pour être suivis et surveillés. Mais la question que je voulais poser est liée à ce qu’on vient de discuter, sachant que la police ne s’engage pas à protéger la sécurité du people, et sachant qu’il y a tant d’exemples de violence et même d’exploitation. Quelles sont les stratégies à travers lesquelles les PS au Sénégal protègent ses pairs ? Quelles stratégies sont utilisées ?
M: Mais ni nu men a jappalante, lenn rek lanu men a def. Parce ce que am na nu xamul, am na PS nu xam ne nyoom, nyoom xam unu, bunu demee police, xamunu seen droits. Danu nekk di patiner. Bunu lajjee xalis, fekk amunu xalis, police di nanu leen yoobu. Bu nu leen yoobu rek leggi, am na seen morom PS yu fa nekk. Nyoom yeneen topp. Bu nu leen toppee –
M: But how we protect and help each other, there’s only one thing we can do. Because there are some who don’t know, there are sex workers who don’t know that they, they don’t know, when they go to the police, they don’t know their rights. They struggle.If they’re asked for money, and it turns out they don’t have any money, the police can take them away. But when they’re taken now, there are their sex worker peers who meet them there. They follow them. When they follow them –
S: Nu proteger leen
S: They protect them.
M: Fekk nyoom, c’est à dire, montant bi nuy lajj, men nanu ko cotiser –
M: Because them, that is to say, the sum that’s demanded, they can pool their money together –
S: Jox leen nu laisse.
S: Give it to them so they leave them alone.
M: Jox leen ko, pour nu men a leen baay. Loolu lanu men a def. Mais bu fekkee ne dangay gniow, te nekkoo en règles, ou bien da nga defoon benn délit, boobu moom, di nanu cotiser pour l’avocat. Benn. Naari, le temps que da nu juger, di nanu togg, bokk, indi loxo. (40:45)
M: Give it to them, so they can be left alone. That’s what we can do. But if they come and are not following the rules, or else they commit some crime, in that case, we can pool money for the lawyer. First. Second, in the time it takes to make a judgement, we can cook, share, and extend a hand.
[Interpretor’s note – In my live interpretation/translation into English, I failed to include the part about pooling money to help comrades. My sincere apologies for that.]
Note : j’ai oublié de partager ton commentaire à propos de la cotisation quand j’ai fait la traduction. Je présente mes excuses.
S: Non danu manquer coté boobu.
S: No we’re kind of lacking on that front.
M: Coté boobu moom –
M: On that front, well –
S: Mais man –
S: But I –
M: Ku yeggee, parce-ce que nit ki men na am problème, te kenn du fa yeg. Mais par exemple, am na nyoo xam ne, ku am problème rek, automatiquement di na woote. Am na nu may woo, am na nuy woo Sana, am na nuy woo personne de resource [titre omis]. Solidarité. Danu wootante. Xol lu nuy def. Solidarité boobu am na.
M: Whoever can help, because if a person has a problem and no one helps them. But for example, there are people who you know, who just have a problem, and automatically they’ll call. There are some who call me, some who call Sana, some who call the resource aid [title omitted]. Solidarity. We call each other. Pay attention to what we’re doing. We have that solidarity.
S: Comme man sama coté ni. Police bu gniowoon fi, xam nga annonce laa def. Nyepp nu xam sama numéro. Ma ne leen, li la sol, li la sol, li la sol. Da ma leen di proteger. Leggi nak, du coup, client buy gniow, da nu taxaw ci balcon bi nu xol client bi. Di nanu xam ne kii policier bi la. Parce-ce que moo joogé ci Sana leggi. Kon da nu communiquer sunu biir. Parce-ce que lekk na ci mann benn cinq mille walla cinquinte mille, exemple, ma beggul mu lekk ci sama bennen morom. Mais damay woote, ma ne, Khady, policier bi, muungi ci terrain bi. Leggi lii la sol, lii la sol. Fais attention. Khady day baay xelam ci loolu. Khady woo keneen, keneen woo keneen. Kon dunu yaq jour boobu noonu. Noonu lanuy defe.
S: Like me on my end. When the police came here, you know I had made an advertisement. Everyone knows my phone number. I told people, they’re wearing this, this and this. I protect them. Now then, the minute the client approaches, we stand out on the balcony and look at the client. Others will be able to know that this is a police officer. Because the person will have come from Sana’s place. We communicate among ourselves. So as an example, say they ate 5 thousand or 50 thousand of my money, and I don’t want them to eat my colleague’s money. So I call and say, “Khady, there’s a police officer on the soccer field. Now, he’s wearing this and this. Pay attention.” Khady will give thought to that. Khady calls this person, this person calls someone else. So they won’t ruin that day. That’s how we do it.
M: Noonu lanu defe. [42]
M: That’s how we do it.
S: Noonu lanu defe. Bunu gniowee ci man, noonu laay partager. Noonu lanuy defe.
S: That’s how we do it. When they come to me, that’s how I share information. That’s how we do it.
KD:Dangay woote yeneen PS.
KD:You call other sex workers.
S: Partage, waaw. Ma ne leen, li ak li, muungi ci terrain. Ki, Khady, yeere bu gris la sol. Sol na dal yu gris. Automatiquement su gniowee day xol ci balcon bi, boo gisee ku sol gris, daal yu gris, mu ne Policer baangi ni. Kon du ko recevoir. Parce-ce que day xam ne gniowul pour liguey. Da fa gniow pour jel xalis. Noonu lay deme.
S: Sharing, yes. I tell them, this and this, he’s on the soccer field. This person, Khady, is wearing gray clothes. Is wearing grey shows. Automatically if they come, you’ll look out the balcony, and when you see someone wearing gray, you’ll say to yourself, there’s the police officer. So you won’t let them in. Because you know that they didn’t come for work. They came to take money. That’s how it goes.
M: Sa, dangay faay sa portable. Bu la woowee doo ko jel.
M: Your, you’ll turn off your cell phone. When they call you, you won’t answer them.
S: Walla nga faay sa portable. Doo ko jel. Dangay xol, men nga xar. Day dem après.
S: Or you turn off your cell phone. You don’t pick up. You’ll look around, you can wait. After, they’ll leave.
K: I know that we’re coming up on time. And I know that it is a holiday and that they have some celebrations to prepare for. They talked about needing time to prepare meals and prepare for gatherings, so I want to respect that very much, but I was wondering if you could again Khady, convey appreciation and thanks, and that this again just feels like one more entry point into more conversation at another point for all of us. And that yah, I hear so much, there’s so much international similarity and so much difference. And it’s so incKiraibly, I feel very, very grateful to be learning, and to be sharing in this space. If you wouldn’t mind conveying that as my kind of like well wishes for their celebration [inaudible @ 46:08…
I think I heard “or something”].
K: Je sais que notre temps est presque écolé. Et je sais que c’est une journée de fête et qu’elles doivent préparer les célébrations Elles ont indiqué qu’elles ont besoin de temps pour préparer le repas et préparer la fête, donc je voudrais bien respecter cela. Mais je voudrais que Khady, encore une fois, tu puisses leur dire que je les apprécie et leur remercie, et que pour nous, cette conversation est un point d’entrée à plus de discussion la prochaine fois. Et en plus, ce qui me frappe est que, il y a tant de similarités internationales et tant de différences. Et c’est tellement, je me sens très, très reconnaissant de pouvoir apprendre et partager à travers cette discussion. S’il ne te dérange pas de communiquer cela et aussi les souhaiter une bonne fête de ma part.
KD: Kira ne na ne xam na ne leggi 18 hr moo jot et xam na ne yeen da ngeen wara preparer repas bi pour Tamxarit, kon moom da fa begg –
KD: Kira said that it’s almost 6 pm and they know that you need to prepare the meal for Tamxarit, so they want to –
M: Waaw, cere.
M: Yes, millet couscous.
S: Sunu nouvelle an la. Nouvelle an.
S: It’s our new year. New years.
KD: Waaw nouvelle an bi, voilà.
KD:Yes the new year, right.
M: Xam nga 31 Décembre c’est pour les Chrétiens. Mais Tamxarit, moom mooy fin d’année Musulmans yi. Mooy fin d’année.
M: You know December 31st is for the Christians. But Tamxarit, that is the end of the year for Muslims. It’s the end of the year.
M: Waaw parce-ce que benn question rek lanu répondre.
M: Yes because we only responded to one question
KD:Waaw degg la.
KD:Yes it’s true.
M: Benn question.
M: One question.
KD: Waaw, waaw. Et xanaa begg ngeen xam seen situation tamit xanaa walla?
KD: Yes, yes. And maybe would you want to know about their situation too, or?
M: Waaw waaw. Nyoom sax, begg naa ko xam. Ba tey ci question u carnet. Seen loi bi ba tey. Foofu, par rapport à la loi.
M: Yes yes. I would like to know about them. Even now we have the question of notebooks. And for them, their laws now. Over there, regarding the law.
KD: So Maya said yah like we only got through one question [laughter].
KD: Alors Maya a dit, on n’a répondu qu’à une question [rires].
F: And the follow up question
F: Et la question complémentaire.
KD: And follow up questions yes.
Et des questions complémentaires, oui.
[laughter]
[rires]
F: Not just [unintelligible] questions.
[Overlapped by laughter and cross talk, so I’m not positive I heard right here]
F: Pas seulement [inaudible] des questions.
M: Non par rapport à la loi rek. Ci première question bi lanu nekk ba leggi. Ci problèmes u loi bi lanu nekk ba leggi. Leggi nak, seen loi yi par rapport à sunuy bos. Ndax benn la walla du benn? Loolu laa begg a xam.
M: No, regarding the law though. We’re still on the first question. We’re still on problems with the law. Now then, your laws in relation to ours. Are they the same or are they not the same? That’s what I would like to know.
S: Foofu am na nu jel carnet u pas? Ak visites yi.
S: Over there, are there people who do use a health notebook and those who don’t ? And the visits.
M: Ak visites yi ak yooyu.
M: And the visits and such.
S: Nu xam, nu xol, échange bi nu muy mel.
S: Let’s know, let’s see, what this exchange brings up.
[49:30 discussion of schedule and time constraints]
[49:30 discussion de l’horaire et des contraintes de temps.
F: I can also say very quickly that we don’t have health notebooks because it’s not, we have full criminalization in the United States, so there aren’t even like, there are no legal workers. That said, when Maya and Sana were talking about, you know, providing names and photos and all of that essentially registering with the government, it did remind me, and I think I’ve shaKira this before but not in one of our recorded conversations, about legislation that was proposed in Hawaii this past year to essentially, you know, if someone wanted to exit the sex industry. So that’s really the focus here. There are a lot of organizations and even governments [inaudible – I couldn’t make out 50:51-50:56] essentially convincing people to leave the sex industry, mostly not meeting their material needs. [inaudible – I couldn’t make out 51:02] this legislation in Hawaii was going to, and I think it failed, but would have paid people 2000 dollars a month to leave.
F: Je peux dire rapidement que nous n’avons pas de carnet sanitaire parce ce que il n’est pas, nous avons la criminalisation complète aux États-Unis, alors il n’y a même pas, il n’y a pas de travailleurs légaux. Cela étant dit, ce que Maya et Sana viennent de dire, tu sais, le fait de donner son nom et sa photo et tout ça, en effet enregistrer auprès du gouvernement, il m’a fait penser, et je pense que j’avais déjà partagé ça mais ce n’était pas dans une de nos conversations enregistrées, à propos de la législation qui a été proposée à Hawaii l’année dernière pour, en effet, tu sais, si quelqu’un veut quitter l’industrie du sexe. Ça porte essentiellement sur cet aspect. Il y a beaucoup d’organisations et même de gouvernements [inaudible] en effet pour convaincre les gens de quitter l’industrie du sexe, sans satisfaire leurs besoins pour la plupart. [Inaudible] cette législation à Hawaii visait à, et je pense qu’elle a échoué, mais elle aurait payé des gens 2000 dollars par mois pour quitter.
KD: Oh sorry, I didn’t hear that last bit. You were saying, I heard you say, not meeting material needs. What did you say after that?
KD: Je m’excuse, je n’ai pas entendu la dernière partie. Tu disais, j’ai entendu que tu disais, ne satisfaire pas les besoins matériaux. Et qu’est-ce que tu as dit après ?
F:Oh I don’t remember exactly but –
F: Ah, je ne me souviens pas exactement mais –
KD: Sorry.
KD: Désolée.
F: This particular legislation did propose a 2000 dollar a month stipend and the catch was essentially very similarly registering with the government and having to share a name, essentially being on the list, which is very dangerous, especially knowing that laws do change, and that the police and the state don’t have our interest, or aren’t looking to protect people. It’s again similar as what Sana and Maya described. Like, it’s about control. How do we get you to do what we want you to do, rather than about keeping people safe. There are harm reduction organizations. So there are some organizations that work with folks who are engaged in criminalized work, whether that’s like drug using, or selling sex. But for the most part, yeah it’s just criminalized here.
F:Cette législations a proposé une bourse de 2000 dollars par mois et le piège était, en principe, pareil à l’enregistrement auprès du gouvernement dans la mesure où tu donnes ton nom, c’est-à-dire, tu es inscrit sur la liste, ce qui est très dangereux, surtout étant donné que les lois changent, et que la police et l’état ne fonctionnent pas dans notre intérêt, ou ne visent pas à protéger les gens. Encore une fois ça rappelle à ce que Sana et Maya discutaient. C’est à dire, il s’agit du contrôle. Comment te forcer à faire ce que nous voulons que tu fasses. Il ne s’agit pas de la protection ou de la sécurité. Il y a quelques organisations dans le domaine de la réduction des risques. Alors il y a des organisations qui travaillent avec des gens qui sont engagés dans le travail criminalisé, que ce soit l’usage des drogues ou la vente du sexe. Mais pour la plupart, oui, c’est criminalisé ici.
K: Khady I would also add with regard to registries, there are several states in the United States that participate in including sex workers who are typically, I don’t want to misquote this, but typically it’s similar to like, three strikes arrests for prostitution-related charges or trafficking-related charges, and I think it’s seven states now, but I can look that up for you, that participate in including offending prostitutes right, so sex workers who have been entrapped by or criminalized by, or caught up in the carceral system, that are then put on sex offender registries. And that, Fleur maybe you can remember if it’s, I think it’s still seven states, have those registries. But that comes with all sorts of other kind of impactful, harmful, dangerous – not just having your name, right, and your address, but you can’t move certain places. If you have children, this is like, you can imagine the impact of this. So I can look that up if they are interested in knowing that aspect of criminalizing law. That’s something to make note of as well, that even if we don’t have, kind of codified in law these health notebook type things, there are other mechanisms that states will put on their books to kind of regulate the movement and bodies of sex working people. And that our government usually equates sex working people either as helpless victims in the law or perpetrating, like, mastermind criminals. Like we have this dichotomy in our law, and so it plays out like that, like either you’re a sex offender who is like a predator, or you are only ever a woman or a girl in need of saving, because, you know, you don’t have agency or autonomy, right, to be making these decisions. So I would add, just to color in some of everything that Fleur said, like that too manifests in practice in our legal system.
K: Khady j’ajouterais aussi à propos des registres, il y a beaucoup d’états aux États-Unis qui participent à l’inclusion des PS qui sont, je ne veux pas mal citer, mais ça ressemble aux arrestations des trois fautes pour des accusations liées à la prostitution ou à la traite d’être humain, et je pense qu’il y a sept états maintenant, mais je peux chercher cela pour vous, qui participent à l’inclusion des prostituées délictueuses, donc les travailleuses du sexe qui ont été piégées ou criminalisées par, ou entrées dans le système carcéral, qui ont été ensuite inscrits sur les registres des délinquants sexuels. Et ça, Fleur peut-être tu peux te souvenir s’il s’agit de, je pense ce qu’il y a toujours sept états qui ont ces registres. Mais il y a tant d’effets percutants, néfastes, dangereux, pas seulement le fait d’inscrire ton nom et adresse, mais tu ne peux pas déménager à certains endroits. Si tu as des enfants, c’est comme, tu peux imaginer l’impact. Alors je peux le chercher si elles s’intéressent à connaitre cet aspect de la criminalisation. C’est quelque chose à noter aussi, que même si nous n’avons pas cette loi codifiée, ces genres de carnets sanitaires, il y a d’autres mécanismes que les états vont employer pour contrôler le mouvement et les corps des gens qui travaillent dans l’industrie du sexe. Et que notre gouvernement dit que les PS sont soit ay victimes qui n’ont pas d’autonomie, ou bien des criminelles tout puissantes. C’est à dire, nous avons cette dichotomie dans notre loi, comme si soit tu es un délinquant sexuel qui ressemble à un predateur, ou bien tu es une femme ou une jeune fille qui doit être sauvée, tu sais, parce-ce que tu n’as pas de pouvoir ou d’autonomie pour faire ces décisions. Alors j’ajouterais, pour ajouter à ce que Fleur vient de dire, que ça aussi se voit dans notre système juridique.
59:23
S: Trois délits, même [inaudible] bi, boo nekkee PS, trois délits quoi.
S: Three strikes, even [inaudible], if you’re a sex worker, it’s three strikes.
M: Trois délits boo xam ne, tej nanu.
M: Three strikes and you know, they lock you up.
S: Trois délits bu fekkee ne en tant que PS walla drogue bi ak yepp? J’ai pas compris trois délits.
S: Three strikes as in sex workers or drugs and everything? I didn’t understand three strikes.
K: Three strikes laws, I think my partner is right here and knows this for a fact. One second – three strikes laws come out of the 1990s, like, and that’s even though they’re not codified necessarily everywhere, it’s become like an elective policing, court-based – the like three strikes laws.
[Voice of Kira’s partner] Started in California. [Voice of Kira’s partner] Automatic life sentence in California, that’s right. So, three strikes laws kind of got phrased, did get its start in California, and that was kind of a policing escalation, and a court-based escalation of sentencing. And we’ve seen that, like for instance the way that that plays out in Cook county where I am just right now in Chicago, around prostitution-related charges, is that you know the first two arrests that you might have around a prostitution-related charge, they might make you do the like, the day school, the hooker day school, what we S workers [KD: or was it “as workers”?] call it.
Where it’s like, they give you cold fried chicken, and they make you watch a video, and they tell you to like turn your life around. And they usually have some kind of like Christian-based ministry group come and give you pamphlets, and they give you like, they make you stay in jail until you can like get out, or they give you the slap on the wrist and then kind of, we call it Catch and Release. And that will happen like your first two times. But your third time, you’re probably going to do an extended amount of time. You’re probably going to pay a larger fine and/or that option, the like quote unquote, what is it called, the charitable option, right, that the county sees itself giving you, of like, the exit school or exit program, that’s not available to you anymore. And now you’re deemed like, a real criminal. And so that’s an example of like, it’s not technically a three strikes policy, but it plays out in this really strange way, all across the United States. And it’s just a way of escalating sentencing. Does that make more sense?
K: Les lois des trois fautes, je crois que mon partenaire est juste à côté et le sait sans aucun doute. Un moment – les lois de trois fautes viennent des années 1990s, te, même si elles ne sont pas nécessairement codifiées partout, elle est devenue une façon de maintien de l’ordre électif, basé sur les tribunaux – comme les lois des trois fautes.
[Voix du partenaire de Kira] A commencé en Californie. [La voix du partenaire de Kira]. Une peine à vie automatique en Californie, c’est ça. Donc, les lois des trois fautes ont été en quelque sorte formulées, ont commencé en Californie, et c’était une sorte d’escalade policière et une escalade judiciaire des peines. Et on a vu que, par exemple la manière dans laquelle ça marche à Cook county ou j’habite actuellement à Chicago, autour d’accusations liées à la prostitution, c’est que, tu sais, les deux premières arrestations que vous pourriez rencontrer autour d’une accusation liée à la prostitution, ils peuvent demander que tu fasses l’école de jour, l’école de jour des prostitués, d’après ce que les travailleuses S l’appellent.
Où c’est comme, ils te donnent du poulet frit froid, et ils te font regarder un vidéo, et ils vous disent qu’il faut changer ta mode de vivre. Et ils ont une sorte de groupe de ministère Chrétien qui vient te donner des brochures, et ils vous donnent comme, ils te font rester en prison jusqu’à ce que tu voulusses sortir, ou ils te donnent une tape sur les doigts et puis, nous l’appelons “Catch and Release.” Et cela s’applique aux deux premières fois. Mais la troisième fois, vous allez probablement passer plus de temps en prison. Vous allez probablement payer une amende plus lourde et/ou cette possibilité, entre guillemets, comment s’appelle-t-elle, l’option caritative, tu vois, que le district pense qu’il te donne, l’école de sortie, ou bien le programme de sortie, tu n’auras plus accès. Et maintenant, on te considère un vrai criminel. Et donc c’est un exemple, ce n’est pas exactement une politique des trois fautes, mais ça se passe de cette manière tellement bizarre, partout aux États-Unis. Et ce n’est qu’un moyen d’augmenter les peines. Est-ce que ça fait plus de sens ?
M: Da nu ne bandit nga.
M: They say that you’re a criminal.
S: Ci booba, danu utiliser église bi.
S: They use the church for that.
KD:Waaw voilà.
KD:Yes that’s it.
S: Danuy utiliser église bi.
S: They use the church.
KD:Waaw voilà.
KD:Yes that’s right.
S: Danuy utiliser église bi. Danuy xam ne kii PS la.
S: They use the church. They know that she, she’s a sex worker.
KD:Voilà, c’est comme église day gniow, day jox ay fiches ak information, “li nga def baaxul, da nga war a –
KD:Right, it’s like, the church comes and gives flyers with messages, “what you’re doing is bad, you have to –
S: Même Musulmans day def noonu parfois de. Bunu xamee li danga mel ni ku lay moytu.
S: Even Muslims do that sometimes. When they know this about you, you become someone they avoid.
KD: Waaw, waaw.
KD: yes, yes.
M: Nyoom, trois délits la. D’abord danu lay conseiller. Mais après, comme si bandit nga. [inaudible]
M: Then, it’s three strikes. First they give you advice. But later, it’s as if you’re a bandit. [inaudible]
M: Bu de nun la, bu dee Sénégal, soo defee benn délit, men nanu la tejj. Parce-ce que loi PS ak loi population générale benn la. La loi, benn la, par rapport à la loi population générale. Par rapport à la population générale, mooy PS, UD yi, ak neenen yi, seen loi bi benn la.
M: If it was us, if it was Sénégal, if you commit one offense, they can lock you up. Because the law for sex workers and the law for the general population is the same. The law is the same, in relation to the law for the general population. In relation to the general population, for sex workers and drug users and others, the law is the same.
KD:Kon boo defee benn délit, men nanu la tejj.
KD:So if you commit one offense they can lock you up.
M: Waaw ça dépend. Soo amee avocat bu baax, mais men na bun a lourd peine. Men nga am benn mois, ak ceci
M: Yes it depends. If you have a good lawyer, you can avoid a harsh sentence. You could have one month or something.
S: Walla 15 jours
S: Or 15 days.
M: Walla 15 jours fermé. Walla un mois, six mois.
M: Or 15 days in custody. Or one month, six months.
S: Mais peine bi du lourd.
S: But the sentence won’t be severe.
M: Du nekk peine bu lourd. Avertissmenet lay donne. Du nekk peine bu lourd. Mais boo amatee, peine bi day gen a –
M: It won’t be a severe sentence. It’s like a warning. It won’t be a severe sentence. But if you have another, the sentence will be more –
S: Ok Khady, boo leen waxee loolu, ne ko nun tamit, nuungi content chaque fois ci seen waxtaan di nanu ci jang. Xamuma ba xam sunu waxtaan, nyoom tamit nyuungi ci jang. Da nu am partage bu baax. Surtout Kira. Kira day partager bu baax. C’est très important. Par rapport ak moom, xam na lu bere ci États-unis lu nekk foofu, pourtant dem unu foofu. Du degg? Chaque waxtaan danu ci jang lu bere. Personellement danuy jang lu bere. Contaan nanu torop. Fleur aussi baaxne. Mais Kira moom c’est le meilleur.
S: Ok Khady, when you tell them that, tell them also that we too, we’re happy with the conversation. Every time, we always learn something. I don’t know whether with our conversation, they also learn something. We have a great exchange. Especially Kira. Kira shares really well. That’s very important. Regarding them, they know a lot about the United States and what it’s like, whereas we haven’t been there. Right? Every conversation we learn a lot. Personally we learn a lot. We’re very pleased. Fleur is also great. But Kira is the best.
KD : Begg nga ma wax loolu de ?
KD : Do you want me to say that ?
S: Non jow uma de. Point de vue lanuy wax rek. Ay points de vue.
S: No I’m not gossiping. We’re just sharing points of view. Just points of view.
M: Da nu xame, Fleur moo gen a ki,
M: We’re saying that Fleur is more, what is it,
S: Non da ma remarquer naari causéries. Kira moo gen a participer. Parce-ce que ci moom lanu gen a jang. Kee, Fleur, danuy jang, mais c’est un peu lent. Mais kii da fay dugg, day xot dal. Day xotal ba ci biir. Kon du coup di nanu xam realités u Amerik. Voilà. Peut-etre kii moo gen a agé. Moo gen a xam milieu bi. Peut-etre loolu la tamit. Parce-ce que am na questions yoo xamantane duma ko xam. Kii da fa ko xam. Comprends nga? No différence bi loolu la moom. Loolu mootax ma expliquer.
S: No I noticed in the two conversations. Kira participates more. Because we learn more from them. The other one, Fleur, teaches us things too, but it’s a little slow. But the other really dives in, goes deep. They go all the way to the core of it. So all of a sudden we know America’s realities. There it is. Maybe they’re older, they know the environment more. Maybe it’s that as well. Because there are questions that I don’t know. But she [Maya ] knows. You understand? No that’s the difference. That’s what I wanted to explain.
M: Mais c’est bon.
M: But it’s good.
S: C’est bon.
S: It’s good.
M: Content nanu. Parce-ce que normalement am na trois questions. Mais benn question lanu répondre.
M: We’re pleased. Because normally there are three questions. But we only answered one.
S: Benn question rek lanu wax.
S: We only talked about one question.
M: Xanaa beneen date bi inshallah. Comme ça nu kii, bu dee am satisfaction, répondre nanu bu baax. Te nyoom tamit repondre nanu ci, ba nu gis ko.
M: Maybe on the next date, if Allah wills it. This way we, if they’re satisfied that we responded well. And they also responded well so we see things more clearly.
S: Waaw. Content nanu ci réponses yi torop.
S: Yes. We’re so happy with the answers.
M: Waaxleen lepp te remercier leen bu baax. Parce-ce que trois questions la. Waaye benn question lanu répondre. Des na deux questions après. Les dernières deux questions rek, men nanu japp beneen jour, nu wax nu, après nu dajewat inshallah.
M:Tell them everything and thank them so much.. Because there are three questions. But we only answered one. Two questions are left now. The last two questions, we can pick another day. They can let us know. After we will meet again if Allah wills it.
S: Sincèrement waxtaan –
S: Sincerely, the conversation –
M: Vraiment content nanu dal. J’espère bien qu’on a bien répondu aux questions aussi.
M: We’re truly pleased. I hope also that we too answered the questions well.
KD:Waaw waaw dégg la.
KD:Yes, yes it’s true.
M: Questions yi nu posées répondre nanu bu baax.
M: They responded really well to the questions we asked.
S: Content nanu waaw.
Yes, we’re pleased.
M: Nu def close.
M: So we can conclude.
S: Thank you very much. Sincèrement. Ne ko content nanu torop. Ne ko nu jang Wolof nu gniow Sénégal. Kira mu jang Wolof gniow Sénégal vacances.
S: Thank you very much. Sincerely. Tell them that we’re so pleased. Tell them they should learn Wolof and come to Sénégal. Kira should learn Wolof and come to Sénégal for vacation.
M: Ak Fleur yepp.
M: And Fleur and everybody.
S: Ne ko nu jang Wolof nu gniow Sénégal. OK?
S: Tell them to learn Wolof and come to Sénégal. OK?
F:Yes, retreat in Sénégal!
F:Oui, vacances au Sénégal!
[laughter]
[rires]
K: I accept this invitation. [KD: I’m 99% sure this was Kira’s voice!]
K: J’accepte cette invitation.
S: Inshallah
S: If Allah wills it.
[overlapping voices]
[les voix se superposent]
S: Merci beaucoup.
S: Thank you very much.
M: Merci.
M: Thank you.
F:Happy new year.
F:Bonne année.
[overlapping voices]
[les voix se superposent]
S: Ah Khady, da fa begg nu def invitation? OK amul probleme. Dinanu waxtaan. Nu jang Wolof après nu gniow. Sinon nu jang Francais après nu gniow. Su fekkee ne Anglais la day difficil torop.
S: Ah Khady, they want an invitation? Ok no problem. After we’ll discuss. They’ll learn Wolof and come, or learn French and then come. If they speak only English that would be too difficult.
F: Got it.
F:Bien compris.
KD: Sana doo ma inviter? [rires] OK, anyway.
KD: Sana you’re not inviting me too? [laughs]OK, anyway
S: Ma inviter la yaw, ma inviter la?
S: Shall I invite you, shall I invite you?
KD:Waaw waaw
KD:Yes yes.
M: Gniowal nu lekk cere.
M: Come and we’ll eat millet couscous.
S: Waaw gniowal.
S: Yes come.
M: Gniowal nu lekk cere.
M: Come and we’ll eat millet couscous.
S: Men nga tokk ba annee prochaine, nuungiy gniow nu degg Wolof moo gen. Walla nu and ak yaw gniow vaccances.
S: You can wait a year so that they come when they understand Wolof, that would be better. Or you come with them and come for vacation.
F:How do I say happy new year in Wolof?
F:Comment dit-on Bonne Année en Wolof?
M: Gniowal nu lekk cere. Gniowal nu lekk cere.
M: Come over and eat millet. Come over and eat millet.
KD:Danu ne deweneti.
KD:They say happy new year.
[overlapping voices, goodbyes]
[Les vois se superposent et disent au revoir]
M: Bonne année. Merci.
M: Happy new year. Thank you.
S: Naka lanuy waxe bonne année en Anglais?
S: How do we say happy new year in English?
KD:Nu ne, Happy New Year.
KD:They say, Happy New Year.
S: Happy New Year.
M: Happy New Year.
Voices overlap, saying goodbye and sending good wishes.
Les vois se superposent, disant au revoir et envoyant des bons voeux.
[1] A couple thoughts about how to present the word “sutura” – We could translate sutura directly, or we could put the wolof word sutura first and then put the English translation in parenthesis. // on peut traduire le mote sutura directement, ou bien on peut utiliser le mot “sutura” en wolof d’abord et après on met la traduction Anglaise en parenthèse.
Despite widespread moral panic about sex trafficking, most ordinary people don’t understand how their responses to street-based sex work might be contributing to violence, exploitation, and trafficking in the sex trade. The reality is that many communities, and especially Black and brown trans women, are pushed out of traditional forms of work, turn to underground economies like the sex trade to meet their needs, and then face criminalization and homelessness (and worse: the criminalization of homelessness) that makes them vulnerable. 90% of street-based sex workers have experienced violence in the course of their work, and often, that violence comes directly at the hands of law enforcement.
At the same time, with the rise of gentrification and the influx of more affluent, often white new residents, neighbors are increasingly turning to the police to make complaints about their Black, brown, and poor neighbors occupying public spaces. 2019 data from the Community Service Society of New York shows that quality of life complaints increased by 166% in long-standing communities of color that had had large influxes of more affluent, especially white new residents. Consistently, wealthier (mostly white) residents move in to neighborhoods and use their complaints to police the behavior of longtime residents of color as a way to establish new neighborhood norms that make communities of color feel less welcome and safe. Instead of advocating for increased resources to address the root causes, neighbors use their complaint calls to police to displace marginalized communities.
Street-based sex workers and their more stably housed neighbors have a shared goal: community safety. For Black, brown, and poor communities, policing is a primary source of the violence. And for new neighbors, calling the police is often all they know. So, how can equip communities with new strategies for building safety without calling the police?
While some gentrifiers might truly intend to push out neighbors of color that they deem “undesirable” by calling the cops, we can still reduce harm by providing neighbors with alternatives, understanding, and education. As organizers, we need to educate communities about the reasons that people trade sex on the street and the risk factors that make them vulnerable to the violence, trafficking, and exploitation that we have a shared interest in disrupting. As we’re innovating new ways to use technology to better support people in the sex trades in the wake of FOSTA and SESTA, we also need to get creative about fighting the traditional ways that technology has been weaponized against those who are most marginalized in the sex industry.
Ending the criminalization of sex work is an essential part of the strategy to curb violence against people trading sex and people profiled as sex workers, and in places like Washington, DC and New York, sex workers, advocates and lawmakers are working to pass comprehensive legislation that seeks to decriminalize the buying and selling of sex among adults, whlie also targeting the many other laws that criminalize sex workers for surviving, like the #WalkingWhileTrans ban that acts as “Stop and Frisk” for Black and Latinx trans and cis women, and the criminalization of housing sex workers that puts people at increased risk of trafficking.
But this policy change won’t happen overnight, and it’s not enough on its own. Apart from decriminalization and decarceration, we have to change the social norms that stigmatize and harm sex workers, which disproportionately impact those working on the street. And we have to fight stigma with education.
Research shows that art, when paired with education, can change behavior. With support from Hacking + Hustling, and Decrim NY worked with the NYC-based artist M on new art to tell our stories: the stories of street-based sex workers and sex workers who have experienced police violence, and to urge neighbors to stop calling the police on their neighborss. We’ve plastered the art throughout neighborhoods like Jackson Heights, Prospect Park, and Hunts Point and our canvassers have engaged in dialogue with neighbors about the real problems of trafficking, violence, and exploitation in the sex trades and in public spaces, and helped them understand that our communities need more resources, not more policing.
Danielle Blunt: Welcome everyone. Today we are here for a panel on The Coding of Risk: From Sex Work to Sanctions.
Before we get started, we wanted to let you know that it’s totally okay to take screenshots or tag us on Twitter. If you wanna tag @hackinghustling or @article19org.
So hello, I am Danielle Blunt. I am a sex worker and co-founder of Hacking//Hustling, a collective of sex workers and accomplices working at the intersection of tech and social justice to interrupt state surveillance and violence facilitated by technology. I’m also a public health researcher, currently researching the financial discrimination of sex workers.
Today’s panel explores the overlap in financial discrimination and censorship of citizens of sanctioned countries and sex workers. The goal of the session is to explore how algorithms expansively code and therefore produce categories of marginalization and the processes that control the flow of information and capital. How are communities coded as high risk by financial institutions? How does a high risk assessment impact people’s lived experiences, increase vulnerability and disrupt mutual aid?
Joining us for this conversation today, we have with us Mahsa Alimardani, an internet researcher focusing on political communication and freedom of expression online at Article 19, and is a doctoral candidate at the University of Oxford.
We also have Afsaneh Rigot, a researcher focusing on law technology, LGBTQ, refugee, and human rights issues. She’s a senior researcher at Article 19 on the Middle East and North Africa team focusing on LGBTQ rights, police tactics, and corporate responsibility. Afsaneh is a fellow at Harvard Kennedy School’s technology and public purpose program, furthering her work on corporate responsibility and her design from the margins framework. She is an affiliate at the Berkman Klein Center where she continues work and research on the use of digital evidence in prosecutions against MENA LGBTQ communities.
And we also have Lorelei Lee, a writer, sex worker, activist, organizer, juris doctor, justice catalyst fellow, a co-founder of the Disabled Sex Workers’ Coalition and the Upstate New York Sex Workers’ Coalition, a founding member of Decrim Massachusetts, and a researcher with Hacking//Hustling. Their writing appears in n+1, The Establishment, Spread, Denver Quarterly, The Feminist Porn Book, Coming Out Like a Porn Star, We Too, Hustling Verse, and elsewhere. Their book, Anything of Value: Looking at sex work through legal history, memoir and cultural criticism, is anticipated in 2023. So to start off, I’d love for all of you to introduce yourself a little bit further and what perspectives and insights you’re bringing to this conversation. Lorelei, I’ll go ahead and start with you.
Lorelei Lee: Yeah. Hi everyone. I’m so happy to be having this conversation. Just the few meetings that we’ve had have been so generative. So I’m really thrilled to be talking about this here.
I started to think about some of the comparisons between treatment of people profiled as being in the sex trades and treatment of people profiled as being Arab or Muslim in the United States when I was working at the Center for Constitutional Rights and noticed that a bunch of their work on where they were seeing folks being de-platformed for doing Palestine solidarity work, for example, folks being, by de-platformed mean kicked off of tech platforms and kicked off of financial tech platforms. They’re, you know, not being allowed to accept donations, et cetera. And I just noticed that that is the same thing that happens within the sex worker community.
We all know, you know, that it happens to all of our friends. So I started to just think about this similarity in what I think a lot of people think of as two very different groups of people. Now, of course, that’s not true. We are overlapping groups of people, but I think the more, maybe something else that is an important point to come out of this is how, to the government, we are very similar types of people. So in the United States, we have legal regimes that target trafficking, we have legal regimes that target terrorism and we have sanctions regimes. And each of those regimes, they’re made up of a group of laws. Each of them are made up of a group of laws. They claim to target a very specific and horrifying type of violence, but they do so through vaguely worded statutes that are very broad. And they’re intentionally broad in order to allow for discretion in their enforcement. It’s good for government officials, it’s good for prosecutors and it’s good for Congress if these laws are thought of as targeting this bad thing and the bad thing could be everywhere. And, so we – we’re going, there’s like- they’re pretending that there are clear lines in the language they use to discuss it while simultaneously making the laws very ambivalent so that they can define it as they go. And so that has caused all kinds of problems in relationship to tech companies specifically, but also any companies really. And we’ve seen this kind of profiling- actually what’s happening is an over compliance with these laws. So because these laws are vague, companies don’t want to take legal risk, of course. And so they will make their rules be the furthest they can be, out on the edge of where the rule could be. So, you know, like the behavior that the law is targeting is way over here, and then things that are thought of as adjacent to that behavior, but really it’s people who are thought of as adjacent to that behavior, are like all the way over here.
And this is where they’ll draw the line. And so they’ll develop policies that are actually meant to profile and exclude. So I think that this is an intentional scheme on the part of the U.S. government, as well as on the part of corporations in which the government can claim that these problems, these types of violence are external to the United States or external to the social body that is thought of as the good American citizen. And so with trafficking, I mean, it’s very explicitly framed as being about people who are socially deviant in a way. And then the folks who get profiled under that regime are sex workers, regardless of whether trafficking has happened in their place of work or not. And then of course under the anti-terrorism regime, Muslim and Arab people are targeted. And then under the sanctions regime specifically, Iranians are targeted. And so all of these are folks who the government feels don’t need to be part of the social body. And in fact, it’s better for them if we are not. And corporations, similarly, if you externalize these harms, then you can claim that your actions are always good, that this doesn’t happen inside of my organization. It only happens outside of it. And so I only have to focus outside of, and so specifically for anti-trafficking protocols, there have been a lot of instances where companies are, have horrifying worker treatment and like no labor rights within their company, but that’s not important. That’s pushed to the side in order to focus on what is thought of as the most extreme form of labor violation, which is trafficking. And, you know, so this kind of division is happening through this scheme. And I think I had one more point that I wanted to make, which is really just to note again that this, these are schemes that criminalize people while they pretend to criminalize behaviors.
Danielle Blunt: Thank you, Lorelei. Mahsa, would you share a little bit about your perspectives and insights that you’re bringing to this conversation?
Mahsa Alimardani: Definitely. So, I mean, I want to start from a point that Lorelei just made, which was the intentionality of these laws, which I think is very interesting because my work centers on access to the internet and freedom of expression online in Iran, and this is very much of value that the U.S. government (inaudible) to be behind. And, you know, they have a lot of, I mean the state department constantly talks about their belief in freedom, freedom of the internet, even under the Trump administration we’ve seen these values reiterated. And when we’re looking at Iran, it is naturally a country, you know, that has a repressive government, a repressive system in play that is causing a lot of hurdles to access to information and freedom of information online.
But on the other hand, there are restrictions put into place by the U.S. government and many will argue, well, you know, if the U.S. government didn’t have these restrictions, it would still be repressive in Iran for, you know, internet users in Iran, which is true. It would be. But if we’re, you know, concentrating on policies in the U.S. this is, you know, the low hanging fruit we can change, which is to ease access through these policies. And so as Lorelei mentioned, there is a series of restrictions placed on how Iranian users inside Iran, even sometimes outside of Iran, have access to certain services. How this manifests itself for freedom online is very interesting.
So we have sanctions, and sanctions in place typically means that U.S. entities cannot provide services or do business with Iran, but there have been provisions, again, because of, you know, the value of freedom online, because of the value of freedom online, where there is actually a license given by OFAC, the Office of Foreign Asset Controls at the U.S. Treasury that has, it’s called General License D, which is supposed to allow for services that enable personal communications. How this gets misinterpreted and how we see platforms like Slack being blocked for Iranians has a strong component to do with the topic at hand today, which is financial discrimination. So the laws get interpreted in a way to argue that certain services being provided to Iran or Iranian entities are breaking sanction because they are being, they are involved in financial transactions. Oftentimes they are not. And so I guess that’s where this comes into play.
This intersection between what U.S. values are, how it’s being put into play, how companies, a word that we’ll probably use a lot today, are over-complying with these laws and how the U.S. government is not necessarily making it easier. Part of the work that I’m doing at Article 19 is I’m working on different ways to persuade the companies to A. Stop over-complying. And, B. For the government, the U.S. government, to make more rigid, a more rigid general license D, which is, it won’t be subject to, you know, the interpretations that companies like Apple, Google, Slack have been doing, which has been misidentifying certain services that are vital for access to the internet or freedom, freedoms online, and misinterpreting them as financial transactions. I mean, there is a wider discussion about why Iranian users have to be subjected to these financial hurdles, and that is a very valid conversation. And I think we’ll get into it as well, but I just wanted to put into context how this is a major issue for freedom online.
And I think I’m getting a message that I, there’s some issues with my microphone sound, and I will try to fix that. And I apologize if that was getting in the way.
Danielle Blunt: Thank you so much. And Affi, I’d love to hear from you about what you’re bringing to this conversation.
Afsaneh Rigot: Yeah. Hi. Just quickly, apologies – narrow video here so it might be a very dark screen, but it’s RightsCon, we’ll deal with it.
So it’s a fascinating conversation, and I’m really glad to be part of it in this sort of bridging of the different work and activism and advocacy that different communities have been doing around this concept of financial discrimination. I think having conversations with yourself and having conversations with Mahsa and Lorelei and folks like Kendra, our mutual friend at the (inaudible) clinic, and so on has highlighted this concept of the same sort of patterns and systems and methodologies used to target individuals who are deemed high risk. And, you know, we’ll get into this conversation about how high risk is presented and who gets to choose what high risk means and how the sort of methodologies employed throughout these different procedures, whether it’s trafficking, whether it’s money laundering, whether it’s terrorism, whether it’s sanctions, they tend to use the same sort of methodology for all of them and, you know, as we’ll get into, apparently the same databases.
So when we’re discussing these sort of topics and bringing it into consideration of the impact of tech on marginalized communities, which is something I deeply focus on in my work, focusing on like the impact on communities, specifically queer communities in the MENA region, I see some of the things often. I see some of the things often when it comes to the use of morality framing to target those most marginalized through law enforcement, the justification of targeting those most marginalized, those seen as low risk in terms of prosecutions because the prosecutions can happen so quickly. It, it kind of hits a nerve similarly when it comes to these concepts around who gets blocked off from access to financial institutions and who gets blocked off from access to platforms and so on. And the use of this sort of morality framing that Lorelei was really beautifully outlining that who, you know, how they try and frame who is high risk in these situations.
It, I think whoever has done any sort of studies in to like enforcement and police enforcement sees the overwhelming similarities of how they use these sort of frames and who becomes a target to it. So I come from it from a perspective of, one, being an Iranian, who has been subject to these issues from family to friends, but also as someone who’s been working as part of Article 19, supporting work with Mahsa’s work on sanctions, and also looking at the way these sort of blockages affect the queer communities that I look at, not in the way of getting thrown off, but in the way of just not having any access, knowing that you’re not welcome. And that’s something we’ll talk about later.
But yeah, I’ll leave it as that I come to this conversation with just a lot of solidarity and joy that these advocacy projects and movements and works is happening, including from Hacking//Hustling, the work that has been happening on Iran sanctions. We should be bringing that all together because it’s one common methodology used around the most marginalized within these contexts. And it makes a lot of sense for us to be clamoring together, to highlight how it’s affecting those folks that fall under their risk appetite. So I’ll leave it there.
Danielle Blunt: Thank you, Affi, and thank you for bringing up the importance of cross movement work and building solidarity across movements so we can work together and build knowledge together and build power together.
So in our conversations, we’ve been talking about some of the parallels in anti-trafficking and antiterrorism laws and sanctions and Lorelei, in your intro you were talking a little bit about how these laws are really about policing people, not behavior. And I was wondering if you could talk a little bit about some of the parallels in anti-trafficking laws and antiterrorism laws and how these can lead to the production of high-risk communities and the subsequent financial discrimination of marginalized communities.
Lorelei Lee: Yeah. So I think the parallels, and I’m going to apologize, I may repeat something that I have already said, but I think the parallels are really in the enforcement. So there’s parallels in the, in the language in that both- So, okay. So if we’re talking about trafficking in the United States, we’re talking about a couple of different major laws, the Trafficking Victims Protection Act, and that has been modified by FOSTA, which a lot of people have heard of recently, it passed in 2018. That’s the Freedom from Online Sex Trafficking Act. And then FOSTA also modified something called the Mann Act, which was passed in 1910, which was originally called The White Slave Traffic Act. And each of these is a, are laws that claim to target trafficking. However, for example, the Mann Act in the decades after it was passed, started to be used to prosecute things like adultery, you know, having a mistress and crossing state lines with her. It was always a “her” in, in that time because the Mann Act was specifically about crossing state lines with women for immoral purpose. And that was the language of the law at that time. And in fact, a Mann Act case was also brought against a woman who traveled by herself on a train across state lines. And they said that even though the language of the law made it sound as though you would have to be transporting someone else and not yourself, they needed to prosecute her in order to be sure of getting at the entirety of the evil the law was trying to target. (laughs) So, similarly, we have, we have FOSTA, which we don’t know how it’s going to look, but we do know that FOSTA expands the federal trafficking statutes to also put prostitution for the first time into federal law, a criminal statute against prostitution. What FOSTA actually says is that you can’t use a computer service, there’s a bunch of legal language around what a computer service is, but internet technology to promote or facilitate the prostitution of another person. And so that’s the first time that we have prostitution really becoming a federal crime. And then the TVPA is the law that was passed in 20- in 2000 around the same time that we have the Palermo Protocols internationally, and it is a very specifically describing, you know, creating a definition for what trafficking is and what sex trafficking is. However, it also defines trafficking, it defines trafficking, sex trafficking, as the exchange of sexual behavior for anything of value. And so it encompasses any kind of trading sex, not just coerced or forced sex trading. However, it only criminalizes severe forms of sex trafficking, and so they have a different definition there that does include force, fraud and coercion. And then in the anti-terrorism statutes, they’re called the material support statutes, and they prohibit the provision of material support to foreign designated terrorist organizations. The United States keeps a list on which they assign organizations as FTOs, and the definition of what it means to provide material support has become very broad. It’s similar to these other laws where a case out of the Ninth Circuit said that HLP, which was an American organization who, Humanitarian Law Project, and they wanted to do some work with some organizations in other countries who had been designated on the foreign terrorist organization list. But the work that they wanted to do was human rights work. They wanted to specifically provide training for nonviolent dispute resolution methods, such as filing complaints with the U.N., et cetera. And the court decided that providing nonviolence training was providing material support because the organization that had been designated could- it was, it was going to be fungible with resources that they could then put toward buying weapons, for example. So if they don’t have to pay for peaceful mediation services, they can use money to buy weapons. So the idea here is that this thing that is pure speech is now fungible with money. And it made me think of when you were talking, made me think of that when you were talking, Mahsa, because I do think what, what’s happening with these tech platforms also. We have a lot of blurring. Where, is this about money? Is this about speech? And is it really just about excluding folks from everything, from, you know, having access to financial technologies, having access to platforms at all? And I, you know, I also, you know, I mentioned that one of these laws is the, or was originally called the White Slave Traffic Act, and I think that’s really also important to bring up because I do think that these laws are also all of them are rooted in racial capitalism, in systems of racial capitalism in the United States, in the criminalizing of some behaviors, but specifically behaviors by some people. And when you have something like the White Slave Traffic Act, where what you’re doing is trying to remove women, white women who, you know, women who are assigned to this thing called whiteness, you’re trying to preserve that thing called whiteness by removing them from situations where they might be having interracial sex and simultaneously continuing to criminalize, for example, black women in the sex trades, which is what happened after the passage of that law. Really sort of a building of a rescue industry and simultaneously a criminalization industry. And so I know these are, I’m sort of going on these tangents that aren’t direct to the topic today, but I think you can probably see how all of this leads to a kind of exclusion online that is just a progression from an exclusion that’s been going on for hundreds of years.
Danielle Blunt: Thank you, Lorelei. Yeah. And so I, I hear what you’re talking about and how broad these laws are, which can lead to like a gross over enforcement of them.
So Affi, this question is for you, how does the coding of risk happen at financial institutions? And could you talk a bit about the process by which individuals and communities are deemed to be high risk?
Afsaneh Rigot: Sure. Also, thank you Lorelei and Mahsa on these outlines. And I mean, just like following up on this notion of the sort of vague and broad scope that these institutions are being provided with in terms of implementation, we’ll see, we’re gonna be seeing it in terms of risk profiling, and you don’t see my fingers here, but I’m going “risk”, you know, air quotes. (laughing) Risk profiling.
So I think it’s interesting to do it with a case study of a particular bank that I managed to get some sort of insight from in terms of how their doing some of their risk profiling.
This is a bank based in the UK, but the sort of work and methodology they use is, is replicated throughout. Different banking systems have different risk appetites. So some of these percentages might differ and some of the softwares might differ, but the notion remains the same. And one thing that to keep in mind is that in risk profiling, and this is something that Blunt, you and I in other conversations were talking about is like this deputization of people to take on the profiling and methodology. So let’s say financial profiling they’re doing that sort of online profiling of accounts. So there’s this methodology of using third party due diligence softwares. If you’re a banking system, you will be often using your own databases and checking your accounts against more broader, larger international databases, such as I’ll give you a few names here, LexisNexis, WellCheck, and NetReveal. So depending on what software the bank itself has, it’ll usually check its data components against something like LexisNexis and WellCheck.
A little bit of a background because I went down the black hole, I’m looking at the rubbish these companies get into. LexisNexis and WellCheck are both owned by Thomson Reuters and RELX, who are doing data brokering on the side of their journalism, which is always nice to hear. So in terms of this data, brokering and selling of data, what happens is that millions of profiles from, the last I checked with WellCheck is 2.5 million profiles with 25,000 new profiles added monthly. These are profiles seen as high risk, whether it’s on terrorism as well as on trafficking and so on. So checking this the bank will look at whether or not a profile is matching it, depending on their risk appetite, on a 25, 30, 50, 75 or 100 percent scale. And then you bring in the sort of human element that is the employees who will go through these profiles and check it.
What triggers off these sort of risk alarms in different situations with sanctions and countries, as Mahsa has explained and will continue to explain, it’s really not that complicated. Literally the name of the country coming up will trigger it, the name coming up in any sort of format, whether it’s the name of a human rights organization, whether it’s the name of a restaurant coming up, it will trigger this sanctions company and ban on there. And often like the sort of framework the banks will have is that you can open up accounts for Iranians, but you can’t open up accounts for anyone who has any sort of direct or indirect dealings with Iran. What does indirect mean? Who knows? It’s up for them to decide. Indirect, does it mean having the name in your organization’s title? Yes, apparently it does. So that sort of sanctioning and risk assessment is quite automatic. You have also countries like Cuba that fall into this. The U.S. is the only country that has sanctions regimes against Cuba. However, international regimes around the world and banking regimes abide by the U.S. regimes on this. When it comes to Syria and Iran, it’s more international, but it kind of shows you where the priority is set when it comes to risk assessments. So places like North Korea, Iran, Syria, immediate ban. And then they check later. Terrorist sort of designations. So this insider of mine who has given me some really nice quotes and tidbits to understand. It’s like, literally a name, Mohammad will be flagged every time because that’s the sort of risk assessment they’ve put out there. And then it’ll have to be the human element that go goes in and looks at that profile. Al, they were mentioning the name Alto like Al Hammadi, Al and so on, a very popular name in the Arabic speaking regions, will pop up as a risk for terrorism. And, you know, it depends on the bank and the sort of threshold they put on, but well you can see the pure racism and the discriminatory nature of this quite clearly as it comes.
When it comes to sex workers, very frankly I was told that any Vietnamese names get flagged as potential victims of exploitation because they have sort of associated Vietnamese names with trafficking in these contexts. But the interesting part of this also, that sort of vague method that they use around this, for example, when it comes to sex workers, there’s absolutely zero to no distinction between actual sex work, trafficking or exploitation. It’s all under the same umbrella. And the sort of slides that I was seeing it was all about how to identify victims of exploitation. No differentiation there and no training there that there’s anything outside these realms. And it’s that sort of frameworking within the algorithms and the deputization that creates this risk assessment for sex workers, any sort of words will pop up, sugar, darling, emojis you’ve mentioned to me before although it didn’t come up in that conversation, but very little things come up as seen as a trigger for potential trafficking and closing of these accounts. So, you know, these sort of vague concepts around what is risk not only revolve around very racist, discriminatory frameworks and geopolitics, but they’re also being passed on to these (inaudible) that we’re talking about, who are being trained in the same way.
One of the other things that was really fascinating to see in this is that there’s this moralization in the training that’s happening for folks who are enforcing sanctions, terrorism, or trafficking regulations within the bank where, you know, this one in fact it was telling me it felt like pure brainwashing because you’re being told you’re doing, you’re enforcing sanctions regimes on X, Y, and Z countries because you’re helping them abide by human rights provisions. You’re enforcing anti-terror legislations this firmly because if you don’t, you’re going to be responsible for terror acts in specific situations. You’re doing this profiling of sex workers because you’re trying to stop exploitation. So this moralization is really problematic in the context. And you know, one of the questions was, do you ever get asked or told about the folks that would get stuck in the middle, who have been marginalized and thrown out of these services that don’t fall under any of these? And the answer was clearly no.
So obviously when it comes to risk assessments, for me I would, I can very, very deeply see that this over-compliance rather than compliance as incentive is a huge deal because it becomes this notion of ensuring that there are no fines against a company. These fines and revenues, even though they have these moralization presentations and they talk about, if you don’t do it, we’re gonna get fined big time. And then they show them these huge cases. The final point I’m going to say is also really depends on the institution. If the customers of an institution are from a particular country or particular demographic, they’ll have more robust systems in place. Because again, it becomes a revenue-based incentive for them to maybe not enforce further. I’ll stop there. It’s all kind of trash, but there we are. This is the systems we work with.
Danielle Blunt: Thank you so much Affi. And I think it’s, it’s such an important point of the human role in these automated flagging systems and the process in which companies are deputizing employees as an extension of policing and how policing is deputizing companies as an extension of policing as well. So that the biases of the employees, which are impacted by the like propaganda-like trainings are impacting what’s happening here.
And so Mahsa, my question for you is, how do these processes impact people living in sanctioned countries? And how do these processes parallel issues of internet access in sanctioned countries?
Mahsa Alimardani: So there’s two ways to answer this. And, and one I’ve already touched upon, which is a very tangible access to platforms and services that are hindered by sanctions and these systems. And then the second is kind of a question of content moderation, which I know has been very popular and much talked about throughout this rights, this year’s RightsCon, but how Iranians are able to freely express themselves is somewhat subject to the systems as well. And I’ll, and I’ll start with this because Affi did this really great job of framing, the kind of, I guess, systems in place to do this. And similarly, there are systems in place within platforms like Facebook or Instagram that are doing this.
I mean today, a coalition that I’ve been part of just launched a big campaign asking Facebook to do a public audit on the content it has on Palestine. And this is a big step forward just because the, you know, the faults that Facebook has been implementing in content moderation recently has been so egregious for Palestine, but hopefully this will be a first step for the wider region.
There’s some really great research that Dia Kayyali has been doing with Mnemonic And I think previously with Witness where they were actually examining the algorithms in place and how these algorithms were flagging content based on just the IP of the users within the Middle East region, you know, the use of Arabic texts within the region, which is, you know, used for Persian, Urdu and Arabic and these things automatically flag the content for their systems to, you know, do extra moderation and extra removals within, you know, their policies of terrorist content, terrorist organizations, and individuals, which is an actual Facebook policy. And so what had happened, and we saw this kind of mass eradication of terrorist individuals and organizations back in April of 2019 in particular in Iran because the Trump administration decided to designate the Revolutionary Guards as part of the FTO list. And so naturally, I’m not here to argue that the Revolutionary Guards are by any means a good actor, but they are such a pervasive presence within Iran.
I mean, men have to do mandatory military service in Iran. And one in four chances are mandatory military service will land you in the IRGC. And so, you know, I’ve known of human rights activists who had to serve within, you know, a division of the Revolutionary Guards who have been subject to certain sanctions or checks or denials of boarding flights because of this past they had no control over. And the Revolutionary Guards have a pervasive influence over the banking sector, a lot of the private sector in Iran. So it’s kind of hard to avoid them if you’re Iranian and, you know, everyday politics, politicians, news in Iran very much revolves around the Revolutionary Guards. And so if journalists and human rights activists want to just freely talk about and express, you know, day-to-day activities, they found, they find massive hurdles because, I mean, for example, in January of 2020, a major figure from the Revolutionary Guards was assassinated by the United States. And we saw, you know, activists and journalists having their content removed for just mentioning his name, Qasem Soleimani, because he was part of this terrorist designation and they were talking about benign things and even criticizing them. So this has been a hindrance in the ways that these platforms, you know, initiate or make use of these sanctions designations and FTO lists.
On the other hand, users, you know, naturally do not have access to the same platforms because of certain things like if a VPN or a service is hosted on the Google cloud platform it won’t work in Iran. And what this often means is that Iranians will end up using less secure platforms, platforms that are being hosted by the Iranian government, which is something that this U.S. policy is playing into very nicely. You know, the Iranian censorship regime wants users to be reliant on national infrastructure. And the sanctions policy is encouraging that kind of migration and has been encouraging it for the past 10 years. And so essentially the implementation of these regulations, be it from the side of the companies, whether they’re, you know, hosting the content of Iranians or allowing access to services, online services, is a very simple thing that could potentially make freedom of expression online and access to the internet much easier. I mean, it’s not the end all cure to freedom of expression or access, but it is kind of a low hanging fruit that, you know, the United States could potentially help with.
Danielle Blunt: Thank you so much. And yeah, I think that point that when people don’t have access to these tools, don’t have access to online spaces or access to financial technologies or banking institutions, the impact of that isn’t necessarily an unintentional consequence, but it’s often by design and an intended result of those rules. So we’ve talked a bit about how financial discrimination happens, but I’d really like to shift the conversation for these last 15 or so minutes into what are the real world impacts of financial discrimination and denying groups of people access to capital.
And so Affi, I’ll start with you. Could you talk a little bit about how the broadness and the vagueness of these rules leads to over-complying and over-enforcing of rules, and sort of a little bit about the ripple effect of how this could impact people’s larger social networks?
Afsaneh Rigot: Yeah. I mean, I think part of, like, let’s say an example of this from the work I do is if I am going to talk to sex workers within the MENA region to talk about how they would use financial platforms or any sort of platform for their work, it’s just non-existent.
There is quite a clear line that one, we’re not welcome on these platforms. Two, we can’t use these platforms. Which is a big issue because in the sort of study of prosecutions I do of queer and trans folks and sex workers in the MENA region, one of the main things police look for is monetary transactions, like physical monetary transactions. And for example, in somewhere like Egypt, without that sort of monetary transactions the case prosecutors have against individuals is a lot weaker. So, you know, in my mind, in my sort of utopian mind, if there was a method in which folks could use a platform safely to do these transactions without having different currencies and so on on them on the streets when they are arrested or entrapped, this situation of higher charges and these laws who are, that are, for example, in Egypt is called the Laws Against Prostitution, will be much harder to enforce. Not foolproof, but the reality is people don’t have this access.
Again, the sanctions regimes that we’re looking at are so wide and broad in terms of reach. And the fact is that the folks we’re talking about are so easy to become, to be thrown under the bus in these situations for these companies and these, and financial institutions, that platforms and sort of tools that folks need to be used can be cut off from them quite immediately. We’re seeing this right now with regards to queer dating apps in Iran, where there are major number of queer dating apps that have basically over-complied with sanctions regimes and cut off access to queer users in Iran from using dating apps. They were not legally required to do this. There’s provisions that Mahsa can go into it much better than all of us in terms of the framework that does not require this over-compliance. However, as we talked about, the incentive is to over-comply rather than comply because losing Iranian users isn’t that much of a big deal. They’re not going to be able to use financial banking systems to get premium features anyway. There’s also the, you know, moralization and fear-mongering that comes with all of these situations in terms of, in terms of sort of advocating for these changes. And obviously we are having these conversations and ensuring to me, ensuring there’s a method of kind of reversing this over-compliance, but it spills over in this context. In Iran, there are so many legal and policy frameworks that criminalize the community.
These platforms become fundamental for connecting. These platforms become fundamental for user community. They have their problems. They have their many problems, but they’re fundamental. Cutting that access off is just like quadruple discrimination against a community that’s already discriminated against. And why?
Because you thought it’s probably better to over-comply than just comply. So the ripple effect is huge.
I’ll pass it on to Mahsa to continue about the sanctions issues when it comes to different platforms and effects on individual groups. But one thing I want to add, if there’s any global enforcement listening, I don’t know, maybe, if your criteria or framework of protecting vulnerable groups against prostitution or trafficking means throwing whole groups of vulnerable groups under the bus, it’s not working. And it’s not about vulnerable groups. We all notice. So. (laughing) Let’s stop it there and I’ll pass on to Mahsa from there.
Mahsa Alimardani: Yeah. So I think Affi mentioned something really important, which is what, I guess, qualifications or standards are applied when sanctions are applied to certain technologies or services. And I mean, tomorrow there’s going to be a session hosted by GitHub that I’m part of, and the really interesting thing about GitHub as a platform is, is that it’s a platform that offers both paid and unpaid services. Yet both the paid and unpaid, essentially the free services, stopped being served inside of Iran.
So you had Iranian users with, you know, free repositories have the repositories removed off of GitHub. And this was a very strange thing. And I know Affi and I have had many conversations with different platforms like this where we’re like, but you know, this is a personal communication. There’s no financial aspect to an open source code. You know, GitHub repository. How is this being applied? I mean, this is happening to a lot of platforms where this, this judgment is being made. GitHub is one of the few that actually took the extra, went the extra mile and got a general license for their product to be available in Iran. And this was something that took a lot of time. And if you come to the event tomorrow, they’ll explain to you exactly the kind of grueling process that only like a big corporation, Microsoft is behind GitHub, that could really, I guess, support such an endeavor. And they did do this, and this has become kind of a model for the rest of the tech industry.
And Affi has been working with other, you know, companies through her work in the MENA region, and we’ve been trying to kind of apply the GitHub model. But the thing is, is that, you know, the general license D for personal communications should be strong enough to stop these companies from having to fear those kinds of repercussions from the government. But unfortunately, those assurances aren’t there from the government, even though the government will come out and say they believe in freedom of the internet and access. And so you have this kind of, I guess, you’re, you’re, they’re passing on the responsibility between each other. So at this point, the solution seems to be the goodwill of the few companies that are willing to take those extra steps.
Danielle Blunt: Thank you. Thank you Affi and Mahsa. I feel like both of you touched on how lack of access to tech can actually increase vulnerability and violence. And when Affi was talking about dating apps shutting down because of sanction, I wanted to note that it also impacts sex workers who hustle on those apps, which is a good reminder that the communities that we’re talking about often hold intersecting identities, and it’s not that we’re necessarily talking about two discreet communities. So I’ll end with this question to Lorelei about, how does the financial discrimination of sex workers impact community? And I’m thinking specifically around destruction of mutual aid and organizing.
Lorelei Lee: Yeah. So I mean, there’s, as Mahsa and Affi have already talked about, there are such broad effects and multiple effects of this kind of discrimination. And, you know, we’re talking about everything from being excluded from public places, like the Marriott not allowing people they profile as sex workers to be in their bars, or Uber driving sex workers, suspected sex workers, to police stations, to the direct impact of not being able to use either an app on which you would screen clients such like as a dating app, which would be a harm reduction tactic for people in the sex trades, not being able to do that so your work becomes more dangerous, or not being able to take money from clients.
And so your economic precarity becomes more precarious. And then there are the community effects that Blunt has mentioned, which is that, you know, for sex workers, like we are just a group of people who is facing trauma and precarity all the time. And also who really, there aren’t good resources for us, provided to us socially, either from family networks or the government or civil society at large, there’s just not anything. So what we’ve done is create these networks, and created them across the country. Oftentimes using apps like Twitter, that continue to try to shut down our accounts. And, and yet somehow we have at least some of this in place where we then are able to say, okay, who needs money this week? Who’s, who’s struggling this week? And we can try to redistribute some of our funds and do this mutual aid work.
However, in order to do that, we have to be like, well, my Cash App is shut down this week, but maybe I can find someone who has a PayPal that they can, you know, then switch it to their Venmo, then switch it to Cash App so that they can get it to you. And something I said, I was talking to Rachel about this yesterday, how it just feels like we’re passing the same $50 around the country all the time, you know, and just like struggling through this like maze to try to do that. So that is very frustrating. And I think it points to something that I would describe as a systemic effect. So we have the individual impact, we have the community impact, and then we have the systemic impact. And the systemic impact is where our community networks can never get more resourced, can never become stronger because we’re constantly being shut out of financial systems as well as public dialogue. And so we can’t be visible online, which means we can’t tell our true stories online, which means the narratives that folks get about sex work are very specific and very narrow and often build a myth around all of us, which means that we are then viewed through that narrow lens so that we are constantly seen through a stigmatizing, dehumanizing profiling kind of aspect. And, and because of that, our voices, when we actually get a chance to use them, people don’t take us seriously.
And Facebook specifically, just to go back to Facebook and I will wrap up, I know we’re nearing the end. Facebook has a policy that you can’t talk about sex work unless you are talking about experiences of violence. And this is exactly what I’m talking about. When you only allow one narrative to be the narrative that people hear, then you stop seeing us as complex and complete people. And I am not saying, I mean, I think I actually have to say this, which I, is absurd that I have to, but I know when I’ve had these conversations before people think that I’m trying to downplay experiences of violence. I am a trafficking survivor myself. I don’t believe that our experiences of violence are not important, but I think that if you hear them only in a two-dimensional way, without hearing anything else about our lives, you stop seeing us as human beings again. So that’s like- (laughing) Like I just think these impacts are so broad and, and the oppression that comes at that systemic level prevents us from doing the kind of organizing that we are trying to do in order to push back against that very oppression.
Danielle Blunt: Thank you so much, Lorelei. And I think what you’re talking about when we’re not able to share our own stories another narrative is created in that space is also a really good point of intersection of what’s happening with these two communities with internet shut downs as well. So I want to give a minute for everyone to share a last thought, and I’ll start with Mahsa as we close out.
Mahsa Alimardani: I mean, I’m just kind of constantly listening and, listening to Lorelei and learning because they frame things so well, the bigger picture of what we’re doing, and I completely agree. And there is, you know, there is this inherent discrimination.
It’s a discrimination within the Middle East regions. It’s discrimination within sex workers. And I think, you know, the stronger the voices of these communities, I mean, there’s, there’s some momentum right now, you know, with focus on content on the Middle East, if there’s any way to, you know, intersect and do the movement building within these communities, I feel like these changes might be stronger and more profound.
Danielle Blunt: Thank you. Affi, did you have any last thoughts?
Afsaneh Rigot: Yeah, I think I echo Mahsa in saying that, you know, Lorelei frames everything so beautifully and perfectly that I think most people listening should just remember those last words, but I just wanted to add the importance of like connecting the work that’s being done on these issues around discrimination and sort of censorship of communities and from sex workers to communities in the Middle East to folks being affected by terrorism.
And as Blunt mentioned, these aren’t distinct groups, we’re all overlapping and intersectional and amongst, and within each community. One thing I wanted to mention is the little note about how also banking systems use this notion of reputational risk, predominantly around sex practice and anti-terror and high risk profiles. And reputational risk is such a broad and flimsy concept. So when asked about how Only Fans comes in, they block folks that have connection to Only Fans around this notion of reputational risk. And finally that if you’re looking at the issue around Thomson Reuters and their huge data brokering, look up the different situations that have come up in terms of blocking of the Palestine Solidarity Campaign to Finsbury Park Mosque under terrorism associations. So one of the things I hope we can do is hold these sort of seemingly neutral data brokers that are festering amongst us accountable for the work they’re doing and the data (inaudible) Thomson Reuters and LexisNexis also obviously notoriously sold a whole bunch of information to ICE for immigrants and undocumented folks and refugees in the U.S. So that’s another thing in our campaign. (laughing) To add to it anyway.
Danielle Blunt: Thank you Affi. And if you want to follow any of our work, you can follow at Hacking//Hustling, hackinghustling.org, and at article19.org. And Lorelei, I would love to offer you the one minute we have remaining for any final thought.
Lorelei Lee: Oh, I just wanted to say thank you all so much. I mean, I also learned so much from listening to you Mahsa and Affi, and I’m just like really thrilled that we have had the opportunity to bring our knowledges together and keep doing it. Yeah.
Danielle Blunt: Thank you so much. And I look forward to continuing the conversation with you all, and thank you so much to everyone who tuned in to join for this session.
INGRID: So today, we were gonna talk about platform surveillance. And… There’s, in general, what we’re kind of focused on is both ways that platforms surveil and forms of state surveillance that utilize platforms.
So, this is sort of a outline of where we’re going today. First, kind of doing a little bit of clarifying some terms that we’re gonna use. Those two different kind of terms being state surveillance and surveillance capitalism. And then we’ll talk a little bit about mitigation strategies, with both.
What is surveillance capitalism?
So, clarifying some terms. We ‑‑ we wanted to kind of make sure we were clear about what we were talking about. So surveillance capitalism is a term used to describe sort of a system of capitalism reliant on the monetization of personal data, generally collected by people doing things online.
What is state surveillance?
State surveillance is a tactic used by governments to intimidate and attack. It can be digital? It can also be IRL. For many years, it was just people following people, or decoding paper messages. It’s more sophisticated now, but it takes many different forms.
And surveillance capitalism kind of emerges in part because of the demands of state surveillance. So in the early 2000s, after 9/11, the existence of a military industrial complex that had funded the development of surveillance technologies was already well established, but was amped up in some ways by the demands of the global war on terror. And there are… you know, deep interconnections between the history of the Silicon Valley and the history of the military industrial complex.
The state will utilize the resources of surveillance capitalism. But we wanted to make it clear, like, surveillance capitalism’s negative impacts and risk to personal safety are not solely confined to state violence. They can perpetuate interpersonal violence; they can, you know, create greater harms for your ability to kind of, like, find a place to live, or change your mobility. And those are ‑‑ you know, those two distinctions, we just wanted to kind of clarify as we go into this.
And I feel ‑‑ like, part of me wants to believe this is kind of already taken as a given, but it’s good to kind of have it said: Neither surveillance capitalism nor state surveillance exist independent of racism and the legacy of the transatlantic slave trade. Sometimes ‑‑ I found when I first started getting into this space, in 2012, 2013, that was kind of an angle that was kind of neglected. But you don’t have an Industrial Revolution without transatlantic slave trade, and you don’t have surveillance without the, you know, need to monitor people who are considered, you know ‑‑ property, basically. I would highly recommend Simone Browne’s Dark Matters as an introduction to the role of Blackness in surveillance. But, yeah. Just wanted to raise that up, and be really honest about who is affected by these systems the most.
And… Also wanted to reiterate something that we said in the first day of this series, which is, nothing is completely secure… and that’s okay. We’re learning how to manage risk instead. Right? The only way to be completely safe on the internet is to not be on the internet. Connection is, you know, involves vulnerability by default. And… The work that we’re trying to all do is find ways to create space. To mitigate or, like, understand the risks that we’re taking, rather than just assuming everything’s a threat and closing off. Or saying, well, I’m doomed anyway; doesn’t matter.
What is threat modeling?
In the world of security studies and learning about, you know, building a security ‑‑ like, a way of thinking about keeping yourself safe, a term that comes from that space is threat modeling. Which is a practice of basically assessing your potential vulnerabilities with regard to surveillance. We’re not going to be doing threat modeling, per se, in this presentation. There are some really great resources out there that exist on how to do that that we can ‑‑ we’re happy to point you to. But we wanted to raise it up as something that can help in approaching managing risk and mitigation, because it’s sort of inventorying your own, you know, circumstances and understanding where you are and aren’t at risk, which can make it a little less overwhelming.
All right. So the concept of state surveillance, for this presentation, we’re going to be talking about it on and using platforms, which we talked about yesterday. There’s all kinds of other ways? (Laughs) That, you know ‑‑ as I said earlier, that the state can engage in surveillance. We’re, right this second, just gonna focus on platforms. We might be ‑‑ if there are questions specifically about non‑platform things, maybe in the Q&A part, we could talk about those, if there’s time.
What are platforms and why do they matter?
So, platforms are companies. I think we’ve said this a lot over the last three days. And what that generally means is that platforms have to, and will, comply with requests from law enforcement from user data. They don’t have to tell anyone that that happens. They do, some of the companies; some big companies do. These are from ‑‑ the one on the top is Twitter’s annual transparency report, and the one below is Facebook’s. And these are just graphs that their ‑‑ digitalizations they made about government requests for user data. And ‑‑ but again, this is almost a courtesy? This is like something that’s kind of done maybe for the brand, not necessarily because they have any obligation. But… It’s also just a reminder, like, there’s ‑‑ they can’t actually necessarily say no to, like, a warrant. This also applies to internet service providers, like Verizon; mobile data providers; hosting services. Like, companies have to do what the law tells them, and most of the internet is run by companies.
Next slide… So, companies don’t actually ‑‑ but governments don’t really always have to ask platforms to share like private data, if there’s sort of enough publicly available material to draw from. The method of using sort of publicly accessible data from, you know, online sources is sometimes called open source investigations, in that the method is reproducible and the data is publicly available. When Backpage still existed, that was more or less how cops would use it to conduct raids. One older example of this is in 2014, the New York Police Department conducted a massive raid on public housing project Harlem to arrest suspected gang members. It was called ‑‑ oh, shoot. It had like some terrible name… Operation Crew Cut. That’s what it was called. (Laughs) And much of the evidence used in the raid was culled, basically, from cops lurking on social media, interpreting slang and inside jokes between teenage boys as gang coordination. Some of the young men ‑‑ I think they were as young as 16 and as old as 23 ‑‑ who were caught in that raid are still serving sentences. Some of them were able to challenge the case and be let out, but they still ‑‑ it was a pretty terrible process.
A more recent example of sort of police using publicly available data is this one on the left in June. This woman in this photo was charged with allegedly setting a Philadelphia police vehicle on fire. And the police were able to figure out who she was, based on a tattoo visible in this photo ‑‑ which you can’t really see in this image because it’s quite small; I couldn’t get a bigger one. Based on a T‑shirt she had previously bought from an Etsy store, and previous amateur modeling work on Instagram. So, you know, maybe only a handful of people had bought that one Etsy shirt. And they were ability to kind of figure out and match her to these other images than online out in the public.
What is open source investigation and why does it matter?
I want to note briefly that open source investigation, or digital investigation using publicly available resources, isn’t inherently a bad thing or technique. It’s research. Activists and journalists use it to identify misinformation campaigns and human rights violations when it’s not safe for them to be on the ground. I’ve used it in my own work. But you know, the point is don’t make it easier for the police to do their job.
What is metadata and why does it matter?
The next slide… is another source of information that can be pulled from publicly available sites, besides just sort of reading the material and the images, is metadata. And “metadata” is sort of just a fancy word for data about data. If ‑‑ one way that sometimes this gets described is like, if you have like a piece of physical mail, the letter is sort of like the data, but who it’s ‑‑ the envelope, so who it’s mailed to, where it was mailed from, things like that, that’s the metadata. That’s the container that has relevant information about the content.
So in an image, it’s encoded into the file with information about the photo. This is some screenshots of a photo of my dog, on my phone. (Laughs) She, she briefly interrupted the session yesterday, so I thought I’d let her be a guest in today’s presentation. And if I scroll down on my phone and look kind of further below in the Android interface, I can see that it has the day and the time that the photo was taken, and then what I’ve blocked out with that big blue square right there is a Google Maps embed that has a map of exactly where I took the picture. You can also see what kind of phone I used to take this picture. You can see where the image is stored in my folder. You can see how big the image file is. These are examples of, like, metadata. That, combined with things like actual information in the data, like the actual information in the image, like a T‑shirt or a tattoo, all of that is like really useful for law enforcement. And metadata is something that is stored ‑‑ if you use like Instagram’s API to access photos, you can get metadata like this from the app.
Is it possible to remove metadata from your phone’s camera?
OLIVIA: So, surveillance capitalism! Really big ‑‑ oh, there’s a Q&A. Is it possible to remove metadata from your phone’s camera?
INGRID: So there’s two things that you can do. One is ‑‑ I think that you can, in your settings, like you can disable location? Being stored on the photos? Depending, I think, on the make and model. Another thing, if you’re concerned about the ‑‑ you know, detailed metadata… you can ‑‑ like, taking a screenshot of the image on your phone is not gonna store the location that the screenshot was taken. It’s not gonna store ‑‑ like, it might ‑‑ I think that the screenshot data might store, like, what kind of device the screenshot was taken on. But given ‑‑ but that doesn’t necessarily narrow it down in a world of mostly iPhones and, you know, Samsung devices and Android devices. Like, it could be ‑‑ it’s like a bit less granular. Yeah.
OLIVIA: Awesome.
Surveillance Capitalism: How It Works and Why It Matters
So, surveillance capitalism! I don’t know if you guys notice… I’ve been seeing them a lot more often. But some advertisements in between YouTube videos are just kind of like multiple choice questions? Some of them ask how old you are; some of them might ask if you’ve graduated from school yet; et cetera. So like, in what world is a single‑question survey a replacement for, say, a 30‑second advertisement for Old Spice deodorant?
Our world! Specifically, our world under surveillance capitalism. So, to go further on Ingrid’s initial definition, surveillance capitalism occurs when our data is the commodity for sale on the market. And usually, almost always created and captured through companies who provide online service ‑‑ free online services, like Facebook, Google, YouTube, et cetera.
We can’t really know for sure how much our data is worth? There’s no industry standard. Because, at the end of the day, information that’s valuable for one company could be completely useless for another company. But we do know that Facebook makes about $30 every quarter off of each individual user.
What are data brokers and why do they matter?
But social media sites aren’t the only ones with business models designed around selling information. We also have data brokers. And data brokers… If we go back to the private investigator example that we saw in the state surveillance section, thinking about the tools at their disposal, the level of openness that you have online, they could find out a lot of things about you. Like where you’ve lived, your habits, what you spend money on, who your family members and romantic partners are, your political alignments, your health status, et cetera. That’s like one person.
But imagine that rather than searching through your data and piecing together a story themselves, they actually just had access to a giant vacuum cleaner and were able to hoover up the entire internet instead. That is kind of what a data broker is!
I made up this tiny case study for Exact Data. They’re a data broker, and they’re in Chicago.
And Exact Data has profiles of about 20 ‑‑ not twenty. 240 million individuals. And they have about 700 elements associated with each of them. So some of the questions that you could answer, if you looked at a stranger’s profile through Exact Data, would be their name, their address, their ethnicity, their level of education, if they have grandkids or not, if they like woodworking. So it goes from basic data to your interests and what you spend time on.
So, potential for harm. You get lumped in algorithmically with a group or demographic when you would prefer to be anonymous. Your profile may appear in algorithmic recommendations because of traits about yourself you normally keep private. The advertisements you see might be reminders of previous habits that could be triggering to see now. And it’s also a gateway for local authorities to obtain extremely detailed information about you. I don’t know if Ingrid has any other points that might be potential for harm.
How to Mitigate Harm
But, luckily, there are ways to mitigate. Right? You can opt out of this. Even though it’s pretty hard? But if you remember from our first workshop where we talked about how websites collect data from us, we know that it’s captured mostly using scripts: trackers, cookies, et cetera. So you can use a script blocker! Also, reading the terms of service, it will probably mention the type of data an online service collects and what it’s for. They don’t always, but a lot of them do. So if you read it, you’re able to have a bit more agency over when you agree to use that service or not, and you might be able to look for alternatives that have different terms of service.
Privacy laws in the United States are pretty relaxed when it comes to forcing companies to report things. So, one tip is also to try setting your VPN location to a country that has stronger privacy laws. And then you might get a lot more banners about cookies and other trackers that they’re required to tell you if you live somewhere else that’s not here.
You can also contact data brokers and ask to be put on their internal suppression list. And a lot of brokers have forms you can fill out online requesting that. But the only issue is that this is really hard? Because there are a lot of data broker companies, and we don’t actually know how many more there are, because this is an industry that’s pretty unregulated.
Another mitigation strategy is creating, essentially, a digital alter ego that’s difficult to trace to your other online accounts. So if you are behaving in a way that you don’t want to be conflated algorithmically with the rest of your life, you can create separate online profiles using different e‑mail addresses, potentially using them in other places, and just creating as much distance between you and one aspect of your life and you in the other aspect of your life, and compartmentalizing in a way that makes it difficult to connect the two of you.
And then of course you can use encrypted apps that don’t store metadata or actual data. This could include messaging apps, but this could also include… word processors like CryptPad; it could include video conferencing; it could include a lot of different apps.
So, to wrap everything up: Over the past three days…
Wrapping Up the Digital Literacy Series
INGRID: So, I guess we wanted to kind of try and provide some wrap up, because we covered a lot of things in three days. And that was, like, a very broad version of a very, like, deep and complicated subject. But we sort of ‑‑ you know. We went through, you know, the foundational kind of depths of the internet, how it’s, you know, made, what’s like the actual kind of technical aspects of how it works. The platforms that, you know, are built atop that foundation that extract value out of it. And systems of power that incentivize those platforms to exist and control and govern kind of how some of that value is used ‑‑ (Laughs) Or misused.
And I guess across all three of these, I had a couple of, like, kind of bigger questions or things to think about that I wanted to kind of put forward. One is sort of, like, I think in some ways the neoliberal public/private structure of the internet as a ‑‑ as an infrastructure that everyone lives with… like, they’re ‑‑ like, you ‑‑ it’s ‑‑ it shapes the way that, like, everything else kind of follows. Right? When a… when something that was originally sort of like a government‑built property becomes a commodity that becomes the foundation of how anyone kind of can like live in the world, it creates kind of a lot of these aftereffects.
And I think ‑‑ I find internet history always really fascinating, because it’s a reminder that a lot of this is very contingent, and it could have gone different ways. Sometimes, people wanted it to go a different way? And thinking about what it looks like to build different networks, build different services and structures. And, you know, while living within surveillance capitalism. ‘Cause we haven’t built different internets and different structures quite yet. Our surveillance capitalism’s still pretty big. A big part of taking care of one another and ourselves is… taking care with where and how we speak and act online. Which is different from being afraid to say things? And more being kind of competent in where and how you, like, choose to speak, to protect yourself and to protect people you care about.
I think… that’s ‑‑ yeah, that went by really fast! (Laughs)
BLUNT: We’ll just shower you with questions. (Laughs)
How are companies making money off of data?
I have two questions in the chat. Someone says: How is it exactly that companies make money off of our data? Is it just through ads? Are there other processes?
OLIVIA: So, when it comes to making money off of it, sometimes… let’s say you’re a company that sells… let’s say you’re a company that sells headphones. And you are tracking data of the people who are using your headphones. They buy them, and then in order to use them, they have to download an app into their phone. Right? Through that app, they might record things like the songs you listen to, what time of day you listen to them, how long you’re using the app, where you are when you’re listening to it… And they might have this, like, select little package of data about you.
Now, they might find that that’s data that, like, a music… campaign ‑‑ the people who like do advertisement for musicians, I guess? I don’t remember what that job’s called. But it’s more like the idea that different companies collect data that’s useful for other companies in their… in their marketing practices, or in their business practices.
So Facebook collects data that a lot of different companies might want for a myriad of reasons, because the amount of data Facebook kind of siphons from people is so large. And so ‑‑ yeah, does that…? Do any of you guys have something to say around that, about other ways that companies might ‑‑
INGRID: Yeah. I mean, a lot of it bottoms out in ads and market research.
OLIVIA: Yeah.
INGRID: There ‑‑ I mean, another, another place where ‑‑ I don’t think, it’s not the most lucrative source of revenue, I think? In so far as, it’s not the biggest buyer? But like, police departments will buy from data brokers. That’s some ‑‑ and that’s, there’s no real regulation on whether or when they do that.
So. Like, you know, it’s ‑‑ just, information in general is valuable. (Laughs) So, it’s ‑‑ it’s not ‑‑ like, I think the ‑‑ and I mean, ironically, I think what’s kind of so fascinating to me about the model of surveillance capitalism is that, like, ads don’t work. Or like, they kinda work, but like. They don’t… In terms of ‑‑ like, in terms of actually proving that, like, after I look at a pair of glasses once, and then I’m followed around on the internet by the same pair of glasses for like two and a half months, like… The actual success rate that I actually bought the glasses, I don’t think is that high? But there is just enough faith in the idea of it that it continues to make lots and lots of money. It’s like, very speculative.
OLIVIA: I actually saw a article recently that said… instead of advertising ‑‑ like, say like we all paid for, like, an ad‑free internet? It would cost about like $35 a month, for each of us. In terms of, like, being able to maintain, like, internet infrastructure and pay for things, without having advertisements.
If you have an alter ego for privacy, how can you ensure it remains separate? Is facial recognition something to worry about?
OLIVIA: “If you have an alter ego account and a personal account, how do you ensure your online accounts stay completely compartmentalized and aren’t associated through your device or IP address, et cetera?” And then they say, “Is there a way to protect your face from being collected on facial recognition if you post pictures on both accounts?”
INGRID: Yeah. So we didn’t ‑‑ we didn’t talk about facial recognition. And I ‑‑ I kind of ‑‑ I kind of falsely assumed that that’s ‑‑ it’s been so heavily talked about in the news that maybe it was sort of the thing people were kind of ‑‑ not ‑‑ but I also didn’t want to overemphasize that as a risk, because there’s so much information beyond a face that can be used when trying to identify people?
In terms of if you’re posting pictures on two different accounts… I don’t ‑‑ like ‑‑ I mean, if they’re similar photos, I don’t think ‑‑ I think the answer is, like, your face will be captured no matter, like, what? That’s sort of a piece of it. I don’t know. Olivia, can you think of any examples of, like, mitigate ‑‑ like, mitigation of face recognition, beyond like ‑‑ I mean, Dazzle doesn’t really work anymore. But like, in the same way that like people will avoid copyright, like, bots catching them on YouTube, by like changing the cropping, or subtly altering like a video file?
BLUNT: I just dropped a link. Have you seen this? It’s from the Sand Lab in Chicago, called the Fawkes Tool, and it like slightly alters the pixels so that it’s unrecognizable to facial recognition technologies. I’m still sort of looking into it, but I think it’s an interesting thing to think about when we’re thinking about uploading photos to, like, escort ads or something like that.
OLIVIA: I think that’s difficult when it comes to, like, facial recognition, is because depending on like who the other actor is, they have access to like a different level of technology. Like, the consumer‑facing facial recognition software, like the stuff that’s in Instagram face filters, and like the stuff that’s on your Photobooth app on your laptop, it’s really different from the kinds of tools that like, say, the state would have at their disposal.
So it’s kind of like a different… I don’t know if the word “threat model” is even a good way to phrase it, because we know that like, say, for instance, the New York Police Department definitely has tools that allow them to identify pictures of protesters with just their eyes and their eyebrows.
And so, normally… if I were talking to someone who uses ‑‑ who has, like, two different accounts and is interested in not, like, being connected to both of those accounts because of their bio‑metric data, like their face, I would normally suggest that they like wear a mask that covers their whole face, honestly. Because I don’t really know of a foolproof way to avoid it, digitally, without like actively, like, destroying the file. Like, you’d have to put like an emoji ‑‑ like, you’d have to physically ‑‑ you’d have to physically cover your face in a way that doesn’t… that’s irreversible for someone else who downloads the photo to do. Because there’s a lot of tricks online when it comes to, like, changing the ‑‑ changing the lighting, and like, putting glitter on your face, and doing a lot of different stuff?
And some of those work on consumer‑facing facial recognition technology. But we don’t actually know how ‑‑ if that even works at the state level, if that makes sense.
So depending on like, who you’re worried about tracking your account… you might just want to straight up cover your face, or leave your face out of photos.
What is gait analysis and why is it important?
BLUNT: I wonder, too, do you ‑‑ if you could talk a little bit about gait analysis, and how that’s also used? Are you familiar with that?
INGRID: I don’t ‑‑ I don’t know enough about gait analysis…
OLIVIA: I know that it exists.
INGRID: Yeah. And I think ‑‑ like, it is ‑‑ and this is another thing where, in trying to figure out what to talk about for this session, figuring out like what are things that we actually know where the risks are, and what are things that… may exist, but we, like, can’t necessarily like identify where they are?
OLIVIA: I have heard of resources for people who are interested. Like, for high‑risk people who are worried about being founded via gait analysis? And gait analysis is literally being identified by the way that you walk, and the way that you move. And there are resources of people who, like, teach workshops about like, how to mess with your, like, walk in a way that makes you not recognizable. And how to, like, practice doing that.
BLUNT: It’s fascinating.
Does it matter if you use popular platforms in browsers or apps?
INGRID: “If you use popular platforms like Facebook and Instagram in browsers instead of apps, does that give you a little more control over your data, or does it not really matter?”
I ‑‑ so, Olivia and Blunt, if you have other thoughts on this, please jump in. I mean, my position is that it kind of doesn’t matter, in so far as what ‑‑ the things that Facebook, like, stores about you are things you do on Facebook. Like, it’s still tied to an account. Unless you’re talking about ‑‑ so I don’t think ‑‑ so it’s kind of whether it’s, you know ‑‑ like, ultimately, like every ‑‑ it’s not just like… you know, passive, kind of, like trackers are happening that you could maybe use a script blocker on, and that’s cool? But things you like, and things you click on, on Facebook in the browser, are still going to be stored in a database as attached to your profile. So it doesn’t necessarily change, I think, the concerns over both of those. But.
BLUNT: I’m not totally ‑‑ I have also heard things about having the Facebook app on your phone, that it gives Facebook access to more things. Like, the terms of service are different. I’m not totally sure about it. I don’t have it on my phone.
INGRID: That’s actually ‑‑ that’s a good point. I apologize. I guess it also ‑‑ it depends on what thing you’re trying ‑‑ kind of concerned about. So, one way that ‑‑ one thing that Facebook really likes having information on for users, individual users, is who else they might want to be Facebook friends with. Right? The like “People You May Know” feature, I once read, uses up like a very large percentage of, like, the Facebook infrastructure compute. Because connecting people to other people is really, really hard. And once ‑‑ and like, the Facebook app being on your phone does give it kind of the opportunity to be opened up to your phone contacts, and places you take your phone. Which can expand, like, the network of people that it thinks might be, like, in your proximity, or in your social network. Because if a phone number in your phone has a Facebook account, maybe they will ‑‑ they’ll say, like, oh, you know this person, probably!
In 2017, Kashmir Hill and Surya Mattu did a feature for Gizmodo on how it works, that was inspired by Kashmir getting recommended a friend on Facebook that was a long‑lost relative, from her like father’s previous marriage or something. That there was, like, no way they would have otherwise met. And part of ‑‑ so, her interest partly came out of trying to figure out how they could have possibly even made those connections. And Facebook wouldn’t tell them! (Laughs) Because the “People You Know” feature is also a very powerful tool that makes them, like, an app that people want to use, in theory. They also did some follow‑up stories about sex workers being, like, outed on their alt apps, on their like alt accounts, because the “People You May Know” feature was recommending friends who knew the sex worker from like other parts of their life the alt account. And there also were examples of, like, therapists and like mental health professionals being recommended people who were clients as Facebook friends. People who met other people in, like, you know, 12‑step meetings being recommended as Facebook friends.
So there is ‑‑ in terms of, like, app versus browser, like, Facebook won’t say whether or not some of this stuff goes into the, you know, whether or not information it gathers from mobile devices goes into its “People You May Know” recommendations. But based on examples like this, it seems likely that that plays a role.
So that doesn’t ‑‑ I guess, in terms of control over your data, like… I think I misunderstood the framing of the question, ’cause I guess it’s more ‑‑ it gives you slightly more control over what Facebook does and doesn’t know about you. Because if Facebook doesn’t know what you’re doing with and on your phone, that’s probably like not a bad idea.
Did that all make sense, or was that…? I don’t know.
How secure is Facebook Messenger? How secure is Instagram?
BLUNT: No, I think that made sense. Someone was asking about the Facebook Messenger app. I think the same thing sort of applies to that, ’cause it’s all connected. I don’t know if anyone has anything else to say about that.
INGRID: This is the part where I admit that I’m not on Facebook. So, I’m actually terrible at answering a lot of Facebook questions, because I don’t…
BLUNT: Yeah, I ‑‑ I also think, like, Instagram is owned by Facebook, so also having the Instagram app on your phone, I feel like, might also bring up some of the same concerns?
INGRID: Yeah. I think it’s slightly ‑‑ from ‑‑ like, from what I ‑‑ I mean, I do use Instagram, so I can remember that interface slightly better? But… Like… My experience ‑‑ like, so as far as I’ve been able ‑‑ like. So like, as my point of comparison, I had sort of a dummy, like, lurker Facebook account, for some research a while ago. And the difference between its attempts to like connect me and suggest follows, versus Instagram’s attempts, were like ‑‑ Facebook seemed far more aggressive and given far less information about me was able to draw connections that, like, didn’t make any sense ‑‑ that like, were, very accurate. So, I think that’s just a good thing ‑‑ like, it’s ‑‑ you know. Don’t trust Instagram, because don’t trust Facebook, but. In my experience ‑‑ like, I don’t know if it’s… as much a part ‑‑ like, it’s not as central to the business model.
BLUNT: Yeah. And I think, too, just speaking from my own personal experience, like when I have had Facebook or Instagram, I use like a tertiary alias and lock the account and don’t use a face photo on the app, just so if it does recommend me to a client they’re much less likely to know that it’s me. And like, that has happened on my Instagram account. So. My personal Instagram account.
What is Palantir and why does it matter?
INGRID: Yeah. There are several follow‑ups, but I feel like this question “Can you explain about Palantir?” has been sitting for a while, so I want to make sure it gets answered, and then come back to this a little bit. So ‑‑
OLIVIA: I can explain a little bit about Palantir. So, it’s kind of the devil. We have ‑‑ I think it’s existed for… since, like, 2014? That might be the wrong ‑‑ no, not ‑‑ I think it was 2004, actually!
INGRID: Yeah, no, it’s 2004. I just ‑‑ I accidentally stumbled into a Wall Street Journal article about them from 2009 yesterday, while researching something else, and died a little? It was like, “This company’s great! I can’t see how anything could be problematic.”
BLUNT: It’s 2003, yeah. 17 years.
OLIVIA: It was started by Peter Thiel, who is a really strong Trump supporter and is linked to this really weird, like, anti‑democracy pro‑monarchy movement in Silicon Valley that’s, like, held by a lot of like a weird circle of rich people. And they are kind of the pioneers of predictive policing, and have also assisted ICE with tracking down and deporting immigrants. And they actually recently went public ‑‑ hmm?
INGRID: They did? Wow!
OLIVIA: Yeah. They haven’t, like, turned a profit yet, in 17 years. But it was initially funded, if I’m getting this right, I’m pretty sure they were initially funded by like the venture capital arm of the CIA.
INGRID: Okay, they actually, they haven’t gone public yet, but they are planning for an IPO…
OLIVIA: Soon. Is that it?
INGRID: Yeah.
OLIVIA: Okay.
INGRID: Sorry, just ‑‑ ’cause they ‑‑ so, a thing about this company is that ‑‑ like, every two years, there is a flurry of press about them maybe doing an IPO, and then they don’t. And… So, I mean ‑‑ and yeah, sorry. So they were ‑‑ their initial funding partially came from In‑Q‑Tel, which is a capital firm run by the CIA that funds companies that make products that the CIA might need. Which ‑‑ so… Keyhole, which was a satellite imagery, like, software company, was initially funded by the CIA. And that company was later acquired by Google and became Google Earth. So just an example of the kind of things that they fund. It’s stuff like that.
OLIVIA: Oh, and to clarify, it’s like a datamining company. So they do the same kind of thing that the case study that I showed earlier does. But they have ‑‑ they’re really good at it. And they also create tools and technologies to do more of it.
INGRID: So ‑‑ and they’re kind of a good example of the point made at the beginning of this about surveillance capitalism and state surveillance being closely intertwined. Palantir has corporate and government contracts to do datamining services. I think they were working with Nestle for a while, and Coca‑Cola. They want to be providing as much tools to businesses as they do to ICE. And those, you know, kinds of services are seen as sort of interchangeable. (Laughs)
I mean, the funny thing to me about Palantir, too, is that ‑‑ it’s not like they’re ‑‑ in some ways, I feel like I’m not even sure it’s that they’re the best at what they do? It’s that they’re the best at getting contracts and making people think they’re cool at what they do?
OLIVIA: They market themselves as like a software company, but they really just have a lot of files.
INGRID: They’re kind of ‑‑ somebody in the tech industry once described them to me as like the McKinsey of datamining? That’s a firm that ‑‑ they work with lots of governments and corporations that do things that seem to just make everything worse, but they keep making money? (Laughs) Is the best way to describe it!
So I think, in terms of, like, explaining about Palantir, like… I guess, they are a high‑profile example of the kind of company that is rampant throughout the tech industry. They’ve had the most cartoonish accoutrement in so far as, you know, one of their founders is literally a vampire. And ‑‑ you know, they took money from the CIA. And their name comes from a Lord of the Rings, like, magical seeing stone. In some ways, I think that there is… a level ‑‑ like. They have ‑‑ there have been like documented instances of them doing egregious things, such as working with the City of New Orleans Police Department to develop predictive policing tools without an actual contract, so without any oversight from the City Council, without any oversight from the Mayor’s Office, based on ‑‑ and basically through the funding for the project coming through a private foundation. But in terms of, like, you personally in your day‑to‑day life, should you worry about this specific company any more than you would worry about a specific state agency? I don’t think that’s ‑‑ it’s going to depend on your particular risk levels, but… They’re kind of ‑‑ they’re a tool of the state, but not necessarily themselves going to, like, come for people.
OLIVIA: Also ‑‑
INGRID: Oh, literally a vampire? Peter Thiel is one of those people who believes in getting infusions of young people’s blood to stay healthy and young, so. As far as I’m concerned? A vampire.
What are Thorn and Spotlight?
BLUNT: I also just wanted to talk briefly about Thorn and Spotlight, ’cause I think that ‑‑ so, Spotlight is a tool used by Thorn, which I believe Ashton Kutcher is a cofounder of? The mission of Thorn is to, quote, stop human trafficking, and what they do is use their technology to scrape escort ads and create databases of facial recognition built off of those ads. And so I think it’s just interesting to think about the relationship between these tools and how they collaborate with the police and with ICE and in a way that could potentially facilitate the deportation of migrant sex workers, as well.
So, one question here… “Deleting the Facebook or Insta app won’t help, right, because the info on you has will be been collected?” Not necessarily. I mean, there won’t be any more data collected? And I think ‑‑ it’s true that it won’t be erased, unless you delete your account. And like, go through the steps to like actually‑actually delete it, because they’ll trick you and be like “Just deactivate it! You can always come back!” Never come back…
But like, yeah. There’s ‑‑ if it’s something that, like ‑‑ you know. As you continue to live your life and go places, although I guess people aren’t going places right now… They won’t have any more material. Yeah.
What data points does Facebook have?
BLUNT: Someone asked: “If you have an existing Facebook account that only has civilian photos and you haven’t touched it for four years, it only has those data points, right?” I think that’s a good follow‑up to the previous question.
INGRID: Yeah, that’s true. Well ‑‑ there’s also people you know who have Facebook accounts? And like, Facebook has this internal mechanism for, like, tracking non‑Facebook users as, like, air quote, like, “users,” or as like entities that they can serve ads to. And generally, it’s based on, like, those people being, like, tagged in other people’s images, even if they don’t have an account, or if they have a dead account. Like, if somebody ‑‑ if you have like a four‑year‑old untouched Facebook account, and somebody tags a contemporary photo of you, like, those data points are connected.
So, you know. Whatever other people do that could connect back to that original account, or whatever ‑‑ yeah. Whatever followers or friends you have on it… Updates that they produce could kind of be new data points about you.
Can fintech platforms connect your pay app accounts to your social accounts?
“In terms of fintech, how easy is it for companies to link your pay app accounts to social accounts?” Blunt, you might have a more comprehensive answer on this than I will.
BLUNT: Okay… Let me take a look. So, I think that there are, like, databases built off of escort ads that then get shared on the back end to sort of facilitate, like, a network‑type effect of de‑platforming sex workers. So a lot of people report ‑‑ and some of the research that we’re doing sort of confirms this ‑‑ that once you experience, like, de‑platforming or shadowbanning on one platform, you’re significantly more likely to experience it or lose access to it on another. So, as like a form of harm reduction and being able to keep access to those financial technologies, I suggest just having like a burner e‑mail account that you only use for that that’s not listed anywhere else publicly, that sounds like vanilla and civilian.
So that way, if they’re, like, running ‑‑ if they’re scraping e‑mail addresses from escort ads and then selling that data to facial ‑‑ to financial technologies, that your e‑mail won’t be in there. It’s just like adding one layer of protection. It might be connected in some other ways, but… just sort of as a form of harm reduction.
I don’t know if that answered… that question.
And I’m looking right now for resources that specifically ‑‑ resource specifically on protecting community members in regards to ICE centering trans sex workers. I know that… Red Canary Song does some work around this, specifically with massage parlor workers, and I’m looking up the name of the organization of trans Latinx sex workers in Jackson Heights. So I will drop that link so you can follow their work, as well.
And please feel free to drop any other questions in the chat, even if it wasn’t covered. We’ll see if we can answer them, and this is your time. So, feel free to ask away.
(Silence)
Tech Journals or Websites to Follow
INGRID: “What tech update journals or websites do we follow to stay up to date?” Oh! I want to know more about what other people do, too. I tend to, like ‑‑ I tend to follow specific writers, more than specific outlets, partly because… like, there are freelancers who kind of go to different places. But also, I kind of value seeing people who, like, have developed expertise in things. So… Julia Carrie Wong at The Guardian is someone I read a lot. Melissa Gira Grant, at The New Republic. (Laughing) Is not always writing about tech, but is probably one of the smartest and sharpest and most integrity‑filled writers I know.
BLUNT: Definitely one of the first to cover FOSTA‑SESTA, for sure.
INGRID: Yeah. Yeah. And… I’ve been… Let’s see. Motherboard, in general, Jason Koebler and Janus Rose, are very well‑sourced in the industry. So I usually trust things that they cover and find. Caroline Haskins is a young reporter who used to be at Motherboard and is now at BuzzFeed and does excellent work, along with Ryan Mac. And Kashmir Hill, who is now at The New York Times, but has also written for Gizmodo and others. And who else… Davey Alba is also with The New York Times, and is really great. Those are mine.
BLUNT: I sign up for the ‑‑ it’s like, Casey Newton’s daily e‑mail letter? And I just read that to stay up to date on certain news, and then research more. I know that the Internet Freedom Festival also has good updates, and I’m also happy to drop links to other weekly mailing lists that I sign up to, as well.
OLIVIA: Oh, I was muted! Oops. I, I listen to a lot of podcasts. And I know the, like, Mozilla IRL podcast was really good, for me, in terms of like learning more about like the internet, and specifically like surveillance infrastructure. And they have a lot of episodes. So if you’re, like, idling, and you have time to listen rather than reading. They also ‑‑ Mozilla also has their transcripts out, which is pretty nice.
Can browser bookmarks be accessed?
INGRID: “Is there any way for bookmarks on my browser to be accessed?” I believe the answer for that is gonna depend on the browser. Because ‑‑ or, it can depend on the browser? So, I think in the case of a browser like Chrome, which is owned by Google, if you are like logged into your Google account as part of, like, your browser using, I think all of that information then gets associated with your Google account. So Google will have that data. In terms of access beyond that, I’m not sure.
And then I think for other browsers, I don’t believe that that would be something that’s stored on Firefox. I’m not sure about Microsoft Edge. Olivia, do you have any other thoughts on that one?
OLIVIA: I, I don’t know…
How secure and safe is Zoom?
INGRID: Talking about safety of Zoom! Okay. Yeah. We talked, we talked a little bit about this yesterday, I think. Zoom is, you know, it’s software that was made for like workplace calls, and is designed for that setting. Which means ‑‑ (Laughs) In some ways, like, it is inherently a workplace surveillance tool. It is… relatively, like ‑‑ I mean, it’s, it’s not secure in the sense that, like, these ‑‑ I mean, first of all, this call, this is being recorded, and can ‑‑ you know, Zoom calls can be broadcast to livestream, like this one is. But also, the, you know, communications ‑‑ like, the actual calls aren’t, you know, encrypted in any way. Like, they kind of can just be like logged onto if they’re public. There can kind of just be some URL hacking. There are, you know, different settings you can make in terms of letting people in or out of meetings. But at the end of the day, also, Zoom has access to the calls! (Laughs) And how much you trust Zoom with that material is, you know, at your discretion.
I… (Sighs) I mean, in general, like… When it comes to, like ‑‑ like, Zoom calls are not where I would discuss, like, sensitive topics, or anything I wouldn’t want to have on the record. And that’s generally just the protocol I take with it. And I think ‑‑ I mean, that being said, like, yeah. There are… it is in such common use at this point, in terms of like spaces for events, like this one! That I won’t, like, kind of outright boycott it, simply because it’s become such a ubiquitous tool. But I think compartmentalizing what I do and don’t use it for has been helpful.
BLUNT: And so if you’re interested in staying up to date with EARN IT, I would suggest following Hacking//Hustling and Kate Villadano. I can drop their handle. And also, on July 21st, we’ll be hosting with our legal team… a legal seminar, sort of similar to this with space to answer questions, and we’re providing more information as to where EARN IT is and how you can sort of plug in on that.
Is there government regulation surrounding data tracking?
√: “Is there government regulation of data tracking, or not so much?” Not so much! Yes! In the United States, there’s very little regulation.
So, the reason ‑‑ or, one of the reasons that if you use a VPN and set it to a place like Switzerland and use it, you get a lot more information about what tracking is happening and you can make requests for data from platforms, is because of a European Union regulation called GDPR, General Data Protection Regulations? Or maybe General Data Privacy Regulations, sorry. And, yeah. The United States does not have an equivalent to that. Some websites ‑‑ in some ways, like, because the European Union is such a large market, I have seen some companies kind of just unilaterally become GDPR‑compliant, for like all users, simply because it’s easier than having, like, a GDPR version and a “other places” version. But, you know, when it comes to Facebook or… like, Instagram, or like large platforms, there’s ‑‑ like, they don’t ‑‑ they don’t really have an incentive to create conditions where they collect less data. So I think there, it’s kind of like, well, sorry. It’s gonna be that way. (Laughs) Yeah.
And it’s not as though ‑‑ and I think it is a thing that, like, lawmakers have interest in? But I think part of the challenge is… both, like, these companies are, you know, very well‑funded and will, like, seek to ‑‑ and will like lobby against further regulation? And also like a lot of people in Congress are… old! And bad at computer? And… don’t necessarily have ‑‑ sometimes have trouble, I think, wrapping their heads around some of the concepts underlying this. And, you know, and are not necessarily ‑‑ and like, I think the general atmosphere and attitude around, like, free markets solve problems! Kind of further undermines pursuit of regulation.
What exactly contributes to shadowbanning?
“In terms of social media accounts following your activity, based on your research so far for shadowbanning et cetera, who do you follow and… to certain lists?” I think, Blunt, this question might be for you, because of the research.
BLUNT: Yeah. I think it’s less about who you follow, and more about who you interact with. So, like, we’re still analyzing this research, but there seems to be a trend that, like, if ‑‑ if you’re shadowbanned, the people that you know are more likely to be shadowbanned, and there might be some relationship between the way that you interact and the platform? But we’re still figuring that out? But just like one thing you can try and ‑‑ we talked about this in another one, but having a backup account for promo tweets, so your primary account with the most followers doesn’t exhibit quote‑unquote “bot‑like activity” of automated tweets. And just having, like, casual conversation about cheese, or nature…
(Laughs) We’re not totally sure how it works.
Oh, and also! I believe her name is Amber ‑‑ I’m going to drop the link to it. But someone is doing a training on shadowbanning, and it seems like we’re collecting data on like multiple accounts. And it seems like there’s some interesting things to say. So I’m going to go grab a link to that. If you’re interested in learning more on shadowbanning, that’s on the 25th, at like 5:00 p.m. I think. So I’ll drop a link.
And just for the livestream: So, this is with Amberly Rothfield, and it’s called Shadowbanning: Algorithms Explained, on July 25th at 6:00 p.m. Still a few spots left. And it looks like she was pulling data on different accounts and playing with the accounts and analyzing the posts’ interaction and traction. So that should be really interesting, too.
Cool. Thank you so much for all of the amazing questions. I think we’ll give it a few more minutes for any other questions that folks have, or any other resources we can maybe connect people with, and then we’ll log off!
Can you request your information from datamining companies?
BLUNT: Oh. Someone’s asking, can I request my information from datamining companies?
OLIVIA: Yes, you can! Yes, you can. And a lot of them… Let me see if I can find a link? ‘Cause a lot of them have, like, forms, either on their website or available where you can make requests like that. You can request to see it, and I’m pretty sure you can also ‑‑ I know you can request that they stop collecting it and that they get rid of your file. But I think you can also request to see it.
BLUNT: I also just dropped a link to this. This is an old tool that I’m not sure if it still works, but it’s Helen Nissenbaum’s AdNauseam, which clicks every single advertisement, as well as tracking and organizing all of the ads targeted at you. It’s really overwhelming to see. I remember looking at it once, and I could track when I was taking what class in school, when I was going through a breakup, just based on my searches and my targeted ads.
Cool. So, is there anything else you want to say before we log off?
INGRID: I mean, thank you all so much for participating. Thank you, again, Cory, for doing the transcription. And… Yeah! Yeah, this has been really great.
OLIVIA: Hi, everyone. My name’s Olivia. My pronouns are she/her. Co‑facilitating with Ingrid. And some of the values that this particular digital literacy/defense workshop will be centered in include cyber defense, less as a way of military technology, right? Reframing cryptography as more of an abolitionist technology. Right? And cyber defense as an expression of mutual care and a way of accumulating community‑based power. And in that way, also thinking of ways to teach this type of material in ways that are antiracist, but also anti‑binary and pro‑femme.
And so, we’re really ‑‑ we really care a lot about making sure that this is trauma‑informed and teaching from a place of gentleness, considering the previous digital harm people have experienced and trying not to relive it. So if you need to take a break, remember that this is being recorded and posted online so you will be able to access it later.
INGRID: Great. Thank you, Olivia. My name’s Ingrid. I use she/her pronouns. And welcome back to people who were here yesterday. Today, we are talking about platforms! And in this context, we primarily mean social media sites like Facebook and Instagram. Some of this, you know, it can be applied to contexts where people kind of buy and sell stuff. But essentially, we’re talking about places where people make user accounts to communicate with each other. And ways in which ‑‑ but with kind of more of a focus on kind of the large corporate ones that many people are on!
There were four sort of key concepts we wanted to cover. There’s a lot in them, so we’ll try to move through them smoothly. First kind of being algorithmic curation and the way that can produce misinformation and content suppression. And some of the laws and legal context that are defining decisions that platforms make. We talked a little bit about this yesterday, but, you know, reiterating again: Platforms are companies, and a lot of decisions they make come out of being concerned with keeping a company alive, more than taking care of people.
What is algorithmic curation and why does it matter?
So we’re going to start with algorithmic curation. And I think there’s a thing also that came up yesterday was a tendency for technical language to kind of alienate audiences that don’t know as much about computers or math, I guess. An algorithm is a long word that ‑‑ (Thud) Sorry. That’s the sound of my dog knocking on my door, in the background.
Broadly speaking, an algorithm is a set of rules or instructions ‑‑ (clamoring) Excuse me. One second. She just really wants attention. I’m sorry you can’t see her; she’s very cute!
But… An algorithm is a set of rules or instructions for how to do a thing. You could think of a recipe or a choreography. The difference between an algorithm used in the context of a platform and a algorithm that contains, you know, ingredients for a recipe is that there is a lot less flexibility in interpretation in an algorithm. And it’s usually applied on a much larger scale.
And the reason that a lot of platforms… deploy algorithmic curation, and what algorithmic curation is experienced as, is often recommendation algorithms? And algorithms that determine what content is going to show up in a social media timeline.
So I am ‑‑ you know, I have recently been watching Avatar: The Last Airbender on Netflix. I am 33 years old. And… (Laughs) I found that, you know, Netflix wants to make sure that I know they have lots of other things that I might like because I liked that show. Right? And you could kind of think of algorithms as kind of being if this, then that rules. Like, if somebody watches this show, look at all the other people who watched that show and the other shows that they watched, and suggest that, you know, you probably will like those.
And platforms give the rationale for deploying these kinds of algorithms partly just trying to help people? Right? Like, discover things, because there’s so much content, and you’ll get overwhelmed, so we prioritize. What it actually kind of in practice means is trying to keep you using a service. Right? Like, I’m probably going to cancel my Netflix account once I finish Avatar, so. But oh, like no, now I gotta watch The Dragon Prince. Right?
I think… Do I do this part, or Olivia?
OLIVIA: I can do it?
INGRID: Sorry! I couldn’t remember how we split up this section.
OLIVIA: I… So… In early social media, we didn’t really have super complicated algorithms like the ones we do now. You have the, like, find your friends algorithms that would basically like show you perhaps like the friend of your friends. But the people you follow were mostly the only people whose posts you would see.
But now that we’re able to collect more user data about how you’re using the platform, as well as your activities off the platform, now algorithms are able to become more complicated, because there’s so much more information that they’re able to use.
So some of the things that might be going into your algorithmic curation are listed here. It’s a really long list, and not all of the things that are on this list are even… not all of the things that are on this list are even like the long exhaustive list of things that might be factoring into the algorithm? ‘Cause so few platforms actually disclose what are the things that contribute to the stuff that you see, and what you don’t see, and who’s seeing your own content, and the people who don’t see your own content. But one thing that we know for sure is that the way that these platforms are designed is specifically in order to make money. And so following that motive, you’re able to kind of map a lot of the predicted behavior of some of them.
And one of the really big consequences of these like algorithmic filter bubbles is misinformation. Right? So because we’ve all been inside for the past couple of weeks and months, we’re all really susceptible to seeing really targeted misinformation, because we’ve been online a lot. And so it’s quite possible that more data is being collected about you now than ever before. Platforms make money off of our content, but especially content that encourages like antisocial behaviors. And when I say antisocial behaviors, I mean like antisocial for us pro‑social behaviors. One of these categories encourages a healthy boundary with social media, like light to moderate use. Comforting people! Letting people know that they rock! Right? Cheering up people. Versus antisocial behaviors, while they’re much less healthy, they encourage people to use social media like three times as much. Right? People are spreading rumors; people are posting personal infection; if people are being ignored or excluded or editing videos or photos or saying mean things. Right? And so that makes an environment where misinformation does super well, algorithmically.
Through their design, especially platforms like Instagram and Twitter, they prioritize posts that receive lots of attention. We see this like how people ask others to “like” posts that belong to particular people so that they’ll be boosted in the algorithm. Right? They prioritize posts that get a lot of clicks and that get a lot of like feedback from the community. And it’s really easy to create misinformation campaigns that will take advantage of that.
OLIVIA: Nice. That was a really quick video from the Mozilla Foundation. But I wanted to clarify that there’s this assumption that people who fall for misinformation are like kinda dumb, or they’re not like thinking critically. And this is like kind of a really ableist assumption, right? In truth, anyone could unknowingly share misinformation. That’s like how these campaigns are designed, right? And there’s so many different forms that misinformation takes.
It could be like regular ole lies dressed up as memes; fabricated videos and photos that look super real, even though they’re not; performance art and like social experiments? (Laughing) Links to sources that don’t actually point anywhere? And it could have been investigation that was originally true! But then you told it to your friend, who got the story kind of confused, and now it’s not true in a way that’s really, really important. And of course, there’s also conspiracy theories, and misleading political advertisements, as well.
But sometimes, misinformation is less about being not told ‑‑ being told a lie, and more about not being told the truth, if that makes sense.
So, the easiest way to avoid misinformation is to just get in the habit of verifying what you read before you tell someone else. Even if you heard it first from someone that you trust! Right? Maybe one of your friends shared misinformation. But my friend is a really nice, upstanding citizen! Right? There’s no way that… I don’t know; being a citizen doesn’t matter. My friend is a nice person! And not always… are the people ‑‑ people who share misinformation aren’t always doing it to stir the pot. They just got confused, or they just… ended up in a trap, really.
So, fact‑check the information that confuses you, or surprises you. But also fact‑check information that falls in line with your beliefs. Fact‑check all of it. Because you’re more likely to see misinformation that falls in line with your beliefs because of the algorithmic curation that we talked about before. Right? We have an internet that’s like 70% lies.
So, two sites that were pretty popular when I asked around how people fact‑checked were PolitiFacts and Snopes.com. You could also use a regular search engine. There’s Google, but also using DuckDuckGo at the same time. You could ask a librarian. But also, if you look at a post on Instagram or Twitter and scroll through the thread, there might be people saying, like, hey, this isn’t true; why’d you post it? So always be a little bit more thorough when you are interacting with information online.
How does algorithmic curation contribute to content suppression and shadowbanning?
INGRID: So the next sort of thing we wanted to talk about that’s a, you know, consequence of algorithmic curation and companies, like, platforms being companies, is suppression of content on platforms. Right? Platforms have their own terms of service and rules about what people can and can’t say on them. And those terms of service and rules are usually written in very long documents, in very dense legal language that can make it hard to understand when you break those rules, and are kind of designed to, you know, be scrolled through and ignored.
And we wanted to ‑‑ but because a lot of the decisions about what is, like, you know, acceptable content or unacceptable content are, again, being made by an algorithm looking for keywords, for example… the platforms can kind of downgrade content based on assumptions about what’s there.
So… shadowbanning is a concept that I imagine many of you have heard about or, you know, encountered, possibly even experienced. It actually originally is a term that came from like online message groups and forums. So not an automated algorithm at all. Basically, it was a tool used by moderators for, you know, forum members who liked to start fights, or kind of were shit‑stirrers, and would basically be sort of a muting of that individual on the platform. So they could, you know, still post, but people weren’t seeing their posts, and they weren’t getting interaction, so they weren’t getting whatever rise they wanted to get out of people.
Today, the more common kind of application of the term has been describing platform‑wide essentially muting of users from, like, the main timeline, or making it hard to search for that individual’s content, based on what is thought to be automated interpretation of content. I say “what’s thought to be automated interpretation of content,” because there is a lot that is only kind of known about what’s happening on the other side of the platform. Again, yeah, what it often looks like is not showing up in search unless someone types the entirety of a handle; even if you follow that person, that person’s content not showing up in the main timeline, like in their follower’s feeds, not showing up in a hashtag…
And, shadowbanning is like a really gaslighting experience? Because it’s hard to know, is the result of what I’m saying is people just don’t like it, or people just don’t care anymore, or am I being actively suppressed and people just can’t see me? And if it’s something that has happened to you, or is happening to you, one thing that is important to remember is like you will feel very isolated, but you are in fact not alone. This is a thing that happens. It’s often sort of ‑‑ it’s been, over time, kind of dismissed by platforms as myth or kind of ‑‑ and I think, I wonder if, in some ways, perhaps their aversion to it comes from associating it with this less automated context? Because it’s like, well, we’re not deliberately trying to mute anybody; it’s just our systems kind of doing something! But the systems are working ‑‑ you know, they designed them, and they’re working as designed. Right?
Instagram recently, in making an announcement about work that they want to do to address sort of implicit bias in their platform, sort of implicitly acknowledged that shadowbanning exists. They didn’t actually use the term? But it is interesting to see platforms acknowledging that there are ways that their tools will affect people.
In terms of the “what you can dos” and ‑‑ Blunt, if you have anything that you want to add to that, I’d totally be happy to hear because I’m far from an expert. It’s a lot of what the sort of like best practices tend to be based on what other people have shared as like working for them. So basically, I don’t want to tell you anything and say like this is a guarantee this will like work for you in any given context. One thing that I have seen a lot is, basically, posting really normie content? Like, just going very off‑script from whatever your normal feed is, and doing something like, I don’t know, talking about your pet, or having ‑‑ you know, talking about like cooking. Basically just like changing what you’re doing. Another approach is getting your friends and followers to engage with your content, so that it’s seen as popular, so that it will like return to the timeline.
Blunt, is there anything else that you would want to include in there?
BLUNT: Yeah, I think something that communities found to be useful is that if you are going to be automating posts, to do it on a backup account so that what’s flagged as bot‑like behavior is ‑‑ so your promo account might be shadowbanned, but you might have a wider reach to direct people to where to give you money. But it’s a really complex topic. I’ve been thinking about it a lot right now as I was just ‑‑ Hacking//Hustling is currently studying shadowbanning. So far, we’ve found our data backs up a lot about what sex workers know to be true about show shadowbanning sort of works, what seems to trigger it and what seems to undo it. But as I was making a thread about the research, which both included the words “sex worker” and “shadowban,” I was like, I don’t even know if I can say either of these words without being shadowbanned! So I write it with lots of spaces in it, so hopefully the algorithm won’t recognize it, which also makes it inaccessible to anybody using a screen reader.
So, I don’t know. I know there was a class on how to reverse a shadowban, but I also think that after the global protests started that the algorithm changed a little bit, because we were noticing a lot ‑‑ a higher increase of activists and sex worker content suppressed in the algorithm.
INGRID: Yeah. That’s ‑‑ do you know when you’re going to be putting out some of the research from ‑‑ that Hacking//Hustling’s been doing?
BLUNT: Yeah, we just tweeted out a few of our statistics in light of the recent Twitter shenanigans, and… (Laughs) Some internal screenshots being shared, where they say that they blacklist users? Which is not a term I knew that they used, to describe this process. We’re in the initial analysis of the data stages right now, and we’ll probably ‑‑ our goal is to share this information primarily with community, so we’ll be sharing findings as we are able to, and then the full report will probably come out in like two to three months.
Can algorithms judge video content?
INGRID: “Have you found that the algorithm can judge video content? I know nudity in photos are flagged.” I would defer to Blunt on this question, actually.
BLUNT: I would say, yeah. I’ve had videos take ‑‑ I have lost access to YouTube from videos. So I think anything that you post with a… either a link… for sex work, or just links in general and photos are more likely to be flagged. So, like, personally, I notice my posts that are just text‑based show up higher and more frequently in the algorithm and on the feed.
Which laws and politics surround content suppression?
INGRID: Mm‑hmm… yeah. So the other kind of form of suppression we wanted to mention and talk about is not as algorithmic. It’s when, you know, the state gets involved.
So platforms are companies; companies are expected to follow rules; rules are made by governments. Sometimes, it’ll kind of look like shadowbanning. So TikTok has been reported to basically down‑rank certain kinds of content on the site, or like not, you know, have it show up in a “For You” page, or on your follow page, depending on laws in a country around homosexuality. Sometimes it’s, you know, a result of companies creating rules that are sort of presented as being about national security, but are actually about suppressing dissent. So in Vietnam and the Philippines, there have been rules basically made that mean ‑‑ that have made the contents of social media posts seen as, you know, potentially threats against the state, basically. And sometimes their rules about protecting the vulnerable are actually about, you know, some moral majority bullshit. Which seems like a good time to start talking about sort of legal contexts!
And a lot of this is ‑‑ all of this particular section is really USA contexts. And I feel like I should ‑‑ I wanted to kind of give some explanation for that, because I feel weird doing this like broad sweep on, like, other kind of like countries’ approaches and focusing so much on the United States. But the reason for doing that is, basically, America ‑‑ as, you know, an imperialist nation! Tends to have an outsized impact on what happens on global platforms, overall. And there’s, you know, two reasons for that; one is that most of these companies are located in the United States, like their headquarters are here, so they are beholden to the laws of the place; but secondly, it’s also about sort of markets. Right? Like, the ‑‑ if you, you know. Like, if Facebook is like, we don’t need the American consumer base! Like, it’s probably going to affect their ability to make money.
And there are exceptions in terms of, like, the ways that other law, like, law kind of impacts platforms’, like, structure and decisions. And we talked a little bit yesterday about European privacy laws, but we’ll try and bring a little more in tomorrow about those.
First kind of category is like ‑‑ this is a little bit of a tangent, but it came up yesterday, so I wanted to kind of mention it. This is an image from the account shutdown guide that Hacking//Hustling made, that I did some work on. And basically, platforms that, you know, can facilitate financial transactions, which can be something, you know, like Stripe, PayPal, or Venmo, but, you know… Basically, they have to work with banks and credit card companies. And banks and credit card companies can consider sex work‑related purchases to be like “high risk,” despite there being very little evidence that this is true? The reason sometimes given is the possibility of a charge‑back? Meaning, you know, hypothetically, heteronormative sitcom scenario, that I don’t want my wife to see this charge on my bill! So reports it, and it gets taken off. How much this is actually the case? Unclear. It’s also, like, they’re just kind of jerks.
But, you know, platforms don’t actually have a lot of ability to kind of decide ‑‑ like, to actually like argue with these companies? Because they control the movement of money. Around, like, everywhere? So, in some ways, it’s kind of ‑‑ you know, they kind of just have to fall in line. I mean, that being said, companies themselves are also like kinda dumb. I wasn’t sure whether this needed to be included, but this Stripe blog post explaining why businesses aren’t allowed? They have a section on businesses that pose a brand risk! And they have this whole thing about like, oh, it’s our financial partners don’t want to be associated with them! It’s not us! But, you know, like, fuck out of here, Stripe.
What is section 230?
Back to other laws! (Laughing) So. Section 230 is a term that maybe you’ve heard, maybe you haven’t, that describes a small piece of a big law that has a very large impact on how platforms operate and, in fact, that platforms exist at all. So in the 1990s, lawmakers were very stressed out about porn on the internet. Because it was 1996, and everyone was, you know, didn’t know what to do. And a bill called the Communications Decency Act was passed in 1996. Most of it was invalidated by the Superior Court? Section 230 was not. It’s part 230 of it. It’s a very long bill. It’s really important for how platforms operate, because it says that platforms, like, or people who run hosting services, are not responsible when somebody posts something illegal or, you know, in this case, smut. I, I can’t believe that there was a newspaper headline that just said “internet smut.” It’s so silly… But that the platform, the hosting service, they’re not responsible for that content; the original poster is responsible. Like, if you wanted to sue someone for libel, like, you would not sue the person who hosted a libelous website; you would sue the creator of the libelous website.
And this was initially added to the Communications Decency Act, because there was concern ‑‑ really because of capitalism! There was concern that if, if people were afraid of getting sued because somebody, you know, used their services to do something illegal, or used their services to post something that they could get sued for, that people would just not go into the business! They would not make hosting services. They would not build forums or platforms. And so it ‑‑ removing that kind of legal liability… opened up more space for, for platforms to emerge. It’s, in some ways, it’s a fucked up compromise, in so far as it means that when Facebook does nothing about fascists organizing on their platforms and fascists actually go do things in the world, Facebook can’t be held responsible for it. Right? I mean, the Charlottesville rally in 2017 started on Facebook. Facebook obviously got some bad PR for it, but, you know. Then again, writing some exceptions where platforms are responsible for this or that… tend to not be made on kind of trying to meaningfully support people with less power, but usually about what powerful people think are priorities. Such as the first effort, in 2018, to change or create exceptions to Section 230. Which was FOSTA‑SESTA!
What is FOSTA-SESTA?
It was sold originally as fighting trafficking? The full ‑‑ FOSTA and SESTA are both acronyms. FOSTA is the Allow States and Victims to Fight Online Sex Trafficking Act. SESTA is the Stop Enabling Sex Traffickers Act. But the actual text of the law uses the term, “promotion or facilitation of prostitution and reckless disregard of sex trafficking.” So basically, it’s kind of lumping sex work into all sex trafficking. Which… Yeah. That’s ‑‑ not, not so wise.
And what it essentially creates is a situation where companies that allow that ‑‑ allow prostitution, or facilitation of prostitution, and reckless disregard of sex trafficking to happen on their platform? Can be held legally responsibility for that happening. The day that FOSTA and SESTA was signed into law, Craigslist took down the Personals section of its website. It has generally heightened scrutiny of sex worker content across platforms, and made it a lot harder for that work to happen online.
What is the EARN IT Act?
And in some ways, one of the scary things about FOSTA‑SESTA is the way in which it potentially emboldens further kind of attempts to create more overreaching laws. The EARN IT Act is not a law, yet. It is one that is currently being… discussed, in Congress. It emerged as ‑‑ or, the way that it’s been framed is as a response to an investigative series that happened at the New York Times about the proliferation of sexual images of children on platforms. And this, this is a true thing. Basically, any service that allows uploading of images has this problem. Airbnb direct messages can be, are used? And it’s a real thing. But this, the actual law is a very cynical appropriation of this problem with a solution that really serves more to kind of control and contain how the internet, like, works.
It proposes creating a 19‑member committee of experts, headed by the Attorney General, who would be issuing best practices for companies and websites, and allow those that don’t follow the best practices to be sued. And what “best practices” actually means is currently ‑‑ is like very vague in the actual text of the bill. The word “encryption” does not actually appear in the text of the bill, but its authors have a long history of being anti‑encryption. The current Attorney General, Bill Barr, has expressed wanting back doors for government agencies so that they can look at encrypted content. And likely, you know, it’s thought it could include “best practice” things like make it easier for the government to spy on content.
This is ‑‑ you know. I know somebody who worked on this series, and it is so frustrating to me to see that effort turn into, how about we just kill most of what keeps people safe on the internet?
So I mention, this is something that is more good to pay attention to. Write your Congress member about. Hacking//Hustling has done ‑‑
What is encryption?
Oh, Blunt would like me to define encryption. So it’s a mechanism for keeping information accessible only to people who know how to decode it. It is a way of keeping information safe, in a way! And… The ability ‑‑ and it’s ‑‑ the introduction ‑‑ encryption was not inherently actually part of the early internet, because it was originally created by researchers working for the government who thought it would just be government documents moving around it, so they were all public anyway. But it has since been kind of normalized into a part of, like, just using the internet as we know it today. But it’s, in this context, it’s ‑‑ yeah, basically, it means that when ‑‑ if I want to send you a message, that the only people who can read that message are like you and me, and not the service that is moving the message around, or not like the chat app that we’re using.
That was ‑‑ I feel like that was a little bit garbled, but… I don’t know if you like ‑‑ if, Olivia, is there anything that you would want to add to that? Or a better version of that? (Laughs)
OLIVIA: I think, I think you’ve mostly said it, in terms of it’s like a way of like encoding information so that ‑‑ someone might know the information is present, but they don’t know what it says. So, when we have things like end‑to‑end encryption on the internet, it means that something is encrypted on my side, and no matter, like, say what third party tries to look at the message that I sent to you while it’s in transit, it can’t be seen then, and it also can’t be seen by them on the other side, because the person who I sent the message to has their own, like, code that allows them to decode the message that’s specific to them. And this happens on a lot of platforms without our knowledge, in the sense that apps that are end‑to‑end encrypted, like Signal, they don’t really tell you what your key is. Even though you have one, and the person that you’re talking to has one, it’s not like you’re encoding and decoding yourself, because the math is done by other things.
But if the bill goes out of its way to exclude encryption, then it might make it potentially illegal for these services to exist, which would be a really bad thing for journalists and activists and sex workers and, like, everybody.
INGRID: Yeah. And additionally, there is ‑‑ I mean, within the world of people who work on encryption and security tools, any ‑‑ the idea of creating a back door, or some way to sneakily decrypt a thing without somebody knowing, is that that creates a vulnerability that… essentially, it creates a vulnerability that essentially anyone else could exploit. Like, if it exists there, it’s like somebody will hack it and figure it out.
OLIVIA: There’s no such thing as a door that only one person can use.
What’s the connection between the EARN IT Act and The New York Times?
INGRID: A question ‑‑ EARN IT is not solely a response to an article by the New York Times? It was a series of seven articles. And when I say “in response,” that is the argument ‑‑ that is the statement made by the people who wrote the bill. I think that it was more that EARN IT was proposed by some Congress people who saw an opportunity to cheaply exploit outrage over, like, abuse of children, to put forward some policies that they would want to have happen anyway. I think, like, the ‑‑ it’s ‑‑ and I think the reason, I guess, I mention it is because I think it’s also important to acknowledge the ways that these ‑‑ yeah, it was all, it was an entire, entirely from the New York Times. And it’s, you know, honestly, like, I don’t… I, I think that the main takeaway from that series to me was more that, like, companies are dropping the ball? Not that we need the government to come in and, like ‑‑ or like, if there’s supposed to be government making rules about how companies address this issue, like, I don’t think that the solution is create a committee that pursues, like, telling the companies what to do in this way that doesn’t actually seem to have anything to do with the actual problem they’re talking about.
BLUNT: Totally. And we actually ‑‑ I just want to also say that on the 21st, Hacking//Hustling will be hosting a legal literacy panel, where we will be talking about the ways that fear and threats to national security are used to pass… laws that police us further, that want to end encryption, that want to do away with our privacy. So if you check out HackingHustling.org slash events, I think, you should be able to find out more about that. Again, that’s at 7:00 p.m. on the 21st. You’ll be able to learn a lot more. We’ll do an update on EARN IT, where to look for updates, and similar legislation that’s being passed.
INGRID: I did see ‑‑ there was like a ‑‑ I saw an article that said a bill was being worked on, that was basically like in response to EARN IT, trying to say, like, yes, this is this problem you’re claiming that you’re going to address, like, it’s bad, but like this is not the way to do it, and trying to come up with an alternative. I think Ron Whiting was involved. Do you know anything about this?
BLUNT: Yeah, I think that’s ‑‑ yes. I mean, yes, we will talk about that on the 21st. I’m not ‑‑ we will have our legal team talk about that, so I don’t say the wrong thing.
INGRID: Okay, great. Moving forward!
What are some secure and private platform alternatives?
Olivia, do you want to do the platform alternatives? I feel like I’ve just been talking a lot!
OLIVIA: Sure! So, it kind of sucks that we’re all kind of stuck here using… really centralized social media platforms that we don’t control, and that kind of, in like nefarious and really complicated ways, sometimes control us. And so you might be thinking to yourself, gee, I wish there was something I could use that wasn’t quite Instagram and wasn’t quite Twitter that could let me control information.
So, we have some alternatives. One of these alternatives is called Mastodon. And… Essentially, it’s a independent ‑‑ is that the word? I think the word is ‑‑
BLUNT: An instance?
OLIVIA: It’s an instance! There you go. It’s an instance of… Oh, no, I don’t think that’s the word, either.
Basically, Mastodon is a very ‑‑ is a Twitter‑like platform that’s not Twitter, and instead of going on like a centralized place, you can set up your own Mastodon instance for your community. So instead of having ‑‑ like, you might have Mastodon instances that are called other names? Kind of like ‑‑ would a good analogy be like a subreddit?
INGRID: Maybe. I think, like, the existence of ‑‑ so, Mastodon is also from a project to create… like, open standards for social networking tools. I think we talked a little bit about sort of standardizing of browsers and web content. And in the last decade, one that’s been in development is one for just creating an open standard of what, like, a social network should do and could be. The protocol is actually called ActivityPub, and Mastodon is built on top of it. It’s, it’s more ‑‑ it’s kind of like… the term used for how they’re actually set up is like “fed rated.”
OLIVIA: Federated!
INGRID: Yeah. You set up one that’s hosted on your own. And it can connect to other Mastodon sites that other people run and host. But you have to decide whether or not you connect to those sites. And I think the, the example ‑‑ the thing that ‑‑ sorry. I can jump off from here, ’cause I think the next part was just acknowledging the like limitations. (Laughs) ‘Cause I think ‑‑ so… With ‑‑ so, this is a screenshot of Switter, which had been kind of set up as a sex work‑friendly alternative to Twitter, after FOSTA‑SESTA. And… It has run into a lot of issues with staying online because of FOSTA‑SESTA. Their hosting in ‑‑ like, I think Cloudflare was originally their hosting service, and they got taken down, because the company that like made ‑‑ you know, the company that was hosting it didn’t want to potentially get hit with, you know, like, liabilities because FOSTA‑SESTA said you were facilitating sex trafficking or some shit.
So it’s, it’s not a, necessarily, like, obvious ‑‑ like, it’s not easy, necessarily, to set up a separate space. And whether setting up a separate space is what you want is also, like, a question.
OLIVIA: Another option is also… Say you have a community that’s on Instagram, or on Twitter, and you guys are facing a lot of algorithmic suppression, and you’re not able to, like, reliably which you can’t to the ‑‑ communicate to the people who like your page. You could also split it both ways. You could try having an additional way of communicating to people. So you might have like a Twitter page where you have announcements, but then have a Discord server or something where you communicate with community members, or similar things.
And those types of interventions would essentially allow you to avoid certain types of algorithmic suppression.
INGRID: Yeah. And in a way, the construction of an alternative, it’s, I think… the vision probably is not to create, like, a new Facebook, or a new, you know, Twitter, or a new Instagram, because you will just have the same problems. (Laughs) Of those services. But rather to think about making sort of intentional spaces, like, either ‑‑ like, within, you know, your own space. This is a screenshot of RunYourOwn.social, which is a guide created by Darius Kazemi about ‑‑ you know, what it is to create intentional online spaces. I just find it really, really useful in thinking about all this stuff.
All right. Those were all our slides…
BLUNT: I actually just wanted to add one little thing about that, just to follow up on those previous two slides. I think it’s important to note, too, that while there are these alternatives on Mastodon and in these various alternatives, that’s often not where our clients are? So I think that it can be helpful for certain things, but the idea that entire communities and their clients will shift over to a separate platform… isn’t going to, like, capture the entire audience that you would have had if you had the same access to these social media tools that your peers did. So I think just one thing that I’ve been recommending for folks to do is to actually, like, mailing lists I think can be really helpful in this, too, to make sure that you have multiple ways of staying in touch with the people that are important to you, or the people that are paying you. Because we don’t know what the stability is of a lot of these other platforms, as well.
INGRID: Yeah.
OLIVIA: E‑mail is forever.
BLUNT: Yeah.
INGRID: Yeah, that’s a really, really good way to ‑‑ you know, point. And thank you for adding that.
Okay! So I guess… Should we ‑‑ I guess we’re open, now, for more questions. If there’s anything we didn’t cover, or anything that you want kind of more clarification on… Yeah.
I see a hand raised in the participant section, but I don’t know if that means a question, or something else, or if… I also don’t know how to address a raised hand. (Laughs)
BLUNT: Yeah, if you raise your hand, I can allow you to speak if you want to, but you will be recorded, and this video will be archived. So, unless you’re super down for that, just please ask the questions in the Q&A.
What is Discord and how secure is it?
Someone asks: Can you say more about Discord? Is it an instance like Switter or Mastodon? What is security like there?
OLIVIA: So Discord is a ‑‑ is not an instance like Switter and Mastodon. It’s its own separate app, and it originated as a way for gamers to talk to each other? Like, while they’re playing like video games. And so there’s a lot of, a lot of the tools that are currently on it still make kind of more sense for gamers than they do for people who are talking normally.
A Discord server isn’t really an actual server; it’s more so a chat room that can be maintained and moderated.
And security… is not private. In the sense that all chats and logs can be seen by the folks at, like, at Discord HQ. And they say that they don’t look at them? That they would only look at them in the instance of, like, someone complaining about abuse. So, if you say like, hey, this person’s been harassing me, then someone would look at the chat logs from that time. But it’s definitely not… it’s not a secure platform. It’s not‑‑ it’s not end‑to‑end encrypted, unless you use like add‑ons, which can be downloaded and integrated into a Discord experience. But it’s not out of the box. It’s mostly a space for, like, communities to gather.
Is that helpful…?
INGRID: “Is the information on the 21st up yet, or that is to come?” I think this is for the event ‑‑
BLUNT: Yeah, this is for July 21st. I’ll drop a link into the chat right now.
What are some tips for dealing with misinformation online?
INGRID: “How would you suggest dealing with misinformation that goes deep enough that research doesn’t clarify? Thinking about the ways the state uses misinformation about current events in other countries the U.S. uses to justify political situations.” (Sighs) Yeah, this is ‑‑ this is a hard one. The question of just ‑‑ yeah. The depths to which misinformation goes. I think one of the… really hard things about distinguishing and responding to misinformation in this ‑‑ in, like, right in this current moment… is doing ‑‑ is kind of ‑‑ know ‑‑ like, it is very hard to understand who is an authoritative source to trust? Because we know that the state lies. And we know that the press follows lies! Right? Like, I imagine some of you were alive in 2003. Maybe some of you were born in 2003. Oh, my goodness.
(Laughter)
I ‑‑ again, I feel old. But… Like, the ‑‑ and you know, it’s not even ‑‑ like, you can just look at history! Like, there are… there are lots of legitimate reasons to be suspicious! Of so‑called authoritative institutions.
And I think that some of the hard things with those ‑‑ with, like… getting full answers, is… being able to ‑‑ is like finding, finding a space to like kind of also just hold, like, that maybe you don’t know? And ‑‑ and that actually maybe you can’t know for sure? Which is to say, maybe ‑‑ okay, so one example of this. So, I live in New York. I don’t know how many of you were ‑‑ are based near here, or heard about ‑‑ we had this fireworks situation this summer? (Laughing) And there was a lot of discussion about, like, is this like a op? Is this some sort of, like, psychological warfare being enacted? Because like, there were just so many fireworks. And, you know, the ‑‑ it’s also true that, like, fireworks were really like cheap, because fireworks companies didn’t have more fireworks jobs to do. I, personally, was getting lots of like promoted ads to buy fireworks. But like at the end of the day, the only way that I could kind of like safely manage, like, my own sense of like sanity with this is to say, like: I don’t know which thing is true. And the thing that ‑‑ and like, neither of these things address the actual thing that I’m faced with, which is like loud noise that’s stressing out my dog.
And so I think that some ‑‑ I think the question with, like, misinformation about sort of who to trust or what to trust, is also understanding, like… based on like what I assume, what narrative is true or isn’t true, what actually do I do? And… How do I kind of, like, make decisions to act based on that? Or can I act on either of these?
I guess that’s kind of a rambly answer, but I think ‑‑ like, there isn’t always a good one.
INGRID: There are two other questions, but I just want to quickly answer: What happened in 2003 is America invaded Iraq based on pretenses of weapons of mass destruction that didn’t exist. And companies ‑‑ like, news outlets reported that with no meaningful interrogation. (Laughs) Sorry.
What’s going on with TikTok and online privacy right now? Is it worse than the EARN IT Act?
OLIVIA: Re: TikTok… It’s a really confusing situation, because most places, especially a lot of cyber security experts on the internet, have been saying to delete TikTok? But also a lot of that ‑‑ a lot of reasons that it’s being done so are kind of boiling down to, it’s a Chinese app. Which is really xenophobic. But there are ‑‑ TikTok does track a lot of information about you. What it uses it for, mostly it’s to send you really, really, hyper‑specific TikToks. But it definitely is ‑‑ like, that information is being collected about you, and it exists in their hands. So I think it’s mostly a decision for individuals to make about whether they’re going to decide to trust TikTok with their information in that way. Because they absolutely know where you live, and they definitely know whatever things about you that you feel like they’ve gathered in order to create the TikTok algorithm that shows up in your feed. Those things can be ‑‑ those things are true. So.
I think ‑‑ Ingrid, do you have anything to say on that?
BLUNT: You’re still muted, Ingrid, if you’re trying to talk.
INGRID: Oh, sorry. I… The question, also, asked, you know, if things like the data collection on platforms like TikTok was worse than things like EARN IT. And I think the… It kind of depends on where you think, like, sources of harm are going to be? It’s ‑‑ you know, it’s kind of, just ‑‑ it’s different! Like, you know, there’s a bunch of information that a company now has that they could choose to sell, that they could choose to utilize in other ways, that they might give to a law enforcement agency that gets a subpoena. But whether or not ‑‑ but like, EARN IT and FOSTA‑SESTA are examples of ‑‑ like, those are ‑‑ that’s, I guess, a different kind of harm? That harm has less to do with collection of information, and more about suppression of content and information and of certain kinds of speech.
“Is it fair to say that social media companies can use your username alone to connect you to other accounts? Should we slightly modify our usernames to avoid being associated and shut down all at once?” So I think ‑‑ I mean, I would say just for the question of like whether to modify your username or not, I think that’s also a risk assessment question, in so far as if you need people to be able to find you across multiple platforms, I would not want to tell you to like not do that? Or to like make it harder for you to, like, reach clients or an audience. I think ‑‑ social media companies tend to… whether they’re looking for you across platforms, like, is not as clear to me. I think it depends on the, like, agreements that exist within the platform. So like, I know that ‑‑ I mean, like Facebook and Instagram are owned by the same company. Right? So they will end up sharing ‑‑ like, the sharing of those two identities, like, is fairly ‑‑ you know, that’s likely to happen. But…
OLIVIA: Some might not be looking for your other accounts? But if you’re ever, like, being investigated by like an actual individual person, or like say your local police department, or the state in general, they probably would be.
INGRID: Yeah. And in that case, I think that what may be more helpful is if you have sort of a public persona that you want to have kind of have a similar identity… That’s a choice you can make. And then if there’s like alt accounts that, you know, maybe are where you have more personal, like, communications, or are working ‑‑ you know, kind of more connected to community and less business? That, making those slightly harder to associate, or making those slightly more compartmentalized? And we’ll talk more a little bit about sort of compartmentalizing identities tomorrow. But I think, yeah, that’s one way to kind of address that ability of being kind of identified.
BLUNT: I think, too, I wanted to add that it’s not just like using the same username, but where you post it, or like what e‑mail is associated with an ad. If you’ve linked your social media to a sex working ad, one of the statistics that we found in the research, the ongoing research projects that Hacking//Hustling is doing right now on shadowbanning is that sex workers who linked their social media to an advertisement are significantly more likely to believe they’ve been shadowbanned, at 82%. Which seems to me that linking might put you in… the bad girl bin, as I call it. (Laughs)
Do we have any other questions? We still have a good chunk of time. Or anything that folks want more clarity on?
What is DuckDuckGo and what is a VPN? Should we use them?
Okay, so we have one that says, “I heard DuckDuckGo mentioned. Do you personally use that search engine? Also, I recently started using ExpressVPN, as I just started sex work, and bad on my part, I did little research on which VPNs. Have you heard of ExpressVPN? Do you have another app that you personally use or have more knowledge about? I want to stay safe and of course share with others what would be the best app to use for VPN.”
INGRID: Olivia, do you want to take some of this one…?
OLIVIA: I was muted. So, I do use DuckDuckGo, most often. Sometimes, if I’m trying to like test to see if something ‑‑ like, if I’m using another ‑‑ like, my house computer uses Google, because my mom’s like, I don’t like DuckDuckGo! It’s not showing me the things I want to see! And that’s usually because Google, again, collects data about you and actively suggests results that it thinks are the things you’re searching for, whether or not they’re what you’re actually searching for.
For VPN use, I use ProtonVPN, mainly because it’s free and I don’t really have money to pay for a VPN right now. But I think ExpressVPN is one of the most popular ones. So I’d say it’s pretty trustworthy.
INGRID: Yeah, I’ve used ExpressVPN. I’ve seen that it’s ‑‑ yeah. It’s generally, I think, a well‑regarded one. I think that’s partly why it costs the money it costs. (Laughs) So I think ‑‑ yeah. If you don’t want to have to keep paying for it; but if you’ve already paid for it, yeah, keep using it.
What are the alternatives for encryption?
Yeah. “Can we talk about some alternatives for encryption, assuming a back door is created?”
OLIVIA: This isn’t ‑‑ oop.
INGRID: Go ahead.
OLIVIA: This isn’t really an alternative for encryption, but I think one of the things that we could start doing is ‑‑ less so would it be, like, trying to function without encryption, but instead encrypting our messages ourselves. Because technically, you could have end‑to‑end encryption over Instagram DM if you do the hand work of encrypting the messages that you send by yourself. Bleh! Tripped over my tongue there.
So there are a lot of apps, specifically for e‑mail, I’m thinking of? Like, Enigmail, and Pretty Good Privacy, that are essentially tools that you can use to “hand encrypt,” in quotation marks, your e‑mails, so you don’t have to depend on someone doing that for you. Right, the government can’t knock on your door and say you’re not allowed to encrypt anymore. And encryption algorithms are mathematical things. So you wouldn’t be able to make one that’s like kind of broken. The ones that we have now are… as long as ‑‑ like, Signal for instance is very public about the algorithms that they use, and that’s how we know that we can trust them. Because other people can trust them, and they’re like, yeah, it’s really ‑‑ it would take a computer about a thousand years to crack this. And so we’re able to use those same algorithms by ourselves without depending on other platforms to do that work for us. And it would suck that we’d have to interact with each other with that level of friction? But it is possible to continue to have safe communications.
BLUNT: Yeah, and I think just in general, if you’re unsure about the security of the messaging system that you’re using? Like, right now, we’re using Zoom, and we had this conversation a bit yesterday. But I’m speaking on Zoom as if I were speaking in public. So if I were to say ‑‑ if I wanted to talk about my personal experiences, potentially I would phrase it as a hypothetical, is also one way. So just slightly changing the ways that you speak, or… Yeah. I think that’s also an option. Go ahead, sorry.
OLIVIA: No, I agree. Just bouncing off with the people that you’re talking to that, like, hey, we’re not going to talk about this. And not being, like, reckless. So in a, like in a public forum, don’t like post about the direct action that’s happening on Sunday at city hall. Things like that are not things ‑‑ just like using, in that sense, using discretion, at that point.
What is the back door issue and how does it relate to encryption?
BLUNT: Someone says: “So the back door issue is for companies that encrypt for us?”
INGRID: Basically, yeah. The ‑‑ the back door issue, or like what, I guess… the back door issue is not ‑‑ and it’s also not necessarily, like, all encryption would stop working. Right? It would be something like… you know, the government ‑‑ like a government saying, hey, WhatsApp, we want access to conversations that currently we can’t have access to because WhatsApp communications are encrypted, and ordering WhatsApp to do that. And one would hope? (Laughs) That ‑‑ like, companies also know that they have a certain amount of, like, brand liability… when they remove security features. So it’s something that would probably be known about? I don’t think that it would be done ‑‑ like, I would hope it wouldn’t be done surreptitiously? But, yeah. It’s more about, like, whether or not certain ‑‑ like, previously considered secure communications would become compromised. It wouldn’t necessarily end the possibility of ever, you know, deploying encryption ever again. It would be more of a service by service thing.
BLUNT: We still have some time for more questions, if anyone has any. Please feel free to drop them into the Q&A.
And maybe if Ingrid and Olivia, if you wanted to chat a little bit about what we’ll be talking about tomorrow, folks might have an idea of other things that they might want clarity on, or other things that they are really hoping might be covered.
What will be covered in part 3 of the digital literacy series?
OLIVIA: Yeah, tomorrow we’re gonna talk a lot about surveillance, like more specifically. So like, surveillance that’s done on platforms, and in ‑‑ but also like talking both about surveillance capitalism and state surveillance, and how they ‑‑ the different ways that they might cause harm for someone who’s like trying to use the internet. Yeah. I think those are the most ‑‑ the biggest points? But also thinking about… like, mitigation.
INGRID: Yeah. And we’re ‑‑ and in the context of state surveillance, we’re primarily talking about when the state utilizes platforms in the service of surveillance, or obtains information from platforms. There are a myriad of other ways that the state can ‑‑ that, you know, police departments or federal or state governments can engage in surveillance of people, digitally or otherwise. But partly because the scale and scope of that topic is very, very large, and because we know people are coming from lots of different settings, and maybe like ‑‑ and we don’t personally know the ins and outs of the surveillance tools of every police department in the world? We didn’t want to kind of put forward, like, examples of tools that might just be, like ‑‑ that would mostly just create, like, greater like anxiety or something, or that wouldn’t necessarily be an accurate depiction of threats or realities that people might face.
If there is interest in more of those things, we’re happy to do questions about them in the thing? But it’s not something that we did ‑‑ we’re doing a deep dive into, because… again, it seems like that might be better to do more tailored questions to specific contexts.
BLUNT: I’m curious ‑‑ did you see the EFF launched the searchable database of police agencies and the tech tools that they use to spy on communities? Speaking of not spying on people! (Laughing)
INGRID: Yeah, but that’s the thing ‑‑ another thing is like, well, those tools are here. God bless these agencies for putting that work together.
BLUNT: Cool. So I’ll just give it like two or three more minutes to see if any other questions pop in… And then I’ll just turn off the livestream, as well as the recording, in case anyone would prefer to ask a question that’s not public.
How to Build Healthy Community Online
Okay. So we have two more questions that just popped in… “Could you speak to building healthy community online? How to do that, how to use platforms for positive information spread?”
OLIVIA: So, when it comes to building healthy communities, I think… it really comes down to, like, the labor of moderation. Like, it has to ‑‑ it has to go to someone, I think. We often have ‑‑ one of the problems with a lot of platforms online is that they’re built by people who don’t really, like, see a need for moderation, if that makes sense? Like, one of the issues with Slack is that there was no way to block someone, in Slack. And a lot of the people who originally were working on Slack couldn’t conceive of a reason why that would be possible ‑‑ couldn’t conceive of a reason why that would be necessary. While someone who’s ever experienced workplace harassment would know immediately why that kind of thing would be necessary, right?
And so I think when it comes to like building healthy communities online, I think like codes of conduct are really honestly the thing that’s most necessary, and having people or having ‑‑ creating an environment on that specific profile or in that specific space that kind of invites that energy in for the people who are engaging in that space to do that moderation work, and to also like promote… pro‑social interactions, and to like demote antisocial interactions, and things like that.
BLUNT: I also think that we ‑‑ Hacking//Hustling also on the YouTube channel has… a conversation between myself and three other folks talking about sort of social media and propaganda and a couple of harm reduction tips on how to assess the, like, truthfulness of what you’re sharing and posting. And I think that’s one thing that we can do, is just take an extra second before re‑tweeting something and sharing something, or actually opening up the article before sharing it and making sure that it’s something that we want to share… is a simple thing that we can do. I know things move so fast on these online spaces that it’s sometimes hard to do, but I think that that… if, if you’re able to assess that something is misinformation, or maybe it’s something that you don’t want to share, then. It slows down the spread of misinformation.
Thank you so much to everyone and their awesome questions. I’m just going to take one second to turn off the YouTube Live and to turn off the recording, and then see if folks have any questions that they don’t want recorded.
Okay, cool! So the livestream has stopped, and the recording is no longer recording. So if folks have any other questions, you’re still on Zoom, but we would be happy to answer anything else, and I’ll just give that two or three more minutes… And if not, we’ll see you tomorrow at noon.
(Silence)
Okay. Cool! Anything else, Ingrid or Olivia, you want to say?
INGRID: Thank you all for coming. Thank you, again, to Cory for doing transcription. Or, live captioning. Yeah.
BLUNT: Yeah, thank you, Cory. Appreciate you.
OLIVIA: Thank you.
CORY DOSTIE: My pleasure!
BLUNT: Okay, great! I will see you all on ‑‑ tomorrow! (Laughs) Take care.
Part 1: OK But What Is The Internet, Really? In this three-day lunch series with Olivia McKayla Ross (@cyberdoula) and Ingrid Burrington (@lifewinning), we will work to demystify the tools and platforms that we use every day. It is our hope that through better understanding the technologies we use, we are better equipped to keep each other safe!
Digital Literacy Training (Part 1) Transcript
OLIVIA: Hi, everyone.
Just before we begin, some of the things that ‑‑ the values that we’re trying to cement this workshop in terms of cyber defense is firstly acknowledging cyber defense as a way of maintaining community‑based power, and cryptography as an abolitionist technology rather than military or something that doesn’t come from us, right?
So, there have been ways of using techniques like cryptography, and using ‑‑ and that community defense is something that doesn’t have to be immediately associated with a white supremacist, industrial technology.
So following that, we want to affirm that there can be a cyber defense pedagogy that can be ant-iracist, anti‑binary, and pro‑femme. But also one that’s trauma informed, right? And doesn’t reinforce paranoia. Because we know there are white supremacist institutions. And teaching from a place of gentleness. And considering, because of our myriad identities, the previous harm people might have experienced, and trying not to replicate it or force people to relive it.
So if you need to take space at any point during this workshop, we want to honor that, and this will be reported and available for view at a later time, as well.
INGRID: Thank you, Olivia. That was great.
My name is Ingrid. I go by she/her pronouns. And we are ‑‑ welcome, welcome to the internet! (Laughs) This is the first of a series of three digital literacy you sessions where we’re gonna be walking through a few different concepts.
And this first one we wanted to start with was really getting into just some of the baseline, you know, technical things around what the internet actually is and how people experience it, or how it, you know, works.
And… We’ve sort of organized this into a couple of sections. We’re gonna, you know, talk ‑‑ start kind of with a couple things about our personal kind of opinions about how to talk about some of these things, some grounding perspectives we’re bringing to it. The internet and kind of how it works as an infrastructure.
Browsers? Which, as like a particular technology for interfacing with the internet. And the World Wide Web, which is… you know, basically the thing that the browser takes you to. (Laughs)
So, starting with our opinions… (Laughs) We got ‑‑ we got more, but these seem important to start with.
The first one that we wanted to convey is that, you know, some of this stuff around what ‑‑ around how the internet works gets treated like this sort of special knowledge, or like something only for smart people. But, you know, companies have a lot more resources to do things. The people who run, work in, found tech companies often have had, you know, privileges like generational wealth! Or like early exposure to technology, that mean that some of this stuff was just more available to them.
And has been for a long time. And if there are things that are confusing, or unfamiliar, it’s ‑‑ you know, it is not because you don’t understand. And it’s because the people who kind of have a lot of control and power, like, are able to like overcome things that are confusing… Yeah.
We’ll come back to this point in other ways, I think, in this presentation today.
OLIVIA: The other point that we really want to hammer in is that nothing is completely secure online. And that’s due to the nature of how we connect to the internet, right? The only way you can really have a completely secure computer is to have a really, really boring computer! Right?
Computers are interesting because… computers and the internet are able to be interesting and fun things to use because we are able to connect to other computers. Right? Because it’s a form of a telecommunication device. And so it’s kind of okay! That our computers can’t be completely secure, because if they were, they’d just be kind of like brick boxes that don’t really do anything.
So instead of trying to chase like a mythological, like, security purity, what we do is we learn to manage risk instead. Right? We create systems so that we put ourselves in at least danger as possible.
What is the internet?
INGRID: So, for our kind of initial kind of grounding point, we want to just ‑‑ or, what the internet is. And this is, this is a hard question, sometimes, I find? Because… The word “internet” comes to kind of mean lots of different things. I ‑‑ for me, one of the most, like, the simplest summary I can ever provide is that the internet is just computers talking to computers. (Laughs)
It’s information going between computers. This image, which is, you know, one of many you can find when you Google image search “internet diagram” is a bunch of computers in, you know, a household, including a game machine and a few PCs. Who is this person? With all these devices? And they’re connecting to a router in their house, which has connected to a modem, which connects to the internet! Which is more computers. Not the ones that you’re seeing on the screen.
It’s kind of dorky, but this is a really goofy example of a computer talking to another computer. It’s from the movie Terminator 3. This also, I realize, is an Italian dub?
INGRID: So, I show this ‑‑ so what’s actually happening in this scene, which is, yes, very garbled, is the lady terminator, who is a robot, a very large sentient computer, is using a cell phone, like a dumb phone, to call another computer? And then she is making noises into the phone that are a translation of data into audio signal. And that is allowing her to hack into the LA School District’s database. It’s ‑‑ and it’s, you know, it’s very 2003? (Laughs) In that that was an era where, when people were getting online in their homes, they would have to connect to a modem that made sounds like that, too.
So I think, you know, it’s kind of a corny old example, but I like it because it also shows something that is hard to see in our day‑to‑day use of the internet, which is that for information to move from one computer to another computer, it has to be rendered into something material. In this case, it’s tones? It’s sound? On a home computer connected to a wi‑fi network, it would be radio waves. And kind of when you get to different layers of the internet, it’s going to be pulses of light traveling through fiberoptic cable.
So everything you type, every image you post, at some point it gets ‑‑ you know, that digital data gets transformed into a collection of, you know, arrangements of points of light, or, you know, a sound, or like a different material.
And it’s, you know, it’s much bigger! (Laughs) Than, like, than what we see on a screen! This is a map of the submarine cables that cross oceans that make it possible for the internet to be a global experience. It’s very terrestrial?
This is just for fun. This is just a video of a shark trying to eat one of the cables in the ocean… A cutie.
Rawrumph!! I just love his little… The point being, yeah. The internet is vulnerable to sharks! It is… it is very big, and it is complicated, and it is ‑‑ it is not just, you know, a thing on a screen. It needs a lot of physical stuff.
And when computers talk to computers, that doesn’t usually mean, like, a one‑to‑one connection? Right? So… I’m talking in this webinar to all of you right now, but, like, my computer is not directly connecting to your computer. What’s actually happening is that both of our computers are talking to the same computer… somewhere else.
There’s like a, you know, intermediary machine, that’s probably in a big building like this. This is an Amazon data center in Ashburn, Virginia. And that’s kind of the model that most of the internet takes; it’s usually, there’s kind of intermediary platforms, right?
And in a lot of technical language, this is called the client‑server model. The idea being that a server, which is a computer, holds things that are, you know, content on the internet, or applications like Zoom, and the client, which is just a computer, requests things from the server. You know, the server serves that. This goes ‑‑ this gets to the client computer through a routing process, that usually means that the information has to travel through multiple computers.
But! Again, this, like ‑‑ these words just mean computer and computer? Technically, you could turn a home computer into a server and get a stable internet connection and make it something ‑‑ make it something that just serves information to the internet. Or, you know, you could even think about the fact that because, you know, lots of information is taken from personal computers and sent to companies, you know, in some ways we are serving all of the time!
And I ‑‑ mostly, this is just a dynamic, again, thinking about… who controls and how the internet is governed, that I think is important to acknowledge? I mean, in some ways, the internet is not computers talking to computers so much as… computers owned by companies talking to computers owned by people?
The internet, you know, it began as a project funded by the U.S. military, but became the domain of private companies in the late 1990s. So all of that stuff that I was talking about earlier? You know, the submarine cables, the data centers, they’re all private property owned by corporations. And it’s kind of ‑‑ all of the, you know, technical infrastructure that makes the internet possible is a public good… but it’s all managed by private companies. So it’s kinda, it’s more, you know, a neoliberal private partnership. And it has been more a long time.
And I mention this mainly because it’s good to remember that companies are beholden to laws and markets, and it’s in a company’s interest to be compliant with laws and be risk‑averse, and that’s partly why a lot of decisions made by platforms or other companies are often, like, kind of harmful ‑‑ like, can be harmful to communities like sex workers.
And again, like, this doesn’t have to be the way the internet is? It’s just sort of how it has been for a very long time.
So, computers talking to other computers is what, you know, is our very simple summary of what the internet is. But computers don’t necessarily ‑‑ don’t ‑‑ can talk to each other in different kind of languages or dialects, let’s say? Which, in, you know, internet speak, are called protocols. Which, you know, a protocol is what it sounds like: It’s a set of rules about how something’s done. And so that’s, I find, maybe the dialect or language thing kind of useful.
Common Internet Protocols
So a few protocols that exist for the internet that you probably encounter in your daily life that you maybe don’t think that much about are Internet Protocol, wi‑fi, Address Resolution Protocol, Simple Mail Transfer Protocol, and HyperText Transfer Protocol. Maybe you haven’t heard as much, or it’s not as commonly talked about? But I’ll explain about these.
And I apologize; these screenshots are from my Mac. There are ways to access these same sorts of things from a Windows machine? I don’t have screenshots. (Laughs) So Internet Protocol is basically the foundation of getting on the internet. It assigns a number called an IP address, Internet Protocol address, to a computer when it’s connected to a network. And that sort of ‑‑ that is the ID that is used for understanding, like, who a computer is and how do you access it.
So when I want to go get content from a specific website, what I’m actually requesting under the hood is… is a set of numbers that is an IP address, which is like the name or ID of the computer that I want to go to.
I’m hoping this isn’t too abstract, and I hope, like ‑‑ yeah, please, if there are places where you have questions… please, add things to the Q&A.
So, Address Resolution Protocol and Media Access Control are a little different, but I wanted to talk about because it’s sort of related to understanding how your computer becomes a particular identity.
So, all ‑‑ there’s a question: Do all computers have their own IP address? They do, but they change, because different ‑‑ basically, when you go ‑‑ when you join the network, the address is assigned. It’s not a fixed ID. But there is a fixed ID that is connected to your computer, and it’s called a Media Access Control, or MAC, address.
And this is another screenshot from my machine. You can see this thing I circled here. That is my MAC address. And that is at the level of like my hardware, of my computer, an ID that has been… basically, like, baked into the machine. Everything that can connect to a network has one of these IDs.
And when ‑‑ and so Address Resolution Protocol is a mechanism for associating your temporary IP address with the MAC address, and it mainly exists so that if there’s, like ‑‑ like, if the network screws up and assigns the same IP address to two things, to like two different devices, the MAC address can help resolve like, oh, we actually mean this device, not that device.
Oh, I realize I didn’t make a slide for wi‑fi. I think most of you probably know wi‑fi as, like, it is the wireless ‑‑ the way that basically information is transferred to something wireless. Yes! Your IP ‑‑ well. Your IP address… will change when you connect, although it generally won’t change that much… It’s, it’s not like ‑‑ how am I answering this?
Like, if you’re ‑‑ if you’re connecting to the internet, in like your home? It’s probably ‑‑ you’re probably gonna get the same ID number, just ’cause it’s the same device you’re connecting to? But when you connect to a network at ‑‑ I guess no one goes to coffee shops anymore…
But in the time when you would go to a place with a different wireless network and connect to the internet! (Laughing) You would probably have a different IP address, because you’re connecting from a different device in a different network.
Oh, the other thing ‑‑ the only other thing about wi‑fi thing I will mention right now is that “wi‑fi” doesn’t actually mean anything. It’s not an acronym; it’s not an abbreviation. It’s a completely made‑up name… No one ‑‑ no one has a good answer for why it’s named that! (Laughs) I think like a branding consultant named it? It’s ‑‑ anyway.
So other protocols. So the Simple Mail Transfer Protocol, that underlies how e‑mail works.
So you encounter it a lot, but probably don’t think much about what ‑‑ like, that’s its own special kind of language for moving information, that’s different from the HyperText Transfer Protocol, which is one that may be familiar to all of you because it is the central protocol used for moving information in the browser!
Which is a nice segue, but I realized I also should mention that there is a variant of HTTP called HyperText Transfer Protocol Secure, or HTTPS. It’s an implementation of HTTP that encrypts the information transferred. So, that wasn’t adopted or implemented when browsers and HTTP were first being developed?
Because, again, these technologies were being developed with, you know, public funding and thought of as tools for scientific research, not for making purchases with credit cards or having, you know, private communications. So the implementation of security features and encryption into the internet is sometimes clumsy or frustrating because it was not designed into the original concept.
What’s an internet browser?
All right. So, we are next moving into the browser. I’m kind of a nerd about internet history things, so part of what I wanted to talk about with the browser is just its origin story?
The first example of a browser that was easy to use was created by researchers at a University of Illinois, including a guy named Marc Andreessen. He made something called Netscape Navigator. It was kind of a… It was a very important opening of the internet to the general public, and it changed a lot of the people’s the perception and ability to be part of the internet.
Marc Andreessen became very rich because he did this, and he founded a venture capital company, or firm, called Andreessen Horowitz. Returning to the idea that a lot of these companies are not smart, they’re just rich? He worked on a thing that is very important… That is not a good reason that he gets to throw money at Airbnb and decide how, you know, urban planning and housing is going to be changed forever!
There are fundamental kind of reasons ‑‑ like, there’s something about that which I feel is kind of important to remember. Both to acknowledge ‑‑ it’s not that Marc Andreessen is a dumb guy; that’s that he’s been given a lot of authority through getting a lot of money through being part of one ‑‑ through doing a clever thing.
A lot of the things that define the browser in the 1990s when it was first becoming an adopted thing were actually proprietary technologies made by different companies. So different companies had their own browsers that they had made. And they wanted to be The Browser everyone used. Right? And so they invented new things to make their browser cool? But they wouldn’t work on other ones.
So Olivia will talk a little bit more about these, I think, in the section on the web. But Cascading Style Sheets, which is a way of adding, you know, designing aspects of a web page, were invented by Microsoft. Javascript, which is a programming language that works in browsers, was created by a guy at Netscape in 14 days? (Laughs) And, yeah, if you wanted to ‑‑ if you made a website and it had, you know, CSS in its layout, it would be like ‑‑ it would be visible in a Microsoft browser, but not in a Netscape browser.
This was a terrible way of doing things? And possibly because companies got nervous about possibly getting regulated, and partly because it was just bad for business, they started ‑‑ they sort of, they figured out how to kind of put aside some of their differences and develop standards, basically.
So the standardization of browsers, so that basically when I open something in Chrome and I open something in Firefox it looks the same and it works the same… kind of starts to be worked on in 1998. It really only starts to be implemented/widespread in 2007, and it continues to be worked on. There are entire kind of committees of people who mostly work at the tech companies that make these browsers who kind of come and talk to each other about, like, what are the things we’re all gonna agree are gonna ‑‑ about, like, in terms of how this technology works?
And we’re looking at, and wanting to talk a little bit, about browsers also because they are really useful teaching tools. It’s really easy ‑‑ well, it’s not “really” easy. It is pretty easy to kind of look at what’s going on behind the scenes, using a browser. And that’s mainly because they’re very old.
You know, by 2007 when the iPhone emerges, and when I think the App Store is in 2010 or 2011, you can’t really look and see ‑‑ it’s much harder to go on your phone and see, like, I wonder what kind of data Instagram is sending back to, you know, Facebook right now! Like, to actually try and look for that on your phone is almost impossible. But you can kind of start to look for that in a web browser.
And that’s sort of a privileging of desktop technology, and a legacy of this being kind of an old technology, where transparency was treated as just inherently a good idea. And I think that if they were being built today, we probably wouldn’t have it.
So, we’re going to introduce you to some browser tools in this next section ‑‑ oh, wait, sorry, one more thing I wanted to acknowledge. This isn’t super detailed as far as comparing the privacy features of different browsers? But ‑‑ and we are working on a list of sort of, like, a bibliography that we can share with everyone later.
The point being ‑‑ the main thing I just wanted to convey here is like different browsers defined by different companies, they’re gonna all work more or less the same, but they do have kind of underlying qualities that might not be great for user privacy. And, also, there’s, you know, questions of like… when, you know, one company kind of controls the browser market, how does that change kind of the way that people see the internet?
So, you know, doing some research, doing some comparison of, of what different browsers… you know, do and don’t do. Most of the screenshots for this were done in Firefox. If you use other browsers, that’s fine. But… Yeah.
All right. Now ‑‑ (Laughs) Now we will move to World Wide Web!
What are web pages and how do they work?
OLIVIA: Hi, everyone! So, this part is talking a lot about the actual content that you are able to look at using your browser. So we’ll be making use of a lot of the tools that Ingrid mentioned about looking deeper into the actual… web pages themselves.
Awesome. So, this is a web page. It’s the same page that the video that we showed earlier in the beginning of sharks biting undersea cables! (Laughs) And it’s accessible to anyone who can connect their computer to the World Wide Web. And so, a lot of times we use “the internet” and “the web” interchangeably?
But the internet itself is more of the infrastructure, and the actual place, if we can call it a place, that we’re going to logically… is called the World Wide Web. Right? That’s the whole WWW‑dot thing that we’ve all been doing.
So, web pages are hosted on computers! You can host a web page on your own computer; you can pay another company to host it for you; other companies host themselves, if they have a lot of money. And… If you are paying someone else to host your website for you, you might end up ‑‑ you have a lot less autonomy. Right?
So there’s a lot of movements for people to like start hosting things themselves to avoid things like censorship and surveillance. Because like we said in the beginning, companies are beholden to a lot stricter laws than individuals are. And individuals are able to kind of themselves say ‑‑
What’s the difference between VPN and TOR? If we have time at the end, we will cover that a little bit, briefly. But essentially, a VPN ‑‑ TOR is a browser, and a VPN is something that you can install into your computer.
TOR does something, does things, that are very similar to what VPNs do, in terms of like onion routing? But they’re not… they’re not the same. Like, you can use TOR to navigate the internet, or you can use a VPN and use your normal browser. Right.
To look at a web page’s source, right, oftentimes you can right click or can N‑click? And you click the, like, You click View Page Source, and you’ll be able to get a closer look at the actual web page itself.
And so when you, when you view the source, you ‑‑ oh, you can go back. When you view the source, you end up seeing HTML. Right? So we told you earlier that the web uses HTTP, which is the HyperText Transfer Protocol, to send and receive data. The data that’s being sent and received is HyperText. Right? That’s written in the HyperText Markup Language.
So… HTML isn’t a programming language, per se; it’s a markup language. So it defines the structure of your content. It displays things, like text and images and links to other web pages.
And there are two ways that HTML pages can exist: Static and dynamic. So static would be a lot of the pages that we might code ourselves, right? Dynamic is more of the… the web pages that are generated dynamically are like Facebook and Instagram. The user requests a page, which triggers code that generates an HTML page.
So sometimes you would want… ‑‑ if you try to look at the source code of a website, you won’t really see much of anything? Because that code, like, doesn’t exist yet. Unless you open an inspector, and you look at the code that’s visible on your side.
So, to make this content look better, it’s often styled. Right? ‘Cause otherwise, it would just be plain Arial size 12. So we add color, add shape, animation, layouts, italics. And we do that using Cascading Style Sheets, or CSS.
CSS is also not a programming language. It’s a way of representing information. So this is what a static HTML file might look like. I grabbed this from a teaching resource, so that’s why you can see things like explanations of what HTML is, because I thought it would look a bit cleaner than the WIRED article.
And this is a CSS file! You see things like font size, font family, color, background color, position. Right? So those are the types of things that you can control using CSS. You can even make animations.
So, knowing that ‑‑ the point that we’re trying to make in saying this is that HTML can’t do anything with your data. Neither can CSS. They just display things that are coming from the other computer that you’re connecting to. So how are web pages collecting our data? Well, the code that actually does stuff in your browser is usually written in Javascript.
So… To see it in action, we can go into Tools, and Web Developer, and Inspector! And we can see some of the stuff that’s going on behind the scenes, right? This is how you do this in Firefox, and it’s similar but not identical in other browsers like Chrome and Safari. You ‑‑ I don’t think you can do this in Safari at all, but I might be wrong about that.
So if you check out the Inspector tab, we have an easier way of reading the HTML source than just pulling it all up in a really large, confusing doc. Right? We get syntax highlighting. We get little disclosure triangles. And we’re able to highlight things and see ‑‑ we’re able to hover over different parts of the HTML, and it’ll highlight that section in the actual web page. So it’s a really useful teaching tool.
The Console tab, we’re able to see more of the Javascript activity that’s happening in the background of the page. So we’re able to see all of these, like, jQuery calls and database calls and analytics. Right? So this is how a web page might try to get information about you so that they ‑‑ the company, in this case WIRED, can use that information to structure their own marketing practices. Like, how many people went to this article about sharks biting undersea cables? They would use Javascript in order to record the fact that you, one person, went to this website.
In the Network tab, it shows data being sent and data being received by your browser. Right? So all of the, all of the ones marked “POST,” or ‑‑ you can only see the P‑O, in this part, are being sent, and all the ones marked “GET” are being received.
And so some of this stuff is fairly, like, normal. It’s actual HTML stuff that’s being included on the page. You can see the different types. And then some of it, in other places, you would be able to see like actual trackers. Right?
And when you click on one of the items, you’re able to see more information about what’s being transferred.
INGRID: And this is not necessarily ‑‑ I mean, although this is not very helpful? Like, when you click the headers? It’s like, here is a bunch of words! I don’t know what’s going on! But the other tabs can give us a little more, and depending on the type of network request, you’ll get slightly easier‑to‑read data.
What are cookies and how do they work?
So, in this section, we’re going to talk a little bit about some of the tracking methods. Cookies… are called cookies, because in ‑‑ they’re called cookies with the web, because in a different technology, whose name I do not recall, this same thing was called a magic cookie.
And I don’t know why it was called that in the other one… It’s just a, it’s a… it’s a holdover from the fact that a small number of people working on the internet had inside jokes, as far as I can tell.
But a cookie is a text file that contains information. Usually it’s something like an ID. And it’s used for doing things like storing preferences, or kind of managing things like paywalls on news websites.
So in this case, the cookie that was handed off to me from this particular page gave me this ID number that’s just like a pile of letters and numbers. And my browser will store that cookie, and then when I ‑‑ if I go back to the WIRED website, it’ll see ‑‑ it’ll check to see, like, oh, do I already have a cookie assigned to this one?
And if it does, it will take note of how many other WIRED articles I’ve already read. And that’s how WIRED gets ‑‑ is able to say, hey, we noticed you’ve gone, you’ve read all your free articles… Stop, stop doing that. You don’t get anymore.
And they’re not ‑‑ they can also be used for things like, you know, like say you have, you know, a certain kind of like ‑‑ like, you have a log‑in with a particular website, and you don’t want to have to log in every time, and the cookie can store some information for you. But they’re also used for things, like, kind of tracking ‑‑ like, just trying to see where people go online to, you know, be able to figure out how to sell them things.
Just a distinction note, like, if you look at things in the Network tab: A response cookie is a file that comes from, like, a website to your computer; a request cookie is a file that your computer generates that goes to that computer. And it’s, you know. And a lot of this is stuff that is encrypted or encoded or kind of arbitrary ‑‑ like, which is good, in so far as it’s not creating ‑‑ oh, sorry.
It’s not ‑‑ it’s not just giving, you know, information, passing information about you and storing it in the clear? You still probably don’t want it? (Laughs)
So cookies can also be used for, like, tracking. This website has like, you know, a lot of different scripts running on it, because media companies work with other, you know, companies that do this kind of audience tracking stuff.
So like, when I was looking at this one, it was like the URL that the domain was coming from is elsa.memoinsights.com. That’s a weird name, and I don’t know what any of this is. If I type that into the browser, it doesn’t produce a web page?
But when I Google “memo insights,” I find: A company that works with companies to give them, you know, competitive analysis and campaign summaries. I don’t know what these things are, but this is some boutique company that works with Conde Nast, which owns WIRED. Maybe they do something with what I read, and maybe we can learn that people who read WIRED also read the New Yorker, or something.
What are pixel trackers and how do they work?
There are other trackers on the web that are not based in cookies and are a little bit weirder. So, pixel trackers are basically just tiny image files. They’re called this because, you know, sometimes they’re literally just one pixel by one pixel. And to load the image, you know, so the image is hosted on a server somewhere else, not on the WIRED website.
It’s hosted by whatever company, who knows, is doing this work. And because the image has to load from this other server, my computer makes a request to that server. And once that request is logged, it’s ‑‑ that’s, you know, the… that server can, you know, get information from my request about, you know, my computer, where I’m coming from, what ‑‑ like, how long I spent on it, what time I accessed it.
If you ever used, like, e‑mail marketing software, or like newsletter software, like MailChimp or TinyLetter, this is usually how those services are able to tell you how many people have opened your e‑mail. They’ll have like an invisible pixel tracker loaded into the, into the actual e‑mail, and will send the information about when that image loaded to the newsletter web service.
So, and so pixel trackers, they’re sort of sneaky in that they’re like… Again, like, you literally can’t see them on the web page. And they’re not as transparently kind of present? (Laughs) As other things?
What is browser fingerprinting and how does it work?
A more ‑‑ another method of tracking users on the internet across different websites is something called browser fingerprinting, which is a bit more sophisticated than cookies. So in the last few years, browsers have become a lot more dependent on and intertwined with a computer’s, like, operating system and hardware. For example, when you join a Google Hangout or a Zoom call! (Laughs)
You ‑‑ the browser is gonna need to access your webcam and your microphone. Right? And those are, those are, you know, parts of the hardware. So there needs to be like ways for the browser to talk to those parts of your computer? And that in and of itself isn’t a bad thing. But! It means that if a, you know, if some code is triggered that asks questions about, you know, those other parts of hardware, it might just get ‑‑ like, that’s data that could get sent to another server.
So in this example, we’re looking at the loaded information includes things like browser name, browser version. And that’s stuff ‑‑ like, that will usually be in a typical request. Like, knowing what the browser is or what kind of browser isn’t that unusual? But then we get things like what operating system am I on? What version of the operating system am I on?
I don’t ‑‑ like, I don’t know why this site needs that information! And I didn’t see any fingerprinting happening on the WIRED website, so I had to go to the YouTube page that the video was on. (Laughs)
There are a lot of more detailed sorts of things that can be, like, pulled into fingerprinting. So like your camera. Like, is your camera on? What kind of camera is it? That can get ‑‑ that can be something that a, you know, browser fingerprint will want to collect. Your, like, your battery percentage, weirdly? It’s ‑‑ and all of this is in the service of creating, like, an ID to associate with you that is definitively your computer, basically.
As opposed to, like, you know, you can actually like erase cookies from your browser, if you want to. Or you can say, like, don’t store cookies. But it’s a lot harder to not have a battery.
In terms of knowing if fingerprinting’s happening, one way to do that in the Network tab is you’re looking for the POST requests, which means that your computer is sending something to another computer. And one way that it can get sent is in a format called JSON, which is an abbreviation for JavaScript Object Notation. Which is basically a format for data that can be processed by the programming language that works in the browser.
This is ‑‑ another way if, like, if the, you know, the Network tab is like a little overwhelming, there are browser extensions that can show you more kind of detailed things about what’s going on with fingerprinting.
Additionally, just as a sidenote, browser ‑‑ like, browser extensions are another example of like throwbacks of the browser. The idea that anyone can build, like, extra software for that piece of software? It’s like, no one would ever let you do that to the Instagram app on your phone. And it’s sort of a, it’s kind of a leftover thing from something ‑‑ like, Firefox started doing it in 2004, and then everyone copied them. (Laughs)
But, back to fingerprinting.
Just as far as ‑‑ this is a Chrome extension called DFPM, Don’t FingerPrint Me, which just logs this in a slightly tidier way. So I thought I would show it. And it highlights a couple of examples of ways that this page is currently doing fingerprinting that I might want to know about.
So canvas fingerprinting is a method ‑‑ it sort of describes it here. It draws like a little hidden image on the page that then is kind of encoded to be, like, your fingerprint. I think Firefox actually blocks this by default, so I had to do this in Chrome! (Laughs)
WebRTC, that’s related to your camera and microphone. WebRTC stands for Web RealTime Communication, or Chat, I’m not sure which. But that’s basically the tool used for making ‑‑ for doing web calls. They’ll also look at what fonts you have on your computer, your screen resolution. You can see here the battery level stuff.
So I guess the point I wanted to bring across with the fingerprinting stuff is just that, like, there are lots of different things in play here.
Should we ‑‑ do you think we have time for our bonus round…? Oo, it’s almost 1:00. But I feel like there was ‑‑ I’m hoping, I think there was some interest in this. I don’t know, Olivia, what do you think?
OLIVIA: I just pasted in the chat an answer to the TOR versus VPN question? So we can skip those slides. But it might be useful to kind of rapid‑fire go through safer browsing techniques? Yeah, I just got a “yes please” in the Q&A.
What is a VPN and how does it work?
INGRID: Okay. Quick version of the VPN thing. This is how a normal connection, you know, logs data about you. I go to a website, and it logs this computer came to me! This computer over here.
A VPN basically means that you’re connecting to that computer through another computer. And so your request looks as though it’s coming from kind of somewhere else. That being said, like, it’s ‑‑ you know, there’s still other data. Like, given the point I just made about fingerprinting, there’s other data that could be collected there that’s worth thinking about.
When data travels through TOR, TOR is an acronym for The Onion Router, and the idea is that it wraps your request in multiple ‑‑ by going through multiple different computers, which are called relays.
So when you use TOR, which is a browser, to connect, it sends your request through this computer and this computer and this computer, and whatever is the last one you were on before you get to the page you want to visit, that’s where this ‑‑ that’s the, like, IP address that this device is going to log. These are called ‑‑ this last sort of like hop in the routing is called the exit relay. Those can be ‑‑ yeah. I think that that, that was my attempt at being quick. I apologize. (Laughs)
OLIVIA: Fun fact about VPNs. If you ‑‑ because the United States has different privacy laws than other countries, if you were to connect to a VPN server that was in, for example, the European Union, you might get a lot more notifications from the sites that you normally go to about different cookies and different things that they do with your data. Because in Europe, they’re required to tell you, and in America, they’re not always required to tell you what they’re doing with your data.
What is private web browsing and how does it work?
Oh, I can take it. So this is how, in Firefox, you would open a private window. And private windows, I think we’re all kind of a little bit familiar with them. They clear your search and browsing history once you quit. And it doesn’t make you anonymous to websites, or to your internet service provider. It just keeps it private from anyone else.
But that might be really useful to you if you are using a public computer, or if you’re using a computer that might be compromised for any other reason. Like say if you suspect that you’re going to protest and a cop might take your device from you.
What are script blockers and how do they work?
INGRID: So script blockers, so like the tracking and the little analytic tools and stuff usually are written in Javascript, because that is the only programming language that works in a browser. So there are tools that will prevent Javascript from running in your browser. And that can be helpful for preventing some of those tracking tools from sending data back to, back to some, you know, computer somewhere else. It can be a little bit frustrating, because Javascript is used from all ‑‑ for all sorts of things on websites. Sometimes it’s used for loading all of the content of the web page! (Laughs)
Sometimes it’s used to, you know, make things kind of have like fun UI! But it’s ‑‑ so it sort of… You know. It’s worth ‑‑ it’s interesting to try, if only to see how much of your internet experience needs Javascript? But yeah. There are some tools that will, you know ‑‑ the Electronic Frontier Foundation has a cool extension called Privacy Badger that sort of learns which scripts are trackers and which ones aren’t as you browse. But yeah. These are, yeah, these are extensions that browsers will ‑‑ you can install onto a browser.
And then firewalls!
What is a firewall and how does it work?
OLIVIA: So firewalls are kind of the first line of defense for your computer’s security. It would prevent, basically, other computers from connecting directly to your computer, unless you like say yes or no. And so… They’re really easy to turn on? On your computers? But they’re not that way by default.
So in a Mac computer, like I’ve shown here, you would literally just go to security and privacy, and go to the firewall tab, and it’s like one button. Turn off, or turn on. And you don’t really have to do much more than that.
And in Windows, there’s a similar process, if you go to the next slide, where you really just go into settings, go into security, and switch the “on” setting. It’s pretty… It’s pretty easy, and it’s kind of annoying that it’s not done for you automatically.
But I recommend everyone to just check out and see, like, hey, is my firewall turned on? Because it’s a really easy step to immediately make your computer much safer.
INGRID: All right! We went through all the slides! (Laughter)
BLUNT: That was perfectly timed! You got it exactly at 1:00
What’s the difference between a VPN and TOR?
OLIVIA: Okay. So for the TOR versus VPN answer.
As we said just a while ago, TOR uses onion routing and sends your data through multiple computers called TOR nodes to obscure traffic and anonymize you, while a VPN just connects you to a VPN server, that are often owned by VPN providers, which is sometimes you have to pay to use them and other ones are free.
So I described it as kind of like a condom? (Laughs) Between you and your internet service provider? So Verizon knows that you’re using a VPN, but it doesn’t know what you’re doing on it, because a VPN would encrypt all your traffic.
It’s really important that you use a VPN that you trust, because all of your internet traffic is being routed through their computer, which is another reason people like to pay. Because you can have a little bit more faith that it’s like a trusted service if you’re paying for it? Even though that’s of course not always true.
But there is Proton Beacon, which is one I use that’s free, which is run by the same people who run Proton Mail, which I use. I haven’t had any problems with it.
You can use a VPN and TOR at the same time, which is what the question directly asked. And I believe that your ISP would know that you’re using a VPN, but because you’re using a VPN it wouldn’t know that you’re using TOR. Ingrid, if that’s not true, you can like clap me on that.
Because TOR is super slow and it routes your computer through a bunch of different things, it can break a lot of websites, including video streaming like YouTube and Netflix. A lot of people use VPNs, however, so they can access videos or things that are banned in different countries by making it look like they’re in a different place.
But if you’re doing something highly sensitive or illegal, you’d probably want to use TOR, and probably some other precautions, too.
BLUNT: Thank you so much. That was super helpful. Do folks have any questions? Is there anything that people would benefit from sort of like going back and going into in a little bit more detail?
Someone just said: Is there a way around TOR breaking websites? I’ve had used it and it throws a lot of captcha tests on regular websites.
OLIVIA: So Cloudflare kind of hates TOR? (Laughs) It takes a really aggressive stance towards TOR users, actually? There was like an Ars Technica article I read that said Cloudflare said 90% of TOR traffic we see is, per se, malicious.
So I don’t know if there’s going to be a time that you can use captcha and not have it act up, because Cloudflare sees that kind of activity as malicious activity.
Can Apple see what you’re doing on your computer or phone?
INGRID: “This may be hardware‑related, but does Apple see what you’re doing on your computer because you connect to the internet, e.g. any photos, videos you store?”
Okay, to make sure I understand the question: Is the question whether, like, if you’re using an Apple device, whether Apple is able to see or collect anything if you’re connected to the internet from that device?
Okay. So I think ‑‑ I mean, the answer to that is you would need to kind of tell them to do that? (Laughs)
They’re ‑‑ so like, if you are using something like iCloud to store photos and videos, then yes, they would be able to see and have all of those. But in terms of, like, just being on the internet doing things on an Apple device? Apple can’t personally, like, kind of peek in and see that. I mean, they, like ‑‑ there are, you know, other computers will know that you’re on an Apple device.
But yeah, you have to be directly interfacing with Apple’s network for Apple to be able to have anything on or from your computer.
OLIVIA: And when it comes to things like iMessage and iCloud, they… say? That that information is encrypted. Of course, it’s like not open sourced, so we don’t actually know how they’re encrypting it or what they do. But Apple has said for a while that communications between, like say two iMessage users?
So not someone using it to speak to someone who has an Android; that’s SMS.
But two iMessage users speaking to each other, that’s technically an end‑to‑end encrypted conversation. Apple does collect some information from you when you are initially typing in someone’s number to text them, because it pings the server to find out if that number is associated with an iCloud amount.
So for iPhone users, that little moment between when a number that you’re typing in turns either blue or green, in that moment it’s sort of pinging Apple’s servers. So they do have a list of the times that that ping has occurred.
But of course, that doesn’t tell you if you actually contacted the person whose number you typed in; it just knows that you made that query. And that’s the extent, so Apple says, of the information that they collect about your iMessage conversations.
So, yes, they do ‑‑ they can technically see that information? But they tell us that they don’t look at it. So.
Open-source vs. Closed-source Technology
BLUNT: Can you explain a little bit more about open source or closed source technologies?
OLIVIA: Yeah! So, open source technologies are… basically, they’re apps, websites, tools that they’re ‑‑ the code that’s used to write them and run them is publicly available.
When it comes to security technologies, it’s really… best practice to try to use tools that are open source, because that means that they’re able to be publicly audited.
So like, regular security experts can like go in and like actually perform an audit on open source security tools, and know that they work. Versus, you have a lot of paid security tools that you basically assume that they work because people tell you that they work?
And you can’t really, like ‑‑ the public can’t really hold them to any, like, public accountability for whether or not they work or not.
Versus you can actually, like, test the encryption algorithm, say, of Signal, which is a messaging app and all of their code is public information.
INGRID: Open source, it’s also like a way of… kind of letting people developing software kind of support each other, in a way? Because the fact that Signal is open source, it’s not just like oh, we can be accountable if Signal says it’s doing something but it’s not; it’s also a way to be like, hey, I noticed something. Is it working? And you can actually directly contribute to improving that technology.
It’s complicated ‑‑ I mean, the world of open source, it’s complicated in that it’s like, it still has elements of the like… you know, snobby, like, like culture of tech, sometimes? But it, it’s kind of ‑‑ in principle, it’s very like useful for being able to have technologies that are accountable and that kind of have some element of like public engagement and understanding.
How to Choose a VPN
BLUNT: Awesome. Thank you. And so I have another question in the chat: What are some good ways to assess the trustworthiness of a VPN, as you were discussing before?
OLIVIA: The way most people do it, I think, Ingrid, you could check me on this, is kind of by reputation. If you look up how to find a good VPN, you’ll find a lot of articles where people talk about the pros and cons of different ones. And you’ll be kind of directed to the ones considered by the public to be the most trustworthy ones?
INGRID: Yeah. And I think one way I guess I evaluate companies sometimes on this is like looking at their level of engagement with the actual, like, issues that they… of like user privacy?
So like, one of the, you know, things I ended up using as a reference for this workshop as a guide to making ‑‑ as a guide for, like, different browsers, was like a blog post by Express VPN. And they’re a company that, they don’t have to tell me anything about like which browser ‑‑ there’s no reason for them to generate that content.
I mean, it’s good PR‑ish? But they’re not going to get new customers because I’m using a different browser now.
So some of it’s thinking, you know, is it open source or not? What is the like business model? And are they kind of actively, you know, engaging with issues related to user privacy?
We’ll talk a little bit more tomorrow about legislative issues around privacy, and that’s also another way. Like, have they taken positions on particular, you know, proposed laws that could harm user privacy?
To me, those are sort of like, how are they kind of like acting on principles?
OLIVIA: It also might be a good way of checking to see if ‑‑ yeah! If they produce logs in court proceedings, so you know that they don’t track traffic.
Also, to see like, say, certain companies might be funded by other companies that, like, are less concerned about… public safety or privacy or human rights.
So that might also be a good way of like checking to see, like, the integrity of a VPN company. ‘Cause at the end of the day, they’re all companies.
Is WordPress a reputable option for sex workers?
INGRID: All right. The next question: Would y’all consider WordPress reputable for housing a sex worker website?
This ‑‑ thank you for asking, because it lets us kind of talk about something I wanted to figure out how to include in that whole presentation but didn’t.
So… Just as like a point of clarification, and maybe this is understood by people, but maybe for the video it will be helpful… WordPress? (Sighs) Is both a, like, hosting company and a piece of software. WordPress, I think ‑‑ WordPress.org is the hosting one? Or WordPress.com? I can never remember. (Laughs)
I think it’s WordPress.com. But you can host a website on WordPress’s, like, platform, and when you do that you will be running a website that is built using WordPress’s software. Which is also called WordPress! This is confusing and annoying.
But… you can also use WordPress’s software on another web, like, hosting service. Like, you can install WordPress onto a like hosting service website. I think a fair amount today, like of hosting services, actually do sort of a one‑step click, like they’ll set up a server with WordPress for you option.
In terms of WordPress, like, as the host of a website? And as a host for sex worker websites… I don’t actually know. I would say ‑‑ I would, like, check ‑‑ I would need to go check their terms of service? (Laughs)
I think in general… Yeah. I don’t totally ‑‑ I think with all hosting companies, it’s hard ‑‑ like, they’re, like, figuring ‑‑ figuring out which ones are kind of the most reputable is partly about looking at any past incidents they’ve had in terms of takedowns, or like what their ‑‑ also like where they’re located?
So like, WordPress is a company based in the United States, so they’re beholden to United States laws and regulations. And I’m guessing part of the reason this question was asked is that this person ‑‑ that you probably know a little bit about FOSTA‑SESTA, which makes it harder for companies to allow any content related to sex work on their servers.
And as far as I know, WordPress wants to be compliant with it and hasn’t taken a radical stance against it.
Blunt, do you have any…?
BLUNT: Yeah, I can say I think hosting anywhere on a U.S.‑based company right now has a certain amount of risk, which you can decide if that works for you or not. If you are hosting on WordPress right now, I would just recommend making lots of backups of everything, as like a harm reduction tool. So if they decide to stop hosting your content, you don’t lose everything.
And I also just recommend that for most platforms that you’re working on. (Silence)
Cool. So we have around 15 minutes left. So if there are any other questions, now’s the time to ask them. And… I don’t ‑‑ and if not, I wonder if just chatting Ingrid and Olivia a little bit about what y’all will be covering in the next two days!
Okay, we have two more questions.
Can you reverse browser fingerprinting?
“This may be a digital surveillance question, but once you get browser fingerprinted, is it reversible?”
INGRID: Hmm. That’s actually a question where I’m not sure I know the answer. Olivia, do you know…?
OLIVIA: No…
INGRID: I do know that… you can sort of ‑‑ I know on some, on mobile devices, you can like spoof aspects of your identity?
So, like, you can ‑‑ like, so I mentioned Mac addresses are sort of this hard coded thing. That’s just the idea of your like device? A phone can actually ‑‑ like, you can actually generate sort of like fake MAC addresses? (Laughs)
That are the one that’s presenting to the world? So if that sort of was a piece of your fingerprinted identity, that’s one way to kind of, like ‑‑ you know. It’s like you wouldn’t be a perfect match anymore? But… Yeah, I don’t know if there’s sort of a way to completely undo a fingerprinting.
Yeah. I will also look into that and see if I can give you an answer tomorrow, if you’re going to be here tomorrow. If you’re not, it will be in the video for tomorrow.
Additional Digital Literacy Resources
BLUNT: Great, thank you. And someone asked: Are there any readings that y’all would recommend? I’ve read Algorithms of Oppression and am looking for more. I love this question!
OLIVIA: That… the minute I heard that question, like, a really long list of readings just like ran through my brain and then deleted itself? (Laughs) We’ll definitely share like a short reading list in the bibliography that we’ll send out later.
BLUNT: Awesome. That’s great.
Okay, cool! This has been really amazing. Thank you so much. I’m just going to say, one more chance for questions before we begin to wrap up.
Or also, I suppose, things that you’re interested in for the next two days, to see if we’re on track for that.
How do fintech companies use digital surveillance?
Someone asks: This is a fintech‑related question for digital surveillance, but can you talk about how that kind of works internet‑wise?
INGRID: Fintech…
BLUNT: For financial technologies. And how they track you. Oh! So like, if you’re using the same e‑mail address for different things? Is that sort of on the…?
OLIVIA: Like bank tracking? Like money type of…?
INGRID: So… Depending on the, you know, like financial servicer you’re working with, like PayPal or Stripe or whatever, they’re going to have ‑‑ like, they ‑‑ like, in order to work with banks and credit card companies, they are sort of expected to kind of know things about you.
These are like related to rules called KYC, Know Your Customer. And so part of the tracking or like ‑‑ or, not tracking, but part of information that is collected by those providers is a matter of them being legally compliant?
That doesn’t mean it produces great results; it’s simply true.
And I think the ‑‑ in terms of the layer ‑‑ I’m trying to think of what’s ‑‑ I don’t know as much about whether or not companies like Venmo or… Stripe or PayPal are sharing transaction data? I’m pretty sure that’s illegal! (Laughs) But… Who can say. You know, lots of things happen. That would be capitalism.
BLUNT: I also just dropped the account shutdown harm reduction guide that Ingrid and Hacking//Hustling worked on last year, which focuses a lot on financial technologies and the way that, like, data is sort of traced between them and potentially your escorting website. So that was just dropped into the chat below, and I can tweet that out as well in a little bit.
Zoom vs. Jitsi: which is more secure?
OLIVIA: Privacy/security issues of Zoom versus Jitsi… I also prefer to use Jitsi when feasible? But I also found that call quality kind of drops really harshly the more people log on. Like, I don’t think we can actually sustainably have a call of this many people on Jitsi without using like a different ‑‑ without hosting Jitsi on a different server.
Concerning how I handle the privacy/security issues of Zoom, they’re saying they’re going to start betaing end‑to‑end encryption later this month. I don’t know what that actually even means for them, considering that they’re not open source, right?
But I do say that one of the things that I tend to try and practice when it comes to, like, using Zoom, is kind of maintaining security culture amongst me and people who we’re talking to. Right? So I’m never going to talk about, like, any direct actions, right, that are going to happen in real life on Zoom. Refrain from just, like, discussing activity that could get other people in trouble anyway.
Like, while it would be nice to have, like, say this kind of conversation that we’re all having now over an encrypted channel, I think it’s generally much safer and ‑‑ I don’t like using the word “innocent,” but that’s like the word that is popping into my head, to talk about ‑‑ to use Zoom for education, even if it is security education, than it would be to actually discuss real plans.
So… It might be really beneficial to you if you are, like, say, having ‑‑ using Zoom to talk to a large group of people about something that is kind of confidential? To talk over, like, Signal in a group chat, or some other encrypted group chat platform, and decide like, okay, what are you allowed to say over Zoom, and what you’re not allowed to say. And to think of Zoom as basically you having a conversation in public.
Assume for all of your, like, Zoom meetings that someone’s recording and posting it to ‑‑ (Laughs)
YouTube later! And that would probably be… that would probably be the most… secure way to use it, in general? Is just to assume that all of your conversation’s in public.
BLUNT: Yeah. I totally agree, Olivia. And that’s why this is going to be a public‑facing document. So, Zoom felt okay for us for that.
INGRID: Yeah. I mean, I think another way I’ve thought about this with Zoom is like, just remembering what Zoom’s actually designed for, which is workplace surveillance? Right? It’s like, you know, its primary market, like when it was first created, and still, is corporations. Right?
So there’s lots of ‑‑ so like also, when you’re going into like ‑‑ even if you’re going to a, you know, public Zoom thing that is, you know, about learning something. Like, whoever is managing that Zoom call gets a copy of all of the chats. Right?
And even if you’re chatting like privately with one other person, that message is stored by ‑‑ like, someone gets access to that! And… Mostly just that’s something to… like, thinking ‑‑ like, just keep in mind with, yeah, what you do and don’t say. Like, especially if you are not the person who is running the call.
Think about what you would or wouldn’t want someone you don’t know to kind of like have about you.
What’s to come in the digital literacy lunch series?
BLUNT: Awesome. Thank you so much. Do you want to start to wrap up and maybe chat briefly about what we’ll be seeing in the next two sessions?
OLIVIA: Sure, yeah. So the next two sessions are going to be one talking more about how platforms work and sort of the whole, like, algorithmic ‑‑ bleh! (Laughs)
Algorithmic curation, and how misinformation spreads on platforms, and security in the Twitter sphere, rather than just thinking about using the internet in general. And then the third will be talking more explicitly about internet surveillance.
So we’re going to be talking a little bit about surveillance capitalism, as well as like state surveillance, and the places where those intersect, and the places where you might be in danger and how to mitigate risk in that way.
Last year, two organizers from Hacking//Hustling were rejected from speaking at last year’s Lesbians Who Tech convening in San Francisco, which took place shortly after SESTA-FOSTA was signed into law. Hacking//Hustling provided a partial scholarship to Baby Fat (@babyfat.jpeg) to attend and make sure that there would be sex worker representation at the conference. Baby Fat’s reflections on her experience at Lesbians Who Tech as a sex working Femme are below.
A few months ago, I was able to attend my first Lesbians Who Tech summit thanks largely to the support of my community. At the time of attending I was working as a digital media associate at a Queer healthcare nonprofit. Most of my 9-5 background has come from my work in Queer nonprofits, working mostly in direct outreach. For the last three years I have worked in tech specific positions within nonprofits, skills which I was able to acquire because of my hustling. I’m from a nontraditional background, but hustling has taught me everything I know about tech, marketing, and community management.
It’s worth mentioning I was able to attend the conference because I was awarded a partial scholarship for them. I attended the summit because I have always had a passion for social media and believe in its ability to connect community and provide accessible education, especially as it relates to Queer sexuality and wellness. From a hustling perspective, it’s the best way for me to engage and advertise to those who utilize the multitude of my services. Post FOSTA/SESTA I have had to rely even more heavily on social media and have since began operating more discreetly.
While the conference was exciting and I was able to connect with some great folks I often felt that some overall nuance was missing. There was a lack of intentional conversations around gentrification, sex work, and the Queer complacency. Navigating the space as a fat femme sex worker was complex and exhausting at times, between being unable to fit in certain seating, being talked down to by masc attendees, or feeling uncomfortable disclosing the extent of my work. Because a bulk of my 9-5 career has been in nonprofits a majority of the conferences I have attended have been specifically dedicated to Queer theory, resistance, and community building. However, these spaces often lack on seeing the importance of tech within these movements and have been slow to adapt to the changes tech have created in communities. I think LWT is doing better work than most other tech specific conferences, but I do think they could benefit from adapting some of the approaches and topics Queer nonprofit conferences have.
Throughout the summit, I heard no mentions of gentrification from LWT leadership, which felt especially out of place considering that LWT seeks to empower the very people gentrification disproportionately effects. While gentrification has been a popular conversation in tech spaces, having been discussed in length, I can understand how it feels like it might not need as much attention. But I still feel it’s incredibly important to have some intentional dialogue and education around it. I’m from Chicago and the city’s recent tech expansion and attempt at being a global city has reinvigorated the conversation of gentrification and tech. If LWT truly aims to create a more intersectional and diverse tech workforce than they need to fully engage the communities that are being displaced by tech gentrification. LWT leadership needs to recognize they have a platform to educate and incite change. Choosing not to talk about gentrification is choosing to be complicit in it.
At the root of complicity are respectability politics, something LWT engages heavily in, in order to maintain funding, connections, and a respectable reputation. But with these politics comes the erasure of some folks who rely on tech for their safety and economic stability. Sex workers have always been at the forefront of using and building the popularity of tech platforms and services. Between navigating digital banking, advertising online, and censorship on social media sex workers utilize tech at significant rates. Sex workers made Cashapp and Venmo mainstream, and continue to be a driving force between both banking systems growth. But both systems, as well as most social media platforms, have made it increasingly difficult for sex workers to continue using them.
I went to LWT knowing that there were no formal mentions of sex work in the programming, an oversight considering the historical connections between sex work and Queer folks. After all, pride was started by Marsha P. Johnson, a Black Trans woman, and a sex worker. Countless other Queer revolutionaries like Sylvia Rivera, Amber L. Hollibaugh, and Miss Major among numerous others have been on the front lines of Queer liberation. But as Queer folks have become more assimilated into mainstream culture, Queer sex workers have been pushed farther to the fringes by their own communities.
Whenever in casual conversation with other attendees, the mention of sex work would make them uncomfortable. When I disclosed my experiences in navigating social media as a sex worker, I could feel them try to calculate what type of work I did. It felt like I had to prove my credentials and cleanliness to them. I had a few people implore what type of sex work I did, and I generally got the feeling from them that some forms were more acceptable than others. Often times folks would withdraw from the conversation or worse, explain to me how they knew things were “difficult” because they read a Vice article once. When I pressed them for ways that they were working to make their companies and products better for sex workers since they read a Vice article, they often said there wasn’t much they could do because they weren’t a decision-maker or programmer. But I think that’s just coded language for “I don’t want to do anything.”
I don’t think it’s a matter of people not understanding the difficulties sex workers face while trying to navigate tech. I think it’s an issue of respectability politics; additionally, those that are willing to make change are unsure where to start. Sex work, despite what sex positivity would have you think, is still incredibly stigmatized, especially within educated Queer spaces, like LWT. Leadership at LWT has the power to educate attendees on the nuances of tech and sex work and can impact attendees to do more within their positions, but once again, they choose not to.
The highpoint of the conference for me was being able to see Angelica Ross speak, Ross has been incredibly vocal about the importance of sex workers in tech and has provided visibility to the larger movement. I want to see more more dialogue around sex work and sex workers speaking and facilitating conversations specifically at LWT in the future. Additionally, I would like to see LWT engaging more with sex-workers by partnering with sex worker specific organizations and speaking about sex work more vocally on their digital platforms. I think engaging more sex worker based organizations would encourage more sex workers to attend, and if anyone needs better tech, it’s sex workers.
Publicly talking about sex work not only educates civilians on the nuances of tech and sex work but also actively destigmatizes sex work in tech spaces, making it easier for folks to openly (and comfortably) talk about their narratives as sex workers. I’m critical of LWT because I want it to succeed, I want people to feel comfortable and for tech to be reclaimed.