The Coding of Risk: From Sex Work to Sanctions

Danielle Blunt: Welcome everyone. Today we are here for a panel on The Coding of Risk: From Sex Work to Sanctions. 

Before we get started, we wanted to let you know that it’s totally okay to take screenshots or tag us on Twitter. If you wanna tag @hackinghustling or @article19org

So hello, I am Danielle Blunt. I am a sex worker and co-founder of Hacking//Hustling, a collective of sex workers and accomplices working at the intersection of tech and social justice to interrupt state surveillance and violence facilitated by technology. I’m also a public health researcher, currently researching the financial discrimination of sex workers. 

Today’s panel explores the overlap in financial discrimination and censorship of citizens of sanctioned countries and sex workers. The goal of the session is to explore how algorithms expansively code and therefore produce categories of marginalization and the processes that control the flow of information and capital. How are communities coded as high risk by financial institutions? How does a high risk assessment impact people’s lived experiences, increase vulnerability and disrupt mutual aid? 

Joining us for this conversation today, we have with us Mahsa Alimardani, an internet researcher focusing on political communication and freedom of expression online at Article 19, and is a doctoral candidate at the University of Oxford. 

We also have Afsaneh Rigot, a researcher focusing on law technology, LGBTQ, refugee, and human rights issues. She’s a senior researcher at Article 19 on the Middle East and North Africa team focusing on LGBTQ rights, police tactics, and corporate responsibility. Afsaneh is a fellow at Harvard Kennedy School’s technology and public purpose program, furthering her work on corporate responsibility and her design from the margins framework. She is an affiliate at the Berkman Klein Center where she continues work and research on the use of digital evidence in prosecutions against MENA LGBTQ communities. 

And we also have Lorelei Lee, a writer, sex worker, activist, organizer, juris doctor, justice catalyst fellow, a co-founder of the Disabled Sex Workers’ Coalition and the Upstate New York Sex Workers’ Coalition, a founding member of Decrim Massachusetts, and a researcher with Hacking//Hustling. Their writing appears in n+1, The Establishment, Spread, Denver Quarterly, The Feminist Porn Book, Coming Out Like a Porn Star, We Too, Hustling Verse, and elsewhere. Their book, Anything of Value: Looking at sex work through legal history, memoir and cultural criticism, is anticipated in 2023. So to start off, I’d love for all of you to introduce yourself a little bit further and what perspectives and insights you’re bringing to this conversation. Lorelei, I’ll go ahead and start with you.

Lorelei Lee: Yeah. Hi everyone. I’m so happy to be having this conversation. Just the few meetings that we’ve had have been so generative. So I’m really thrilled to be talking about this here. 

I started to think about some of the comparisons between treatment of people profiled as being in the sex trades and treatment of people profiled as being Arab or Muslim in the United States when I was working at the Center for Constitutional Rights and noticed that a bunch of their work on where they were seeing folks being de-platformed for doing Palestine solidarity work, for example, folks being, by de-platformed mean kicked off of tech platforms and kicked off of financial tech platforms. They’re, you know, not being allowed to accept donations, et cetera. And I just noticed that that is the same thing that happens within the sex worker community. 

We all know, you know, that it happens to all of our friends. So I started to just think about this similarity in what I think a lot of people think of as two very different groups of people. Now, of course, that’s not true. We are overlapping groups of people, but I think the more, maybe something else that is an important point to come out of this is how, to the government, we are very similar types of people. So in the United States, we have legal regimes that target trafficking, we have legal regimes that target terrorism and we have sanctions regimes. And each of those regimes, they’re made up of a group of laws. Each of them are made up of a group of laws. They claim to target a very specific and horrifying type of violence, but they do so through vaguely worded statutes that are very broad. And they’re intentionally broad in order to allow for discretion in their enforcement. It’s good for government officials, it’s good for prosecutors and it’s good for Congress if these laws are thought of as targeting this bad thing and the bad thing could be everywhere. And, so we – we’re going, there’s like- they’re pretending that there are clear lines in the language they use to discuss it while simultaneously making the laws very ambivalent so that they can define it as they go. And so that has caused all kinds of problems in relationship to tech companies specifically, but also any companies really. And we’ve seen this kind of profiling- actually what’s happening is an over compliance with these laws. So because these laws are vague, companies don’t want to take legal risk, of course. And so they will make their rules be the furthest they can be, out on the edge of where the rule could be. So, you know, like the behavior that the law is targeting is way over here, and then things that are thought of as adjacent to that behavior, but really it’s people who are thought of as adjacent to that behavior, are like all the way over here. 

And this is where they’ll draw the line. And so they’ll develop policies that are actually meant to profile and exclude. So I think that this is an intentional scheme on the part of the U.S. government, as well as on the part of corporations in which the government can claim that these problems, these types of violence are external to the United States or external to the social body that is thought of as the good American citizen. And so with trafficking, I mean, it’s very explicitly framed as being about people who are socially deviant in a way. And then the folks who get profiled under that regime are sex workers, regardless of whether trafficking has happened in their place of work or not. And then of course under the anti-terrorism regime, Muslim and Arab people are targeted. And then under the sanctions regime specifically, Iranians are targeted. And so all of these are folks who the government feels don’t need to be part of the social body. And in fact, it’s better for them if we are not. And corporations, similarly, if you externalize these harms, then you can claim that your actions are always good, that this doesn’t happen inside of my organization. It only happens outside of it. And so I only have to focus outside of, and so specifically for anti-trafficking protocols, there have been a lot of instances where companies are, have horrifying worker treatment and like no labor rights within their company, but that’s not important. That’s pushed to the side in order to focus on what is thought of as the most extreme form of labor violation, which is trafficking. And, you know, so this kind of division is happening through this scheme. And I think I had one more point that I wanted to make, which is really just to note again that this, these are schemes that criminalize people while they pretend to criminalize behaviors.

Danielle Blunt: Thank you, Lorelei. Mahsa, would you share a little bit about your perspectives and insights that you’re bringing to this conversation?

Mahsa Alimardani: Definitely. So, I mean, I want to start from a point that Lorelei just made, which was the intentionality of these laws, which I think is very interesting because my work centers on access to the internet and freedom of expression online in Iran, and this is very much of value that the U.S. government (inaudible) to be behind. And, you know, they have a lot of, I mean the state department constantly talks about their belief in freedom, freedom of the internet, even under the Trump administration we’ve seen these values reiterated. And when we’re looking at Iran, it is naturally a country, you know, that has a repressive government, a repressive system in play that is causing a lot of hurdles to access to information and freedom of information online. 

But on the other hand, there are restrictions put into place by the U.S. government and many will argue, well, you know, if the U.S. government didn’t have these restrictions, it would still be repressive in Iran for, you know, internet users in Iran, which is true. It would be. But if we’re, you know, concentrating on policies in the U.S. this is, you know, the low hanging fruit we can change, which is to ease access through these policies. And so as Lorelei mentioned, there is a series of restrictions placed on how Iranian users inside Iran, even sometimes outside of Iran, have access to certain services. How this manifests itself for freedom online is very interesting. 

So we have sanctions, and sanctions in place typically means that U.S. entities cannot provide services or do business with Iran, but there have been provisions, again, because of, you know, the value of freedom online, because of the value of freedom online, where there is actually a license given by OFAC, the Office of Foreign Asset Controls at the U.S. Treasury that has, it’s called General License D, which is supposed to allow for services that enable personal communications. How this gets misinterpreted and how we see platforms like Slack being blocked for Iranians has a strong component to do with the topic at hand today, which is financial discrimination. So the laws get interpreted in a way to argue that certain services being provided to Iran or Iranian entities are breaking sanction because they are being, they are involved in financial transactions. Oftentimes they are not. And so I guess that’s where this comes into play. 

This intersection between what U.S. values are, how it’s being put into play, how companies, a word that we’ll probably use a lot today, are over-complying with these laws and how the U.S. government is not necessarily making it easier. Part of the work that I’m doing at Article 19 is I’m working on different ways to persuade the companies to A. Stop over-complying. And, B. For the government, the U.S. government, to make more rigid, a more rigid general license D, which is, it won’t be subject to, you know, the interpretations that companies like Apple, Google, Slack have been doing, which has been misidentifying certain services that are vital for access to the internet or freedom, freedoms online, and misinterpreting them as financial transactions. I mean, there is a wider discussion about why Iranian users have to be subjected to these financial hurdles, and that is a very valid conversation. And I think we’ll get into it as well, but I just wanted to put into context how this is a major issue for freedom online. 

And I think I’m getting a message that I, there’s some issues with my microphone sound, and I will try to fix that. And I apologize if that was getting in the way.

Danielle Blunt: Thank you so much. And Affi, I’d love to hear from you about what you’re bringing to this conversation.

Afsaneh Rigot: Yeah. Hi. Just quickly, apologies – narrow video here so it might be a very dark screen, but it’s RightsCon, we’ll deal with it. 

So it’s a fascinating conversation, and I’m really glad to be part of it in this sort of bridging of the different work and activism and advocacy that different communities have been doing around this concept of financial discrimination. I think having conversations with yourself and having conversations with Mahsa and Lorelei and folks like Kendra, our mutual friend at the (inaudible) clinic, and so on has highlighted this concept of the same sort of patterns and systems and methodologies used to target individuals who are deemed high risk. And, you know, we’ll get into this conversation about how high risk is presented and who gets to choose what high risk means and how the sort of methodologies employed throughout these different procedures, whether it’s trafficking, whether it’s money laundering, whether it’s terrorism, whether it’s sanctions, they tend to use the same sort of methodology for all of them and, you know, as we’ll get into, apparently the same databases. 

So when we’re discussing these sort of topics and bringing it into consideration of the impact of tech on marginalized communities, which is something I deeply focus on in my work, focusing on like the impact on communities, specifically queer communities in the MENA region, I see some of the things often. I see some of the things often when it comes to the use of morality framing to target those most marginalized through law enforcement, the justification of targeting those most marginalized, those seen as low risk in terms of prosecutions because the prosecutions can happen so quickly. It, it kind of hits a nerve similarly when it comes to these concepts around who gets blocked off from access to financial institutions and who gets blocked off from access to platforms and so on. And the use of this sort of morality framing that Lorelei was really beautifully outlining that who, you know, how they try and frame who is high risk in these situations. 

It, I think whoever has done any sort of studies in to like enforcement and police enforcement sees the overwhelming similarities of how they use these sort of frames and who becomes a target to it. So I come from it from a perspective of, one, being an Iranian, who has been subject to these issues from family to friends, but also as someone who’s been working as part of Article 19, supporting work with Mahsa’s work on sanctions, and also looking at the way these sort of blockages affect the queer communities that I look at, not in the way of getting thrown off, but in the way of just not having any access, knowing that you’re not welcome. And that’s something we’ll talk about later. 

But yeah, I’ll leave it as that I come to this conversation with just a lot of solidarity and joy that these advocacy projects and movements and works is happening, including from Hacking//Hustling, the work that has been happening on Iran sanctions. We should be bringing that all together because it’s one common methodology used around the most marginalized within these contexts. And it makes a lot of sense for us to be clamoring together, to highlight how it’s affecting those folks that fall under their risk appetite. So I’ll leave it there.

Danielle Blunt: Thank you, Affi, and thank you for bringing up the importance of cross movement work and building solidarity across movements so we can work together and build knowledge together and build power together. 

So in our conversations, we’ve been talking about some of the parallels in anti-trafficking and antiterrorism laws and sanctions and Lorelei, in your intro you were talking a little bit about how these laws are really about policing people, not behavior. And I was wondering if you could talk a little bit about some of the parallels in anti-trafficking laws and antiterrorism laws and how these can lead to the production of high-risk communities and the subsequent financial discrimination of marginalized communities.

Lorelei Lee: Yeah. So I think the parallels, and I’m going to apologize, I may repeat something that I have already said, but I think the parallels are really in the enforcement. So there’s parallels in the, in the language in that both- So, okay. So if we’re talking about trafficking in the United States, we’re talking about a couple of different major laws, the Trafficking Victims Protection Act, and that has been modified by FOSTA, which a lot of people have heard of recently, it passed in 2018. That’s the Freedom from Online Sex Trafficking Act. And then FOSTA also modified something called the Mann Act, which was passed in 1910, which was originally called The White Slave Traffic Act. And each of these is a, are laws that claim to target trafficking. However, for example, the Mann Act in the decades after it was passed, started to be used to prosecute things like adultery, you know, having a mistress and crossing state lines with her. It was always a “her” in, in that time because the Mann Act was specifically about crossing state lines with women for immoral purpose. And that was the language of the law at that time. And in fact, a Mann Act case was also brought against a woman who traveled by herself on a train across state lines. And they said that even though the language of the law made it sound as though you would have to be transporting someone else and not yourself, they needed to prosecute her in order to be sure of getting at the entirety of the evil the law was trying to target. (laughs) So, similarly, we have, we have FOSTA, which we don’t know how it’s going to look, but we do know that FOSTA expands the federal trafficking statutes to also put prostitution for the first time into federal law, a criminal statute against prostitution. What FOSTA actually says is that you can’t use a computer service, there’s a bunch of legal language around what a computer service is, but internet technology to promote or facilitate the prostitution of another person. And so that’s the first time that we have prostitution really becoming a federal crime. And then the TVPA is the law that was passed in 20- in 2000 around the same time that we have the Palermo Protocols internationally, and it is a very specifically describing, you know, creating a definition for what trafficking is and what sex trafficking is. However, it also defines trafficking, it defines trafficking, sex trafficking, as the exchange of sexual behavior for anything of value. And so it encompasses any kind of trading sex, not just coerced or forced sex trading. However, it only criminalizes severe forms of sex trafficking, and so they have a different definition there that does include force, fraud and coercion. And then in the anti-terrorism statutes, they’re called the material support statutes, and they prohibit the provision of material support to foreign designated terrorist organizations. The United States keeps a list on which they assign organizations as FTOs, and the definition of what it means to provide material support has become very broad. It’s similar to these other laws where a case out of the Ninth Circuit said that HLP, which was an American organization who, Humanitarian Law Project, and they wanted to do some work with some organizations in other countries who had been designated on the foreign terrorist organization list. But the work that they wanted to do was human rights work. They wanted to specifically provide training for nonviolent dispute resolution methods, such as filing complaints with the U.N., et cetera. And the court decided that providing nonviolence training was providing material support because the organization that had been designated could- it was, it was going to be fungible with resources that they could then put toward buying weapons, for example. So if they don’t have to pay for peaceful mediation services, they can use money to buy weapons. So the idea here is that this thing that is pure speech is now fungible with money. And it made me think of when you were talking, made me think of that when you were talking, Mahsa, because I do think what, what’s happening with these tech platforms also. We have a lot of blurring. Where, is this about money? Is this about speech? And is it really just about excluding folks from everything, from, you know, having access to financial technologies, having access to platforms at all? And I, you know, I also, you know, I mentioned that one of these laws is the, or was originally called the White Slave Traffic Act, and I think that’s really also important to bring up because I do think that these laws are also all of them are rooted in racial capitalism, in systems of racial capitalism in the United States, in the criminalizing of some behaviors, but specifically behaviors by some people. And when you have something like the White Slave Traffic Act, where what you’re doing is trying to remove women, white women who, you know, women who are assigned to this thing called whiteness, you’re trying to preserve that thing called whiteness by removing them from situations where they might be having interracial sex and simultaneously continuing to criminalize, for example, black women in the sex trades, which is what happened after the passage of that law. Really sort of a building of a rescue industry and simultaneously a criminalization industry. And so I know these are, I’m sort of going on these tangents that aren’t direct to the topic today, but I think you can probably see how all of this leads to a kind of exclusion online that is just a progression from an exclusion that’s been going on for hundreds of years.

Danielle Blunt: Thank you, Lorelei. Yeah. And so I, I hear what you’re talking about and how broad these laws are, which can lead to like a gross over enforcement of them. 

So Affi, this question is for you, how does the coding of risk happen at financial institutions? And could you talk a bit about the process by which individuals and communities are deemed to be high risk?

Afsaneh Rigot: Sure. Also, thank you Lorelei and Mahsa on these outlines. And I mean, just like following up on this notion of the sort of vague and broad scope that these institutions are being provided with in terms of implementation, we’ll see, we’re gonna be seeing it in terms of risk profiling, and you don’t see my fingers here, but I’m going “risk”, you know, air quotes. (laughing) Risk profiling.

So I think it’s interesting to do it with a case study of a particular bank that I managed to get some sort of insight from in terms of how their doing some of their risk profiling.

This is a bank based in the UK, but the sort of work and methodology they use is, is replicated throughout. Different banking systems have different risk appetites. So some of these percentages might differ and some of the softwares might differ, but the notion remains the same. And one thing that to keep in mind is that in risk profiling, and this is something that Blunt, you and I in other conversations were talking about is like this deputization of people to take on the profiling and methodology. So let’s say financial profiling they’re doing that sort of online profiling of accounts. So there’s this methodology of using third party due diligence softwares. If you’re a banking system, you will be often using your own databases and checking your accounts against more broader, larger international databases, such as I’ll give you a few names here, LexisNexis, WellCheck, and NetReveal. So depending on what software the bank itself has, it’ll usually check its data components against something like LexisNexis and WellCheck.

A little bit of a background because I went down the black hole, I’m looking at the rubbish these companies get into. LexisNexis and WellCheck are both owned by Thomson Reuters and RELX, who are doing data brokering on the side of their journalism, which is always nice to hear. So in terms of this data, brokering and selling of data, what happens is that millions of profiles from, the last I checked with WellCheck is 2.5 million profiles with 25,000 new profiles added monthly. These are profiles seen as high risk, whether it’s on terrorism as well as on trafficking and so on. So checking this the bank will look at whether or not a profile is matching it, depending on their risk appetite, on a 25, 30, 50, 75 or 100 percent scale. And then you bring in the sort of human element that is the employees who will go through these profiles and check it.

What triggers off these sort of risk alarms in different situations with sanctions and countries, as Mahsa has explained and will continue to explain, it’s really not that complicated. Literally the name of the country coming up will trigger it, the name coming up in any sort of format, whether it’s the name of a human rights organization, whether it’s the name of a restaurant coming up, it will trigger this sanctions company and ban on there. And often like the sort of framework the banks will have is that you can open up accounts for Iranians, but you can’t open up accounts for anyone who has any sort of direct or indirect dealings with Iran. What does indirect mean? Who knows? It’s up for them to decide. Indirect, does it mean having the name in your organization’s title? Yes, apparently it does. So that sort of sanctioning and risk assessment is quite automatic. You have also countries like Cuba that fall into this. The U.S. is the only country that has sanctions regimes against Cuba. However, international regimes around the world and banking regimes abide by the U.S. regimes on this. When it comes to Syria and Iran, it’s more international, but it kind of shows you where the priority is set when it comes to risk assessments. So places like North Korea, Iran, Syria, immediate ban. And then they check later. Terrorist sort of designations. So this insider of mine who has given me some really nice quotes and tidbits to understand. It’s like, literally a name, Mohammad will be flagged every time because that’s the sort of risk assessment they’ve put out there. And then it’ll have to be the human element that go goes in and looks at that profile. Al, they were mentioning the name Alto like Al Hammadi, Al and so on, a very popular name in the Arabic speaking regions, will pop up as a risk for terrorism. And, you know, it depends on the bank and the sort of threshold they put on, but well you can see the pure racism and the discriminatory nature of this quite clearly as it comes.

When it comes to sex workers, very frankly I was told that any Vietnamese names get flagged as potential victims of exploitation because they have sort of associated Vietnamese names with trafficking in these contexts. But the interesting part of this also, that sort of vague method that they use around this, for example, when it comes to sex workers, there’s absolutely zero to no distinction between actual sex work, trafficking or exploitation. It’s all under the same umbrella. And the sort of slides that I was seeing it was all about how to identify victims of exploitation. No differentiation there and no training there that there’s anything outside these realms. And it’s that sort of frameworking within the algorithms and the deputization that creates this risk assessment for sex workers, any sort of words will pop up, sugar, darling, emojis you’ve mentioned to me before although it didn’t come up in that conversation, but very little things come up as seen as a trigger for potential trafficking and closing of these accounts. So, you know, these sort of vague concepts around what is risk not only revolve around very racist, discriminatory frameworks and geopolitics, but they’re also being passed on to these (inaudible) that we’re talking about, who are being trained in the same way.

One of the other things that was really fascinating to see in this is that there’s this moralization in the training that’s happening for folks who are enforcing sanctions, terrorism, or trafficking regulations within the bank where, you know, this one in fact it was telling me it felt like pure brainwashing because you’re being told you’re doing, you’re enforcing sanctions regimes on X, Y, and Z countries because you’re helping them abide by human rights provisions. You’re enforcing anti-terror legislations this firmly because if you don’t, you’re going to be responsible for terror acts in specific situations. You’re doing this profiling of sex workers because you’re trying to stop exploitation. So this moralization is really problematic in the context. And you know, one of the questions was, do you ever get asked or told about the folks that would get stuck in the middle, who have been marginalized and thrown out of these services that don’t fall under any of these? And the answer was clearly no.

So obviously when it comes to risk assessments, for me I would, I can very, very deeply see that this over-compliance rather than compliance as incentive is a huge deal because it becomes this notion of ensuring that there are no fines against a company. These fines and revenues, even though they have these moralization presentations and they talk about, if you don’t do it, we’re gonna get fined big time. And then they show them these huge cases. The final point I’m going to say is also really depends on the institution. If the customers of an institution are from a particular country or particular demographic, they’ll have more robust systems in place. Because again, it becomes a revenue-based incentive for them to maybe not enforce further. I’ll stop there. It’s all kind of trash, but there we are. This is the systems we work with.

Danielle Blunt: Thank you so much Affi. And I think it’s, it’s such an important point of the human role in these automated flagging systems and the process in which companies are deputizing employees as an extension of policing and how policing is deputizing companies as an extension of policing as well. So that the biases of the employees, which are impacted by the like propaganda-like trainings are impacting what’s happening here. 

And so Mahsa, my question for you is, how do these processes impact people living in sanctioned countries? And how do these processes parallel issues of internet access in sanctioned countries?

Mahsa Alimardani: So there’s two ways to answer this. And, and one I’ve already touched upon, which is a very tangible access to platforms and services that are hindered by sanctions and these systems. And then the second is kind of a question of content moderation, which I know has been very popular and much talked about throughout this rights, this year’s RightsCon, but how Iranians are able to freely express themselves is somewhat subject to the systems as well. And I’ll, and I’ll start with this because Affi did this really great job of framing, the kind of, I guess, systems in place to do this. And similarly, there are systems in place within platforms like Facebook or Instagram that are doing this.

I mean today, a coalition that I’ve been part of just launched a big campaign asking Facebook to do a public audit on the content it has on Palestine. And this is a big step forward just because the, you know, the faults that Facebook has been implementing in content moderation recently has been so egregious for Palestine, but hopefully this will be a first step for the wider region.

There’s some really great research that Dia Kayyali has been doing with Mnemonic And I think previously with Witness where they were actually examining the algorithms in place and how these algorithms were flagging content based on just the IP of the users within the Middle East region, you know, the use of Arabic texts within the region, which is, you know, used for Persian, Urdu and Arabic and these things automatically flag the content for their systems to, you know, do extra moderation and extra removals within, you know, their policies of terrorist content, terrorist organizations, and individuals, which is an actual Facebook policy. And so what had happened, and we saw this kind of mass eradication of terrorist individuals and organizations back in April of 2019 in particular in Iran because the Trump administration decided to designate the Revolutionary Guards as part of the FTO list. And so naturally, I’m not here to argue that the Revolutionary Guards are by any means a good actor, but they are such a pervasive presence within Iran.

I mean, men have to do mandatory military service in Iran. And one in four chances are mandatory military service will land you in the IRGC. And so, you know, I’ve known of human rights activists who had to serve within, you know, a division of the Revolutionary Guards who have been subject to certain sanctions or checks or denials of boarding flights because of this past they had no control over. And the Revolutionary Guards have a pervasive influence over the banking sector, a lot of the private sector in Iran. So it’s kind of hard to avoid them if you’re Iranian and, you know, everyday politics, politicians, news in Iran very much revolves around the Revolutionary Guards. And so if journalists and human rights activists want to just freely talk about and express, you know, day-to-day activities, they found, they find massive hurdles because, I mean, for example, in January of 2020, a major figure from the Revolutionary Guards was assassinated by the United States. And we saw, you know, activists and journalists having their content removed for just mentioning his name, Qasem Soleimani, because he was part of this terrorist designation and they were talking about benign things and even criticizing them. So this has been a hindrance in the ways that these platforms, you know, initiate or make use of these sanctions designations and FTO lists.

On the other hand, users, you know, naturally do not have access to the same platforms because of certain things like if a VPN or a service is hosted on the Google cloud platform it won’t work in Iran. And what this often means is that Iranians will end up using less secure platforms, platforms that are being hosted by the Iranian government, which is something that this U.S. policy is playing into very nicely. You know, the Iranian censorship regime wants users to be reliant on national infrastructure. And the sanctions policy is encouraging that kind of migration and has been encouraging it for the past 10 years. And so essentially the implementation of these regulations, be it from the side of the companies, whether they’re, you know, hosting the content of Iranians or allowing access to services, online services, is a very simple thing that could potentially make freedom of expression online and access to the internet much easier. I mean, it’s not the end all cure to freedom of expression or access, but it is kind of a low hanging fruit that, you know, the United States could potentially help with.

Danielle Blunt: Thank you so much. And yeah, I think that point that when people don’t have access to these tools, don’t have access to online spaces or access to financial technologies or banking institutions, the impact of that isn’t necessarily an unintentional consequence, but it’s often by design and an intended result of those rules. So we’ve talked a bit about how financial discrimination happens, but I’d really like to shift the conversation for these last 15 or so minutes into what are the real world impacts of financial discrimination and denying groups of people access to capital. 

And so Affi, I’ll start with you. Could you talk a little bit about how the broadness and the vagueness of these rules leads to over-complying and over-enforcing of rules, and sort of a little bit about the ripple effect of how this could impact people’s larger social networks?

Afsaneh Rigot: Yeah. I mean, I think part of, like, let’s say an example of this from the work I do is if I am going to talk to sex workers within the MENA region to talk about how they would use financial platforms or any sort of platform for their work, it’s just non-existent.

There is quite a clear line that one, we’re not welcome on these platforms. Two, we can’t use these platforms. Which is a big issue because in the sort of study of prosecutions I do of queer and trans folks and sex workers in the MENA region, one of the main things police look for is monetary transactions, like physical monetary transactions. And for example, in somewhere like Egypt, without that sort of monetary transactions the case prosecutors have against individuals is a lot weaker. So, you know, in my mind, in my sort of utopian mind, if there was a method in which folks could use a platform safely to do these transactions without having different currencies and so on on them on the streets when they are arrested or entrapped, this situation of higher charges and these laws who are, that are, for example, in Egypt is called the Laws Against Prostitution, will be much harder to enforce. Not foolproof, but the reality is people don’t have this access.

Again, the sanctions regimes that we’re looking at are so wide and broad in terms of reach. And the fact is that the folks we’re talking about are so easy to become, to be thrown under the bus in these situations for these companies and these, and financial institutions, that platforms and sort of tools that folks need to be used can be cut off from them quite immediately. We’re seeing this right now with regards to queer dating apps in Iran, where there are major number of queer dating apps that have basically over-complied with sanctions regimes and cut off access to queer users in Iran from using dating apps. They were not legally required to do this. There’s provisions that Mahsa can go into it much better than all of us in terms of the framework that does not require this over-compliance. However, as we talked about, the incentive is to over-comply rather than comply because losing Iranian users isn’t that much of a big deal. They’re not going to be able to use financial banking systems to get premium features anyway. There’s also the, you know, moralization and fear-mongering that comes with all of these situations in terms of, in terms of sort of advocating for these changes. And obviously we are having these conversations and ensuring to me, ensuring there’s a method of kind of reversing this over-compliance, but it spills over in this context. In Iran, there are so many legal and policy frameworks that criminalize the community.

These platforms become fundamental for connecting. These platforms become fundamental for user community. They have their problems. They have their many problems, but they’re fundamental. Cutting that access off is just like quadruple discrimination against a community that’s already discriminated against. And why?

Because you thought it’s probably better to over-comply than just comply. So the ripple effect is huge.

I’ll pass it on to Mahsa to continue about the sanctions issues when it comes to different platforms and effects on individual groups. But one thing I want to add, if there’s any global enforcement listening, I don’t know, maybe, if your criteria or framework of protecting vulnerable groups against prostitution or trafficking means throwing whole groups of vulnerable groups under the bus, it’s not working. And it’s not about vulnerable groups. We all notice. So. (laughing) Let’s stop it there and I’ll pass on to Mahsa from there.

Mahsa Alimardani: Yeah. So I think Affi mentioned something really important, which is what, I guess, qualifications or standards are applied when sanctions are applied to certain technologies or services. And I mean, tomorrow there’s going to be a session hosted by GitHub that I’m part of, and the really interesting thing about GitHub as a platform is, is that it’s a platform that offers both paid and unpaid services. Yet both the paid and unpaid, essentially the free services, stopped being served inside of Iran.

So you had Iranian users with, you know, free repositories have the repositories removed off of GitHub. And this was a very strange thing. And I know Affi and I have had many conversations with different platforms like this where we’re like, but you know, this is a personal communication. There’s no financial aspect to an open source code. You know, GitHub repository. How is this being applied? I mean, this is happening to a lot of platforms where this, this judgment is being made. GitHub is one of the few that actually took the extra, went the extra mile and got a general license for their product to be available in Iran. And this was something that took a lot of time. And if you come to the event tomorrow, they’ll explain to you exactly the kind of grueling process that only like a big corporation, Microsoft is behind GitHub, that could really, I guess, support such an endeavor. And they did do this, and this has become kind of a model for the rest of the tech industry.

And Affi has been working with other, you know, companies through her work in the MENA region, and we’ve been trying to kind of apply the GitHub model. But the thing is, is that, you know, the general license D for personal communications should be strong enough to stop these companies from having to fear those kinds of repercussions from the government. But unfortunately, those assurances aren’t there from the government, even though the government will come out and say they believe in freedom of the internet and access. And so you have this kind of, I guess, you’re, you’re, they’re passing on the responsibility between each other. So at this point, the solution seems to be the goodwill of the few companies that are willing to take those extra steps.

Danielle Blunt: Thank you. Thank you Affi and Mahsa. I feel like both of you touched on how lack of access to tech can actually increase vulnerability and violence. And when Affi was talking about dating apps shutting down because of sanction, I wanted to note that it also impacts sex workers who hustle on those apps, which is a good reminder that the communities that we’re talking about often hold intersecting identities, and it’s not that we’re necessarily talking about two discreet communities. So I’ll end with this question to Lorelei about, how does the financial discrimination of sex workers impact community? And I’m thinking specifically around destruction of mutual aid and organizing.

Lorelei Lee: Yeah. So I mean, there’s, as Mahsa and Affi have already talked about, there are such broad effects and multiple effects of this kind of discrimination. And, you know, we’re talking about everything from being excluded from public places, like the Marriott not allowing people they profile as sex workers to be in their bars, or Uber driving sex workers, suspected sex workers, to police stations, to the direct impact of not being able to use either an app on which you would screen clients such like as a dating app, which would be a harm reduction tactic for people in the sex trades, not being able to do that so your work becomes more dangerous, or not being able to take money from clients.

And so your economic precarity becomes more precarious. And then there are the community effects that Blunt has mentioned, which is that, you know, for sex workers, like we are just a group of people who is facing trauma and precarity all the time. And also who really, there aren’t good resources for us, provided to us socially, either from family networks or the government or civil society at large, there’s just not anything. So what we’ve done is create these networks, and created them across the country. Oftentimes using apps like Twitter, that continue to try to shut down our accounts. And, and yet somehow we have at least some of this in place where we then are able to say, okay, who needs money this week? Who’s, who’s struggling this week? And we can try to redistribute some of our funds and do this mutual aid work.

However, in order to do that, we have to be like, well, my Cash App is shut down this week, but maybe I can find someone who has a PayPal that they can, you know, then switch it to their Venmo, then switch it to Cash App so that they can get it to you. And something I said, I was talking to Rachel about this yesterday, how it just feels like we’re passing the same $50 around the country all the time, you know, and just like struggling through this like maze to try to do that. So that is very frustrating. And I think it points to something that I would describe as a systemic effect. So we have the individual impact, we have the community impact, and then we have the systemic impact. And the systemic impact is where our community networks can never get more resourced, can never become stronger because we’re constantly being shut out of financial systems as well as public dialogue. And so we can’t be visible online, which means we can’t tell our true stories online, which means the narratives that folks get about sex work are very specific and very narrow and often build a myth around all of us, which means that we are then viewed through that narrow lens so that we are constantly seen through a stigmatizing, dehumanizing profiling kind of aspect. And, and because of that, our voices, when we actually get a chance to use them, people don’t take us seriously.

And Facebook specifically, just to go back to Facebook and I will wrap up, I know we’re nearing the end. Facebook has a policy that you can’t talk about sex work unless you are talking about experiences of violence. And this is exactly what I’m talking about. When you only allow one narrative to be the narrative that people hear, then you stop seeing us as complex and complete people. And I am not saying, I mean, I think I actually have to say this, which I, is absurd that I have to, but I know when I’ve had these conversations before people think that I’m trying to downplay experiences of violence. I am a trafficking survivor myself. I don’t believe that our experiences of violence are not important, but I think that if you hear them only in a two-dimensional way, without hearing anything else about our lives, you stop seeing us as human beings again. So that’s like- (laughing) Like I just think these impacts are so broad and, and the oppression that comes at that systemic level prevents us from doing the kind of organizing that we are trying to do in order to push back against that very oppression.

Danielle Blunt: Thank you so much, Lorelei. And I think what you’re talking about when we’re not able to share our own stories another narrative is created in that space is also a really good point of intersection of what’s happening with these two communities with internet shut downs as well. So I want to give a minute for everyone to share a last thought, and I’ll start with Mahsa as we close out.

Mahsa Alimardani: I mean, I’m just kind of constantly listening and, listening to Lorelei and learning because they frame things so well, the bigger picture of what we’re doing, and I completely agree. And there is, you know, there is this inherent discrimination.

It’s a discrimination within the Middle East regions. It’s discrimination within sex workers. And I think, you know, the stronger the voices of these communities, I mean, there’s, there’s some momentum right now, you know, with focus on content on the Middle East, if there’s any way to, you know, intersect and do the movement building within these communities, I feel like these changes might be stronger and more profound.

Danielle Blunt: Thank you. Affi, did you have any last thoughts?

Afsaneh Rigot: Yeah, I think I echo Mahsa in saying that, you know, Lorelei frames everything so beautifully and perfectly that I think most people listening should just remember those last words, but I just wanted to add the importance of like connecting the work that’s being done on these issues around discrimination and sort of censorship of communities and from sex workers to communities in the Middle East to folks being affected by terrorism.

And as Blunt mentioned, these aren’t distinct groups, we’re all overlapping and intersectional and amongst, and within each community. One thing I wanted to mention is the little note about how also banking systems use this notion of reputational risk, predominantly around sex practice and anti-terror and high risk profiles. And reputational risk is such a broad and flimsy concept. So when asked about how Only Fans comes in, they block folks that have connection to Only Fans around this notion of reputational risk. And finally that if you’re looking at the issue around Thomson Reuters and their huge data brokering, look up the different situations that have come up in terms of blocking of the Palestine Solidarity Campaign to Finsbury Park Mosque under terrorism associations. So one of the things I hope we can do is hold these sort of seemingly neutral data brokers that are festering amongst us accountable for the work they’re doing and the data (inaudible) Thomson Reuters and LexisNexis also obviously notoriously sold a whole bunch of information to ICE for immigrants and undocumented folks and refugees in the U.S. So that’s another thing in our campaign. (laughing) To add to it anyway.

Danielle Blunt: Thank you Affi. And if you want to follow any of our work, you can follow at Hacking//Hustling, hackinghustling.org, and at article19.org. And Lorelei, I would love to offer you the one minute we have remaining for any final thought.

Lorelei Lee: Oh, I just wanted to say thank you all so much. I mean, I also learned so much from listening to you Mahsa and Affi, and I’m just like really thrilled that we have had the opportunity to bring our knowledges together and keep doing it. Yeah.

Danielle Blunt: Thank you so much. And I look forward to continuing the conversation with you all, and thank you so much to everyone who tuned in to join for this session. 

Leave a Reply

Discover more from Hacking//Hustling

Subscribe now to keep reading and get access to the full archive.

Continue reading