#SEGGSED: SEX, SAFETY, AND CENSORSHIP ON TIKTOK

Imagining Liberated Digital Worlds


Through one of Hacking//Hustling’s panels, “Decoding Stigma: Designing for Sex Worker Liberatory Futures,” with Chibundo Egwatu, Yin Q, and Gabriella Garcia, I became acquainted with the affiliated group Decoding Stigma. Decoding Stigma is a “coalition of laborers, futurists, advocates, artists, designers, technologists, researchers, teachers, and students who want to compel conversations about sex and sex work at the tech school” (Decoding Stigma, 2022). It is through the collective organizing, teaching, writing, and art of Hacking//Hustling and Decoding Stigma that this research has been possible, and it is to them that I look for visions of a liberated digital world. In the aforementioned panel, Garcia explores the question “What would an internet by sex workers look like?” She describes the possibility of sex workers playing a symbiotic role in tech design and calls us to think of creating technologies rooted in liberation rather than thinking solely of the abolition of current carceral technologies. This reminds me of other technofeminist calls to action such as in Data Feminism. D’Ignazio and Klein ask, “How might we teach a data science that is grounded in values of equity and coliberation?” In their response, they express, “Listening and engaging is the first step towards coliberation. And the only way to work respectfully with those most affected by a problem is to develop a sophisticated understanding of structural oppression and how your own identity factors into that” (D’Ignazio and Klein, 2020). Sex workers, especially more marginalized sex workers, are among those most affected by overmoderation and deplatforming. Including sex workers in tech design is a necessary intervention in allowing individuals and communities to shape the terms of what data is being collected on them and by what means. As opposed to the current ease with which data is extracted by powerful institutions with no regard for the best interests of their users (Blunt and Stardust, 2021; Garcia, D’Ignazio and Klein, 2019; Garcia, 2021).


Our world is not becoming any less data-driven or reliant on technology. Nor are those in power suddenly going to stop using tech and data to their advantage regardless of social harm, so the way to combat encoded bias and digital violence is through a feminist movement for data justice. The more marginalized people are involved in the creation of new tech and the auditing of existing technologies, the more equitable the data-driven world has the chance to become. In the case of representations of sex, sexuality, and sex education, the inclusion of sex workers in technology and education is essential for progress. As Garcia has said, “To big tech, the sex worker is as indispensable as they are disposable” (Garcia, 2021). As has been argued, sex workers played a fundamental role in the development of telecommunications technology, so it would be foolish to not include sex workers in the creation of our digital futures. Especially given the significance of sex and sexuality in ongoing debates about the protection of free speech online. Section 230 of the Communications Decency Act (CDA) is considered to be one of the most important laws protecting the freedom of speech online. It declared that providers and users of interactive computer services shall not be treated as the publisher or speaker of information provided by another content provider. Stating, 

No provider or user of an interactive computer service shall be held liable on account of— (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1). (Legal Information Institute)


This policy previously allowed for the internet to be a place of cyberlibertarianism (Garcia, 2021), where users could communicate freely on virtual platforms with few exceptions. The protections made by Section 230 have been carved out by legislation like SESTA-FOSTA, and most recently the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT Act). The EARN IT Act of 2022 is the most recent attack on online privacy, arguably a key component of how ‘safety’ is conceptualized in digital spaces. Policies like SESTA-FOSTA and EARN IT eviscerate section 230, targetting sex workers, activists, organizers, and protesters' privacy by expanding platforms’ liability for content produced by their users, and targeting encryption (Blunt and Stardust, 2021). Laws targeting section 230, “Created the internet’s first US government-sanctioned censoring instrument” (Garcia, 2021) which has caused platforms like TikTok to broaden their definitions of obscenity to protect themselves from the expanding scope of SESTA-FOSTA and other laws that could render them culpable for their users’ behavior. As mentioned before, Hacking//Hustling came into existence in response to SESTA-FOSTA, so it is unsurprising that as the EARN IT Act is resurfacing from its 2020 iteration, sex workers are once again at the forefront of organizing for internet freedom and safety.

 Different conceptualizations of safety have been and will continue to be key to my research, so it is important to identify what safety means outside of a hegemonic, legislation-oriented context. What does safety mean for sex workers and other marginalized groups? Safety is not monolithic, but creating safe space is a goal held by organizers, teachers, politicians, and corporations alike. Borrowing from feminist pedagogies of safety, an ethical approach to creating a so-called safe space may look like centering those who are typically made the least safe by society at large. Principally attending to the needs of people who are made the least safe through carceral logics and technologies that disproportionately target Black people, Indigenous people, and other people of color.  Decoding Stigma asks us to “Imagine an internet built by those who innovated out of a need for collective safety, rather than by those driven to conquer a global economy. Imagine a cybernetic future founded by those for whom the creative functions of both mind and body have never been severable” (Garcia, 2021).  

Vickery explores the risk versus reward of teens' access to online spaces and argues that the assumption of risk without an equal assumption of the possibility of growth is harmful to young people and prevents them from being able to navigate situations of risk safely in the future. bell hooks similarly argues that Safety is “knowing how to cope in situations of risk,” She writes “True safety lies in knowing how to discern when one is in a situation that is risky but where there is no threat” (hooks, 1994). This is a learned skill that is facilitated through support in encountering and handling situations of risk. It means having the tools to understand the difference between risk and danger. With risk often comes potential reward which is why Vickery argues for a shift from harm-driven expectations to opportunity-driven expectations when it comes to youth moving through online spaces.

Harm-driven expectations continually construct the internet as a dangerous space and effectively construe all risk as harmful. Rather than enabling young people and empowering adults to help youth navigate risks, they attempt to prevent risky encounters and behaviors altogether–often with unequal consequences for different youth populations” (Vickery, 2017).


Constructing all risk as inherently harmful limits opportunities for growth through problem-solving, conflict resolution, boundary setting, and decision-making. Without preparing young people for their inevitable encounters with potentially hurtful, dangerous, or conflict-inducing language, images, and ideologies online they are left to navigate risk alone either in secret as youths, or alone at eighteen when presented with their new “adult” identity. 

Turning eighteen and/or leaving home for the first time for school or work can be incredibly disorienting and shocking the more sheltered you are from risk as a child. I remember my urge to text my parents from the bottom bunk of my dorm to ask permission to go out at night. I remember the anxiety in my chest bubbling up as I was offered a drink or a hit considering what would happen if my parents found out. As I inhaled sharply and began to cough, I thought, “They must know.” As I stumbled back to my dorm with my arms intertwined with my roommate’s I profusely apologized to her for my behavior. Though I had not done anything wrong or unsurprising for a college freshman, I did not know how to react to my adult choices other than through apologetic shame. I had spent so much of my youth averting my eyes from the unknown horrors of sex and drugs. I was firmly prohibited from viewing R-rated movies. I self-censored from pornography, often frantically closing pop-ups of animated nude women dancing seductively on the edges of my screen. I did not hang out with people who my parents perceived as “bad influences” (they were usually boys with painted nails, and outfits from Hot Topic, who smoked weed by the age of thirteen).  I was completely unprepared for the world of college parties, hypersexuality, and normalized binge drinking that I entered as a teenager. My parents could not have prepared me for this either; the thing that came the closest to preparing me for this landscape of new opportunities attainable through risk-taking was peer communities online. 

The internet was a place where I could learn about myself and create myself through the media I consumed, and the communities I was a part of. There were next to no spaces growing up where I could learn about myself, about sex, sexuality, and the substances often involved in places like parties where sex is often at the forefront of people’s thoughts. I was almost constantly under direct adult supervision or in public spaces where these conversations did not feel invited or separable from shame and stigma. Vickery notes, “Young people are hanging out together more online precisely because physical spaces designated for young people are increasingly being diminished or heavily regulated and structured by adults (thus diminishing peer-centric aspects of those spaces)” (Vickery, 2017). Similarly, the regulation and surveillance of physical spaces have made the internet the primary place for sex workers to have peer-centric online spaces to share information about staying safe and navigating risk in person in digital space. This serves as another example of how shame and stigma against sex and sexuality, which creates whorephobia, impacts both youth and sex workers and unites these groups as inhabitants of social media reliant on the digital world for peer support networks. 

Though paternalistic and carceral ideologies of safety may beg to differ, neither young people nor sex workers are safer through increased surveillance and policing. Normative expectations of childhood innocence are a tool of the state used to justify increasingly invasive technologies of surveillance rooted in capitalistic desires to both commodify data and reproduce criminality.

Laws like SESTA-FOSTA and the EARN IT Act infringe upon adults’ rights to free speech, autonomy of choice, and youth’s access to vital information about identity, sexuality, relationships, and intimacy that can assist them in their transition into adulthood. As I have been arguing, these attacks are not about children’s safety. As Hacking//Hustling highlights, “The root issues these bills are allegedly seeking to address include extremism, trafficking, and child exploitative material. If that were true, we would be talking about directly resourcing communities, not amending section 230” (Blunt and Stardust, 2021). Those who are taking direct action in the name of harm-reduction, community healing, and forging safer spaces online are sex workers, survivors, and allies. 


While groups like Hacking//Hustling and Decoding Stigma more broadly address whorephobia’s influence on the digital world and a need for collective organizing against carceral technologies and for inclusive, liberatory tech, sex worker organizers are also working towards safety within our current digital landscape. Within sex worker organizing there is ongoing support for improved media literacy, and more specifically porn literacy among all people. Recognizing that pornography and sex education are often conflated with one another, prolific adult filmmaker, Erika Lust started an ethical porn company of her own accompanied by the non-profit project, The Porn Conversation. Lust’s work is a feminist intervention of mainstream pornography, or free online porn, that combats the stereotype that all pornography is inherently misogynistic and damaging to women. She has been leading the revolution for adult entertainment that resists traditionally gendered stereotypes since 2004 when she made her first indie film, The Good Girl. Lust’s films “represent a wide range of human body shapes, identities, and sexualities,” prioritizing pleasure by focusing on the eroticism of sexuality and relationships outside of the male gaze (Lust).
 

This page has paths:

Contents of this tag:

This page references: