#SEGGSED: SEX, SAFETY, AND CENSORSHIP ON TIKTOK

Feminist Hacking

 Ellie Botoman proposes that we consider chilled speech/algospeak on TikTok as linguistic hacking – an intentional subversion of self-moderation that demonstrates a resistance to TikTok’s algorithmic panopticon. “For all of their virtual infallibility, these algorithms can still be hacked, tricked, misled,” and in doing so, users disrupt the widespread belief that technology is god-like, all-knowing, and beyond human comprehension (despite being entirely man-made). [See Cursor Mag Issue 1. -- "UNALIVING THE ALGORITHM"] Hacking can be a feminist intervention disrupting and subverting technologies of empire like surveillance, policing, and criminalization. In the interview with Hacking//Hustling, “Automating whorephobia: sex, technology and the violence of deplatforming,” Danielle Blunt asserts that “Sex workers hack a capitalist system by choosing a type of work that often enables us to hack the nine-to-five system.” Many sex workers are unable to meet the demands of traditional nine-to-five work, “due to disability, barriers to the formal economy, and caregiver responsibilities” (Blunt and Stardust, 2021). Hacking in this case is extracting value from a system that intends to extract value from all of us in order to survive capitalism. Blunt argues that when we conceptualize hacking this way,

“We are then better able to conceptualize technology not just as these high-tech forms, like a computer. We can understand mutual aid and community-building as a technology…It is a matter of figuring out what technologies we want to use and what technologies support our survival, safety, and well-being” (Blunt and Stardust, 2021).


While hacking is often a masculine-coded activity, femme people and other people with marginalized identities are made to become skilled hackers in order to survive, build community solidarity, and resist everyday violence. 

Botoman’s use of linguistic hacking and the conceptualization of hacking as a strategy of surviving technologies of structural violence like capitalism, policing, and criminalization employed by Hacking//Hustling both echo Mckenzie Wark’s A Hacker Manifesto [version 4.0] where Wark offers that “to hack is to refuse representation, to make matters express themselves otherwise. To hack is always to produce a difference, if only a minute difference, in the production of information” (Wark, 2006). In the action of using chilled speech, one is simultaneously aware of their experience of algorithmic oppression and is combatting future attempts by TikTok, at restricting their content. People who have been othered by TikTok have strategically and creatively adapted to digital violence. In the act of self-censoring and blurring the relationships between visual content and captions, language and its meaning, Botoman adds “There’s this splintering of gazes where the human eye can recognize the content of these videos for what they actually are, while the AI struggles to fit the text and the image into its programmed patterns of recognition, a parallel world where one’s video content is simultaneously legible and illegible to its human and artificial audiences” (Botoman, 2022). For example, I have seen some sex workers and fat people on the app using the tag “#fakebody,” typically used for depictions of gore, so that their content is not removed automatically for being perceived as showing too much skin by the algorithm. In the act of labeling themselves as having a fake body, the AI often fails to register the humanity on display.

Though TikTok and other SNS ban specific words and phrases to disrupt the production of content that reflects topics labeled as inappropriate or threatening to users' safety, what often happens is the development of online communities that connect through the newly coded language (Day, 2021). Some of the hashtagged terms sex workers organize under on TikTok include #Accountant and #AccountantsofTikTok from the trending sound, as well as #Strippa, #StripTok, #Swork/#Sworker, #Spicetok, and #SellingContent. 

The title “TikTok Accountant” became a popular code for sex workers on the app when a song written by Rocky Paterra about telling people he was an accountant when asked about work went viral among sex workers on the app. The song was originally written about Paterra avoiding conversations with strangers about being a struggling actor, but sex workers on TikTok embraced the sound, using it as a tool to promote and destigmatize their work through relatable humor (Haasch, 2021). Accountant becoming synonymous with sex worker crossed into the real world as a safety device and lent itself to conversations about what sex workers do or don’t disclose to strangers to protect themselves and their privacy. On TikTok, people began suggesting other job titles that do not invite many questions like jobs that involve confidential information that can not be spoken about in detail. A trend that primarily functioned as a meme became a source of community support and exchanging tactics to stay safe in public and at work. 

The other tags represent a small selection of the range of coded language sex workers use to identify themselves, advertise their work, and educate outsiders about different aspects of the erotic labor industry online. While these terms can represent how sex workers hack TikTok’s algorithm in order to participate on the website, many modified terms used are also examples of chilled speech. While some have come about as a fun meme that unites sex workers with other stigmatized laborers, others exist out of the fear of penalization for using language that could be used to target their accounts.

Part of the issue with algorithmically censoring all references to sex, is that survivors of sexual violence and advocates against interpersonal violence are censored as well. Sex worker and survivor are not mutually exclusive categories. There are many sex workers who are survivors of domestic violence, some of whom relied on sex work to escape abusive living situations. In addition to TikTok shadowbanning or removing sex education content, or videos created by or about sex work, TikTok censors vital information about interpersonal violence prevention and resources for survivors. A recent example of chilled speech I encountered is the term “Sessgual @ssault,” indicating that a fear of penalization is present when discussing experiences with sexual violence amongst users. The phrase sexual assault is uncomfortable to read, the subject may challenging, triggering, or heart breaking, but the term is not dangerous. In fact, to prevent the very real risk of social media playing a role in sexual assault, sex education that includes media literacy, is a necessary harm reduction tool. 
 

This page has paths:

This page references: