#SEGGSED: SEX, SAFETY, AND CENSORSHIP ON TIKTOK

Methods

This work seeks to untangle the relationships sex, sexuality, and sex education have to one another and to socially constructed definitions of safety. If we do not have control over what is considered safe for us to consume, how much power do we have over our consent to what we engage with online? The notion of safe/ty is used as a tool of controlling media consumption, and therefore access to unregulated, informal education that may offer counter-narratives to what is commonly taught in schools.

In this study, I focused on the digital space of TikTok and performed a discourse analysis of TikTok's community guidelines alongside an analysis of how those guidelines have been applied to sex ed-related content. I pay particular attention to conversations about sex education, pornography, youth risk and safety, morality, innocence, and the role of technology in shaping our understanding of sexuality. I read TikTok Community guidelines against instances of overmoderation of sex ed content, and sexuality and relationship educational materials.

 TikTok’s terms state that users may not: “promote sexually explicit material,” or post “any material which is defamatory of any person, obscene, offensive, pornographic, hateful or inflammatory” (TikTok, 2021). I focus on the “Adult Nudity and Sexual Activities” section which is broken down into two sections: “Sexual Exploitation” and “Nudity and sexual activity involving adults.” Among the list of content that qualifies as sexual exploitation, TikTok asks that creators do not share “Content that depicts, promotes, or glorifies sexual solicitation, including offering or asking for sexual partners, sexual chats or imagery, sexual services, premium sexual content, or sexcamming” (TikTok, 2022). The conflation between sex work and sexual exploitation is not new, and TikTok’s guidelines reify that belief by implying that participating in aspects of sex work on the platform is considered to be sexual exploitation by the community guidelines. 

 TikTok states that the purpose of these guidelines is to “protect children,” and maybe there is truth to that, but children are not being protected, instead, sex workers and sex educators experience digital violence when trying to provide youth with resources to understand their sexuality, their bodies, and learn safer sex practices as they grow. The challenge is that policies that harm sex workers will always by extension harm children– children who are politically objectified and treated as a symbol of innocence, purity, and the future of maintaining the status quo. The symbol of the child is used to represent a binary opposite to some sex workers while representing a reminder of the lost innocence, victimhood, and potential for rehabilitation for others. The symbolic child possesses virginity and cis-heterosexual potential. Whereas sex workers are often positioned as either intentionally deviant, morally corruptive, and predatory, or as victims of violence and/or moral corruption. These narratives maintain the fear that innocent children are at high risk of being seduced or forced into the sex industry, fears that sex workers are sexual predators and therefore deserving of carceral punishment, and the desire to protect children and “rescue” sex workers from the sex industry. 

On TikTok, it is common to see users manipulate the spelling of search terms, captions, and hashtags that are censored or are speculated to be a target for censorship. Examples I have seen of self-censored language used in sex ed discussions are 0rg@sm, Pr0n, v!br@t0r, cond00ms, v@g!na, p3nis, ar0usa1, and other similarly censored terminology important to communicating topics in SRE. In adition to analyzing TikTok’s community guidelines, I use the modified terms that sex educators organize their content with as ways of theorizing the types of censorship that occurs and the strategies creators employ to resist overmoderation. 

Online, safety is often used to justify censorship, even when that censorship has the potential to make marginalized communities less safe. In the following chapters, I provide background on the platform TikTok, the history of sex education in the United States, and discourses on pornography. I use the historic conflation of sex education with pornography and obscenity to set up competing discourses of safety that are rooted in anti-sex work attitudes. In chapter three, I go into more detail about how digital violence that targets sex workers also impacts sex educators’ ability to disrupt the perceived risk of youth being exposed to sex online. Digital violence on social media takes place through surveillance, shadowbanning, content deletion, and deplatforming that aims to silence and erase non-normative non-hegemonic groups. These forms of suppression lead to self-censorship that represents the chilling effects of digital violence and opportunities for resistance.  Sex educators and sex workers have similarly used coded language to discuss sex online, demonstrating a shared need for more liberated digital spaces that do not simply reproduce biases that uphold existing systems of power and oppression. I use the concluding chapter to discuss some of the liberatory digital futures sex worker activists have imagined. The censorship of sex education and sexuality more broadly on social media platforms like TikTok does not only impact educators and the communities of virtual students they reach, it is a part of a larger web of digital violence rooted in anti-sex work policies. Since this issue rests at the nexus of sex and tech, I turn to sex workers, a group of people whose modern history of sexual labor is intertwined with that of telecommunications technology, as experts on the treatment of sex by tech.
 

This page has paths: