#SEGGSED: SEX, SAFETY, AND CENSORSHIP ON TIKTOK

Competing Discourses of Safety

The most cited reason for content removal present in pop-sex-ed TikToks is the community guidelines section on “Adult nudity and sexual activities.” Despite the clear harm done by the selective implementation of these guidelines, the text of these guidelines opens with the statement: “We strive to create a platform that feels welcoming and safe.” Which begs the question, welcoming and safe to who? What does welcoming and safe look and feel like? The platform has created a hostile environment for sex-ed content creators, especially sex workers, even though there have been demonstrated links between teens’ access to comprehensive sex education with their internet literacy and safety. Additionally, the app is not welcoming to fat people, disabled people, and people of color– markedly, Black people who are largely uncredited for their key role in TikTok’s success. In response to violence and harassment towards LGBTQ+ users of the app, TikTok is implementing new community guidelines that will ban deadnaming trans people. However, this is a band-aid solution and a distraction from larger issues of harassment present on the platform, principally for transgender people with multiple marginalized identities, including the many trans sex workers. The language of “welcoming and safe” implies that there is a hegemonic way to create those feelings that meet the needs of a majority of the community. This highlights how sexuality continues to be viewed as a threat to safety, principally as it relates to young people. TikTok is welcoming and safe for people who are white, cisgender, thin, able-bodied, and wealthy, much like the “real world” off the internet. On the app, we see a reproduction of dominant values that positions discussion of sex and sexuality as a threat to our collective well-being. 

After its declaration of creating a safe and welcoming environment, the guidelines read, “We do not allow nudity, pornography, or sexually explicit content on our platform.” As I scroll through the app I see hundreds of videos of young, thin, white women, wearing bikinis or crop tops dancing to viral TikTok sounds. The most popular dances usually involve body rolls, hip-thrusting, twerking, and other dance moves often considered overtly sexual. Many of the songs used for these sounds contain explicit lyrics that are often about sex. Take Cardi B and Megan the Stallion’s “WAP” for example: the WAP challenge on TikTok went viral as dancers drop it low and slide into the splits twerking. There’s also the “Silhouette Challenge,” where the creator usually starts in casual, baggy clothing: then when the beat drops, the background flips to dark red lighting illuminating the silhouette of the creator in the doorway, often posing in lingerie. Both of these are examples of trends that were recreated en masse by users without a response from the platform about the “sexually explicit” content being promoted on the app, nor was this content deleted on mass for being labeled as pornography. Sex educators on the other hand will be fully clothed with the camera focusing on their faces (rather than full body shots), and will have their content removed under the “Adult nudity and sexual activities” guidelines. Supporting, and even promoting, sexualized content creators who are often normative teenage girls does not seem to be an issue for TikTok, which provides an instructive counterpoint to its own discourse about safety in relation to sexuality. Rather than actually reflecting a commitment to safety, TikTok’s policing of sex education reflects the wider moral panic about providing young people with comprehensive SRE, and the impetus to sanction education that strays from the often conservative at best, false and fear-mongering at worst, sex education curriculum followed by schools. 

As the teachers of stigmatized knowledge, sex educators have been under attack by parents, school boards, and legislatures due to deeply rooted cultural anxieties about sex in the United States. A common tactic used to disrupt access to sex education in schools has been to engage in explicit and politically charged language in community debates around sex education policies. This is almost always engineered by moral conservatives. Irvine names some examples of this tactic, such as “Calling a curriculum a ‘sodomy curriculum,’ or describing it as pornographic,” in addition to “[stigmatizing] sex educators themselves, calling them perverts and pedophiles” (Irvine et al., 2016). Similar language is used to describe those who engage in the erotic labor industry, associating sex education, sex workers, and consumers of the sexual market through insults. Earlier, I mentioned Rubin’s observation that the stigma associated with sexually explicit materials is reinforced through the use of terms like “obscene” and “pornographic” to signify horror or repulsion in non-sexual situations (Rubin, 1993 & Smith, 2010). Stigma formed and reinforced by fear has created popular desires to either save or destroy sex workers (Peepshow Podcast, 2021). The association of sex work, sex education, and pornography with the production and promotion of obscenity, justifies the dehumanization and violence against sex workers that results in the sweeping removal of NSFW content that sex educators, rightfully, complain about being targeted by. This conflation has permeated SNS: sex educators are now experiencing forms of overmoderation like content removal, shadowbanning, and account deletion, under the guise of them breaking the rules of TikTok’s community guidelines against “Adult nudity and sexual activities.” 

Sex educators who receive notice that they have violated the community guidelines for “Adult nudity and sexual activities” are left to speculate on what TikTok’s algorithm saw in their work that rendered them sexually violent or pornographic. These guidelines demonstrate the conflation between sexual solicitation, sexual violence, pornography, and sex education. Although these guidelines are established to create a “welcoming and safe” space online, they are further perpetuating the sexual stigma that prevents sex workers and educators from using the most popular social media platform of our current moment. In speculating what aspects of their content triggers TikTok’s overmoderation, sex educators on the app have taken to modifying how they talk about sex and relationships online. In response to the guideline that states, “Do not upload, stream, or share: Content that contains sexually explicit language for sexual gratification,” pop sex educators use coded language not yet flagged by the app’s system as being associated with sexually explicit or pornographic content, to reach other users of the site. Other methods of avoiding overmoderation like shadowbanning include labeling posts with banners of text across the top of posts that read something like “For Educational Purposes Only,” with some creators like sexual health educators @yes.tess using a screenshot of the community guidelines itself to resist content removal (Tess, n.d.). Highlighted at the top of many of her videos is the, now outdated, community guidelines statement, “We do allow exceptions around nudity and sexually explicit content for educational, documentary, scientific, or artistic purposes” (TikTok Community Guidelines, 2021). TikTok’s exception policy is referenced by many enraged pop sex educators in response to their content being removed or their pages being shadowbanned. However, as mentioned earlier, the only categories in the current community guidelines that are given explicit exceptions for educational contexts are “Illegal activities and regulated goods” and “Hateful behavior” (TikTok Community Guidelines, 2022). Ironically, in order to offer helpful sex ed content that can make people safer, content creators have to use strategies to get around shadowbanning and content deletion. In the next section, I take this up in relation to the concept of digital violence. 
 

This page has paths: