#SEGGSED: SEX, SAFETY, AND CENSORSHIP ON TIKTOK

CHAPTER 3. DIGITAL VIOLENCE

Notably, the hashtag #safersex is one of TikTok’s banned terms. While TikTok boasts about its commitment to safety, the term safe when in relation to sex is censored on its platform. This implies that the inclusion of sex is always a threat even in the context of encouraging safety and providing free risk reduction tools. The exclusion of #safersex and other sex education tags from the ability to easily come up in searches or be promoted to users of the app illustrates the discrepancies between what TikTok allegedly believes in and what is being executed by its algorithms. Deactivating a term’s viability as a means of organizing information due to its association with a high volume of content deemed risky does not allow for a nuanced assessment of risk. This is a form of digital violence where users and their content are wrongfully flagged for being innapropriate, despite content that is often more vulgar, but is perhaps represented in a way or by a person that appeals to TikTok’s desires for its platform, existing freely.

Lydia Collins (@lacollins) is a sex educator whose work centers on HIV prevention in Black Communities. Her page serves as a pop sex ed resource that covers topics like STI transmission, birth control, boundary-setting in relationships, PrEP myths, and self-advocacy when interacting with medical professionals. She is a fashionable, fat, Black woman with short hair and large glasses with clear frames who writes and advocates for decolonizing and undoing anti-blackness in sex education. With 10.2k followers and 709.5k likes, the discrepancies in viewer count highlight TikTok’s algorithm at work (Collins, n.d.). One of her most popular videos with over 2.7 million views shows Collins bobbing her head to the beat of a sped-up version of “Another One Bites the Dust” with text across the bottom of the screen reading “When you go to get tested before a hookup and see your date at the clinic” (Collins, 2021). The camera turns to show both Collins and presumably her date bobbing their heads in sync as their straight faces break into smiles. The comments on this video range from an appreciation for seeing testing being normalized, people sharing their date stories, to questions and confusion. Some of the comments include, “Tested for what though,” “YASSSS!!! Greenest flag ever!!!”  and “Thank you for encouraging this!! People say it’s ‘expecting too much’... ok.” The comments are overwhelmingly positive. TikToks with the lowest view counts on Collin’s page on the other hand include more controversial topics like sexual pleasure and abortion. Where her video that serves as a neutral message about going to the doctor’s office has over two million views, her video offering a writing activity on prioritizing pleasure currently has only 140 views, 20 likes, and no comments. Similarly, Collins’s TikTok titled “Advocating for your sexual health pt. 5” has 200 views, 32 likes, and no comments. In this TikTok, she offers “Questions to ask your healthcare provider before getting an @bortion” such as: “What are the side effects of both? (Pills vs Surgery),” “When can I resume s*xual activity?” and “What other medications will negatively interact with the process?” (Collins, 2022). There are a number of potential factors as to why the algorithm is prioritizing some content over others from Collins’s page including what music is used and if it was trending at the time of posting, the background or outfits being viewed, and differently worthy of promotion, or the content itself. It seems that the content receiving the least amount of attention is content that pushes the boundaries of what is appropriate sex education, and therefore what is safe to promote on the app to its largely young audience. Where content that speaks only to neutral medicalized health advice is promoted, content that could be considered to have a political agenda or that suggests enjoying sex beyond its reproductive purpose is not.

            Other examples of TikTok censoring language that was seemingly deemed by the platform as inappropriate, pornographic, or unsafe are labels used to describe non cis-hetero subjects. Some examples are lesbian which is typically written on TikTok as lesbean, le$bean, or ledollarbean, and the term intersex which has incited a public battle for social media representation. These search terms have since been unbanned by the app after pushback from LGBTQ+ users and activists but users self-censoring previously banned and precariously permitted terms is still a common practice, which is in itself a telling example of how pervasive digital violence can be. In February of 2021, Pidgeon Pagonis (@pidgeo_n) and other intersex activists took to Instagram and Twitter to rally their followers to encourage TikTok to unban intersex as a hashtag and search term on the TikTok platform (Pagonis, n.d.). Intersex activists and allies began to flood TikTok with videos marked with #UnBanIntersex and were encouraged to give TikTok a one-star review in the app store to detract from the App’s high ratings. Technology historian, Mar Hicks (@histoftech on Twitter), Tweeted, “This is an escalation of all the banning and shadowbanning that platform has been doing to curate the type of people they want – and drive away the types of people they don’t want,” speaking to the digital violence occurring by TikTok erasing intersex people and their experiences on the platform. Overmoderation like banning terms associated with queer and trans identities reflects societal attitudes towards people who deviate from cisgender and heterosexual standards of legibility and desirability. “‘My community is erased with a scalpel, and with words and linguistics,’ says Pagonis, ‘but this time they’re literally erasing the word.’” TikTok told The Verge magazine that the multiple instances of the Intersex tag being banned were mistakes, but there was never a public statement made about the allegedly accidental removal, leaving users like Pagonis to speculate about whether or not it was truly an accident, or an intentional act of censorship (Sanchez, 2021). #UnbanIntersex is one of many examples of a pattern recognized by activists and users with marginalized identities on TikTok where first a creator identifies an issue with the way certain terms are moderated and recognizes potential harmful impacts to the community. Then a video is made revealing the error or intentional censorship taking place which quickly accumulates thousands or millions of views. From that point, the issue may be taken up by a journalist who publicly asks TikTok to explain itself, and then the company fixes what it insists was an error, and emphasizes how it supports the community that has called out the problem and associated political issues (Ohlheiser, 2021). This approach from TikTok has been seen repeatedly, but where the company seems to be unwilling to consider potential harmful impacts to a marginalized community is when it comes to sex workers. Censoring references to sex is rooted in anti-trafficking and anti-sex work policies, but impacts many other communities, including youth – who the guidelines are presumably set out to protect.

Public sex educator and podcast host @sexedwithdb highlights her experience with overmoderation for sharing information pertinent to SRE compared to the typical user of the app through “Duet” videos. On TikTok, creators have the option to make their video collaborative by allowing Duets. This usually looks like the original creator’s video appearing on one side of the split-screen with the responding creator reacting or adding to the video on the other. In one duet posted by @sexedwithdb the video is captioned “Literally I try to teach people about condoms and this video flies!?!?” as the exacerbated DB stares in shock as the original creator cuts into a pastry saying “I would fucking bust my pussy wide fucking open” (Bezalel, 2021). While sex educators struggle to have their content seen because of the assumption seemingly present on the platform that all sexual knowledge is pornographic, and therefore dangerous, casual creators often can speak in overtly explicit language without their content being flagged. This is partly because the more a user is implicated in violating the community guidelines the more moderation intensifies, so when sex educators are frequently “caught” breaking the app’s rules the more they run the risk of being de-platformed. In another video that has been reposted after removal @sexedwithdb frowns looking into the camera framed by white text reading, “Me trying to get medically accurate health education videos to the masses.” The video quickly cuts to her unenthusiastically dancing as the white text above her head says, “My page getting sh@d0wb@nn3d bc the T0k wants to censor health educators.” The short video was originally captioned “Can we stop censoring health education pls???” (Bezalel, 2021). Though TikTok's alibi for shadowbanning her content is related to keeping users safe, instead it punishes an educator for providing a public health service on the app. Not only that, but TikTok follows in the footsteps of other social media and tech company giants by attempting to limit sex workers’ ability to exist online by increasing surveillance and virtual forms of punishment. 

Targeting the removal of sex workers, typically under the guise of preventing sex trafficking and protecting children’s safety, leads to the removal of community-based, accessible, harm reduction tools, including sex education. Pop sex-ed content creators have also turned to other platforms or mediums such as Instagram and podcasting to shared their experiences of being shadowbanned on TikTok. Katie Haan (@Itskatiehaan) is a self-described peer-led sex educator who identifies herself as “your TikTok big sis” to young, queer users of the app seeking sexual health advice in a judgment-free, non-professional space (Haan, n.d.). She is young herself at the age of twenty-five, and she is a thin white woman with Eurocentric beauty features. Her features reflect what TikTok’s algorithm typically promotes for being aesthetically pleasing, and is therefore a desirable user of the app. This is important to specify because of reports that TikTok has informed its moderators to “Suppress posts by ‘Ugly’ people and the poor to attract new users” (Biddle et al. 2020). On her Instagram account, she posts a screenshot from her TikTok account showing a previously posted video that had received over twelve thousand likes replaced by a blank black screen except for the text in the center stating, “Removed for: Adult nudity and sexual activity.” Haan captions the screenshot, “This video (of my face speaking to the camera) was removed for ‘adult nudity and sexual activity’? This video is a literal safety warning, ER nurses were chiming in to vouch for the importance of this info. @tiktok do better” (Haan, 2021). She reposted the video in retaliation and received the same message from TikTok within four hours of posting. In a follow-up post, Haan asked what was so different about the removed videos compared to others that were not flagged as community guidelines violations, and shares that she believes she has been shadowbanned. Later she shares that younger followers informed her that the previous spelling she was using, “s3x,” was added to the list of terms triggering TikTok guidelines for “sexually explicit content.” Haan writes “I guess it’s ‘seggs’ for now. People deserve to be educated with REAL terms and without confusing euphemisms.” This response both represents an impact of automated censorship as digital violence and the resistance of coding language to continue engaging in the subject despite receiving a message to stop. 

Haan is right. People do deserve to be informed of medically accurate terms for their bodies. In fact, it is an important feature of the first key concept of SIECUS’s third edition Guidelines for Comprehensive Sex Education (CSE). Key concept one is “Human Development” for level one (Children ages five to eight years old). The first topic in this chapter is “Reproductive and Sexual Anatomy and Physiology,” which encourages informing elementary school students of the following: These guidelines encourage educators to begin informing children about their own bodies, including the often sexualized and stigmatized body parts. TikTok sex educators, on the other hand, need to use edited versions of these terms for an audience at least eight years older than who these CSE guidelines are intended for. This practice is identified by the collective of sex workers and accomplices, Hacking//Hustling, as “Chilled Speech.” In Hacking//Hustling’s community report “Posting into the Void,” the collective identifies issues of internet surveillance, shadowbanning, and other forms of digital violence perpetrated against sex workers as well as Activists, Organizers, and Protesters (AOPs). Their definition of chilled speech comes from Jonathan Penney and reads as follows: “When an individual’s speech or conduct is suppressed by fear of penalization at the interests of a party in power” (hacking//hustling, 2021; Penney, 2017). Parties in power not only include the state but social media platforms as well. TikTok’s community guidelines serve as an authority that suppresses the voices of sex workers and AOPs which includes sex educators on the app. “P3n!s” and “V@g!na” are some of the common examples of chilled speech used in-text within posts and in captions by pop sex educators to avoid triggering censorship. Another more colloquial term for this form of self-censorship is “algospeak,” language changed intentionally in response to algorithmic censorship of content considered to be NSFW by the platform the user is attempting to post on. 
 

This page has paths:

This page references: