CHAPTER 1. INTRODUCTION
cc: got a lot of new friends joining the convo here :)
cc: So here's the definitions of some terms I use a lot!
cc: PIV= pen!s in v@g!na
cc: PIA= pen!s in anu$
cc: manual s3x= hand stuff
cc: When discussing anatomy, it will always be VULV@ OWNERS & PEN!S OWNERS here
cc: cause gender is IRRELEVANT to the s3x ed conversation (Haan, 2021).
These are common representations of Popular Sex Education (pop sex ed) that occur on the social media platform TikTok. Typically sex ed is categorized in a binary, as either formal sex ed or informal sex ed, formal being conducted by trained sex educators and health professionals, and informal being: sex education taught by parents and peers in everyday conversations (Döring 1, 2021). In all contexts, this binary can impact who and what content is censored, but on social media– here focusing on TikTok, the algorithm targets content creators all along the spectrum of this binary. Sex educators of all backgrounds have a shared experience of navigating the necessary linguistic and visual codes to produce content that the algorithm does not swiftly flag as harmful. A range of actors – including trained educators, parents, queer and trans people, sex workers, doctors, nurses, therapists, and sex-positive users – all engage in producing ‘sex edutainment’ online, creating a digital learning ecology of sex education resources in a digestible popular media format (Johnston 76-77, 2017). Incorporating edutainment, like social media engagement and game-ified modes of interactive learning, has been a pedagogical strategy used in many fields to support in-class formal learning (Isacsson and Gretzel, 2011), but in the case of sex education, edutainment has often been a necessary stand-in for formal learning rather than supplemental and informal. This approach to sex education, which creates and reproduces popular narratives about gender, sex, sexuality, and relationships on highly trafficked social media platforms, is what I call pop sex ed.
Noticeably, both of the videos I mentioned thus far are about sex ed rather than actively distributing sex ed and that is because an overwhelming amount of sex-positive and educational media on TikTok does not get far enough past the site’s content moderation protocols to make its way to my feed, and generic hashtag searches. These two examples of pop sex ed that successfully circumvented deletion have several things in common: the creators both benefit from having white, Eurocentric features and small frames, making them hegemonically attractive, cisgender women. They are both US-based and college-educated without visible disabilities. These vectors of privilege seemingly allow for their videos to soar to the top of TikTok searches of common sex ed hashtags such as #seggstok, #seggsed, and so on, over content that appeals less to TikTok’s standards of desirability. Additionally, these videos demonstrate the common practice of self-censorship that TikTok’s peer-led sex educators engage in to share invaluable information about sex, sexuality, and relationships. Ironically sex workers and educators alike have had to engage in this form of self-policing as a necessary tool to be able to continue sharing knowledge on the app. Despite attempts of suppressing sex ed conversations online, pop sex educators continue navigating these exhausting systems of moderation to provide information for the safety and support of people who did not receive comprehensive sex education in school or at home growing up.
Embarking on this research I was interested in how social media algorithms, such as the one used by TikTok, work to construct the digital identities of its user base, and how that identity construction operates to disappear identities of users off of the platform through content suppression. Throughout this paper I will frequently be using the term “user/s” to describe people interacting with online interfaces since this language is common in documents by platforms referring to the people who use their sites. I want to stress that before any of us are users we are people. Platforms turn us into “users” rendering diverse individuals and communities existing online into a homogenous abstract group. Behind each user is a person with complex life experiences based on positionality. Even though “user/s” is constructed to label all people interacting with a particular website or app, the experiences of users mirror the real world as existing biases are coded into our online worlds.
Social Networking Sites (SNS) continue to become increasingly hostile towards sex workers and any content deemed Not Safe For Work (NSFW), which often includes comprehensive sex education. The inclusion of sex education in the sweeping removal of NSFW content on social media platforms like Tumblr, Instagram, and TikTok, reveals the association with all representations of sex, even educational ones, and risk. Sex education is a harm reduction tool, to remove it by associating it with harm itself disregards education’s power to keep people safe by making informed decisions about their bodies and their beliefs.
I will typically be using the terms sex work and sex workers to describe people who are in the erotic labor industry, but sex workers are by no means one, homogenous group. Like “woman” is not an identity category that can describe a universal experience, neither can “sex worker.” The experiences of people who do sex work vary widely depending on race, gender, class, and sexuality. Within the erotic labor industry, there are a plethora of jobs that are received with different levels of stigma that disrupts sex worker solidarity. The classist, hierarchical order of respectability, privilege, and stigma among sex workers gets referred to as the “whorearchy” (Knox, 2014; Sciortino, 2018; Lynn, 2019) The whorearchy typically positions cam girls at the top of the pyramid with the most privilege, followed by strippers and sugar babies, and positions street-based, full-service sex workers, who are the most marginalized, at the bottom. Belle Knox writes, “The whorearchy is arranged according to the intimacy of contact with clients and police. The closer to both you are, the closer you are to the bottom” (Knox, 2014). This socially constructed hierarchy is damaging to all sex workers and distracts from possibilities for solidarity among people participating in different kinds of labor within the sex industry. I use the term sex worker to describe both the symbolic sex worker who is the imagined target of whorephobic legislation and tech design, and people who do sex work, many of whom identify themselves with that title.
Anti-sex work or whorephobic policies written into community guidelines, terms of service, or other moderation policies have detrimental side effects on young people’s access to sex education that is body-positive, sex-positive (or sex-neutral), anti-racist, queer and trans-inclusive, and addresses ability and class online. Policing sex workers’ online presence through methods like shadowbanning, content removal, or account deletion/deplatforming, is also felt by sex educators, parents, sexologists, sex therapists, doctors, and activists fighting for and contributing to young people’s access to comprehensive sex education. I suspect that this policing is related to a real and imagined relationship between sex work and sex education, as they are both characterized as obscene by digital interfaces that disrupt both sex educators’ and workers’ ability to share important information about sex, sexuality, and relationships, including illuminating a balanced view of the realities of sex work and the harm done by the state via increased anti-trafficking legislation.
Despite people of all ages engaging with the app, TikTok as a platform is marketed toward people ages 13 and up. Presumably, because the intended audience is under legal adult age, TikTok upholds strict guidelines regarding internet safety including comprehensive Parent Safety Center guidelines for personally restricting visual content, messaging features, and the ability to be found through public searches, and more. With the existence of copious self-moderation options, it can be argued that there is no need for additional boundaries enforced on teen and adult users of the app who can navigate social media on their terms. The fact that these guidelines do exist opens up questions about how socially constructed ideas about safety is used to limit, restrict, and censor online content related to sex and sex ed. TikTok’s Community Guidelines constructs one version of what safety is and what it feels like relating to the regulation of sex and sexuality on its platform. When read against other definitions of safe/ty, entanglements of categories of sexuality, youth, innocence, obscenity, and pornography reveal themselves. Dominant ideologies of acceptable gendered and racialized performance constructs innocence and sexuality/sexual knowledge as mutually exclusive. In the United States, these ideologies position any variance from white cisgender heterosexuality as a queer deviance from approved sexual behavior.
I ground this research in examples of censorship expressed publicly by users on the app TikTok who self-organize/identify their content through hashtags related to sexuality and relationship education (SRE). Popular SRE-related hashtags currently include #Sexedforall, #Seggsed, #Seggstok, and #Sexualhealth/#Seggsualhealth. In the last two years, these terms have been through phases of being banned and unbanned erratically. The fact that even these workaround terms are frequently banned demonstrates that the seemingly mercurial nature of the implementation of the community guidelines to ban or shadowban hashtags or users, may be more intentional than at first glance. As TikTok’s algorithmic moderation disrupts sex workers’ and sex educators’ ability to organize, connect, or share pertinent information relating to sex, sexuality, and relationships, including providing media, and more specifically porn, literacy.
In this thesis, I offer a selection of videos where the content creators have acknowledged any issues with TikTok removing their content, as well as those where the creator comments (all too common on the app) read along the lines of “This is the third time I’m posting this so let’s try using terms even the community guidelines will like.” I analyze the imagery, screen text, captions, and comment sections of a representative selection of SRE-related posts that offer a discourse of safety related to sex education from the users' perspective that often conflicts with the app’s construction of what is safe for consumption for their dominant audience.
The app often censors or removes content under the guise of breaching TikTok’s policy on “Adult nudity and sexual activities,” which states: “We do not allow nudity, pornography, or sexually explicit content on our platform. We also prohibit content depicting or supporting non-consensual sexual acts, the sharing of non-consensual intimate imagery, and adult sexual solicitation” (TikTok, 2021). These guidelines leave much up to interpretation about what is considered to be pornography or “sexually explicit” which allows for flexibility in how these policies are enforced. Defining these terms is not an easy task and involves grappling with hierarchies of sexual value established by hegemonic cultural ideas “that erotic variety is dangerous, unhealthy, depraved, and a menace to everything from small children to national security” (Rubin 152, 1984). The ambiguity of what the guidelines created by social media platforms determine to be pornographic reflects that of the legislature at large. In 1964, US Supreme Court Justice Potter Stewart famously used the phrase “I know it when I see it,” to describe his threshold for what is to be considered “hard-core pornography” (Obscenity Case Files, 2021), demonstrating the often arbitrary, authority with which such terms are defined and even framed as moral panics.
The repetitive consequences faced by creators attempting to deliver peer-led sex education to social media bring up instances of criminalization, visibility, surveillance, and shielding people (especially children) from overt sexuality, especially when being taught or modeled by sex workers, women of color, fat people, and queer and trans* TikTok users. Further, because of the stigma around sex work, TikTok’s algorithm is selectively implemented to disrupt sex workers’ opportunities to promote their work or build community online. Such practices negatively impact young people’s access to comprehensive sex education because of the conflation of sex education materials and pornography.
When I began to pursue this research, I found that the TikTok algorithms that censor peer-led sex education beg the question: are sex educators pornographers? I also wondered about the reverse: Are pornographers sex educators? Should they be? I was curious about the relationship between sex education and pornography, between sex educators and sex workers. I wanted to know where the parallels, limits, and boundaries between sex ed, sexually explicit content, and pornography were, and what those boundaries could reveal about common perceptions of youth and innocence in relation to sexuality. The answer to that question is subjective: as my research carried me through TikTok, Youtube videos, Zoom conferences, Instagram, Twitter threads, Blogs, Podcasts, and so much more, I found a spectrum of opinions about these relationships. Central to all of the discussions I have found about sex education online and sex work online is the social construction of what is and is not safe. Safety is often weaponized by authorities (the state, police, religious institutions, bosses, social media terms of service) to further assert power through the guise of protection. Online, particularly on Social Networking Sites (SNS), policies like community guidelines/terms of service, or laws that regulate the internet such as the Communications Decency Act, and SESTA-FOSTA, use the idea or feeling of safety for some as a tool of enacting digital violence on others. Forms of digital violence include shadowbanning, overmoderation, content deletion, and deplatforming. How does the presence of these online censorship tactics impact users’ abilities to consent to what they see and engage with online? What kinds of control do we or don’t we have over what we can learn about sex, sexuality, our bodies, and ourselves online? As I spent the last couple of years considering all of this in relation to the accounts of sex workers, sex educators, sexologists, and sex education advocates, my central questions crystalized. In this thesis, I explore: When it comes to sex, sexuality, and sex education: what is safe/ty? How do we navigate our consent? How are our safety and consent mediated in digital spaces? I argue that the social construction of safety is a political rhetorical tool used to assert oppressive, sex-negative ideologies through education and legislation as a means of increasing surveillance and carcerality of the body of all, but most significantly, people with multiple marginalized identities. Through surveillance technologies, including legislation threatening privacy and online security, digital violence occurs, constructing bodies that do or do not threaten the collective safety of a given platform’s community. This produces limits to accessing knowledge about one’s body/mind that impedes on bodily autonomy through carceral logics that seek to reproduce white supremacist and capitalistic notions of what makes a person valuable and therefore deserving of freedom and respect.