#SEGGSED: SEX, SAFETY, AND CENSORSHIP ON TIKTOK

Cyberfeminist Design Justice

Investigations of biases embedded in digital algorithms have pointed to how bigoted technology is constructed by society in the hands of oppressors, that cyclically reinforce harmful stereotypes naturalized by AI technologies. For example, in Algorithms of Oppression, Safiya Noble reminds us that race in itself is a technology created in service of white supremacy. In a 2010 Ms. Magazine blog post, Noble describes another example of bigoted technology where women’s rights advocate, Veronica Arreola, called out the lack of terms such as “Latinas,” “Lesbian” and “Bisexual” in the then-new Google Instant search-enhancement tool. These terms, as well as others associated with gender, race, and sexuality, were not included initially because of the “X-rated” search results. Arreola wrote, “You’re Google… I think you could figure out how to put porn and violence-related results say, on the second page?” (Noble, 2018). Arreola’s concern is valid – what does it mean for these marginalized identities to be deeply associated with pornography in a society that weaponizes “pornographic” to mean inappropriate, disrespectful, and, as Arreola mentions, violent? Censoring terms because of their association with high volumes of pornographic results means attempting to disappear identifiers of largely held cultural anxieties about identity. Many young people who are curious about their sexuality and gender identities turn to porn for answers and validation of their fantasies, desires, and identity. Pornography also allows people who have been taught that parts of them are undesirable, or that they are unworthy of sexual attention, to see themselves and their bodies unabashedly desired on-screen. It can empower fat people, disabled people, and older people, to view themselves as ripe with sexual possibilities. Although mainstream pornography is often thought of as creating and upholding fetishistic and racist representations of different groups of people, there is also pornography made with the intention of being a space for Black, Brown, and Indigenous people to be positively sexually represented as desirable outside of the white, western gaze. Many people use the internet with the specific intent of consuming pornography, and that is not an inherently bad or violent thing.  

In fact, porn was one reason many people interacted with the internet in the first place, and the sex industry can be largely thanked for the accessibility of the internet within the home (Horn et al., 2022). In Pornography and Pedagogy: Teaching Media Literacy, Tarrant argues for pornography’s role in media literacy and its centrality to the internet as terms like “sex,” “porn,” “free porn” and “porno” are the most searched online. “Ghostery.com reports that 25 percent of all Web searches are for pornography and 891 million unique users were tracked on the top adult websites during the summer of 2013. Data reveals that these four terms were collectively searched nearly twenty-three million times each month in 2011” (Tarrant, 2015). Though it is difficult to gauge the accuracy of online statistics surrounding pornographic searches and consumption, ten years later it is safe to say that pornography remains an active component of the internet’s landscape. Covenant Eyes is a website that claims to offer educational resources and “Screen Accountability and Filtering software,” developed by Founder and CEO Ron DeHaas who has “directed over 1.5 million man-hours of battling pornography and sex trafficking.” His strong opposition to pornography and sex trafficking-- a common connection used to posture as providing a safety service, despite the dangers that sex-trafficking measures often pose to folks in the adult industry-- has yielded more recent data on young people’s porn consumption. 

Some highlights from the unnamed 2018 study include the following findings: 
  • Nearly 27% of teens receive sexts.
  •  Around 15% are sending them.
  •  57% of teens search out porn at least monthly.
  • 51% of male students and 32% of female students first viewed porn before their teenage years.
  • The first exposure to pornography among men is 12 years old, on average.
  • 71% of teens hide online behavior from their parents.
  • Teens and young adults 13-24 believe not recycling is worse than viewing pornography (https://www.covenanteyes.com/pornstats/, 2021).
Considering that these statistics are being leveraged as evidence of the dangerous pornification of culture, however, these statistics – more importantly – reflect a need to provide education to young people about what exactly pornography is, and how it does and does not reflect realistic sexual encounters. A 2021 survey, Preliminary Insights from a U.S. Probability Sample on Adolescents’ Pornography Exposure, Media Psychology, and Sexual Aggression, indicated an even higher rate of teenage consumption of “adult” media, stating that 84.4% of 14-18-year-old males and 57% of 14-18-year-old females have viewed pornography. The high estimations of teenage pornography consumption support Tarrant’s claim that “In light of the data, it is easily argued that improving media literacy skills provide important foundations for navigating the sex-tech nexus,” specifying that “Including pornography as a specific component of media literacy is crucial to promoting sexual health, pleasure, and safety in a media-saturated culture” (Tarrant, 2015). Two interventions that are necessary for mending the relationship between self, sexuality, and technology on an individual and collective level are porn literacy and design justice. On a micro-scale, developing porn literacy can be facilitated in formal and informal learning environments as a part of larger conversations about technology, social media, advertisements, relationships, sex education, and so on. There are websites and other online learning tools designed to make porn-literacy or “the porn conversation” accessible to anyone with an internet connection, but the availability of these resources may become more limited.

Which leads me to design justice. While porn literacy would help demystify and normalize pornography and sex work on the individual level, for systemic change to occur, encoded bias must be decoded. Digital violence must be taken seriously as an oppressive tool of social control. Although design justice is a hot topic in digital humanities research, the input of sex workers and sex worker organizers is critically missing. Catherine D’Ignazio and Lauren Klein’s Data Feminism discusses the role of data science and data justice in seeking liberation rooted in cross-movement solidarity. In their book, D’Ignazio and Klein demonstrate how who is collecting data, what data does and doesn’t count, who inputs that data, and then how it is interpreted and used has a significant impact on what is eventually communicated as being objective numbers. They poke holes in the infallibility of tech by chronicling the very human errors and biases that impact design. In their call for educators to “Teach data like an intersectional feminist,” the authors assert that, “In general, those who wield their data from a position of power tend to use that technology to preserve a status quo in which they are on top. This is true even when the people in power think of themselves as being anti-racist and anti-sexist because ‘privilege is blind to those who have it’” (D’Ignazio and Klein, 2020). Where D’Ignazio and Klein excuse some of the harm caused by technology as a part of this “privilege hazard,” I would argue that harm done by technology to criminalized communities is never accidental. The optimistic belief that instances of automated content suppression and deletion that censor language that entire communities organize around happen because of harmless oversights, is not supported by the history of surveillance and criminalization of marginalized communities on the land occupied by the United States.  

Sasha Costanza-Chock introduces her book, Design Justice: Community-Led Practices to Build the Worlds We Need, as being about the relationship between design and power and the communities working to transform that relationship so that tech does not continue to reinforce interlocking systems of inequality. Costanza-Chock invites us to imagine building “a better world, a world where many worlds fit; linked worlds of collective liberation and ecological sustainability” (Costanza-Chock, 2020), by grounding the exploration of tech design and social justice in the experiences of activists and community organizers. I was surprised to find no reference to sex workers or sex work in this book about community-led approaches to design justice. Sex workers and sex worker organizers are experienced with digital violence and skilled at circumventing whorephobic design where possible. Not only that, but many other instances of digital violence directed at people who have never engaged in sex work can be linked to encoded anti-sex work bias.

 For example, recently a friend of mine who is a trans-masculine non-binary person had a video they posted to mark a moment in their transition removed from TikTok for “Minor Safety.” In the flagged video, he is shown shirtless only from the shoulders-up lip-syncing to Olivia Rodrigo’s “good 4 u.” From this removal, he received a warning that their “next violation could result in being blocked from some features,” leaving my friend questioning if, based on data collected about them by TikTok, the platform is repeatedly misgendering them, and interpreting their bare shoulders as a sexual threat to minor safety. Or if TikTok perceives his trans-masc body as dangerous for children to see. Although the algorithm sought to remove pornographic content, instead it participated in digital transphobia. Rather than removing the imagined corruptive, violent, hard-core pornography being advertised to children, TikTok removed 15 seconds of a 20 something-year-old Brown, Trans-person sharing a moment of joy as they captured their gender euphoria on camera for his friends, and young trans people, to see and feel less lonely. Retelling this story, ongoing public debates about whether or not the knowledge of the existence of transgender and queer identities are appropriate for children come to mind. Having queer and trans identities has also been historically constructed as a perverse form of sexual deviance. Queer people and people in the sex industry have been accused of being “perverts,” “child predators,” and “criminals,” so it is unsurprising that members of the LGBTQ+ communities may experience anti-sex work biases as a part of, and in addition to, transphobic and homophobic tech design. However, it is more common to hear outrage against obvious anti-LGBTQ+ biases in social media content moderation, than to encounter widespread demands to address whorephobia. I think this is partly because most people outside of sex work, do not see the removal of their artwork, advertisements, or educational content as a part of anti-sex work digital violence. Artists and activists will describe platforms as sex-negative, conservative, and sexist. They will fiercely post and re-post the “female nipple,” rallying followers to call out platforms’ sexism and racism. I have seen accounts be shadowbanned, deleted, and recreated on Instagram by watercolor painters, fat models, and trans performance artists who have interpreted being censored, shadowbanned, or deplatformed as a reaction to their representations of radical body acceptance. They often point towards users who have shared similar content but who hold more privileged identities (closer in proximity to the thin, able-bodied, cis-gender, heterosexual, white ideal) who did not face consequences from the app. Censored users may identify their experience as connected to encoded sexism, fatphobia, or transphobia, but the thing that connects them all is sex, and how they have been interpreted as promoting sex.
 

This page has paths:

Contents of this tag:

This page references: