Sex educators practice self-censorship online to avoid content bans. Does it work?

Sex educators practice self-censorship online to avoid content bans. Does it work?

6 minutes, 30 seconds Read

“Did you know that sex shouldn’t hurt?” read the words on one Instagram role posted by sexual health and wellness company The Pelvic People in early 2025.

At first glance, the missing “e” looks like a typo. But it isn’t. If you scroll through Pelvic People’s Instagram page and other similar pages on the social media platform, you’ll see videos littered with words like “s*x,” “lub3” and “c00chie.”

This popular form of online slang, commonly referred to as “algospeak,” exists to circumvent algorithms used by tech companies like Meta, TikTok and

Posts perform better with “a little bit of censorship,” says Emily Tran, social media manager at the Pelvic Health and Rehabilitation Center, who uses the center’s Instagram, @pelvichealth, to share evidence-based information about sexual health.

“If I were to write something about the female anatomy and use the correct terms, [the algorithm] could flag it and say it’s inappropriate,” Tran added.

The algorithm would then reduce the visibility of the post, leading to a decrease in engagement. Something Tran said happened to the account before she started using algospeak.

Creators who discuss sensitive topics often see this type of self-censorship as essential to being able to post freely without losing engagement. But, experts said in interviews with Rewire Newsgroup, Censored language such as algospeech can fuel stigma around these topics and even pose a threat to sexual health. And recent research has cast doubt on whether using algospeak actually improves their visibility on social media.

Censorship breeds censorship

Researchers from the University of St. Cyril and Methodius in Slovakia collected the most recent fifty Instagram posts from nine sex education accounts that post in English, Slovak or Czech. They analyzed the likes, comments and shares on a total of 450 posts to understand overall engagement.

The resulting studythat was published in the magazine Media literacy and academic research in June 2025 found no statistically significant decrease in the number of likes and shares between posts that did not use algospeak compared to posts that did.

Researchers also found a slightly lower number of comments on posts that did not use algospeech, which they attributed to other potential factors, such as uncensored videos with a more academic tone.

Another limiting factor says the lead author of the study, Michal Kabát, in an interview with Newsgroup rewiringwas that “the accounts do not use algospeech consistently.” But because a given account often posts content with and without algospeech, it’s unlikely that his study’s findings can be explained simply by more popular accounts using less algospeech.

Ultimately, algospeak is probably an unnecessary precaution for sex education, Kabát said.

“[People] I feel this obligation to somehow fit into some unwritten rules,” Kabát said.

Promoting censorship of sexual language reinforces taboos about sex and views sexual organs as inappropriate to acknowledge, Kabát said.

“This does not contribute to establishing open, clear and taboo-free communication,” Kabát added.

If a topic is portrayed online as one that should not be talked about openly, Kabát said: “[viewers] They won’t talk about it, they won’t use the words, or they’ll self-censor themselves.”

It’s a phenomenon he says he’s already noticed in comment sections, where users, despite not having to worry about engagement, recall the same algospeech used in a post.

Together we make reproductive justice visible.

Newsgroup rewiring is a reader-supported, independent, nonprofit newsroom. Membership ensures that this report remains accessible to everyone.

Why language matters

Sex education, yes missing from many school systems in the United States and around the world. Often the curriculum focuses on teaching people how to avoid sexually transmitted infections and unplanned pregnancies without much discussion of anatomy and sexual health.

Social media can fill these gaps for many young people by making reliable information about sexual health accessible. However, censoring language around these topics online can hinder clear communication and lead to feelings of shame around uncensored sexual language.

This practice can be especially harmful to those assigned female at birth. People with female anatomy indicate more often that they experience feelings of shame about their genitals than those with a male anatomy, and studies show that Parents often use euphemisms such as ‘down there’ and ‘private parts’ when talking to their children about the female genitalia. As a result, even young children are more likely to know the actual names for male genitalia than female genitalia.

This linguistic discomfort with the female anatomy is visible even on TV. For example, a Review 2023 The censorship of the word “vagina” showed that network broadcast officials needed the show Grey’s anatomy to replace the word ‘vagina’ with ‘vajayjay’. Meanwhile, the word “penis” was said 17 times in one episode.

That kind of censorship can have real consequences for sexual health: “Cryptic language can inadvertently reinforce the idea that sexuality and genital anatomy are shameful,” says Taylor Roebotham, a gynecologist at the London Health Sciences Center in Ontario, Canada. Many of her patients already struggle to explain their symptoms, she added, often because patients don’t have the language to do so.

“I’ve heard patients say they had a problem with their vagina, using the only word they know for that part of the body,” Roebotham said. “But upon examination, they actually had more of a musculoskeletal problem with their pubic bone.”

Communication problems like these can delay care, because doctors may refer patients to the wrong specialist or examine the wrong part of the body.

Sometimes stigma can prevent patients from seeking care in the first place.

A Study from 2024 interviews with patients with vulvar lichen sclerosus, a chronic skin condition that causes pain, itching and discoloration of the genitals, found that many participants experienced diagnostic delays because they felt uncomfortable talking to healthcare providers about their genitals or didn’t even notice there was a problem.

“Some women in my study said they didn’t even think they had to acknowledge the existence of their vulva,” says Sophie Rees, a social scientist at the University of Bristol and co-author of the 2024 study.

How to avoid algospeech

It may seem as if Kabát’s study points to the end of algospeak altogether as the obvious solution.

But for many sex education creators, doing away with algospeech completely may not feel like a real possibility. Social media companies are often vague about their content restriction policies, and it’s never entirely clear what kinds of posts get them flagged, liked, or even banned. For those who make a living creating content, the threat of financial loss can make uncensored language too much of a risk without clearer content guidelines.

Meta, the company that owns Instagram and Facebook, has been too criticized for inconsistently enforcing its policies And cannot provide any explanations when messages are limited. The company has promised to make the reasons known for restrictions on content, Kabát said, but users themselves claim that this does not always happen. Research shows that ‘shadow banning’, where a platform reduces the visibility of a user’s posts without notifying them, leaves them only aware of the sudden drop in engagement that results. ruling on Instagram.

Tran, whose job is to provide evidence-based sexual health information to the public, tries to use algospeech as similar as possible to the language it replaces, she said, to avoid being distracting from the censorship. “We just want to put out information that we think should be accessible to everyone,” Tran said, “while creating a community that wants to engage and have conversations.”

Still, experts warn that in the long term, the risks of using algospeak may outweigh the benefits.

“If creators collectively returned to medically correct terminology, would sexual health content disappear completely? Probably not,” Roebotham said. “Even if algorithms prefer censored language, they still need content, and we should inundate them with thoughtful information.”


#Sex #educators #practice #selfcensorship #online #avoid #content #bans #work

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *