2021 Year In Review: Sex Online - EFF

We don’t entrust Internet companies to be arbiters of morality. We shouldn’t hand them the responsibility of making broad cultural decisions about the  forms sexuality is allowed to take online. And yet, this last year has been a steady and consistent drum of Internet companies doing just that. Rather than seeking consensus among users, Apple, Mastercard, Amazon, Ebay and others chose to impose their own values on the world, negatively affecting a broad range of people.
The ability to express oneself fully—including the expression of one’s sexuality—is a vital part of freedom of expression, and without that ability, free speech is an incomplete and impotent concept.
To be clear, we are talking here about legal sexual expression, fully protected in the US by the First Amendment, and not the limited category of sexual expression, called “obscenity” in US law, the distribution of which may be criminalized.
Here is a tiring and non-exhaustive list of the ways Internet platforms have taken it upon themselves to undermine free expression in this way in 2021.
It’s apt to take a look backwards at the events leading up to SESTA/FOSTA, the infamously harmful carveout to Section 230, as it set the stage for platforms to take severe moderation policies regarding sexual content. Just before SESTA/FOSTA passed, the adult classified listing site Backpage was shut down by the US government and two of its executives were indicted under pre-SESTA/FOSTA legal authority.
FOSTA/SESTA sought to curb sex trafficking by removing legal immunities for platforms that hosted content that could be linked to it. In a grim slapstick routine of blundering censorship, platforms were overly broad in disappearing any questionable content, since the law had no clear and granular rules about what is and isn’t covered. Another notable platform to fall was Craigslist Personals section, which unlike Backpage, was attributed as a direct result of FOSTA. As predicted by many, the bill itself did not prevent any trafficking, but actually increased it—enough so that a follow-up bill, SAFE SEX Workers Study Act, was introduced as a federal research project to analyze the harms that occur to people in the sex industry when resources and platforms are removed online.
The culture of fear around hosting sexual expression has only increased since, and we continue to track the ramifications of SESTA/FOSTA. Jump ahead to December, 2020:
Late last year, Pornhub announced that it would purge all unverified content from its platform. Shortly after the news broke, Visa and Mastercard revealed they were cutting ties with Pornhub. As we have detailed, payment processors are a unique infrastructural chokepoint in the ecosystem of the Internet, and it’s a dangerous precedent when they decide what types of content is allowed.
A few months after the public breakup with Pornhub, Mastercard announced it would require “clear, unambiguous and documented consent” for all content on adult sites that it partners with, as well as the age and identity verification of all performers in said content. Mastercard claimed it was in an effort to protect us, demonstrating little regard for how smaller companies might be able to meet these demands while their revenue stream is held ransom.
Not long after that, Mindgeek, parent company to Pornhub, closed the doors for good on its other property, Xtube, which was burdened with the same demands.
Banner message on XTube that says they are now closed for good
 As documented by the Daily Beast and other publications, this campaign against Mindgeek was a concerted effort by evangelical group Exodus Cry and its supporters, though that detail was shrouded from the NYTimes piece that preceded, and seemed to garner support from the general public.
This moral policing on behalf of financial institutions would set a precedent for the rest of 2021.
Just this month, December 2021, AVN Media Network announced that because of pressure from banking institutions, they will discontinue all monetization features on their sites AVN Stars and GayVN Stars. Both sites are platforms for performers to sell video clips. As of January 1st, 2022, all content on those platforms will be free and performers cannot be paid directly for their labor on them.
In May, Twitch revoked the ability for hot tub streamers to make money off advertisements. Although these streamers weren’t in clear violation of any points on Twitch’s community guidelines policies, it was more a you know it when you see it type of violation. It’s not a far leap to draw a connection between this and the “brand safety score” that a cybersecurity researcher discovered on Twitch’s internal APIs. The company followed up on that revelation that the mechanism was simply a way to ensure the right advertisements were “appropriately matched” with the right communities, then  said in their follow-up statement: “Sexually suggestive content—and where to draw the line—is an area that is particularly complex to assess, as sexual suggestiveness is a spectrum that involves some degree of personal interpretation of where the line falls.” After their mistake in providing inconsistent enforcement and unclear policies, Twitch added a special category for hot tub streamers. No word yet on their followup promise to make clearer community standards policies.
During this year’s iteration of the WWDC conference where Apple unveils new features and changes to their products, a quiet footnote to these announcements was a change to their App Store Review Guidelines: “hookup apps” that include pornography would not be allowed on the App Store. Following outcries that this would have a disproportionate impact on LGBTQ+ apps, Apple followed up with reporters that those apps, such as Grindr and Scruff wouldn’t be affected. They wanted to make it clear that only apps that featured pornography would be banned. They did not comment on if, or how, they cracked the code to determine what is and isn’t porn.
Discord describes itself as “giving the people the power to create space to find belonging in their lives,”—that is, unless Apple thinks it’s icky. Discord is now prohibiting all iOS users from accessing NSFW servers, regardless of user age. Much like Tumblr did in 2018, this is likely in response to the pressure put on by the above-mentioned strict policies imposed by the Apple App Store. This means that adult users are no longer allowed to view entirely legal NSFW content on Discord. These servers are accessible on Android and Desktop.
In August, OnlyFans declared that it would ban explicit content starting in October. Given their reputation, this was confusing. Following significant pushback and negative press, they backtracked on their decision just a week later.
With just a month’s notice, Ebay revised their guidelines to ban adult items starting in June. Offending material includes movies, video games, comic books, manga, magazines, and more. Confusing matters even more, they took care to note that nudist publications (also known as Naturist publications, usually non-sexual media representing a lifestyle of those that choose not to wear clothes) would not be allowed, but risqué sexually explicit art from pre-1940 and some pin-up art from pre-1970 are allowed.
Many have pointed out that this change will endanger the archival capabilities and preservation of LGBTQ history.
Instagram, a platform often criticized for its opaque restrictions on sexual content,  stands out in this list as  the only example here that puts an option in the user’s hands to see what they wish.
The new “Sensitive Content Control” was released in July. It is a feature which enables users to opt into how restrictive they would like the content they view on the app to be moderated.
Instagram's sensitive content controls menu
Although Instagram still has many, many, many, issues when it comes to regulating sexual content on its platform, a feature like this, at the very least this type of interface, is a step in the right direction. Perhaps they are paying attention to the petition with over 120,000 signatures pleading them to stop unfairly censoring sexuality
Given that no two user-groups will agree on what is beyond the threshold of material too sexually explicit to be on social media, that Instagram itself can’t agree with the professional art world on what is pornography versus art, the obvious choice is to let the users decide.
Let this “Sensitive Content Control” be a proof of concept for how to appropriately implement a moderation feature. Anything beyond what is already illegal should be up to users—and not advertisers, shareholders, developers, or biased algorithms—to decide whether or not they want to see it.
The Internet is a complex amalgam of protocols, processes, and patchwork feature-sets constructed to accommodate all users. Like scar tissue, the layers are grown out of a need to represent us, a reflection of our complexities in the real world. Unfortunately, the last few years have been regressive to that growth; what we’ve instead seen is a pruning of that accommodation, a narrowing of the scope of possibility. Rather than a complex ecosystem that contains edges, like in the offline world, those edges are being shaved down and eliminated to make the ecosystem more child-proof. 
Research shows that overly restrictive moderation is discriminatory and destructive to non-normative communities, communities that because of their marginalized status, might exist in some of the edges these platforms deem to be too dangerous to exist. Laying unnecessary restrictions on how marginalized people get to exist online, whether intentional or not, has real world effects. It increases the margins that prevent people living in safety, with dignity, with free expression and autonomy.
If we take the proof of concept from the above Instagram example, we can imagine a way to accommodate more possibilities, without sanding down the edges for all. 
And if we’ve learned anything from the few positive changes made by companies this year, it’s that these platforms occasionally do listen to their user base. They’re more likely to listen when reminded that people, not the homogenous monopolies they’ve constructed for themselves, hold the power. That’s why we’ll continue to work with diverse communities to hold their feet to the fire and help ensure a future where free expression is for everyone.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2021.
Back to top

source

Leave a Reply

Your email address will not be published. Required fields are marked *