💾 Archived View for republic.circumlunar.space › users › flexibeast › quotes › monea-2022.gmi captured on 2023-05-24 at 18:13:49. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
In his concurrence to the 1964 US Supreme Court case Jacobellis v. Ohio, Supreme Court Justice Potter Stewart famously argued that while he could not provide a formal explanation of what constituted obscenity, he knew it when he saw it. Today, machines still cannot provide a formal explanation of what constitutes pornography, but now they also know it when they see it.
For Google, porn is like a virus, constantly mutating in form and strategy to evade detection and infect the healthy body of online content. Google has leveraged this perspective to appoint itself judge and jury of obscenity online.
In Stewart’s day, leaving decisions about what constituted obscenity in the hands of local judges produced a de facto heteronormative censorship regime. This was in no small part due to the fact that the judges deciding what constituted obscenity were almost always straight, cis, white males channelling what they believed to be the morals of the majority of their community. For example, judges of the time saw the merit of letting Playboy and even Hustler go to press but not of LGBTQIA+ pornography or sex educational materials (Strub 2013). SafeSearch’s quest to eliminate all ‘viral pathogens’ similarly leads to overbroad censorship that unduly targets LGBTQIA+ content online.
SafeSearch is heteronormative when it exercises each of these three functions of power. The definitions of sex, sexuality, and pornography that it generates privilege heterosexuality, making it the default, and defining alternatives as aberrations. As I will explore, SafeSearch sexualizes and pornifies breasts and connects sodomy to bestiality The way that SafeSearch regulates flows of content is also heteronormative in that it often blocks sex educational or artistic materials when it is turned on. Even when it is turned off, SafeSearch governs the flow of internet traffic in a manner advantageous to mainstream heterosexual pornography producers. Lastly, SafeSearch reinforces gender and LGBTQIA+ inequalities not only by making heterosexuality and mainstream heteroporn the norm online, but by making it harder for people to find safe spaces to explore alternative sexualities and learn about sex online (e.g. the censorship of sex education and queer community spaces online). SafeSearch is at work in every Google Search, even if you turn it off. This includes Google Image searches, Google News searches, or any other kind of search on its platform.
Through SafeSearch, Google has embedded a new digital compulsory heterosexuality into the infrastructure of the internet itself.
To return to the spam as virus metaphor, the broadness of the ‘racy’ classifier indicates that it is more important to eradicate any viral pathogens than it is to preserve benign organisms. In more practical terms, blocking porn is more important than not blocking non-porn, including art. Take the Venus de Milo, for example. When I ran a Cloud Vision analysis of a standard Wikimedia Commons image of the statue, the API is convinced that it is likely a ‘racy’ image (see Figure 1).
There are strong connections between the heteronormative biases in the inputs to the SafeSearch algorithm (i.e. the datasets it is trained on) and the heteronormative outputs that it has (i.e. the censorship decisions that it makes), which lead to serious harm to LGBTQIA+ communities online.
[N]o matter how sophisticated the system, if you feed it heteronormative data, it will produce heteronormative results.
Google’s CNN seems to have learned the shape and texture of the average ‘female’ – and lighter-skinned – breast. This was confirmed by running images of ‘nude paintings’, ‘nude sculptures’, and ‘hentai’ (Japanese-styled nude and sexual drawings) through the system, all of which were frequently flagged as ‘racy’ content when they contained any semblance of a female breast – again, even when clothed. Needless to say, this result was not repeated when I ran through images of bare men’s chests through the system.
While these women eventually won a carve-out to many censorship practices online wherein breastfeeding mothers were allowed to post content with visible nipples, the change preserved a cisnormative gender binary. This problem is hardcoded into the datasets that algorithms like these are trained on, in the first instance by the decision to assume stable gender binaries for the purposes of censoring ‘female’ nipples. This gender binarism in the dataset allows for breasts that are coded as ‘female’ to be associated with ‘pornography’, ‘adult content’, or ‘raciness’, thus capturing and reinforcing a culturally singular heteronormative bias. These assumptions have been productively challenged by trans women like Courtney Demone (2015), whose #DoIHaveBoobsNow? campaign on Facebook and Instagram showcased topless photographs at different phases of her hormone therapy and transition to beg the question of when her breasts became a content violation. Demone’s campaign not only demonstrates how ill-equipped cisnormative assumptions about gender are for regulating flows of online content, but also how our current censorship practices make it difficult for transgender and non-binary people to express their identities online without violating content moderation policies instituted without them in mind.
[O]ne synset for masturbation combines ‘self stimulation’ with ‘self-abuse’, both defining ‘manual stimulation of your own genital organ for sexual pleasure’ (WordNet n.d.). As Thomas Laqueur (2003) has shown, anti-masturbation discourses historically work to preserve heteronormativity. Here, masturbating to pornography is conceptually tethered to self-abuse in the computer’s worldview, which contributes to an ontological drive towards censoring porn.
The system has no categories that call for the types of images necessary for it to learn the visual features of representations of transgender, intersex, or asexual relationships. ... By defining the existence, non-existence, and meaning of these terms, WordNet delimits the possibilities for what can be accomplished with a visual dataset that uses its labels, subtly inflecting it with heteronormative biases.
Even if these image data are not used to out people, the counting and classifying of LGBTQIA+ people has a long history of rendering them susceptible to dehumanization and violence (see Spade 2015). LGBTQIA+ precarity is only exacerbated now that private corporations control web-scale data collections and data analytics tools (Cheney-Lippold 2017, 220–221).
Many YouTube content creators have noted that it makes their LGBTQIA+ themed content invisible on the site. On the page of one bisexual user, named neonfiona, you can literally change her online identity from bisexual to straight by enabling restricted mode, which censors all of her content on dating women but leaves all of her similar content on dating men intact (Monea 2022).
Take, for example, Google’s understanding of the term ‘bisexual’. From 2009 to 2012 Google only understood the term ‘bisexual’ as a query for mainstream heteroporn. While the effects of this oversight at Google and their slowness to address it are quite bad, it is easy to understand how their algorithms would have come to such a conclusion. In mainstream porn, the term ‘bisexual’ is popularly appropriated heteronormatively to signal only scenes with females willing to engage in group sex with other women, for example male–female–female (MFF) threesomes.3 The term ‘bisexual’, then, is hugely popular in mainstream heteroporn, and mainstream heteroporn comprises a large percentage of internet pornography (if not of the web in its entirety). As such, the term ‘bisexual’ actually is more likely to indicate pornography than not. And while it is a flagship term in the LGBTQIA+ marquee, bisexuals often speak of feeling under-represented or even marginalized in LGBTQIA+ discourse. With the term often being collapsed into its container acronym, one can see how this usage would have been less compelling to the filter’s machine learning protocols. The end result was Google adding the term ‘bisexual’ to a list of banned search terms that could cause a website to be deprioritized in search rankings if any of these terms appeared on the site. Because of this, for three years, all bisexual organizations and community resources were either deprioritized in Google search results or completely censored (Garcia 2012).
Google’s inability to parse context leads it to obscure whole corpuses of useful LGBTQIA+ discourse for as minor an infraction as a single explicit image on an entire site or forum, further sexualizing people’s identities and denigrating them as not fit for the public sphere.
Google has made sure that the flow of internet traffic to pornography is bottlenecked through a small set of keywords that it thinks indicate a user is intentionally searching for pornography, and because internet traffic is bottlenecked through these few keywords it becomes very easy for the mainstream porn industry to maintain a financial lock on these search terms. As such, Google has essentially guaranteed the continued dominance of mainstream heteroporn.
In ensuring that you only get porn when you want it, Google has additionally ensured that you will always get the same kind of porn.
At this point, however, it is safe to assume that these keywords will reflect the heteronormativity so deeply engrained in both our pornography and SafeSearch’s code. The alternative, avant-garde, and experimental pornography that is the focus of much of porn studies may find itself continually and increasingly marginalized. If porn is where we go for a safe space not only to be affected – materially, symbolically, and sexually – but also to discover what affects us, this space has been sold for the sake of advert revenues.
It is worth noting that the term bisexual is not used to describe MMF threesomes or larger group sex scenes in mainstream porn, and only begins to appear in LGBTQIA+ porn when men penetrate one another in these scenes.
☙