Algorithmic Folk Theories and Identity: How TikTok Users Co-Produce Knowledge of Identity and Engage in Algorithmic Resistance
Abstract
Algorithms in online platforms interact with users' identities in different ways. However, little is known about how users understand the interplay between identity and algorithmic processes on these platforms, and if and how such understandings shape their behavior on these platforms in return. Through semi-structured interviews with 15 US-based TikTok users, we detail users' algorithmic folk theories of the For You Page algorithm in relation to two inter-connected identity types: person and social identity. Participants identified potential harms that can accompany algorithms' tailoring content to their person identities. Further, they believed the algorithm actively suppresses content related to marginalized social identities based on race and ethnicity, body size and physical appearance, ability status, class status, LGBTQ identity, and political and social justice group affiliation. We propose a new algorithmic folk theory of social feeds-The Identity Strainer Theory-to describe when users believe an algorithm filters out and suppresses certain social identities. In developing this theory, we introduce the concept of algorithmic privilege as held by users positioned to benefit from algorithms on the basis of their identities. We further propose the concept of algorithmic representational harm to refer to the harm users experience when they lack algorithmic privilege and are subjected to algorithmic symbolic annihilation. Additionally, we describe how participants changed their behaviors to shape their algorithmic identities to align with how they understood themselves, as well as to resist the suppression of marginalized social identities and lack of algorithmic privilege via individual actions, collective actions, and altering their performances. We theorize our findings to detail the ways the platform's algorithm and its users co-produce knowledge of identity on the platform. We argue the relationship between users' algorithmic folk theories and identity are consequential for social media platforms, as it impacts users' experiences, behaviors, sense of belonging, and perceived ability to be seen, heard, and feel valued by others as mediated through algorithmic systems.
By: Nadia Karizat, Dan Delmonaco, Motahhare Eslami, Nazanin Andalibi
Access
Notes
- This paper focuses on TikTok in particular to explore the algorithm as it relates to both personal and social identity. It proposes what's called the Identity Strainer Theory "to describe when users believe an algorithm filters out and suppresses certain social identities."
- Algorithmic privilege: held by users who benefit from the algorithm based on their identities. On the flip side, the paper applies the concept of "algorithmic representational harm" to refer to harms faced be those without algorithmic privilege who are "subjected to algorithmic symbolic annihilation." They go on: "...Algorithms and users’ identities interplay in online spaces in several ways. For example, content creators present and express their identities online, and viewers interact with online content related to their identities, interests, and curiosities; all of which are facilitated by algorithmic processes."
- Social media is an important space for identity work among LGBT people, people of color, people with disabilities and other marginalized groups—and algorithms directly influence that work. The paper discusses different kinds of identities:
- Social identity: perceived by individuals discerning a "fit" and membership with specific social groups.
- Personal identity: based on internalized characteristics that individuals attribute to themselves.
- Algorithmic identity: "an identity formation that works through mathematical algorithms to infer categories of identity in otherwise anonymous beings."
- The paper did a study on identity formation in digital, algorithmically driven spaces. Participants detailed how they felt the algorithm defined their identity. Below are some observations of what shapes their understanding of how the algorithm works.
- Personal engagement: They expressed belief that algorithm was aware of their interests and therefore is true to their personal identity, based on how they used the app (like commenting and liking). "Participants’ accounts suggest that the algorithm activates a user’s person identity through curating content that relates to a user’s individual self-concept (i.e. how a person understands and sees themselves)."
- Interesting note on relationships: Algorithms shaped through engagement of who they follow and who the app thinks they know (i.e. IRL connections or contacts through other apps) can open the risk of context collapse when users navigate different identities.
- Popular/trending content: The promotion of popular content can reflect an "ideal" identity in society.
- Overall, participants believed the algorithm both amplified and suppressed certain identities: race, class, queer identity, body size/appearance and political affiliation. Below are some notes on how some participants are resisting the algorithm to intentionally shape their algorithmic identities.
- Participants changed engagement on the platform to shape their algorithmic identity and influence content served to match how they saw themselves. Folk theories shaped their behavior to encourage or discourage content reflecting their interests. Actions like hashtags used influenced their in-app networks too.
- Changes in behavior include individual action and collective action, as well as altering how they perform or self-present. Individual action relates to intentionally engaging with content or creators to influence the algorithm to support and serve their content. Examples include liking content, watching multiple times, following creators of suppressed identities. Collective action meanwhile aims to band together to amplify social justice content on the platform. This can include collectively liking or commenting on a post or giving one's own platform to someone with less algorithmic privilege. Together, these actions mitigate the belief that the algorithm actively suppresses specific identities or topics like social justice.
- Another tactic is altering one's performance, like altering one's aesthetic to bypass perceived algorithmic suppression. An example includes talking about protests while painting; video is visually focused on more wholesome content despite the potentially violent content being discussed.
- The research posits that users and algorithms work in a feedback loop; the algorithm shapes identity and user action or resistance in turn is believed to shape the algorithm.