Abstract
A recent trend in online forums, social media posts, and personal essays credits TikTok’s For You Page recommendation algorithm with revealing users’ sexual identities to the users themselves, framing the social media platform less as a tool for self-discovery and more as a method for being algorithmically categorized. As a result, the number of popular articles with titles like “TikTok Thinks I’m Gay” and “TikTok Made Me Gay” is on the rise. These accounts lean into an algorithmic culture that emphasizes the role of technology without accounting for a broader assemblage of meaning-making. By setting claims about TikTok’s ability to “determine” sexual orientation within a longer lineage of coming out, and by deconstructing both the algorithms and the online rhetoric surrounding them, this article illustrates how critical engagement and algorithmic auditing can recover layers of meaning that are abstracted by this social media shift in rhetoric.
"Tik Tok Made Me Gay"
Sarah Barrios is a singer-songwriter for the digital age. During the COVID-19 pandemic, Barrios built a fanbase across social media platforms, leveraging viral trends and becoming highly responsive to her online followers while establishing a music career. 1 On TikTok, the video-sharing social media site owned by ByteDance, Barrios has accumulated over 766,000 followers. One of her frequent engagement tactics has been to poll these followers as a means of crowdsourcing lyrical inspiration with the catchphrase, “You ask, I write.” Replying to one of her follower’s requests, Barrios uploaded a short, original song in mid-2021, the lyrics of which recounted her then-recent experience of coming out as bisexual. These are the lyrics of the song in their entirety:
Oh my god / Think I made a mess
Cause I got this app / To relieve some stress
And I always knew / that I wasn’t straight
But now I can honestly say / TikTok made me gay. 2
The song drew more than 5,000 comments, the majority of which equate to viewers’ nods of agreement—replies like “relatable,” “honestly true,” “where’s the lie?,” and even “I’m questioning everything because of TikTok.”
Yet, mixed into these positive responses are less enthusiastic ones. Several comments responding to Barrios’s post oppose this new approach to expressing sexual identity, framed here as something that the app—or more specifically, the app’s algorithm—has revealed to or caused in the user. One such comment asks, “How can TikTok make you gay?” Another writes, “[T]hat’s not how it works.” Comments like these illustrate how some users are frustrated by Barrios’s framing of coming out and how her claims about TikTok’s agentiality seem to disrupt the ways in which coming out narratives have historically operated. One of the difficulties in this debate about how coming out “works,” however, is that both sides of the conversation have been simplified, reduced and abstracted to quick missives that stake out sides without fully addressing the broader implications. In writing, “[T]hat’s not how it works,” for instance, what is the it being referenced here: coming out or the algorithm? Increasingly, to understand the former requires engaging with the latter.
As works like Pedro Domingos’s The Master Algorithm, Virginia Eubanks’s Automating Inequality, Safiya Umoja Noble’s Algorithms of Oppression, and Cathy O’Neil’s Weapons of Math Destruction have demonstrated, nearly every aspect of our lives is now affected by an increasingly algorithmic culture. This article positions claims like “TikTok made me gay” as a case study that illuminates a broader cultural shift, one that embraces algorithmic agency while abstracting the role of other actors in an assemblage of meaning-making. By drawing on assemblage theory and the work of Jill Walker Rettberg, this analysis also recursively highlights a longer lineage in which questions of power, agency, and control predate the alarmist tones that so often accompany contemporary discourse around algorithms broadly, and coming out narratives specifically. As this work highlights, there is plenty to critique in our algorithmic culture, yet engaging critically with an algorithm like TikTok’s For You Page (or FYP, the app’s recommendation algorithm that personalizes each user’s content) can ultimately promote a sense of empowerment, not fear, reminding users of their own role in the assemblage. 3
This is especially important when considering the sheer amount of attention that TikTok’s algorithms have received in recent years. TikTok was launched globally in 2017, and it quickly gained both followers and media interest thanks in large part to its advanced algorithms. Writing for The Guardian, Alex Hern refers to the company’s personalization algorithms as the “company’s secret sauce,” as it has proved “uncannily good” at anticipating individual users’ interests. 4 One result of the recommendation algorithm’s efficacy is that many queer or proto-queer users find that their FYP quickly corrals them to what has colloquially come to be known as the “gay side” of TikTok—full of content created by and for the app’s queer users. As illustrated by the trend noted above, the algorithm’s categorization of users has, at times, preceded the users’ own solidified sense of queer identity.
At the outset, it should be noted that this trend is just one of many that exist amid queer communities online. Much of the scholarship on the topic of coming out highlights how this common rite of passage looks different for different individuals. One experience—or one emerging online trend—cannot be expected to represent all queer individuals and their coming out acts. Yet, in addition to appearing across TikTok videos and comments, as well as Reddit forums dedicated to those TikTok uploads, a growing number of online articles published in the past five years give voice to claims like Barrios’s with titles like “TikTok Thinks I’m Gay,” “TikTok Made Me Gay,” “TikTok’s Algorithms Knew I was Bi Before I Did,” and “TikTok’s Algorithm is Making Women Who’ve Only Dated Men Realize They’re Queer." 5 Headlines like these adopt the same rhetoric as Barrios, positioning TikTok and its recommendation algorithm as the subject of a coming out statement. It would be easy to accuse headlines like these of being rooted in technological determinism, the belief that technology drives cultural change rather than seeing technological change as a result of human decision and innovation, but something more complicated is going on here. As the following analysis illustrates, claims that advance a cause-and-effect relationship between algorithms and their users’ sexual identities are mere abstractions; deconstructing them gives way to much larger assemblages.
Coming Out: Agency and Assemblage
Scholarship on the subject of coming out has—for decades—emphasized the individual’s role in first declaring one’s sexual orientation. Many definitions of the act sound similar to what queer theory and rhetoric scholar Jen Bacon proposes in her work, which frames “coming out” as the “act of articulating one’s gay or lesbian identity to the non-queer world.” As she writes, coming out “entails, quite simply, saying ‘I’m gay’ to someone who did not previously know.” 6 Although this offers a useful entry point into discourse on this topic, Bacon’s definition is perhaps too narrow for many queer individuals. Worth noting, for instance, Bacon’s focus is on gay and lesbian identities, specifically, and this focus therefore neglects a range of other LGBTQ+ identities. Furthermore, some—like myself—have “come out” to family and friends only to hear that their listeners already knew, or at least suspected, the speaker’s sexual orientation. This chronology also complicates Bacon’s definition, yet responses of “already knowing” have become increasingly common on social media platforms such as YouTube, TikTok, and Instagram, where comment sections often turn it into a competition over who “knew” first. 7
Beyond ascribing an order to events, Bacon’s definition is also firmly situated in time, marking how coming out might function at the turn of the millennium—at the time of her writing in 1998. That is to say, it depicts a fairly contemporary understanding of how coming out works, not taking into account how these speech acts operated in the first half of the twentieth century. Prior to the gay liberation movement that began in the 1960s, for instance, coming out was often framed not as a step out of heteronormative society (i.e., as a means of marking one’s difference) but rather as a step into the gay community (expressing a sense of belonging). As Nicholas A. Guittar writes, coming out was once a means of announcing one’s entrance into queer society—as being “part of ‘the club’ among gay circles.” 8
Then again, Bacon’s definition might not be contemporary enough. Coming out is still often framed through an active stepping out, away, or into, yet “speaking” is no longer as vital on our multimodal Web 2.0 platforms as it was in the past. Consider Eugene Lee Yang’s coming out video uploaded to YouTube in 2019. 9 He never utters the words so often associated with coming out (“I am gay,” “I am bisexual,” etc.), but through a dance performance, he manages to effectively convey his sexual identity to viewers. Through movements choreographed to instrumental music, Yang embraces another male dance partner, surrounds himself in queer community, and pays homage to those who have been—and, indeed, continue to be—persecuted for their sexual identity.
These different definitions and examples highlight the range of coming out narratives that exist. Yet, importantly, they also unanimously position the queer individual performing the act of coming out as the agential power, as the actor controlling the narrative, and as the subject performing the action. Through the framing of buzzy headlines and TikTok tags, however, this role shifts; the recommendation algorithm becomes the subject, the actor, that “makes” its user gay while the user is regarded as the object that receives this action. This shift—between active voice and passive, performative utterance and constative—is what the negative comments on Barrios’s post seem to be addressing.
Then again, to say that this is an unprecedented shift in coming out rhetoric would be inaccurate. Michel Foucault’s early and oft-cited work on coming out, for instance, framed these speech acts in complex, nuanced ways—blurring the distinction between active subject and passive object nearly fifty years ago. As he observed in his first volume of The History of Sexuality, individuals in the nineteenth century who deviated from heterosexual norms were initially compelled to confess their deviations through the combined influence of powerful institutions related to medicine, law, and religion. As Foucault observes, those who complied with these institutional pressures were regarded as sexual deviants, not liberated subjects. Here, coming out ascribes none of the power, agency, and control to the queer speaker. However, Foucault goes on to identify the “reverse discourse” that this early form of coming out produced, a discourse more in sync with those twentieth-century definitions and examples referenced above. Through “reverse discourse,” marginalized subjects adopted the very language that was once used to oppress them and repurposed it to describe their own developing sense of identity and community. According to Foucault, “homosexuality began to speak in its own behalf, to demand that its legitimacy or ‘naturality’ be acknowledged, often in the same vocabulary, using the same categories by which it was medically disqualified.” 10
In this discursive shift, Foucault still avoids placing the queer individual as the object of his sentence. He does not write that “the homosexual,” as dated as that term might be, “began to speak on his or her own behalf” but rather that “homosexuality began to speak in its own behalf.” 11 Perhaps, this is to emphasize the way in which—through Foucault’s biopolitical framework—the pattern that determines how society has collectively constructed the concept of homosexuality wields a power of its own. This concept, this construct, is also active. Through Foucault’s phrasing, this identity category is perhaps even more active than the individual who adopts the identity. In applying this dynamic to rhetoric on TikTok, to say that a recommendation algorithm has “made” users gay (has revealed to them some truth about their sexual identity) is not such a drastic leap in discourse. It continues to ascribe power and agency to cultural constructs—shifting from discursive technologies (speech and speech acts) to algorithmic ones.
This understanding offers a useful foundation for the way Jill Walker Rettberg has discussed our relationship to technology—more broadly—through assemblage theory, building on earlier scholarship by theorists likes Deleuze, Guattari, and Latour. Through this lens, Rettberg observes that our everyday technologies—smartphones, digital assistants, streaming services, and so on—might be fruitfully regarded as “companion[s],” as things that neither control us nor that we entirely control either, but instead, as objects with “agency” of their own. “When we use technology,” she observes, “we enter in a relationship with it. That relationship affects us, it affects the technology, and it affects the people and environment around us.” 12 For this reason, Rettberg offers assemblages as a way to reexamine a shared sense of power, agency, and control as they’re spread across a network of actors (of people, technologies, and social institutions).
Such a viewpoint allows us to maneuver around the technologically determinist nature of claims that position algorithms as “making” their users gay. There is a social benefit to this reframing as well, as more nuanced understandings of how queer identities operate might squelch the type of conservative rhetoric that has so often accused media representations and social media platforms alike as “grooming” young viewers and users into being gay, as if sexual orientation is the result of some singular influence. 13 At the same time, assemblage theory can correct a knee-jerk reaction that might refuse these technologies any agency at all. Users can, for instance, acknowledge that TikTok’s FYP recommendations have affected them and their understanding of queer identity both on- and offline without letting this be the entire narrative.
This phenomenon of co-producing meaning and identity is one that a range of other studies have found to be the new norm. In a paper titled “Algorithmic Folk Theories and Identity,” for instance, Nadia Karizat et al. observe that social media users engaging with recommendation algorithms are likely to adapt and shape their behavior to their understanding of how an algorithm operates. The titular phrase “algorithmic folk theories” refers to the ways in which users’ knowledge of how social media algorithms work is often only partial and unconfirmed. Rather than relying on concrete knowledge, users often turn to theoretical understandings and guesswork. Still, these theories allow users to “make sense of what they see and experience on these platforms.” 14
All of this seems only to extend Foucault’s earlier claims that queer individuals are never fully or solely the authors of their own coming out narratives—that there is a network of actors, an assemblage of subjects co-authoring the act. The primary difference, however, is that through our new algorithmically-informed formations of the self, who these actors are, what their motives might be, and how much of the assemblage falls to them have all become difficult and, at time, even impossible to determine due to advancements in deep learning algorithms that rely on layers and layers of abstraction—the process by which complex information is simplified as elements deemed irrelevant or unimportant are removed.
ERR_NETWORK_CHANGED: Troubling the Assemblage
On its own website, TikTok offers a glimpse of their algorithm’s inner workings through promotional materials that they curate and make public. According to these materials, a user’s video recommendations are determined by a wide range of factors that include the user’s previously liked or shared videos, the accounts that the user follows, accounts that the user comments on, video content that is similar to what the user uploads, the user’s own language preference, and the user’s device type. 15 This is by no means an exhaustive list, but even in this small glimpse, the For You algorithm’s complexity becomes apparent.
In contrast, consider a much simpler algorithm. A popular WikiHow quiz titled “Am I Gay?” poses sixteen multiple-choice questions to online users. Developed by Standford University therapist Deb Schneider, the quiz has been taken by nearly ten million users. Its questions are straightforward. The first query, for instance, asks users, “Have you ever had feelings for a same-gender close friend?” before listing a series of options: (A) “I think so. That’s why I’m taking this quiz,” (B) “Wait, what’s the difference between friendship and a crush?” (C) “Don’t think so, but we’re so close people joke that we’re dating,” and (D) “Nope. We’re just friends.” 16 Here, no corporate explanation of the algorithm is needed, as discovering how the quiz works becomes a fairly easy task. If the respondent answers A for the majority of questions, they are informed, “You might be gay!” If they choose mostly B answers, they receive the message, “You could be bisexual or somewhere on the LGBTQ+ spectrum”; C answers produce the message, “You may experience a small amount of attraction to people of the same gender,” and D answers conclude, “You’re probably straight.” Taking the quiz only a handful of times allows users to confidently predict how this algorithm operates, and in doing so, users are taking part in what’s come to be known as algorithmic auditing—or, more specifically, automated auditing—a method employed by many journalists, lawyers, and academics to deconstruct how an algorithm functions, gauging its “possible points of failure” in the process. 17 While this methodology can help users think more critically about the algorithms they encounter, it also has several limitations.
Algorithms have often been defined as simply a series of steps, a logical progression, or “a procedure or set of rules” (OED), but such simple definitions—like the above definitions of “coming out”—are quickly complicated. As Louise Amoore points out, defining an algorithm as a mere series of steps “seriously overlooks the extent to which algorithms modify themselves in and through their nonlinear iterative relations to input data.” 18 This is to say that our algorithmic folk theories are often extremely limited. WikiHow’s “Am I Gay?” quiz contains a clear, linear process that’s easy to comprehend; TikTok’s recommendation algorithm, however, is much more difficult to grasp. How can one grapple with the scale and dimensions of such a corporately guarded algorithm? According to Amoore, today’s deep learning algorithms rarely do take input along a linear journey, and as a result, they make algorithmic auditing an impossible—or at least, an impractical—task. 19 Instead, these advanced algorithms transform input through a network of layers, abstractions, and recursions—where the algorithm invokes itself, executing its procedure in a series of feedback loops. This all means that if a user ends up on the “gay side” of TikTok, it is difficult to determine the steps they took to get there. The process, the algorithm, is not nearly so easy to understand. Algorithmic auditing remains useful, but its limitations here become evident. 20 Even algorithmic folk theories—already based on guesswork and rumors—become less reliable.
Deep-learning algorithms have become so complex that even programmers behind these technologies may find it difficult to grasp what their algorithms are doing—the “steps” that they are taking in converting input to output. Beyond the layers of abstraction that result from recursive feedback loops, projects are likely to be split up across a number of teams or individuals. This means that, depending on the scale of the project, programmers might only have access to one small piece of a larger algorithmic puzzle. As Nick Seaver points out, today’s algorithms are “massive, networked” systems “with hundreds of hands reaching into them, tweaking and tuning, swapping out parts and experimenting with new arrangements.” 21
Tristan Ferne, Henry Cooke, and David Man have similarly highlighted this unknowable quality of deep learning algorithms. As they observe, “AI works quite differently to the human brain, in some ways it is an alien non-human intelligence.” 22 And this is why they are so often regarded as “black boxes,” technologies with identifiable inputs (data related to user profiles and behaviors, for instance) and knowable outputs (content and recommendations), yet with unknown or unknowable processes and functions taking place between these two poles. Seaver refers to this lack of transparency as “algorithmic secrecy,” a result of several factors including the complexity of the code (discussed above) and “the legal boundary of the corporation.” 23 After all, it is in TikTok’s best interest to guard their recommendation algorithms, hiding them from competitors who are similarly interested in engaging as many users as possible for as long as possible on their own platforms. In the attention economy, an effective recommendation algorithm is prime currency, appeasing the corporation, their advertisers, and their users simultaneously, serving up new content continuously. These algorithms are not only alien in their way of thinking but also in their frequency of updating, refreshing, adapting.
Again, this unknowability, this secrecy, checks the user’s ability to understand their relationship to algorithms like TikTok’s For You recommendations through assemblage theory, as some of the actors (and the agency they might wield) are obscured by black box technologies. While we might rely on algorithmic folk theories to fill the gaps in our knowledge, this inability to know exactly what the algorithm’s role is in the assemblage has a tendency of producing claims that diminish our own agency, common in both journalistic and academic writing. Consider, for instance, the way journalist Kyle Chayka frames algorithms as “dominating” their users in his recent publication Filterworld. As he observes, deep learning algorithms “manipulat[e] our perceptions and attention” while flattening human culture at large. 24 In We Are Data, John Cheney-Lippold makes a similar claim, writing, “We are effectively losing control in defining who we are online, or more specifically we are losing ownership over the meaning of the categories that constitute our identities.” 25 As Foucault illustrates, pressures to categorize the self have always existed, yet they have traditionally come from institutions that we can see and identify. Accordingly, concerns surrounding the influence of different actors—the control and manipulation they might wield—are amplified as the actors themselves become decentralized, even invisible, and spread across the Web.
While abstraction blurs our understanding of how much agency users retain, it is worth reiterating the fact that users do certainly retain some amount of control in the assemblage. Users can still accept or reject the offerings that a recommendation algorithm offers them, for example. Similarly, users tend to make choices around who they follow and what type of content they are interested in. As a personal example, my social media feeds are full of artists carving out linocuts and designers trying to maximize space in their small apartments. My partner’s feed is full of tennis players and travel tips. Neither of these are solely a result of what an algorithm has determined for us. In other words, we still have choice, and in that, we remain actors or subjects in the assemblage. Returning to the article headlines above, none of the writers making claims that “TikTok made me gay” are doing so begrudgingly—as if, despite their own personal preference, TikTok’s won out and the user is now required to align their sexual identity with what the algorithm has determined. To the contrary, these writers’ claims suggest that they have found meaning (active and agential) in the videos that TikTok has recommended.
In reflecting on this collaborative mode of meaning-making, Rettberg observes, “[We] act in the way we think the algorithms expect.” 26 In other words, users develop literacies around the ways in which an algorithm functions—sometimes, through algorithmic auditing; more often, through algorithmic folk theories—in order to better collaborate with it. Furthermore, users retain some amount of control as well when deciding whether or not what the algorithm has offered to them fits their sense of self. When a recommendation algorithm continually promotes a genre, a user, a video, or an idea that does not fit, users are likely to, at some point, cease attempts towards collaboration. As Rettberg states, “Often I fail and give up in disgust, frustrated that machine vision doesn’t see me the way I want to be seen.” 27 In an assemblage with TikTok’s recommendation algorithm, the user retains this right—to reset the app, to alter its settings, to take a break, to delete it, etc.
The fan studies scholar Jessica Hautsch has highlighted a similar process in her writing about online personality quizzes, a return to simpler forms of algorithms. As with the WikiHow example above, this digital genre of multiple-choice offerings promises to reveal something about their respondents’ identities if they engage with quizzes like “Everyone is one of these indoor plants, which one are you?” and “Customize your dream cottage and we’ll reveal which Jane Austen character you belong with.” 28 According to Hautsch, the actual results of these quizzes do not matter as much as the meaning users ascribe to them. “They are a tool through which we play with who we could be, potentially integrating that identity into how we understand ourselves. It is not that they ‘reveal’ our traits,” Hautsch writes, “but rather that we can use their results to shape how we conceptualize our personalities,” even our identities at large. 29
The quizzes, the algorithms, that Hautsch addresses are so simple that users might audit them with ease and precision on their own, but her conclusions about active meaning-making easily map onto many TikTok users’ relationship to the recommendation algorithm. In this light, claims that “TikTok made me gay” indicate how users conceptualize the self, abstracting a network of meaning-making into much simpler claim. It might be useful to parse out this process even further. For instance, one might regard the “me” being recommended content by an algorithm like TikTok’s For You recommendation as an abstraction as well. This target that TikTok is directing videos to is not exactly who “I” am in my AFK (Away From Keyboard) life. Rather, as Cheney-Lippold argues, it is an amalgamation of data about “me” that is easily measured and categorized. 30 The algorithm is not predicting that a person is queer; rather, it is compiling an affinity of videos based on user activity and collected data that it deems likely to appeal to that user. In this, the user comes alongside the algorithm and maps meaning onto the resulting recommendations, the collection of data: TikTok recommends queer content; the algorithm thinks I want queer content; the algorithm thinks I’m gay. Through abstraction, this logic—this rhetoric—is simplified. 31
Nevertheless, this process, this understanding remains abstracted in the actual rhetoric from users who claim, “TikTok made me gay.” Instead of considering their own role in how they constructed an understanding of their own sexual identities through or with or alongside TikTok, writers who claim “TikTok made them gay” are often more likely to regard the recommendation algorithm with a religious zeal. A writer for the site Mashable, for instance, regards the algorithm as a “divine digital oracle” capable of “dark magic.” 32 An essay on the site The Cut begins similarly before taking a turn. “By way of sorcery,” the writer begins, “TikTok learns your every interest, tendency, and pattern based on how you interact with its content, even if that’s just watching a video mostly through. What that means is TikTok knows you better than you know yourself.” 33 What interests me about this claim is how the author ignores her own actions—“watching” videos and “interacting” with content—to instead focus on the algorithm’s role. Yet, through the author’s own account, she is an active participant, engaged in the creation—or rather, the curation—of the videos that the recommendation algorithm offers her. Yes, the algorithm is shaping her, but in this assemblage, she is simultaneously shaping the algorithm.
Perhaps the author’s role in this assemblage is ignored because of a broader lack in critical engagement. The author is not thinking about her own actions but is instead letting them unfold through a subconscious state amid an endless scroll. Social media apps are constantly vying for our attention, and they often seem most effective when we lose ourselves in them. Accordingly, it is easy to adhere to the types of analogies above, which liken TikTok’s recommendations to a divine power, one to which we—as users—have willingly submitted, subordinated. In Philip Pullman’s His Dark Materials, the young Lyra gives voice to a similar dynamic whenever she makes use of her Alethiometer, a fictional divination technology featured throughout the book series. Describing how she reads and interprets this device, Lyra states, “I just make my mind go clear.” 34 Unsurprisingly, Lyra’s use of the Alethiometer and her trust in the technology often results in undesirable consequences, events that never would have occurred had Lyra not given up her agency so fully.
As Peter Nagy and Gina Neff have observed, framing algorithms as magical is often a strategy invoked by those within the tech industry, as such rhetoric works to “deflect attention and calls for accountability away from their own already immense power.” 35 Meghan O’Gieblyn makes a similar claim, proposing that by framing the algorithm as magical, divine, or mystical, users are encouraged to view these technologies as “completely neutral and objective.” 36 To round out these claims, Kristian Lum and Rumman Chowdhury have further concluded that the term algorithm has become a shorthand way of “deflect[ing] accountability for human decisions.” 37 In Data Feminism, Catherine D’Ignazio and Lauren Klein offer an alternative to these trajectories, what is—perhaps—an obvious solution to a problem so rooted in a lack of critical engagement. As they observe, examining and challenging power imbalances in the technologies that structure our daily lives is a necessary step in moving towards more equitable futures. 38 This requires us to be more aware of what is lost in our abstractions, a task more easily said than done.
Coming Out Abstracted
“[W]hen a human looks at a tree it translates the intricately complex mass of leaves and branches into this thing called ‘tree.’ To be a human was to continually dumb the world down into an understandable story that keeps things simple.” 39 This claim comes from Nora, the protagonist at the center of Matt Haig’s The Midnight Library. Throughout the novel, Nora offers insights like these, drawing from her philosophy background, while maneuvering through a series of connected lives, stepping in and out of multi-dimensional versions of herself as if they were avatars in a virtual reality game. According to Nora, “everything humans see is a simplification . . . Humans are fundamentally limited, generalising creatures, living on auto-pilot, who straighten out curved streets in their minds, which explains why they get lost all the time.” 40 Indeed, we do have a tendency of getting lost in our abstractions.
Clearly, claims that shorten the process of interacting with TikTok’s recommendation algorithms through an assemblage of meaning-making to something like “TikTok made me gay” engage in a process of abstraction. Like the algorithms themselves, such claims simplify the process of meaning-making, coming out, and sexual identity. Reworking this abstracted statement, it would likely be more accurate for someone making the claim to say something like, “I can easily map meaning onto the recommended TikTok videos I’ve found on my feed, and this meaning supports my own growing understanding around my sexual orientation, which I believe fits into a category like ‘gay,’ which—in turn—has been shaped by society and (through society) TikTok itself.” While such a claim is not nearly as buzzy (and likely wouldn’t fit into a headline, let alone a short song uploaded to TikTok), it does highlight the recursive element of coming out—of understanding identity—more broadly through a cyclical loop. Our personal choices affect our personal feeds, which affect our understanding of self—and this loops on and on, over and over again.
Still, all this roundabout logic is worth teasing out, since in the midst of technological changes and resulting concerns, it is essential that we remember the agency we do have—the fact that we are still subjects, working alongside or against or through our algorithmic companions in an assemblage of power, control, and meaning. Remaining aware of and active in this assemblage is becoming increasingly important in a culture where other actors are effectively framing “the algorithm” as divine authority. The historian Yuval Noah Harari regards the success of this rhetorical messaging as marking a shift from humanism to dataism; “Whereas humanism commanded: ‘Listen to your feelings!’ Dataism now commands: ‘Listen to the algorithms! They know how you feel.’” 41 Harari is rightfully critical of this shift, and although he does not look to assemblages, specifically, this theoretical framework is one means of amplifying more critical engagement.
In computer science, abstraction refers to the process of simplification, of removing unnecessary detail. What is lost in the simplification process is regarded as lossy (inessential, unnecessary, and able to be discarded). In the age of algorithms, resisting a narrative that regards our own decision-making abilities as lossy, our own role in the assemblage as lossy—things to be discarded—becomes a central concern, not only for understanding sexual identity, but all aspects of the self. While we may not know to what extent our own agential role shapes the TikTok assemblage, a surefire way of short-circuiting that agency is to lose sight of the self in the recursion.
Footnotes
Chyenne Tatum, “Artist Spotlight: Sarah Barrios Learns to Bare Her Soul Through Music,” EnVi Media, May 31, 2021, https://www.envimedia.co/artist-spotlight-sarah-barrios-learns-to-bare-her-soul-through-music.
Sarah Barrios [@sarahbarrios], “And that’s on pandemic,” TikTok, April 24, 2021, https://www.tiktok.com/@sarahbarrios/video/6954796275535465733?lang=en.
This claim that equates critical inquiry to empowerment is an extension of Catherine D’Ignazio and Lauren Klein’s second principle from Data Feminism: to challenge power. I return to this principle at the end of this work. Catherine D’Ignazio and Lauren Klein, Data Feminism (MIT Press, 2020), https://www.data-feminism.mitpress.mit.edu/pub/ei7cogfn.
Alex Hern, “How TikTok’s Algorithm Made It a Success: ‘It Pushes the Boundaries,’” The Guardian, October 24, 2022, https://www.theguardian.com/technology/2022/oct/23/tiktok-rise-algorithm-popularity.
Sensitive-Cow-1268, “Did TikTok help anyone else realize they’re gay?,” Reddit, 2020, https://www.reddit.com/r/latebloomerlesbians/comments/jaqw3u/did_tiktok_help_anyone_else_realize_theyre_gay; David Oliver, “TikTok Thinks I’m Gay. How Could It Know Before I Knew?,” USA Today, December 14, 2021, https://www.usatoday.com/story/life/health-wellness/2021/12/14/tiktok-social-media-apps-guess-if-people-lgbtq-how/6466755001/?gnt-cfr=1. Emma Turetsky, “TikTok Made Me Gay,” The Cut, August 27, 2021, https://www.thecut.com/2021/08/tiktok-helped-me-come-out-gay-lesbian.html; Jess Joho, “TikTok’s Algorithms Knew I Was Bi Before I Did. I’m Not the Only One,” Mashable, September 18, 2022, https://www.mashable.com/article/bisexuality-queer-tiktok; Julia Naftulin, “TikTok’s Algorithm Is Making Women Who’ve Only Dated Men Realize They’re Queer,” Insider, March 22, 2022, https://www.insider.com/tiktok-lesbians-queer-women-coming-out-algorithm-why-2022-3.
Jen Bacon, “Getting the Story Straight: Coming Out Narratives and The Possibility of a Cultural Rhetoric,” World Englishes 17, no. 2 (1998): 205.
For a recent example, consider the response garnered by singer Shawn Mendes’s public coming out during one of his concerts in 2024. His speech performance was filmed and uploaded to social media by countless fans, and across many of these posts, top comments relay fans’ claims of “already knowing” Mendes was bisexual, gay, etc. Ironically, Mendes, himself, had not used any of these labels in his statement. For more on this trend as it relates to coming out videos on YouTube, see Jon Heggestad, “Paranoid Viewers & the ‘Already Knowing’ of YouTube’s Coming Out Genre,” Queer Studies in Media & Popular Culture 6, no. 2 (2021): 105-122.
Nicholas A. Guittar, Coming Out: The New Dynamics (First Forum Press, 2014), 7.
The Try Guys, “I’m Gay—Eugene Lee Yang,” YouTube, June 15, 2019. https://www.youtube.com/watch?v=qpipLfMiaYU.
Michel Foucault, The History of Sexuality: Volume 1: An Introduction (Vintage Books, 1990), 101.
Even the original French grants agential power to “l'homosexualité,” not to “l'homosexuel.” Michel Foucault, Histoire de la Sexualité: 1: La Volonté de Savoir (Gallimard, 1976), 134.
Jill Walker Rettberg, Machine Vision: How Algorithms are Changing the Way We See the World (Polity Press, 2023), 27-28.
The national LGBTQ media advocacy organization GLAAD has categorized the “groomer” trope as a “violation of the hate speech policies of all social media platforms.” As they observe, this rhetoric, which scapegoats queer folks as “pedophiles, sexual predators, and threats to children” is both baseless and dangerous. GLAAD pinpoints social media platforms as the current arena in which this fear-mongering speech most often occurs at present, but the trope certainly predates hateful online rhetoric. GLAAD, “Online Anti-LGBTQ Hate Terms Defined: ‘Groomer,’” GLAAD, 2024, https://www.glaad.org/groomer-definition-meaning-anti-lgbt-online-hate.
Nadia Karizat, Dan Delmonaco, Motahhare Eslami, and Nazanin Andalibi, “Algorithmic Folk Theories and Identity: How TikTok Users Co-Produce Knowledge of Identity and Engage in Algorithmic Resistance,” Proceedings of the ACM on Human-Computer Interaction 5 (2021): 2.
TikTok, “How TikTok Recommends Videos #ForYou,” TikTok, June 18, 2020, https://www.newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you.
One question from this quiz stands out in particular. “If you scroll through your feed or FYP,” the question begins, “do you see content from queer creators?” Here, this online quiz designed to identify one’s sexuality relies on other algorithms, creating its own feedback loop of sorts. Deb Schneider, “Am I Gay?,” WikiHow, July 21, 2023, https://www.wikihow.com/Relationships/Am-I-Gay-Quiz.
Meredith Broussard, More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech (MIT Press, 2023), 161.
Louise Amoore, Cloud Ethics: Algorithms and the Attributes of Ourselves and Others (Duke University Press, 2020), 11.
Amoore, Cloud Ethics, 18.
In 2022, The Wall Street Journal conducted an algorithmic audit of TikTok’s FYP. While they were able to discover certain patterns—for instance, users were likely to be shown videos about suicide and depression as a way to increase engagement—how these patterns formed, whose feeds they appeared on, and for what purpose all remain somewhat unclear. The Wall Street Journal, “How TikTok’s Algorithm Figures You Out,” YouTube, July 21, 2022, https://www.youtube.com/watch?v=nfczi2cI6Cs.
Nick Seaver, “Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems,” Big Data & Society 4, no. 2 (2017): 10.
Tristan Ferne, Henry Cooke, and David Man, “Explaining Artificial Intelligence. Part 3: What Does AI Look Like?,” BBC, August 16, 2021, https://www.bbc.co.uk/rd/blog/2021-08-explaining-artificial-intelligence-part-3-what-does-ai-look-like.
Seaver, “Algorithms as Culture,” 5.
Kyle Chayka, Filterworld: How Algorithms Flattened Culture (Bonnier Books UK, 2024), 4.
John Cheney-Lippold, We Are Data: Algorithms and the Making of Our Digital Selves (New York University Press, 2017), 178.
Jill Walker Rettberg, Machine Vision: How Algorithms are Changing the Way We See the World (Polity Press, 2023), 127.
Rettberg, Machine Vision, 127.
Ruby Selwood-Thomas, “Everyone Is One of These Indoor Plants—Which One Are You?,” BuzzFeed, July 29, 2019, https://www.buzzfeed.com/rubyatthemovies/what-indoor-plant-are-you-aa4sp6sgdj; Lordannascott, “Customize Your Dream Cottage and We’ll Reveal Which Jane Austen Character You Belong With,” BuzzFeed, February 10, 2021, https://www.buzzfeed.com/lordannascott/build-a-house-and-well-tell-you-which-jane-austen-7h15ct8bgn.
Jessica Hautsch, “Casting the Self into Fictional Worlds Through Online Personality Quizzes,” Response: The Journal of Popular and American Culture 9, no. 2 (2024), https://www.responsejournal.net/issue/2024-11/feature/casting-self.
Cheney-Lippold, We Are Data.
See also Ellen Simpson and Bryan Semaan, “For You, or For ‘You’?: Everyday LGBTQ+ Encounters with TikTok,” Proceedings of the ACM on Human-Computer Interaction 4, no. CSCW3 (2021): 1–34, https://dl.acm.org/doi/10.1145/3432951.
Joho, “TikTok’s Algorithms Knew I Was Bi Before I Did.”
Turetsky, “TikTok Made Me Gay.”
Philip Pullman, The Golden Compass (Alfred A. Knopf, 1995), 174.
Peter Nagy and Gina Neff, “Conjuring Algorithms: Understanding the Tech Industry as Stage Magicians,” New Media & Society 26, no. 9 (2024): 4939.
Meghan O’Gieblyn, God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning (Vintage Books, 2022), 208.
Kristian Lum and Chowdhury Rumman, “What Is an ‘Algorithm’? It Depends Whom You Ask,” MIT Technology Review, February 26, 2021, https://www.technologyreview.com/2021/02/26/1020007/what-is-an-algorithm.
D’Ignazio and Klein, Data Feminism.
Matt Haig, Midnight Library (Viking Press, 2020), 148-149.
Haig, Midnight Library, 149 (italics in original).
Yuval Noah Harari, Homo Deus: A History of Tomorrow (Random House, 2016), 345.
References
Amoore, Louise. Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Duke University Press, 2020.
Bacon, Jen. “Getting the Story Straight: Coming Out Narratives and The Possibility of a Cultural Rhetoric.” World Englishes 17, no. 2 (1998): 249-258. https://doi.org/10.1111/1467-971X.00098.
Barrios, Sarah [@sarahbarrios]. “And that’s on pandemic.” TikTok, April 24, 2021. https://www.tiktok.com/@sarahbarrios/video/6954796275535465733?lang=en.
Broussard, Meredith. More Than a Glitch: Confronting Race, Gender, and Ability Bias in Tech. MIT Press, 2023.
Chayka, Kyle. Filterworld: How Algorithms Flattened Culture. Bonnier Books UK, 2024.
Cheney-Lippold, John. We Are Data: Algorithms and the Making of Our Digital Selves. New York University Press, 2017.
D’Ignazio, Catherine, and Lauren Klein. Data Feminism. MIT Press, 2020, https://www.data-feminism.mitpress.mit.edu/pub/ei7cogfn.
Ferne, Tristan, Henry Cooke, and David Man. “Explaining Artificial Intelligence. Part 3: What Does AI Look Like?” BBC, August 16, 2021. https://www.bbc.co.uk/rd/blog/2021-08-explaining-artificial-intelligence-part-3-what-does-ai-look-like.
Foucault, Michel. The History of Sexuality: Volume 1: An Introduction. Vintage Books, 1990.
GLAAD. “Online Anti-LGBTQ Hate Terms Defined: ‘Groomer.’” GLAAD, 2024. https://www.glaad.org/groomer-definition-meaning-anti-lgbt-online-hate.
Guittar, Nicholas A. Coming Out: The New Dynamics. First Forum Press, 2014.
Haig, Matt. Midnight Library. Viking Press, 2020.
Harari, Yuval Noah. Homo Deus: A History of Tomorrow. Random House, 2016.
Hautsch, Jessica. “Casting the Self into Fictional Worlds Through Online Personality Quizzes.” Response: The Journal of Popular and American Culture 9, no. 2 (2024), https://www.responsejournal.net/issue/2024-11/feature/casting-self.
Heggestad, Jon. “Paranoid Viewers & the ‘Already Knowing’ of YouTube’s Coming Out Genre.” Queer Studies in Media & Popular Culture 6, no. 2 (2021): 105-122. https://doi.org/10.1386/qsmpc_00048_1.
Hern, Alex. “How TikTok’s Algorithm Made It a Success: ‘It Pushes the Boundaries.’” The Guardian, October 24, 2022. https://www.theguardian.com/technology/2022/oct/23/tiktok-rise-algorithm-popularity.
Joho, Jess. “TikTok’s Algorithms Knew I Was Bi Before I Did. I’m Not the Only One.” Mashable, September 18, 2022. https://www.mashable.com/article/bisexuality-queer-tiktok.
Karizat, Nadia, Dan Delmonaco, Motahhare Eslami, and Nazanin Andalibi. “Algorithmic Folk Theories and Identity: How TikTok Users Co-Produce Knowledge of Identity and Engage in Algorithmic Resistance.” Proceedings of the ACM on Human-Computer Interaction 5 (2021): 1-44. https://doi.org/10.1145/3476046.
Lordannascott. “Customize Your Dream Cottage and We’ll Reveal Which Jane Austen Character You Belong With.” BuzzFeed, February 10, 2021. https://www.buzzfeed.com/lordannascott/build-a-house-and-well-tell-you-which-jane-austen-7h15ct8bgn.
Lum, Kristian, and Chowdhury Rumman. “What Is an ‘Algorithm’? It Depends Whom You Ask.” MIT Technology Review, February 26, 2021. https://www.technologyreview.com/2021/02/26/1020007/what-is-an-algorithm.
Naftulin, Julia. “TikTok’s Algorithm Is Making Women Who’ve Only Dated Men Realize They’re Queer.” Insider, March 22, 2022. https://www.insider.com/tiktok-lesbians-queer-women-coming-out-algorithm-why-2022-3.
Nagy, Peter and Gina Neff. “Conjuring Algorithms: Understanding the Tech Industry as Stage Magicians.” New Media & Society 26, no. 9 (2024): 4938-4954. https://doi.org/10.1177/14614448241251789.
O’Gieblyn, Meghan. God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning. Vintage Books, 2022.
Oliver, David. “TikTok Thinks I’m Gay. How Could It Know Before I Knew?” USA Today, December 14, 2021. https://www.usatoday.com/story/life/health-wellness/2021/12/14/tiktok-social-media-apps-guess-if-people-lgbtq-how/6466755001/?gnt-cfr=1.
Pullman, Philip. The Golden Compass. Alfred A. Knopf, 1995.
Rettberg, Jill Walker. Machine Vision: How Algorithms are Changing the Way We See the World. Polity Press, 2023.
Schneider, Deb. “Am I Gay?” WikiHow, July 21, 2023. https://www.wikihow.com/Relationships/Am-I-Gay-Quiz.
Seaver, Nick. “Algorithms As Culture: Some Tactics for the Ethnography of Algorithmic Systems.” Big Data & Society 4, no. 2 (2017): 1–12. https://doi.org/10.1177/205395171773810.
Selwood-Thomas, Ruby. “Everyone Is One of These Indoor Plants—Which One Are You?” BuzzFeed, July 29, 2019. https://www.buzzfeed.com/rubyatthemovies/what-indoor-plant-are-you-aa4sp6sgdj.
Sensitive-Cow-1268. “Did TikTok help anyone else realize they’re gay?” Reddit, 2020. https://www.reddit.com/r/latebloomerlesbians/comments/jaqw3u/did_tiktok_help_anyone_else_realize_theyre_gay.
Simpson, Ellen, and Bryan Semaan’s “For You, or For ‘You’?: Everyday LGBTQ+ Encounters with TikTok.” Proceedings of the ACM on Human-Computer Interaction 4, no. CSCW3 (2021): 1–34. https://doi.org/10.1145/3432951.
Tatum, Chyenne. “Artist Spotlight: Sarah Barrios Learns to Bare Her Soul Through Music.” EnVi Media, May 31, 2021. https://www.envimedia.co/artist-spotlight-sarah-barrios-learns-to-bare-her-soul-through-music.
TikTok. “How TikTok Recommends Videos #ForYou.” TikTok, June 18, 2020. https://www.newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you.
The Try Guys. “I’m Gay—Eugene Lee Yang.” YouTube, June 15, 2019. https://www.youtube.com/watch?v=qpipLfMiaYU.
Turetsky, Emma. “TikTok Made Me Gay.” The Cut, August 27, 2021. https://www.thecut.com/2021/08/tiktok-helped-me-come-out-gay-lesbian.html.
The Wall Street Journal. “How TikTok’s Algorithm Figures You Out.” YouTube, July 21, 2022. https://www.youtube.com/watch?v=nfczi2cI6Cs.