top of page

Algorithmically
Induced
[Trans]phobia

[Trans]phobia between Polarization and Algorithms: An Algorithmic Vicious Circle?

Liviu Poenaru, Dec. 13, 2024.
 
 
Transphobia, far from being a mere reflection of individual social prejudices, today finds a new dynamic in the algorithms of digital platforms. These platforms, by reinforcing polarization and promoting extreme content, play a key role in the proliferation of transphobic discourse. At the same time, transphobia is instrumentalized in political and economic strategies, complicating the analysis of the convergence of algorithmic polarization, attention economies, and the disengagement from public debate versus its recuperation by political narratives.
​
Social platforms such as Facebook, Twitter, or YouTube operate using algorithms optimized to maximize user engagement. These algorithms, designed to capture and sustain attention, amplify the most polarizing content. Within this framework, several mechanisms come into play.
​
First, users are trapped in echo chambers where they are exposed primarily to content that reinforces their preexisting beliefs. This reduction in exposure to diverse perspectives creates an environment conducive to the radicalization of viewpoints, including transphobia. Second, transphobic discourse, often emotionally charged and polarizing, generates high engagement rates (likes, shares, comments). These interactions encourage algorithms to promote such content further, reinforcing a vicious cycle. Finally, these dynamics reduce discussions to simplistic and binary exchanges. Trans individuals are often reduced to stereotypes or turned into figures of controversy.
​
The attention economy, on which "big tech" relies, constitutes another driver of transphobia. In this model, each click has monetary value, incentivizing the prioritization of the most polarizing content at the expense of diversity or nuance. Transphobic content thus occupies a prominent place within this system.
​
Firstly, the division generated by this content boosts profit. Transphobic discourse, by provoking strong reactions and controversy, increases traffic on platforms, thereby generating advertising revenue. Secondly, these dynamics lead to the fragmentation of communities. By dividing users on identity issues such as trans rights, platforms encourage conflicts that keep users active and engaged.
​
Transphobia is not only a product of social and economic dynamics. It is also exploited for political purposes, often within populist or identity-based strategies.
​
Political parties, particularly those on the right, use trans issues as "dog whistles" to mobilize their electoral bases. These messages, amplified by algorithms, benefit from disproportionate reach. Moreover, by focusing debates on identity issues, some political actors avoid discussing more complex structural problems such as social inequalities or economic dysfunctions. Additionally, the use of algorithms in electoral strategies reinforces these dynamics. Entire campaigns are now designed to manipulate voters' emotions through algorithmic mechanisms that further fragment the public sphere.
​
Nicholas Carr (2010), in his book The Shallows: What the Internet Is Doing to Our Brains, highlights how activities such as browsing the web, clicking hyperlinks, and juggling multiple tasks reinforce habits of shallow thinking while simultaneously eroding the neural connections necessary for deep reading and contemplation. Each click and webpage visited interrupts the flow of thought, creating a brain optimized for efficiency and immediacy rather than the deep integration of ideas. Carr argues that this shift has profound implications for memory, learning, and intellectual engagement. In this context, digital platforms, by favoring polarizing and emotional content, exploit these automatic neural and psychological mechanisms, encouraging impulsive reactions and rapid judgments.
​
Digital platforms activate and nurture sub-cognitive mechanisms that operate below the level of mentalization, influencing behaviors and positions without conscious reflection. These mechanisms include:
​

  • Automatic Reaction to Emotional Stimuli: Highly polarizing content exploits the brain's primitive reactions, such as fear, indignation, or rejection. These emotions, intensified by the speed of online interactions, encourage impulsive actions, such as online harassment or uncritical sharing of transphobic content.

  • Erosion of Mentalization: The digital space favors superficial interactions, where users react quickly without analyzing or contextualizing information. This dynamic diminishes their ability to understand others' perspectives or to mentalize the consequences of their own actions.

  • Cognitive Biases Reinforced by Algorithms: Confirmation bias, anchoring bias, or aversion to ambiguity are exacerbated by personalized content. These biases automate judgments and reinforce transphobic positions, often without the user being aware of it.

 
These mechanisms explain why platforms do not merely disseminate transphobic content but also alter the neural and psychological processes underpinning individual behaviors.
​
Despite numerous calls for transparency and better regulation of algorithms, we are still far from effective control of these tools. Platforms operate in a fragmented and often complacent regulatory environment, allowing algorithmic dynamics to intensify. This risks further fostering polarization and the spread of transphobic discourse.
Meanwhile, the rise of the far right, which effectively uses these tools to propagate its messages, is a warning sign. Elections won through algorithmic strategies demonstrate how these tools, designed to maximize attention, can reorganize democratic dynamics to the detriment of public debate and inclusion.
​
Transphobia, in its digital dimension, perfectly illustrates the dangers of algorithmic polarization. Amplified by contemporary economic and political models, it calls for a profound reform of digital platforms and collective accountability. While technology has enabled the rapid dissemination of transphobic ideas, it can also become a powerful tool to promote inclusion and mutual understanding. However, without regulation, these dynamics risk intensifying, redefining power relations, and threatening democracy itself.
​
 
References
Carr, N. (2010). The Shallows: What the Internet Is Doing to Our Brains. W.W. Norton & Company.
Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin Books.
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. F., ... & Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarization. *Proceedings of the National Academy of Sciences, 115*(37), 9216-9221. https://doi.org/10.1073/pnas.1804840115
Sunstein, C. R. (2018). #Republic: Divided democracy in the age of social media. Princeton University Press.
​
​
​

We have been conditioned and imprinted, much like Pavlov's dogs and Lorenz's geese, to mostly unconscious economic stimuli, which have become a global consensus and a global source of diseases.

Poenaru, West: An Autoimmune Disease?

  • LinkedIn
bottom of page