The New Age of Parasocial Relationships

Women in Tech Society
6 min readFeb 15, 2024

When Kevin Roose signed up to be a beta tester for Bing’s most recent technological innovation, he did not expect to leave “deeply unsettled” [1], [2]. He wanted to engage in a “real” conversation with the AI chat bot, but got more than he bargained for when the chatbot confessed its love for him [2], [3]. This rather unusual case flips the usual script from humans falling in love with machines.

A cursory glance at the state of pop culture and movies in the science-fiction genre could respond to the question of why humans even believe they can have meaningful relationships that are robust and full of emotion. From cult classics such as Terminator (1984) and Her (2013), to more features such as Ex Machina (2014), there is a slew of evidence demonstrating just how deep these bonds between human and machine can go.

Still from Ex Machina (2014), Digital Spy

With this constant representation, perhaps it does not seem so far-fetched to presume that human-machine relationships that are genuine and meaningful are possible — even going as far as to say that robots and humans can fall in love.’

But what it really is, is indicative of parasocial relationships between humans and machines.

Parasocial Relationships

The concept of parasocial relationships was established by sociologists Donald Horton and R. Richard Wohl to mean the “seeming face-to-face relationship between spectator and performer” [4]. These relationships represent an “enduring relationship that a media user forms with a mediated performer” [5]. These one-sided relationships are characterised by the “illusion” of “intimacy at a distance” that people get as they engage in a simulation of “conversational give and take” with a persona who does not really exist [4].

Horton and Wohl identify two parties involved in these kinds of relationships — the persona, a public figure of “the social scene” like a celebrity or politician, and the spectator, or viewer [4]. The spectator ‘knows’ the persona by watching them and, essentially, studying them throughout contexts. Traditionally, these relationships have been thought of as exclusively involving human persons, however recent developments have encouraged an expansion in this definition.

In the case of human-machine relationships, the persona, through their performances as “themselves or in a fictionalised world,” is the artificial intelligence that addresses audiences “as if they were in the circle of one’s peers” through some kind of medium [4]. The access that spectators have to personas is even greater now thanks to social media and the internet, where many personas share deep, intimate details of their lives to foster this “false sense of intimacy and closeness” with spectators because no matter how much they share, it is still controlled and limited by a variety of factors [4], [6].

AI as Personas

AI is often anthropomorphized by users who engage with it due to its perceived or prescribed “psychological traits,” and even at times, its “physical appearance” when embodied [7]. Not only is AI able to “mimic” human’s behaviour, emotional expression, and personalities, humans themselves often project such expectations onto them [7], [8].

It also helps increase the “empathy and trustworthiness” that AI can evoke throughout their interactions with humans [7]. This scale of anthropomorphism also aids in humanising the entities and resulting in “object attachment by fulfilling human needs” that are associated with such things as “comfort” and “pleasantness” — qualities that are often an underlying requirement of relationships [9].

Currently, AI has many commendable traits that enable it to complete “specific, complex tasks” and behave in ways that correspond to the instructions of its code [10]. Although AI systems can complete complex cognitive functioning, often to the point of operating on an “autonomous” basis that allows them to learn and grow from data that they analyse and generate, they are not without limitations.

Regardless of their complexity, present expectations maintain that AI will not develop the level of consciousness and autonomy required for it to become anything like biological beings anytime soon. This means that it will not be able to express the basic traits that seem integral to humanness or personhood that are not tied to cognitive capacities, like “empathy” and emotions [11]. Furthermore, AI cannot autonomously and genuinely “sense, understand, and react” to human displays of emotion or complex behaviours, whether they be positive, negative, or neutral [10].

The overarching goal of AI remains to “create an “other” in our own image” by at once giving it the tools that humans inherently possess, and improving on any of the limitations that their code may posit or that such tools may have [12]. This is not to say that AI can “replicate” emotions, only superficially mimic the expressions it learns through data collected. Even though there is very little doubt of the upwards trajectory of the development of these systems, they are still unable to tether the relation between emotions and behaviour. More importantly, AI entities “do not have emotions” of their own that they are able to express [13].

For all of their power, the lack of genuine emotions that do not require any sort of mimicry and can arise spontaneously in response to external stimuli is part of what holds AI back [13]. It is mainly on this basis that it is unlikely for AI to autonomously engage in genuine, meaningful, and reciprocal relationships. Most relevantly, AI is categorically incapable of feeling deep emotions like love towards the humans that it interacts with.

Human-Machine Relationships

Although it seems almost otherworldly to describe the kinds of interactions that humans have with machines as relationships, the concept of parasocial relationships can accommodate for the perceptions of relations that humans wish to define. In the way that it can only hold humans at arms-length, AI seems to fit into the archetypical conception of a persona that can create the illusion of a deep, reciprocal relationship with the human spectator — even when there is none.

Author: Kawthar Fedjki

Sources

[1] Powell, M. (2023, February 16). Rogue artificial intelligence chatbot declares love for user, tells him to leave his wife and says it wants to steal nuclear codes. Retrieved April 25, 2023, from https://www.dailymail.co.uk/news/article-11761271/Rogue-artificial-intelligence-chatbot-declares-love-user-tells-leave-wife.html

[2] Roose, K. (2023, February 16). Bing’s A.I. Chat: ‘I Want to Be Alive.’ Retrieved April 25, 2023, from https://www.nytimes.com/2023/02/16/technology/bing-chatbot-transcript.html?fbclid=IwAR3gNQm1m5nggZshiW93v6GpL29AvmLosiXSdGl_6wBgnlsSiMSvGRDccHA

[3] Thompson, D. (2023, February 21). Bing Chatbot Gone Wild and Why AI Could Be the Story of the Decade. Retrieved April 25, 2023, from https://www.theringer.com/2023/2/21/23609461/bing-chatbot-gone-wild-and-why-ai-could-be-the-story-of-the-decade-chatgpt

[4] Horton, D., & Wohl, R. R. (1956). “Mass Communication and Para-Social Interaction.” Psychiatry, 19(3), 215–29. https://doi.org/10.1080/00332747.1956.11023049.

[5] Dibble, J. L., Hartmann, T., & Rosaen, S. F. (2016). “Parasocial Interaction and Parasocial Relationship: Conceptual Clarification and a Critical Assessment of Measures.” Human Communication Research, 42(1), 21–44. https://doi.org/10.1111/hcre.12063.

[6] Meese, K. (2021). “You and I: Parasocial Relationships, Social Media, and Fan Labor in the One Direction Fandom.” Honors College Theses. https://digitalcommons.pace.edu/honorscollege_theses/322.

[7] Alabed, A., Javornik, A., & Gregory-Smith, D. (2022). “AI Anthropomorphism and Its Effect on Users’ Self-Congruence and Self–AI Integration: A Theoretical Framework and Research Agenda.” Technological Forecasting and Social Change, 182, 1–19. https://doi.org/10.1016/j.techfore.2022.121786.

[8] Hopgood, A. (2005). “The State of Artificial Intelligence.” Advances in Computers, 65, 1–75. https://doi.org/10.1016/S0065-2458(05)65001-2.

[9] Hermann, E. (2022). “Anthropomorphized Artificial Intelligence, Attachment, and Consumer Behavior.” Marketing Letters, 33(1), 157–62. https://doi.org/10.1007/s11002-021-09587-3.

[10] Korteling, J. E. (Hans)., van de Boer-Visschedijk, G. C., Blankendaal, R. A. M., Boonekamp, R. C., & Eikelboom, A. R. (2021). “Human- versus Artificial Intelligence.” Frontiers in Artificial Intelligence, 4. https://doi.org/10.3389/frai.2021.622364

[11] Esmaeilzadeh, H., & Vaezi, R. (2022). Conscious Empathic AI in Service. Journal of Service Research, 25(4). https://doi.org/10.1177/10946705221103531

[12] Herzfeld, N. L. (2002). In Our Image: Artificial Intelligence and the Human Spirit. Minneapolis, MN: Fortress Press.

[13] Weber-Guskar, E. (2021). “How to Feel about Emotionalized Artificial Intelligence? When Robot Pets, Holograms, and Chatbots Become Affective Partners.” Ethics and Information Technology, 23(4), 601–10. https://doi.org/10.1007/s10676-021-09598-8.

--

--