Mimi Yates – 16/04/2022
When we place humans and machines together in science fiction, it seems to always end badly. From Frankenstein’s murderous turn on his creator to the bloody human-replicant conflicts of Blade Runner, popular texts bring into question our capacities of how, and whether, we could live together. Alex Garland’s Ex Machina (2015), tells the story of a young and bright computer programmer Caleb who gets selected to visit a narcissistic CEO Nathan at his remote property for the company ‘Bluebook’. Caleb learns that he is working on building an AI robot named Ava whom he is designing to take the Turing test, a measure of whether a robot can pass for human. Confining her and observing her in an obscured glass-walled room, I think this film presents the conflicts within posthumanist discourse and the ‘uncanny valley’ between human and machine as clear as the transparent material that separates them. In this essay, I will specifically focus on what Samani has coined ‘Lovotics’, referring to the research area of human-robot relationships, specifically relationships of sex and love. Applying this to Ex Machina, the first part of this paper ask two questions that the film invokes: could robots have feelings for us? And, could we have feelings for a robot? The second part will analyse the representation of female robots and human-robot relationships in the film, before discussing how this is echoed in real-life scenarios and the dangers of this. I will ultimately argue that posthuman relationships will transform our notions of intimacy and that, in fact, they already have. I believe that the potential of progression represents us with some serious challenges that need addressing.
On Love: Could a robot have feelings for us?
It seems apparent that Ava in Ex Machina experiences authentic emotion in the film; for example, when she professes her love for Caleb to the point of uttering ‘I love you’. Her facial expressions appear complex, thoughtful and inquisitive. When Caleb asks her what the first thing she would do if she left the facility, she emphasises her longing and curiosity to ‘people watch’ at a busy traffic intersection. As mentioned, as the narrative develops we learn that Nathan’s motive has not merely been to prove Ava is capable of thought; but that she has conscious intent, and thus is capable of projecting (and invoking) emotion to human reciprocation. In questioning whether a robot could have feelings for humans, we must first start at whether a robot could have feelings at all. And in asking this, we inevitably fall (hard) at the consciousness hurdle, since it is common sense to most scholars that an entity must be conscious for it to ‘feel’. David Chalmer’s writes in one of his many seminal works on posthumanism on ‘the hard problem of consciousness’, which accounts for the difficulty we face in understanding why we obtain a qualitative, subjective, phenomenal character of our conscious states. Nathan describes this concept cleverly through the use of the Pollock painting ‘No.5’, where he explains the creation of the painting as “not deliberate, not ran- dom…some place in between” (Ex Machina 2014). He explains how the “challenge” is to find that small gap in between: conscious intention. The purpose of Nathan’s lesson is to prove to Caleb that Ava’s behaviours – like her seduction – are not simply the result of human automatic programming, but are her own autonomous conscious thought.
I think it is clear that Nathan’s aim for his creation is not simply for her to pass the Turing Test, but to prove that she has consciousness. Following on from this, Katherine Hayles argues for the mind’s need to be embodied in ‘How We Became Posthuman’ (1999). She mentions debates within posthumanism which question whether we can treat the body as superfluous, or whether our very existences are an “inextricable intertwining of body and mind” (Hayles 1999: 5). She further emphasises that having an embodied awareness from affective experiences are a core element to the conceptualisation and identification of the self, where affect is an experience of intensity. Taking on this concept, Fahn states that “without affect, feeling or emotion cannot exist” since it is vital to the relationship between the body and the external environment in forming subjective experiences (Fahn 2019:5). She argues that this idea of artificial intelligence with physical embodiment has the potential to envision a consciousness – not strictly from cognitive processing – but from these direct experiences with the material world; and more strongly, that if consciousness were to arise, it can only be achieved by affective embodiment (Fahn 2019: 5). Research on affect and embodiment have explored the concept of the body itself as a programmable machine, expanding this to include simulated emotions translatable into code (ibid.,). Perhaps, we are able to build artificial intelligences as capable of learning and processing emotion through affective experience.
Continuous progress in the research of artificial intelligence – where Moore’s Law states that the computing power of technology doubles every two years – seems to suggest these fictional considerations of the posthuman may point towards a future where emotional capacity no longer separates humans and machine. In 2011, Samani’s paper entitled ‘Lovotics’ detailed the design and development of a robot capable of experiencing complex and human-like emotional states, governed by artificial hormones embedded in its system (Samani 2011). This Lovotics robot includes three modules: the Artificial Endocrine System, which resembles the human endocrine system where artificial hormone levels change dynamically according to interaction and social awareness; the Probabilistic Love Assembly, which is used to calculate “probabilistic parameters of love” between a human and a robot, such as proximity, repeated exposure, similarity, desirability; and the Affective State Transition module which is employed to “manage alteration of the short-term affective states” of the robot (Samani 2011: 10-11). In turn, these three modules connect to generate realistic emotion-driven behaviours by the Lovotics robot (Cheok 2017: 3). Samani explains how this ‘intimacy software’ includes mathematical codes for such parameters, creating formulas to represent each factor as well as an overall ‘intimacy’ formula that combines all of them together (Samani 2011).
We see in fictional characters like Ava and the case study of the Lovotics robot that ‘emotion’ could be pre- sent in artificial intelligence systems. However, one may question the authenticity of this. Caleb directly asks this question to Nathan: “are you programming her to flirt with me?”. Parallels can be drawn here with the artificial operating system in Jonze’s Her (2013), where Samantha asks “are these feelings even real? Or are they just programming?”. I would argue that despite Ava professing her desire to be with Caleb, the fact that she emotionally manipulates him and ultimately abandons him explicitly supports an argument that she does not feel real emotion, since it was not in her code to do so; and therefore, does not care as a result. Nathan explains that Ava is merely a “rat in a maze” and that he has given her one task to complete; to use Caleb to help her escape. As we see, she stops at nothing to complete this task, in the end coldheartedly killing Nathan with a blank expression to do so. She locks Caleb in the compounds, looking over at him as he screams. Thus, it seems that Ava does not actually experience love for Caleb, but rather can mimic it to manipulate her captors. In response to this, I would argue that it does not really matter whether or not a robot genuinely possess these emotions. What matters is our personal perception and response to them. Surely if a robot can make a human feel something, what the robot actually feels or does not feel does not mean all that much. David Levy writes in ‘Love and Sex with Robots’ (2007) that “if a robot behaves as though it has feelings, can we reasonably argue that it does not?” (Levy 2007:11). Bradbury illustrates this idea nicely in his work ‘I Sing the Body Electric’, which tells the story of a robotic grandmother who fills the place of the deceased mother in a family. The father exclaims “You’re not in there” (echoing an Uncanny uneasiness at the human-like android), with the robotic grandmother replying “No, but you are – and if paying attention is love, I am love. If knowing is love, I am love. That is always real.” (Renstrom 2015). In Ex Machina, Caleb believes that Ava is capable of feeling, even if this is not authentic – but I would argue that this is what makes it so real. Thus, since it is firstly not possible yet to determine a robots emotional, affective experience from their subjective point of view and secondly, we can question whether that in itself is really is important in answering definitively, I answer that it is possible to say a robot could have feelings for us.
On Love: Could we have feelings for a robot?
I will now discuss the opposite case, whether we as humans can have feelings for a robot. A post-phenomenological analysis can shine light on this subject, but this will be strictly in the case of an embodied robot like Ava (even though examples of disembodied robots exist such as Samantha in Her) since the phenomenology I will using involves this as a precept. Post-phenomenology can be defined as “the study of technology in terms of the relations between human beings and technological artifacts”, focusing on the ways in which technologies can shape relations between humans and the world, aiming to overcome the limits of merely human subjectivity (Rosenberger and Verbeek 2015: 9). Idhe characterises human-robot relationships in his post-phenomenology, distinguishing four basic forms of technological intentionality: embodiment relations, alterity relations, hermeneutic relations and background relations (Idhe 1993: 98). I think the most important for human-robot relations is that of alterity relations, which Idhe explains are those in which we relate to technology as an ‘other’, in that I relate to the robot in similar ways to how I relate to other humans (Coeckelbergh 2010, my emphasis). This idea of an alterity relation is a vital aspect of human-robot relationships, since it positions the robot as a ‘partner’ to us rather than an merely object in our space; whilst also conjuring up questions of moral subjectivity, which I will discuss later on (Bergen 2020: 288). With Coeckelbergh, I argue that using this approach in evaluating the status of a robot is contentious with most philosophical accounts (such as traditional phenomenology) which requires that the robot has consciousness but that what really matters for comprehending human-machine relationships is how the robot appears to us and how we experience it (Coeckelbergh 2010: 198). For the purpose of clarity, I will define how we experience love specifically as a human using Cooper and Sporto- lari’s theories of human relationship formation: physical attraction, similarity, self-disclosure and intimacy (1997). I have found this fits well with my analysis, since it is arguable that Caleb falls in love with Ava.
I will first start with the idea of physical attraction. A machine, like Kyoko and Ava and Nathan’s other AI’s in Ex Machina, can certainly have physically attractive features; perhaps even more so than humans can. In fact, Nathan reveals that Ava’s attractiveness was based specifically off Caleb’s porn preferences. We also must remember than in finding someone attractive, we pay attention also to their behavioural features, habits, voice, body language and social skills (Viik 2020: 4). So, for humans to gain feelings for robots, it is vital for artificial intelligence to have a certain level of emotional and social intelligence, emulating that of a human. Pulman writes that artificial companions need to have a “fairly elaborated and accurate model of our abilities, inabilities, interests and needs…[needing] to keep account of previous interactions” (Pulman 2008: 66). It can be said that Ava indeed fulfils this, since during their talking sessions she repeatedly asks him about things he has told her in the previous days. She is ‘up to date’ with his feelings, even telling him when she senses“discomfort” and “attraction”.
Coeckelbergh further suggests in his paper ‘Empathy and Vulnerability Mirroring in Human-Robot Relations’ that robots will have to “mirror” human vulnerability in order to be objects of love for humans (2010: 6-8). He states that human relationships and intimacy are based on a recognition of each others’ vulnerabilities. Thus, we can feel empathic towards the other as we know that we are similar in our vulnerabilities; we now see how this ties in with my theory of relationship formation by Cooper and Sportolari, echoing the similarity component. He continues to explain that this“vulnerability mirroring” is not limited to human-human relations, but can extend also to animals and fictional characters. As a result, we come to see them as not just alien ‘things’ in our space, but as ‘one of us’ (ibid.,). We can therefore build on this and include robots in this analysis, to the extent that they can function as vulnerability mirrors. In Ex Machina, I think this is evident. Ava exhibits this through telling Caleb how she longs to escape and how she wishes to live amongst humans, as well as expressing her fears at being ‘switched off’ by Nathan. She tells Caleb how she feels about him and wishes to spend her life with him. Her vulnerability makes Caleb open up to self-disclosure, revealing to her how his parents died in a car crash whilst he was in the car. She responds with empathy and an expression of grief. Thus, since Ava fulfils this necessary condition for companionship in Coekelbergh’s terms and for love in Cooper and Sportolari’s, it can be said that Caleb could be in love with Ava; and that humans could have feelings for machines.
On Sex: How are human–machine relations presented in Ex Machina?
Now that I have discussed how I think human-machine relationships are possible, I will discuss how they are presented specifically in the film. Many feminists scholars have noted on the film that it has a “serious fembot problem” (Watercutter 2015: 1), whilst also claiming that the male characters seek entitlement to female bodies and treat them as nothing but “disposable fuck-toys” (Cross 2015: 1). We get a hint of this before the ‘spontaneous dance scene’ where we see Kyoko trying to undress herself in front of Caleb, alluding to what
Nathan really uses her for. This is made definitive towards the end of the film in the scene where Caleb discovers Kyoko lying naked on the bed in his room during a chilling score, as well as the creepy naked robot ‘corpses’ of Nathan’s former creations lining the closets. Nathan also notes on Ava’s sexuality, telling Caleb “you bet she can fuck” and that when how he is thinking about reprogramming her simply so he can use her for that very reason. In addition to this, it has been noted that the female bodies in the film are not only present- ed as sexualised, but as “fractured and mutilated” (Henke 2013: 133). Unfortunately, this presentation does not seem to add up to Haraway’s concepts in ‘The Cyborg Manifesto’, where she envisions the cyborg as a transgressive figure that could subvert oppressive power structures like the patriarchy (Haraway 1991). The very premise of sex-robots – which is what I would classify at least Kyoko and the Nathan’s other creations as – is problematic. The sexualisation of female robots has become a fetish; what we might posit as a Mulvey- esque ‘post-male-gaze’. Soukup termed this as a “techno-scopophilia”, where the posthumanist fusion of human and machine eroticizes and thus reduces the latter to fetishized commodities (2009). Ava being literally positioned in a glass room seems to metaphorically emphasise this post-male-gaze entrapment, as she exists in this spaces wholly dominated by two men.
With Yee, I would argue that Ava’s “highly sexualised yet innocent character as well as her desire for freedom are reflective of male control over female agency” (Yee 2017: 85). Perhaps, the artificial female body is sexually enticing precisely because of its artificiality. The fact that Nathan can construct Ava, and his other robots, with whatever parts that he wants the most, and can switch them off and reprogram them as he desires, is an uncomfortable reflection of what Richardson thinks as “what some men think about women: that they’re not fully human beings” (Richardson 2015; telling LiveSci- ence). Today, we see exponential progress in the development of sex-bots, such as TrueCompanion’s ‘Roxxxy’, a uncannily human-looking robot that you can customise in several ways, from hairstyle to personality. The new model can even ‘hear’ you talk and engage back in appropriate ‘sex-talk’. The different models come along with names, including ‘Frigid Farah’, ‘S&M Susan’ and ‘Wild Wendy’ (Danaher 2017: 6). Clearly, just these names themselves present issues.
On Sex: The dangers of human–machine relations
The Campaign Against Sex Robots (CASR) put forward a series of criticisms that I will now expand on. CASR structures an argument from moral degradation, cen- tred around the potential effects on the users of sex-robots’ sexual and thus moral subjectivity (Bergen 2020). Engaging in sex-robots recognises solely the needs and wants of the buyer, and thus can contribute to the justification of using women as sex object, reducing human empathy as a result. Since the etymology of the word ‘robot’ comes from the Czech term ‘robotnik’ meaning “forced worker”, we can understand why, with its background of forced servitude, is disturbing in the context of sexual relations, provoking unease on topics of exploitation and consent (Gersen 2019: 1798).
New sex-robot models with a ‘frigid’ setting, for exam- ple, could allow men to simulate rape. In addition, sex- robots could perpetuate unrealistic standards of beauty and sexual opportunity if this way of thinking about sex is normalised and carried over to inter-human relations if men are allowed sex whenever they please, since the function of the sex-bot is, to have sex. CASR also note that this resulting exploitation is not exclusive to women, but also children. Reports of a Japanese company shipping anatomically-correct custom made sex-robots of girls as young as five to clients globally lead the U.S of Representatives to pass the Curbing Realistic Exploitative Electronics Pedophilic Robots Acts (CREEPER) in 2017, as well as the EU tightening their law on this, aiming to ban the distribution and importation of child sex robots (Gersen 2019: 1797). Despite spurious logic used by some that its “better a robot than a child” (thinking you can engineer out paedophilia), arguing that they instead provide ‘safe’ sexual satisfaction, the Congress reasoned with CASR that it would instead, normalise the abuse and rape of minors (ibid,). The lack of empathy that CASR emphasise would thus become part of someone’s sexual subjectivity, and in becoming an un-empathetic sexual subject leads to becoming an un-empathetic moral subject (Bergen 2020). Since I have previously argued for a post-phenomenological perspective in terms of our interactions with robots, I think that robots thus having an impact on our moral subjectivity as humans is perfectly possible.
In fact, I would argue that they already have. In my research, I found that a 2019 United Nations report on AI voice assistants (like that of ‘Alexa’) which are given human-like female voices, are programmed to be “submissive in the face of gender abuse” (Gerson 2019: 1800). With approximately 10% to up to 50% of interactions being abusive, comments calling the voice assistants a “bitch” or a “slut” lead to responses including “thank you for your feedback” (UNESCO 2019: 106-108). Feminist concern over this lead companies to rethink these responses, urging them to fight back against aggressive and degrading treatment by altering better responses (Gerson 2018: 1800). This is just one example of how interactions with machines has had an impact on our moral subjectivity. The rise of the sex-bots not just fictionally but in real-life scenarios show us that our notions of intimacy are being transformed – and could continue to do so; perhaps at our own peril.
In conclusion, I have firstly discussed the possibility of human-machine relationships, exploring the questions of (i) whether robots could have feelings for us and (ii) whether we could have feelings for robots. Applying this to Garland’s Ex Machina, for the first question I discovered through an analysis of embodiment and affective experience that this could be possible, even if one may question the authenticity of it. For the latter, I looked at a post-phenomenological analysis of Ava concluding that she fulfils certain necessary conditions for Caleb to feel love for her. The second part of my essay analysed the representation of human-machine relationships in the film, focusing on the sexualisation and objectification of Ava and Kyoko. Finally I related these to real-life instances of sex-robots and the dangers these pose to our society, such as perpetuating exploitative behaviour, contributing to moral degradation. Ultimately, I argue that the very fact it could be possible for human-machine relationships to exist, means our notions of intimacy will change; and that current human-machine relationships show that they already have. I think the potential progression of AI that Nathan confronts us with in the creation of Ava, gives us even more reason to continue research into this topic in order to predict and comprehend the challeng- es we may be facing – perhaps, sooner than we think.
Bibliography:
- Aghaebramani Samani, H (2011). LOVOTICS: LOVE + ROBOTICS, SENTIMENTAL ROBOT WITH AFFEC- TIVE ARTIFICIAL INTELLIGENCE. PHD Doctorate of Philosophy Thesis. National University of Singapore. NUS Graduate School for Integrative Sci- ences and Engineering. Pp. 1-151
- Bergen, J. (2020) Love(rs) in the making: Moral subjectivity in the face of sexbots. Paladyn, Journal of Behavioral Robotics, Vol. 11 (Issue 1), pp. 284-300.
- Bergen, J.P (2020). Love(rs) in the making: Moral subjectivity in the face of sexbots. Paladyn, Journal of Behavioral Robotics. 11: 284–300
- Cheok, A.D (2017). Lovotics: Human-Robot Love and Sex Relationships in: Robot Ethics 2.0: From Autonomous Cars to Artificial Intelligence. Eds,. Lin, P. Abney, K., Jenkins, R. Oxford Scholarship Online.
- Coeckelbergh, M. (2010). Humans, Animals, and Robots: A Phenomenological Approach to Human-Robot Relations. Int J of Soc Robotics 3, 197– 204.
- Coeckelbergh, M. (2010). Artificial companions: empathy and vulnerability mirroring in human- robot relations. Studies in ethics, law, and technology, 4(3), [2].
- Cooper, A. & Sportolari, L. (1997). Romance in Cy- berspace: Understanding Online Attraction, Journal of Sex Education and Therapy, 22:1, 7-14.
- Cross, K. (2017) “Goddess from the Machine. A Look at Ex Machina’s Gender Politics”. Feministing. [online]. Available at < http:// feministing.com/2015/05/28/goddess-from-the- machine-a-look-at-ex-machinas-gender- politics/>
- Di Minico, E. (2017). “Ex-Machina and the Feminine Body through Human and Posthuman Dystopia”. Ghosts in the Cinema Machine, pp. 67-84.
- Fahn, C.W. (2019). Affective Embodiment and the Transmission of Affect in Ex Machina. Philosophies. 4, 53.
- Garland, A. (2014). Ex Machina. A24.
- Gersen, J. S. (2019). SEX LEX MACHINA: INTIMACY AND ARTIFICIAL INTELLIGENCE. Columbia Law Review, 119(7), 1793–1810.
- Haraway, D.J (1991). “A Cyborg Manifesto: Science, technology, and Socialist-Feminism in the Late Twentieth Century,” in Simians, Cyborgs, and Women: The Reinvention of Nature. New York: Routledge, 149-181.
- Hayles, N K. (1999). How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: The University of Chicago Press.
- Henke, J. (2013). “Ava’s body is a good one”: (Dis) Embodiment in Ex Machina. American British and Canadian Studies 29(1) 126-146.
- Ihde, D. (1993). Postphenomenology: Essays in the Postmodern Context. United Kingdom: Northwestern University Press.
- J. Danaher, and N. McArthur, Eds. (2017). Robot Sex: Social and Ethical Implications. MIT Press, Cambridge, MA, USA.
- Levy, D. (2007). Love and Sex with Robots. Harper Collins Perennial: UK.
- LiveScience (2015). Rise of the Fembots: Why Artificial Intelligence Is Often Female. [online] (February 20th 2015). Available at < https:// www.livescience.com/49882-why-robots- female.html>
- Renstron, J. (2015). Artificial Intelligence: Real Emotion? Slate. [online] (9th April 2015). Accessed via < https://slate.com/technology/2015/04/ex- machina-can-robots-artificial-intelligence-have- emotions.html>
- Rosenberger, R. Verbeek, P.P. (2015). A Field Guide to Post-Phenomenology. Postphenomenological Investigations: Essays on Human-Technology Relations. Lexington Books. 9-41
- S. Pulman, (2008). Conditions for companionhood, in. Close Engagements with Artificial Companions: Key social, psychological, ethical and design issues, John Benjamins Publishing Company, pp. 59–68.
- Soukhop, C. (2009). Techno-Scopophilia: The Semiotics of Technological Pleasure in Film. Critical Studies in Media Communication 26(1):19-35
- UNESCO, (2019). I’d Blush if I Could: Closing Gender Divides in Digital Skills Through Education. Report.
- Viik, T. (2020) Falling in love with robots: a phenomenological study of experiencing technological alterities. Paladyn, Journal of Behavioral Ro-