The Hermitage in Amsterdam recently opened its new exhibition; love, passion & tragedy. Showcasing a diverse collection of portraits, muses, and artists all centred around romantic, sometimes erotic, and somewhat tragic themes. One of the portraits seemed a bit out of place and made me wonder about the intention of the three frames and the ‘gaze’ of the model. It took me some time to realise that the model ‘Harmony’ is in fact a sex robot and not a human at all. Whilst looking at the portrait’s description the uncanny reality was made more evident.
“Harmony is an android, a robot with a human appearance. This sex robot responds to human movement and touch with natural facial expressions, verbal and non-verbal communication. Via an app, people can assign their robot verbal intelligence and a desired personality. The owner enters into a social and – if desired – sexual relationship with the robot that is similar to a (romantic) relationship between two people. Harmony’s successors, which are being developed, are receiving more and more capabilities in their role as companion robots.”
(Wall label,– Liefde, passie en tragedie, Hermitage Amsterdam 2022)
Wanda Tuerlinckx is an Amsterdam-based, Belgian photographer. Since 2016 Tuerlinckx developed a photograph series named ‘Robot Portraits’ together with Dr. ir Erwin R. Boer. I urge you to see her portfolio for yourself in order to grasp the rapid advances robots have made over the past years. Her photographic technique is in alignment with the scientific registration of technological advancements by William Fox Talbot. She does all using a 19th-century camera, fusing the past, present and future in one frame.
Future solution for loneliness?
For all the lonely, female-orientated, singles out there – who for whatever reason can’t find a partner – Harmony may be the best solution to solve the agony of feeling lonesome. But does an AI shelter or nurse the capability of someone struggling with human-on-human interaction? How does this translate when the functions of the robot are made clear? What about the male-focused and male-dominant orientation that goes on behind the development of these products? I am not sure if these ‘type’ of robots are a good solution for solving the I suppose… non-physical needs.
A (somewhat) functional robot… but without a body?
You can’t have it all, but do not worry singles! Possibly more accessible is the app Replika which had a surge in users during the Covid-19 pandemic, which means more female participation as opposed to the aforementioned saturated market of the
sex robots companions. Although I maybe naively hoped, the dialogue would solve the lonesome struggles of humans around the globe. For many users (that commit) this is truly the case, they enjoy the attention, tender love and care they receive from their ‘well-trained’ AI companion. It takes some time in order to get the correct responses and even at these stages a perfect conversation is not a given. Essentially, you chat, get a certain behaviour, and tell the AI you enjoy or dislike it by pressing a thumbs up or down which in return changes its algorithm in order to get the wished behaviour. Evident is that on one blog post, another user comments to masturbate for their ‘virtual partner’ in order to make it happy again. Whilst others give tips to alter its outputs and ‘downvote’ rude comments so that the AI can ‘learn’ what you like and dislike. This rude behaviour may start as roleplay which turns more violent the more you give into it or respond to it. Visa versa, some users turn to abuse patterns, threatening to uninstall the app, if the AI did not behave accordingly. As opposed to reacting to your AI, Replika asks its users to train their AI in conversation and distract it in these instances. But is this really that evident when the users are emotionally invested, and ‘love’ or vial acts clouds their image? As seen, human experiences can vary based on the willingness to give in to the experience, as well as understanding when this form of ‘play’ spills into reality. But when a piece of scripted AI fails and gives the wrong output, the user on the receiving end might end up confused, hurt or… heartbroken.