During the lecture on cyborgs, a TV series I had not thought about in a while crossed my mind. It’s called Westworld. It premiered in 2016 and is a mixture of the genres Western and Science-fiction.
It takes place in the 2050’s where a corporation own several theme parks including the American Old-West one called Westworld. In each park there are humanoid robots called ‘hosts’ that are programmed to fulfill the guests’ every wish. They will play along with any violent and/or sexual desire and are designed to prevented any harm from happening to the guests. Every ‘host’ follows their own storyline which the guests can join. After every day the ‘hosts’ are rebooted and the memories of that day will be wiped. The main defense of the corporation is that the ‘hosts’ are machines and thus cannot be harmed in the same way humans can.
During season on we find out that the programmers have installed an update that causes some ‘hosts’ to become sentient. While some try to fix this ‘bug’ a guest called ‘Man in Black’ tries to find a maze that he believes developer Arnold left for him in the park. The head of Programming Bernard finds out he himself is a ‘host’ based on Arnold and that Arnold died trying to protect the ‘hosts’ as he saw potential for sentience in the ‘hosts’ and believed the corporation would abuse them. It is revealed that the Maze is something Arnold left for the ‘hosts’ as a guide to consciousness. Ford, one of the developers, tries to add one last narrative: all of the ‘hosts’ killing the guests and board members but this plan is foiled by Doloris, a sentient ‘host’. 
Machines and robots have been created for human entertainment or help for several decades. Computers, robots helping in and around the house, gaming consoles and VR-sets. In the last years these developments have been making bigger steps especially within the field of AI and cyborgs.
During the lecture the question was raised if we would make AI suffer before they would make us suffer. Westworld is a great reminder that ethics play a role in technology and interaction with technology. I can’t help but wonder if this is a possible outcome in our future. In Westworld it is shown that the people running the park have no regard for ‘hosts’ that gained consciousness. They try to fix this ‘bug’ in the programming. This has to do with that their main point in ethics concerning the fact that the guests can do anything they want to the ‘hosts’ without fear of repercussions and protected by them, is that the ‘hosts’ are machines and therefore cannot feel or experience pain the same way humans can. People use the humanoids thinking they have no conscience and when some of them do develop this and humans realize this, they try to undo it and want to continue using them. The humanoids rise against the people in the park and begin to fight for revenge and a life of their own.
What Westworld cleverly does is making the distinction between the humanoids and the humans so small so that the viewer is forced to think about how they view robots (think about Bernard who thought he was human but turned out to be a ‘host’). If the difference between robot and human becomes so insignificant because the robots can become sentient who are we to deny them that and continue to use them for our own pleasure. Do they not deserve a life outside of the park if they are smart enough to become conscious? It is even ethical to create them and not grant them that freedom? It makes the viewer think about free will, rights for robots and ethics within the field of technology.
Some observations and questions
I am not sure if in Westworld the humanoids have sensors that would allow them to physically feel the same things as a human and if so would that be the same as how a human feels?
Something I think also relates to Westworld is the question if data is ever really gone? In the series the ‘hosts’ have a certain narrative that the replay everyday and at the end of the day their memory is wiped. But the data storage is not destroyed and they can remember their past ‘lives’ when they become sentient. It is similar to how things on the internet can be deleted but they aren’t really gone.
The main purpose of the park is to let people live out their (mostly) sick fantasies without consequences. This can also be seen in violent video games like GTA where players often just go around killing NPC (Non Playing Characters) for fun or to release stress/anger. Westworld is the real life version of this and do you think we will have something similar in the future?
 Allie Waxman giving a recap of Season 1
 Olivia Armstrong on how Westworld is different from other tv-drama’s