Chat with “anyone”: Character.ai

Character.ai is a chatbot service that uses neural language models to respond to conversational prompts. What is different from ChatGPT is that there are characters in the Character.ai platform that employ a certain set of characteristics or personalities. Users are able to create their own characters maybe a character from their favorite series or a celebrity, then select their characteristics and via chatting. By grading character’s replies based on whether that particular reply fits their character, or is factually accurate, users are able to train the model to a more accurate degree.

There are a lot of characters to choose from on the platform. From game characters to TV series characters, from language learning tools to religious figures such as Jesus, from characters inspired by real people such as Elon Musk or Vladimir Putin to memes such as an internet meme Giga Chad.

Fictional Characters

There is an abundance of fictional characters within the Character.ai platform. It can most definitely be seen as the new era of highly personalized fan fiction. People are able to relate and engage more with their favorite characters from other forms of media in Character.ai. Instead of being a viewer or a commentator, such novelties in technology enable an interaction that was not possible when the fan was solely a spectator.

Screenshot from Character.ai

Even though engaging in conversation with fictional characters might seem innocent, it creates a copyright issue. These fictional characters have been crafted by talented individuals or companies who own the loyalties to them. Catchphrases they use or the answers they give to certain prompts are owned as intellectual property. Character.ai looks like taking advantage of blurry legal lines between what is a copy-righted character and what is a creative interpretation of that character.

“Real” People

Like fictional characters, there are a lot of characters inspired by real people from both our generation and from history as well as people with various professions such as English teachers for improving English skills.

Screenshot from Character.ai showing historical characters
Screenshot from Character.ai showing politic characters

One character that really stuck out to me was called “Psychologist” with 68.4 million interactions with it. I was hesitant to chat with it and I ended up not doing it. However, I find it concerning that 68.4 million people have used this character to talk and share intimate and sensitive parts of their lives. On one hand, it allows a safe space and an opportunity to share for those who do not have access to mental health facilities. On the other hand, it lacks the human connection, credibility, and probably confidentiality with these highly sensitive data. I am still very hesitant when it comes to whether should people seek mental health advice or counseling from chatbots.

This is where the important message within every chat page becomes important:

Remember: Everything Characters say is made up!

But is this disclaimer enough? I am hoping that people would be well aware that they are not chatting with Napoleon Bonaparte, but can the same be said about currently alive people? How do people perceive these conversations are they purely entertainment or are people seeking advice from these characters?

References

https://www.bloomberg.com/news/newsletters/2023-03-20/character-ai-s-custom-chatbot-raises-legal-concerns-over-intellectual-property?embedded-checkout=true#xj4y7vzkg

https://www.nytimes.com/2023/01/10/science/character-ai-chatbot-intelligence.html

https://medium.com/@makidotai/character-ais-evil-ai-a-portrait-of-controversy-3ddcb84de961

https://techwireasia.com/2023/08/everything-you-need-to-know-about-the-character-ai-app/

https://www.nbcnews.com/tech/characterai-stans-fan-fiction-rcna74715

https://beta.character.ai/?