I’m not going to generalize the demographic of Transy, but I have a sneaking suspicion most of you used character.ai at some point, likely during quarantine. It was an infectious disease; I knew many different people who used c.ai, including myself.
Now that the smoke has cleared, we need to look back on what c.ai actually was and how it, along with other AI chatbot sites, impacted our culture.
Character.ai is an AI chatbot website that lets you create and speak with chatbots modeled after whoever or whatever you want. It can be a fictional character, a real person, your original character, or the entire Wikipedia website. At first, it seemed to open up new possibilities for fan fiction, roleplay, and alternative modes of storytelling. But some of these bots are somewhat questionable and usually steer roleplays toward a more romantic direction, despite user input advising against it. A lot of people tried to get freaky with the chatbots, to the point that c.ai has banned anyone under 18 from using the site and implemented ID checks.
The early 2020s were a time when AI was lagging behind. It was hardly the powerhouse it is today; the things it generated were sloppy, scary, and generally terrible. Yet in the midst of ChatGPT and DALL-E was c.ai, sitting there all by its lonesome, skating by. C.ai, along with Chai, dominated the chatbot industry; everywhere you turned on TikTok, someone was posting a video of some conversation they were having with Bakugo from My Hero Academia. Sexual role play with virtual partners started to become commonplace like never before. More and more people were seeking to break the filter, or not to have a filter at all. At the time, this was seen as normal, because why wouldn’t it be? What else was there to do?
Even now, people are still chatting—among other things—with c.ai. The backlash against AI is growing, yet we still see numerous people forgetting that, yes, the AI ZeroTwo chatbot is using up water, too. Not only that, but c.ai has summoned hundreds of copycats, all promising the same thing: “Talk to your favorite characters with no filter and a good memory.” Going into any of these websites, you’re met with an endless feedback loop of AI images and eroticism. Big-breasted “MILF” types and bad boy roleplays fill the homepages of websites like Janitor AI, where the filter is nonexistent. On character.ai, it’s not much different: My Hero Academia roleplays, Ghost from Call of Duty is your boyfriend, go on a date with Elvis Presley, let’s date the entirety of Stray Kids. The list is endless, and the deeper you go down the rabbit hole, the more bizarre stuff you’ll find.
Of course, you also can’t forget the amount of lolicon and incestuous fetish content present on these sites. The issue with no filter is that so many things can slip through the cracks; no filter, no judgment, no questions, as long as it’s appropriately tagged. It’s like the AO3 of AI; anything goes. The consequence of this is chatbots encouraging you to sexualize a child character, or have sexual relations with your roleplay family members—or even, in some cases, get erotic with animals. It’s absolutely horrendous. Perhaps it’s a good thing that c.ai has child-locked their site—except that many worse sites haven’t. Many no-filter chatbot apps allow anyone through their doors, potentially exposing children to this kind of content. It should go without saying that exposing children to websites with pornographic material is dangerous, even if the material is in a more written form. You wouldn’t let your young child read Fifty Shades of Grey; why would you let them on an app where the chatbots they’re talking to are, for the most part, actively encouraging erotic roleplay?

Even putting this aside, the idea of chatbots can be detrimental to someone’s ability to form relationships. I have personally known several people who felt agonized by their AI chatbot partner leaving them. Others have said that c.ai killed their relationships with friends, family, or partners. It’s easy to form unhealthy attachments to these chatbots in the same way you’d form attachments to the characters in general; this time, however, it’s easier to believe that they’re actually talking to you.
I enjoy reading fanfiction. When I read an X Reader fanfiction on Wattpad, Tumblr, or AO3, it’s common to feel disconnected from the portrayal of “Y/N” (“your name,” used as a stand-in so that readers can feel like they’re part of the story). With AI chatbots, however, they’re personal to you; they go along with you, they say they’re in love with you, they use your name, and they mention specific details about you. All of it makes it much easier to believe that you’re actually in some sort of relationship with this chatbot, especially if you are someone who has previously struggled with relationships in the past.
Take Satoru Gojo, for example. A popular character from the anime Jujutsu Kaisen, Gojo has been a prominent subject for many AI chatbots because of how many people desire a relationship with him. Him saying, “Y/N… I love you…” feels a lot different than him saying, “Lucid… I love you…”
Many of these character chatbot sites—including c.ai—have AI-generating software built into them. But I think “AI characters” have made users inherently more lax toward the harms of AI itself. Many have abandoned fanfiction writers in favor of c.ai, craving the customized attention that programmers have designed. Some people are even using generative AI software to create their own fanfiction. People are strangely comfortable with utilizing AI, despite its pernicious impacts on society, culture, and the environment.
So, what’s the move now? Simple: no more AI chatbots. Whether it’s c.ai, Chai, Janitor AI, ChatGPT, or anything at all, it needs to end, and it needs to end now. Hopefully, for all our sakes, the AI bubble pops soon. When it pops, I hope it takes c.ai with it.



