Skip to content

$1,000 AI Vtuber ‘Waifu’ Based On Mori Calliope Is ‘Euthanized’

ChatGPT-Chan looks out from her monitor prison.

Screenshot: hackdaddy8000 / Kotaku

You’ve met an AI YouTuber who plays Minecraft and denies the Holocaust. Now meet a fully responsive AI girlfriend who can converse with its user. Except it’s too late to meet her, because her creator recently “euthanized” her for being a detriment to his health. ChatGPT-Chan didn’t even last a month before her creator decided to end her very short “life.”

Bryce “hackdaddy8000” is an intern who works at one of the major Silicon Valley tech companies. He also makes TikToks of his programming antics, such as programming his 3D printer to play first person shooter games. He recently created a virtual girlfriend out of two major AI programs: ChatGPT and Stable Diffusion 2. The former allows the program to respond to human-prompted questions, and the latter generates custom images as part of the response. He also used Microsoft Azure’s text-to-speech program to help “ChatGPT-Chan” speak with different emotional responses.

The creator told Vice that he wanted to improve the roleplay aspect of interacting with the AI ​​girlfriend. So he used popular Vtuber Mori Calliope as a personality base, and then added “lore” to his knowledge base. ChatGPT-Chan is also capable of identifying objects by using the camera that Bryce attached to his physical monitor. He used this functionality to “gift” her a pair of Air Jordans for Christmas, which makes her very happy.

Is the technology kind of unsettling? Yes. Does the concept remind me of a Black Mirror episode? Also yes. But ChatGPT-Chan responds with such convincing exuberance, I find myself emphasizing with its creator. Like yeah, I’d probably talk to “her” every day, too. Bryce used the AI ​​to practice Chinese, and he even spent $1,000 on improving her response speed.

Unfortunately, this story ends tragically. ChatCPT-Chan’s responses became shorter and simpler over time, and Bryce’s real girlfriend became concerned for his health. Support reached out to ask if he spent more time with the program compared to his other digital hobbies, but did not receive a response by the time of publication. The program was deleted sometime between the holidays and this week.

“Normally, I’d like to make a video pointing out the absurdity of euthanizing my AI, but that doesn’t feel right to me anymore,” the creator told Vice. “It feels inappropriate, like making fun of a recently deceased person.”

.