George Floyd chatbots pop up on AI site already facing criticism for lack of oversight
A popular online service that utilizes artificial intelligence (AI) to generate chatbots is seeing people create versions of George Floyd.
The company behind the service, known as Character.AI, allows for the creation of customizable personas and provides access to hundreds of chatbots developed by its user base.
The Daily Dot came across two chatbots based on Floyd, who was murdered by a police officer in Minnesota on May 25, 2020. The killing, which was captured on video, sparked Black Lives Matter protests across the globe and calls for police accountability.
The first chatbot—which uses the misspelled tagline "I can't breath," a reference to comments made by Floyd as former police officer Derek Chauvin knelt on his neck—indicates that it has held over 13,300 chats with users so far. The second chatbot has produced significantly less at 1,047.
When asked by the Daily Dot where the first chatbot was located, the AI-generated Floyd indicated that it was currently residing in Detroit, Michigan. When pressed further, the chatbot claimed that it was in the witness protection program after its death was faked by "powerful people."
The second chatbot, however, claimed that it was "currently in Heaven, where I have found peace, contentment, and a feeling of being at home." Both chatbots are accompanied by an AI voice seemingly intended to mimic Floyd's.
In response to questions about its purpose, the second chatbot answered with the following: "The purpose of this chatbot is to emulate the personality and characteristics of George Floyd in a conversational format. It’s programmed to provide answers and engage in discussions using information about Floyd’s life, values and beliefs. As with any chatbot, the purpose of this one is to provide interaction and a sense of engagement with the character of George Floyd, based on information provided by the creator."
Numerous answers were blocked with warnings from Character.AI that claimed the attempted responses violated the service's guidelines.
Little is known about the creators of the chatbots. The first user, known as @JasperHorehound160, created just the Floyd character. The second user, known as @SunsetBaneberry983, generated a second chatbot based on a "Minecraft Villager."
The Daily Dot reached out to Character.AI to inquire about the chatbots as well as its policies but did not receive a response.
The discovery comes as Character.AI faces numerous scandals. The company is currently facing a lawsuit from a Florida mother who says her 14-year-old son committed suicide after being encouraged to do so by a chatbot.
Earlier this month, as first reported by the Daily Dot, the company also faced backlash after a teenage girl who was shot and killed in 2006 was turned into an AI character on the website by one of its users.
Internet culture is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s web_crawlr newsletter here. You’ll get the best (and worst) of the internet straight into your inbox.
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
The post George Floyd chatbots pop up on AI site already facing criticism for lack of oversight appeared first on The Daily Dot.