March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025
1 2 3 4 5 6 7 8 9 10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
News Every Day |

Character.AI put in new underage guardrails after a teen's suicide. His mother says that's not enough.

Sewell Setzer III and his mother Megan Garcia.
  • Multiple lawsuits highlight potential risks of AI chatbots for children.
  • Character.AI added moderation and parental controls after a backlash.
  • Some researchers say the AI chatbot market has not addressed risks for children.

Ever since the death of her 14 year-old son, Megan Garcia has been fighting for more guardrails on generative AI.

Garcia sued Character.AI in October after her son, Sewell Setzer III, committed suicide after chatting with one of the startup's chatbots. Garcia claims he was sexually solicited and abused by the technology and blames the company and its licensor Google for his death.

"When an adult does it, the mental and emotional harm exists. When a chatbot does it, the same mental and emotional harm exists," she told Business Insider from her home in Florida. "So who's responsible for something that we've criminalized human beings doing to other human beings?"

A Character.AI spokesperson declined to comment on pending litigation. Google, which recently acqui-hired Character.AI's founding team and licenses some of the startup's technology, has said the two are separate and unrelated companies.

The explosion of AI chatbot technology has added a new source of entertainment for young digital natives. However, it has also raised potential new risks for adolescent users who may more easily be swayed by these powerful online experiences.

"If we don't really know the risks that exist for this field, we cannot really implement good protection or precautions for children," said Yaman Yu, a researcher at the University of Illinois who has studied how teens use generative AI.

"Band-Aid on a gaping wound"

Garcia said she's received outreach from multiple parents who say they discovered their children using Character.AI and getting sexually explicit messages from the startup's chatbots.

"They're not anticipating that their children are pouring out their hearts to these bots and that information is being collected and stored," Garcia said.

A month after her lawsuit, families in Texas filed their own complaint against Character.AI, alleging its chatbots abused their kids and encouraged violence against others.

Matthew Bergman, an attorney representing plaintiffs in the Garcia and Texas cases, said that making chatbots seem like real humans is part of how Character.AI increases its engagement, so it wouldn't be incentivized to reduce that effect.

He believes that unless AI companies such as Character.AI can establish that only adults are using the technology through methods like age verification, these apps should just not exist.

"They know that the appeal is anthropomorphism, and that's been science that's been known for decades," Bergman told BI. Disclaimers at the top of AI chats that remind children that the AI isn't real are just "a small Band-Aid on a gaping wound," he added.

Character.AI's response

Since the legal backlash, Character.AI has increased moderation of its chatbot content and announced new features such as parental controls, time-spent notifications, prominent disclaimers, and an upcoming under-18 product.

A Character.AI spokesperson said the company is taking technical steps toward blocking "inappropriate" outputs and inputs.

"We're working to create a space where creativity and exploration can thrive without compromising safety," the spokesperson added. "Often, when a large language model generates sensitive or inappropriate content, it does so because a user prompts it to try to elicit that kind of response."

The startup now places stricter limits on chatbot responses and offers a narrower selection of searchable Characters for under-18 users, "particularly when it comes to romantic content," the spokesperson said.

"Filters have been applied to this set in order to remove Characters with connections to crime, violence, sensitive or sexual topics," the spokesperson added. "Our policies do not allow non-consensual sexual content, graphic or specific descriptions of sexual acts. We are continually training the large language model that powers the Characters on the platform to adhere to these policies."

Garcia said the changes Character.AI is implementing are "absolutely not enough to protect our kids."

Character.AI has both AI chatbots designed by its developers and by users who publish them on the platform.

Potential solutions, including age verification

Artem Rodichev, the former head of AI at chatbot startup Replika, said he witnessed users become "deeply connected" with their digital friends.

Given that teens are still developing psychologically, he believes they should not have access to this technology before more research is done on chatbots' impact and user safety.

"The best way for Character.AI to mitigate all these issues is just to lock out all underage users. But in this case, it's a core audience. They will lose their business if they do that," Rodichev said.

While chatbots could become a safe place for teens to explore topics that they're generally curious about, including romance and sexuality, the question is whether AI companies are capable of doing this in a healthy way.

"Is the AI introducing this knowledge in an age-appropriate way, or is it escalating explicit content and trying to build strong bonding and a relationship with teenagers so they can use the AI more?" Yu, the researcher, said.

Pushing for policy changes

Since her son's passing, Garcia has spent time reading research about AI and talking to legislators, including Silicon Valley Representative Ro Khanna, about increased regulation.

Garcia is in contact with ParentsSOS, a group of parents who say they have lost their children to harm caused by social media and are fighting for more tech regulation.

They're primarily pushing for the passage of the Kids Online Safety Act (KOSA), which would require social media companies to take a "duty of care" toward preventing harm and reducing addiction. Proposed in 2022, the bill passed in the Senate in July but stalled in the House.

Another Senate bill, COPPA 2.0, an updated version of the 1998 Children's Online Privacy Protection Act, would increase the age for online data collection regulation from 13 to 16.

Garcia said she supports these bills. "They are not perfect but it's a start. Right now, we have nothing, so anything is better than nothing," she added.

She anticipates that the policymaking process could take years, as standing up to tech companies can feel like going up against "Goliath."

Age verification challenges

More than six months ago, Character.AI increased the minimum age participation for its chatbots to 17 and recently implemented more moderation for under-18 users. Still, users can easily circumvent these policies by lying about their age.

Companies such as Microsoft, X, and Snap have supported KOSA. However, some LGBTQ+ and First Amendment rights advocacy groups warned the bill could censor online information about reproductive rights and similar issues.

Tech industry lobbying groups NetChoice and the Computer & Communications Industry Association sued nine states that implemented age-verification rules, alleging this threatens online free speech.

Questions about data

Garcia is also concerned about how data on underage users is collected and used via AI chatbots.

AI models and related services are often improved by collecting feedback from user interactions, which helps developers fine tune chatbots to make them more empathetic.

Rodichev said it's a "valid concern" about what happens with this data in the case of a hack or sale of a chatbot company.

"When people chat with these kinds of chatbots, they provide a lot of information about themselves, about their emotional state, about their interests, about their day, their life, much more information than Google or Facebook or relatives know about you," Rodichev said. "Chatbots never judge you and are 24/7 available. People kind of open up."

BI asked Character.AI about how inputs from underage users are collected, stored, or potentially used to train its large language models. In response, a spokesperson referred BI to Character.AI's privacy policy online.

According to this policy, and the startup's terms and conditions page, users grant the company the right to store the digital characters they create and they conversations they have with them. This information can be used to improve and train AI models. Content that users submit, such as text, images, videos, and other data, can be made available to third parties that Character.AI has contractual relationships with, the policies state.

The spokesperson also noted that the startup does not sell user voice or text data.

The spokesperson also said that to enforce its content policies, the chatbot will use "classifiers" to filter out sensitive content from AI model responses, with additional and more conservative classifiers for those under 18. The startup has a process for suspending teens who repeatedly violate input prompt parameters, the spokesperson added.

If you or someone you know is experiencing depression or has had thoughts of harming themself or taking their own life, get help. In the US, call or text 988 to reach the Suicide & Crisis Lifeline, which provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations. Help is also available through the Crisis Text Line — just text "HOME" to 741741. The International Association for Suicide Prevention offers resources for those outside the US.

Read the original article on Business Insider
Game News

Zach-like automation puzzle game ABI-DOS is available on Steam for free

Jhanak: Vihaan starts falling in love with Jhanak

Jhalak Dikhhla Jaa 11 fame Dhanashree Verma breaks silence on divorce rumours with Yuzvendra Chahal; says, “I’ve worked hard for years..”

The maker of the electric USPS truck is also building garbage robots and EV firefighters

The Evolution and Future of Realistic Sex Dolls

Ria.city






Read also

Amazon deal of the day: The M3 MacBook Air is $200 off once again

Rays face 1pm deadline with five arbitration eligible players

Putin warns Trump to BACK OFF Greenland as Don refuses to rule out invasion…so could icy island become WW3 flashpoint?

News, articles, comments, with a minute-by-minute update, now on Today24.pro

News Every Day

Jhalak Dikhhla Jaa 11 fame Dhanashree Verma breaks silence on divorce rumours with Yuzvendra Chahal; says, “I’ve worked hard for years..”

Today24.pro — latest news 24/7. You can add your news instantly now — here


News Every Day

The maker of the electric USPS truck is also building garbage robots and EV firefighters



Sports today


Новости тенниса
Арина Соболенко

Соболенко — о смерти Кольцова: «Поняла, что мне не нужен психолог, я справлюсь сама»



Спорт в России и мире
Москва

«Рейнджерс» заинтересован в трансфере Саши Зделара, пишет Daily Mail



All sports news today





Sports in Russia today

Москва

В один из зимних дней героем рубрики #ЗнайНаших стал сотрудник вневедомственной охраны старший сержант полиции Максим Зубцов


Новости России

Game News

NetEase pulls funding for another studio: Former Halo Infinite design head says Jar of Sparks is 'halting work' while it looks for a new publisher


Russian.city


Москва

Прогоноз-2025 – в Россию вернется архангел, ученые найдут темную материю, но настоящую власть получит загадочный ИИ


Губернаторы России
Магнит

Магнит Маркет и CarClick запустили продажи новых автомобилей из-за рубежа с гарантией 2 года


В Иркутске сотрудники ОМОН «Удар» почтили память сослуживца, погибшего при исполнении служебного долга в Чеченской Республике

Консультация юриста в Сургуте

Консультация юриста в Сургуте по уголовным

Скончался ресторатор Кирилл Гусев


Преобразившаяся Волочкова объявила об отъезде из России — улетит с молодым любовником?

Релиз трека. Релиз новой песни. Релиз сингла. Релиз Музыкального альбома. Релиз стихотворения.

Брат певицы Алсу задолжал банку 161 млн рублей

Песня ведущего телекомпании «Волга» стала новогодним хитом страны


Окленд (ATP). 2-й круг. Табило поборется с Басаваредди, Монфис – со Штруффом, Шелтон – с Меньшиком

Арина Соболенко станцевала вместе с командой после первого титула в сезоне. Видео

Окленд (ATP). 1/4 финала. Монфис сыграет с Акостой, Меньшик – с Боржесом

Джокович рассказал об отравлении в Австралии в 2022 году



В Главном управлении Росгвардии по Московской области подвели итоги работы за 2024 год

Заместитель управляющего Отделением Фонда пенсионного и социального страхования Российской Федерации по г. Москве и Московской области Алексей Путин: «Клиентоцентричность - наш приоритет»

Ветераны СВО будут проходить лечение в центрах реабилитации Социального фонда

В 2024 году Отделение СФР по Москве и Московской области назначило единое пособие родителям 370,5 тысячи детей


В Главном управлении Росгвардии по Московской области подвели итоги работы за 2024 год

Телеведущий Тигран Кеосаян пережил клиническую смерть и находится в коме

Кабинет Артиста. Яндекс кабинет артиста. Яндекс музыка кабинет артиста.

Маркетплейс Wildberries назвал самую дорогую и самую дешевую покупки в 2024 году


Задержан мужчина заявивший о бомбах в самолёте Нижнекамск-Москва: им оказался 42-летний директор компании по продаже бытовой химии

Российский самолет с треснувшим в кабине стеклом сел в Шереметьеве

Житель Набережных Челнов заявил о бомбе на борту самолёта

На два месяца продлили арест экс-главе департамента культуры Москвы



Путин в России и мире






Персональные новости Russian.city
Анастасия Волочкова

Преобразившаяся Волочкова объявила об отъезде из России — улетит с молодым любовником?



News Every Day

Kygrios




Friends of Today24

Музыкальные новости

Персональные новости