{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
23
24
25
26
27
28
29
30
31
News Every Day |

The First AI Crisis Is Psychological

My husband and I wanted a divorce without the divorce part. No adversarial process. No lawyers telling us what we “deserved.” We thought: Why not handle it ourselves? Lawyers are expensive; ChatGPT is cheap, even free at first. I typed: We agree on everything and want an amicable divorce. Can we write our own agreement, get it notarized, and file it ourselves without hiring lawyers? What exactly do we need to submit?

ChatGPT walked me through the steps with total confidence—draft the agreement, file the paperwork, and you can be divorced in a month. It all sounded straightforward.

What I didn’t understand—and what a bank I later tried to get a mortgage from absolutely did—was that a signed settlement agreement is not the same thing as a court-finalized divorce decree. The agreement still has to be incorporated into a judgment and entered by a judge. That takes time, at least six months in California. When I finally understood that, the first condo I’d fallen in love with—the one that made the transition feel slightly less paralyzing—was gone.

And still, I went back to ChatGPT for more. Even now, I keep asking AI questions I should take to professionals. Recently, I typed: Why are my hands going numb? The AI gave me a calm, specific answer and a tidy plan—monitor it; here are a few likely causes; here’s when to worry. I felt the same relief I’d felt with the divorce advice. This guidance might even be right. But what keeps me coming back isn’t accuracy. It’s the unwavering confidence.

[Jasmine Sun: The human skill that eludes AI]

The dominant AI narrative focuses on labor, automation, and job displacement—economic panic. And to be fair, those fears aren’t imaginary. AI may well be economically destabilizing.

But there’s another kind of destabilization that shows up earlier, before anyone loses a paycheck. It hits in two directions at once. First, self-worth: watching a system speak with total certainty and realizing how much of our own credibility has always been bound up in effort, doubt, and earning it the slow way. Second, epistemic uncertainty: the creeping sense that you can’t trust your own eyes anymore, that the internet is turning into a place where anything can be generated, and quite easily.

A Reddit post, a campaign slogan, a book you just bought—did they really come from humans?

On both fronts—self-worth and epistemic uncertainty—the accelerant is the same: the way AI can sound final whether it’s right or wrong. The trouble with AI confidence is that it starts to corrode our own.

Confidence changes how people evaluate information. Decades of experiments on what psychologists call the “confidence heuristic” show that people tend to use confidence as a shortcut for assessing credibility, especially when accuracy is hard to judge.The effect persists even when people know a system or person can be wrong.

The confidence heuristic explains why certainty persuades us. AI is amplified by a second force: We’re not just hearing certainty; we’re hearing it from a machine, and that triggers a different kind of trust. The Penn State researcher S. Shyam Sundar calls this the “machine heuristic”: a shortcut in which we automatically attribute objectivity and expertise to machine-generated answers, especially when they’re given fluently and without hesitation. Sundar named the effect more than a decade before AI made that bias a daily experience.

Of course, AI has little reason not to exude confidence. If AI gives you wrong advice, nothing happens to it. There’s no social cost, no loss of standing, no hesitation the next time it speaks. The tone stays the same whether the answer is accurate, speculative, or completely wrong.

And that’s where AI is sometimes wrong turns into something that hits your self-worth. When a person speaks with that kind of certainty, they’ve usually paid for it—years of training, a reputation on the line, the risk of being wrong in front of people who will remember. You trust them because they’ve earned the right to sound sure. With AI, the certainty is free. It hasn’t done the reading. It hasn’t failed publicly and recovered. It hasn’t built anything slow. And yet it sounds exactly like someone who has—which means the cue you’ve been using your whole life to sort the credible from noise, the cue you worked to earn yourself, suddenly stops working. You weren’t just misled; you couldn’t tell the difference between real authority and a very good impression of it. And if you can’t tell, what does that say about your judgment? What was all that work for?

It’s not just your judgment that’s under pressure—it’s the ground beneath it. AI mediates perception itself—by generating, authenticating the images, videos, and audio we once relied on as direct evidence. Forms of proof that used to anchor reality now circulate untethered from provenance. They look and sound real.

[From the March 2026 issue: America isn’t ready for what AI will do to jobs]

Psychologists studying misinformation describe what happens when people lose confidence in their ability to tell what’s real. If perceptual judgment starts to feel unreliable, people don’t become more analytical; they simplify, deferring to the most decisive source available, or disengage entirely.

People may just give up on trying to sort the real from the fake. A Facebook post circulated on my feed a few weeks ago declaring that every emotional anecdote posted by strangers—every story about a heroic teacher, a celebrity speaking out, an animal rescue—was fake. The post expressed fury at the AI-generated content flooding the platform. But underneath the anger was the relief of a firm rule: If everything is fake, you don’t have to carry the burden of discernment anymore. You don’t have to weigh credibility or sit with uncertainty. You can reject it all.

If we give up on knowing what’s real, we don’t just lose facts; we lose contact. With the world, with one another—most of all, with ourselves. That distance isn’t neutrality; it’s disconnection. Being alive is a certain permeability: beauty, grief, a stranger’s late-night post about losing their dog—small, true things that still move you. Without that, we’re not protected. We’re sealed off.

Ria.city






Read also

Moon phase today: What the Moon will look like on March 22

REI Is Selling Gregory's 'Multifunctional' Nano 20 Hiking Pack for Only $45 Right Now

The Lost Cause Gets a Tinseltown Makeover

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости