{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026
1 2 3 4 5 6 7 8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
News Every Day |

Anthropic’s Ethical Stand Could Be Paying Off

At first glance, last week looked like a catastrophe for Anthropic.

The AI company refused to let the U.S. government use its products to surveil the American public or direct autonomous weapons without human oversight. In response, the Department of Defense canceled its $200 million contract. On Truth Social, President Trump called the company “leftwing nut jobs” and ordered every federal agency to immediately stop using its products. Defense Secretary Pete Hegseth went a step further, designating Anthropic as a “Supply-Chain Risk to National Security.” OpenAI, Anthropic’s chief rival, quickly signed its own deal with the Pentagon.

Anthropic’s principled stand continues to pose enormous risks for the company. But some early indications suggest that it just might pay off.

The company’s confrontation with DOD has proved more effective than some of the world’s most expensive advertising—at least according to one metric. After a Super Bowl campaign earlier this year, Anthropic’s AI model, Claude, became one of the top 10 most-downloaded free apps in America, per Apple’s charts. The day after Hegseth announced that the government was severing ties, it took the No. 1 spot, a position it still holds as of this writing. Downloads have topped 1 million a day, according to Anthropic’s chief product officer. A spokesperson told me that the company “has broken its own sign-up record every day since early last week, across every country where Claude is available.”

[Read: Inside Anthropic’s killer-robot dispute with the Pentagon]

Users aren’t just signing up for Claude—they are also abandoning OpenAI (which has a corporate partnership with The Atlantic). Uninstalls of ChatGPT, OpenAI’s flagship app, spiked 295 percent on February 28, as details of OpenAI’s deal with the Pentagon emerged. One-star reviews surged nearly 800 percent, and five-star reviews fell by half.

Perhaps more consequential, Anthropic has gained the trust and admiration of engineers across the AI industry. Letters of support for the company are circulating among its competitors’ employees. One such letter had some 850 signatures as of Monday. Many of these employees are demanding that their companies show solidarity with Anthropic and honor the same red lines. Some have reportedly threatened to leave if those demands are not met.

Anthropic has won admiration outside Silicon Valley too. Before the company’s clash with DOD, former Republican Representative Denver Riggleman, who now leads a cybersecurity firm, was preparing to pick an AI firm to partner with. He was considering a range of options; Anthropic’s stand narrowed them to one. Riggleman has since directed his company to work with Anthropic on all future projects. “Anthropic had its nonnegotiables,” he told me, and “we have ours.”

Drawing from his experience on a congressional AI task force focused on foreign adversaries, Riggleman thinks that Hegseth’s decision to label Anthropic a supply-chain risk will likely be overturned in court. The U.S. government has never applied the label to an American company, typically reserving it for corporations run by hostile foreign actors, such as Huawei. Moreover, this is the first time that the label appears to have been used in retaliation for a business declining contract terms. “To say it rests on shaky legal ground,” Riggleman said, “would be generous.”

The former congressman once trusted his country to regulate technologies that had the power to reshape Americans’ lives. “These days,” Riggleman said, “the government is no longer creating those safeguards—it’s destroying them.” He continued, “I don’t think we appreciate yet, as a society, what it means to have private firms controlling this amount of information about citizens.”

The Department of Defense has said that the contract it offered Anthropic contained adequate safeguards, in part because the text limited AI’s uses to “all lawful purposes.” Anthropic argued that this clause wasn’t sufficient—that a new executive order or reinterpretation of statute could shift the existing legal boundaries. “We don’t want to sell something,” Anthropic CEO Dario Amodei said, “that could get our own people killed, or that could get innocent people killed.”

OpenAI has contended that its subsequent deal with the Pentagon is safer than Anthropic’s. Its contract does appear to prohibit mass surveillance and autonomous weapons. But it retains the “all lawful purposes” language, rendering that prohibition dependent on DOD’s willingness to respect legal norms. Even Sam Altman, OpenAI’s CEO, conceded that the deal was “definitely rushed” and that “the optics don’t look good.” On Monday, the company said it had added restrictions to the contract regarding surveillance, but critics are skeptical that they will prove any more binding.

[Read: OpenAI is opening the door to government spying]

The events of the past week reminded me of my early days as a Navy pilot nearly three decades ago. One of my first tasks was to sign a document pledging never to surveil American citizens. By the time of the 9/11 attacks, I was an aircraft commander, leading combat-reconnaissance aircrews that gathered large-scale intelligence and informed battlefield targeting decisions. I took for granted that somewhere along those decision chains, a human being was in the loop.

I could not have defined artificial intelligence then, but I understood instinctively that a person, not a machine, would bear the weight of life-and-death choices. This was not a bureaucratic consideration. It was a hard line that those of us in uniform were expected to hold.

In the standoff between Anthropic and the Pentagon, a private company was forced to hold the line against its own government. In doing so, Anthropic may have earned something more valuable than the contract it lost. In an industry where trust is the scarcest resource, Anthropic just banked a substantial deposit.

Ria.city






Read also

My mom has been collecting miniatures for 50 years. They have taken over our 1000-square-foot basement.

Drug trafficking ringleader from Marion to serve decades in prison

A solid core enlivens free speech and viewpoint diversity

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости