{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
28
News Every Day |

Anthropic Takes a Stand

This is an edition of The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here.

Earlier this week, Secretary of Defense Pete Hegseth sat down with Dario Amodei, the CEO of the leading AI firm Anthropic, for a conversation about ethics. The Pentagon had been using the company’s flagship product, Claude, for months as part of a $200 million contract—the AI had even reportedly played a role in the January mission to capture Venezuelan President Nicolás Maduro—but Hegseth wasn’t satisfied. There were certain things Claude just wouldn’t do.

That’s because Anthropic had instilled in it certain restrictions. The Pentagon’s version of Claude could not be used to facilitate the mass surveillance of Americans, nor could it be used in fully autonomous weaponry—situations where computers, rather than humans, make the final decision about whom to kill. According to a source familiar with this week’s meeting, Hegseth made clear that if Anthropic did not eliminate those two guardrails by Friday afternoon, two things could happen: The Department of Defense could use the Defense Production Act, a Cold War–era law, to essentially commandeer a more permissive iteration of the AI, or it could label Anthropic a “supply-chain risk,” meaning that anyone doing business with the U.S. military would be forbidden from associating with the company. (This penalty is typically reserved for foreign firms such as China’s Huawei and ZTE.)

This evening, Anthropic said in a public statement that it “cannot in good conscience accede” to the Pentagon’s request. What happens next could mark a crucial moment for the company, and for the American government’s approach to AI regulation more broadly. In refusing to bow to an administration that has been intent on bullying private companies into submission, Amodei and his team are taking a bold stand on ethical grounds, and risking a censure that could erode Anthropic’s long-term viability.

During the first year of Donald Trump’s second term, the White House had a more relaxed attitude toward AI regulation; an AI Action Plan from July stresses that the administration will “continue to reject radical climate dogma and bureaucratic red tape” to encourage innovation. Hegseth is now, in effect, threatening to partially nationalize one of the biggest AI players in the private sector—and force the company to go against its own principles. “This is the most aggressive AI regulatory move I have ever seen, by any government anywhere in the world,” Dean Ball, who helped write some of the Trump administration’s AI policies, told me.

The Pentagon has already reportedly been reaching out to other defense contractors to see if they’re connected to Anthropic, a sign that officials are preparing to designate the company a supply-chain risk. Now that Anthropic has defied Hegseth, the contract is likely in peril. The firm doesn’t really need the $200 million—it reportedly pulls in $14 billion a year, and it said it raised $30 billion in venture capital just weeks ago—but being blacklisted could affect its ability to scale up in the future. (“We are not walking away from negotiations,” an Anthropic spokesperson told The Atlantic in a statement. “We continue to engage in good faith with the Department on a way forward.” The Pentagon told CBS on Tuesday that “this has nothing to do with mass surveillance and autonomous weapons being used,” and that ”the Pentagon has only given out lawful orders.”)

As AI firms around the world jockey for dominance, Anthropic has distinguished itself by emphasizing safety. OpenAI’s ChatGPT has been criticized for playing up some users’ delusions, leading to cases of “AI psychosis,” and just last month, xAI’s Grok was spinning up nearly nude images of almost anyone without consent. (xAI has said it is restricting Grok from generating these kinds of images, and OpenAI has said it is working to make ChatGPT better support people in distress.) Meanwhile, Anthropic’s consumer-facing chatbot doesn’t generate images at all. By refusing to cave to government pressure, it may have just averted another crisis: a major public backlash from consumers, some of whom see the company as a more principled player in the AI wars. Anthropic recently faced some pushback over changing its policies—Time reported on Tuesday that, in a seemingly unrelated move, the company dropped a core safety pledge concerning its broader approach to AI development.

Weeks before Hegseth issued his ultimatum, Amodei opined on his website about the risks involved with precisely the two guardrails the Pentagon is targeting. “In some cases,” he wrote, “large-scale surveillance with powerful AI, mass propaganda with powerful AI, and certain types of offensive uses of fully autonomous weapons should be considered crimes against humanity.”

The Trump administration doesn’t seem to know what it wants from AI. On one hand, it’s deeply suspicious of certain kinds of models. The White House’s designated AI czar, David Sacks, has criticized Anthropic for “running a sophisticated regulatory capture strategy based on fear-mongering,” essentially accusing the firm of pushing for unnecessary, innovation-squashing limitations and jeopardizing the future of American tech. The administration has also criticized AI bots for sometimes spitting out “woke” replies. On the other hand, Claude is apparently valuable enough that it’s on the cusp of being commandeered by the federal government.

Ball told me that the Department of Defense may have a point—that there’s an argument to be made about reining in Silicon Valley’s control over the government’s use of new technologies. Although the concentration of power among the technocratic elite is certainly troubling, Hegseth’s proposed punishments for Anthropic are misguided and plainly contradictory. The Defense Production Act does allow the government to intervene in domestic industries in the interest of national security (the Biden administration invoked it in a 2023 executive order on AI regulation). But is Claude so important for U.S. national security that the government needs to compel Anthropic to create an untethered new version? Or is it so dangerous that it needs to be shunned—not just by the Pentagon, but by any business connected to the military? A third, even-more-bewildering option is also on the table: Hegseth could decide to simultaneously commission a modified Claude and sanction the company that stewards it.

All of this ignores a much simpler solution: Hegseth could just start a partnership with a different firm. It’s a good time for his department to be in business with tech, since the mood of Silicon Valley has lately become much more Pentagon-friendly. Palantir’s Alex Karp has touted that his software is used “to scare our enemies and, on occasion, kill them”; the technologist and entrepreneur Palmer Luckey is already building autonomous weaponry for the government; and Andreessen Horowitz’s American Dynamism funds are helping funnel the country’s top young minds into defense tech. But rather than look elsewhere, Hegseth is threatening to crush Anthropic—implying that if he can’t control Claude, no one can.

As the defense secretary looks to make an example of the company, he’s taking a cue from Trump, who has used legal and extralegal pressure to effectively force other private businesses, particularly big law firms, banks, and universities, into submission. These acts of coercion have the potential to reshape American capitalism: We are beginning to see a market where winners and losers are decided less by the quality of their products and more by their seeming fealty to the White House. How that will affect the success of businesses and the economy is uncertain.

The Pentagon created this ultimatum precisely because it understands Anthropic’s world-altering potential. The administration just can’t decide if it’s an asset, a liability, or both.

Related:


Here are three new stories from The Atlantic:


Today’s News

  1. A Columbia University student detained this morning by federal immigration agents has been released. The arresting officers reportedly misrepresented themselves as looking for a missing child in order to gain entry to the student’s residential building.
  2. Hillary Clinton told the House Oversight Committee that she has no new information about Jeffrey Epstein and maintained that she had no knowledge of his crimes; she criticized congressional Republicans’ handling of the probe as partisan. Bill Clinton is scheduled to give his deposition tomorrow.
  3. Cuban forces killed four people and wounded six after firing on a Florida-registered speedboat that Cuban authorities say entered the country’s waters yesterday and opened fire on a patrol vessel. Cuba claims that the U.S.-based passengers were armed and planning a “terrorist” infiltration.

Dispatches

Explore all of our newsletters here.


More From The Atlantic


Evening Read

Illustration by The Atlantic

This Looks Like an Insider Bet on Aliens

By Ross Andersen

On Monday night, someone placed a peculiar bet on the prediction market Kalshi. At 7:45 p.m. eastern time, a single trader put down nearly $100,000 on the claim that, by the end of December, the Trump administration will confirm that alien life or technology exists elsewhere in our universe. According to The Atlantic’s review of Kalshi’s trading data, about 35 minutes after this bet was executed, it was followed by another that was almost twice as large (possibly from the same person). These were market-moving events: For one brief stretch, the market appeared to think that there was at least a one-in-three chance that the U.S. government will announce the existence of aliens this year. Perhaps this was just some overexcited UFO diehard with a hunch and money to burn. Or maybe, as some observers quickly noted, it was a trader with inside knowledge.

Read the full article.


Culture Break

Illustration by Alisa Gao / The Atlantic

Explore. When did literature get less dirty? A puritan strain is manifesting in realist novels as a marked absence of straight sex, Lily Meyer writes.

Read. Casey Schwartz on two new books that demonstrate how Martha Gellhorn, Janet Flanner, and other female reporters took journalism in directions that men could not.

Play our daily crossword.

Rafaela Jinich contributed to this newsletter.

When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic.

Ria.city






Read also

Kennedy warns ayatollah wants to 'drink our blood out of a boot' as Iran tensions escalate

Flock plate reader use ends in Cupertino and Saratoga

Comedy legend Jim Carrey receives lifetime honour at France’s 2026 Cesar Awards

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости