{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
News Every Day |

OpenAI sweeps in to ink deal with Pentagon as Anthropic is designated a ‘supply chain risk’—an unprecedented action likely to crimp its growth

OpenAI announced late Friday it reached a deal for the Pentagon to use its AI models in classified systems, just hours after the U.S. government designated OpenAI arch-rival Anthropic a “supply chain risk” in a move that threatens to deal a serious blow to Anthropic’s business.

Legal and policy experts said the government’s unprecedented decision presents profound questions about the relationship between the government and business in the U.S. It is the first time the U.S. has ever designated an American company a supply chain risk, and the first time the designation has been used in apparent retaliation for a business not agreeing to certain contractual terms. Anthropic said in a statement Friday that it would take legal action to try to overturn the Pentagon’s designation.

In a statement announcing its deal, OpenAI CEO Sam Altman said that its agreement with the Pentagon contains the same two limitations on how the military can use its technology that Anthropic had been insisting on and which the government has said it could not accept.

But OpenAI seems to have sought to enshrine these in the agreement in a different way than Anthropic. While Anthropic tried to have the limits spelled out explicitly in the contract, OpenAI agreed that the Pentagon could use its tech for “any lawful purpose,” while Altman also says of the limitations that OpenAI “put them into our agreement.”

It is unclear exactly how both these things could be true or how the limitations are stated in the agreement. But it may simply be that the contract language highlights that current U.S. law prohibits the Pentagon from deploying A.I. for mass surveillance of Americans and current U.S. military policy states that humans must retain “appropriate levels of human judgment” over the use of lethal force.

OpenAI also said that the Pentagon agreed that the company could build technical solutions into its AI models intended to prevent them from being used for either mass surveillance of U.S. citizens or deployed in lethal autonomous weapons.

“We are asking the [Department of War] to offer these same terms to all AI companies, which in our opinion we think everyone should be willing to accept,” Altman said. Some commentators interpreted Altman’s remark as a veiled criticism of Anthropic, which had not agreed to these terms previously and instead insisted on explicit contractual restrictions on how its models could be used.

Altman had previously publicly supported Anthropic’s position on the limitations it was seeking. Numerous OpenAI employees also signed an open letter supporting Anthropic CEO Dario Amodei’s insistence that its models not be used for mass surveillance or autonomous weapons.

The potential impact of a ‘supply chain risk’ designation

The extent of the damage to Anthropic’s business of the “supply chain risk” designation remained unclear over the weekend. Anthropic had a $200 million contract with the Pentagon that has now been cancelled. But that is not a huge blow to a company that is reportedly on track to generate at least $18 billion in revenue this year.

Instead, the larger concern is the extent to which other enterprises will have to stop using Anthropic’s technology. President Trump said on Truth Social that all federal departments were being ordered to stop using Anthropic’s AI immediately, but with a six-month phase in of the order to prevent disruption. Total federal technology spending is about $140 billion per year, but the amount the U.S. government currently spends on AI is a fraction of that.

The greatest danger, though, is posed by how Pete Hegseth, Secretary of War, has interpreted the supply chain risk designation and its impact. Hegseth said in a social media post that “effective immediately, no contractor, supplier, or partner that does business with the United State military may conduct any commercial activity with Anthropic.”

If that interpretation stands, it would do potentially catastrophic damage to Anthropic’s business, because many of the large enterprises that have been rapidly adopting Anthropic’s Claude models for software coding and other use cases also do some business with the U.S. military. It might also mean that companies such as Amazon, Google, and Nvidia that have invested billions of dollars into Anthropic would have to divest from the company, potentially leaving it with a large funding hole and making it difficult to raise further funds from U.S. investors.

Anthropic earlier this month announced it had closed a new $30 billion venture capital funding round that valued the company at $380 billion. It has reportedly been hiring financial and legal advisors for a potential IPO that could come late this year or early next. But its fight with the Pentagon now casts a pall over that prospect.

Many legal analysts and AI policy experts questioned Hegseth’s broad interpretation of the “supply chain risk” designation. Peter Harrell, a former Biden administration National Security Council official and a visiting scholar at Georgetown University Law School, posted on X that DoW’s supply chain risk designation applies only to work on Department of War contracts. “DoW can’t, legally, tell its contractors ‘don’t use Anthropic even in your private contracts,’” Harrell said.

Dean Ball, a senior fellow at the Foundation for American Innovation and a former AI policy advisor to the Trump administration, said in a post on X that Hegseth’s interpretation of the supply chain risk designation was “almost surely illegal” and amounted to “attempted corporate murder.” He said Hegseth’s actions—which he called “a psychotic power grab”—sent a terrible message to any business about whether it should ever risk doing business with the U.S. government.

Several legal experts noted that even a more narrowly-interpreted decision to designate Anthropic a supply chain risk may not survive a legal challenge. Charlie Bullock, a senior research fellow at the Institute for Law & AI, told Wired that the government cannot make the designation without having completed a risk assessment—something which it is unclear if the government conducted—and notifying Congress prior to taking action, something that also doesn’t seem to have occurred.

Amos Toh, a senior counsel at the Brennan Center for Justice at New York University, was also among several legal experts who said that the supply chain risk designation requires the government to prove that there is a risk of sabotage, subversion, or manipulation of operations by an adversary. “It is not at all clear how adversaries could exploit Anthropic’s usage restrictions on Claude to sabotage military systems,” Toh told the defense news site DefenseScoop. The statute also requires that the Pentagon have exhausted any alternative, less intrusive courses of action to mitigate the risk prior to making the supply chain risk finding. Toh questioned whether the Pentagon could reasonably claim to have made a “good faith effort” to pursue less intrusive measures, given how quickly the Anthropic dispute escalated over the past few days.

Even if Anthropic ultimately prevails in challenging the supply chain risk designation in court, the damage to its business may be done. ”It will take years to resolve in court. And in the meantime, every general counsel at every Fortune 500 company with any Pentagon exposure is going to ask one question: is using Claude worth the risk?” Shenaka Anslem Perera, an independent analyst with a large social media following, posted on X.

This story was originally featured on Fortune.com

Ria.city






Read also

Opportunity RoadRally Registration Open

BPD to increase patrols at places of worship amid events in the Middle East

America's crazed new obsession is nothing more than a tall tale

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости