{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
28
29
30
31
News Every Day |

Exclusive: Anthropic left details of an unreleased model, invite-only CEO retreat, sitting in an unsecured data trove in a significant security lapse

AI company Anthropic has inadvertently revealed details of an upcoming model release, an exclusive CEO event, and other internal data, including images and PDFs, in what appears to be a significant security lapse. 

The not-yet-public information was made accessible via the company’s content management system (CMS), which is used by Anthropic to publish information to sections of the company’s website.

In total, there appeared to be close to 3,000 assets linked to Anthropic’s blog that had not previously been published to the company’s public-facing news or research sites that were nonetheless publicly-accessible in this data cache, according to Alexandre Pauwels, a cybersecurity researcher at the University of Cambridge, who Fortune asked to assess and review the material.

After Fortune informed Anthropic of the issue on Thursday, the company took steps to secure the data so that it was no longer publicly-accessible.

Prior to taking these measures, Anthropic stored all the content for its website—such as blog posts, images, and documents—in a central system that was accessible without a login. Anyone with technical knowledge could send requests to that public-facing system, asking it to return information about the files it contains.

While some of this content had not been published to Anthropic’s website, the underlying system would still return the digital assets it was storing to anyone who knew how to ask. This means unpublished material—including draft pages and internal assets—could be accessed directly.

The issue appears to stem from how the content management system (CMS) used by Anthropic works. All assets—such as logos, graphics, or research papers—that were uploaded to the central data store were public by default, unless explicitly set as private. The company appeared to have forgotten to restrict access to some documents that were not supposed to be public, resulting in the large cache of files being available in the company’s public data lake, cybersecurity professionals who analyzed the data told Fortune. Several of the company’s assets also had public browser addresses. 

“An issue with one of our external CMS tools led to draft content being accessible,” an Anthropic spokesperson told Fortune. The spokesperson attributed the issue to “human error in the CMS configuration.”

There have been several high-profile cases lately of technology companies experiencing technical faults and snafus due to problems with AI-generated code or with AI agents. But Anthropic, which makes the popular Claude AI models and has boasted of automating much of its own internal software development using Claude-based AI coding agents, said AI was not at fault in this case.

The issue with its CMS was “unrelated to Claude, Cowork, or any Anthropic AI tools,” the Anthropic spokesperson said.

The company also sought to downplay the significance of some of the material that had been left unsecured. “These materials were early drafts of content considered for publication and did not involve our core infrastructure, AI systems, customer data, or security architecture,” the spokesperson said.

While many of the documents appear to be discarded or unused assets for past blog posts, like images, banners, and logos, some of the data appeared to detail sensitive information. 

The documents include details of upcoming product announcements, including information about an unreleased AI model that Anthropic said in the documents is the most capable model it has yet trained.

After being contacted by Fortune, the company acknowledged that is developing and testing with early access customers a new model that it said represented a “step change” in AI capabilities, with significantly better performance in “reasoning, coding, and cybersecurity” than prior Anthropic models.

The publicly-accessible data also included information about an upcoming, invite-only retreat for the CEOs of large European companies being held in the U.K. that Anthropic CEO Dario Amodei is scheduled to attend. An Anthropic spokesperson said the retreat was “part of an ongoing series of events we’ve hosted over the past year” and the company was “developing a general-purpose model with meaningful advances in reasoning, coding, and cybersecurity.”

Among the documents were also images that appear to be for internal use, including one image with a title that describes an employee’s “parental leave.” 

It’s not the first time a tech company has inadvertently exposed internal or pre-release assets by leaving them publicly accessible before official announcements.

<a href="https://fortune.com/2025/08/14/apple-leaks-new-products-software-code-vision-pro-studio-display-homepod-tv-ipad/?utm_source=chatgpt.com">Apple has twice leaked information through its own website—once in 2018, when upcoming iPhone names appeared in a publicly accessible sitemap file hours before launch, and again in late 2025, when a developer discovered that Apple had shipped its redesigned App Store with debugging files left active, making the site’s entire internal code readable to anyone with a browser.

Gaming companies like Epic Games and Nintendo have also seen pre-release images, in-game assets, and other media leak via content delivery network systems (CDNs) or staging servers, similar to the data lake Anthropic used in this case. Even larger firms such as Google have accidentally exposed internal documentation at public URLs, and data associated with Tesla vehicles has been exposed through misconfigured third‑party servers.

However, the problem is likely exacerbated by AI coding tools now readily available on the market—including Anthropic’s own Claude Code.  

These tools can automate crawling, pattern detection, and correlation of publicly accessible assets, making it far easier to discover this kind of content and lower the barriers to entry for doing so. AI tools like Claude Code or Codex can also generate scripts or queries that scan entire datasets, rapidly identifying patterns or file naming conventions that a human might miss. 

This story was originally featured on Fortune.com

Ria.city






Read also

Chris Hayes delivers brutal takedown as Trump discovers 'one lie he can't get away with'

‘Anupamaa’ March 26 written update: Anupama gets fired from her job because of Rossi

Republican demands Lindsey Graham be denied access to the Oval Office

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости