{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026 April 2026
News Every Day |

Amazon’s OpenAI gambit signals a new phase in the cloud wars — one where exclusivity no longer applies

Amazon Web Services on Tuesday launched one of the most consequential enterprise AI plays in the company's 20-year history, simultaneously bringing OpenAI's most powerful models to its Bedrock platform, unveiling a new agentic developer framework, releasing a desktop AI productivity tool called Amazon Quick, and expanding its Amazon Connect service from a single contact-center product into a family of four agentic AI solutions targeting supply chains, hiring, healthcare, and customer experience.

The announcements, made at a live event in San Francisco titled "What's Next with AWS," landed just 24 hours after OpenAI and Microsoft publicly restructured their exclusive cloud partnership — a move that, for the first time, freed OpenAI to distribute all of its products across rival cloud providers. AWS CEO Matt Garman called it "a huge partnership" and said customers have been asking for OpenAI models inside AWS "from the very early days."

The timing was no accident. Amazon CEO Andy Jassy had flagged the Microsoft-OpenAI restructuring as "very interesting" in a post on X the day prior, promising more details on Tuesday. What followed was a sweeping set of launches that together represent AWS's bid to become the definitive infrastructure layer for the agentic AI era — one where intelligent software agents don't just answer questions but take autonomous action inside enterprise workflows.

OpenAI's most capable models arrive on Amazon Bedrock for the first time, reshaping the cloud AI marketplace

The centerpiece announcement: OpenAI's latest models are now available through Amazon Bedrock in limited preview, with general availability expected within weeks. AWS confirmed that GPT-5.4 is available immediately in limited preview, with GPT-5.5 arriving shortly thereafter.

In an exclusive interview with VentureBeat at the event, Anthony Liguori, Vice President and Distinguished Engineer at AWS, described the significance of the moment. "We announced a partnership about eight weeks ago centered around this idea of the stateful runtime environment, the SRE APIs," Liguori said. "However, today we announced the availability of all of OpenAI's frontier models in Amazon Bedrock available via both the stateless APIs — these are the APIs that are commonly used, like chat completions and responses."

Liguori characterized the stateless API availability as particularly critical because it removes migration friction. "Customers can take their existing workloads today and just start using AWS right off the bat," he said. "They don't have to write any new software, develop any new things. I think that's one of the most exciting announcements that came out today."

The integration means AWS customers can now evaluate and deploy OpenAI models alongside offerings from Anthropic, Meta, Mistral, Cohere, and Amazon's own models — all through Bedrock's unified security, governance, and cost controls. For enterprise procurement teams, this collapses what had been a fragmented multi-vendor landscape into a single pane of glass.

How a $50 billion Amazon investment and a messy Microsoft breakup cleared the way for Tuesday's deal

The path to Tuesday's announcement was anything but smooth. As TechCrunch reported, OpenAI's earlier $50 billion deal with Amazon, announced in February, had created a legal tangle with Microsoft. Under the original Microsoft-OpenAI agreement, Microsoft retained exclusive rights to OpenAI products accessed through APIs, which appeared to conflict directly with OpenAI's promise to give AWS exclusive hosting rights for its new Frontier agent-building tool.

Microsoft had publicly pushed back at the time, stating that "Azure remains the exclusive cloud provider of stateless OpenAI APIs." The Financial Times reported that Microsoft even contemplated legal action. Monday's restructured deal — which replaced Microsoft's open-ended exclusivity with a nonexclusive license running through 2032 — swept those legal obstacles aside.

For AWS, the resolution means its multi-billion-dollar investment in OpenAI can now fully bear fruit. As CNBC reported, OpenAI's revenue chief Denise Dresser had told employees in a memo that the Microsoft relationship "has also limited our ability to meet enterprises where they are — for many that's Bedrock." At the San Francisco event, Dresser framed the moment as a turning point. "They're no longer in the mindset of experimentation and pilots," she said of enterprise customers. "They really want to go full enterprise wide, and they understand that to do that, they need to have powerful models. But even more importantly, they want those models in a trusted environment."

OpenAI CEO Sam Altman, who was unable to attend in person due to his ongoing court case against Elon Musk across the Bay Bridge in Oakland, sent a recorded video message. "We are co-developing an agent platform from the ground up, deeply integrated with AWS services and powered by OpenAI's most advanced models and tools," Altman said, "so that customers can build and run powerful agents in their own environment without worrying about the underlying plumbing."

Inside Bedrock managed agents, the reinforcement learning-trained 'harness' that AWS says will define the agentic era

Beyond raw model access, AWS launched Amazon Bedrock Managed Agents powered by OpenAI — a system that combines OpenAI's frontier models with its proprietary "harness," the agentic execution framework that powers products like Codex. This is where Liguori's technical analysis was most revealing.

He explained that the harness concept represents a shift in how models are trained and deployed for agentic work. "When you think about an agentic platform, there's really two components," Liguori told VentureBeat. "One is the harness — the actual logic that will execute tool calls for the model, determine when to compact the context, all of those sorts of things — and then the model itself."

Critically, Liguori argued, the best agentic performance comes when models are trained specifically against their harness through reinforcement learning — not merely prompted to use tools at inference time. "You can give a model a whole lot of instructions and a set of tools, and it will be able to use it most of the time," he said. "But when you really train the model on a specific set of tools, a specific style of operations, it's just like drilling plays over and over again — the model builds muscle memory for using that harness."

The football analogy is instructive. Where general-purpose models are like versatile athletes who can adapt to any playbook, harness-trained models are like championship teams that have run the same formations thousands of times until execution becomes instinctive. For enterprises deploying agents in high-stakes production environments — managing financial transactions, orchestrating supply chains, or processing sensitive healthcare data — that reliability gap matters enormously.

Bedrock Managed Agents consists of three components: a runtime layer for configuring skills, memory policies, and tool access; an environment layer where the agent lives (deployable on Fargate or other AWS compute); and an inference API for interacting with the agent. The system integrates deeply with AWS's identity and access management, VPC networking, and CloudTrail auditing — meaning every action an agent takes is logged and governed by existing enterprise security policies.

AWS makes its boldest security claim yet: zero human access to inference machines running OpenAI's models

Liguori made what may be his most striking claim when discussing why enterprises should trust AWS over on-premises alternatives or smaller cloud providers. "With Bedrock, the system that we're using to host the GPT-5.4 models, that whole environment is zero operator access," he told VentureBeat. "There's no human that could ever log into one of those machines, so your inference data is never able to be accessed by a human."

He pointed to AWS's custom silicon — Graviton processors and Nitro security chips — as the foundation for this claim. "When you look at one of our servers, either compute servers or the servers we're using for Gen AI, the only thing that you can buy off the shelf is the memory modules. Everything else is either custom boards or even custom silicon."

This argument is designed to counter a growing narrative from what the industry calls "neo-clouds" — smaller providers that offer on-premises model hosting with tighter physical security controls. Liguori flipped that argument on its head: "You're actually way more secure in the cloud because we have built a platform with such strong physical securities... If you were to try to stand up your own inference system today, you'd probably be running open source software on just Linux."

It's a bold claim, and one that enterprise CISOs will undoubtedly scrutinize. But it underscores AWS's conviction that the agentic era — where AI agents access source code, PII data, and critical business systems — demands infrastructure security guarantees that go far beyond what most organizations can build independently.

Codex's 4 million weekly users could soon multiply as OpenAI's coding agent arrives on AWS

OpenAI's Codex coding agent also arrived on Bedrock in limited preview. Dresser shared that Codex has been growing at a blistering pace, expanding "from 3 million weekly active users to 4 million in two weeks." The tool has evolved beyond simple code generation into a full agentic software development lifecycle platform.

For Liguori, who described himself as "10 to 20 times more productive" as an engineer thanks to tools like Codex, bringing this capability into AWS represents the bridge between individual developer productivity and enterprise-scale deployment. "Most developers today are using these OpenAI models on their laptops," he said. "We haven't seen that happen yet in the rest of the industry, and with Bedrock Managed Agents, we think we have a way for enterprises to deploy agents in a means that meets their compliance requirements."

The gap Liguori is describing — between the solo developer experience and enterprise-wide adoption — is arguably the central challenge of the current AI moment. Individual engineers can achieve extraordinary productivity gains with agentic coding tools. But scaling that to thousands of developers across a Fortune 500 company, with proper governance, security, and auditability, requires platform-level infrastructure. That's the market AWS is targeting.

Liguori saw the near-term potential in even more immediate terms. He described leading a team of about 20 engineers who share a common codebase of skills and MCP tools. "That has been an amazingly powerful thing, because we're all able to build on top of each other as we learn how to use these models," he said. "Where I've run into a hurdle is there's a lot of stuff I'd like to share with our finance team... and I can't really ask them to clone a Git repo and build it from a Git repo." Bedrock Managed Agents, he argued, will let teams create hosted agents that non-technical colleagues can access — taking agentic development from a developer-only practice to an enterprise-wide capability within the next six months.

Amazon Quick Desktop aims to be the agentic AI assistant that finally works for non-developers

While the OpenAI partnership dominated headlines, AWS also launched Amazon Quick Desktop — a new desktop application designed to bring agentic AI to knowledge workers who aren't developers. Liguori framed the product as addressing a critical gap. "A lot of these agentic tools have primarily targeted developers," he said. "Quick Desktop is a really great tool if you are a knowledge worker that is not a developer... I think it's been underserved for the non-developer knowledge workers."

Quick Desktop integrates with a user's local files, calendar, email, Slack, and enterprise applications — building what AWS calls a "Knowledge Graph" that maps relationships between people, projects, decisions, and actions. The system connects natively with Google Workspace, Microsoft 365, Zoom, and Salesforce. Unlike other AI productivity tools, Quick doesn't wait for prompts. It proactively surfaces what matters — unanswered emails, deals needing updates, documents awaiting review — and can take action like scheduling meetings, drafting emails, or updating Jira tickets.

Garman, who said he had been using the desktop app for several weeks, called it "by far the most effective tool" among AI productivity products he has tested. "If you think about what we've done with Quick — combine all of your sources of data inside of the enterprise — but then we also saw the power of having access to a local desktop and being able to operate with your local files and your local email and your local Slack... but people were worried about security, appropriately so," Garman said. "What we're doing here is combining a bunch of those things together with QUIC to give you the best of all of those worlds."

The product is available in preview today, with no AWS account required — users can sign up with just an email address. Customers including BMW, 3M, Mondelēz, Southwest Airlines, and the NFL are already using it, with some reporting production time reductions of nearly 80% and customer issue processing cut by more than 50%.

Amazon Connect becomes a family of four as AWS bets that 'agentic teammates' will transform supply chains, hiring, and healthcare

Perhaps the most ambitious long-term bet announced Tuesday was the expansion of Amazon Connect from a single contact-center product — one that reached over $1 billion in revenue last year and processes 20 million interactions daily — into a family of four agentic AI solutions.

The new lineup includes Amazon Connect Decisions, an agentic supply chain planning tool built on more than 25 specialized supply chain tools and 30 years of Amazon operational science, including one of Amazon's SCOT (Supply Chain Optimization Technologies) foundation models. Amazon Connect Talent is a high-volume hiring platform inspired by Amazon's experience hiring 250,000 seasonal employees during peak periods, using AI agents to conduct voice interviews around the clock and present recruiters with anonymized, skills-based scoring. Amazon Connect Customer AI is the renamed and enhanced version of the original contact-center service. And Amazon Connect Health covers the patient journey from appointment scheduling through clinical encounters, including ambient documentation, billing code suggestions, and post-visit summaries drawn from Amazon's experience with One Medical and Amazon Pharmacy.

Colleen Aubrey, who leads applied AI solutions at AWS and previously co-founded Amazon's advertising business, introduced a new design philosophy underlying all four products: "humorphism." Where skeuomorphism translated physical objects into digital metaphors — desks to desktops, files to folders — humorphism translates human interaction dynamics into AI agent behavior. "If we're building products that at the heart of which is an agentic teammate, then how should those teammates interact with you?" Aubrey asked. The philosophy manifests in specific design choices: Connect Decisions agents ask planners why they made manual adjustments and apply those insights across similar products. Connect Talent agents adapt follow-up questions based on candidate responses. Connect Health agents trace every clinical insight back to source data so physicians can verify AI-generated documentation.

What AWS's four-layer strategy reveals about where the real value in enterprise AI will be captured

Taken together, Tuesday's announcements reveal a coherent strategy operating across four distinct layers: custom infrastructure (Graviton, Trainium, zero-operator-access security), model access (Bedrock as a model marketplace with unified APIs), an agentic platform (Bedrock Managed Agents and AgentCore for building and governing agents), and purpose-built applications (Quick for individual productivity, Connect for vertical business operations).

This layered approach addresses a fundamental tension in the enterprise AI market. Companies want choice at the model layer but integration at the platform layer and specificity at the application layer. By offering all three through a single security and governance framework, AWS is betting it can capture value across the entire stack — a strategy that reshapes competitive dynamics for Microsoft, Google Cloud, and the growing constellation of smaller AI infrastructure providers.

Garman pushed back on the "SaaSpocalypse" narrative that agentic AI will destroy incumbent enterprise software companies. "The incumbent providers today have such a huge advantage," he said. "They have deep domain expertise... a large customer set with all of their data." He pointed to Salesforce's recent headless API offering as an example of incumbents adapting smartly. But he also drew an explicit parallel to the early days of cloud computing, when customers would simply replicate their on-premises data centers in the cloud rather than reimagine what was possible. "You see that today with how people are thinking about AI and agents," Garman said. "They're like, 'I have this business process, I'm gonna have agents do the exact same thing that humans do.' It kind of works... but it doesn't give you that transformational change."

He pointed to Amazon's own Prime Video team as proof of what that change looks like in practice. The team used agentic tools to rebuild a partner payment system that was projected to take two years — completing it in roughly two quarters with a handful of people, while simultaneously improving the system for customers, for Amazon, and for the partners who get paid through it.

The enterprise AI arms race enters a new phase as model access becomes table stakes and the platform war begins

For enterprises evaluating their AI strategies, Tuesday's announcements simplify one decision — OpenAI models are now available where most of them already run production workloads — while complicating another. With model access increasingly commoditized across cloud providers, the real differentiator becomes the platform layer: where agents are built, governed, deployed, and trusted to take consequential actions. That's the battleground AWS is staking out, and it's the same ground Microsoft, Google, Salesforce, and a growing number of startups intend to contest.

Liguori sees the transformation accelerating fast. "I think what we're going to see in the next six months is a lot of this agentic stuff going from developer only to being able to be consumed by a larger number of folks within an enterprise," he told VentureBeat. Anthony Liguori, the AWS distinguished engineer who led the technical work over eight sleepless weeks to bring OpenAI's models to Bedrock, said his own productivity as a software engineer has increased 10 to 20 times over the past year. When asked what excites him most about what comes next, he didn't talk about models or infrastructure. He talked about what happens when that same multiplier reaches the finance team, the product managers, the supply chain planners — the millions of knowledge workers who have been watching the agentic revolution from the sidelines.

"We had nothing eight weeks ago," he said, "and now we're here." If the next eight weeks move as fast, the sidelines may not exist for much longer.

Ria.city






Read also

The Latest: Iran war has cost estimated $25 billion so far, Pentagon official tells Congress

FIS Platform Bridges the Gap Between Banking and Digital Money

Recortes a la asistencia alimentaria de SNAP preocupan a comerciantes y vendedores locales

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости