The investment management firm and the tech giant joined forces in 2024 to fund data centers behind the AI boom and are now closer to their $30 billion goal, the report said. The partnership also includes Nvidia, xAi and United Arab Emirates-affiliated MGX investment group.
The effort “continues to attract significant capital,” BlackRock CEO Larry Fink told analysts during the call, per the report.
The partnership is aiming to pull in $30 billion of private equity capital and then mobilize up to $100 billion in investment potential, including debt financing for infrastructure projects, PYMNTS reported in September 2024.
“We are committed to ensuring AI helps advance innovation and drives growth across every sector of the economy,” Microsoft Chairman and CEO Satya Nadella said at the time. “The Global AI Infrastructure Investment Partnership will help us deliver on this vision, as we bring together financial and industry leaders to build the infrastructure of the future and power it in a sustainable way.”
The group made a $40 billion deal in October to acquire Aligned Data Centers from Macquarie Asset Management, which called it the largest data center acquisition in history.
Meanwhile, new research suggests that AI may no longer need massive data centers to scale.
A study from Switzerland-based tech university EPFL found that while frontier model training is still computationally intensive, many operational AI systems can be deployed without requiring centralized hyperscale facilities.
Instead, these systems can distribute workloads across existing machines, regional servers or edge environments, cutting back on the reliance on large, centralized clusters.
“The research highlights a growing mismatch between AI infrastructure and real-world enterprise use cases,” PYMNTS reported Friday (Jan. 9). “These systems often rely on smaller models, repeated inference and localized data rather than continuous access to massive, centralized models.”
Nvidia found that small language models could carry out 70% to 80% of enterprise tasks, leaving the most complex reasoning to large-scale systems, a structure that is becoming the most cost-effective way to operationalize AI.
For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.