Meta’s AI Future Is Personal, Starting With You
On Wednesday (April 8), Meta released Muse Spark, the first artificial intelligence (AI) model out of Meta Superintelligence Labs.
Built over the past nine months under Chief AI Officer Alexandr Wang, the new AI model is showing performance competitive with systems from OpenAI, Google and Anthropic, according to CNBC.
The model now powers Meta’s digital assistant in the standalone Meta AI app and desktop website, with a rollout to Facebook, Instagram, WhatsApp, Messenger and Ray-Ban Meta AI glasses planned for the coming weeks.
Meta spent $14.3 billion to acquire a 49% stake in Scale AI and brought in its founder, Wang, as Meta’s first chief AI officer, after Llama 4, its previous model, fell short of ChatGPT and Anthropic’s Claude.
Multimodal by Design
According to Meta, Muse Spark is a natively multimodal reasoning model that supports tool use, visual chain of thought and multi-agent orchestration.
Most frontier models started as text engines and later bolted on vision. Meta built this model to reason across image, video and text from the ground up, allowing it to act as a virtual assistant and help with people’s daily activities. For example, the model can analyze a video of a user doing push-ups and offer feedback on their form. Muse Spark can also study a photo of a fridge’s contents and suggest dishes for dinner.
Meta worked with over 1,000 physicians to curate training data for more factual and comprehensive health responses, the company said. The model can read a food photo to determine its nutritional content and map the muscles a workout targets.
For harder tasks, a Contemplating mode runs multiple agents in parallel and scored 58% on Humanity’s Last Exam, putting it alongside Google’s Gemini Deep Think and OpenAI’s GPT Pro on complex reasoning.
Meta also flagged gaps in long-horizon agentic tasks and coding. For dinner and pushups, those gaps do not matter. For enterprise software decisions, they do.
The Data Moat No One Else Has
The biggest advantage Meta brings is not the model. It is distribution and data. Logging into the Meta AI app connects a user’s Facebook and Instagram accounts automatically.
As TechCrunch reports, Meta does not explicitly say that personal information from those accounts feeds into the AI, but the company trains on public user data and has positioned Muse Spark as a personal superintelligence product. For anyone who joined Facebook in 2010, that is 15 years of behavior, preferences, and social signals the model draws on.
No other AI company holds that position. OpenAI knows what users have asked previously. Google knows what they search. Meta knows what they buy, who they follow and what they scroll past. It also strengthens Meta’s core business. More context leads to better targeting. Better targeting drives higher-value advertising.
Meta is also embedding commerce directly into the experience. Muse Spark can recommend products, track prices and surface alternatives within social feeds, turning the AI into a shopping assistant across its platforms, according to Axios. Mizuho Securities said usage growth via Shopping mode could drive significant monetization through ad targeting and search.
A Closed Model, a New Revenue Line
With Muse Spark, Meta is moving away from its open-source strategy. Its previous model, Llama, reached millions of developers but generated little direct revenue. This time, Meta is keeping control. Muse Spark is a closed model, and the company has said it will not release its design or code publicly. But Meta said it plans to offer third-party developers access to the underlying technology through an application programming interface.
The company said AI capital expenditures in 2026 will run between $115 billion and $135 billion, nearly twice last year’s figure. That level of spending warrants a return the open-source playbook never produced.
For all PYMNTS AI and digital transformation coverage, subscribe to the daily AI and Digital Transformation Newsletters.
The post Meta’s AI Future Is Personal, Starting With You appeared first on PYMNTS.com.