Elon Musk’s Grok is going to be used on the battlefield by the US military
Grok, which only weeks ago was accused of undressing people without their consent, will soon be the US Pentagon’s newest employee.
Elon Musk’s artificial intelligence company xAI has signed a deal with the military to use its AI model for classified activities.
Grok will assist defence officials in battlefield analysis, developing weapons and combing through intelligence, according to Axios.
All in all, department officials can use the chatbot for ‘all lawful use’.
The Pentagon and xAI have been approached for comment.
Jurgita Lapienytė, chief editor at Cybernews, told Metro that she worries what such deals mean for the future of AI, which most people know as that thing that writes their emails and makes funky images.
‘Currently, AI is not only untrustworthy but also very dangerous when unsupervised,’ she said.
‘In military operations, it can also be used to dehumanize operations by offering gamified experiences for officers and soldiers and shifting personal responsibility.’
AI-powered warfare is already here – Claude, a bot by Anthropic, was reportedly used by the Pentagon to conduct Saturday’s Iran strikes.
US military command used the tools to pick targets and carry out battlefield simulations, multiple reports said.
Yet only hours before, Trump ordered agencies to cut ties with the ‘radical Left’ company.
The company had been butting heads with the Pentagon over assurances that its tech would not be used for shady security purposes for weeks.
Defence Secretary Pete Hegseth called the company a ‘supply-chain risk to national security’, meaning no military contractor can work with Anthropic.
Anthropic questioned why the administration would use a term ‘historically reserved for US adversaries’ and said it would challenge the call in court.
Lapienytė said: ‘Yes, the government shouldn’t allow any company to dictate the terms for defence operations. But should AI companies be punished for having safety rules?
‘If the biggest market players are forced onto their knees, smaller companies will stop having safety rules, too. Will being “safe” become bad for business?
‘When the world’s most powerful military starts using AI without being transparent about exactly how, one can begin to wonder just how much US operations overseas are influenced by the algorithm.’
ChatGPT also signs Pentagon deal
OpenAI said on Friday it, too, had struck a deal with the Pentagon for itsChatGPT tool to be used in classified systems.
ChatGPT will not be used for domestic surveillance or building autonomous weapons, OpenAI chief executive Sam Altman said.
In a post on X, Musk’s social media platform, Altman said the ‘DoW’ (or ‘Department of War’, Donald Trump’s preferred name to the defence department) has a ‘deep respect for safety’.
He added that OpenAI will build toughened safety guardrails ‘to ensure our models behave as they should, which the DoW also wanted’.
‘We remain committed to serve all of humanity as best we can. The world is a complicated, messy, and sometimes dangerous place,’ he added.
Altman’s announcement, however, sparked a small exodus of ChatGPT users – including Katy Perry.
The singer shared on X that she had subscribed to Claude, writing: ‘Done.’
Scores of users on Reddit said they were ditching ChatGPT over the Pentagon deal, with one saying: ‘You’re now training a war machine.’
Altman responded to concerns earlier today, sharing on X an internal company post detailing the ‘DoW’ agreement.
The memo states officials ‘understand’ that the model cannot be used for ‘deliberate tracking, surveillance, or monitoring of US persons or nationals’ and won’t be used by intelligence agencies.
It added: ‘It’s critical to protect the civil liberties of Americans, and there was so much focus on this, that we wanted to make this point especially clear, including around commercially acquired information.
‘Just like everything we do with iterative deployment, we will continue to learn and refine as we go.’
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.