Governments around the world are considering bans on Grok’s app over AI sexual image scandal
As concerns grow over Grok’s ability to generate sexually explicit content without the subject’s consent, a number of countries are blocking access to Elon Musk’s artificial intelligence chatbot. At the center of the controversy is a feature called Grok Imagine, which lets users create AI-generated images and videos. That tool also features a “spicy mode,” which lets users generate adult content.
Both Indonesia and Malaysia ordered that restrictions be put in place over the weekend. Malaysian officials blocked access to Grok on Sunday, citing “repeated misuse … to generate obscene, sexually explicit, indecent, grossly offensive, and non-consensual manipulated images.” Officials also cited “repeated failures by X Corp.” to prevent such content.
Indonesia had blocked the chatbot the previous day for similar reasons. In a statement accompanying Grok’s suspension, Meutya Hafid, Indonesia’s Minister of Communication and Digital, said. “The government views the practice of non-consensual sexual deepfakes as a serious violation of human rights, dignity, and the security of citizens in the digital space.”
The responses could be just the beginning of Grok’s problems, though. Several other countries, including the U.K., India, and France, are thinking of following suit.
The U.K. has launched an investigation into the chatbot’s explicit content, which could result in it being blocked in that country as well. “Reports of Grok being used to create and share illegal, non-consensual, intimate images and child sexual abuse material on X have been deeply concerning,” Ofcom, the country’s regulator for the communications services, said in a statement.
Musk, in a social media post following word of the Ofcom investigation, wrote that the U.K. government “just want[s] to suppress free speech.”
Fast Company attempted to contact xAI for comment about the actions in Indonesia and Malaysia as well as similar possible blocks in other countries. An automatic reply from the company read “Legacy Media Lies.”
Beyond the U.K., officials in the European Union, Brazil, and India have called for probes into Grok’s deepfakes, which could ultimately result in bans as well. (The U.S. government, which has contracts with xAI, has been fairly silent on the matter so far.)
In a press conference last week, European Commission spokesperson Thomas Regnier said the commission was “very seriously looking into this matter,” adding “This is not ‘spicy.’ This is illegal. This is appalling. This is disgusting. This is how we see it, and this has no place in Europe.”
Musk and X are still feeling the effects of a $130 million fine the EU slapped on the company last month for violating the Digital Services Act, specifically over deceptive paid verification and a lack of transparency in the company’s advertising repository.
Beyond sexualized images of adults, a report from the nonprofit group AI Forensics that analyzed 20,000 Grok-generated images created between Dec. 25 and Jan. 1 found that 2% depicted a person who appeared to be 18 or younger. These included 30 images of young or very young women or girls in bikinis or transparent clothes.
The analysis also found Nazi and ISIS propaganda material generated by Grok.
While the company has not addressed the countries blocking access to its services, it did comment on the use of its tool to create sexual content featuring minors.
“We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary,” X Safety wrote in a post. “Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.”
The company has also announced it will limit image generation and editing features to paying subscribers. That, however, likely won’t be enough to satisfy government officials who want to block access to Grok while these images can still be generated.