X’s Grok app has been reinstated in Indonesia after it was recently banned for generating sexualized images of people without their knowledge or consent.
In early January, in response to the Grok nudification trend on The Indonesian Ministry of Communications threatened to ban both X and the separate Grok app if concerns about ‘degrading images of women and children’ are not addressed.
A few days later, the ministry followed through on that threat by completely banning the Grok app and restricting access to X. But now, following assurances from announced that it lifts its ban action, allowing X to continue operating its platforms in the country.
As reported by The New York Times:
“This was reported by the Indonesian Ministry of Communications and Digital Affairs a statement on Sunday that the ministry had received a letter from X Corp “with concrete steps to improve services and prevent abuse.” The ban will be lifted “conditionally” and Grok could be blocked again if “further violations are discovered,” Alexander Sabar, director general of the ministry of digital space monitoring, said in the statement.
That means X is now back in action in all Southeast Asian countries where it is available, with both Malaysia and the Philippines also recently lifting their bans on the app in response to the nudification controversy.
So, all good, the use of Grok is limited to ensure that no more non-consensual nudes are produced, and everything is back to normal. Right?
Well yes and no.
Yes, in the sense that X has implemented restrictions to prevent people from generating offensive images via Grok, at least to some extent. But the question remains as to why
Musk initially claimed that several other AI tools enabled the generation of deep fake nudes, but no one went after them, suggesting the real motivation was to shut down X because of its “free speech” aligned approach.
That’s not right, and even if it were, for what possible reason would
That belies Musk’s well-publicized opposition to CSAM content, an element he made central to his reformation at Twitter when he took over the app. Musk repeatedly claimed that previous Twitter management had not done enough to combat CSAM content, and he did so would make this his “#1 priority”. during his time as chief.
And Musk’s new management team has provided that some data notessuggesting that they had improved the platform’s efforts on this front. But more recent reports indicate that CSAM content is now available more present in the app than ever on X, while the company has also terminated its contract with Thorn, a non-profit organization that provides technology to detect and address child sexual abuse content (Thorn says X has stopped paying his bills).
And then there are Grok’s deepfakes, which users had been able to generate thousands of sexualized images in the app every dayincluding, again, images of children.
And at least for a while, Elon defended this functionality and tried to deflect criticism of it as an option.
Why? I don’t know, it doesn’t make sense, there’s no reason why anyone would need this as a feature. But driven by his passion to make his AI the most widely used generative AI option on the market, Musk initially refused to make a change even though he could.
It’s also worth noting that Musk recently bragged that Grok is now generating more images and video than all other AI tools combined. Something he cannot claim with any certainty, as he has no access to data on the output of other engines. But I also wonder why that is? Could it be because of the thousands of fake nudes that X users have created?
It’s confusing to me that anyone could see this as consistent with Elon’s previous statements about a non-permissive approach to CSAM content, or that Elon actually values this as a major concern.
It seems that progress remains his guiding star, if necessary at the expense of everything else, while his constant reframing of everything as a political flashpoint makes it increasingly difficult to side with him in the name of measured development.
#Grok #restored #Indonesia #controversy #nudification


