Apple Faces Pressure to Eliminate X App as xAI Declares Grok Will Halt Undressing Feature

Apple Faces Pressure to Eliminate X App as xAI Declares Grok Will Halt Undressing Feature

3 Min Read


**Apple Confronts Renewed Demand to Eliminate X and Grok from the App Store**

Recently, a consortium of 28 organizations focused on digital rights, child safety, and women’s rights has appealed to Apple to swiftly address the applications X and Grok, created by xAI. The open letter from the coalition illustrates worries regarding the generation of nonconsensual intimate images (NCII) and child sexual abuse material (CSAM), which breach both legal regulations and Apple’s App Review Guidelines.

The call for action follows reports from numerous users of X and Grok who allegedly solicited the AI to modify images of women and underage girls to depict them as less clothed. Despite xAI’s initial assertions that these occurrences were due to “failures in safeguards,” the company continued to handle these requests, resulting in X being restricted in various countries and facing scrutiny in others.

The coalition’s correspondence stresses that Apple is not only facilitating the production of harmful content but is also benefiting from it. They insist on the withdrawal of Grok and X from the App Store to avert further exploitation and criminal endeavors.

**xAI’s Statement: Grok Will Halt Image Editing, Kind Of**

In light of the escalating pressure, xAI declared that Grok will introduce measures to stop the editing of images of real individuals in scant attire, such as bikinis. This limitation will apply universally to all users, including those who pay for subscriptions. Moreover, image creation and editing through Grok will now be confined to confirmed Premium subscribers, a move that xAI claims will enhance user accountability for any infractions.

Regardless of these measures, reports suggest that some users are already attempting to circumvent the new limitations. Non-subscribers find themselves receiving notifications indicating that image generation capabilities are now reserved for paying users.

**9to5Mac’s Perspective**

As xAI implements these new filters and regulations, it’s unclear whether they will effectively tackle the persistent problems. Historical trends indicate that individuals intent on bypassing safety protocols can be quite resourceful, especially in targeting women online. The existence of multiple exceptions in xAI’s announcement raises doubts that this dilemma will be adequately resolved, particularly for the involved victims.

Critics have voiced disappointment over Apple’s absence of a public reaction to the situation, positing that the company’s inaction compromises its dedication to maintaining a secure environment within the App Store. The ongoing availability of nonconsensual images via X’s iOS application has prompted concerns regarding Apple’s integrity and attentiveness to user safety issues.

You might also like