AI tool faces growing global scrutiny over the spread of nonconsensual sexual images of women and minors on X

Elon Musk said on Wednesday he was not aware of any “naked underage images” generated by xAI’s Grok, as scrutiny of the AI tool intensifies worldwide.

“I not aware of any naked underage images generated by Grok. Literally zero,” Musk said in an X post. Musk’s comment comes as xAI and X face growing global scrutiny, including calls by lawmakers and advocacy groups for Apple and Google to drop Grok from app stores, an investigation by UK regulators, and bans or legal action in countries such as Malaysia and Indonesia.

Last week, X curtailed Grok’s ability to generate or edit images publicly for many users, however industry experts and watchdogs have said that Grok was still able to produce sexually explicit images, and that restrictions, such as paywalling certain features, may not fully block access to deeper AI image tools.

  • Master167@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 hours ago

    I don’t buy it and here’s why.

    Is he getting updates about his company from Yes men? Absolutely. I believe all of the executives at his companies inflate his ego because he’s the “Elon Musk”. But if you read any news and see Twitter mentioned, it’s been about this topic since they started this feature of his misinformation machine.

    • TigerAce@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      I don’t buy it and here’s why.

      He’s the biggest lier out there, he only doesn’t lie when he’s promoting nazi shit.

  • Tollana1234567@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 hours ago

    hes totally aware, if he’s been praticing “kung fu” lessons with miss ghislaines maxwell on Epsteins properties before.

    • hardcoreufo@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 hours ago

      I know people who fell for that hook, line and sinker and somehow have blanked that from their memory.

  • defunct_punk@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    21 hours ago

    Ah yes, the ol’ “i did not have sexual relations” excuse.

    “I not [sic] aware of any naked underage images generated by Grok. Literally zero,”

    The accusation is about scantily-clad, underclothed image generation of real minors, not full on nudity, which is still CSAM in most laws’ eyes, something he conveniently skirts.

    • a_non_monotonic_function@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 hours ago

      It’s more than I didn’t have sexual relations. It’s I don’t know where I don’t remember. The get out of jail free card for politicians in rich people alike.

      Look at how often Mike Johnson just says he doesn’t know what’s going on.

  • Deestan@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    21 hours ago

    Grok was still able to produce sexually explicit images, and that restrictions, such as paywalling certain features, may not fully block access […]

    Charging money for revenge porn didn’t work as a “restriction”? Demanding people paying to access CSAM didn’t “fully block” it?

    Whoever at Reuters allowed that sentence to form is just cooked.

  • nutsack@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    15 hours ago

    the value proposition of grok has always been that it will do things that the other models will try to block people from doing

  • dipcart@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    20 hours ago

    When it became news you could AI CSAM, I’m pretty sure his response was… to post an AI picture of himself in a bikini?

    I would love to meet the single person who he has convinced by this

  • magnetosphere@fedia.io
    link
    fedilink
    arrow-up
    16
    ·
    20 hours ago

    “Obviously, Grok does not spontaneously generate images, it does so only according to user requests,” Musk said.

    “Guns don’t kill. People do.”

  • ianhclark510@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    17
    ·
    21 hours ago

    So your giant expensive water poisoning machine that you own and built is generating content you’re not aware of, and this is a commercial product you’re charging for and integrating into the Pentagon?