Nearly a year after AI-generated nude images of high school girls upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

  • gencha@lemm.ee
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    25 days ago

    We need a very low barrier of entry to generate gay porn from a single image of a male before this problem will be taken seriously.