• ShaunaTheDead@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    I would be okay if these AI generated images were made available to people who had self identified as a pedophile to their local government body and agreed to be placed on a special list and enter psychiatric treatment. After all, although it’s absolutely disgusting, at least with AI generated images no children are being harmed and if it brings these sick people forward to seek treatment and to be identified and monitored to prevent real life abuse, then it could actually save real children from being exploited which is obviously a noble goal.

    I just don’t envy the poor bastard that has to setup the test data for the AI to generate all that art…

  • MagicShel@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    The question is will AI satiate those people or drive them to further extremes? I’m hopeful it might reduce demand for making content “the traditional way” due to being lower risk (since less-horrific doesn’t seem to motivate such people).

    I say hopeful because this cat can’t be put back in the bag. The technical solutions they suggest are desperate grabbing at straws and (self-evidently) won’t work. People trading in CSM are already taking extreme legal and social risks, so it’s hard to imagine any greater motivation that could be applied to AI generated versions. I think all we can do is hope this makes things better and not worse. Because if it goes the other way the future is looking grim.

  • dog@yiffit.net
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    wow, that’s terrifying and gross 🙃 it would have been free for people not to do this

  • lilweeb@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Wish I hadn’t deleted my Reddit account, cause I predicted this. I also predicted that AI would be used to generate “proof” that a person is a child abuser. And there’s absolutely nothing we can do about any of this. What a nightmare.