• sudo22@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    1
    ·
    9 months ago

    My guess would be he wasn’t self hosting the AI network so the requests were going through a website.

      • sudo22@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        ChatGPT can be tricked into giving IED instructions if you ask the right way. So it could be a similar situation.

      • BreakDecks@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.

        I suppose they could get some kind of sex+children detector going for all generated image, but you’re going to have to train that model on something, so now it’s a chicken and egg problem.