A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
As someone who’s spent a couple weeks down a stable diffusion rabbit hole. I can attest that they don’t need to be trained on CP to generate CP content. Using some very popular checkpoints I inadvertently created some images that I found questionable enough to immediately delete. And I wasn’t even using prompts to generate young girls, with the right prompts I can easily see some of the more popular checkpoints pumping out CP.
deleted by creator
As someone who’s spent a couple weeks down a stable diffusion rabbit hole. I can attest that they don’t need to be trained on CP to generate CP content. Using some very popular checkpoints I inadvertently created some images that I found questionable enough to immediately delete. And I wasn’t even using prompts to generate young girls, with the right prompts I can easily see some of the more popular checkpoints pumping out CP.
I think it it becomes widespread, like you want it to be, models that generate CSAM will be trained on such material, yes.
deleted by creator