Europol has supported authorities from 19 countries in a large-scale hit against child sexual exploitation that has led to 25 arrests worldwide. The suspects were part of a criminal group whose members were engaged in the distribution of images of minors fully generated by artificial intelligence (AI).
That said, there’s a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there’s a good chance they don’t understand how they work, but the creators of the model should absolutely know where they’re getting the source data from.
Prove that the models use illegal material and go after the model creators for that, because that’s an actual crime. Don’t go after people using the models who are providing alternatives to abusive material.
I think all are unethical, and any service offering should be shut down yes.
I never said prosecute the user’s.
I said you can’t make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.
the odds of human explotations at some point in the chain are just too high
We don’t punish people based on odds. At least in the US, the standard is that they’re guilty “beyond a reasonable doubt.” As in, there’s virtually no possibility that they didn’t commit the crime. If there’s a 90% chance someone is guilty, but a 10% chance they’re completely innocent, most would agree that there’s reasonable doubt, so they shouldn’t be convicted.
If you can’t prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.
Services should only be shut down if they’re doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing “beyond a reasonable doubt” that they were committing a crime. That’s how the law works, you only punish people you can prove “beyond a reasonable doubt” were committing a crime.
Let’s say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.
I don’t know, I’m not an expert. But just because I don’t know of something doesn’t mean it doesn’t exist, it means I need to consult experts.
It can’t.
Then prove it. That’s how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.
My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It’s not CSAM if it’s generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn’t use CSAM in its training data.
You can’t prove a negative. That’s not how prooving things work.
You also assume legal images. But that puts limits on what’s actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?
You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than “see, it looks like a child therefore it’s CSAM,” but it’s necessary to protect innocent people.
You assume it can, prove that it can.
That’s guilty until proven innocent. There’s a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.
You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.
All existing solutions are based on real life images. There’s no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.
When the cops come knocking, your best bet is to comply under duress (be clear that it’s under duress). Fighting the police will just add more charges, the right place to fight is in the courts. If your country’s justice system is corrupt, then I guess you might as well fight the police, but in most developed countries, the courts are much more reasonable than the police.
how can it be done ethically?
The burden of proof is on showing that it was done unethically, not that it was done ethically. Force the prosecution to actually do their job, don’t just assume someone is guilty because the thing they made looks illegal.
That’s just not true.
That said, there’s a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there’s a good chance they don’t understand how they work, but the creators of the model should absolutely know where they’re getting the source data from.
Prove that the models use illegal material and go after the model creators for that, because that’s an actual crime. Don’t go after people using the models who are providing alternatives to abusive material.
I think all are unethical, and any service offering should be shut down yes.
I never said prosecute the user’s.
I said you can’t make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.
We don’t punish people based on odds. At least in the US, the standard is that they’re guilty “beyond a reasonable doubt.” As in, there’s virtually no possibility that they didn’t commit the crime. If there’s a 90% chance someone is guilty, but a 10% chance they’re completely innocent, most would agree that there’s reasonable doubt, so they shouldn’t be convicted.
If you can’t prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.
Services should only be shut down if they’re doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing “beyond a reasonable doubt” that they were committing a crime. That’s how the law works, you only punish people you can prove “beyond a reasonable doubt” were committing a crime.
How can it be made ethically?
That’s my point.
It can’t.
Some human has to sit and make many, many, many models of genitals to produce an artificial one.
And that, IMO is not ethically possible.
Let’s say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.
I don’t know, I’m not an expert. But just because I don’t know of something doesn’t mean it doesn’t exist, it means I need to consult experts.
Then prove it. That’s how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.
My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It’s not CSAM if it’s generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn’t use CSAM in its training data.
You can’t prove a negative. That’s not how prooving things work.
You also assume legal images. But that puts limits on what’s actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?
You assume it can, prove that it can.
You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than “see, it looks like a child therefore it’s CSAM,” but it’s necessary to protect innocent people.
That’s guilty until proven innocent. There’s a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.
You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.
All existing solutions are based on real life images. There’s no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.
That’s how existing solutions work.
So again, how can it be done ethically?
When the cops come knocking, your best bet is to comply under duress (be clear that it’s under duress). Fighting the police will just add more charges, the right place to fight is in the courts. If your country’s justice system is corrupt, then I guess you might as well fight the police, but in most developed countries, the courts are much more reasonable than the police.
The burden of proof is on showing that it was done unethically, not that it was done ethically. Force the prosecution to actually do their job, don’t just assume someone is guilty because the thing they made looks illegal.