- cross-posted to:
- linux_lugcast@lemux.minnix.dev
- cross-posted to:
- linux_lugcast@lemux.minnix.dev
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
You are certain about this? If so, where are you getting that info, because it’s not in the article?
Generative image models frequently are used for the “infill” capabilities, which is how nudifying apps work.
If he was nudifying pictures of real kids, the nudity may be simulated, but the children are absolutely real, and should be considered victims of child pornography.