The rise of "deepfakes", or fake pornographic videos that use machine learning AI to superimpose the face of a popular celebrity onto a porn performer to create the illusion of said celebrity having sex on video, has lead to a plethora of concerns - not least the privacy and moral dilemmas around what appears to be non-consensual pornography of celebrities existing on the internet. However, the latest of these concerns has arisen around the datasets, or facesets, or images that go into creating these deepfakes – and concerns that they may be leading into the accidental creation of nonconsensual child porn.

To understand the concern, let's look at exactly how "deepfakes" work – The machine learning algorithms that go into creating these "deepfakes" will analyze a faceset of thousands of images and videos gathered into a single folder and read them frame-by-frame to get the exact image that is best matched and suited to the original performer's expression.

The concern then arises when a single image, or a set of images, of the target celebrity, is of them as a minor or as a child.

The example of Emma Watson can be seen below – in and amongst thousands of images of her, a few exist of her as a minor, which will then go into creating the pornographic "deepfake".

Why is this a concern? Because this is then technically child porn, and both creating and consuming it can be prosecuted with heavy sentences more or less anywhere on the planet, even if only a single frame utilizes said image.

The people making deepfakes and trading these facesets are worried about this, and understandably so. Often younger celebrities' facesets come with disclaimers that the faceset may contain photos of them as minors. Some are going even further, and deleting whole facesets, such as one of Elle Fanning, until they can be sure it doesn't contain images of her as a minor. The quote below is from a user on a deepfake web forum.

"I deleted all posts with Elle Fanning because it's impossible to prove that she was 18 years old in the old faceset. It's better to be safe than sorry."

However, a longstanding area of contention has remained that of intent, as many of these datasets are not curated by humans, but by facial recognition software like Microsoft's open-source FaceTracker and FaceDetector, which scan through videos and capture stills of a target's face. An example of this is in another Emma Watson faceset, which includes a close-up of actor Matthew Lewis as Neville Longbottom, showing that these facesets can often contain other types of images that shouldn't be there and that intent to produce child pornography under the deepfake banner is scarce.

While this remains a legal gray area and sentences, at least in the US, may be determined by exactly how many frames of the celebrity or individual in question are those of said celebrity as a minor, it remains an area of great concern for deepfake producers and will likely remian a contentious area as AI-based pornography grows, develops and improves over time.

What do you guys think? Does the rise of AI-based child pornography worry you? Let us know in the comments, or post over on our forums with your thoughts.

via Motherboard
Image source: The Guardian