Wednesday, January 24, 2018

Seeing isn't believing: the rise of fake porn

The following may be disturbing to readers, but I feel it is important to write for several reasons. The first is, to stay a step ahead of cyberbullies that could use this technology to humiliate others. The second is to give readers - especially parents and teens - information to consider when deciding what to share publicly, privately, or at all.

In late 2016, software maker Adobe showcased an audio-editing tool that could, given a speech sample, create a natural-sounding recording of that person. This capability could come in very handy for editing podcasts or narrations, allowing a producer or sound engineer to edit the spoken text instead of re-recording. 

Last summer, a University of Washington research project demonstrated the next logical step. They were able to take a video recording of a public speech, replace the audio portion with a recording saying something else entirely, and manipulate the video so the speaker's face and mouth movements matches the new audio.

Faking someone's spoken words is one thing. But technology publication Motherboard wrote today of a new and disturbing practice gaining steam in the last six weeks or so: so-called "face-swap" porn, an artificial intelligence-aided merging of celebrity faces onto the bodies of porn actors, to create convincing videos that appear to be of that celebrity.

In the article (warning: NSFW, and unsettling content) Motherboard writes of individuals taking benign video from celebrities' public Instagram stories, and transferring the faces onto nude Snapchats posted by others. Using freely available software and step-by-step instructions, the technique can be accomplished by even a novice computer user. 

My fear is that it won't stop with celebrities. The thought of someone taking video from my daughter's Instagram, and creating a believable fake video with which to humiliate her, shakes me to the core, as it should any parent.

So why write this?

The first reason is to counter would-be cyberbullies. My hope is that a fake video - even an extremely convincing fake - might be less traumatic if it is widely known that such fakes are no longer fantasy. 

The second reason is to give you food for thought when it comes to privacy decisions. What you (or your child) post publicly, may be seen by - or downloaded and abused by - anyone. 

There is no one-size-fits-all solution when it comes to privacy and safety, but I'll share how I have approached this with my kids. When my children first began using social media, our household rule was that a social media account could be either public, or personal, but never both.

If the child wanted to share publicly, it had to be under a pseudonym and never include pictures of them, their family members, pets, or home. If the child wanted to identify themselves, the account had to be private and only shared with friends they (and we) knew in real life. 

As they and their situational awareness have grown, we have given them more discretion, but you can bet this development is the subject of discussion in our home.

Do you have something to add? A question you'd like answered? Think I'm out of my mind? Join the conversation below, reach out by email at david (at) securityforrealpeople.com, or hit me up on Twitter at @dnlongen