(888) 277-4253

Popular face-aging app an outing Russian spy bot?

Popular face-aging app an outing Russian spy bot?
Photo credit:
pxhere - stock photo

Technology is a useful thing and as evidenced in a recent aging app it's being used for much more than work. Pop culture has embraced the visage-aging software called FaceApp in which you can take a selfie and through the magic of developers see what you look like 50 years from now. You can then share your realistic sun-damaged face with the masses on social media and comment about how much you look like your mom or dad. 

But the app is made by a company in St. Petersburg, Russia which has many concerned about the safety of your identity once you sign off on the terms and conditions. 

Towleroad points out that users who agree to the terms actually grant the company “a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable, sub-licensable license to use, reproduce … create derivative works from … and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.”‘

The minute it takes to download the app and sign off on permissions might mean you are also being added to a face recognition database.

Although this sounds like some next-level spy technology, Jason Hill, lead cybersecurity researcher at CyberInt Technologies told USA Today that “[There] is no immediate evidence to suggest that FaceApp is performing any nefarious task.” Still, he adds info could be being collected. “For example, collating photos associated with a user could, where present, allow image metadata, such as the location that a picture was taken, to be mapped and correlated with access logs, gathered when the user accesses the service, that will associate details of their IP address, ISP and the device (including browser, operating system and hardware).”

Although Hill's assessment is a bit prophetic, other experts in the field downplay concerns. 

“A security researcher who goes by the pseudonym Elliot Alderson (real name Baptiste Robert) downloaded the app and checked where it was sending users’ faces," reports Forbes. "The French cyber expert found FaceApp only took submitted photos – those that you want the software to transform – back up to a company server. And where’s that server based? America, not Russia. … And, as noted by Alderson, the app also uses third party code, and so will reach out to their servers, but again these are based in the U.S. and Australia. Of course, given the developer company is based in St. Petersburg, the faces will be viewed and processed in Russia. It’s unclear how much access FaceApp employees have to those images…”

But what does that mean for LGBT people?

LGBTQ Nation says the app and its capabilities might be used to out people. They give an example where Stanford University researchers were able to use social media photos to determine if users were gay or straight. Their guesses were pretty accurate.

They go on to quote tech journalist Stilgherrian in his interview with Australia’s ABC News.“This is a pretty standard boilerplate privacy policy, which effectively offers you no protection at all,” he said. 

Adds David Vaile of the Australian Privacy Foundation: “They ask for way more rights than they need to offer the service to you, [they] can remove the data from any effective legal protection regime, share it with almost anyone, and retain it indefinitely."

The popularity of the app may be short-lived as saturation has already set in and many people just wish users would stop sharing their deepfake photos.