Controversial photo-editing app FaceApp is back, with a new "challenge" showing what people will look like as they age.
The app, which launched in 2017, uses artificial intelligence to augment users' facial expressions and make them look older or younger, or change their gender.
It has gone viral again in the last few days, with thousands of people posting "aged" selfies of themselves on social media, using the hashtag #FaceAppChallenge.
The challenge has proved so popular that even celebrities such as Gordon Ramsay, Sam Smith and Drake are getting involved.
However, some online privacy experts have raised concerns that the app, which is developed by Russian company Wireless Lab, could be accessing and storing users' images without their permission.
The company's terms and conditions state that, by using the app, users agree to allow FaceApp to use, modify, adapt and publish any images that they upload.
"You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content," the Ts&Cs state.
FaceApp is also allowed to use your name, username or any likeness provided in any media format it likes without compensation, meaning you will not be paid for it, or have any ability to take it down or complain about it.
FaceApp uploads your photo to the cloud for processing, rather than carry out on-device processing like many apps do. After doing so, it retains the image long after you've deleted the app.
"All of this should raise alarms whenever a free service is acting on sensitive information like images – the revenue to pay for the service is coming from somewhere and it’s likely the sale of data related to what the service provides," said Tim Mackey, Principal Security Strategist at the Synopsys CyRC (Cybersecurity Research Center).
Moreover, according to TechCrunch, the app is able to access Photos on Apple's iOS platform even if a user has set their photo permissions to "never".
"Given how many screenshots people take of sensitive information like banking and whatnot, photo access is a bigger security risk than ever these days," said Matthew Panzarino, Editor-In-Chief at TechCrunch.
"With a scraper and optical character recognition tech you could automatically turn up a huge amount of info way beyond 'photos of people'."
"The implications on privacy for apps like FaceApp is extremely concerning," said Javvad Malik, security awareness advocate at KnowBe4.
"The app itself uses AI to digitally age users' photos, which is fun from a novelty perspective, but the same types of AI is used to produce deepfake type of imagery which can be used for nefarious purposes such as public embarrassment or blackmail.
"The fact that FaceApp retains access to iOS photos without permission should be a red flag for all app store maintainers.|
FaceApp has had its fair share of controversies over the years.
Back in April 2017, Wireless Lab was forced to apologise after several users complained that its "hot" filter was lightening their skintone .
FaceApp's CEO Yaroslav Goncharov claimed that it was an "unfortunate side-effect of the underlying neural network caused by the training set bias, not intended behaviour."
Then in August of the same year, the app sparked outrage with the introduction of a new filter that allowed people to change their ethnicity in their selfies .
Responding to the backlash, Wireless Lab, removed the filters and withdrew the update from the App Store.
Source: Read Full Article