Behind the FaceApp magic

New questions arise about privacy breaches, data misuse

8 Promising Applications of Blockchain Technology in Education
These records often determine university or graduate school admissions
Business Standard Editorial Comment
4 min read Last Updated : Jul 27 2019 | 10:06 PM IST
The latest app to take social media by storm raises new questions about privacy breaches, data misuse, and the commercial monetisation of social media. FaceApp has a tiny team of 12 people based in St Petersburg. It uses artificial intelligence (AI) to morph portraits. Users upload digital portraits to a cloud, where these are processed to create likenesses of themselves as they will look (or looked) a decade or two later (or earlier). FaceApp also “gender-switches” portraits, and facial expressions, on request.

It is an intriguing business. The AI must not only apply facial recognition techniques to “recognise” faces; it has to guess how ageing, “youth-ing” and gender-switching could lead to alterations in facial appearance. It must distinguish between happy, sad, and pensive faces. This places it at the cutting edge of face-recognition technology.

In behavioural terms, it has hit a sweet spot. FaceApp is one of the most popular downloads with at least 100 million users. It appears that a lot of people want to know what they will look like a few decades down the line. Many also wish to wallow in nostalgia and look at morphed pictures of themselves as children and quite a few are curious about gender-switchingappearances. 

This may seem like a harmless, if narcissistic, pastime. However, before FaceApp can apply its magic, it needs portraits to be uploaded by users. Those portraits are associated with other data, like mobile numbers, device models, user-names, locations, birth years, and so on.

The app has a “freemium” subscription model. It claims that 99 per cent of its revenue comes from premium user subscriptions even though only about 1 per cent of users pay to use the service. Every paid subscriber yields some more sensitive data such as credit cards details.

The terms of use are roughly modelled on Instagram, which means that users grant permission for their uploaded shots (including altered versions) to be retained and used by the service provider, permanently, for a broad range of commercial purposes. This permission is royalty-free, and granted by default at the time of installation.

Moreover, there is a boilerplate indemnity protecting the company from being sued for any loss or injury suffered via the app (such as damaged reputation or embarrassment). The terms of service also mean that users agree, by default, to cover all legal fees for third-party claims against FaceApp arising from their use of the app. It is possible to opt out of these legally restrictive clauses. But that involves reading some very fine print and pro-actively contacting the company within 30 days of installation. 

So, FaceApp is sitting on a treasure trove of data given by users. It can easily monetise this. Face-recognition platforms need big data-sets to train algorithms. Every national security agency worth its salt, as well as countless local police forces, is trying to set up efficient facial recognition systems. So the company could simply offer its 100 million-plus portraits to face recognition developers.

Beyond this of course, there is a real fear that the databases could be hacked, and misused. There are new morphing technologies such as DeepFake, which recombine existing images to create fake new images and malicious videos, which are not easily distinguishable from the authentic. The scope for misuse of such a database is horrifyingly large.

FaceApp’s Russian founder, Yaroslav Goncharov, says that it retains only specific pictures uploaded to the cloud for a maximum of 48 hours of processing. Goncharov also says that FaceApp is reviewing its terms of service to soften the legal implications and offer more user-privacy.

Given the success of this app, other face-morphing systems will surely be on offer soon. Users would be well-advised to read the fine print in terms of service with care. This is especially so, in India, which has no specific privacy law. If you are being offered something for free on social media, the chances are payment is being extracted in hidden ways.


 

One subscription. Two world-class reads.

Already subscribed? Log in

Subscribe to read the full story →
*Subscribe to Business Standard digital and get complimentary access to The New York Times

Smart Quarterly

₹900

3 Months

₹300/Month

SAVE 25%

Smart Essential

₹2,700

1 Year

₹225/Month

SAVE 46%
*Complimentary New York Times access for the 2nd year will be given after 12 months

Super Saver

₹3,900

2 Years

₹162/Month

Subscribe

Renews automatically, cancel anytime

Here’s what’s included in our digital subscription plans

Exclusive premium stories online

  • Over 30 premium stories daily, handpicked by our editors

Complimentary Access to The New York Times

  • News, Games, Cooking, Audio, Wirecutter & The Athletic

Business Standard Epaper

  • Digital replica of our daily newspaper — with options to read, save, and share

Curated Newsletters

  • Insights on markets, finance, politics, tech, and more delivered to your inbox

Market Analysis & Investment Insights

  • In-depth market analysis & insights with access to The Smart Investor

Archives

  • Repository of articles and publications dating back to 1997

Ad-free Reading

  • Uninterrupted reading experience with no advertisements

Seamless Access Across All Devices

  • Access Business Standard across devices — mobile, tablet, or PC, via web or app

Topics :Data Privacydata protectionFaceApp removing ethnicity filters

Next Story