Using Azure Face API to Analyze Your FaceApp Photo (DEBUNKING FaceApp)

OVERVIEW
Happy new year! 🎈
2019 is coming to an end, which means we are getting older, aren’t we?
The question is, how old do we really look? Because age is just a number, right?
I wanted to put the FaceApp to the test because it’s one of the few apps that went viral this year!
In this blogpost I will use Azure Cognitive Services Face API to see how old does A.I. see us right now, and how old does A.I. see our FaceApp future self, shall we?
I am also going to prove that FaceApp doesn’t really show you how you will look like when you get older, and that it should only be used for fun!
Let’s get started!
Requirements
Demo
STEP 1
To start things up, I will download and use the FaceApp on a picture of Kevin aka Macaulay Culkin aka the kid from the movie Home Alone 2: Lost in New York, just because it’s one of the most played movies at the Christmas and New Year season!
You can choose between a variety of options to go with, but in this demo I will be using the Age/Old filter since it’s the main reason the FaceApp went viral, which you can do by opening the app, choosing a picture> clicking “Age”> then clicking “Old”
Let’s get into the most interesting part!
STEP 2
Visit portal.azure.com and sign-in with your Azure account.
STEP 3
Click on “Create Resource” then click “Face” from the “AI + Machine Learning” section.

STEP 4
Create a “Resource Group” by clicking “create new” then fill in the details and click “Create”.

STEP 5
Wait till it says “Your deployment is complete” then click “Go to resource”.

STEP 6
Go ahead and copy the “Key” and “Endpoint” from under the “Quickstart” section, as you are gonna need them in the next step.

STEP 7
Just like always, I’ve created a simple tool and posted it on GitHub to use the API with a custom user interface instead of going with cURL and Postman!
You first need to create a folder and call it anything, then create 3 files inside, 1 HTML and call it index.html, 1 CSS and call it style.css and 1 JS and call it script.js
The four main ids to include in the HTML so that we can then use in our JavaScript is (“input”, “analyseButton”, “output” and “sourceImage”).
The “input” is used to accept image files, then “analyzeButton” is used to take the “input” value and make an API call, then finally “output” and “sourceImage” are used to show the results that came back from the API!
document.getElementById("analyseButton").addEventListener("click", analyze);
function analyze(){
var sourceURL = document.getElementById("input").value;
let reqBody = {
url: sourceURL
};
let myHeader = new Headers({
'Content-Type': 'application/json',
'Ocp-Apim-Subscription-Key':'ACCESS_KEY'
});
let initObject = {
method: 'POST',
body: JSON.stringify(reqBody),
headers: myHeader
}
let request = new Request('ACCESS_URL/face/v1.0/detect?returnFaceId=true&returnFaceLandmarks=false&returnFaceAttributes=age,gender,emotion', initObject);
fetch(request).then(function(response){
if(response.ok){
return response.json();
}
else{
return Promise.reject(new Error(response.statusText));
}
}).then(function(response){
let imgDiv = document.getElementById('sourceImage');
imgDiv.style.visibility = "visible";
document.getElementById("output").innerHTML = "gender: " + response[0].faceAttributes.gender + "</br>" + "age: " + response[0].faceAttributes.age + "</br>" + "hapiness: " + response[0].faceAttributes.emotion.happiness + "</br>" + "sadness: " + response[0].faceAttributes.emotion.sadness + "</br>" + "anger: " + response[0].faceAttributes.emotion.anger;
imgDiv.src = sourceURL;
imgDiv.classList.add('animateImage');
}).catch(function(err){
alert(err);
document.getElementById("output").innerHTML = "";
});
}
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="UTF-8">
<title>Azure Face API Demo</title>
<link rel="stylesheet" href="css/style.css">
</head>
<body>
<header>
<h1>Azure Face API Demo</h1>
</header>
<section class="text">
<h3>Insert Picture URL</h3>
<input id="input" />
</section>
<div class="btn-section">
<button id ="analyseButton"> Analyse </button>
</div>
<div class="results">
<div class="left-side">
<p id="output"> </p>
</div>
<img id="sourceImage"/>
</div>
<script src="js/script.js"></script>
</body>
</html>
*{
margin: 0;
padding: 0;
box-sizing: border-box;
font-family: arial, sans-serif;
}
body {
background-color: #fafafa;
overflow-x: hidden;
}
header{
display: flex;
flex-wrap: wrap;
padding: 12px;
text-align: center;
background: #1A2980; /* fallback for old browsers */
background: -webkit-linear-gradient(to right, #26D0CE, #1A2980); /* Chrome 10-25, Safari 5.1-6 */
background: linear-gradient(to right, #26D0CE, #1A2980); /* W3C, IE 10+/ Edge, Firefox 16+, Chrome 26+, Opera 12+, Safari 7+ */
box-shadow: 0px 4px 30px -10px rgba(0,0,0,0.5);
align-items: center;
justify-content: center;
}
header h1{
font-weight: 200;
color: #f2f2f2;
font-size: 2rem;
}
#input {
padding: 18px 6px 3px 6px;
border: none;
outline: none;
background-color: transparent;
border-bottom: 1px solid rgba(34,198,227,1);
}
.text {
font-weight: 400;
text-align: center;
}
.text h3 {
color: #444;
margin: 18px 6px;
}
#analyseButton {
cursor:pointer;
border: none;
outline: none;
color: #f2f2f2;
background: #1a2980;
padding: 12px 28px;
margin: 12px auto;
box-shadow: 0px 4px 30px -10px rgba(0,0,0,0.5);
}
#analyseButton:active {
transform: translateY(1px);
box-shadow: 0p 4px 5px -10px rgba(0,0,0,0);
}
#sourceImage {
visibility: hidden;
align-self: flex-end;
background: cover;
width: 300px;
height: 300px;
box-shadow: 0px 2px 30px -15px rgba(0,0,0,0.5);
}
.btn-section {
width: 100px;
margin: 12px auto;
}
.results {
background-color: #fdfdfd;
display: flex;
align-items: center;
justify-content: space-between;
width: 100%;
margin: 12px auto;
box-shadow: 0px 5px 50px -20px rgba(0,0,0,0.5);
}
.results h3 {
font-size: 2em;
margin-bottom: 12px;
}
.left-side{
text-align: center;
margin: auto;
}
/*Add ClassList */
.animateImage {
animation: imageAnim 1s forward;
}
/*Animations*/
@keyframes imageAnim {
0%{
opacity: 0;
}
100%{
opacity: 1;
}
}
@media screen and (min-width: 901px){
.results{
width: 50%;
}
}
@media screen and (min-width: 601px) and (max-width: 900px){
.results{
width: 75%;
}
}
@media screen and (max-width: 600px){
.results{
width: 100%;
}
}
You need to replace the “ACCESS_URL” and “ACCESS_KEY” with your own in the JavaScript code, as per those you copied in the previous step from the Azure portal!
In my case for example, the “ACCESS_URL” is going to be
https://newyearfacetest.cognitiveservices.azure.com/
STEP 8
I will start by using the tool to detect my own age!
I am currently 22 years old, and going to be 23 by next August, so, let’s see how old the Face API sees me.

It says I am 24, which is so close, in fact I think it sees me one year older because I was wearing an ice cap hat!
Now that we’ve tested the accuracy of the Azure Face API, which turned out to be very accurate, let’s put the FaceApp to the test using the 2 pictures of Kevin from STEP 1!
The plot summary of Home Alone on IMDB states that Kevin is eight years old!
An eight-year-old troublemaker must protect his house from a pair of burglars when he is accidentally left home alone by his family during Christmas vacation.

The Azure Face API has proven it’s accuracy again!
It says that Kevin is 7 years old, while the main plot summary of the movie says he’s 8, which is very close!
Now, let’s try using the FaceApped photo of Kevin aka FaceApp Old Kevin photo!

Azure Face API shows that the FaceApp Old Kevin is a 65 years old female, which might be right when it comes to age, but obviously wrong when it comes to gender, but why?
The answer is simple!
FaceApp is not that accurate, in fact it is far from accurate!
It turned an 8 years old male to 65 years old female!
“The FaceApp should be used for fun only and people shouldn’t rely on the after photo as to what they will look like when they get older.”
Debra Jaliman, MD, a New York City-based dermatologist and author of Skin Rules
Aging depends on a variety of different factors that can’t be measured or predicted by an app including your genetics, how much time you spend in the sun, how you protect our skin from UV rays, whether you smoke, and how you sleep and diet, etc.
Resources
- Free Azure Subscription
- Face API
- Face API Documentation
- face-api-demo on GitHub
- FaceApp
- Home Alone
- Home Alone 2: Lost in New York
- FaceApp Is Cool and All, But Can It Really Predict How You’ll Age?
Summary
I hope you enjoyed this blogpost as we played around with A.I. and debunked FaceApp while having fun doing so, with some Christmas and New Year feel! 🎅
awesome experiment!
Awesome post! Keep up the great work! 🙂
Many thanks, this website is very helpful.
Like!! Really appreciate you sharing this blog post. Really thank you! Keep writing.