News

/ Facial recognition and biometric data processing: we are in debt

July 8, 2020

I wish there were more of us who saw the risks to which we are exposed every day by an obsolete and inefficient rule. We advocate that the bill amending Law 19,628, which expressly regulates biometric data processing, be given urgency.

Macarena Gatica

Senior Associate

Alessandri

Again I am surprised, but this time more than a year ago, when I see on social networks and WhatsApp groups pictures of friends, family and strangers using FaceApp once again. It’s no longer to look young or old, but to look female or male.

A year ago I also wrote about this topic and I thought that today few would come back to the same application, considering what has happened with facial recognition and the greater sensitivity we should have regarding the treatment of our data in the wake of the pandemic. I was wrong. But I will not go back over the risks involved in using this application, but will take the opportunity to refer to biometric data and specifically to facial recognition.

Last June, events occurred that we never thought we would see. IBM reported that it would not offer solutions aimed at facial recognition and that it would not continue with related technological research and development. It bases its decision on racial bias and problems with people’s privacy. Days later, we see that Amazon and Microsoft follow, who pointed out that they prohibit their facial recognition applications from being used by police forces. It is well known the questioning that this generated in the United States, because of the racial problems with the police.

Technologically speaking, facial recognition was considered by many to be the breakthrough. However, some immediately saw its risks: Microsoft researcher Luke Stark pointed out that facial recognition was AI plutonium.

We could summarize the risks inherent in facial recognition in racial bias/discrimination and false positives. We are all clear about recognition and legal protection from discrimination, but what about false positives? The accuracy of facial recognition has a wide spectrum of accuracy. There are more false positives than false negatives, which can imply a security problem. For example, when this method is used to verify permission to access an application, phone, or a protected environment, or a problem for the university that monitors general or test attendance. On the other hand, we have suppliers and the police wanting to identify their consumers and criminals respectively, as soon as they enter a commercial establishment: the former to offer them products and services and the latter to prevent them from committing crimes.

What are we doing in Chile with facial recognition? A commercial establishment and a university publicly communicate with great pride that they use facial recognition for different purposes. From a legal standpoint, facial recognition is biometrics of certain physiological characteristics of a person’s face. Law 19,628 on Privacy Protection says nothing about biometric data, without prejudice to which, we could consider it as a sensitive data, according to the definition contained in this legal body and therefore would correspond the express consent of the owner to make its facial recognition.

This law lacks compliance and therefore in our country it is not an issue. There is no concern about the treatment without consent its use. Another reason for Chile to soon have adequate data protection.