I have always been against the idea that the end justifies the means. So I was very happy when the European Parliament stood up last week to put a stop to the constant advances in facial recognition. With almost 55% of the votes in favor, the MEPs who represent us have asked the European Commission for a “prohibition of any processing of biometric data for police purposes that lead to mass surveillance in publicly accessible spaces “.
Although the resolution is not binding, it should generate impact on the advancement of artificial intelligence (AI) regulation in Europe, whose first draft we met earlier this year. At the time, that text was already criticized for its lack of specificity and for allowing wide exceptions to facial recognition that would give rise to legal loopholes.
Of course, MEPs are not stupid. The resolution admits that the technology could offer “great opportunities to combat crime such as the financing of terrorism and child abuse and exploitation. “But it also warns that its use poses” significant risks to people’s fundamental rights; while any general application of AI for mass surveillance would be disproportionate. “
It is clear that, for them, the end does not justify the means. As good as the promises of facial recognition and automated biometric data analysis are, their negative impact could be so severe that we should not allow them, even if they work perfectly.
As good as the promises of facial recognition and automated biometric data analysis are, their negative impact could be so severe that we should not allow them
“Technology is very promising if it is developed and used ethically and reliably, but it carries considerable risks to fundamental rights, democracy and the rule of law. If it is faulty, it is faulty, regardless of who uses it and for what purposes. Good intentions do not justify the means “, warned the leader of the initiative, Petar Vitanov, during the session.
He also recalled some scandals that have shown that “AI systems can lead to racially biased and discriminatory results”. For example, the world is awaiting the resolution of a case in the US that could prohibit the Detroit police from continuing to use such a system, after its malfunction led to the unjust arrest of an innocent citizen.
In Spain, the transposition of the future regulations would force, among other things, to dismantle the system for identifying people through AI that is already used on the borders of Ceuta and Melilla. The idea of speeding up and increasing the effectiveness of terrorist identification sounds great, but what if the technology went wrong and the wrongly accused person was you?
If you are a middle-aged white male, you can rest easy, as the biggest facial recognition biases are typically on women, people of color, and minorities like transgender people. Whether we are white, black or blue, we all have the same right to the presumption of innocence and privacy.
That’s another: the more biometric data that is collected about us, the easier it will be to constantly monitor us if technology is not curbed. The president of Brazil, Jair Bolsonaro, was harshly criticized when in 2019 he forced federal agencies to share all their data on Brazilian citizens, including genomic sequences, to unify them in a large master database.
On a private level, things look worse, since even there are no limits to the use that companies can make of this technology in their domains, as long as they do not violate data protection laws. In fact, this was precisely the reason that Justice forced Mercadona to stop its facial recognition system. Regardless of the potential biases and opacity of the system, the ultimate argument was for privacy.
Although Europe still lacks robust regulation for artificial intelligence at the public and private level, it is clear that MEPs are listening to the community, which received last week’s resolution to applause. No matter how safe and effective they try to sell us, the end does not justify the means, and even less so if it involves wrongful imprisonment and unwarranted surveillance.