A former student of Legal Sciences, who graduated from the Sant’Anna School and the University of Pisa with whom he continues to collaborate, is the only European scholar to have received the ‘Future of Privacy Award’ bestowed by the ‘Future of Privacy Forum’. The American nonprofit organization brings together industry, academics, consumer lawyers, companies and institutions, with a view to exploring the challenges posed by innovative technology, developing privacy protection, defining ethical standards and leading the way to viable commercial procedures. Gianclaudio Malgieri, the former student, (photo on the right) is now affiliated to LiderLab at the Dirpolis Institute in the Sant’Anna School, and he has been invited to receive the award on Thursday 6 February at the United States Senate for his paper ‘Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations’ which he authored together with Margot Kaminski, a researcher from the University of Colorado.
The paper by Gianclaudio Malgieri and Margot Kaminski was one of five selected from around the world for the ‘Future of Privacy Award’ (the full name of the award is ‘Privacy Papers for Policymakers Award’), and the authors will be given the award on Thursday 6 February in Washington. On the same day, the study will also be presented to the American Federal Trade Commission (the federal agency in the USA which deals with consumer protection and competition). In the selected paper, the two scholars propose an innovative method in the assessment of the impact of algorithms in light of the GDPR, the General Regulation on the Protection of Personal Data, which opens new possibilities and scenarios in the protection of privacy by combining for the first time two tools in the protection from the risks of profiling algorithms.
Gianclaudio Malgieri is 27 years old and a student of Giovanni Comandé, a professor of Comparative Private Law at the Sant’Anna School of Advanced Studies: he graduated from Sant’Anna in 2017, after graduating in 2016 from the University of Pisa in Law with a thesis on the regulation of profiling algorithms. He teaches in the DPO (Data Protection Officer) courses organized by the Lider Lab of the Dirpolis Institute at the Sant’Anna School, and is a collaborator of the chair of Computer Law at the University of Pisa. Gianclaudio Malgieri is now completing a PhD at the VUB in Brussels (Vrije Universiteit Brussel), in the LSTS (Law, Science, Technologies and Society) research group.
“In the study for which I will be given the award,” explains Gianclaudio Malgieri, “my co-author and I propose to use the impact assessment on data protection (DPIA) to explain and justify profiling algorithms making them even more transparent. This is an advantage for companies, increasing the correctness and accountability of algorithms through a systematic approach”. For example, in the world of insurance, citizens release a series of personal information such as gender, date of birth, etc. Based on this information, an algorithm gives an evaluation, predicts the risk and quantifies it. “These citizens have the right to understand how this process takes place, to ask for explanations and to be reassured of the fact that, in these assessments, there are no discriminatory criteria,” explains Gianclaudio Malgieri.
In addition to discrimination, another risk that this tool should lessen is that of commercial manipulation. There have been cases of women, victims of violence in the United States, who have received advertising for self-defense tools: in this case, the risk of manipulation was high, because it was based on trauma and emotions. Gianclaudio Malgieri and Margot Kaminski combine the right to an explanation of the algorithms with impact assessment, which since 2018 has been a mandatory tool for all high-risk data processing, for the fundamental rights and freedom of the individual. “This in practice means all the processes that may involve risks of discrimination, manipulation, loss of data control, economic or psychological damage and so on,” continues Malgieri. “Among these, it is important to consider the large-scale processing of sensitive data, the processing of data from video surveillance cameras, or automated decisions based on personality assessments.“