Data Protection Authority fines university €750,000 for its use of biometric data in remote exams
-
06/02/2026
- ANECA requires all universities in its territory to comply with the provisions of the AEPD resolution from 2025 onwards.
- In October, ANECA sent a circular to university rectors demanding compliance with the European Artificial Intelligence Regulation.
The Spanish Data Protection Agency (AEPD) has published resolution PS-00067-2024 [0] condemning a private university in the Valencian Community to two heavy fines (€300,000 and €350,000) for the inappropriate use of automatic systems based on biometric data for the remote surveillance of official exams. In a press release [1], the AEPD explained the content of the resolution, which states that biometric data must be considered “special category data” in accordance with the European Data Protection Board. It also established that “consent cannot be considered valid as no real and effective alternative was given to students, since the software used was the only method allowed for taking online exams.”
ANECA, for its part, aims to contribute to improving the quality of the higher education system through the evaluation, certification, and accreditation of courses and institutions. In light of this, since the Data Protection Agency established its criteria, ANECA has required universities within its territory to comply with data protection standards when verifying and accrediting their official degrees and centers. Thus, all degrees and university centers that have been evaluated by ANECA since then must comply precisely with the provisions in this area.
In addition, in October ANECA issued a circular addressed to the rectors of universities that have the Agency as their reference quality agency, reminding them of the full validity of Regulation (EU) 2024/1689 of the European Parliament and of the Council, which establishes harmonized rules on artificial intelligence. This Regulation considers a catalog of applications in the field of education and vocational training to be “high-risk systems,” including those “AI systems intended to be used for monitoring and detecting prohibited behavior by students during exams.” According to Article 27, before deploying a high-risk AI system, it is necessary to carry out an assessment of the impact that the use of such a system may have on fundamental rights. In the case of Spain, it must be confirmed that the university has carried out and registered with the Spanish AI Supervisory Agency (AESIA) an assessment of the impact on fundamental rights (EIDF). In accordance with this, ANECA requires proof of registration of this document for the verification of degrees and official centers that make use of tools based on biometric information.
References: