Facial recognition: School ID checks lead to GDPR fine
A watchdog has penalised a local authority for trialling facial recognition on high-school students in Sweden to keep track of attendance.
The Swedish Data Protection Authority (DPA) fined the Skelleftea municipality 200,000 Swedish Krona (£16,800, $20,700) for flouting a privacy law.
The trial involved tracking 22 students over three weeks and detecting when each pupil entered a classroom.
This is the first time that Sweden has ever issued a fine under GDPR.
The General Data Protection Regulation, which came into force last year, classes facial images and other biometric information as being a special category of data, with added restrictions on its use.
The DPA indicated that the fine would have been bigger had the trial been longer.
According to technology magazine ComputerSweden, Swedish authorities decided to investigate after reading media reports of Anderstorp's High School's trial.
The local authority told Swedish state broadcaster SVT Nyheter in February that teachers had been spending 17,000 hours a year reporting attendance, and the authority had decided to see whether facial-recognition technology could speed up the process.
'Fairly safe'
The trial, which took place in autumn 2018, had been so successful that the local authority was considering extending it.
Jorgen Malm, who oversees Anderstorp's High School and Naturbruk's High School for the municipality, told SVT that the technology was "fairly safe".
According to the DPA ruling, although the school secured parents' consent to monitor the students, the regulator did not feel that it was a legally adequate reason to collect such sensitive personal data.
The regulator noted that although some parts of the school could be deemed to be "public", students had a certain expectation of privacy when they entered a classroom.
It said there were less intrusive ways that their attendance could have been detected without involving camera surveillance.
As a result, the DPA found that Skelleftea's local authority had unlawfully processed sensitive biometric data, as well as failing to complete an adequate impact assessment, which would have included consulting the regulator and gaining prior approval before starting the trial.