The murky world of biometric engines


Thursday, 01 September, 2022

The murky world of biometric engines

Individuals are unknowingly at risk of becoming victims of deep fake scams or online stalking through seemingly innocuous activities, according to biometric security firm Daltrey. Participating in public events, or merely being a spectator, leaves individuals open to their image being uploaded into a biometric engine, a technology that is vastly more complex and powerful than a simple photo library.

Daltrey's CEO and co-founder Blair Crawford uses the popular annual City2Surf road running event held in Sydney each August as an example. According to Crawford, when the 60-odd thousand participants register and attend the event, their photos are taken and uploaded into the German-owned Sportograf facial recognition system.

“Each of the faces in the photos is subjected to a facial recognition system that maps their faces. This is the start of the issue, as people may not be aware that their images are being placed into such a system that is accessible by so many other people with so little protection. In addition, spectators’ images may unknowingly be captured in the background, uploaded and searchable, without the opportunity for them to consent,” Crawford said.

People can then find their face, or a someone that strongly resembles them, using a selfie or an image of someone they have. Once they have found a match for the picture of the person they are looking for, the selfie is erased (as per Sportograf’s privacy policy) but the images in the biometric registration system are retained.

“It’s possible for a stalker to track someone, for instance a participant or a minor who is captured in the background as a spectator, by accessing the images as they are not secured behind any sort of authentication. The images could be used to create a deep fake of the person, to confirm they were in the location of the event, and furthermore they are accessible anywhere in the world,” Crawford said.

He highlighted that in the case of spectators they have neither registered nor signed up to City2Surf's terms and conditions.

“Participants, who have registered and agreed to the terms and conditions, are unlikely to have read the details and fully understand the extent to which they have consented. This raises the key question of how biometric technology is outpacing the community’s understanding of its application, as we have seen recently with the Bunnings example,” he said.

Crawford argues that the responsible use of biometric technology is an imperative.

“Vendors of technology that can impact the security and privacy of people need to think through all potential consequences. Biometric programs must be built on a foundation of consent, where people must opt in based on a clear understanding of the scope and the value to the person opting in.

“In terms of a national framework, there are a lot of standards that already exist to guide the use and applications of biometric technology such as ISO/IEC 24745:2022, which defines the principles of confidentiality, integrity, and privacy protection of biometric information to make the use of biometrics safer. The focus should be on the adoption of these standards to safeguard the integrity of the users’ security and privacy,” he said.

Image credit: iStock.com/metamorworks

Related News

IMT sector was Australia's most targeted in 2023: report

The information, media and technology sector has been the Australian industry most targeted...

ISACA identifies gaps in AI knowledge, training and policies

85% of digital trust professionals say they will need to increase their AI skills and knowledge...

VNC accounts for nearly all remote desktop attacks

Virtual Network Computing accounted for 98% of remote desktop attacks recorded by Barracuda last...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd