Biometrics are the physical characteristics that can be used to digitally identify a person. Examples of such characteristics include fingerprints, faces, retinas, the voice, and DNA, all of which can be used for biometric authentication.
How does biometric authentication work?
Healthcare providers and law enforcement have long been interested in biometrics, but as the technology has become sufficiently advanced (and affordable), biometrics is now more commonly used as a convenient way to control access to digital systems like mobile phones, computers, and even buildings.
Businesses are now realising the potential of biometrics as an additional way to identify people, and many providers of authentication software and identity management software now incorporate biometrics into their products.
Biometric authentication works like a password or code. By using a fingerprint on a special scanner, for example, a user can unlock their device or account instead of typing in a password. This can be quicker and more convenient for the user as there’s no need to remember different passwords for different accounts. Biometrics can also be used as an additional layer of security – also known as multi-factor authentication – on top of traditional methods like passwords and access codes.
41% of respondents have voluntarily given biometric data to private companies
According to a GetApp survey of more than 1,000 Canadians – scroll down to see the full methodology – 41% of respondents have voluntarily given biometric data to a private company that is not related to healthcare. The most common biometric data provided were fingerprints – commonly used to unlock mobile phones – which were used by 84%. 64% also said they have given a face scan, and 42% a voice scan.
Overall, fingerprints are the piece of biometric data that respondents would most voluntarily provide to a non-healthcare-related private company. Of the 1,007 survey respondents, 45% would be comfortable sharing a fingerprint, 32% a face scan, and 24% a voice scan.
However, general resistance to providing such data remains relatively high: 36% say they wouldn’t be comfortable sharing any of the given biometric data types, which also includes DNA, as well as iris, hand, and vein scans.
One-third of respondents don’t trust tech companies with their biometric data
Handing biometric data to any other party implies a level of trust that the party will handle that data responsibly. Those surveyed did not have high levels of trust in any third party, although the differences in trust levels between different types of data handlers are illustrative.
50% trust technology companies “somewhat” to properly use and safeguard their biometric data, while 33% don’t trust them at all. 28% don’t trust their employer with biometric data (48% trust them somewhat) and 34% don’t trust government agencies (with 46% trusting them somewhat).
Apple and Microsoft are the most trusted of all “big tech” companies when it comes to biometric data
Consumers will be familiar with the use of biometric authentication from using their fingerprints to unlock a phone. This data usually resides on the device, but users must trust big tech companies – namely Apple for iPhones and Google for Android devices – to keep that data safe.
26% of respondents to the survey say they “highly trust” Apple to properly use and safeguard biometric data. This level of trust is greater than that placed in government agencies, which only 21% of respondents said they trust highly.
Other large technology companies had slightly lower levels of trust amongst respondents:
Canadians overwhelmingly favour transparency and regulation in biometrics
As the use of biometric authentication has become more widespread, questions arise as to what constitutes ethical use. In October 2020, the Office of the Privacy Commissioner of Canada found that mall operator Cadillac Fairview had collected and used facial recognition technology on 5 million people without proper consent, prompting calls for tougher legislation.
Respondents to the GetApp survey overwhelmingly support transparency around the collection and use of biometric data:
- 82% think private companies should not be able to share biometric data with other companies without the subject’s express consent.
- 65% say private companies’ use of biometric data should be regulated by law.
- 78% say consumers should have the right to opt-out of facial recognition technology used by private companies.
- 83% say they should have the right to know if a private company is in possession of their biometric data.
- 82% believe they should have the right to request the deletion of their biometric data stored by private companies.
According to Ann Cavoukian of the Global Privacy and Security by Design Centre, Cadillac Fairveiw’s actions would have resulted in “millions of dollars in fines if it had happened in the United States.”
Under Canadian law, the company was not fined for its actions. But Canadian legislators are now seeking to update privacy laws with new protections for consumers and punishments for misuse of personal data, which may include biometric data.
Misuse of biometric data is a concern for nearly 3 in 4 people surveyed
As highlighted above, the use of biometric data opens many conversations about ethics. These include how and when biometric data is collected, obtaining subjects’ consent, how it is stored, with whom it is shared, and what rights subjects have over their data during these processes.
Respondents show broad concerns in regards to the use of their biometric data. 72% are concerned about misuse, 71% are concerned about ID theft, 70% about data breaches, and 69% about reduced privacy.
More education on algorithmic bias is needed
Much of the collection of biometric data happens on a large scale, with companies sometimes collecting millions of data points. This means that the use of artificial intelligence algorithms to find interesting patterns within the data sets can be attractive. In the example of Cadillac Fairview, the company analyzed faces to understand the ages and genders of the millions of people visiting its malls rather than to identify individuals.
However, algorithms are not perfect and can exhibit bias when processing data to generate outcomes that may be unfair to some groups of users. These biases have been reported in the media in recent years:
- Tests on three commercial facial recognition systems showed a sharp drop in effectiveness for dark-skinned women compared with white men.
- A study by the US National Institute of Standards and Technology found higher rates of false positives when one-to-one matching images of Asian and African American faces relative to images of Caucasians.
- Twitter apologized in September 2020 after users found that its image-cropping algorithm favoured light faces over dark ones.
However, not everyone realizes that these biases exist. 43% of the people surveyed are not aware of algorithmic bias and its potential problems. As algorithms become more common in our daily life, there is much work to do to improve public understanding of how they work, as well as their advantages and shortcomings.
Data for the GetApp Biometric Technology and Password Management Canada Survey 2021 was collected in January 2021. The sample comes from an online survey of 1,007 respondents who live in Canada. The respondents were of the age groups 18 to 25 years, 26 to 34 years, 35 to 49 years, 50 to 64 years, and 65 and above years.