You are on page 1of 69

Biometrics and Privacy

Alex Stoianov, Ph.D.


Senior Policy Specialist
Office of the Information and Privacy
Commissioner of Ontario

Presentation at the University of Ontario Institute of


Technology, November 13, 2014

Presentation Outline
1.
2.
3.
4.
5.
6.
7.

Privacy 101: Fair Information Practices


Privacy and Biometrics
Privacy by Design approach to biometrics
Untraceable Biometrics: Biometric Encryption,
Cancellable Biometrics
Match-on-Card (or Match-on-Device)
Case Study: BE in OLG Self-Exclusion
Program
Case Study: iPhone 5S, Galaxy S5 with a
fingerprint sensor
2

Biometrics
Automatic systems that use measurable,
physical or physiological characteristics or
behavioural traits to identify or verify an
individual.
Fingerprints, face, iris, finger/palm veins, voice,
hand/finger geometry, retina, dynamic signature,
keystroke dynamics, gait, palmprints, ECG, DNA (in
the future)
3

4
Jain et al, 2008

Retina
Keystroke dynamics
Jain et
al, 2004

Finger veins
Hitachi, 2006

Biometric Applications
Forensic applications (Criminal
investigation; Crime prevention; Remains
identification; Parenthood determination)
Governmental applications (National ID;
Driver's license; Border crossing; Terrorist
watchlists; Welfare; Casinos; Access
control)
Military applications (Identifying insurgents
in Iraq and Afghanistan war zones using
portable fingerprint, face and iris scan)
Enterprise applications (Access control;
Time/attendance control; Biometric logon) 6

Biometric Applications (contd)


Consumer applications: Smartphones
( iPhone 5s, Galaxy S5); Laptops; USB
flash drives; Biometric locks; Car ignition
Financial applications: ATM; Payment
systems; Account access
Health applications: Access to a health
registry; Patients tracking and monitoring;
Drugs dispensing
Education: Lunch and library entitlement;
Prevention of cheating on exams
7

IPC Who we are


Independent agency of govt; we oversee three laws:
FIPPA, MFIPPA, and PHIPA
Longstanding interest & involvement in privacy,
technology and law/compliance issues.
IPC approach: constructive engagement;
technologies are both a threat to and opportunity for
privacy; seek pragmatic Privacy by Design, winwin scenarios
IPC website: www.ipc.on.ca
8

Privacy 101:
Fair Information Practices

Information Privacy Defined


Information Privacy: Data Protection

Freedom of choice; personal control;


informational self-determination;

Control over the collection, use and


disclosure of any recorded information
about an identifiable individual;

Privacy principles embodied in


Fair Information Practices.

10

Fair Information Practices

Accountability

Identifying Purposes

Consent

Limiting Collection
Limiting Use, Disclosure,
Retention

Accuracy
Safeguards
Openness
Individual Access
Challenging Compliance

CSA Model Code for the Protection of Personal Information


(Privacy Code) CAN-CSA Q830 1996
www.csa.ca/standards/privacy/code/

11

What Privacy is Not

Privacy Security

12

Prevailing Model:
Privacy vs. Security: A Zero-Sum Game

13

The famous quote, often attributed to Benjamin


Franklin comes to mind at this point,

Those who would give up Essential Liberty to


purchase a little Temporary Safety, deserve
neither Liberty nor Safety.

14

Privacy AND Security!


A Positive-Sum Model

15

Privacy-Enhancing Technologies
(PETs)
PETs empower individuals to manage their own identities
and personally-identifiable information (PII).
PETs express fair information principles by:
1.

2.

3.

actively engaging the individual in managing and


controlling their PII (e.g., consent, accuracy, access,
challenging compliance, etc.)
minimizing the collection, use, sharing, and retention
of PII by others (e.g., limiting purposes, collection,
use, retention, etc.)
enhancing data security (e.g. safeguards)
16

Privacy by Design
1. Proactive not Reactive; Preventative not
Remedial
2. Privacy as the Default
3. Privacy Embedded into Design
4. Full Functionality Positive-Sum, not
Zero-Sum
5. End-to-End Lifecycle Protection
6. Visibility and Transparency
7. Respect for User Privacy
17

IPC Biometrics work

Biometrics Program, Toronto (1994)


Ontario Works Act (1997)
Discussion & guidance papers (1999)
Presentations, Speeches, etc. (2000-)
Statement to House of Commons Standing Committee
on Citizenship & Immigration (2003)
Resolution of International Data Protection
Commissioners (2005)
Member of EAB EABAC (2005-)
Biometric Encryption (BE) paper (2007)
Bell PerSay Philips BE for voice recognition (2007 2008)
18

IPC Biometrics work (contd)

OLG Self Exclusion BE project (2007- 2012)

Bill 85 submission (2008)


Fingerprint Biometrics: Address Privacy Before Deployment
(2008 paper)
BE Chapter (2009); BE articles for Encyclopaedia of Biometrics
(2009) and Encyclopaedia of Cryptography and Security (2010)
Joint paper with Max Snijder, CEO of EAB, on Untraceable
Biometrics (2009)
Technical papers on BE: TIC-STH (2009), SPIE (2010)
OLG BE FR Report (2010, 2013)
Review of Policy Research paper on BE (2012)
IPC OlG Morpho Report on Match-on-Card for FR with BE
(2014)
BE for one-to-many paper (2014)
19

Privacy and Biometrics

20

Privacy and Biometrics:


Concerns
Creation of large centralized databases
Far-reaching consequences of errors
in large-scale networked systems;
Interoperability invites unintended
additional secondary uses
Security risks
Biometric characteristics are unique,
irrevocable, and variable!
21

Privacy and Biometrics


Risks

Function creep
Linkage of the databases
Expanded surveillance, discrimination
Negative impacts of errors, false matches, etc.
Diminished oversight
Absence of individual knowledge or consent
Loss of personal control
Misuse of data (data breach, ID fraud, theft)
Loss of user confidence, acceptance, trust, & use
22

Consider biometrics in the context:


Consumer applications: full user's control - low
privacy threat, but may not be very secure (a
biometric USB flash drive can be lost or stolen);
Fingerprinting children at schools: privacy
invasive (no necessity test; no parental consent and
control; no organizations transparency; misleading
information) and not secure;
Access to nuclear facilities: high security;
established necessity.
Biometric information can be stolen from a low
security application and used to commit identity
23
theft in a high security application.

Biometrics & Security


The Risks

Spoofing
Replay attacks
Substitution attack
Tampering
Masquerade attack
Trojan horse attacks
Overriding Yes/No response
Insufficient accuracy
24

Biometric system integrators, vendors,


and consultants may claim:
the stored biometric information is just a
meaningless number, and therefore is not
personally identifiable information (PII);
biometric templates stored in a database cannot
be linked to other databases because a
sophisticated proprietary algorithm is used;
a biometric image cannot be reconstructed from
the stored biometric template.

Must be analysed on case by case basis


(False in most cases)

25

Fingerprint minutiae

26
Adler, 2004

Fingerprint minutiae (contd)


Basic minutiae
template:
Store minutiae
coordinates,
angles, and types.
(33 minutiae in this
example)

27

Fingerprint minutiae (contd)


Minutiae information is a biological
characteristic of an individuals finger
Highly sensitive personal information
Cannot be changed, cancelled, or revoked
To obtain the minutiae information from a
stored template, it is not necessary to be
familiar with a sophisticated proprietary
algorithm. Only the knowledge of the storage
format is needed.
28

Minutiae standards
ANSI-INCITS 378-2004, Information Technology
Finger Minutiae Format for Data Interchange, 2004.
ISO/IEC 19794-2:2005, Information Technology
Biometric Data Interchange FormatsPart 2: Finger
Minutiae Data, 2005.
Basic minutiae information (coordinates, angles, and
types) is always stored; extended data are allowed.
29

The basic minutiae information can


uniquely identify an individual!
For a fingerprint containing 36 minutiae:
A probability that two fingerprints will falsely match
on all 36 minutiae is 5.5x10-59, and on 12 out of 36
minutiae is 6.1x10-8 (Pankanti et al, 2002)

Minutiae Interoperability Exchange Test,


MINEX04, NIST:
The standard minutiae templates generated by one
vendor can be matched to the templates generated by
another vendor using an algorithm from a third
vendor
30

Fingerprint reconstruction from


minutiae template (Cappelli et al, 2007)

Minutiae template
Original image

Reconstructed image
31

The reconstructed image is called a


masquerade image since it is not an exact
copy of the original, but will fool the system in
> 90% cases
A masquerade image can be injected to the
system in a digital form (bypassing the
fingerprint sensor)
It is possible to create a fake fingerprint and
physically submit it to the sensor
The level of interoperability for the minutiae
template is increased. The masquerade image
can be submitted to any other fingerprint
system, even a non-minutiae one
32

Face reconstruction from a commercial


off-the-shelf facial recognition system

Adapted from P. Mohanty, S. Sarkar, and R. Kasturi, Privacy and security issues related
to match scores, in IEEE Workshop on Privacy Research In Vision, CVPRW, 2006.
33

Privacy by Design approach to


biometrics
Untraceable Biometrics
Biometric Encryption;
Cancellable Biometrics

Match-on-Card
Matching in encrypted domain (e.g., via
homomorphic encryption)

34

Untraceable Biometrics
Class of emerging technologies that
seek to irreversibly transform the
biometric data provided by the user

35

Untraceable Biometrics
no storage of biometric image or conventional
biometric template;
the original biometric image/template cannot be
recreated from the stored information, i.e. it is
untraceable;
a large number of untraceable templates for the same
biometric can be created for different applications;
the untraceable templates from different applications
cannot be linked;
the untraceable template can be changed or
cancelled.
36

Untraceable Biometrics:
Biometric Encryption (BE)
Cancellable Biometrics (CB)

37

Biometric Encryption (BE)


BE technologies securely bind a digital key to a
biometric, or extract a key from the biometric;
neither the key nor the biometric can be retrieved
from the stored BE template, also called helper
data;
the key is re-created only if a correct biometric
sample is presented on verification;
BE template can be changed or cancelled;
the output of BE verification is either a key (correct
or incorrect) or a failure message
38

Biometric Encryption (BE)


There is always a biometric dependent helper data stored in
the system.
In essence, the key is encrypted with the biometric.
This encryption/decryption process is fuzzy because of the
natural variability of biometric samples.

Synonyms: Biometric Template Protection; Biometric


Cryptosystem; Fuzzy Extractor; Secure Sketch; Helper
Data Systems; Biometric Locking; Biometric Key
Generation; etc.
39

Principal inventor of BE (1994):


Dr. George Tomko, founder of Mytec Technologies
Inc., Toronto

40

Use Biometric as the


Encryption Key
Randomly generated key
Enrollment

0101100101
Biometric Image

Biometric Template

100110100010
010

BE binding
algorithm

110011001011
..110
Biometrically-encrypted key is stored
41

Decrypt with Same Biometric


Biometrically-encrypted key

110011001011
..110

Verification

Fresh Biometric Image

Fresh Biometric Template

101100101010
000

BE retrieval
algorithm

0101100101
Key retrieved

42

Fuzzy Commitment Scheme for Iris


(Hao, Anderson, Daugman 2005)
Enroll

Iris template, 2048 bits:

100110100010010
140-bit key:

0101100101
Map to 2048-bit Error Correcting Code (ECC) codeword:

010101101001...100
XOR:

110011001011...110
Store as a biometrically-encrypted key

43

Fuzzy Commitment Scheme for Iris


(Hao, Anderson, Daugman 2005)
Verify

Fresh iris template, 2048 bits:

101100101010000
Retrieve biometrically-encrypted key:

110011001011...110
XOR:

011111100001...110
If the number of errors is within the ECC bound, the ECC will
decode the correct 140-bit key:

0101100101
44

BE technological challenges
Make the number of bit errors for a legitimate user
as low as possible (accommodate natural variations
of biometrics);
Make FAR as low as possible;
Design a powerful, efficient, and secure Error
Correcting Code;
Make the biometrically-encrypted key (also called
helper data) resilient against attacks;
Develop BE applications.
45

Advantages of
Biometric Encryption
BE technologies can enhance both privacy and security:
1. NO retention of biometric image or template
2. Multiple / cancellable / revocable identifiers
3. Improved security of personal data and
communications
4. Greater public confidence, acceptance, and use; greater
compliance with privacy & data protection laws
5. Suitable for large-scale applications
46

Advantages of
Biometric Encryption (contd)
6.

Improved authentication security


longer, more complex identifiers
no need for user memorization
less susceptible to security attacks:

No substitution attack: nobody knows the key and


cannot create a fake template;
No tampering: biometric features arent stored;
No Trojan horse attacks: no score is used;
No overriding Yes/No response;
More resilient to a masquerade attack.
47

Core BE technologies

Mytec1 (Tomko et al, 1994, 1995);


Mytec2 (Soutar et al, 1997);
ECC check bits (Davida et al, 1998);
Biometrically hardened passwords (Monrose et al, 1999);
Fuzzy Commitment (Juels and Wattenberg, 1999);
Fuzzy Vault (Juels and Sudan, 2002);
Quantization using correction vector (Lyseggen et al, Duffy
and Jones, 2001; Linnartz and Tuyls, 2003; Buhan et al,
2007);
Several Fuzzy Extractor schemes (Dodis et al, 2004);
BioHashing with key binding (Teoh et al, 2004);
ECC syndrome with graph-based LDPC coding (Martinian et
al, 2005)
48

Possible Applications
and Uses of Biometric Encryption

Biometric ticketing for events;


Biometric boarding cards for air travel;
Identification, credit and loyalty card systems;
Anonymous (untraceable) labeling of sensitive
records (medical, financial);
Consumer biometric payment systems;
Access control to personal computing devices;
Personal encryption products;
Local or remote authentication to access files held by
government and other various organizations;
One-to-many systems
49

Cancellable Biometrics
CB technologies apply a secret transform to the
biometric;
the transform can be invertible or not;
both the transformed template and the secret
transform are stored;
on verification, the same transform is applied to a
fresh biometric sample, and two transformed
templates are matched;
the output of CB verification is a Yes/No response.
50

Cancellable Biometrics

51

Differences between BE and CB


1. CB:

distorting transform must be kept secret.


BE: the cryptographic key is not kept at all.

2. CB : fresh distorted template is compared against stored


distorted template.
BE : undistorted biometric is compared against the stored
biometrically encrypted key.
3. CB : binary Yes/No response on verification.
BE : the verification output is either a key or a failure message.

52

Differences between BE and CB (contd)


4. CB : closer to a conventional biometric system.
BE : can be integrated with a conventional cryptosystem.
5. CB : can be attacked by overriding Yes/No response, and by a
substitution attack.
BE : is immune against those attacks.
(BE can also have an optional transform)

53

Match-on-Card (or Match-on-Device)


Fingerprint Authentication within
the Secure Element (SE)

(Morpho)

54

Match-on-Card with BE

55

Case Study: BE in OLG SelfExclusion Program


Self-excluded individuals: more than 15,000 in
Ontario and growing
How to reliably detect those who attempt to enter
a gaming site? (Manual comparison does not
work)
Privacy of all casino patrons must be protected
Solution: Facial recognition in watch-list scenario
with Biometric Encryption
Novel Made in Ontario Privacy-by-Design
application: collaboration of OLG, IPC, UofT,
and iView Systems
56

Facial Recognition with Biometric


Encryption
Biometric Encryption (BE): securely bind a
persons identifier (pointer to personal
information) with facial biometrics
The pointer is retrieved only if a correct (i.e.
self-excluded) person is present
The link between facial templates and personal
information is controlled by BE
Final comparison is done manually
Privacy of both general public and selfexcluded people is protected
57

BE in a watch list scenario for OLG SelfExclusion Program

58

The outcome
Live field test at Woodbine facilities: Correct
Identification Rate (CIR) is 91% without BE,
and 90% with BE negligible accuracy impact
BE reduces False Acceptance Rate (FAR) by up
to 50% - huge accuracy improvement!
Accuracy exceeds state-of-the-art for facial
recognition
Triple-win: privacy, security, and accuracy
(unexpected) all improved!
Deployed in most Ontario gaming sites by 2012
59

Case Study: iPhone 5S, Galaxy S5


with a fingerprint sensor
iPhone 5S

Galaxy S5

60

iPhone 5S features
the fingerprint is used for log-on and for payments for
iTunes and Apple store;
equipped with Authentec TouchID capacitive RF sensor;
the sensor is located under the Home button and covered
with a thin sapphire disk;
there is a metal ring around the Home button, which is
an RF antenna/electrode for the Authentec sensor;
there is 360 degrees readability;
it is claimed that there is a liveness detection because RF
waves penetrate the sub-epidermal, and even that a dead
finger wouldnt work - FALSE;
61

iPhone 5S features (contd)


the use of a fingerprint is not mandatory for the iPhone, a
password can be used instead;
it is claimed that a fingerprint image is not stored; instead, a
template is generated;
the template is stored in a Secure Enclave and is encrypted;
up to 5 templates can be stored. They can represent different
fingers of the owner or the fingerprints of other users;
the verification is 1:1 or one-to-few (where few equals 5)
but not 1:many;
(supposedly) no biometric information is sent to the server;
(supposedly) the template is not minutiae-based and,
therefore, is not compatible with the FBIs AFIS.
62

IPC Publications
Ann Cavoukian and Alex Stoianov, Biometric Encryption: A Positive-Sum
Technology that Achieves Strong Authentication, Security AND Privacy (March
2007) at www.ipc.on.ca/images/Resources/up-1bio_encryp.pdf
Ann Cavoukian, Alex Stoianov, and Fred Carter, Biometric Encryption:
Technology for Strong Authentication, Security AND Privacy. In IFIP, Policies and
Research in Identity Management; Eds. E. de Leeuw, Fischer-Hbner, S., Tseng,
J., Borking, J.; (Boston: Springer), v. 261, pp. 5777, 2008.
A. Cavoukian and A. Stoianov, Biometric Encryption: The New Breed of
Untraceable Biometrics. Chapter in Boulgouris, N. V., Plataniotis, K. N., MicheliTzanakou, E., eds.: Biometrics: fundamentals, theory, and systems. Wiley, London
(2009).
Ann Cavoukian and Alex Stoianov. Biometric Encryption. In Encyclopedia of
Biometrics. Springer, 2009. http://www.ipc.on.ca/images/Resources/bio-encryptchp.pdf
Ann Cavoukian and Max Snijder, The Relevance of Untraceable Biometrics and
Biometric Encryption: A Discussion of Biometrics for Authentication Purposes.
http://www.ipc.on.ca/images/Resources/untraceable-be.pdf
Ann Cavoukian, Fingerprint Biometrics: Address Privacy Before Deployment
(2008). http://www.ipc.on.ca/images/Resources/fingerprint-biosys-priv.pdf
63

IPC Publications (contd)

A. Stoianov, Cryptographically secure biometrics. Proc. SPIE, Vol. 7667, pp.


76670C-1 - 76670C-12 (2010).
A. Cavoukian, M. Chibba, and A. Stoianov, Advances in Biometric Encryption:
Taking Privacy by Design from Academic Research to Deployment. Review of
Policy Research, V. 29, Issue 1, pp. 37-61 (2012).
A. Cavoukian, T. Marinelli, A. Stoianov, K. Martin, K. N. Plataniotis, M. Chibba,
L. DeSouza, S. Frederiksen, Biometric Encryption: Creating a PrivacyPreserving Watch-List Facial Recognition System. In: Security and Privacy in
Biometrics, P. Campisi (ed.), Ch. 9, pp. 215-238. Springer-Verlag London, 2013.
A. Cavoukian, M. Chibba, A. Stoianov, T. Marinelli, K. Peltsch, H. Chabanne, O.
Beiler, J. Bringer, and V. Despiegel. Facial Recognition with Biometric
Encryption in Match-on-Card Architecture for Gaming and Other Computer
Applications (Feasibility Study). IPC (June 30, 2014).
http://www.ipc.on.ca/images/Resources/pbd-facial-recognition-biometric.pdf
A. Cavoukian and A. Stoianov, Privacy by Design Solutions for Biometric Oneto-Many Identification Systems. IPC Technical Report (June 20, 2014).
http://www.privacybydesign.ca/content/uploads/2014/06/pbd-solutionsbiometric.pdf
64

How to Contact Us
Information and Privacy Commissioner of Ontario
2 Bloor Street East, Suite 1400
Toronto, Ontario, Canada
M4W 1A8
Phone:
Web:
E-mail:

(416) 326-3333 / 1-800-387-0073


www.ipc.on.ca
info@ipc.on.ca

65

Additional Slides

66

Ontario Works Act (1997)


1. The biometric must be stored in encrypted form
both on any card and in any database;
2. The encrypted biometric cannot be used as a
unique identifier;
3. The original biometric must be destroyed upon
encryption;
4. The stored encrypted biometric can only be
transmitted in encrypted form;
5. No program information is to be retained or
associated with the encrypted biometric
information;
6. Biometric information shall be collected openly
and directly from an individual;
67

Ontario Works Act (contd)


7. There can be no ability at the technical level to
reconstruct or recreate the biometric from its
encrypted form;
8. There must be no ability to compare biometric
information between databases;
9. There can be no access to the biometric
database without a court order or specific
warrant.
This legislation was recognized nationally as the
privacy standard for the use of biometrics in
public sector.
68

Vulnerabilities of a biometric system

69

You might also like