Biometrics and Privacy

In biometrics, a human being is identified by measuring a set of parameters of the body. Biometric data are said to identify a person based on "who he is", rather than on "what he has" (such as a smartcard) or "what he knows" (such as a password). 

An unresolved issue, however, is that a citizen looses privacy as he must reveal his identifying biometric data to his bank, to the government, to his employer, the car rental company, to the owner of a discotheque or nightclub, etc. Each of them will obtain the same measured data, and unless special precautions are taken there is no guarantee that none of these parties will ever misuse the biometric data to impersonate the citizen and to breach his provacy.

 Dishonest

verifiers

Following the tradition in cryptography to name our role players, we say that prover Peggy allows verifier Victor to measure her object called "Prop." 

It is important to study security breaches due to a dishonest Peggy, but also in those resulting from an unreliable Victor.

Biometric language Often the distinction is made between identification and biometric verification

Identification estimates which object is presented by searching for a match in a data base of reference data for many objects. Victor a priori does not know whether he sees Prop1 (belonging to Peggy) or Prop2 (belonging to Petra). 

On the other hand, verification attempts to establish whether the object presented truly is the object Prop that a known prover Peggy claims it to be. Peggy provides not only Prop but also a message in which she claims to be Peggy and can be linked to Prop, in some direct or implicit way. During verification, Victor is assumed to have some a priori knowledge about Prop in the form of certified reference data, but at the start of the protocol he is not yet sure whether Prop or a fake replacement is present.

Security language In security, we distinguish (possibly against common practice in biometric literature) between verification and authentication. In a typical verification situation, the reference data itself allows a malicious Victor to artificially construct measurement data that will pass the verification test, even if Prop itself has never been available. In authentication, the reference data gives insufficient information to allow Victor to (effectively) construct valid measurement data.

While such protection is not yet mature for biometric authentication, it is common practice with secure computer passwords. When a computer verifies a password, it does not compare the password p typed by the user with a stored reference copy. In stead the password is processed by a cryptographic one-way function F and the outcome is compared against a locally stored reference string F(p) . This prevents that the system can be attacked from the inside such that the unencrypted or decryptable password of its users can be stolen.

Cryptographic operations on noisy data

The main difference with biometrics is that during measurements it is unavoidable that noise or other aberrations occur. Noisy measurement data will be quantized into discrete values before these can be processed by any cryptographic function. Due to external noise, the outcome of the quantization may differ from experiment to experiment. In particular if Peggy's physiological parameter takes on a value close to a quantization threshold, minor amounts of noise can change the outcome. Minor changes at the input of a cryptographic function will be amplified and the outcome will bear no resemblance to the expected outcome. This effect, identified as 'confusion' and 'diffusion', makes it less trivial to use biometric data as input to a cryptographic function. Particularly the comparison of measured data with reference data can not be executed in the encrypted domain.

Secure storage of templates

Storage of reference data (user templates) and protecting their privacy is well recognized as a key concern with biometric authentication. Preferably the derivative should not allow an attacker to construct fake data. It was previously known that enrollment data can be encrypted. However, a security weakness appears when during authentication the data needs to be decrypted. 

Before an authentication can take place, Prop must have gone through an enrollment phase. During this phase, Peggy and Prop visit a Certification Authority. Prop's parameters are measured. These measurements are processed and stored for later use. In an on-line application, such reference data can be stored in a central (possibly even publicly accessible) data base or these data can be certified with a digital signature of the Certification Authority, and given to Peggy. In the latter case, it is Peggy' responsibility to securely give this certified reference data to Victor.

 

Attacks by stealing biometric identity We distinguish between two attacks
  1. Misuse of templates: a dishonest Victor can attempt to calculate the parameters of Prop or to establish the key without having access to the object. This corresponds to a system operator who attempts to retrieve user passwords from the reference database of data strings F(p).
  2. Misuse of measurement data: After having had an opportunity to measure Prop, a dishonest Victor misuses its measurement data. This corresponds to grabbing all keystrokes including the plain passwords typed by a user.

 

 

Papers

Copies of papers

 

html pdf errata Jean-Paul Linnartz and Pim Tuyls, "New Shielding functions to enhance privacy and prevent misuse of biometric templates", 4th International Conference on Audio and Video Based Biometric Person Authentication, Guildford, United Kingdom, 9-11 June 2003

 

pdf pdf Evgeny Verbitskiy, Pim Tuyls, Dee Denteneer, Jean-Paul Linnartz, "Reliable Biometric Authentication with Privacy Protection", 24th Benelux W.I.C. Symposium on Information Theory, Werkgemeenschap voor Informatie en Communicatietheorie (www.w-i-c.org), Veldhoven, May 22-23, 2003, pp. 125-131.

 

PDF 126k summary PDF2 Frans Willems, Ton Kalker, Jasper Goseling, and Jean-Paul Linnartz, "On the capacity of a biometrical identification system", ISIT 2003, Yokohama, Japan, June 29 - July 4, 2003, paper 590.

 

 

J.P.M.G. Linnartz, “Security with Noisy Data”, keynote speech at IEEE Benelux Information Theory and Signal Processing Chapter symposium on "Security with Noisy Data", Eindhoven, 21 Jan. 2011.

J.P.M.G. Linnartz, “Security with Noisy Data”, Invited talk for IEEE Benelux Meet-the-Fellows Seminar, Leuven, 16 Feb 2011.

 

pdfJ.A. de Groot and J.P.M.G. Linnartz,  “Improved Privacy Protection in Authentication by Fingerprints”, joint WIC/IEEE SP Symposium on Information Theory and Signal Processing in the Benelux, Brussels, Belgium, May 10-11, 2011


 

pdf

J.A. de Groot and J.P.M.G. Linnartz, "Zero Leakagec Quantization Scheme for Biometric Verification", 36th International Conference on Acoustics, Speech and Signal Processing, Prague, May 22-27, 2011


pdf PDF Poster Lingni Ma, J.A. de Groot and J.P.M.G. Linnartz, “Biometric Security Based on ECG”, Veldhoven, Nov. 14 -15, 2011, ICT.OPEN 2011 “The interface for Dutch ICT-Research”

 

 

overzicht