Skip to main content

Best Practices for Privacy and Data Protection for the Processing of Biometric Data

  • Chapter
Security and Privacy in Biometrics

Abstract

Self-regulatory initiatives by data controllers can contribute to a better enforcement of data protection rules. This is especially important for the use of biometric data in identity management systems, because of risks of use as unique identifiers and identification. This chapter explains the Best Practices which were developed in the Turbine project. These Best Practices recommend inter alia the creation of multiple trusted revocable protected biometric identities, which are irreversible and unlinkable.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    European Commission, Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions. A comprehensive approach on personal data protection in the European Union, 4.11.2010, COM(2010) 609 final, p. 12, available at http://ec.europa.eu/justice/news/consulting_public/0006/com_2010_609_en.pdf (‘Commission, Communication. Personal Data Protection, 2010’). All links in this chapter have been last visited in February 2011.

  2. 2.

    ISO/IEC 2382-37:2012 Information technology—Vocabulary—Part 37: Biometrics, 13.12.2012, 28 p.

  3. 3.

    About the importance of such common vocabulary, see also E. Kindt, ‘Biometric applications and the data protection legislation. The legal review and the proportionality test’ in Datenschutz und Datensicherheit 2007, p. 167.

  4. 4.

    The contribution in this chapter is representing however only the author’s view and is binding Turbine partners, the European Commission nor the EDPS.

  5. 5.

    Article 27, 2 Directive 95/46/EC.

  6. 6.

    Article 27, 3 Directive 95/46/EC.

  7. 7.

    Exceptions exist, e.g., France.

  8. 8.

    For example, on the European level, in 2009 only two organizations representing a sector have drawn up with success codes which were validated by the Article 29 Data Protection Working Party: the International Air Transportation Association (IATA) and the Federation of European Direct and Interactive Marketing (FEDMA). See N. Robinson, H. Graux, M. Botterman, L. Valeri, Review of the European Data Protection Directive, Cambridge, Rand, 2009, pp. 9-10 and p. 37, available at http://www.ico.gov.uk/upload/documents/library/data_protection/detailed_specialist_guides/review_of_eu_dp_directive.pdf (‘Rand, 2009’).

  9. 9.

    Rand, 2009, p. 37.

  10. 10.

    Commission, Communication. Personal Data Protection, 2010, pp. 12–13.

  11. 11.

    Best practices initiatives in relation with biometric data in domains other than data protection have been taken as well, such as for testing methodologies or for particular (large-scale) applications. These initiatives are however not discussed in this chapter.

  12. 12.

    International Biometric Group, Best Practices for Privacy-Sympathetic Biometric Deployment, available at http://www.bioprivacy.org/best_practices_main.htm.

  13. 13.

    A. Albrecht, BioVision. Privacy Best Practices in Deployment of Biometric Systems, BioVision, 28 August 2003, 49 p. (‘BioVision, Best Practices, 2003’); see also A. Albrecht, ‘Privacy best practices’, in Biometric Technology Today, November/December 2003, pp. 8–9.

  14. 14.

    BioVision, Best Practices, 2003, pp. 5 and 13.

  15. 15.

    We discuss the meaning of this principle further below in Sect. 14.4.2.

  16. 16.

    Office of the Privacy Commissioner, Approval of the Biometrics Institute Privacy Code, Australia, 19 July 2006, 24 p., also available at www.Biometricsinstitute.org.

  17. 17.

    See Preamble of the Biometrics Institute Privacy Code, second consideration.

  18. 18.

    The review started with the establishment of a Privacy Committee and surveys to its members. The results were presented later in 2008 and 2009.

  19. 19.

    The inadequacies found included making a separation of government and non-government privacy principles, the exemption from the Act of small businesses, media and other, the variation of jurisdictions, the exemption of employee records from the act and the fact that Privacy Impact Assessments and Audits were not mandatory while this was the case in the Code.

  20. 20.

    See Revocation of the Biometrics Institute Privacy Code, available at http://www.comlaw.gov.au/Details/F2012L00869.

  21. 21.

    TrUsted Revocable Biometric IdeNtitiEs project (TURBINE) (IST-2007-216339) (2008–2011) (7th Framework Programme), with homepage at www.turbine-project.eu (‘Turbine’). About the technology developed in Turbine, see also the contribution of J. Binger and H. Chabanne in Chap. 11 in this book.

  22. 22.

    For the full text of the Best Practices, see Turbine, D.1.4.3 Practical Guidelines for the privacy friendly processing of biometric data for identity verification, available at http://www.turbine-project.eu/index.php.

  23. 23.

    Consultative Committee of the Convention for the Protection of Individuals with regards to Automatic Processing of Personal Data [CETS No. 108] (T-PD), Progress report on the application of the principles of convention 108 to the collection and processing of biometric data, Strasbourg, Council of Europe, CM(2005)43, March 2005, 22 p.

  24. 24.

    In some cases, the identity provider and the service provider may be one and the same entity.

  25. 25.

    See and compare also, e.g., with the U.S. federal privacy requirement in the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and its regulations to implement technical policies and procedures that allow only authorized persons to access electronic protected health information.

  26. 26.

    Verification is therefore also referred to as a ‘one to one comparison’ (1:1 comparison).

  27. 27.

    This is also referred to as a ‘one to many comparison’ (1:n comparison).

  28. 28.

    For example, if a sector would be entitled by law and under specific conditions to keep ‘black lists’ (e.g., the insurance sector). ‘Black lists’ excluding individuals from access rights or practical services, requires in several countries explicit legal provisions authorizing the use of such lists, in particular because such lists may imply some form of discrimination.

  29. 29.

    For example, because the central storage would be more convenient for the user and the biometric characteristic does not allow the use of the identification functionality (e.g., hand geometry). Compare, e.g., with the Unique Authorization No AU-007 of 27 April 2006 by the French DPA, the CNIL, for biometric systems based on hand geometry verification for access control, management of time and attendance and of the canteen in the workplace in France.

  30. 30.

    Presently, the data subjects have information, access and correction rights, and the right to object under specific conditions. They also have the right to freely refuse consent.

  31. 31.

    For example, the French DPA, the CNIL, has warned since 2000 for the central storage of biometric data, especially fingerprint, and thereupon developed a position on the use of biometric identifiers which shall in principle not be stored centrally but locally. See also for a similar position the DPAs of Greece and Belgium. See also the report R. Hes, T. Hooghiemstra and J. Borking, At Face Value. On Biometrical Identification and Privacy, Achtergrond Studies en Verkenningen 15, The Hague, Registratiekamer, September 1999, p. 52, issued by the Dutch DPA. Compare, however, with CNIL, Communication de la CNIL relative à la mise en œuvre de dispositifs de reconnaissance par empreinte digitale avec stockage dans une base de données, 28 December 2007, 12 p.

  32. 32.

    Article 29 Data Protection Working Party and the Working Party on Police and Justice, The Future of Privacy. Joint contribution to the Consultation of the European Commission on the legal framework for the fundamental right to protection of privacy, WP 168, 1 December 2009, p. 14.

  33. 33.

    Prime, Prime White paper, 2008, v.3.0, 19 p., available at https://www.prime-project.eu/prime_products/whitepaper/PRIME-Whitepaper-V3.pdf.

  34. 34.

    Additional aspects of the multiple identities are set forth in Best Practice No 4 (revocability) and Best Practice No 7 (irreversible and unlinkable across contexts) discussed below. About the architecture using pseudo identities in the project, see J. Breebaart, C. Bush, J. Grave and E. Kindt, ‘A reference architecture for biometric template protection based on pseudo identities’, in A. Brömme (ed.), Proceedings of the Special Interest Group on Biometrics and Electronic Signatures, Bonn, Gesellschaft für Informatik, 2008, pp. 25–37.

  35. 35.

    See Sect. 3(a) German Federal Data Protection Act.

  36. 36.

    Article 29 Working Party, Working Document on on-line authentication services, WP 68, 29 January 2003, p. 15, available at http://ec.europa.eu/justice_home/fsj/privacy/docs/wpdocs/2003/wp68_en.pdf.

  37. 37.

    See, e.g., for the importance of this aspect for the issuance of biometric passports, EDPS, Opinion of 26 March 2008 on the proposal for a Regulation of the European Parliament and of the Council amending Council Regulation No. 2252/2004, O.J. C 200, 6.08.2008.

  38. 38.

    See and compare, e.g., Biometrics Institute Privacy Code, 2006, Sect. F.11.4.

  39. 39.

    See on this risk, e.g., M. Meints & M. Hansen, ‘Additional and in some cases health related information in biometrics’, in E. Kindt and L. Müller (eds.), D.3.10. Biometrics in identity management, Frankfurt, FIDIS, 2007, pp. 83–86, available at http://www.fidis.net/resources/deliverables/hightechid/#c2057.

  40. 40.

    See and compare also with earlier recommendations of the Committee of experts on data protection (CJ-DP), The introduction and use of personal identification numbers: the data protection issues, Council of Europe, 1991, pp. 15–17, available at http://www.coe.int/t/dghl/standardsetting/dataprotection/Reports/Pins_1991_en.pdf.

  41. 41.

    See J. Breebaart, C. Bush, J. Grave and E. Kindt, ‘A reference architecture for biometric template protection based on pseudo identities’, in A. Brömme (ed.), Proceedings of the Special Interest Group on Biometrics and Electronic Signatures, Bonn, Gesellschaft für Informatik, 2008, pp. 25–37.

  42. 42.

    See J. Breebaart, B. Yang, I. Buhan-Dulman, Ch. Busch, ‘Biometric Template Protection. The need for open standards’ in Datenschutz und Datensicherheit 2009, pp. 299–304.

  43. 43.

    The data protection legislation of only a few countries contain specific provisions relating to the linking of information, e.g., Slovenia.

  44. 44.

    See, e.g., Article 29 Working Party, Working Document on on-line authentication services, WP 68, 29 January 2003, p. 12.

  45. 45.

    Article 29 Working Party, Working Document on Biometrics, WP 80, 1 August 2003, p. 10.

  46. 46.

    In particular, in Ontario, Canada, the Social Assistance Reform Act of 1997 (later revoked) and the Ontario Works Act of 1997 (Article 75).

  47. 47.

    See also about the use of ‘best available techniques’ as one of the recommendations for privacy and data protection in the Union, ENISA Ad Hoc Working Group on Privacy & Technology, Technology-Induced challenges in Privacy & Data Protection in Europe, M. Langheinrich and M. Roussopoulos (eds.), October 2008, pp. 9 and 35–36 available at http://www.enisa.europa.eu/doc/pdf/deliverables/enisa_privacy_wg_report.pdf.

  48. 48.

    This should not be confused with to what some refer to as ‘anonymous biometric data’. The latter is in our view strictly speaking a contradictio in terminis, since all biometric data refer and relate to an individual, whether directly identifiable or not.

  49. 49.

    For example, the German Federal Data Protection Act explicitly states as a general principle that ‘data processing systems are to be designed and selected in accordance with the aim of collecting, processing or using no personal data or as little personal data as possible (…)’ (Sect. 3a).

  50. 50.

    For example, the Belgian DPA. See CBPL, Advice N o 17/2008 of 9 April 2008 upon own initiative relating to the processing of biometric data for the authentication of persons, No 77 (‘CBPL, Advice No 17/2008’). See and compare also with the Biometrics Institute Privacy Code which promotes anonymity (Article 8).

  51. 51.

    The current information obligation includes inter alia the obligation to inform about the identity of the controller, the purposes, the recipients of the information and the access and correction right of the data subject as specified in the applicable national data protection legislations.

  52. 52.

    See also CBPL, Advice No 17/2008, No 79. The need for transparency and agreement on FTE and FRR has also been recognized repeatedly in public sector applications, such as for the use of biometric passports.

  53. 53.

    Increasing the number of attempts may already address various failures in a simple way. However, this will affect the security provided by the system. Moreover, it may not always solve the issue and additional fall back procedures will remain required.

  54. 54.

    See and compare, e.g., with the new Article 1 a introduced by Regulation (EC) No 444/2009 of the European Parliament and of the Council of 28 May 2009 amending Council Regulation (EC) No 2252/2004 on biometric passports and travel documents.

  55. 55.

    See also Art. 17 (1) of the Directive 95/46/EC.

  56. 56.

    See, e.g., ISO19092: 2008 for a concise overview of infrastructure requirements.

  57. 57.

    See also Art. 17 (1) §2 of the Directive 95/46/EC.

  58. 58.

    See, e.g., for Belgium, CBPL, Reference measures for the security for every processing of personal data, 4 p., available at http://www.privacycommission.be/nl/static/pdf/referenciemaatregelen-vs-01.pdf.

  59. 59.

    See, for example, the Independent Centre for Privacy Protection Schleswig-Holstein (ICCP/ULD), Germany, which leads the EuroPriSeconsortium. See also the CNIL, which joined the French governmental institute AFNOR, with the goal to be heard in domains such as biometrics (CNIL, 30 ans au service des libertés. 29e rapport d’activité, p. 52); the ENISA Ad Hoc Working Group on Privacy & Technology also reiterated the benefits of certification in its report on ‘Technology-Induced challenges in Privacy & Data Protection in Europe’.

  60. 60.

    Commission, Communication. Personal Data Protection, 2010, pp. 12–13.

  61. 61.

    EDPS, Opinion 1.02. 2011 on a research project funded by the European Union under the 7th Framework Programme (FP 7) for Research and Technology Development (Turbine (TrUsted Revocable Biometric IdeNtitiEs), available at http://www.edps.europa.eu/EDPSWEB/edps/cache/off/Consultation/OpinionsC/OC2011 (‘EDPS, Turbine Opinion, 2011’).

  62. 62.

    See EDPS, Turbine opinion, 2011, §34–37.

  63. 63.

    EDPS, Turbine opinion, 2011, §67 and §69.

  64. 64.

    See EDPS, Turbine opinion, 2011, §16.

  65. 65.

    International Biometric Group, BioPrivacy Technology Risk Rating, available at www.bioprivacy.org.

  66. 66.

    PETs have been considered since some time as necessary in preserving privacy of individuals in networks (see, e.g., EU Commission, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), COM(2007) 228 final, 10 p.), and as having an important role in biometric systems (see also R. Hes, T.F.M. Hooghiemstra and J.J. Borking, At Face Value. On Biometrical Identification and Privacy, The Hague, Registratiekamer, September 1999, 74 p.). We will elaborate this some more below.

  67. 67.

    See, for an overview of the ‘state of the art’ in template protection, several presentations given by experts internal and external to the Turbine project of research institutions and companies during the Turbine final public workshop on 17–18 January 2011, available at http://www.turbine-project.eu/workshop_presentations.php; also on these new techniques, see C. Busch and H. Reimer, ‘Biometrie in neuem Licht ?’, Datenschutz und Datensicherheit, 2009, p. 271 and the several contributions in this number on template protection, and Grijpink, J., ‘Trend report on biometrics: Some new insights, experiences and developments’, Computer Law & Security Report 2008, pp. 261–264.

  68. 68.

    EDPS, Turbine opinion, 2011, §20 and §25.

  69. 69.

    In the Turbine project, a FRR of less than 1 % was aimed at with a FAR of 0.1 %. According to several project reports, this goal has been met. For the specific results, see C. Bush, D. Gafurov, B. Yang and P. Bours, Turbine performance evaluation. From Benchmarks to Airport Deployment, 17 Jan. 2011, available at http://www.turbine-project.eu/workshop_presentations.php and the various publications of Turbine partners about the technology listed on the Turbine website.

  70. 70.

    See in detail about this standard which is in the meantime adopted, J. Breebaart, B. Yang, I. Buhan-Dulman, Ch. Busch, ‘Biometric Template Protection. The need for open standards’ in Datenschutz und Datensicherheit 2009, pp. 299–304. It is clear however that additional work is required for determining the benchmarks and metrics for these technologies to be able to test and compare several solutions incorporating these aspects of template protection.

  71. 71.

    See, e.g., CBP, Uitgangspunten voor Privacy by Design, available at the webpages of the theme ‘privacy by design’ of the Dutch DPA, at http://www.cbpweb.nl/themadossiers/th_pbd_uitgangspunten.shtml.

  72. 72.

    European Commission, Communication on ‘A Digital Agenda for Europe’, COM(2010) 245, p. 17, footnote 21.

  73. 73.

    See CBP, Privacy by Design zelf toepassen, available at the webpages of the theme ‘privacy by design’ of the Dutch DPA, at http://www.cbpweb.nl/themadossiers/th_pbd_praktijk.shtml.

  74. 74.

    See Information and Privacy Commissioner of Ontario, Privacy by Design. The 7 Foundational Principles, available at http://www.privacybydesign.ca/background.htm.

  75. 75.

    Ibid. See, e.g., for a detailed description of the implementation of the concept in combination with the use of protected templates for face recognition in a real life environment (for limiting ‘self-excluded’ problem gambler access to gaming venues) in combination with a watch list, see A. Cavoukian and T. Marinelli, Privacy-Protective Facial Recognition: Biometric Encryption. Proof of Concept, Information and Privacy Commissioner Ontario, Ontario Lottery and Gaming Corporation, November 2010, 16 p., available at www.ipc.on.ca. See also Chap. 9 in this book.

  76. 76.

    European Commission, Communication from the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), COM(2007) 228 final, p. 3. The research in PETs often relates to identity management systems.

  77. 77.

    See J.-M. Dinant, ‘Chap. 5. The Concepts of Identity and Identifiability: Legal and Technical Deadlocks for Protecting Human Beings in the Information Society’, Reinventing Data Protection? in S. Gutwirth, Y. Poullet, P. De Hert, C. de Terwange, S. Nouwt (eds.), Springer, 2009, (111), pp. 118–119.

  78. 78.

    See also E. Kindt, ‘The use of privacy enhancing technologies for biometric systems analyzed from a legal perspective’, in M. Bezzi et al. (eds.), Privacy and Identity, IFIP International Federation for Information Processing AICT 320, 2010, pp. 134–145.

  79. 79.

    The Commission stated it in its Communication on PETs of 2007 as follows: ‘The use of PETs can help to design information and communication systems and services in a way that minimizes the collection and use of personal data and facilitate compliance with data protection rules’ (emphasis added).

  80. 80.

    See and compare also, e.g., with recital 46 and Article 14(3) of Directive 2002/58/EC.

  81. 81.

    See and compare, e.g., with the use of biometric data as identifier in legislation setting up large-scale systems on Union level such as Eurodac.

  82. 82.

    Article 8.7 Directive 95/46/EC.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Els Kindt .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag London

About this chapter

Cite this chapter

Kindt, E. (2013). Best Practices for Privacy and Data Protection for the Processing of Biometric Data. In: Campisi, P. (eds) Security and Privacy in Biometrics. Springer, London. https://doi.org/10.1007/978-1-4471-5230-9_14

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-5230-9_14

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-5229-3

  • Online ISBN: 978-1-4471-5230-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics