Development of New Technologies

AuthorKatherine H Woodcock; W Gregory Voss
Pages173-205
173
7
Development of New Technologies
The Working Party has commented that “[e]ach new phase of technological develop-
ment presents a challenge for the protection of personal data and the right to privacy.”1
That statement was made in 1999 and it then referred to the use of the Internet as a
new phase of technological development. EU data protection law, as embodied by
the Directive, is meant to be technologically neutral (as it will be with the proposed
GDPR, if and when the GDPR is adopted). That is, its provisions apply regardless
of the technology used to process personal data. This approach has the advantage of
allowing for the application of current legislation to new technologies developed after
the legislation’s adoption. Many new technologies have developed since the Direc-
tive’s adoption in 1995; however, determining how the Directive’s application to those
new technologies works in practice can still be a struggle. The Working Party has con-
sistently, albeit not always clearly, issued guidance contributing to the understanding
of how existing law applies to such technologies.
Both biometrics and GPS technology have been discussed previously in the
employment relationship context.2 However, we will now turn to new technologies
generally and present some of the practical concerns and issues at a high level.
I. BIOM ETRICS AND FACIAL RECOG NITION
The Directive does not specically refer to biometrics. However, the French Data
Protection Agency—the CNIL—has provided information on biometrics in the con-
text of the Directive at least since 1999 and has translated into English part of its annual
report for 2001 on the issue.3 The CNIL favors the use of biometric technologies that
are “not based on storing the templates in a database,” such as using a microchip card
that is kept by the data subject or devices such as a mobile phone or computer for
which he or she has exclusive use. It also warns that a biometrics element that “leaves
traces” such as DNA or ngerprints may have an incidence on liberties and privacy,
1. Working Party, Working Document: Processing of Personal Data on the Internet (WP 16) (Feb. 23, 1999),
at 2, http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/les/
1999/wp16_en.pdf.
2. See Chapter 4 Section II.E.
3. CNIL, 22nd Annual Report for 2001, Extracts of Chapter 3 “Current Debates”: A Century of Biometrics
(July 2002), http://www.cnil.fr/leadmin/documents/en/AR-22-biometrics_VA.pdf.
woo51396_07_c07_173-206.indd 173 12/7/15 8:50 AM
174
Navigating E U Privacy and Data Protec tion Laws
but that an element “leaving no trace,” such as the outline of a hand, the retina, and
voice recognition technologies “should be preferred,” absent a “particular imperative
requirement of security.”4
A. Working Party Guidance
Moreover, the Working Party issued a working document on biometrics as early as
2003.5 The Working Party highlighted the “special nature” of biometric data in that
it relates to “the behavioural and physiological characteristics of an individual and
may allow his or her unique identication.”6 The Working Party set out the then dif-
ferent uses for biometrics: automated authentication/verication and identication,
especially for entry control, whether the space is a physical one or a virtual one. It also
enumerated some of the then current forms of biometrics—such as DNA and nger-
print testing—and warned about potential reuse of data by third parties for their own
purposes if society increases the use of biometric databases for “routine applications”
and about the public becoming desensitized to the data protection risks and impact on
the data subject’s life due to widening use of biometrics. 7 Biometric identiers are not
always used in isolation and are often combined for security purposes or may be used
with other identifying methods, such as passwords, PIN codes, smart cards, and so on.
Although the Working Party considered that, in most cases “measures of biometric
identication or their digital translation in a template form . . . are personal data,” if
they are “stored in a way that no reasonable means can be used by the controller or
by any other person to identify the data subject,” they should not be considered per-
sonal data.8 The application or lack of application of the Directive appears to hinge on
this analysis. However, as discussed later in the section on big data, when looking at
anonymization, the Working Party generally considers biometric data to be personal
data when it is possible to identify the individual to whom the data relates. It is worth
noting that the household use exemption from the Directive may apply, when dealing
in applications purely in domestic use by a natural person (for example, a personal
smartphone or computer with a ngerprint reader).
In 2012, the Working Party provided new guidance on biometric technologies.9
Here, it highlighted the potential dangers in processing, due to the close tie to an indi-
vidual’s personal physical characteristics.10 As a result, in order for such processing
4. Id. at 12.
5. Working Party, Working Document on Biometrics (WP 80) (Aug. 1, 2003), http://ec.europa.eu/
justice/policies/privacy/docs/wpdocs/2003/wp80_en.pdf.
6. Id. at 2.
7. Id.
8. Id. at 5.
9. Working Party, Opinion 3/2012 on the Developments in Biometric Technologies (WP 193) (Apr. 27,
2012), http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/
les/2012/wp193_en.pdf.
10. See id. at 3.
woo51396_07_c07_173-206.indd 174 12/1/15 5:51 PM
175Development of New Technologies
to occur legally, it must be legitimate under the Directive, and the proper securing of
such data is necessary.11
B. Complying with the Directive
When dealing with biometric processing, most analyses turn on the proportionality of
the collection and processing—if there is a less-intrusive way to achieve the purposes,
the test fails.12 In addition, the Working Party gives an example of inappropriate reuse
of biometric data. If such data are collected for access control, they should not be
further used for workplace surveillance or in order to determine the data subject’s
emotional state, for example. In discussing the relevant privacy principles in WP 193,
the Working Party highlighted the principles of purpose limitation, proportionality,
necessity, and data minimization, stating that these should be kept in mind when the
purposes of an application are dened.13
1. Proportionality and Purpose Limitation
In analyzing the application of the principle of proportionality, one will search for
(i) the necessity of the use of the biometric system in order to ll a need, and not just
its convenience or cost efciency; (ii) the likely effectiveness of the biometric system
in meeting the need; (iii) whether the benet of the use of such technology is relatively
important compared to the loss of privacy; and (iv) whether there is a less “privacy
intrusive” method in order to fulll the need.14 In the analysis, the purpose of the pro-
cessing will play a major role. Thus, the use of facial recognition or DNA data may be
very effective in the identication of serious crime suspects, but it may not be justied
in more large-scale use (such as widespread facial recognition without the knowledge
of data subjects), because of the potential negative effects on privacy.15
2. Legal Grounds for Processing
a. Performance of a Contract
Under the basis provided in Article 7(b), if necessity for the performance of a contract
is the legal grounds, in general this will apply only when “pure biometric services” are
provided (such as submitting hair samples for DNA testing), and it “cannot be used
to legitimate a secondary service that consists in enrolling a person into a biometric
system.”16
b. Controller’s Legitimate Interest
A data controller’s legitimate interest under Article 7(f) of the Directive may be used
only as the legal grounds for processing when, as may be the case with security for
high-risk areas (e.g., access to a laboratory where research on dangerous viruses is
11. Id. at 10–14.
12. Id. at 6.
13. Id. at 7.
14. Id. at 8.
15. Id. at 9.
16. Id. at 12.
woo51396_07_c07_173-206.indd 175 12/1/15 5:51 PM

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT