Abstract
Artificial intelligence is increasingly able to autonomously detect suspicious activities (‘smart’ law enforcement). In certain domains, technology already fulfills the task of detecting suspicious activities better than human police officers ever could. In such areas, i.e. if and where smart law enforcement technologies actually work well enough, legislators and law enforcement agencies should consider their use. Unfortunately, the German Constitutional Court, the European Court of Justice, and the US Supreme Court are all struggling to develop convincing and clear-cut guidelines to direct these legislative and administrative considerations. This article attempts to offer such guidance: First, lawmakers need to implement regulatory provisions in order to maintain human accountability if AI-based law enforcement technologies are to be used. Secondly, AI law enforcement should be used, if and where possible, to overcome discriminatory traits in human policing that have plagued some jurisdictions for decades. Finally, given that smart law enforcement promises an ever more effective and even ubiquitous enforcement of the law—a ‘perfect’ rule of law, in that sense—it invites us as democratic societies to decide if, where, and when we might wish to preserve the freedom to disobey the rule(s) of law.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In doing so, this Chapter also complements the more specific analyses provided by Buchholtz, Schemmel, paras 32–46, and Braun Binder, paras 16 et seq., on legal tech, financial market regulation, and tax law enforcement, respectively.
- 2.
For the general definition of artificial intelligence informing this Book cf. Rademacher and Wischmeyer, paras 5–6.
- 3.
Or some other incident requiring state interference, e.g. an attempted suicide or the identification of a person for whom an arrest warrant is issued.
- 4.
That excludes more ‘traditional’ big data technologies such as breathalyzers, field testing kits, or DNA analyses. This definition is similar to the one proposed by Rich (2016), pp. 891–892, who correctly notes that even though ‘[t]hese traditional technologies can be exceptionally helpful to police in establishing the “historical facts” of what happened’, they cannot analyze on their own ‘groups of disparate facts together and [draw] conclusions about the probability of an individual’ non-compliance. It is the specific feature of smart law enforcement technologies—called ‘automated suspicion algorithms’ by Rich—that they attempt to apply patterns that are woven with such density that a ‘match’ qualifies, per se, as indicative of suspicious activity (cf. para 29). Furthermore, the definition also excludes so-called ‘impossibility structures’, i.e. technology which not only detects suspicious activity, but at a more sophisticated level aims at making illegal conduct physically impossible (cf. Rich 2013, pp. 802–804; see also, with a different terminology, Cheng 2006, p. 664: ‘Type II structural controls’, Mulligan 2008, p. 3: ‘perfect prevention’, and Rosenthal 2011, p. 579: ‘digital preemption’).
- 5.
Cf., for a definition, in this Book Rademacher and Wischmeyer, and the thorough explanation offered by Rich (2016), pp. 880–886, esp. 883.
- 6.
- 7.
- 8.
Ferguson (2017), pp. 88–90.
- 9.
- 10.
- 11.
- 12.
Bundespolizeipräsidium (2018), p. 35.
- 13.
The EU-funded iBorderCtrl-project (www.iborderctrl.eu) is testing software that is to detect persons lying at border controls; see, for a first assessment, Algorithm Watch (2019), pp. 36–37.
- 14.
E.g. Bouachir et al. (2018): video surveillance for real-time detection of suicide attempts.
- 15.
- 16.
Ferguson (2017), p. 86.
- 17.
In the final version of what is now Directive (EU) 2019/790 the explicit reference to content recognition technologies has been removed (cf. Article 17 of the Directive). If that effectively avoids a de facto obligation on information service providers to apply such filter technologies remains to be seen. For an early discussion of technological means of ‘automatic enforcement’ see Reidenberg (1998), pp. 559–560. See also Krönke, para 44, who expects a ‘de facto obligation’ of service providers to apply recognition software.
- 18.
- 19.
- 20.
Ferguson (2017), p. 118.
- 21.
- 22.
E.g. the private Polaris Project, which analyzes telephone calls for help in cases of human trafficking, to reveal places, routes, and even financial cash flows worthy of police attention.
- 23.
- 24.
To some extent that includes misconduct within police forces as well, cf. Ferguson (2017), pp. 143–162.
- 25.
- 26.
- 27.
For the constitutional constraints German police has to respect see para 18. A second reason for the German reluctance might also be that big data driven proactive ‘rasterizing’ proved spectacularly inefficient when German police applied it in the aftermath of 9/11, cf. German Constitutional Court 1 BvR 518/02 ‘Rasterfahndung’ (4 April 2006), BVerfGE 115, pp. 327–331.
- 28.
- 29.
- 30.
- 31.
Brühl (2018). The legal basis for this data mining is presumably Section 25a(1), (2) Hessisches Gesetz über die öffentliche Sicherheit und Ordnung (HSOG), in force since 4 July 2018.
- 32.
Section 10 Geldwäschegesetz; cf. for a case study on anti-money laundering technology Demetis (2018); see also Schemmel, para 12.
- 33.
Directive (EU) 2016/681 of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime. For an evaluation against the backdrop of EU and German constitutional law cf. Rademacher (2017), pp. 410–415.
- 34.
For a status quo report on German PNR-analyses cf. Bundestag (2018).
- 35.
See, for a rare example, Ferguson (2017), p. 88.
- 36.
Schlossberg (2015).
- 37.
For an early instance of aural surveillance cf. Zetter (2012).
- 38.
- 39.
- 40.
Joh (2019), p. 179.
- 41.
Henderson (2016), p. 935: ‘[W]hen it comes to criminal investigation, time travel seems increasingly possible.’
- 42.
If police tried ‘to see into the past’ (Ferguson 2017, p. 98) before the rise of big data policing, they usually had to rely on human eye witnesses—who are notoriously unreliable.
- 43.
See Barret (2016), reporting on technology that could connect up to 30 million CCTV cameras in the US.
- 44.
Cf. German Constitutional Court 1 BvR 370/07 ‘Onlinedurchsuchung’ (27 February 2008), BVerfGE 120, pp. 344–346, concerning intelligence agents performing website searches.
- 45.
Article 13 Grundgesetz (German Basic Law).
- 46.
- 47.
Constitutional Court ‘Onlinedurchsuchung’ (see note 44), p. 345.
- 48.
E.g. German Constitutional Court 1 BvR 2074/05 ‘KFZ-Kennzeichenkontrollen’ (11 March 2008), BVerfGE 120, p. 431; cf. Rademacher (2017), pp. 403–405, 406–407.
- 49.
See, e.g., German Constitutional Court ‘KFZ-Kennzeichenkontrollen’ (see note 48), p. 430 with p. 399: Although an ALPR-scan was said to not count as an interference with A’s right to informational self-determination, if A’s license plate number does not produce a match in the database and the image was therefore deleted immediately, A’s potential privacy fear (‘chilling effect’ due to a feeling of constant observation’) still should render the interference with B’s right to informational self-determination—whose license plate number had produced a ‘hit’—unconstitutional. Just after the Federal Administrative Court confirmed that irritating jurisprudence, the Constitutional Court reversed it, now holding that any form of video surveillance amounts to an interference with the right to informational self-determination, see Constitutional Court 1 BvR 142/15 ‘KFZ-Kennzeichenkontrollen 2’ (18 December 2018), paras 45. Cf. Marsch (2012), pp. 605–616.
- 50.
German Constitutional Court 1 BvR 209/83 ‘Volkszählung’ (15 December 1983), BVerfGE 65, p. 42: ‘[…] right of individuals to decide in principle themselves when and within what limits personal matters are disclosed’. Author’s translation.
- 51.
Cf. Constitutional Court ‘Volkszählung’ (see note 50), p. 45: ‘in that respect, “unimportant” data no longer exist in the context of automated data processing’. Author’s translation.
- 52.
The Sphärentheorie offered, in principle, heightened protection from surveillance for information that could be categorized as intimate or private, and did not protect, again in principle, information that was considered social or public.
- 53.
Cf. Ferguson (2014), pp. 1313–1316.
- 54.
- 55.
Cf. Marsch, paras 15–16. Ditto Poscher (2017), p. 132.
- 56.
Constitutional Court ‘KFZ-Kennzeichenkontrollen’ (see note 48), and ‘KFZ-Kennzeichenkontrollen 2’ (see note 49).
- 57.
Constitutional Court ‘Rasterfahndung’ (see note 27).
- 58.
Cf. Staben (2016), pp. 160, 162 for a broad overview, listing most, if not all, the criteria the Constitutional Court so far has weighed against modern surveillance technologies.
- 59.
See, esp., the decision on dragnet investigations, Constitutional Court ‘Rasterfahndung’ (see note 27), pp. 354, 356–357.
- 60.
Constitutional Court ‘KFZ-Kennzeichenkontrollen’ (see note 48), pp. 401, 407.
- 61.
Constitutional Court ‘Volkszählung’ (see note 50), p. 43; Constitutional Court ‘KFZ-Kennzeichenkontrollen’ (see note 48), pp. 402, 430; Constitutional Court ‘KFZ-Kennzeichenkontrollen 2’ (see note 49), paras 51, 98.
- 62.
The Constitutional Court’s recurrence to this line of argument (‘risks of abuse’, see e.g. Constitutional Court ‘KFZ-Kennzeichenkontrollen’ (see note 48), p. 402) has been harshly criticized by German legal scholarship for being empirically unfounded, cf. Trute (2009), pp. 100–101.
- 63.
See also Hermstrüwer, paras 10, 19.
- 64.
Whether that being an imminent risk of harm to an individual’s health, life, or liberty, or whether police experience regarding a specific place (crime hot spots) or activity suffices (see for examples of the latter Constitutional Court ‘KFZ-Kennzeichenkontrollen 2’ (see note 49), para 94, and Rademacher 2017, pp. 401–410), depends on the ‘invasiveness’ of the respective means of surveillance. Obviously, that again requires a proportionality test, the outcome of which is hard to predict. In specific areas, such as tax law (see Braun Binder) and financial regulation (see Schemmel, paras 10–12 and para 32 for US law), state databases are ‘rasterized’ routinely, thus implementing a limited form of generalized suspicion (Generalverdacht) or so-called ‘anlasslose Kontrolle’ (cf. Constitutional Court ‘KFZ-Kennzeichenkontrollen 2’ (see note 49), para 94).
- 65.
Constitutional Court ‘KFZ-Kennzeichenkontrollen’ (see note 48), p. 378, recital 4.
- 66.
Rademacher (2017), pp. 401–403.
- 67.
- 68.
Most recently confirmed in CJEU Case C-207/16 ‘Ministerio Fiscal’ (2 October 2018) para 51. The respective jurisprudence is mainly based on Article 8 CFR [Right to data protection]. For a detailed analysis of that provision and the shift in the CJEU’s jurisprudence towards an understanding similar to the German right to informational self-determination cf. Marsch, paras 29–32.
- 69.
See, above all, Regulation (EU) 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation); Directive (EU) 2016/680 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purpose of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data; and Directive (EG) 2002/58 of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (ePrivacy Directive). Cf. Dimitrova (2018).
- 70.
Cf. CJEU Case C-203/15 ‘Tele2 Sverige’ (21 December 2016) para 111: ‘[N]ational legislation [requiring private companies to store communications data] must be based on objective evidence which makes it possible to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences […].’ Confirmed by CJEU Opinion No. 1/15 ‘Passenger Name Record’ (26 July 2017), para 191. But cf. the referrals submitted under Article 267 TFEU by the Investigatory Powers Tribunal London (C-623/17), the French Conseil d’État (C-512/18), and the Belgian Constitutional Court (C-520/18), rather critically scrutinizing the CJEU’s data protection-friendly approach. See also ECtHR Case-No. 35252/08 ‘Big Brother Watch v. United Kingdom’ (13 September 2018), para 112.
- 71.
Take, for instance, ‘intelligent’ video surveillance that is supposed to alert to pickpocketing at train stations. Certainly, it will need to indiscriminately record at least some minutes of what happens on the platform to distinguish suspicious behavior from people just strolling around waiting for their trains.
- 72.
Additionally, the data will be needed to train new algorithms and evaluate algorithms which are already applied, cf. CJEU ‘Passenger Name Record’ (see note 70), para 198.
- 73.
‘[N]ot […] limited to what is strictly necessary’, CJEU ‘Passenger Name Record’ (see note 70), para 206.
- 74.
CJEU ‘Passenger Name Record’ (see note 70), paras 204–209.
- 75.
Meaning that the ‘models and criteria’ applied by Canada must be ‘specific and reliable, making it possible […] to arrive at results targeting individuals who might be under a “reasonable suspicion” of participation in terrorist offences or serious transnational crime’, cf. CJEU ‘Passenger Name Record’ (see note 70), para 172.
- 76.
See CJEU ‘Passenger Name Record’ (see note 70): ‘The transfer of that data to Canada is to take place regardless of whether there is any objective evidence permitting the inference that the passengers are liable to present a risk to public security in Canada.’ [para 186] ‘[T]hat processing is intended to identify the risk to public security that persons, who are not, at that stage, known to the competent services, may potentially present, and who may, on account of that risk, be subject to further examination. In that respect, the automated processing of that data, before the arrival of the passengers in Canada, facilitates and expedites security checks, in particular at borders.’ [para 187] According to para 191 et seq., that suffices to ‘establish a connection between the personal data to be retained and the objective pursued’. Cf. Rademacher (2017), pp. 412–413.
- 77.
That is especially true as the CJEU’s reasoning in its ‘Passenger Name Record’ decision (see note 70) is premised on public international law concerning air traffic and respective border controls, cf. para 188. However, public international law is only one line of argument in favor of the agreement the Court has accepted, cf. para 187.
- 78.
Due to the interference with the fundamental right to informational self-determination, in any case a specific legal basis is required, see Article 52(1) CFR.
- 79.
See also CJEU ‘Ministerio Fiscal’ (see note 68), paras 54, 56–57.
- 80.
- 81.
Cf. Ferguson (2014), p. 1333: ‘In a government truly of limited powers, police would not have the surveillance powers to invade privacy or security unless there was a law specifically allowing it. Such is not the current reality under the Fourth Amendment.’ But cf. Ferguson (2017), p. 116, suggesting that social media monitoring and respective storage of data could interfere with the First Amendment as well.
- 82.
Katz v. United States, 389 U.S. 347 (1967), p. 361 (Harlan, J., concurring).
- 83.
‘Public’ is defined quite broadly, encompassing any communication that is directed to third parties, cf. Smith v. Maryland, 442 U.S. 735 (1979); for an in-depth analysis of the Court’s case law see Wittmann (2014), pp. 146–328; arguably, Carpenter v. United States has destabilized the third party doctrine, too (cf. note 99).
- 84.
Joh (2016), p. 18, notes that ‘[s]urprisingly, there is little discussion of these decisions that the police make about individuals before any search, detention, or arrest takes place. Rather, current unresolved issues of police technology have focused on whether a particular use is a Fourth Amendment search requiring a warrant and probable cause.’
- 85.
- 86.
Cf. Kyllo v. United States, 533 U.S. 27 (2001), pp. 34–35.
- 87.
That does not mean that legislators could not step in and implement more restrictive requirements for surveillance that fall short of constituting an interference under the Fourth Amendment. According to Ferguson (2017), p. 101, however, such legislative restrictions or clarifications are, to date, missing. The new California Consumer Privacy Act (CCPA), which has received great attention in the EU, too, limits its scope of application to data processing by private entities (cf. Cal. Civ. Code § 1798.140(c)).
- 88.
Cf. Ferguson (2014), p. 1305, identifying seven ‘values’ discussed as underlying the Fourth Amendment case law.
- 89.
Ferguson (2014), p. 1307.
- 90.
United States v. Jones, 565 U.S. 400 (2012), p. 404.
- 91.
Cf. Ferguson (2014), p. 1308.
- 92.
United States v. Jones (see note 90), p. 418 (Alito, J., concurring). But see also Justice Sotomayor’s opinion, ibid, at p. 414, consenting to the majority that ‘the Government’s physical intrusion on Jones’ Jeep supplies a […] basis for decision’ in any case.
- 93.
A similarly historical approach applies to surveillance technology that is able to ‘explore details of the home that would previously have been unknowable without physical intrusion’ (Kyllo v. United States (see note 86), p. 40—such surveillance does constitute a search within the meaning of the Fourth Amendment; see also Florida v. Jardines, 133 S.Ct. 1409 (2013), p. 1419 (Kagan, J., concurring)).
- 94.
United States v. Jones (see note 90), p. 430 (Alito, J., concurring).
- 95.
Ibid.
- 96.
Ibid, pp. 429–430 (Alito, J., concurring).
- 97.
Ibid, pp. 416–417 (Sotomayor, J., concurring). See also paras. 18 and 42. Cf. Staben (2016), pp. 67–68: the argument of chilling effects does appear in the Supreme Court’s jurisprudence, but usually regarding the First Amendment.
- 98.
Thus, at least under traditional interpretation of the Fourth Amendment, constituting public disclosure; see also notes 83 and 99.
- 99.
The crime in question in Carpenter was robbery. Interestingly, Justice Alito dissented, arguing that the majority’s decision would be ‘revolutionary’ inasmuch as it ignored established case law according to which the Fourth Amendment would not apply to ‘an order merely requiring a [third] party to look through its own records and produce specific documents’ (Carpenter v. United States, 585 U.S. ____ (2018), p. 12 (Roberts, C. J., for the majority).
- 100.
Ibid, p. 18 (Roberts, C. J., for the majority).
- 101.
Obviously, the degree of precision and accuracy that is required will vary depending on the intrusiveness of the surveillance measure itself, the availability of less intrusive means, and the severity of the crime or threat in question.
- 102.
It is a mistake to conclude that the application of machine learned patterns could not result in individualized predictions (cf. Ferguson 2017, p. 127: ‘generalized suspicion’). As soon as data concerning a specific individual is used as input data for the prediction, the result is individualized by definition. The question that really matters is, whether it is individualized enough, i.e. whether the pattern in question relies on more than just one or two predictors such as place of birth, education etc. (cf. Hermstrüwer, para 10, and also paras 30–34 on the converse risk of excessive individualization (‘overfitting’)). One should bear in mind that all police suspicion, be it detected by human or technological means, starts with and relies on some form of pattern recognition, i.e. the application of previously learned information to new situations. For details cf. Rademacher (2017), pp. 373–377, 381–383, and Harcourt and Meares (2011), p. 813: ‘In reality, most individuals arouse suspicion because of the group-based-type behavior that they exhibit or the fact that they belong to readily identifiable groups—sex and age are two examples—rather than because of unique individual traits. Typically, individuals come to police attention because they are young, or are male, or are running away from the police, or have a bulge in their pocket’.
- 103.
See, for further details on the methods of evaluating predictive algorithms and on the difference between precision and accuracy, Degeling and Berendt (2017), esp. 3.2.
- 104.
All forms of suspicion are probabilistic in nature, be it human or technological. By definition, reliance on ‘suspicion’ accepts that any actions based thereon are made in a state of possible incompleteness of information (cf. Rich 2016, p. 898; Rademacher 2017, p. 383) and should—consequently—be open to ex post rectification.
- 105.
Interestingly, American scholars suggest comparing smart law enforcement to drug dogs rather than to humans, e.g. Rich (2016), pp. 913–921. On the law of drug dogs see esp. Rodriguez v. United States, 575 U.S. __ (2015), pp. 5–6 (Ginsburg, J., for the majority), finding that police may perform investigations (like dog sniffs) unrelated to a roadside detention (which itself requires ‘probable cause’ under the Fourth Amendment), but only if that investigation does not prolong the stop. In Florida v. Jardines (see note 93), the Supreme Court held, in a 5 to 4 decision, that a dog sniff does amount to a search within the meaning of the Fourth Amendment when it is performed on property surrounding the home of a person (so-called curtilage, in that specific case: a front porch), if that property had been entered with the intention of performing that investigation. On the other hand, Justice Scalia reaffirmed that ‘law enforcement officers need not “shield their eyes” when passing by the home “on public thoroughfares”’ (at p. 1423).
- 106.
- 107.
Cf. Rich (2016), pp. 913–921, premised on the comparability of smart law enforcement (‘automated suspicion algorithms’) with drug dogs.
- 108.
Cf. Hermstrüwer, paras 52–55, who correctly notes that acceptability of false positives or false negatives depends on whether AI is applied for information gathering, or for preventive or punitive purposes.
- 109.
Cf. Kroll et al. (2017), pp. 695–705, for detailed ‘recommendations’ to lawmakers, policymakers, and computer scientists to ‘foster’ interdisciplinary collaboration.
- 110.
Ibid, pp. 657–658.
- 111.
See, e.g., Bieker et al. (2018), p. 610, referring to Article 13 of the EU’s General Data Protection Regulation (see para 69).
- 112.
Ditto see Hermstrüwer, paras 3, 45–47, and Kroll et al. (2017), p. 657: a ‘naïve solution to the problem’; Ferguson (2017), pp. 137, 138: ‘The issue […] is not the transparency of the algorithm […] but the transparency of how the program is explained to the public and, of course, what is done with the information’. See also Wischmeyer, passim, and esp. paras 24 et seq. and 30.
- 113.
Cf. Joh (2014), pp. 50–55.
- 114.
Cf. Rich (2016), p. 919.
- 115.
- 116.
On the need to (re)establish human agency cf. Wischmeyer, paras 24 et seq.
- 117.
See also Kroll et al. (2017), pp. 657–660, Ferguson (2017), pp. 137–138, and, for an up-to-date overview of the accountablity discussion, Andrews (2019). This is not to say that specific public officials should not have the right to scrutinize source codes, training and test data etc. if circumstances, especially procedures of judicial review require that kind of additional transparency. See for details Wischmeyer, esp. para 47.
- 118.
For techniques to preserve privacy in the course of human re-evaluation of video data cf. Birnstill et al. (2015).
- 119.
Ditto Rich (2016), p. 920, who, however, appears to be skeptical as to the practicality of such systems of disclosure: ‘theoretically solvable’. See also Wischmeyer, esp. para 27. The respective techniques are called explainable AI (short form: XAI), cf. Waltl and Vogl (2018) and Samek et al. (2017).
- 120.
- 121.
It is important to note that in this case it is irrelevant that the software itself is limited to detecting correlations and is not able to ‘understand’ casual links. To Ferguson (2017), p. 119, the difference between correlation and causation is one of the ‘fundamental questions’ behind big data policing. I disagree: The lack of understanding, which is inherent in machine learning, would only constitute a case against the use of smart law enforcement technologies, if we were to require the software to be held accountable, i.e. require it to explain itself and be subject, eventually, to disciplinary or electoral sanctions. Instead, what we need, is to establish a regulatory framework that preserves human accountability. Therefore, the software itself does not need to ‘understand’ the correlations it searches for. See also, from a private law perspective, Eidenmüller (2017), p. 13: ‘Treating robots like humans would dehumanize humans, and therefore we should refrain from adopting this policy.’
- 122.
Cf. Rich (2016), pp. 911–924 for a detailed analysis of the law on drug dogs (in the US) and its suitability for being applied, by way of analogy, to ‘automated suspicion algorithms’; see also note 105.
- 123.
Ferguson (2017), p. 133.
- 124.
Cf. Tischbirek, paras 5 et seq.
- 125.
See Buchholtz, para 30; Hacker (2018), pp. 1143–1144; but see also Brantingham et al. (2018), p. 1: ‘We find that there were no significant differences […] by racial-ethnic group between the control and treatment conditions.’ For a detailed account of comparable software being tested in the criminal justice system of the UK, cf. Scantamburlo et al. (2019), esp. pp. 58 et seq.
- 126.
See Hermstrüwer, paras 3–4; see also Bennet Capers (2017), pp. 1242, 1271: ‘I am a black man. […] I am interested in technology that will lay bare not only the truth of how we police now but also how those of us who are black or brown live now.’
- 127.
The example is taken from Bennet Capers (2017), p. 1242, who uses this equation to describe his perception of the status quo of human law enforcement in the US.
- 128.
Ditto Hacker (2018), pp. 1146–1150.
- 129.
Cf. Kroll et al. (2017), p. 685; see Tischbirek, para 13. For an up-to-date catalogue of sensitive predictors under EU law see Article 21 CFR.
- 130.
- 131.
- 132.
Kroll et al. (2017), p. 674 (procedural fairness), pp. 690–692 (nondiscrimination). See also Hermstrüwer, paras 41–43.
- 133.
The approach has therefore been labelled ‘fairness through awareness’, cf. Dwork et al. (2011). See also Tischbirek, paras 31 et seq.: ‘towards a paradigm of knowledge creation’.
- 134.
Ferguson (2017), p. 137.
- 135.
See, e.g., Bennet Capers (2017), pp. 1268–1283, 1285, with a strong plea for the replacement of (biased) human policing by (hopefully) less biased policing by technology.
- 136.
- 137.
See, e.g., Cheng (2006), pp. 659 et seq.
- 138.
- 139.
See, for a plea for preventive regulation of financial markets, Schemmel, para 46.
- 140.
Cf. Bennet-Capers (2017), pp. 1282–1283, 1285–1291.
- 141.
Orwell (1949).
- 142.
Cf. Oganesian and Heermann (2018).
- 143.
For a more sophisticated attempt to explain ‘The Dangers of Surveillance’ cf. Richards (2013), esp. pp. 1950–1958, 1962–1964. See Timan et al. (2018), pp. 744–748, for an interdisciplinary attempt to reconcile the insights of ‘surveillance studies’ with legal reasoning, quite rightly asking for ‘more legal scholarship’ that ‘views surveillance as generally good and bad at the same time, or as good or bad depending on the situation’.
- 144.
- 145.
Joh (2019), p. 178: ‘[A]s cities become “smarter”, they increasingly embed policing itself into the urban infrastructure.’
- 146.
See also Timan et al. (2018), p. 738: ‘increasing blend of governmental and corporate surveillance infrastructures’ and ‘an increase of citizen-instigated forms of surveillance can be witnessed’.
- 147.
Cf. Rich (2013), p. 810. This sentiment might actually deserve some form of legal, perhaps even constitutional recognition. The reason for that is that we live in what I would call ‘imperfect democracies’. I.e. in societies that try very hard to balance out majority rule on the one hand and the individual’s rights to self-determination and political participation on the other hand—by providing a plethora of fundamental and political rights in order to protect minorities—but which fail, and will continue to fail in the future, to fully provide such balance for many practical reasons. So as long as even democratic laws cannot claim to be perfectly legitimate with regard to each and every paragraph, there is good reason to argue that such laws on their part may not claim perfect compliance.
- 148.
Petroski (2018).
- 149.
Cheng (2006), pp. 682–688, with a critical account of legislation respecting that wish.
- 150.
See also Hartzog et al. (2015), esp. pp. 1778–1792, advocating for automated law enforcement to be consciously ‘inefficient’ to prevent ‘perfect enforcement’.
- 151.
Mulligan (2008), p. 3.
- 152.
See also Timan et al. (2018), p. 747, citing J Cohen: ‘importance of room for play’.
- 153.
United States v. Jones (see note 90), p. 429; see also note 55 for the German discussion.
- 154.
References
Algorithm Watch, BertelsmannStiftung (2019) Automating Society. Taking stock of automated decision-making in the EU. www.bertelsmann-stiftung.de/de/publikationen/publikation/did/automating-society. Accessed 21 Feb 2019
Andrews L (2019) Algorithms, regulation, and governance readiness. In: Yeung K, Lodge M (eds) Algorithmic regulation. Oxford University Press, Oxford, pp 203–223
Associated Press (2018) Israel claims 200 attacks predicted, prevented with data tech. CBS News. www.cbsnews.com/news/israel-data-algorithms-predict-terrorism-palestinians-privacy-civil-liberties. Accessed 29 Nov 2018
Bachmeier L (2018) Countering terrorism: suspects without suspicion and (pre-)suspects under surveillance. In: Sieber U, Mitsilegas V, Mylonopoulos C, Billis E, Knust N (eds) Alternative systems of crime control. Duncker&Humblodt, Berlin, pp 171–191
Barret B (2016) New surveillance system may let cops use all of the cameras. Wired. www.wired.com/2016/05/new-surveillance-system-let-cops-use-cameras. Accessed 16 Nov 2018
Becker M (2019) Von der Freiheit, rechtswidrig handeln zu können. Zeitschrift für Urheber- und Medienrecht 64:636–648
Bieker F, Bremert B, Hansen M (2018) Verantwortlichkeit und Einsatz von Algorithmen bei öffentlichen Stellen. Datenschutz und Datensicherheit:608–612
Bier W, Spiecker gen Döhmann I (2012) Intelligente Videoüberwachungstechnik: Schreckensszenario oder Gewinn für Datenschutz. Computer und Recht:610–618
Big Brother Watch (2018) Face off. The lawless growth of facial recognition in UK policing. https://www.bigbrotherwatch.org.uk/wp-content/uploads/2018/05/Face-Off-final-digital-1.pdf. Accessed 29 Nov 2018
Birnstill P, Ren D, Beyerer J (2015) A user study on anonymization techniques for smart video surveillance. IEEE. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7301805. Accessed 29 Nov 2018
Böckenförde T (2008) Auf dem Weg zur elektronischen Privatsphäre. JuristenZeitung 63:925–939
Bouachir W, Gouiaa R, Li B, Noumeir R (2018) Intelligent video surveillance for real-time detection of suicide attempts. Pattern Recogn Lett 110:1–7
Brantingham PJ, Valasik M, Mohler GO (2018) Does predictive policing lead to biased arrests? Results from a randomized controlled trial. Stat Public Policy 5:1–6
Brühl J (2018) Wo die Polizei alles sieht. Süddeutsche Zeitung. www.sueddeutsche.de/digital/palantir-in-deutschland-wo-die-polizei-alles-sieht-1.4173809. Accessed 26 Nov 2018
Bundesministerium des Innern (2016) ‘Polizei 2020’. www.bmi.bund.de/DE/themen/sicherheit/nationale-und-internationale-zusammenarbeit/polizei-2020/polizei-2020-node.html. Accessed 7 Dec 2018
Bundesministerium des Innern (2018) Pressemitteilung. Projekt zur Gesichtserkennung erfolgreich. www.bmi.bund.de/SharedDocs/pressemitteilungen/DE/2018/10/gesichtserkennung-suedkreuz.html. Accessed 29 Nov 2018
Bundespolizeipräsidium (2018) Teilprojekt 1 ‘Biometrische Gesichtserkennung’. Abschlussbericht. www.bundespolizei.de/Web/DE/04Aktuelles/01Meldungen/2018/10/181011_abschlussbericht_gesichtserkennung_down.pdf;jsessionid=2A4205E1606AC617C8006E65DEDD7D22.2_cid324?__blob=publicationFile&v=1. Accessed 29 Nov 2018
Bundestag (2018) Antwort der Bundesregiegung auf die Kleine Anfrage ‘Umsetzung der EU-Richtlinie zur Vorratsdatenspeicherung von Fluggastdaten’. BT-Drucksache 19/4755
Burkert H (2012) Balancing informational power by information power, or rereading Montesquieu in the internet age. In: Brousseau E, Marzouki M, Méadel C (eds) Governance, regulations and powers on the internet. CUP, Cambridge, pp 93–111
Candamo J, Shreve M, Goldgof D, Sapper D, Kasturi R (2010) Understanding transit scenes: a survey on human behavior recognition algorithms. IEEE Trans Intell Transp Syst 11:206–224
Capers IB (2017) Race, policing, and technology. N C Law Rev 95:1241–1292
Caplan J, Kennedy L (2016) Risk terrain modeling. University of California Press, Oakland
Chaos Computer Club (2018) Biometrische Videoüberwachung: Der Südkreuz-Versuch war kein Erfolg. www.ccc.de/de/updates/2018/debakel-am-suedkreuz. Accessed 26 Nov 2018
Cheng E (2006) Structural laws and the puzzle of regulating behavior. Northwest Univ School Law 100:655–718
Davenport T (2016) How Big Data is helping the NYPD solve crime faster. Fortune. fortune.com/2016/07/17/big-data-nypd-situational-awareness. Accessed 29 Nov 2018
Degeling M, Berendt B (2017) What is wrong about Robocops as consultants? A technology-centric critique of predictive policing. AI & Soc 33:347–356
Demetis D (2018) Fighting money laundering with technology: a case study of Bank X in the UK. Decis Support Syst 105:96–107
Dimitrova D (2018) Data protection within police and judicial cooperation. In: Hofmann HCH, Rowe GC, Türk AH (eds) Specialized administrative law of the European Union. Oxford University Press, Oxford, pp 204–233
Dwork C, Hardt M, Pitassi T, Reingold O, Zemel R (2011) Fairness through awareness. arXiv.org/pdf/1104.3913.pdf. Accessed 29 Nov 2017
Eidenmüller H (2017) The rise of robots and the law of humans. Oxford Legal Studies Paper. Ssrn.com/abstract=2941001. Accessed 29 Nov 2018
Eubanks V (2018) A child abuse prediction model fails poor families. Wired. www.wired.com/story/excerpt-from-automating-inequality. Accessed 29 Nov 2018
Ferguson AG (2014) Fourth amendment security in public. William Mary Law Rev 55:1283–1364
Ferguson AG (2015) Big data and predictive reasonable suspicion. Univ Pa Law Rev 163:327–410
Ferguson AG (2017) The rise of big data policing. New York University Press, New York
Hacker P (2018) Teaching fairness to artificial intelligence: existing and novel strategies against algorithmic discrimination under EU law. Common Mark Law Rev 55:1143–1186
Harcourt B, Meares T (2011) Randomization and the fourth amendment. U Chi L Rev 78:809–877
Hartzog W, Conti G, Nelson J, Shay LA (2015) Inefficiently automated law enforcement. Mich State Law Rev:1763–1796
Heinemann M (2015) Grundrechtlicher Schutz informationstechnischer Systeme. Schriften zum Öffentlichen Recht, vol 1304. Duncker & Humblot, Berlin
Henderson SE (2016) Fourth amendment time machines (and what they might say about police body cameras). J Constit Law 18:933–973
Hildebrandt M (2016) Law as information in the era of data-driven agency. Mod Law Rev 79:1–30
Hoffmann-Riem W (2008) Der grundrechtliche Schutz der Vertraulichkeit und Integrität eigengenutzer informationstechnischer Systeme. JuristenZeitung 63:1009–1022
Hoffmann-Riem W (2017) Verhaltenssteuerung durch Algorithmen – Eine Herausforderung für das Recht. Archiv des öffentlichen Rechts 142:1–42
Joh E (2014) Policing by numbers: big data and the fourth amendment. Wash Law Rev 89:35–68
Joh E (2016) The new surveillance discretion: automated suspicion, big data, and policing. Harv Law Policy Rev 10:15–42
Joh E (2019) Policing the smart city. International Journal of Law in Context 15:177–182
Kroll JA, Huey J, Barocas S, Felten EW, Reidenberg JR, Robinson DG, Yu H (2017) Accountable algorithms. Univ Pa Law Rev 165:633–705
Lismont J, Cardinaels E, Bruynseels L, De Goote S, Baesens B, Lemahieu W, Vanthienen J (2018) Predicting tax avoidance by means of social network analytics. Decis Support Syst 108:13–24
Marks A, Bowling B, Keenan C (2017) Automatic justice? In: Brownsword R, Scotford E, Yeung K (eds) The Oxford handbook of law, regulation, and technology. Oxford University Press, Oxford, pp 705–730
Marsch N (2012) Die objektive Funktion der Verfassungsbeschwerde in der Rechtsprechung des Bundesverfassungsgerichts. Archiv des öffentlichen Rechts 137:592–624
Marsch N (2018) Das europäische Datenschutzgrundrecht. Mohr Siebeck, Tübingen
May J (2018) Drug enforcement agency turns to A.I. to help sniff out doping athletes. Digital trends. https://www.digitaltrends.com/outdoors/wada-artificial-intelligence-doping-athletes. Accessed 23 Oct 2019
McKay S (2015) Covert policing. Oxford University Press, Oxford
Mulligan CM (2008) Perfect enforcement of law: when to limit and when to use technology. Richmond J Law Technol 14:1–49
Murphy E (2007) The new forensics: criminal justice, false certainty, and the second generation of scientific evidence. Calif Law Rev 95:721–797
Oermann M, Staben J (2013) Mittelbare Grundrechtseingriffe durch Abschreckung? DER STAAT 52:630–661
Oganesian C, Heermann Th (2018) China: Der durchleuchtete Mensch – Das chinesische Socia-Credit-System. ZD-Aktuell:06124
Oldiges M (1987) Einheit der Verwaltung als Rechtsproblem. Neue Zeitschrift für Verwaltungsrecht 1987:737–744
Orwell G (1949) Nineteen eighty-four. Secker & Warburg, London
Pelzer R (2018) Policing terrorism using data from social media. Eur J Secur Res 3:163–179
Petroski W (2018) Iowa Senate OKs ban on traffic enforcement cameras as foes predict more traffic deaths. Des Moines Register. https://eu.desmoinesregister.com/story/news/politics/2018/02/27/traffic-enforcementcameras-banned-under-bill-passed-iowa-senate/357336002. Accessed 23 Oct 2019
Poscher R (2017) The right to data protection. In: Miller R (ed) Privacy and power. Cambridge University Press, Cambridge, pp 129–141
Rademacher T (2017) Predictive Policing im deutschen Polizeirecht. Archiv des öffentlichen Rechts 142:366–416
Rademacher T (2019) Wenn neue Technologien altes Recht durchsetzen: Dürfen wir es unmöglich machen, rechtswidrig zu handeln? JuristenZeitung 74:702–710
Reidenberg J (1998) Lex Informatica: the formulation of information policy rules through technology. Tex Law Rev 76:553–593
Rich M (2013) Should we make crime impossible? Harv J Law Public Policy 36:795–848
Rich M (2016) Machine learning, automated suspicion algorithms, and the fourth amendment. Univ Pa Law Rev 164:871–929
Richards NM (2013) The dangers of surveillance. Harv Law Rev 126:1934–1965
Rosenthal D (2011) Assessing digital preemption (and the future of law enforcement?). New Crim Law Rev 14:576–610
RWI Essen (2018) “Erfolgreiche” Gesichtserkennung mit hunderttausend Fehlalarmen. http://www.rwi-essen.de/unstatistik/84. Accessed 29 Nov 2018
Samek W, Wiegand T, Müller K-R (2017) Explainable artificial intelligence: understanding, visualizing and interpreting deep learning models. arXiv:1708.08296v1. Accessed 25 Oct 2019
Saracco R (2017) An artificial intelligence ‘nose’ to sniff diseases: EIT Digital. https://www.eitdigital.eu/newsroom/blog/article/an-artificial-intelligence-nose-to-sniff-diseases. Accessed 29 Nov 2018
Saunders J, Hunt P, Hollywood JS (2016) Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. J Exp Criminol 12:347–371
Scantamburlo T, Charlesworth A, Cristianini N (2019) Machine decisions and human consequences. In: Yeung K, Lodge M (eds) Algorithmic regulation. Oxford University Press, Oxford, pp 49–81
Schlossberg T (2015) New York police begin using ShotSpotter system to detect gunshots. The New York Times. https://www.nytimes.com/2015/03/17/nyregion/shotspotter-detection-system-pinpoints-gunshot-locations-and-sends-data-to-the-police.html. Accessed 29 Nov 2018
Seidensticker K, Bode F, Stoffel F (2018) Predictive policing in Germany. http://nbn-resolving.de/urn:nbn:de:bsz:352-2-14sbvox1ik0z06. Accessed 29 Nov 2018
Singelnstein T (2018) Predictive Policing: Algorithmenbasierte Straftatenprognosen zur vorausschauenden Kriminalintervention. Neue Zeitschrift für Strafrecht:1–9
Solove DJ (2007) ‘I’ve got nothing to hide’ and other misunderstandings of privacy. San Diego Law Rev 44:745–772
Spice B (2015) Carnegie Mellon developing online tool to detect and identify sex traffickers. www.cmu.edu/news/stories/archives/2015/january/detecting-sex-traffickers.html. Accessed 24 Oct 2019
Staben J (2016) Der Abschreckungseffekt auf die Grundrechtsausübung. Internet und Gesellschaft, vol 6. Mohr Siebeck, Tübingen
Thomas S, Gupta S, Subramanian V (2017) Smart surveillance based on video summarization. IEEE. https://ieeexplore.ieee.org/document/8070003. Accessed 29 Nov 2018
Throckmorton CS, Mayew WJ, Venkatachalam M, Collins LM (2015) Financial fraud detection using vocal, linguistic and financial cues. Decis Support Syst 74:78–87
Timan T, Galic M, Koops B-J (2018) Surveillance theory and its implications for law. In: Brownsword R, Scotford E, Yeung K (eds) The Oxford handbook of law, regulation, and technology. Oxford University Press, Oxford, pp 731–753
Trute HH (2009) Grenzen des präventionsorientierten Polizeirechts in der Rechtsprechung des Bundesverfassungsgerichts. Die Verwaltung 42:85–104
Tyler TR (1990) Why people obey the law. Yale University Press, New Haven and London
Waltl B, Vogl R (2018) Increasing transparency in algorithmic decision-making with explainable AI. Datenschutz und Datensicherheit:613–617
Wendt K (2018) Zunehmender Einsatz intelligenter Videoüberwachung. ZD-Aktuell:06122
Wittmann P (2014) Der Schutz der Privatsphäre vor staatlichen Überwachungsmaßnahmen durch die US-amerikanische Bundesverfassung. Nomos, Baden-Baden
Wysk P (2018) Tausche Freiheit gegen Sicherheit? Die polizeiliche Videoüberwachung im Visier des Datenschutzrechts. Verwaltungsarchiv 109:141–162
Zetter K (2012) Public buses across country quietly adding microphones to record passenger conversations. Wired. www.wired.com/2012/12/public-bus-audio-surveillance. Accessed 26 Nov 2018
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Rademacher, T. (2020). Artificial Intelligence and Law Enforcement. In: Wischmeyer, T., Rademacher, T. (eds) Regulating Artificial Intelligence. Springer, Cham. https://doi.org/10.1007/978-3-030-32361-5_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-32361-5_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32360-8
Online ISBN: 978-3-030-32361-5
eBook Packages: Law and CriminologyLaw and Criminology (R0)