Skip to main content

Advertisement

Log in

Of New Technologies and Old Laws: Do We Need a Right to Violate the Law?

  • Original Article
  • Published:
European Journal for Security Research Aims and scope Submit manuscript

Abstract

New technologies that can force compliance with the law are becoming increasingly widespread. At the same time, the previous constitutional arguments for defending against such structures are losing weight under the new conditions of powerful artificial intelligence. This paper opens a new line of argument using the search term of a “right to violate the law,” which will make it possible to restrict the use of modern law-enforcing technologies in an effective and criteria-based manner also in the future.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. See Marks et al. (2017), pp. 705, 707–710, 714–715; and esp. Ferguson (2017).

  2. See Rich (2013), esp. pp. 802–804 for definition of terms, and ibid., note 2, for a reference to Professor Edward K. Cheng as originator of the term “impossibility structures”. For other attempts of definition, see Cheng (2006), esp. p. 664: “type II structural controls”; Mulligan (2008), p. 3: “perfect prevention”; Rosenthal (2011), p. 579: “digital preemption”; and Hartzog et al. (2015), p. 1777: “total enforcement”.

  3. Cf., with further references to impact and effectiveness research, Hoffmann-Riem (2017), p. 34. Similar idem (2016), p. 143: “willingness to comply by the [law] addressees” as a “precondition [of the] norm being effective” (my translation).

  4. Hoffmann-Riem (2016), pp. 142–145.

  5. See esp. Lübbe-Wolff (1996), p. 1 and passim. See also the contributions in Mayntz (1980, 1983). On the potential unconstitutionality under German law due to structural implementation deficits of statutes Funke (2007), pp. 168 et seq.

  6. Cf. Kreßner (2019), pp. 159 et seq., 284 et seq.

  7. For a comprehensive review, see Ferguson (2017), Rademacher (2020), paras 3–11, with further references.

  8. See also Ferguson (2017), pp. 86 et seq.

  9. Comprehensive information on the current state of affairs by Rademacher (2020), paras 6 et seq., 10.

  10. Compare as well Rich (2013), p. 803.

  11. Compare the definition of Rich (2013), p. 803.

  12. See Article 13 (1) of the proposal for a directive “on copyright in the Digital Single Market” (COM (2016) 593 final): “Information society service providers […] shall […] take measures to ensure the functioning of agreements concluded with rightholders for the use of their works […]. Those measures, such as the use of effective content recognition technologies, shall be appropriate and proportionate.” Emphasis added. The respective Article 17 of the final version (Directive 2019/790) omits that second sentence; instead, Article 17 (8) now states that “[t]he application of this Article shall not lead to any general monitoring obligation.” It remains to be seen if that effectively prevents upload filters.

  13. For a detailed classification under private law doctrine, see Paulus and Matzke (2018), pp. 431 et seq.; see also Fries and Paal (2019).

  14. Eschenbruch and Gerstberger (2018), pp. 3, 5. Another often cited example of smart contracts is the self-executing lease, see Schrey and Thalhofer (2017), p. 1431; Hofmann (2019), pp. 125, 128–129.

  15. Pergande (2019).

  16. Compare North (1990), p. 54: “[T]he inability of societies to develop effective, low-cost enforcement of contracts is the most important source of both historical stagnation and contemporary underdevelopment in the Third World.”

  17. For the status quo and also the “danger of establishing an infrastructure of censorship” Kastl (2016), p. 671.

  18. Kühl (2019).

  19. See Article 6 (2) COM (2018) 640 final, 12 September, 2018.

  20. See § 46 (1) Geldwäschegesetz (GwG).

  21. See also Rich (2013), pp. 795 et seq.

  22. Quoted and translated from Maak (2019).

  23. Maak (2019).

  24. For an example of GPS data already being used for law enforcement purposes, see the interview with Andreas Winkelmann (Berlin Public Prosecutor’s Office) in Kensche (2019).

  25. See, e.g., Bundestag (2017), p. 21: “The highly or fully automated driving function used must also be able to comply with road traffic regulations during its operation” (my translation).

  26. According to § 1a (2), No. 3, German Road Traffic Act (StVG), there may at present be no automobile impossibility structures, but the “technical equipment” for the “highly or fully automated driving function […] must be manually overrideable or deactivatable at any time by the driver”.

  27. Cf. German Constitutional Court 1 BvR 142/15 “Kfz-Kennzeichenkontrollen II” (18 December 2018), para 94, and—for a more detailed analysis—below, at notes 78 et seq.

  28. See, e.g., Lisken (1998), pp. 22, 24 on the one hand (“imposition of a state of siege,” my translation) and, responding thereto, Schwabe (1998), p. 710 (“Albanians and Kurds [arriving on] giants’ ships,” my translation). For a positive example of calm discussion and reflection see Wysk (2018), pp. 141 et seq. On how paradoxical the supposed dichotomy of freedom and security is on closer inspection, see Heckmann (2005), pp. 183, 184–185, 199–201.

  29. For example, di Fabio (2008), p. 422: “The liberal constitutional state does not want […] peace at any price, but […] peace for free people” (my translation).

  30. Compare for this systematization also Spindler (2017), p. 2307: “chilling effects” and “overblocking.” From a private law perspective see Kuhlmann (2019), pp. 122 et seq.

  31. See esp. German Constitutional Court 1 BvR 518/02 “Rasterfahndung” (4 April 2006), BVerfGE 115, pp. 349–351, German Constitutional Court 1 BvR 256/08 “Vorratsdatenspeicherung” (2 March 2010), BVerfGE 125, p. 320 and German Constitutional Court 1 BvR 2074/05 “Kfz-Kennzeichenkontrollen I” (11 March 2008), BVerfGE 120, p. 402. For the in part sharp criticism of this figure of argumentation, see esp. Dreier (2013), para 87; critical also Trute (2009), p. 99 et seq. But see recently German Constitutional Court 1 BvR 142/15 “Kfz-Kennzeichenkontrollen II” (18 December 2018), para 51: The “chilling effect” that might be triggered by modern surveillance technologies is for the first time no longer mentioned explicitly here; instead, the claim of a liberal community is made that its members do not have to constantly account for their own righteousness. That is not the same thing. A dissociation from the argument “chilling effect” was probably not planned nevertheless, as the Court sees the new decision in continuity with the previous line of case law (“cf.”-reference in para 51 to the case law just cited in this footnote).

  32. Compare Staben (2016), pp. 102 et seq.

  33. Re the rather weak empiricism Staben (2016), p. 159.

  34. Insofar esp. clear German Constitutional Court 1 BvR 2074/05 “Kfz-Kennzeichenkontrollen I” (11 March 2008), BVerfGE 120, p. 402: “Natural behavior is particularly jeopardized if the range of investigative measures contributes to a risk of abuse and a sense of being monitored” (my translation). See also, on why the closely related claim about privacy (the “this is no one’s business!” argument) also loses weight in view of the ever more specific functioning of monitoring technologies, Rademacher (2017), pp. 396 et seq.

  35. In essence, the argument about chilling effects seems to be twofold, one being about the prevention of abuse from the misconduct of individual civil servants, but in its extreme form it is also about preventing totalitarianism. In any case, it is an attempt to make a diffuse mistrust, inevitable abstract in a functioning democratic constitutional state, workable not only procedurally, but at the level of substantive (constitutional) law. See also Rademacher (2017), pp. 400 et seq. For advanced technical instruments to reduce such kind of mistrust, see Feiten et al. (2016), pp. 314 et seq.

  36. Similar Cheng (2006), p. 716.

  37. The growing concerns in the last few years about algorithmic–factual (rather than normative) control of the individual usually focus on the aspect of “unconscious control through technology design” (emphasis added), so emphatically Hoffmann-Riem (2017), passim and esp. pp. 24 and 35 with further ref. also to the influential writings of Mireille Hildebrandt. However, impossibility structures can certainly have a conscious controlling effect, as shown by the examples above, such as upload filters or autonomous driving.

  38. Cf. Rahwan and Cebrian (2018) with the request to establish “Machine Behavior Studies.” It should be noted that complying with that request would not mean to necessarily accept the legal personality of machines (see Kersten (2015), pp. 6 et seq.): The (social) scientific observation of technology and the recognition of its role as an agent do not imply an equation of technology and (human-like) personality.

  39. See in this context early on the petition of Hoffmann-Riem (2001), para 95, arguing that the constitutional ban on censorship should also apply vis-à-vis individuals, not only vis-à-vis the state. It should be noted, however, that unlike through AI-based recognition software, Facebook and Co. could not enforce their “community standards” or else run the risk, at least in part, of becoming communicative “garbage dumps” and thereby squandering social acceptance.

  40. See as an example for a newly created obligation to suppress online content Sect. 474.34 [Removing, or ceasing to host, abhorrent violent material] of the Australia Criminal Code Act 1995 as amended by the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Bill 2019.

  41. See note 35.

  42.  The term “Streubreite,” as it is used in German constitutional law and translated as “scatter range” here (compare also Rich (2013), p. 812, who speaks of “overinclusive” and “underinclusive” impossibility structures) is not yet completely contoured. Here, it is used in the sense of German Constitutional Court 1 BvR 518/02 “Rasterfahndung” (4 April 2006), BVerfGE 115, p. 354: “Fundamental rights limitations [are] characterized by a high scatter range [if] they include numerous persons in the scope of the measure, who have no relation to a specific misconduct and have not caused the intervention by their behavior […].” Compare also ibid, p. 357, (β) for the application of the definition.

  43. For example, § 34 (2) of the German Criminal Code (StGB): “However, this [= the justification of prima facie illegal conduct] applies only if the act is an appropriate means of averting the risk.”

  44. Hoffmann-Riem (2017), pp. 29 et seq., 34 et seq. with further ref.; see also Buchholtz (2020), paras 15–18.

  45. Cf., with numerous ref. also to the case law of the European Court of Justice, Spindler (2017), p. 2307. From the case law of the German Constitutional Court, see above all German Constitutional Court 1 BvR 518/02 “Rasterfahndung” (4 April 2006), BVerfGE 115, pp. 354 and 357. See also Rademacher (2017), pp. 394–396.

  46. See the overview of corresponding “expert assessments” provided by Müller and Bostrom (2016) and for an updated version Grace et al. (2018).

  47. Johnson (2014), p. 113.

  48. See only § 1a (2), No. 2 German Road Traffic Act (StVG): “Motor vehicles with a highly or fully automated driving function […] are those that have technical equipment […] that is capable during highly or fully automated vehicle control of complying with traffic regulations aimed at the vehicle’s guidance.” Emphasis added.

  49. See note 19.

  50. § 28 (2) No. 4 Administrative Procedure Act (VwVfG).

  51. Eberl (2018), pp. 11–13. For definition, see the European Commission’s High Level Expert Group on Artificial Intelligence (2019), p. 6, available at: ec.europa.eu/digital-single-market/en/news/definition-artificial-intelligence-main-capabilities-and-scientific-disciplines.

  52. The objections raised by Kotsoglou (2014) against the automation of applying the law (nowadays usually referred to as “legal tech”) were based on the strong contextual dependence of legal language and seemed plausible at the time, as logical (i.e., automated) systems were unable to contextualize. Today these objections lose their force in light of neural networks which are systems that no longer follow strict logic and are thus no longer categorically incompatible with the partly illogical nature of legal language. Obviously, other problems arise, like the infamous “black box” effect, i.e., a lack of transparency of such systems [cf. on AI and transparency esp. Wischmeyer (2020a)].

  53. See only Eberl (2018), pp. 12 et seq.

  54. See also Wischmeyer (2020b), para 61, correctly observing that many of the negative side effects that are produced by new surveillance technologies are difficult to grasp by strictly legal (rather than political or philosophical) argumentation.

  55. For the US discussion about a “freedom to commit crime,” see Rich (2013), pp. 809–812 with further references.

  56. Luhmann (1972), pp. 304 et seq.

  57. See Hoffmann-Riem (2017), p. 34.

  58. Action according to the “spot check” mode (above, Sect. 2.1) instead may be taken, meaning, for example, that interim injunctions may prevent publication, which is then not censorship in the sense of Article 5 (1), sentence 3, GG.

  59. However, on the very restrictive interpretation of what constitutes censorship, Grabenwarter (2013), Art. 5 Abs. 1, Abs. 2, paras 115 et seq. Crit. against the restrictive interpretation esp. Hoffmann-Riem (2001), paras 91–93.

  60. Cf. Hoffmann-Riem (2001), Art. 5 Abs. 1, 2 para 89.

  61. See above, at notes 36 et seq. and 39 et seq., respectively.

  62. See, e.g., Waechter (2016), p. 99.

  63. Maak (2019).

  64. Correctly noted, recently, by Bull (2019), pp. 57, 61, 94.

  65. In the form of an expectation that certain offenses will be prosecuted at only low intensity: a “sporting chance to get away with crime,” Rich (2013), p. 810.

  66. See, e.g., Wischmeyer (2020b), para 61; Hofmann (2019), pp. 137–139; idem. (2018), p. 22 et seq, with a focus on copyright law; Kuhlmann (2019), pp. 123 et seq; Möllers (2018), p. 478; Boehme-Neßler (2017), pp. 3036 et seq. But see also Cheng (2006), p. 655, esp. p. 671, according to whom “[w]e would need to reduce our focus on individual rights and acknowledge the importance of more community-oriented, social welfare goals. After all, the freedom to break the law […] comes with other costs to freedom […].”

  67. Kube (2019), p. 316 (my translation).

  68. See only the evidence in note 2.

  69. Hartzog et al. (2015), p. 1794.

  70. United States v. Jones, 565 U.S. 400 (2012), p. 418 (Alito, J., concurring).

  71. Compare United States v. Jones, 565 U.S. 400 (2012), p. 430 (Alito, J., concurring).

  72. While Justice Alito in Jones could not prevail with this reasoning, it was recently applied expressly in Carpenter by the majority of judges for the retrieval of mobile phone connection data by law enforcement—likely against the previous legal doctrine—subjugating the requirements of the Fourth Amendment (= “probable cause”). See Carpenter v. United States, 585 U.S.___ (2018), p 12 (Roberts, C.J., for the majority).

  73. For example, German Constitutional Court 1 BvR 2074/05 “Kfz-Kennzeichenkontrollen I” (11 March 2008), BVerfGE 120, p. 401 and 407, and esp. German Constitutional Court 1 BvR 518/02 “Rasterfahndung” (4 April 2006), BVerfGE 115, pp. 349–351: “From a constitutional point of view, the new quality of police investigation measures leads to an increased level of intervention” (my translation).

  74. See the on principle of “full effectiveness” (effet utile) in EU law Schmidt-Aßmann (2006), pp. 59–62.

  75. A concrete proposal—namely: “to keep humans in the loop”—can be found in Hartzog et al. (2015), p. 1786. The proposal recalls Article 22 GDPR and—in the arbitrariness shown by the authors – can hardly be effective. The same goes for the concept offered by Rich (2013), pp. 805–828, which is ultimately limited to a case-by-case assessment of chilling and scattering effects on the one hand and the state’s obligation to protect against crime and the interests of potential victims on the other.

  76. Dilkes (2013). Emphasis added. An attempt by Republican legislators to altogether ban the use of “traffic surveillance systems” failed in the Iowa Senate in March 2018, see Pfannenstiel (2018).

  77. See just above, at note 70.

  78. With regard to, for example, passenger name record analysis Arzt (2017), p. 1027. See also Ruthig (2019), Vorbemerkung, paras 3–5; § 4, para 1.

  79. German Constitutional Court 1 BvR 142/15 “Kfz-Kennzeichenkontrollen II” (18 December 2018), para 94 (my translation).

  80. Ibid.

  81. Detailed analysis at Rademacher (2017), pp. 403–405.

  82. German Constitutional Court 1 BvR 142/15 “Kfz-Kennzeichenkontrollen II” (18 December 2018), para 100, as part of an “overall assessment.”

  83. See above, at note 22.

  84. Cf. Marsch (2018), pp. 108–121, and Poscher (2017), pp. 131 et seq.

  85. On the other hand, consideration is already being given to incorporating data protection regulations in impossibility structures by means of “tagging” electronic data to secure legal compliance, see, e.g., Executive Office of the President (2014), p. 56, with examples on pp. 28, 29–30, 67. For discussion under the GDPR Hartung (2018), Art. 23 DSGVO, para 16.

References

  • Arzt C (2017) Das neue Gesetz zur Fluggastdatenspeicherung—Einladung zur anlasslosen Rasterfahndung durch das BKA. Die Öffentliche Verwaltung 2017(24):1023–1030

    Google Scholar 

  • Boehme-Neßler V (2017) Die Macht der Algorithmen und die Ohnmacht des Rechts. Neue Juristische Wochenschrift 2017(42):3031–3037

    Google Scholar 

  • Buchholtz G (2020) Artificial intelligence and legal tech. In: Wischmeyer T, Rademacher T (eds) Regulating artificial intelligence. Springer, Heidelberg

    Google Scholar 

  • Bull HP (2019) Über die rechtliche Einbindung der Technik. Der Staat 58(1):57–100

    Google Scholar 

  • Bundestag (2017) Gesetzesentwurf der Bundesregierung “Änderung des Straßenverkehrsgesetzes”. BT-Drucksache 18/11300

  • Cheng EK (2006) Structural laws and the puzzle of regulating behavior. Northwest Univ Law Rev 100(2):655–717

    Google Scholar 

  • Di Fabio U (2008) Sicherheit in Freiheit. Neue Jurist Wochenschr 2008(7):421–425

    Google Scholar 

  • Dilkes EM (2013) Ordinance amending title 9, entitled “motor vehicles and traffic,” of the city code by adopting an ordinance similar in substance to the proposed initiative on traffic enforcement cameras and drones, automatic license plate recognition systems and other kinds of traffic surveillance systems, and by repealing ordinance no. 12-4466 that enabled automatic traffic enforcement. https://de.scribd.com/doc/145746424/Iowa-anti-drone-bill Accessed 8 July 2019

  • Dreier H (2013) Art. 2 I. In: idem (ed) Grundgesetz Kommentar: GG, Vol I: Präambel, Art. 1–19, 3rd edn. Mohr Siebeck, Tübingen

    Google Scholar 

  • Eberl U (2018) Was ist Künstliche Intelligenz—was kann sie leisten? Aus Politik und Zeitgesch 2018(6–8):8–14

    Google Scholar 

  • Eschenbruch K, Gerstberger R (2018) Smart Contracts: Planungs-, Bau- und Immobilienverträge als Programm? Neue Z für Baur und Vergaber 2018(1):3–8

    Google Scholar 

  • European Commission’s High-Level Expert Group on Artifial Intelligence (2019) A definition of AI. https://ec.europa.eu/digital-single-market/en/news/definition-artificial-intelligence-main-capabilities-and-scientific-disciplines. Accessed 8 July 2019

  • Executive Office of the President (2014) Big data: seizing opportunities, preserving values. https://obamawhitehouse.archives.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf. Accessed 9 July 2019

  • Feiten L, Sester S, Zimmermann C, Volkmann S, Wehle L, Becker B (2016) Revocable anonymisation in video surveillance: a “digital cloak of invisibility”. In: Kreps D, Fletcher G, Griffiths M (eds) Technology and intimacy: choice or coercion. Springer, Heidelberg, pp 314–327

    Google Scholar 

  • Ferguson AG (2017) The rise of big data policing. New York University Press, New York

    Google Scholar 

  • Fries M, Paal BP (2019) Smart contracts. Mohr Siebeck, Tübingen

    Google Scholar 

  • Funke A (2007) Gleichbehandlungsgrundsatz und Verwaltungsverfahren—Die Rechtsprechung des BVerfG zu strukturell bedingten Vollzugsdefiziten. Archiv des öffentlichen Rechts 132(2):168–214

    Google Scholar 

  • Grabenwarter C (2013) Art. 5 Abs. 1, Art. 5 Abs. 2. In: Maunz T, Dürig G (eds) Grundgesetz Kommentar, installment 68 January 2013. C.H. Beck, München

    Google Scholar 

  • Grace K, Salvatier J, Dafoe A, Zhang B, Evans O (2018) When will AI exceed human performance? Evidence form AI experts. Cornell University arXiv:1705.08807v3. https://arxiv.org/pdf/1705.08807.pdf. Accessed 4 July 2019

  • Hartung J (2018) Art. 25 DSGVO. In: Kühling J, Buchner B (eds) DS-GVO, BDSG, 2nd edn. C.H. Beck, München

    Google Scholar 

  • Hartzog W, Conti G, Nelson J, Shay LA (2015) Inefficiently automated law enforcement. Mich. State Law Rev. 2015(5):1763–1796

    Google Scholar 

  • Heckmann D (2005) Das Paradoxon von individueller Freiheit und öffentlicher Sicherheit. Elemente einer Theorie komplementärer Risikoverteilung in Raum und Zeit. In: Alexy R (ed) Juristische Grundlagenforschung. Frank Steiner Verlag, Stuttgart, pp 183–201

    Google Scholar 

  • Hofmann F (2019) Smart contracts und Overenforcement. In: Fries M, Paal B (eds) Smart contracts. Mohr Siebeck, Tübingen, pp 125–140

    Google Scholar 

  • Hoffmann-Riem W (2001) Art. 5 Abs. 1, 2. In: Denninger E, Hoffmann-Riem W, Schneider HP, Stein E (eds) Alternativkommentar Grundgesetz, vol 1. Luchterhand, Neuwied

    Google Scholar 

  • Hoffmann-Riem W (2016) Innovation und Recht. Mohr Siebeck, Tübingen

    Google Scholar 

  • Hoffmann-Riem W (2017) Verhaltenssteuerung durch Algorithmen—Eine Herausforderung für das Recht. Archiv des öffentlichen Rechts 142(1):1–42

    Google Scholar 

  • Johnson CL (2014) Context and machine learning. In: Brézillon P, Gonzalez AJ (eds) Context in computing. Springer, Heidelberg, pp 113–126

    Google Scholar 

  • Kastl G (2016) Filter—Fluch oder Segen: Möglichkeiten und Grenzen von Filtertechnologien zur Verhinderung von Rechtsverletzungen. Gewerblicher Rechtsschutz und Urheberrecht 2016(7):671–678

    Google Scholar 

  • Kensche C (2019) Männlich, Anfang 20, Migrationshintergrund. Die Welt. https://www.welt.de/politik/deutschland/article189538329/Raser-in-Berlin-Maennlich-Anfang-20-Migrationshintergrund.html. Accessed 3 July 2019

  • Kersten J (2015) Menschen und Maschinen: Rechtliche Konturen instrumenteller, symbiotischer und autonomer Konstellationen. JuristenZeitung 70(1):1–8

    Google Scholar 

  • Kotsoglou KN (2014) Subsumtionsautomat 2.0: Über die (Un-)Möglichkeit einer Algorithmisierung der Rechtserzeugung. JuristenZeitung 69(9):451–457

    Google Scholar 

  • Kreßner M (2019) Gesteuerte Gesundheit. Nomos, Baden-Baden

    Google Scholar 

  • Kube H (2019) E-Government: ein Paradigmenwechsel in Verwaltung und Verwaltungsrecht? Veröffentlichungen der Vereinigung der Deutschen Staatsrechtslehrer 78:289–326

    Google Scholar 

  • Kühl E (2019) Vertrauen Sie dieser Website nicht. Die ZEIT. https://www.zeit.de/digital/internet/2019-01/microsoft-edge-browser-newsguard-fake-news-internet. Accessed 3 July 2019

  • Kuhlmann N (2019) Smart enforcement bei smart contracts. In: Fries M, Paal B (eds) Smart contracts. Mohr Siebeck, Tübingen, pp 117–124

    Google Scholar 

  • Lisken H (1998) „Verdachts- und ereignisunabhängige Personenkontrollen zur Bekämpfung der grenzüberschreitenden Kriminalität“? Neue Zeitschrift für Verwaltungsrecht 1998(1):22–26

    Google Scholar 

  • Lübbe-Wolff G (1996) Modernisierung des Umweltordnungsrechts. Economia-Verlag, Bonn

    Google Scholar 

  • Luhmann N (1972) Funktion und Folgen formaler Organisation, 2nd edn. Duncker & Humblot, Berlin

    Google Scholar 

  • Maak N (2019) Sie sind alle auf 180. Frankfurter Allgemeine Sonntagszeitung 2019(10):38

    Google Scholar 

  • Marks A, Bowling B, Keenan C (2017) Automatic justice? Technology, crime, and social control. In: Brownsword R, Scotford E, Yeung K (eds) The oxford handbook of law, regulation, and technology. Oxford University Press, New York, pp 705–730

    Google Scholar 

  • Marsch N (2018) Das europäische Datenschutzgrundrecht. Mohr Siebeck, Tübingen

    Google Scholar 

  • Mayntz R (1980, 1983) Implementation politischer Programme, Vol. I and II. Verlagsgruppe Athenaeum Hain Scriptor Hanstein, Königstein/Ts

  • Möllers C (2018) Die Möglichkeit der Normen. Suhrkamp, Berlin

  • Müller V, Bostrom N (2016) Future progress in artificial intelligence: a survey of expert opinion. In: Müller V (ed) Fundamental issues of artificial intelligence. Springer, Heidelberg, pp 555–572

    Google Scholar 

  • Mulligan CM (2008) Perfect enforcement of law: when to limit and when to use technology. Richmond J Law Technol 14(4):1–49

    Google Scholar 

  • North DC (1990) Institutions, institutional change and economic performance. Cambridge University Press, Cambridge

    Google Scholar 

  • Paulus D, Matzke R (2018) Smart Contracts und das BGB—Viel Lärm um nichts? Zeitschrift für die gesamte Privatrechtswissenschaft 2018(4):431–465

    Google Scholar 

  • Pergande F (2019) Arme Schwarzfahrer? Frankfurter Allgemeine Zeitung 2018(45):10

    Google Scholar 

  • Pfannenstiel B (2018) Iowa House votes to regulate, not ban, traffic cameras as heated debate continues. Des Moines Register. https://eu.desmoinesregister.com/story/news/politics/2018/03/14/iowa-house-votes-regulate-not-ban-traffic-cameras-heated-debate-continues/423930002. Accessed 8 July 2019

  • Poscher R (2017) The Right to Data Protection. In: Miller RA (ed) Privacy and power. Cambridge University Press, Cambridge, pp 129–142

    Google Scholar 

  • Rademacher T (2017) Predictive Policing im deutschen Polizeirecht. Archiv des öffentlichen Rechts 142(3):366–416

    Google Scholar 

  • Rademacher T (2020) Artificial intelligence and law enforcement. In: Wischmeyer T, Rademacher T (eds) Regulating artificial intelligence. Springer, Heidelberg

    Google Scholar 

  • Rahwan I, Cebrian M (2018) Machine Behavior Needs to Be an Academic Discipline: Why should studying AI behavior be restricted to those who make AI? Nautilus 058. http://nautil.us/issue/58/self/machine-behavior-needs-to-be-an-academic-discipline. Accessed 3 July 2019

  • Rich ML (2013) Should we make crime impossible? Harv J Law Public Policy 36(2):795–848

    Google Scholar 

  • Rosenthal D (2011) Assessing digital preemption (and the future of law enforcement?). New Crim Law Rev 14(4):576–610

    Google Scholar 

  • Ruthig J (2019) Fluggastdatengesetz—FlugDaG. In: Schenke WR, Graulich K, Ruthig J (eds) Sicherheitsrecht des Bundes, 2nd edn. C.H. Beck, München

    Google Scholar 

  • Schmidt-Aßmann E (2006) Das allgemeine Verwaltungsrecht als Ordnungsidee, 2nd edn. Springer, Berlin

    Google Scholar 

  • Schrey J, Thalhofer T (2017) Rechtliche Aspekte der Blockchain. Neue Juristische Wochenschrift 2017(20):1431–1436

    Google Scholar 

  • Schwabe J (1998) Kontrolle ist schlecht, Vertrauen allein der Menschenwürde gemäß? Neue Zeitschrift für Verwaltungsrecht 1998(7):709–711

    Google Scholar 

  • Spindler G (2017) Das neue Telemediengesetz—WLAN-Störerhaftung endlich adé? Neue Juristische Wochenschrift 2017(32):2305–2309

    Google Scholar 

  • Staben J (2016) Der Abschreckungseffekt auf die Grundrechtsausübung. Mohr Siebeck, Tübingen

    Google Scholar 

  • Trute HH (2009) Grenzen des präventionsorientierten Polizeirechts in der Rechtsprechung des Bundesverfassungsgerichts. Die Verwaltung 2009(1):85–104

    Google Scholar 

  • Waechter K (2016) Sicherheit und Freiheit in der Rechtsphilosophie. Mohr Siebeck, Tübingen

    Google Scholar 

  • Wischmeyer T (2020a) AI and transparency: opening the black box. In: Wischmeyer T, Rademacher T (eds) Regulating artificial intelligence. Springer, Heidelberg

    Google Scholar 

  • Wischmeyer T (2020b) Regierungs- und Verwaltungshandeln durch KI. In: Ebers M et al (eds) Rechtshandbuch KI & Robotik. C.H. Beck, München

    Google Scholar 

  • Wysk P (2018) Tausche Freiheit gegen Sicherheit? Die polizeiliche Videoüberwachung im Visier des Datenschutzes. Verwaltungsarchiv 2018(2):141–162

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Timo Rademacher.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This paper is a translation of the German original that was published by Mohr Siebeck in 74 JuristenZeitung [JZ] (2019), pp. 702–710. Therefore, the argumentation is mainly focused on German law and the respective discourse, but it also applies, mutatis mutandis, to the case law of the European Court of Justice, in particular with regard to the right to informational self-determination (see, on consolidation of this right under EU law, Rademacher (2020), paras 19–22). The author cordially thanks Prof. Dr. Jens-Peter Schneider for his many valuable comments on the original paper.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rademacher, T. Of New Technologies and Old Laws: Do We Need a Right to Violate the Law?. Eur J Secur Res 5, 39–58 (2020). https://doi.org/10.1007/s41125-019-00064-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s41125-019-00064-7

Keywords

Navigation