Algorithm control as regulatory task
The complex data processes that are essential for running the "Internet of Things", for control processes in infrastructure regulation and in digital production are based on programme code that provides structure and controls algorithms.
Algorithms are not neutral. They are consistently based on the ethical premises and control objectives of their programmers, who generally remain shut off from those who are affected by them. The more tasks society outsources entirely or partially to (entirely or partially) automated systems and the more sensitive (in regards to personal information) the spheres in which the self-learning systems become involved and feed information into the data processing, the more urgent the question of the (constitutional) legal limits of its use, of suitable control methods and of state organisational structures to keep the technical possibilities within the scope of what is beneficial and acceptable for the common good. Delegating these continually further-reaching decisions about more and more complexly configured, non-transparent and autonomously acting systems brings the threat of serious loss of control if there is insufficient risk prevention. Proper Big Data regulation therefore requires effective algorithm control.
Effective algorithm control requires a basic understanding of the technical functional relationships and the consideration of legal, ethical and economic points of view.
Delegating statutory decisions to digital assistance systems is connected to Big Data-specific risks of discrimination, which puts the constitutional principal of equality as well as the social state right to participate on the legal test bench. They must use quality standards for programming to take into account the software and its self-learning mechanisms.
The model of transparent administration (open government) as well as the potential benefit from more efficient, fully automated management processes are at odds with the non-transparency of self-learning Big Data analytical algorithms: the fewer the comprehensible analysis results produced by the analytical tools and decision-making assistants used by the administration for their decision-making process as well as the fewer the recommendations based on these results, the greater the constitutional concerns about their use with regard to the principles of justification and clarity of accountability.
The phenomenon of algorithm-based decision-making is also tangent to the ban on automatically generated individual decisions, as previously laid out in § 6a BDSG [German Data Protection Act] and - as the European Union's legal "replacement" - now in Art. 22 GDPR. The GDPR introduces hurdles to the admissibility of automatically generated decisions - including so-called profiling. They are only legal when they are either required in contractual relationships, when they are carried out on the basis of the relevant legal provision or when the affected parties have consented. The (regulated) forms of self-regulation defined by Art. 40 ff. GDPR and the forms of co-regulation have become a model in this respect. The data protection impact assessment is also assigned an important role as an instrument of regulation.
With a view to the consumer protection aspects of algorithm control in the Internet of Things, the project is financed in part by third-party funds. The programme area was successful in the tender process for the subsidy programme "Promoting Innovation in Consumer Protection in the Law and the Economy - Consumer-Related Research on the 'Internet of Things'" of the German Federal Ministry of Justice and for Consumer Protection (BMJV) with its proposal for algorithm control as a consumer policy protection mechanism. The research projects analysed regulatory approaches for controlling the switching points in the Internet of Things and drafted solution proposals for practical use to achieve effective implementation of algorithm control. More information about the third-party financed project "Algorithm Control in the Internet of Things" can be found here.
Martini, Mario/Nink, David, Wenn Maschinen entscheiden... - vollautomatisierte Verwaltungsverfahren und der Persönlichkeitsschutz, NVwZ 2017, S. 681 f. (Kurzfassung); NVwZ-Extra 10/2017, p. 1 ff. (Langfassung).
Note: The text on this home page is copyrighted. It is taken verbatim or based on Martini, "Digitalisierung als Herausforderung und Chance für Staat und Verwaltung" (Digitalisation as Challenge and Chance for State and Administration), FÖV Discussion Paper No. 85, 2016, in particular p. 42 ff.