The project is devoted to regulatory approaches in the context of digitalisation that are based on the findings of behavioural economics (so-called law-and-behavioural-economics approach), using the examples of privacy-friendly data protection default settings and “nudging as an instrument of government”.
Classically, state regulation for the protection of citizens goes back to the notion of behavioural guidance through orders and prohibitions on the one hand, and through incentives like taxes and subsidies on the other. Both regulatory models assume that the law-abiding and (purely) rationally acting person orients his actions as homo oeconomicus to legal requirements in order to avoid disadvantages and obtain advantages. Essentially, data protection law also adheres to this concept, since it is after all based on a prohibition with permissive reservations (Art. 6, 9 GDPR).
However, modern behavioural studies are increasingly questioning the homo oeconomicus model and producing specific research on how people actually behave: bounded rationality. The best-known example of this is that people adhere to default behaviours even if there is no rational economic reason for this (so-called “default effect”). Regulatory models can be developed on this basis that no longer appeal to people as rational beings, but rather attempt to exploit or at least decrease their irrationality. So-called nudging or de-biasing is the subject of intensive discussion in government and academia. It has become known in political philosophy as “liberal paternalism”.
Digitalisation in particular offers countless possibilities of making use of nudging – to the benefit as well as at the expense of individuals. For example, on the Internet it has become customary over time for controllers to obtain the consent required in data processing via a preset option (so-called opt-out). The GDPR intends to counteract the possibility that service providers exploit behavioural economic circumstances to the disadvantage of those affected: it prohibits consent defaults in certain situations (Art. 2  GDPR). As clear as this requirement appears, it amounts to little in practice: major Internet companies have been attempting from the start to evade this prohibition. For example, through a tendentious presentation of the decision situation or by suggesting or actually imposing a decision compulsion on users. The exact legal limits will for long remain disputed.
The state is also increasingly attempting not just to impel citizens expressly to desired behaviour normatively through prohibitions, but also through cleverly designed decision architectures that nudge them in a particular direction if this promises to achieve the regulatory goal in a more economical manner. Particularly for the transition from classic presence administration to e-government, this would appear to be a feasible method of getting citizens to move in this direction and also actually make use of the new electronic offers.
Prof. Dr. Mario Martini