Skip to main content

The Corona crisis and the barriers of Data Protection Law

Data Protection Law calls for balanced solutions in the Corona crisis too. In that regard, the design of the Corona app is exemplary. By Professor Moritz Hennemann

The Corona pandemic is affecting us all – in previously unknown ways. From time to time, in the new series 'Passau University Perspectives', researchers at the University of Passau will be taking a view of current developments as seen from their discipline.

It's not always easy for Data Protection. In that sense, the General Data Protection Regulation (GDPR) has done data protection a 'favour', but it is somewhat two-edged. On the one hand, the regulation, which has been applicable since 2018, has succeeded in drawing people's attention more strongly to data protection. On the other hand, its very provisions, often rather abstract, have not escaped criticism. It has been extolled as a 'gold standard', for example by the former Green MEP Jan Philipp Albrecht, but it has also been condemned as the 'greatest catastrophe of the 21st century' by the IT law expert Thomas Hoeren. Yet the GDPR is extremely successful, at least at the meta level. It can certainly hold its own in a 'race of legal systems', because the model of the European Union has imitators all over the world.

Undisputed, moreover, is that Data Protection Law follows a fundamental approach that is, at its origin, worthy of approval and offers the individual powerful protection. The 'data subject' is the crux of this regime of standards, which gives the regulation an unmistakable German trademark. Germany may, with a pinch of salt at least, be considered a pioneer of Data Protection Law. The German discourse is also coloured by historical negative experiences had with data processing on the part of the state. The 'total registration' (Aly / Roth) at the time of National Socialism (1933-1945) is an example that both deters and warns. This sensitivity to processing of data by the state, very pronounced and rightly so, became in Germany particularly evident in respect of the census in 1983, for example.

Awareness of the fact that personal data can potentially be abused is thus part of the DNA of (German) Data Protection Law.

Professor Moritz Hennemann, University of Passau

Such potential abuse needs to be nipped in the bud from several points of view: first, in consideration of the awareness of accumulation of power by the state, and also from a legal aspect, because data protection law already starts to take effect at the beginning of any processing of personal data, however little. And it is also absolutely necessary, particularly in the spirit of a state based on the principles of liberty and an open civil society, for clear action to be taken to counter 'screening' of the citizens by the state. The omniscient state is a dystopia. The 'right to informational self-determination' is an expression of the concept that, first and foremost, the processing of personal data necessitates a 'positive decision' by the person concerned. Other kinds of processing require further justification of a supra-individual kind. The discourse on data protection law during the course of the Corona crisis has made this clear yet again.

Prof. Dr. Moritz Hennemann lächelt freundlich in die Kamera.

Professor Moritz Hennemann

researches developments in data law

Which regulatory models for digital interaction should we be following in the 21st century?

Which regulatory models for digital interaction should we be following in the 21st century?

Professor Moritz Hennemann has held the Chair of European and International Information and Data Law and headed the Research Centre for Law and Digitalisation (FREDI) of the Faculty of Law at the University of Passau since 2020. His research revolves around the global development of data and data protection law as well as the legal and regulatory framework of the digital economy.

Are data protection and health absolute values?

Some of those involved in the public debate at the beginning of the Corona crisis had a blinkered view of things. They proclaimed the prevention of major health risks as a quasi-absolute guiding principle for action. That is perfectly understandable when we look at the uncertain forecasts of the spread of the COVID 19 virus. However, earnest and important though the protection of individual and public health is, it is still a value quite unsuitable for absolutisation in a society based on the principles of liberty. No value protected by basic rights, apart from inviolable human dignity, is absolute in the modern German constitutional state. So health protection per se does not outrank other basic rights. And that certainly applies in equal measure to data protection. On the contrary, colliding basic rights need to be weighed against one another and levelled out in a conciliatory manner. A detailed analysis of the collisions of interest is necessary for this. These parameters are the plumb line for the assessment of data protection in the Corona crisis. In this sense, also – and especially – in such an unusual crisis, that process of weighing up can end either to the advantage or to the disadvantage of health protection, depending on the case at hand.

Between tracing and data donation

One aspect that has attracted a lot of attention is the extent to which the design of the various apps designed to combat the crisis conforms with Data Protection Law. Having said that, clear distinctions need to be made between the various applications. The voluntary Corona data donation app from the Robert Koch Institute, for example, aims to gain broad insights into the spread of the virus and the symptoms associated with it. Yet a whole range of other applications are also imaginable, for example for the general monitoring of citizens' mobility, the enforcement of quarantine or the purposive messaging of individual users to inform them about contacts with other users. The main public focus is on these tracing apps. Such apps meanwhile exist worldwide, but they have different designs: the various approaches save different data for different periods of time and use different sources. It may be presumed that the majority of countries are investing in a Bluetooth-based solution. In this kind of solution, as a standard feature, user IDs are mutually exchanged between devices within Bluetooth range – with varying degrees of (de)centralisation. In terms of technological functionality, that begs the question of whether and to what extent the respective user IDs are brought together on a central server. With a decentralised solution, after a message from one person, contact matching only takes place on the smartphones of other persons. Germany introduced a decentralised open-source solution in mid-June.

The German solution is thus designed to be as data-protection-friendly as possible. It is a very long way from the continuous, or even centralised gathering of location data, the compilation of detailed movement profiles of individuals and other 'big brother' functionalities of a police state, and rightly so. This applies, above all, to the German app, because it is based on voluntary participation. In international comparison – and this does not come as much of a surprise – that is not a thing that can be taken for granted. Some solutions are, at least from a German perspective, scarcely imaginable at all in terms of Data Protection Law. For example, there are reports to the effect that in Poland, persons in quarantine are not only under obligation to use an appropriate app, but also to transmit a selfie several times a day, with location data, to prove that they are at home. In South Korea, quite apart from an app, CCTV cameras, GPS data and credit card transactions are evaluated, and users are given information about the actual whereabouts of infected individuals. In China, apparently, elaborate movement patterns are also compiled and health data gathered systematically.

Has Data Protection Law proved its worth in the Corona crisis?

According to Data Protection Law, any lawful processing of personal data must be based on a so-called defined form of permission or consent. The classical version of this is the consent required under Data Protection Law. Such consent must be given voluntarily and after receipt of the proper information with reference to a particular purpose of data processing, and in the processing of health data it must be given 'expressly'. The German solution too is based on consent. For one thing, this is understandable as regards the acceptance of such an app by the citizens in Germany. Consent, moreover, is probably also the most watertight variant in legal terms. On the one hand, Data Protection Law presumably applies in spite of the decentralised processing, whilst on the other it is doubtful whether there is any other legal basis for the app at the present time. In that sense, it should be noted that data protection law places legal limitations on the way we combat the crisis. This is not a case of 'anything goes' with regard to consent or in the light of the other options provided to the member states to deviate in favour of health, but it is not a case of 'nothing goes' either.

Back to the roots?

Against this backdrop, the conformity of the German Corona app in terms of Data Protection Law must also be assessed applying general criteria. These include, above all, the 'voluntary basis' of the consent involved. That voluntary basis does not merely include the absence of coercion, but also the absence of 'gentler' forms of pressure. Many kinds of pressure that go beyond obligations imposed by the state are imaginable. The case would be particularly sensitive if the use of the app were coupled indirectly to amenities. There have, for example, been calls for it to be made a prerequisite for visits to restaurants, theatres and other facilities – i.e. to important parts of public life. There are also fears that pressure could be exerted in other constellations, for example an employer instructing an employee to use the app for the protection of the other employees, or 'requiring' him or her to do so. Without a closer look, these fears certainly cannot be rejected out of hand. So understandably, for the purpose of 'safeguarding' 'genuine' voluntariness, there are calls for some kind of regulation from the legislators that at least clarifies the position. Such a 'safeguard' would emphasise the root of data protection, which has, at times, been lost from view. In the sense of a private, autonomous decision taken by the individual concerned, this can only be a welcome thing.

Even if the circumstances that have occasioned all this are dire, it must be said that the public at large has been involved in a complex discourse on data protection in recent months. In the future too, this kind of attentiveness in society as a whole is a thing we should definitely wish for for data protection, especially and regardless of whether one sees data protection as a system of constraint or as a  functional 'bastion' – in Germany at least – against apps that are too data-intensive or 'big brother' functionalities.

Do you have any questions or ideas on this article? Are you a scientist at the University of Passau who would like to tell us how you see things from the point of view of your discipline? Write to us at: 

perspektiven@uni-passau.de

Playing the video will send your IP address to an external server.

Show video