The decision in Orange România SA v Autoritatea Naţională de Supraveghere a Prelucrării Datelor cu Caracter Personal doesn’t particuarly break new ground, but rather reinforces what we already know about consent to data processing in the EU, namely:
- That it must be freely given, specific, informed and unambiguous; and
- Silence, inactivity or pre-ticked boxes don’t meet this standard.
Nowadays, researchers hunt for the right vaccine for protection against the virus and/or and medication for the effective treatment of coronavirus patients. That means a lot of scientific research. And that often leads to questions about the protection of personal data. To what extent can health data be used for these purposes and how does one ensure that – even in times of crisis – the requirements of the General Data Protection Regulation (“GDPR”) are met?
During the COVID-19 pandemic, data privacy – and, in particular, employee data privacy – has been at the forefront of employers’ minds. In the last six months, employers across the globe have been required to give careful thought to a whole host of potential issues, from contact tracing apps to temperature and other health checks in the workplace, as well as processing an increasing volume of health data of its staff. Whilst not COVID-19 related, a recent decision from the Hamburg Commissioner for Data Protection and Freedom of Information in Germany (the “Commissioner”) is an important reminder of the very significant financial and reputational sanctions an employer may face if it does not take appropriately collect, retain and protect employee personal data in line with GDPR.
H&M was hit earlier this month with the second-largest fine under the GDPR to date after a series of data protection failings relating to its employment practices at its Nuremberg service centre were investigated by the State Data Protection Commissioner in Hamburg.
In November 2018, a data security vulnerability in the systems of Vastaamo Oy (“Vastaamo”), a major provider of psychotherapy services in Finland, led to the names, personal identity numbers, and patient records of at least 40.000 patients being stolen by an unknown hacker.
Uber drivers from across Europe have bought a claim in the Netherlands (where Uber’s data is based)against Uber claiming that Uber has breached GDPR.
Article 22 GDPR states that, with limited exceptions, data subjects have the right not to be subject to an entirely-automated decision which affects their legal rights. If such an automated decision is made, the data subject shall have the right to an explanation about how the decision was made and to contest the decision.
Cymone Gosnell |
In 2016, the European Union (“EU”) created heightened data privacy rights for its citizens by enacting the General Data Privacy Regulation (“GDPR”). The most drastic change from the previous regulation, enacted in 1995, lies within the expanded territorial scope. The change now subjects companies to fines for violations of the regulation, even if those companies are not domiciled in the EU. Data privacy has always been considered a fundamental human right in the EU; however, within the United States, there is no fundamental right to privacy. Rather, the country’s privacy laws are based on a complicated sectoral structure that often leads the country’s citizens confused as to what rights they actually have. This paper will review the EU and United States’ fundamental differences in privacy laws, the changes implemented by the GDPR (including the expanded territorial scope), the compliance plans of some major players within the United States, and what the future looks like for American businesses that hold or process the data of EU citizens under the GDPR. |
Computer Law & Security Review, Volume 36, April 2020. |
Vagelis Papakonstantinou | Paul de Hert |
In this article, we provide an overview of the literature on chilling effects and corporate profiling, while also connecting the two topics. We start by explaining how profiling, in an increasingly data-rich environment, creates substantial power asymmetries between users and platforms (and corporations more broadly). Inferences and the increasingly automated nature of decision-making, both based on user data, are essential aspects of profiling. We then connect chilling effects theory and the relevant empirical findings to corporate profiling. In this article, we first stress the relationship and similarities between profiling and surveillance. Second, we describe chilling effects as a result of state and peer surveillance, specifically. We then show the interrelatedness of corporate and state profiling, and finally spotlight the customization of behaviour and behavioural manipulation as particularly significant issues in this discourse. This is complemented with an exploration of the legal foundations of profiling through an analysis of European and US data protection law. We find that while Europe has a clear regulatory framework in place for profiling, the US primarily relies on a patchwork of sector-specific or state laws. Further, there is an attempt to regulate differential impacts of profiling via anti-discrimination statutes, yet few policies focus on combating generalized harms of profiling, such as chilling effects. Finally, we devise four concise propositions to guide future research on the connection between corporate profiling and chilling effects. |
Journal of the Korea Institute of Information Security & Cryptology, Volume 29, Issue 6, Pages 1477-1489, 2019. |
In this study, we analyzed the privacy policies of 50 Android applications that are on the top chart in EU members to present the methods for enhancing transparency based on GDPR (General Data Protection Regulation). Based on the guidelines in relation to transparency stipulated in WP29, this study extracted factors of transparency in order to ensure transparency of privacy data processing and carried out the verification procedures for each factor. The results revealed that the privacy policies provided in Google Play Store and applications need to be matched, the descriptions of the privacy policies need to be written in clear and plain language for readers to understand easily. and that it is necessary to provide information quickly and improve the descriptions of information which the data controller discloses. The research findings of this study could be used as a preliminary data for proactive responses to the EU’s GDPR by substantially complying with the transparency of GDPR. |
Part of the Philosophical Studies Series book series (PSSP, volume 137) (also available in .epub format) |
This open access book presents an ethical approach to utilizing personal medical data. It features essays that combine academic argument with practical application of ethical principles. The contributors are experts in ethics and law. They address the challenges in the re-use of medical data of the deceased on a voluntary basis. This pioneering study looks at the many factors involved when individuals and organizations wish to share information for research, policy-making, and humanitarian purposes. |
This book was published under a CC-BY 4.0 license. |