Carrefour é multado em 3.8 milhões de euros por descumprimento da GDPR

O Carrefour, multinacional francesa de supermercados com operações em mais de 30 países, foi multada em €3 milhões (quase RS$ 20 milhões) por múltiplas violações do Regulamento Geral sobre a Proteção de Dados (GDPR). Informações são da Infosecurity Magazine.

De acordo com o portal, a multa foi aplicada pela Comissão Nacional de Computação e Liberdade (CNIL), uma das principais organizações reguladoras do GDPR na Europa. Além da rede mundial de supermercados, o Banco Carrefour, conhecido por Carrefour Soluções Financeiras no Brasil, também foi multado pelo órgão, em € 800 mil (mais de RS$ 5 milhões).

H&M hit with €35.3m fine for GDPR employee breach

How did H&M’s internal data collection processes land it with the second largest fine in data breach history?

The key takeaway

Despite the catastrophic financial impact of COVID-19, the Hamburg State Commissioner for Data Protection and Freedom of Information (HmbBfDI) showed no signs of leniency in issuing H&M with the second largest fine ever to be handed to a single company for breach of the GDPR.

The background

The HmbBfDI announced on 1 October 2020 that it had fined the German subsidiary of fashion retailer H&M €35.3 million for the unlawful monitoring of employees in its centrally operated service centre in Nuremberg. On the same day, H&M announced it was to close 250 of its stores globally.

Thinking outside the (pre-ticked consent) box

The decision in Orange România SA v Autoritatea Naţională de Supraveghere a Prelucrării Datelor cu Caracter Personal doesn’t particuarly break new ground, but rather reinforces what we already know about consent to data processing in the EU, namely:

  • That it must be freely given, specific, informed and unambiguous; and
  • Silence, inactivity or pre-ticked boxes don’t meet this standard.

GDPR offers possibilities for scientific research in the context of COVID-19 (but must be observed)

Nowadays, researchers hunt for the right vaccine for protection against the virus and/or and medication for the effective treatment of coronavirus patients. That means a lot of scientific research. And that often leads to questions about the protection of personal data. To what extent can health data be used for these purposes and how does one ensure that – even in times of crisis – the requirements of the General Data Protection Regulation (“GDPR”) are met?

€35 Million Fine Issued Under GDPR For Employee Monitoring And IT Security Failings In Germany

During the COVID-19 pandemic, data privacy – and, in particular, employee data privacy – has been at the forefront of employers’ minds.  In the last six months, employers across the globe have been required to give careful thought to a whole host of potential issues, from contact tracing apps to temperature and other health checks in the workplace, as well as processing an increasing volume of health data of its staff. Whilst not COVID-19 related, a recent decision from the Hamburg Commissioner for Data Protection and Freedom of Information in Germany (the “Commissioner”) is an important reminder of the very significant financial and reputational sanctions an employer may face if it does not take appropriately collect, retain and protect employee personal data in line with GDPR.

A Cautionary Tale of Data Breeches and the GDPR after Hacker Steals Extremely Sensitive Data of 40.000 Psychotherapy Patients

In November 2018, a data security vulnerability in the systems of Vastaamo Oy (“Vastaamo”), a major provider of psychotherapy services in Finland, led to the names, personal identity numbers, and patient records of at least 40.000 patients being stolen by an unknown hacker.

Automated decision-making: Uber drivers claim they are de-activated automatically without ability to object in non-compliance with GDPR

Uber drivers from across Europe have bought a claim in the Netherlands (where Uber’s data is based)against Uber claiming that Uber has breached GDPR.

Article 22 GDPR states that, with limited exceptions, data subjects have the right not to be subject to an entirely-automated decision which affects their legal rights. If such an automated decision is made, the data subject shall have the right to an explanation about how the decision was made and to contest the decision.

The General Data Protection Regulation: American Compliance Overview and the Future of the American Business

Cymone Gosnell |

In 2016, the European Union (“EU”) created heightened data privacy rights for its citizens by enacting the General Data Privacy Regulation (“GDPR”). The most drastic change from the previous regulation, enacted in 1995, lies within the expanded territorial scope. The change now subjects companies to fines for violations of the regulation, even if those companies are not domiciled in the EU. Data privacy has always been considered a fundamental human right in the EU; however, within the United States, there is no fundamental right to privacy. Rather, the country’s privacy laws are based on a complicated sectoral structure that often leads the country’s citizens confused as to what rights they actually have. This paper will review the EU and United States’ fundamental differences in privacy laws, the changes implemented by the GDPR (including the expanded territorial scope), the compliance plans of some major players within the United States, and what the future looks like for American businesses that hold or process the data of EU citizens under the GDPR. |

Big data analytics in electronic communications: A reality in need of granular regulation (even if this includes an interim period of no regulation at all)

Computer Law & Security Review, Volume 36, April 2020. |

Vagelis Papakonstantinou | Paul de Hert |

In this article, we provide an overview of the literature on chilling effects and corporate profiling, while also connecting the two topics. We start by explaining how profiling, in an increasingly data-rich environment, creates substantial power asymmetries between users and platforms (and corporations more broadly). Inferences and the increasingly automated nature of decision-making, both based on user data, are essential aspects of profiling. We then connect chilling effects theory and the relevant empirical findings to corporate profiling. In this article, we first stress the relationship and similarities between profiling and surveillance. Second, we describe chilling effects as a result of state and peer surveillance, specifically. We then show the interrelatedness of corporate and state profiling, and finally spotlight the customization of behaviour and behavioural manipulation as particularly significant issues in this discourse. This is complemented with an exploration of the legal foundations of profiling through an analysis of European and US data protection law. We find that while Europe has a clear regulatory framework in place for profiling, the US primarily relies on a patchwork of sector-specific or state laws. Further, there is an attempt to regulate differential impacts of profiling via anti-discrimination statutes, yet few policies focus on combating generalized harms of profiling, such as chilling effects. Finally, we devise four concise propositions to guide future research on the connection between corporate profiling and chilling effects. |