[Livro] Proteção de Dados: Contexto, narrativas e elementos fundantes

O campo da proteção de dados pessoais está em ebulição. Primeiro, por conta da realidade inescapável da importância dos dados enquanto o principal ativo para a formulação de políticas públicas e modelagem de negócios. Segundo, em razão da aprovação da Lei Geral de Proteção de Dados/LGPD que lança uma nova e importante peça no ordenamento jurídico brasileiro. Com isso em mente, me encontrei motivado para selecionar e, na medida do possível, revisitar as minhas principais reflexões ao longo de anos de engajamento acadêmico e profissional no campo.

A aprovação de uma lei geral de proteção de dados pessoais não marca o final dos debates e a solução dos problemas que a sociedade da informação nos coloca. É necessário construir uma cultura acerca do tema e, no campo jurídico, forjar uma dogmática sofisticada para balancear a proteção de liberdades fundamentais e os interesses econômicos em jogo quando um dado pessoal é manipulado. Esse livro é, assim, um gesto de diálogo com que vier a ser seu leitor e leitora para refletirmos sobre a complexidade do tema. E, por fim, um agradecimento às reflexões travadas com coautores e coautoras que, gentilmente, me autorizaram a republicar os textos.

O livro Proteção de Dados: contextos, narrativas e elementos fundantes é uma coletânia de 20 artigos escritos por Bruno Bioni e diversos especialistas da área.

LGPD: Principais discussões que já foram levadas para o Judiciário

Com exceção das penalidades, a LGPD entrou em vigor no dia 18.09.2020 e, antes mesmo de completar um ano de vigência, já é responsável por importantes discussões no Judiciário.

O objetivo deste artigo é analisar algumas dessas discussões com o propósito de auxiliar as empresas na compreensão dos riscos envolvidos no tratamento de dados pessoais e indicar possíveis medidas para mitigá-los.

Cumpre evidenciar que nenhuma das decisões abaixo pode ser considerada como definitiva ou precedente do respectivo Tribunal, haja vista que a jurisprudência sobre o tema, assim como a cultura de proteção de dados no nosso país, ainda está nos estágios iniciais de desenvolvimento.

Utilities Governed Like Empires

By Cory Doctorow | August 3, 2021 | Source: EFF

Believe the Hype

After decades of hype, it’s only natural for your eyes to skate over corporate mission-statements without stopping to take note of them, but when it comes to ending your relationship with them,  tech giants’ stated goals take on a sinister cast.

Whether it’s “bringing the world closer together” (Facebook), “organizing the world’s information” (Google), to be a market “where customers can find and discover anything they might want to buy online” (Amazon) or “to make personal computing accessible to each and every individual” (Apple), the founding missions of tech giants reveal a desire to become indispensable to our digital lives.

They’ve succeeded. We’ve entrusted these companies with our sensitive data, from family photos to finances to correspondence. We’ve let them take over our communities, from medical and bereavement support groups to little league and service organization forums. We’ve bought trillions of dollars’ worth of media from them, locked in proprietary formats that can’t be played back without their ongoing cooperation.

These services often work great…but they fail very, very badly. Tech giants can run servers to support hundreds of millions or billions of users – but they either can’t or won’t create equally user-centric procedures for suspending or terminating those users.

But as bad as tech giants’ content removal and account termination policies are, they’re paragons of sense and transparency when compared to their appeals processes. Many who try to appeal a tech company’s judgment quickly find themselves mired in a Kafkaesque maze of automated emails (to which you often can’t reply), requests for documents that either don’t exist or have already been furnished on multiple occasions, and high-handed, terse “final judgments” with no explanations or appeal.

The tech giants argue that they are entitled to run their businesses largely as they see fit: if you don’t like the house rules, just take your business elsewhere. These house rules are pretty arbitrary: platforms’ public-facing moderation policies are vaguely worded and subject to arbitrary interpretation, and their account termination policies are even more opaque. 

Kafka Was An Optimist

All of that would be bad enough, but when it is combined with the tech companies’ desire to dominate your digital life and become indispensable to your daily existence, it gets much worse.

Losing your cloud account can cost you decades of your family photos. Losing access to your media account can cost you access to thousands of dollars’ worth of music, movies, audiobooks and ebooks. Losing your IoT account can render your whole home uninhabitable, freezing the door locks while bricking your thermostat, burglar alarm and security cameras. 

But really, it’s worse than that: you will incur multiple losses if you get kicked off just one service. Losing your account with Amazon, Google or Apple can cost you access to your home automation and security, your mobile devices, your purchased ebooks/audiobooks/movies/music, and your photos. Losing your Apple or Google account can cost you decades’ worth of personal correspondence – from the last email sent by a long-dead friend to that file-attachment from your bookkeeper that you need for your tax audit. These services are designed to act as your backup – your offsite cloud, your central repository – and few people understand or know how to make a local copy of all the data that is so seamlessly whisked from their devices onto big companies’ servers.

In other words, the tech companies set out to make us dependent on them for every aspect of our online lives, and they succeeded – but when it comes to kicking you off their platforms, they still act like you’re just a bar patron at last call, not someone whose life would be shattered if they cut you off.

YouTubers Warned Us

This has been brewing for a long time. YouTubers and other creative laborers have long suffered under a system where the accounts on which they rely to make their livings could be demonetized, suspended or deleted without warning or appeal. But today, we’re all one bad moderation call away from having our lives turned upside-down.

The tech giants’ conquest of our digital lives is just getting started. Tech companies want to manage our health, dispense our medication, take us to the polls on election day, televise our political debates and teach our kids. Each of these product offerings comes with grandiose pretensions to total dominance – it’s not enough for Amazon Pharmacy to be popular, it will be the most popular, leveraging Amazon’s existing business to cut off your corner druggist’s market oxygen (Uber’s IPO included a plan to replace all the world’s public transit and taxi vehicles with rideshares). 

If the tech companies deliver on their promises to their shareholders, then being locked out of your account might mean being locked out of whole swathes of essential services, from buying medicine to getting to work.

Well, How Did We Get Here?

How did the vibrant electronic frontier become a monoculture of “five websites, each consisting of screenshots of text from the other four?” 

It wasn’t an accident. Tech, copyright, contract and competition policy helped engineer this outcome, as did VCs and entrepreneurs who decided that online businesses were only worth backing if they could grow to world-dominating scale.

Take laws like Section 1201 of the Digital Millennium Copyright Act, a broadly worded prohibition on tampering with or removing DRM, even for lawful purposes. When Congress passed the DMCA in 1998, they were warned that protecting DRM – even when no copyright infringement took place – would leave technology users at the mercy of corporations. You may have bought your textbooks or the music you practice piano to, but if it’s got DRM and the company that sold it to you cuts you off, the DMCA does not let you remove that DRM (say goodbye to your media). 

Companies immediately capitalized upon this dangerously broad law: they sold you media that would only play back on the devices they authorized. That locked you into their platform and kept you from defecting to a rival, because you couldn’t take your media with you. 

But even as DRM formats proliferated, the companies that relied on them continued to act like kicking you off their platforms was like the corner store telling you to buy your magazines somewhere else – not like a vast corporate empire of corner stores sending goons  to your house to take back every newspaper, magazine and paperback you ever bought there, with no appeal.

It’s easy to see how the DMCA and DRM give big companies far-reaching control over your purchases, but other laws have had a similar effect. The Computer Fraud and Abuse Act (CFAA), another broadly worded mess of a law, is so badly drafted that tech companies were able to claim for decades that simply violating their terms of service could be  a crime – a chilling claim that was only put to rest by the Supreme Court this summer.

From the start, tech lawyers and the companies they worked for set things up so that most of the time, our digital activities are bound by contractual arrangements, not ownership. These are usually mass contracts, with one-sided terms of service. They’re end user license agreements that ensure that the company has a simple process for termination without any actual due process, much less strong remedies if you lose your data or the use of your devices.  

CFAA, DMCA, and other rules allowing easy termination and limiting how users and competitors could reconfigure existing technology created a world where doing things that displeased a company’s shareholders could literally be turned into a crime – a kind of “felony contempt of business-model.” 

These kinds of shady business practices wouldn’t have been quite so bad if there were a wide variety of small firms that allowed us to shop around for a better deal. 

Unfortunately, the modern tech industry was born at the same moment as American antitrust law was being dismantled – literally. The Apple ][+ appeared on shelves the same year Ronald Reagan hit the campaign trail. After winning office, Reagan inaugurated a 40-year, bipartisan project to neuter antitrust law, allowing incumbents to buy and crush small companies before they could grow to be threats; letting giant companies merge with their direct competitors, and looking the other way while companies established “vertical monopolies” that controlled their whole supply chains.

Without any brakes, the runaway merger train went barrelling along, picking up speed. Today’s tech giants buy companies more often than you buy groceries, and it has turned the tech industry into a “kill-zone” where innovative ideas go to die.

How is it that you can wake up one day and discover you’ve lost your Amazon account, and get no explanation? How is that this can cost you the server you run your small business on, a decade of family photos, the use of your ebook reader and mobile phone, and access to your entire library of ebooks, movies and audiobooks? 

Simple. 

Amazon is in so many parts of your life because it was allowed to merge with small competitors, create vertical monopolies, wrap its media with DRM – and never take on any obligations to be fair or decent to customers it suspected of some unspecified wrongdoing. 

Not just Amazon, either – every tech giant has an arc that looks like Amazon’s, from the concerted effort to make you dependent on its products, to the indifferent, opaque system of corporate “justice” governing account termination and content removal.

Fix the Tech Companies

Companies should be better. Moderation decisions should be transparent, rules-based, and follow basic due process principles. All of this – and more – has been articulated in detail by an international group of experts from industry, the academy, and human rights activism, in an extraordinary document called The Santa Clara Principles. Tech companies should follow these rules when moderating content, because even if they are free to set their own house rules, the public has the right to tell them when those rules suck and to suggest better ones.

If a company does kick you off its platform – or if you decide to leave – they shouldn’t be allowed to hang onto your data (or just delete it). It’s your data, not theirs. The concept of a “fiduciary” – someone with a duty to “act in good faith” towards you – is well-established. If you fire your lawyer (or if they fire you as a client), they have to give you your files. Ditto your doctor or your mental health professional. 

Many legal scholars have proposed creating “information fiduciary” rules that create similar duties for firms that hold your data. This would impose a “duty of loyalty” (to act in the best interests of their customers, without regard to the interests of the business), and a “duty of care” (to act in the manner expected by a reasonable customer under the circumstances). 

Not only would this go a long way to resolving the privacy abuses that plague our online interactions – it would also guarantee you the right to take your data with you when you left a service, whether that departure was your idea or not. 

Information fiduciary isn’t the only way to get companies to be responsible. Direct consumer protection laws — such as requiring companies to make your content readily available to you in the event of termination — could too (there are other approaches as well).  How these rules would apply would depend on the content they host as well as the size of the business you’re dealing with – small companies would struggle to meet the standards we’d expect of giant companies. But every online service should have some duties to you – if the company that just kicked you off its servers and took your wedding photos hostage is a two-person operation, you still want your pictures back!

Fix the Internet

Improving corporate behavior is always a laudable goal, but the real problem with giant companies that are entwined in your life in ways you can’t avoid isn’t that those companies wield their incredible power unwisely. It’s that they have that power in the first place.

To give power to internet users, we have to take it away from giant internet companies. The FTC – under new leadership – has pledged that it will end decades of waving through anticompetitive mergers. That’s just for openers, though. Competition scholars and activists have made the case for the harder task of  breaking up the giants, literally cutting them down to size.

But there’s more.  Congress is considering the ACCESS Act, landmark legislation that would force the largest companies to interoperate with privacy-respecting new rivals, who’d be banned from exploiting user data. If the ACCESS Act passes, it will dramatically lower the high switching costs that keep us locked into big platforms even though we don’t like the way they operate. It also protects folks who want to develop tools to make it easier for you to take your data when you leave, whether voluntarily or because your account is terminated. 

That’s how we’ll turn the internet back into an ecosystem of companies, co-ops and nonprofits of every size that can take receipt of your data, and offer you an online base of operations from which you can communicate with friends, communities and customers regardless of whether they’re on the indieweb or inside a Big Tech silo.

That still won’t be enough, though. The fact that terms of service, DRM, and other technologies and laws can prevent third parties from supplying software for your phone, playing back the media you’ve bought, and running the games you own still gives big companies too much leverage over your digital life.

That’s why we need to restore the right to interoperate, in all its guises: competitive compatibility (the right to plug new products and services into existing ones, with or without permission from their manufacturers), bypassing DRM (we’re suing to make this happen!), the right to repair (a fight we’re winning!) and an end to abusive terms of service (the Supreme Court got this one right).

Digital Rights are Human Rights

When we joined this fight,  30 long years ago, very few people got it. Our critics jeered at the very idea of “digital rights” – as if the nerdfights over Star Trek forums could somehow be compared to history’s great struggles for self-determination and justice! Even a decade ago, the idea of digital rights was greeted with jeers and skepticism.

But we didn’t get into this to fight for “digital rights” – we’re here to defend human rights. The merger of the “real world” and the “virtual world” could be argued over in the 1990s, but not today, not after a lockdown where the internet became the nervous system for the planet, a single wire we depended on for free speech, a free press, freedom of assembly, romance, family, parenting, faith, education, employment, civics and politics.

Today, everything we do involves the internet. Tomorrow, everything will require it. We can’t afford to let our digital citizenship be reduced to a heavy-handed mess of unreadable terms of service and broken appeals processes.

We have the right to a better digital future – a future where the ambitions of would-be monopolists and their shareholders take a back-seat to fairness, equity, and your right to self-determination.

A LGPD e as discussões sobre o dano moral in re ipsa

Em setembro de 2020, entrou em vigor a Lei 13.709/2018, a Lei Geral de Proteção de Dados (LGPD), que tem como um dos fundamentos (art. 2º) o respeito à privacidade e a inviolabilidade da intimidade, da honra e da imagem. No artigo 52, a LGPD trata das sanções administrativas, de competência exclusiva da Autoridade Nacional de Proteção de Dados (ANPD), que entram em vigor a partir do próximo 1º de agosto.

Mas, antes mesmo dessa data, os tribunais brasileiros têm sido acionados para dirimir conflitos cujo objeto é a violação a dados pessoais, com pedidos de dano moral. O que se observa, atualmente, é a tentativa de construção de jurisprudência, que, por enquanto, está dividida entre os conceitos de dano moral in re ipsa e a necessidade da comprovação efetiva do dano.

[Livro] Políticas Digitais no Brasil: Acesso à internet, Proteção de Dados e Regulação

Este volume é o resultado de uma parceria entre a Escola de Direito da Fundação Getulio Vargas do Rio de Janeiro (FGV DIREITO RIO) e a União Internacional das Telecomunicações (UIT). As reflexões incluídas neste livro foram desenvolvidas por servidores públicos que participaram do Curso de Políticas Digitais, uma iniciativa fruto da parceria entre FGV DIREITO RIO e UIT, dedicado à formação dos servidores públicos de uma administração moderna e preparada a encarar e aproveitar a digitalização. O livro considera a importância crucial das pessoas das quais depende o bom funcionamento de uma administração moderna, a fim de facilitar o desenvolvimento de um ambiente digital sustentável. Neste sentido esta obra almeja fornecer elementos valiosos para compreender os desafios tecnológicos e regulatórios, como a expansão do acesso à Internet, a proteção de dados pessoais e a promoção da cibersegurança, oferecendo as ferramentas necessárias para enfrentar tais desafios. Este livro foi publicado graças ao generoso patrocínio da União Internacional das Telecomunicações (UIT).

Repositório digital FGV

Compêndio do Cross-Jurisdiction Privacy Project (CJPP)

Intitulado de “Privacy Laws & Digital Advertising: Multi-jurisdictional Overview and Implications” – ou, em português, “Leis de Privacidade e Publicidade Digital: Visão Geral e Implicações Multijurisdicionais” –, o Compêndio Cross-Jurisdiction Privacy Project (CJPP) tem como objetivo examinar as leis de privacidade de onze jurisdições e suas aplicabilidades no ecossistema da publicidade digital.

Desenvolvido pelo IAB US, o material contou com a colaboração de profissionais de diversos países, incluindo o Brasil. Neste sentido, está incluso no documento também a Lei Geral de Proteção de Dados (LGPD). Este capítulo em específico foi traduzido pelo IAB Brasil, a fim de torná-lo mais acessível ao mercado brasileiro. Você o encontra para leitura clicando aqui.

O anteprojeto da LGPD Penal e as regras sobre transferência internacional de dados pessoais

O relatório foi feito com apoio da Embaixada Britânica no Brasil.

Relatório do ITS Rio trata do anteprojeto de LGPD Penal e sua relação com operações de segurança e a transferência internacional de dados:

O Anteprojeto de LGPD Penal veio para conferir segurança a essas atividades, endereçando o tratamento de dados para fins de segurança pública e persecução penal – matéria fora do escopo da Lei Geral de Proteção de Dados (LGPD). O desafio para a proposta é implementar um sistema robusto e eficaz destinado à segurança pública, ao mesmo tempo em que garante a proteção de dados. 

ITS Rio

New rules on protection of transfers of personal data outside European Union

Recently, there have been a number of important developments that affect how organisations facilitate the transfer of personal data out of the European Union in accordance with the EU General Data Protection Regulation (GDPR).

In brief, the developments are as follows:

  • A new set of official template clauses has been published by the European Commission to help organisations ensure that personal data transferred out of the European Union is protected – organisations that are considering implementing these clauses should be aware of some key dates.
  • The European Data Protection Board has released final form recommendations to help organisations assess the risks involved in transferring personal data outside the European Union and identify the appropriate supplementary measures to be implemented where needed.

Organisations that are subject to the GDPR and that are transferring personal data outside of the European Union and organisations that are receiving personal data from within the European Union are highly likely to be affected by these developments.

INSIGHTS INTO THE FUTURE OF DATA PROTECTION ENFORCEMENT: REGULATORY STRATEGIES OF EUROPEAN DATA PROTECTION AUTHORITIES FOR 2021-2022

[Future of Privacy Forum]

We have compiled and analyzed these novel strategic documents, describing where different DPA strategies have touchpoints and noteworthy particularities. The report contains links to and translated summaries of 15 DPAs’ strategic documents from DPAs in France (FR), Portugal (PT), Belgium (BE), Norway (NO), Sweden (SE), Ireland (IE), Bulgaria (BG), Denmark (DK), Finland (FI), Latvia (LV), Lithuania (LT), Luxembourg (LU) and Germany (Bavaria). The analysis also includes documents published by the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS). These documents complement or replace the ones that were included in our 2020 report.

GDPR 3 years on – The greatest hits (and misses)

[Lexology]

More than three years have passed since the GDPR applied and a lot has happened in the world of data protection during that time – fines, class actions, court challenges and more. We give our “playlist” of the greatest hits (and misses). Our previous article marking 12 months of GDPR had a cinematic theme. Now, we’re giving the three-year anniversary of GDPR a musical twist.

1. All the Single Ladies (credit to Beyoncé Knowles, Terius Nash, Thaddis Harrell, and Christopher Stewart)

Brexit: “If you liked it, you should[n’t] have put a [referendum] on it…”

The United Kingdom left the European Union and now has its own data protection regime in the form of the Data Protection Act 2018 and the UK GDPR. For now, this is largely based on the EU GDPR but we expect further divergence in future as the UK seeks to establish itself as a favourable place for overseas companies to do business.