Utilities Governed Like Empires

By Cory Doctorow | August 3, 2021 | Source: EFF

Believe the Hype

After decades of hype, it’s only natural for your eyes to skate over corporate mission-statements without stopping to take note of them, but when it comes to ending your relationship with them,  tech giants’ stated goals take on a sinister cast.

Whether it’s “bringing the world closer together” (Facebook), “organizing the world’s information” (Google), to be a market “where customers can find and discover anything they might want to buy online” (Amazon) or “to make personal computing accessible to each and every individual” (Apple), the founding missions of tech giants reveal a desire to become indispensable to our digital lives.

They’ve succeeded. We’ve entrusted these companies with our sensitive data, from family photos to finances to correspondence. We’ve let them take over our communities, from medical and bereavement support groups to little league and service organization forums. We’ve bought trillions of dollars’ worth of media from them, locked in proprietary formats that can’t be played back without their ongoing cooperation.

These services often work great…but they fail very, very badly. Tech giants can run servers to support hundreds of millions or billions of users – but they either can’t or won’t create equally user-centric procedures for suspending or terminating those users.

But as bad as tech giants’ content removal and account termination policies are, they’re paragons of sense and transparency when compared to their appeals processes. Many who try to appeal a tech company’s judgment quickly find themselves mired in a Kafkaesque maze of automated emails (to which you often can’t reply), requests for documents that either don’t exist or have already been furnished on multiple occasions, and high-handed, terse “final judgments” with no explanations or appeal.

The tech giants argue that they are entitled to run their businesses largely as they see fit: if you don’t like the house rules, just take your business elsewhere. These house rules are pretty arbitrary: platforms’ public-facing moderation policies are vaguely worded and subject to arbitrary interpretation, and their account termination policies are even more opaque. 

Kafka Was An Optimist

All of that would be bad enough, but when it is combined with the tech companies’ desire to dominate your digital life and become indispensable to your daily existence, it gets much worse.

Losing your cloud account can cost you decades of your family photos. Losing access to your media account can cost you access to thousands of dollars’ worth of music, movies, audiobooks and ebooks. Losing your IoT account can render your whole home uninhabitable, freezing the door locks while bricking your thermostat, burglar alarm and security cameras. 

But really, it’s worse than that: you will incur multiple losses if you get kicked off just one service. Losing your account with Amazon, Google or Apple can cost you access to your home automation and security, your mobile devices, your purchased ebooks/audiobooks/movies/music, and your photos. Losing your Apple or Google account can cost you decades’ worth of personal correspondence – from the last email sent by a long-dead friend to that file-attachment from your bookkeeper that you need for your tax audit. These services are designed to act as your backup – your offsite cloud, your central repository – and few people understand or know how to make a local copy of all the data that is so seamlessly whisked from their devices onto big companies’ servers.

In other words, the tech companies set out to make us dependent on them for every aspect of our online lives, and they succeeded – but when it comes to kicking you off their platforms, they still act like you’re just a bar patron at last call, not someone whose life would be shattered if they cut you off.

YouTubers Warned Us

This has been brewing for a long time. YouTubers and other creative laborers have long suffered under a system where the accounts on which they rely to make their livings could be demonetized, suspended or deleted without warning or appeal. But today, we’re all one bad moderation call away from having our lives turned upside-down.

The tech giants’ conquest of our digital lives is just getting started. Tech companies want to manage our health, dispense our medication, take us to the polls on election day, televise our political debates and teach our kids. Each of these product offerings comes with grandiose pretensions to total dominance – it’s not enough for Amazon Pharmacy to be popular, it will be the most popular, leveraging Amazon’s existing business to cut off your corner druggist’s market oxygen (Uber’s IPO included a plan to replace all the world’s public transit and taxi vehicles with rideshares). 

If the tech companies deliver on their promises to their shareholders, then being locked out of your account might mean being locked out of whole swathes of essential services, from buying medicine to getting to work.

Well, How Did We Get Here?

How did the vibrant electronic frontier become a monoculture of “five websites, each consisting of screenshots of text from the other four?” 

It wasn’t an accident. Tech, copyright, contract and competition policy helped engineer this outcome, as did VCs and entrepreneurs who decided that online businesses were only worth backing if they could grow to world-dominating scale.

Take laws like Section 1201 of the Digital Millennium Copyright Act, a broadly worded prohibition on tampering with or removing DRM, even for lawful purposes. When Congress passed the DMCA in 1998, they were warned that protecting DRM – even when no copyright infringement took place – would leave technology users at the mercy of corporations. You may have bought your textbooks or the music you practice piano to, but if it’s got DRM and the company that sold it to you cuts you off, the DMCA does not let you remove that DRM (say goodbye to your media). 

Companies immediately capitalized upon this dangerously broad law: they sold you media that would only play back on the devices they authorized. That locked you into their platform and kept you from defecting to a rival, because you couldn’t take your media with you. 

But even as DRM formats proliferated, the companies that relied on them continued to act like kicking you off their platforms was like the corner store telling you to buy your magazines somewhere else – not like a vast corporate empire of corner stores sending goons  to your house to take back every newspaper, magazine and paperback you ever bought there, with no appeal.

It’s easy to see how the DMCA and DRM give big companies far-reaching control over your purchases, but other laws have had a similar effect. The Computer Fraud and Abuse Act (CFAA), another broadly worded mess of a law, is so badly drafted that tech companies were able to claim for decades that simply violating their terms of service could be  a crime – a chilling claim that was only put to rest by the Supreme Court this summer.

From the start, tech lawyers and the companies they worked for set things up so that most of the time, our digital activities are bound by contractual arrangements, not ownership. These are usually mass contracts, with one-sided terms of service. They’re end user license agreements that ensure that the company has a simple process for termination without any actual due process, much less strong remedies if you lose your data or the use of your devices.  

CFAA, DMCA, and other rules allowing easy termination and limiting how users and competitors could reconfigure existing technology created a world where doing things that displeased a company’s shareholders could literally be turned into a crime – a kind of “felony contempt of business-model.” 

These kinds of shady business practices wouldn’t have been quite so bad if there were a wide variety of small firms that allowed us to shop around for a better deal. 

Unfortunately, the modern tech industry was born at the same moment as American antitrust law was being dismantled – literally. The Apple ][+ appeared on shelves the same year Ronald Reagan hit the campaign trail. After winning office, Reagan inaugurated a 40-year, bipartisan project to neuter antitrust law, allowing incumbents to buy and crush small companies before they could grow to be threats; letting giant companies merge with their direct competitors, and looking the other way while companies established “vertical monopolies” that controlled their whole supply chains.

Without any brakes, the runaway merger train went barrelling along, picking up speed. Today’s tech giants buy companies more often than you buy groceries, and it has turned the tech industry into a “kill-zone” where innovative ideas go to die.

How is it that you can wake up one day and discover you’ve lost your Amazon account, and get no explanation? How is that this can cost you the server you run your small business on, a decade of family photos, the use of your ebook reader and mobile phone, and access to your entire library of ebooks, movies and audiobooks? 


Amazon is in so many parts of your life because it was allowed to merge with small competitors, create vertical monopolies, wrap its media with DRM – and never take on any obligations to be fair or decent to customers it suspected of some unspecified wrongdoing. 

Not just Amazon, either – every tech giant has an arc that looks like Amazon’s, from the concerted effort to make you dependent on its products, to the indifferent, opaque system of corporate “justice” governing account termination and content removal.

Fix the Tech Companies

Companies should be better. Moderation decisions should be transparent, rules-based, and follow basic due process principles. All of this – and more – has been articulated in detail by an international group of experts from industry, the academy, and human rights activism, in an extraordinary document called The Santa Clara Principles. Tech companies should follow these rules when moderating content, because even if they are free to set their own house rules, the public has the right to tell them when those rules suck and to suggest better ones.

If a company does kick you off its platform – or if you decide to leave – they shouldn’t be allowed to hang onto your data (or just delete it). It’s your data, not theirs. The concept of a “fiduciary” – someone with a duty to “act in good faith” towards you – is well-established. If you fire your lawyer (or if they fire you as a client), they have to give you your files. Ditto your doctor or your mental health professional. 

Many legal scholars have proposed creating “information fiduciary” rules that create similar duties for firms that hold your data. This would impose a “duty of loyalty” (to act in the best interests of their customers, without regard to the interests of the business), and a “duty of care” (to act in the manner expected by a reasonable customer under the circumstances). 

Not only would this go a long way to resolving the privacy abuses that plague our online interactions – it would also guarantee you the right to take your data with you when you left a service, whether that departure was your idea or not. 

Information fiduciary isn’t the only way to get companies to be responsible. Direct consumer protection laws — such as requiring companies to make your content readily available to you in the event of termination — could too (there are other approaches as well).  How these rules would apply would depend on the content they host as well as the size of the business you’re dealing with – small companies would struggle to meet the standards we’d expect of giant companies. But every online service should have some duties to you – if the company that just kicked you off its servers and took your wedding photos hostage is a two-person operation, you still want your pictures back!

Fix the Internet

Improving corporate behavior is always a laudable goal, but the real problem with giant companies that are entwined in your life in ways you can’t avoid isn’t that those companies wield their incredible power unwisely. It’s that they have that power in the first place.

To give power to internet users, we have to take it away from giant internet companies. The FTC – under new leadership – has pledged that it will end decades of waving through anticompetitive mergers. That’s just for openers, though. Competition scholars and activists have made the case for the harder task of  breaking up the giants, literally cutting them down to size.

But there’s more.  Congress is considering the ACCESS Act, landmark legislation that would force the largest companies to interoperate with privacy-respecting new rivals, who’d be banned from exploiting user data. If the ACCESS Act passes, it will dramatically lower the high switching costs that keep us locked into big platforms even though we don’t like the way they operate. It also protects folks who want to develop tools to make it easier for you to take your data when you leave, whether voluntarily or because your account is terminated. 

That’s how we’ll turn the internet back into an ecosystem of companies, co-ops and nonprofits of every size that can take receipt of your data, and offer you an online base of operations from which you can communicate with friends, communities and customers regardless of whether they’re on the indieweb or inside a Big Tech silo.

That still won’t be enough, though. The fact that terms of service, DRM, and other technologies and laws can prevent third parties from supplying software for your phone, playing back the media you’ve bought, and running the games you own still gives big companies too much leverage over your digital life.

That’s why we need to restore the right to interoperate, in all its guises: competitive compatibility (the right to plug new products and services into existing ones, with or without permission from their manufacturers), bypassing DRM (we’re suing to make this happen!), the right to repair (a fight we’re winning!) and an end to abusive terms of service (the Supreme Court got this one right).

Digital Rights are Human Rights

When we joined this fight,  30 long years ago, very few people got it. Our critics jeered at the very idea of “digital rights” – as if the nerdfights over Star Trek forums could somehow be compared to history’s great struggles for self-determination and justice! Even a decade ago, the idea of digital rights was greeted with jeers and skepticism.

But we didn’t get into this to fight for “digital rights” – we’re here to defend human rights. The merger of the “real world” and the “virtual world” could be argued over in the 1990s, but not today, not after a lockdown where the internet became the nervous system for the planet, a single wire we depended on for free speech, a free press, freedom of assembly, romance, family, parenting, faith, education, employment, civics and politics.

Today, everything we do involves the internet. Tomorrow, everything will require it. We can’t afford to let our digital citizenship be reduced to a heavy-handed mess of unreadable terms of service and broken appeals processes.

We have the right to a better digital future – a future where the ambitions of would-be monopolists and their shareholders take a back-seat to fairness, equity, and your right to self-determination.

New rules on protection of transfers of personal data outside European Union

Recently, there have been a number of important developments that affect how organisations facilitate the transfer of personal data out of the European Union in accordance with the EU General Data Protection Regulation (GDPR).

In brief, the developments are as follows:

  • A new set of official template clauses has been published by the European Commission to help organisations ensure that personal data transferred out of the European Union is protected – organisations that are considering implementing these clauses should be aware of some key dates.
  • The European Data Protection Board has released final form recommendations to help organisations assess the risks involved in transferring personal data outside the European Union and identify the appropriate supplementary measures to be implemented where needed.

Organisations that are subject to the GDPR and that are transferring personal data outside of the European Union and organisations that are receiving personal data from within the European Union are highly likely to be affected by these developments.

PRESA NA REDE DE PROTEÇÃO SOCIAL: Privacidade, gênero e justiça de dados no Programa Bolsa Família

Artigo recomendado de autoria de MARIANA G. VALENTE, NATÁLIA NERIS e NATHALIE FRAGOSO.

Informações do SciELO

O artigo analisa o Programa Bolsa Família (PBF) como uma “cadeia de valor da informação” e observa, a partir de elementos colhidos em entrevistas e denúncias, aspectos de justiça de dados e impactos da datificação do programa sobre suas beneficiárias, sobretudo quanto à privacidade e ao gênero. Na análise, são consideradas as dimensões procedimental, de direitos e distributiva ao longo da cadeia de dados que informa e alimenta o PBF.

Receber dados ilegalmente coletados gera responsabilidade pelos danos aos titulares

Numa época em que as pessoas passaram a fazer quase todas as atividades em meio digital, um aplicativo de videoconferência tornou-se a plataforma social da era do coronavírus. Com escolas fechadas e milhões de pessoas trabalhando de casa, o Zoom se tornou enormemente popular. Trata-se de um aplicativo muito funcional, porque permite criar salas privadas e dezenas de pessoas se plugarem na sala virtual ao mesmo tempo. O recurso é útil para garantir que apenas convidados entrem na reunião on line, impedindo participação de usuários que não foram convidados. O aplicativo é fácil de usar e de rápida instalação.

Covid-19 spreads too fast for traditional contact tracing. New digital tools could help

Every strategy for releasing Covid-19’s vise-grip on daily life starts with identifying cases and tracing their contacts — the laborious task of public health workers tracking down people who have crossed paths with a newly diagnosed patient, so they can be quarantined well before they show symptoms. That typically takes three days per new case, an insurmountable hurdle in the U.S., with its low numbers of public health workers and tens of thousands of new cases every day. Existing digital tools, however, using cellphone location data and an app for self-reporting positive test results, could make the impossible possible, the authors of a new analysis argue.

[artigo] COVID-19, Cyber Surveillance Normalisation and Human Rights Law

Ushering in a world of social distancing and self-isolation, the global spread of COVID-19 has intensified societal reliance on the Internet, whether for keeping in touch with family and friends, enabling work and education to be conducted remotely from home, or simply searching for and sharing information in an effort to keep track and make sense of the crisis.

Aplicação da LGPD nas redes sociais

Faltando pouco mais de sete meses para a entrada em vigor da Lei nº 13.709/18, a Lei Geral de Proteção de Dados, muitos questionamentos ainda rondam as meditações dos profissionais de privacidade, não faltando indagações e provocações a nos tirarem o sono. Ante a ausência de uma Autoridade Nacional atuante, algumas lacunas devem ser supridas pela análise da experiência internacional, a qual pode fornecer um norte a ser seguido na fase de adequação à LGPD. Todavia, tal cuidado extrapola os programas corporativos de conformidade.

[artigo] Como criar um Relatório de Impacto à Proteção de Dados Pessoais adequado a LGPD

Dentre os muitos pontos exigidos pela LGPD, o Relatório de Impacto à Proteção de Dados Pessoais – RIPDP, se destaca como o instrumento usado pelo controlador  nos casos  em que o tratamento de dados pessoais pode gerar riscos às liberdades civis e aos direitos fundamentais dos titulares, servindo como uma ferramenta para identificar medidas, salvaguardas e mecanismos de mitigação de riscos.

Leia o artigo…

[artigo] Regimes de responsabilidade civil no Código de Defesa do Consumidor (CDC) e na Lei Geral de Proteção de Dados (LGPD)

Sumário: Introdução; 1. Contexto social e legislativo do CDC e da LGPD; 2. Regime de responsabilidade civil no CDC pelo fato do produto; 2.1 Fornecedor; 2.2. O fato do produto e do serviço e as excludentes de responsabilidade no CDC; 3. Regime de responsabilidade civil na LGPD por violação à legislação de proteção de dados pessoais; 3.1. Breve panorama sobre a LGPD; 3.2. Responsáveis; 3.3. Excludentes de responsabilidade e inovação quanto ao risco de desenvolvimento; 4. Teoria do diálogo das fontes entre o CDC e a LGPD; Conclusão; Referências bibliográficas.

Leia o artigo...

[artigo] Privacy and data protection in India and Germany: A comparative analysis

This research report offers a comparative analysis of privacy and data protection in Germany and India. It compares the two regimes on four counts. First, it examines how the right to privacy and/or its allied rights have developed in the two countries historically. In this, it explores the political factors contributing to the understanding and acceptability of the principles of privacy in the decades after the Second World War. Second, it delves into the instruments and forms of state surveillance employed by both the countries and analyses how the presence of parliamentary and judicial oversight on intelligence agencies impacts individual privacy. In the third section, it compares how biometric identity systems have been deployed in the two countries, the safeguards designed around the same, and the legal challenges they have thrown up. Lastly, it evaluates data subject rights as defined under the General Data Protection Regulation (GDPR) together with the Bundesdatenschutzgesetz-Neu (BDSG-Neu) and how they compare with those as defined under the Draft Personal Data Protection Bill, 2018 in the Indian context.