The protocol — which they’re calling Decentralized Privacy-Preserving Proximity Tracing (DP-PPT) — has been designed by around 25 academics from at least seven research institutions across Europe, including the Swiss Federal Institute of Technology, ETH Zurich and KU Leuven in Belgium. They’ve published a White Paper detailing their approach here.
Cymone Gosnell |
In 2016, the European Union (“EU”) created heightened data privacy rights for its citizens by enacting the General Data Privacy Regulation (“GDPR”). The most drastic change from the previous regulation, enacted in 1995, lies within the expanded territorial scope. The change now subjects companies to fines for violations of the regulation, even if those companies are not domiciled in the EU. Data privacy has always been considered a fundamental human right in the EU; however, within the United States, there is no fundamental right to privacy. Rather, the country’s privacy laws are based on a complicated sectoral structure that often leads the country’s citizens confused as to what rights they actually have. This paper will review the EU and United States’ fundamental differences in privacy laws, the changes implemented by the GDPR (including the expanded territorial scope), the compliance plans of some major players within the United States, and what the future looks like for American businesses that hold or process the data of EU citizens under the GDPR. |
Conceptual research on robots and privacy has increased but we lack empirical evidence about the prevalence, antecedents, and outcomes of different privacy concerns about social robots. To fill this gap, we present a survey, testing a variety of antecedents from trust, technology adoption, and robotics scholarship. Respondents are most concerned about data protection on the manufacturer side, followed by social privacy concerns and physical concerns. Using structural equation modeling, we find a privacy paradox, where the perceived benefits of social robots override privacy concerns. |
In this paper, we propose Mapping Distortion Protection (MDP) and its augmentation-based extension (AugMDP) to protect the data privacy by modifying the original dataset. In MDP, the label of the modified image does not match the ground-truth mapping, yet DNNs can still learn the ground-truth relation even when the provided mapping is distorted. As such, this method protects privacy when the dataset is leaked.
In this paper, we seek to identify what types of sensor data can be collected on a
smartphone and which of those types can pose a threat to user privacy by looking into the hardware capabilities of modern smartphone devices and how smartphone data is used in the literature. We then summarize some implications that this information could have on the GDPR.
This month, a bipartisan group of legislators in Washington state presented new legislation that could soon become the most comprehensive privacy law in the country. The centerpiece of this legislation, the Washington Privacy Act as substituted, goes further than the landmark bill California recently enacted and builds on the law Europeans have enjoyed for the past year and a half.
As Microsoft President Brad Smith shared in his blog post about our priorities for the state of Washington’s current legislative session, we believe it is important to enact strong data privacy protections to demonstrate our state’s leadership on what we believe will be one of the defining issues of our generation. People will only trust technology if they know their data is private and under their control, and new laws like these will help provide that assurance. We’re encouraged that privacy legislation in Washington has been welcomed by privacy advocates such as Consumer Reports and the Future of Privacy Forum.