All the times listed below are in Pacific Daylight Time (PDT).
Thursday, October 15
8:00 am–8:10 am
Opening Remarks
Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Apple
8:10 am–9:25 am
Data Governance
Session Chair: Lorrie Cranor, Carnegie Mellon University
Beyond Access: Using Abac Frameworks to Implement Privacy and Security Policies
Amanda Walker, Nuna, Inc.
Over the last several decades, access control systems have evolved steadily from a single bits (“write protect”) through identity and role based approaches to complex, abstract frameworks such as Attribute Based Access Control (ABAC). However, in use they are most often used to answer traditional question of read and write access. In this talk, I will explore how frameworks like ABAC can be used to implement more abstract controls and policies such as purpose constraints and other data handling policies that need to depend on attributes of the data, the code, and surrounding context.
Amanda Walker[node:field-speakers-institution]
Privacy Architecture for Data-Driven Innovation
Derek Care, Legal Director, Privacy at Uber; Nishant Bhajaria, Privacy Architecture and Strategy at Uber
Building privacy governance for a decentralized and innovative workplace is challenging but can help earn customer trust and competitive differentiation.
In this session, an engineer and an attorney will explain how you can execute this during data collection and data sharing to provide an end-to-end privacy program for your most valued stakeholders: your users.
There are several challenges when instrumenting this: measuring risk for the data you collect, tailoring your security tools for privacy, sharing data with privacy in mind and quantifying improvements and investments.
In order to succeed, there is a combination of engineering techniques and management know-how that is critical. The speakers will share their experiences and lessons on how you engage engineers, privacy/security specialists and executive leadership in service of user data privacy.
Derek Care, Legal Director, Privacy at Uber
Nishant Bhajaria, Privacy Architecture and Strategy at Uber
Responsible Design through Experimentation: Learnings from LinkedIn
Guillaume Saint-Jacques, LinkedIn Corporation
As technology advances, there is increasing concern about individuals being left behind. Businesses are striving to adopt responsible design practices and avoid any unintended consequences of their products. We propose a novel approach to fairness and inclusiveness based on experimentation. We use experimentation in order to assess not only the intrinsic properties of products and algorithms but also their impact on people. We do this by introducing an inequality approach to A/B testing. We show how to perform causal inference over this inequality measure. We provide real examples from LinkedIn, as well as an open-source, highly scalable implementation of the computation of the Atkinson index and its variance in Spark/Scala. We also provide over a year's worth of learnings - gathered by scaling our method and analyzing thousands of experiments - on which areas and which kinds of product innovations seem to foster fairness through inclusiveness.
Guillaume Saint-Jacques, LinkedIn Corporation
9:25 am–9:45 am
Break
9:45 am–11:00 am
Privacy-Preserving Data Analysis
Session Chair: Lea Kissner, Apple
Building and Deploying a Privacy Preserving Data Analysis Platform
Frederick Jansen, Boston University
This talk focuses on our experience building and deploying various iterations of a web-based secure multi-party computation (MPC) platform. Our experience demonstrates that secure computations can add value to questions of social good when otherwise constrained by legal, ethical, or privacy restrictions, and that it is feasible to deploy MPC solutions today.
Frederick Jansen, Boston University
A Differentially Private Data Analytics API at Scale
Ryan Rogers, LinkedIn
We present a privacy system that leverages differential privacy to protect LinkedIn members' data while also providing audience engagement insights to enable marketing analytics related applications. We detail the differentially private algorithms and other privacy safeguards used to provide results that can be used with existing real-time data analytics platforms, specifically with the open sourced Pinot system. Our privacy system provides user-level privacy guarantees. As part of our privacy system, we include a budget management service that enforces a strict differential privacy budget on the returned results to the analyst. This budget management service brings together the latest research in differential privacy into a product to maintain utility given a fixed differential privacy budget.
Ryan Rogers, LinkedIn
Improving Usability of Differential Privacy at Scale
Milinda Perera and Miguel Guevara, Google LLC
We present a framework to improve the usability of Differential Privacy (DP) by allowing practitioners to quantify and visualize privacy vs utility trade-offs of DP.
While DP has long been seen as a robust anonymization technique, there is a significant disconnect between theory, implementation, and usability. One of the biggest problems that practitioners face when using DP is forming mental models around the benefits that DP provides to end users and how DP affects data utility. Many users are not acquainted to think in terms of epsilons, deltas, and sensitivity bounds, and they shouldn't have to! Our system helps users think in terms of utility loss and user anonymity gains.
Our talk has three parts. First, we provide a very quick primer on DP. Second, we will explain why and how we build this framework. Third, we demo the system using a real dataset in real-time!
Milinda Perera, Google LLC
Miguel Guevara, Google LLC
11:00 am–11:40 am
Networking Break
Join a Birds-of-a-Feather Session via Zoom to discuss topics of interest with other attendees. Topics and Zoom info will be posted on the conference Slack.
11:40 am–12:55 pm
Design
Session Chair: Manya Sleeper, Google
How to (In)Effectively Convey Privacy Choices with Icons and Link Text
Lorrie Faith Cranor, Carnegie Mellon University; Florian Schaub, University of Michigan
Clear communication about privacy through icons and/or just a few words can be a difficult challenge. We have been involved in a number of research projects that evaluated proposed and implemented privacy choice icons and short text, including most recently studies related to the CCPA opt-out button. In this talk we will discuss our research into privacy icons and text, explaining both our methods, and our findings. We will demonstrate how to conduct these sorts of studies quickly and at low cost, and discuss why they are so important. We will also provide lessons learned about what works and what doesn’t when you want to communicate about privacy through icons and short text.
Lorrie Faith Cranor, Carnegie Mellon University
Florian Schaub, University of Michigan
Beyond the Individual: Exploring Data Protection by Design in Connected Communal Spaces
Martin J. Kraemer, University of Oxford
There's a gap between the personal focus of data protection legislation and practices, and the communal implications of internet-connected technology. Through our research, we've started to explore how existing design tools and methods can help understand and address communal implications. In this talk, I will highlight opportunities for design in such spaces by discussing two case studies from participatory design workshops. These case studies show potential for our method to inform data protection design that is appropriate to social groups and their dynamics, complementing individual perspectives on data protection.
Martin Kraemer, University of Oxford
Throwing Out the Checklist
Dan Crowley, Quizlet
Effective privacy design is a crucial component of technology products and has been a global focus area for regulators and companies alike for decades. Despite this, many classic "privacy-by-design" processes ultimately fail. Why is that? This session focuses on the gaps in most PbD processes, notably how classic PbD systems exhibit inflexibility, are under-resourced, and conflict with predominant engineering cultures. It also discusses what companies, but especially smaller companies, and their privacy champions can do to achieve better privacy outcomes. By shifting from programs designed around process-oriented mandates to programs that emphasize building a “privacy-by-ethos” culture, companies can set themselves (and their users) up for success when it comes to privacy best practices.
Dan Crowley, Quizlet
12:55 pm–1:15 pm
Break
1:15 pm–2:30 pm
Product Privacy
Session Chair: Nwokedi Idika, Google
Product Privacy Journey: Towards a Product Centric Privacy Engineering Framework
Igor Trindade Oliveira, Work & Co
The Product Journey and the Consumer Journey are essential parts of the product development process. When thinking about privacy from the perspective of the user, there is an underlying part of the consumer journey that must be mapped, the product's privacy journey. This talk discusses challenges of embedding privacy into the product creation process and how privacy product principles help consumers and how they can translated to product requirements.
Igor Trindade Oliveira, Work & Co
Wikipedia and the Lean Data Diet
Nuria Ruiz, Principal Engineer, Wikimedia Foundation
Privacy is one of the lesser known charms of Wikipedia. Wikipedia’s stand on privacy allows users to access and modify a wiki in anonymity, without fear of giving away personal information, editorship or browsing history. In this talk we will go into the challenges that this strong privacy stance poses for the Wikimedia Foundation, including how it affects data collection and some creative workarounds that allow WMF to calculate metrics in a privacy conscious way.
Nuria Ruiz, Wikimedia Foundation
Privacy Professional Boss Mode
Melanie Ensign, Discernible Inc.
Despite the growth in new regulations around the world, reliance on legal mandates often has diminishing returns for privacy professionals exerting influence beyond their immediate team. This presentation will introduce new and experienced professionals from all privacy disciplines to pragmatic techniques for thinking creatively about your role, how to build influence for privacy throughout cross-functional organizations, and reduce your dependence on regulatory hammers for securing privacy outcomes.
Friday, October 16
8:00 am–8:10 am
Quick Kickoff
Program Co-Chairs: Lorrie Cranor, Carnegie Mellon University, and Lea Kissner, Apple
8:10 am–9:25 am
Privacy-Preserving Technologies
Session Chair: Lorrie Cranor, Carnegie Mellon University
Privacy in Deployment
Patricia Thaine, Private AI, University of Toronto; Pieter Luitjens, Private AI; Dr. Parinaz Sobhani, Georgian Partners
This talk is a guide to using privacy technology in deployment. First, we will give a brief overview of the current state of privacy technology for (a) Differential Privacy & Anonymization, and (b) Secure Multiparty Computation, Homomorphic Encryption, Secure Enclaves. We will then go over the current obstacles of deploying privacy-preserving software; namely, identifying privacy risks & risk management, the capabilities & limitations of privacy tool sets and the backgrounds required to use them. Obstacles differ depending on whether one is attempting to retrofit a codebase in order to integrate privacy post-hoc or whether one is choosing the tech stack they will use for creating a codebase that integrates Privacy by Design. With those two scenarios in mind, we will discuss strategies for choosing privacy tools, for choosing to compute on the edge vs. on-premise vs. on the cloud, and for thinking about right risk management frameworks.
Patricia Thaine, Private AI, University of Toronto
Pieter Luitjens, Private AI
Parinaz Sobhani, Georgian Partners
Design of a Privacy Infrastructure for the Internet of Things
Norman Sadeh, Carnegie Mellon University
We have recently launched a Privacy Infrastructure for the Internet of Things. The infrastructure revolves around a growing collection of registries where owners of IoT resources (e.g. IoT devices, services) and volunteer contributors can publicize the presence of IoT resources and their data practices, including any privacy settings made available by these resources. An IoT Privacy Assistant app available in the iOS store and Android Google Play store enables people to discover IoT resources around them, find out about the data they collect and interact with privacy settings made available by these resources. In this presentation, we will discuss some of the the design challenges associated with the development of such a platform and how we approached these challenges as we developed a first instance of this technology. Within a month, our user community has grown to over 15,000 users, with our infrastructure hosting over 100,000 IoT resource descriptions.
Norman Sadeh, Carnegie Mellon University
A Backdoor by Any Other Name, and How to Stop It
Max Hunter, EFF
Recent attacks on encryption have diverged. On the one hand, we’ve seen Attorney General William Barr call for “extraordinary access” to encrypted communications, using arguments that have barely changed since the 1990’s. But we’ve also seen suggestions from a different set of actors for more purportedly “reasonable” interventions, particularly the use of client-side scanning to stop the transmission of contraband files, most often child sexual abuse material (CSAM).
On their face, proposals to do client-side scanning seem to give us the best of all worlds: they preserve encryption, while also combating the spread of illegal and morally objectionable content.
But unfortunately it’s not that simple. While it may technically maintain some properties of end-to-end encryption, client-side scanning would render the user privacy and security guarantees of encryption hollow. This talk will explain why that is, and what we can do to keep encryption encrypted.
Maximillian Hunter, EFF
9:25 am–9:45 am
Break
9:45 am–10:55 am
Incidents
Session Chair: Florian Schaub, University of Michigan
Building an Effective Feedback Loop for Your Privacy Program through Privacy Incident Response
Sri Pravallika Maddipati
Privacy Incident Response (PIR) can be challenging to most organizations given the growing number of regulations and notification obligations. While most of the focus is on timely response to incidents, breach notification and quick fixes to minimize damage, the communication of lessons learnt to the appropriate product and privacy teams is often ignored. This talk focuses on maturing the incident response program to analyze key privacy incident trends and metrics which can highlight the success / progress / challenges of the privacy program.
Sri Pravallika Maddipati, Google LLC
When Things Go Wrong
Lea Kissner
Despite all we do to prevent them, mistakes happen. We’re fallible humans working with exceedingly complicated systems in a world of users with a dizzying array of different needs. Unsurprisingly but sadly, our systems sometimes end up with vulnerabilities and those vulnerabilities can turn into incidents, hurting people affected by our systems. In this talk we go through the stages of incident handling: finding the cut, stopping the bleeding, and cleaning up the blood. After the incident is over, our work is done: we need to find the root cause and ensure that neither this particular incident nor related ones happen again. We will go through real-world examples of things going wrong and how to make them go right.
Lea Kissner[node:field-speakers-institution]
Taking Responsibility for Someone Else's Code: Studying the Privacy Behaviors of Mobile Apps at Scale
Serge Egelman
Modern software development has embraced the concept of "code reuse," which is the practice of relying on third-party code to avoid "reinventing the wheel" (and rightly so). While this practice saves developers time and effort, it also creates liabilities: the resulting app may behave in ways that the app developer does not anticipate. This can cause very serious issues for privacy compliance: while an app developer did not write all of the code in their app, they are nonetheless responsible for it. In this talk, I will present research that my group has conducted to automatically examine the privacy behaviors of mobile apps vis-à-vis their compliance with privacy regulations. Using analysis tools that we developed and commercialized (as AppCensus, Inc.), we have performed dynamic analysis on hundreds of thousands of the most popular Android apps to examine what data they access, with whom they share it, and how these practices comport with various privacy regulations, app privacy policies, and platform policies. We find that while potential violations abound, many of the issues appear to be due to the (mis)use of third-party SDKs. I will provide an account of the most common types of violations that we observe and how app developers can better identify these issues prior to releasing their apps.
Serge Egelman[node:field-speakers-institution]
10:55 am–11:40 am
Networking Break
Join a Birds-of-a-Feather Session via Zoom to discuss topics of interest with other attendees. Topics and Zoom info will be posted on the conference Slack.
11:40 am–12:55 pm
Frameworks and Risk
Session Chair: Lea Kissner, Apple
Assessing Privacy Risk with the IPA Triad
Mark Funk, Obscure Group
Introducing the IPA Triad, a generalized privacy framework consisting of three properties: Identity, Presence, and Activity. Like the CIA triad of information security, these properties provide a useful approach for modeling privacy risks in an applied problem space.
Mark Funk, Obscure Group
Engineering Ethics into the NIST Privacy Framework
R. Jason Cronk, Enterprivacy
Version 1.0 of the NIST Privacy Framework invites organizations to use organizational privacy values and business objective to design a target profile. But how does an organization determine it's "privacy values?" This talk will examine 5 different models for privacy and how organizations can develop a set of custom privacy values from those models.
R. Jason Cronk, Enterprivacy
When Engineers and Lawyers Talk: Right-Sizing Your Data Protection Risk Profile
Rafae Bhatti, Mode
The path to navigating data protection risks is often filled with uncertainty. Overestimating the risks stifles growth, and underestimating them can derail the business. To be able to measure data protection risks and right-size the risk-profile of a company, we need to view them from both a technical and legal lens. Engineers and lawyers need to talk.
This talk will provide practical examples of how right-sizing the risk profile helps simply compliance. It will cover scenarios of data retention, use, and sharing, as well as breach notification. We will review key architectural decisions as well as engineering trade-offs that are often involved in shaping an organization’s compliance processes. These decisions and tradeoffs often center around the purpose of use, which is a concept that engineering teams do not traditionally pay attention to. Therefore, viewing the system requirements from a data protection lens helps clarify legal obligations and simplify compliance.
Rafae Bhatti, Mode
12:55 pm–1:00 pm
Closing Remarks
1:00 pm–1:45 pm
Virtual Ice Cream Social
Grab your own ice cream or other tasty treat and join us for a virtual social event with PEPR attendees.