Quantcast
Channel: Open Policy & Advocacy
Viewing all articles
Browse latest Browse all 19

Privacy-Preserving Attribution: Testing for a New Era of Privacy in Digital Advertising

$
0
0

The internet has become a massive web of surveillance, with advertisers and advertising platforms collecting detailed information about people’s online activity. At Mozilla, we believe this information belongs only to the individual and that its unfettered collection is an unacceptable violation of privacy. We have deployed and continue to deploy advanced anti-tracking technology in Firefox, but believe the ecosystem will continue to develop novel techniques to track users as long as they have a strong economic incentive to do so.

We are also deeply concerned by developments in some jurisdictions to restrict anti-tracking features in browsers. In a world where regulators have to balance competing interests, it is dangerous to have advertising and privacy in a zero-sum conflict.

To address these technical and regulatory threats to user privacy while advancing Mozilla’s mission, we are developing a new technology called Privacy-Preserving Attribution (PPA). The technology aims to demonstrate a way for advertisers to measure overall ad effectiveness without gathering information about specific individuals.

The Technology Behind PPA

Rather than collecting intimate information to determine when individual users have interacted with an ad, PPA is built on novel cryptographic techniques designed to protect user privacy while enabling aggregated attribution. This allows advertisers to obtain aggregate statistics to assess whether their ads are working. It does not enable any kind of ad targeting. At its core, PPA uses a Multi-Party Computation (MPC) system called the Distributed Aggregation Protocol (DAP), in partnership with the Divvi Up project at the Internet Security Research Group (ISRG), the organisation behind Let’s Encrypt.

Here’s how it works:

Instead of exposing individual browsing activity to determine who sees an ad, PPA uses mathematics to keep consumer information private. When a user interacts with an ad or advertiser, a record of that interaction is split into two indecipherable pieces on their device – each of which is encrypted and then sent to two independently operated services. Similar pieces from many users are then combined by these services to produce an aggregate number. This number represents how many people carried out an action (such as signing up for a newsletter) after seeing the ad — all without revealing any information about the activity of any individual to either service or to the advertiser. The precise steps are as follows:

  • Data Encryption: When a user interacts with an ad or advertiser, an event is logged in the browser in the form of a value. That value is then split into partial, indecipherable pieces and then encrypted. Each piece is addressed to a different entity — one to Divvi Up at ISRG and one to Mozilla — so that no single entity is ever in possession of both pieces.
  • Masking: As an additional protection, the pieces are submitted to Divvi Up and Mozilla using an Oblivious HTTP relay operated by a third organisation (Fastly). This ensures that Divvi Up and Mozilla do not even learn the IP address of the indecipherable piece they receive. The traffic is opaque to Fastly and intermixed with other kinds of requests such that they cannot learn any information either.
  • Aggregation: Divvi Up and Mozilla each combine all the indecipherable pieces they receive to produce a (still-indecipherable) aggregate value. This means that the data from many users is combined without any party learning the contents or source of any individual data point.
  • Randomisation: Random noise is also added to each half before being revealed to provide  differential privacy guarantees, which mathematically enforce that individual activity cannot be inferred from trends in the aggregate data.
  • Recombination: Divvi Up and Mozilla then send their indecipherable values in aggregate to the advertiser, leading to a combined statistic of interest. This is an aggregate statistic across all users and does not reveal any information about an individual.

By using these advanced cryptographic methods, PPA ensures that user data remains private and secure throughout the advertising measurement process. At no point does any single entity have access to a specific user’s individual browsing activity – making this a radical improvement to the current paradigm.

Rules of the Road

One of the critical considerations in developing PPA was alignment with privacy legislation, such as the General Data Protection Regulation (GDPR). Here are a few ways that we believe PPA meets the stringent requirements in these laws:

  1. Anonymization: The combination of IP protection, aggregation, and differential privacy used by PPA breaks the link between an attribution event and a specific individual. We believe this meets the high standards of the GDPR for anonymization.
  2. Data Minimization: The information reported by the browser follows strict data minimization practices. The only information included in reports is a single, bounded histogram.
  3. Undetectable Opt-Out: When PPA is inactive, it accepts attribution reports from sites and then silently discards them. This means that sites are unable to detect whether an individual has either enabled or disabled PPA. This measure prevents discrimination or fingerprinting by sites on the basis of the feature’s availability.

Prototype Rollout and User Testing

The current implementation of PPA in Firefox is a prototype, designed to validate the concept and inform ongoing standards work at the World Wide Web Consortium (W3C). This limited rollout is necessary to test the system under real-world conditions and gather valuable feedback.

The prototype is enabled with an Origin Trial — which prevents the API from being exposed in any form to any website unless it’s specifically allowed by Mozilla. For the initial test, the only allowed sites are operated by Mozilla – specifically ads for Mozilla VPN displayed on Mozilla Developer Network (MDN). We chose this approach to ensure sufficient participation to evaluate the system’s performance and privacy protections while ensuring that it is tested in tightly-controlled conditions.

Next Steps and Future Plans

During the prototype test, if a user visits the MDN website on Firefox in relevant markets and comes across an ad for Mozilla VPN that is a part of this trial, all of the technical steps in the previous section will occur in the background to allow us to test the technology. All this while individual browsing activity will never leave the device nor be uniquely identifiable. As always, users have the ability to turn off this functionality in their Firefox settings.

As we move forward, our immediate focus is on refining and improving PPA based on the feedback from this initial prototype. Here’s what to expect in the coming months:

  1. Expansion of Testing: Depending on initial results, we may expand the number of sites involved in the testing phase, carefully monitoring the results to ensure the system operates as intended. Due to ongoing standards development, the prototype uses a non-standardized API and thus will never be exposed in its current form to the web at large.
  2. Transparency and Communication: We are committed to being transparent about how PPA works and how user data is protected. We will continue to provide updates and engage with the community to address any concerns.
  3. Collaboration and Standards Development: Mozilla will continue to work with other companies and public standards bodies to develop and standardise privacy-preserving technologies. Our goal is to create a robust, industry-wide solution that benefits all users.

Ultimately, our vision is to develop, validate, and deploy privacy-preserving technologies like PPA with the goal of ultimately eliminating the need for invasive tracking practices. By proving their viability, we aim to create a more secure and private online environment for everyone. One organisation alone cannot solve these challenges. We invite feedback along the way and we hope that our efforts inspire more organisations to innovate in similar ways. Thank you for your support as we embark on this journey. Together, we can build a better, more private internet.

The post Privacy-Preserving Attribution: Testing for a New Era of Privacy in Digital Advertising appeared first on Open Policy & Advocacy.


Viewing all articles
Browse latest Browse all 19

Trending Articles