Definitions Edit

Privacy enhancing technologies (PET) has been defined as:

Privacy-Enhancing Technologies is a system of ICT measures protecting informational privacy by eliminating or minimizing personal data thereby preventing unnecessary or unwanted processing of personal data, without the loss of the functionality of the information system.[1]

The European Commission in its Communication to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs) describes a PET as

a coherent system of ICT measures that protects privacy by eliminating or reducing personal data or by preventing unnecessary and/or undesired processing of personal data, all without losing the functionality of the information system.[2]

The UK Information Commissioner's Office defines PETs as "any technology that exists to protect or enhance an individual’s privacy, including facilitating individuals’ access to their rights under the Data Protection Act 1998."

Overview Edit

Privacy enhancing technologies (PET) is a general term for a set of computer tools, applications and mechanisms which — when integrated in online services or applications, or when used in conjunction with such services or applications — allow online users to protect the privacy of their personally identifiable information (PII) provided to and handled by such services or applications.

A PET is something that:

  1. reduces or eliminates the risk of contravening privacy principles and legislation.
  2. minimizes the amount of data held about individuals.
  3. empowers individuals to retain control of information about themselves at all times.

Goals of PETs Edit

PETs aim at allowing users to take one or more of the following actions related to their personal data sent to, and used by, online service providers, merchants or other users:

Existing PETs Edit

Examples of existing privacy enhancing technologies are:

Future PETs Edit

Examples of privacy enhancing technologies that are being researched or developed are:[5]

  • Wallets of multiple virtual identities; ideally unlinkable. Such wallets allow the efficient and easy creation, management and usage of virtual identities.
  • Anonymous credentials: asserted properties/attributes or rights of the holder of the credential that don't reveal the real identity of the holder and that only reveal so much information as the holder of the credential is willing to disclose. The assertion can be issued by the user herself, by the provider of the online service or by a third party (another service provider, a government agency, etc.). For example:
    • Online car rental. The car rental agency doesn't really need to know the true identity of the customer. It only needs to make sure that the customer is over 23 (as an example), that the customer has a drivers' license, that the customer has health insurance for accidents (as an example), and that the customer is paying. Thus, there is no real need to know her real name nor her address nor any other personal information. Anonymous credentials allow both parties to be comfortable: they allow the customer to only reveal so much data which the car rental agency needs for providing its service (data minimization), and they allow the car rental agency to verify their requirements and get their money. When ordering a car online, the user, instead of providing the classical name, address and credit card number, provides the following credentials, all issued to pseudonyms, i.e. not to the real name of the customer:
      • An assertion of minimal age, issued by the state, proving that the holder is older than 23 (i.e. the actual age is not provided)
      • A drivers' license, i.e. an assertion, issued by the motor vehicle control agency, that the holder is entitled to drive cars
      • A proof of insurance, issued by the health insurance
      • Digital cash
With this data, the car rental agency is in possession of all the data it needs to rent the car, it can thus, as an example, provide the unlocking code to the customer with which she can unlock the closet where the car key is kept.
Similar scenarios are buying wine at an Internet wine store or renting a movie at an online movie rental store.
As an example, it can be negotiated that personal data must not be transferred out to third parties or that the data is to be deleted after 3 months following the end of the [contract]]. While this negotiation takes place, the online service provider communicates his requirements about the minimum amount of data he needs to provide the wanted service. Additional personal data may be asked for, too, but will be clearly labelled as optional. After the transfer of personal data took place, the agreed upon data handling conditions are technically enforced by the infrastructure of the service provider, which is capable of managing, processing and data handling obligations. Moreover, this enforcement can be remotely audited by the user, for example by verifying chains of certification based on trusted computing modules or by verifying privacy seals/labels that were issued by third party auditing organizations (e.g., data protection agencies). Thus, instead of the user having to rely on the mere promises of service providers not to abuse personal data, users will be more confident about the service provider adhering to the negotiated data handling conditions.

References Edit

  1. G.W. van Blarkom, J.J. Borking & J.G.E. Olk, Handbook of Privacy and Privacy-Enhancing Technologies. (The Case of Intelligent Software Agents) (2003).[1]
  2. COM(2007) 228 final.
  3. The EU PRIME research project's Vision on privacy enchanced identity management.
  4. Key Facts on Privacy Negotiations.
  5. The EU PRIME research project's White Paper (Version 2).

This page uses Creative Commons Licensed content from Wikipedia (view authors). Smallwikipedialogo.png