22. Does P3P solve all privacy
concerns on the Web?
No. P3P focuses on the disclosure of web site privacy practices. Its
goal is to help users find out what the privacy practices of a particular site are, and
how these practices compare with the user's preferences. P3P does not solve all privacy
issues, but it can be part of a larger, more comprehensive set of technical and legal
23. Does P3P eliminate the need for other technologies like
encryption or anonymizers?
Definitely not. In fact, P3P focuses on doing one thing and doing it
well: communicating to users, in a simple and understandable way, web sites' stated
privacy policies. Consequently, we defer to others tasks such as encrypting or anonymizing
electronic communications. P3P is meant to be complementary to such technologies. For
instance, users must be able to express preferences such that, "I will only give my
credit card information to sites over a secured communication channel such as SSL or
SET." We don't attempt to reinvent anonymizers, encryption or payment protocols, but
we expect P3P will be able to work with all of these things.
24. Does P3P prevent secondary use of data?
P3P is designed to inform users about any secondary use of their data so they can make
informed choices about whether or not to provide data that might be used for these
purposes. Of course, P3P does not by itself prevent sites from making secondary uses of
data that they do not disclose.
from P3P and Privacy on the Web FAQ;
W3C; June 22, 2001; http://www.w3.org/P3P/p3pfaq.html)
HOW P3P 1.0 WILL HELP PROTECT
P3P can help standardize privacy notices
On a P3P 1.0 enabled Web, all privacy policies will have the same
basic machine-readable fields that will express a company's privacy practices. While this
does not offer privacy protection, if implemented, it could greatly advance transparency
and be used to support efforts to improve privacy protection. As stated above, it does not
address the full range of privacy considerations. But, it is designed to facilitate the
exchange of information about privacy policies in a fashion that maps on to the Internet.
P3P does not preclude the use of other technical or legal means of protecting privacy. In
fact, the working group has sought input from both builders of privacy enhancing tools and
those responsible for implementing and enforcing privacy laws. P3P is just one stone in
the foundation. It needs to be used in concert with effective legislation, strategic
policy and other privacy enhancing tools. For example:
1. Countries with data protection and privacy laws and others
seeking to police compliance with privacy standards could find the automated ability to
assess a businesses' privacy statement useful in their broader oversight and compliance
program. -- Searching and gathering privacy policies could be simplified through P3P. P3P
would allow the policies to be collected and analyzed in a standard machine-readable
format. Governments and organizations would be able to simply search through P3P
statements to find companies whose notice does not meet privacy standards in various
areas. In the current version of P3P, companies could even point to regulatorybodies that
oversee them to help route privacy complaints.
2. Users could more easily read privacy statements before
entering Web sites. -- Today, it is often difficult to find privacy notices. Once found,
they are frequently written in complicated legalese. P3P implementations could allow users
to assess privacy statements prior to visiting a site, and allow users to screen and
search for sites that offer certain privacy protections.
3. Cutting through the legalese -- A company's P3P statement
cannot use difficult to understand or unclear language. The standardization and
simplification of privacy assertions into statements that can be automated will allow
users to have a clear sense of who does what with their information.
4. Enterprising companies or individuals could develop more
accurate means of rating and blocking sites that do not meet certain privacy standards or
allow individuals to set these standards for themselves. Several companies already rate
and block Web sites that do not meet certain privacy standards. Today, creating the tools
and knowledge that support these products is difficult and time consuming. By providing an
open standard, P3P 1.0 could enhance the transparency, accuracy and detail of existing
products, and could encourage an influx of new privacy enhancing products and services.
P3P can support the growth of more privacy choices, including
ananymity and pseudonymity
anonymity is an important protection for privacy on the Internet. The ability to use the
Internet with a pseudonym is also critical. These options must be supported and promoted.
However, with anonymity or pseudonymity a person would be hard pressed to be involved in
the full diversity of interactions occurring on the Internet. For privacy to be part of
the Internet infrastructure, we must deploy tools that assist individuals in controlling
personal information when they choose to, or need to, disclose it. P3P 1.0 can be used
with anonymity tools to allow users to have more control over their personal information.
A user should be able to be anonymous in one context and identifiable in another. The
ability to have Web sites' privacy notices parsed and interpreted by a privacy tool can
assist individuals decision-making regarding when and to whom to disclose personal
information. Today, reading policies is a time consuming, cumbersome and sometimes
impossible task. P3P 1.0 would help change that.
P3P WILL NOT DO:
P3P cannont protect the privacy of users in jurisdictions with
insufficient data privacy laws
The W3C is a specification setting organization; it does not have the
ability to create public policy nor can it demand that its specifications be followed in
the marketplace. While different members of the W3C may have different reasons for
engaging in the process nothing in the P3P Specification or the P3P Guiding Principles
presumes that P3P is designed to replace public policy or a public policy process.
Accordingly, P3P is designed to allow for statements about data practices, which are in
turn directed by law, regulatory procedures, self-regulation or other forces.
We believe that better data privacy laws and further self-regulatory
efforts are necessary to protect consumer privacy internationally. As privacy advocates,
we believe that -- armed with more information -- individuals will seek out companies that
afford better privacy protection. Recent consumer pressure on companies that collect
personal information like there is no tomorrow, show that the public will act to protect
their privacy if given simple, practical tools and advice to aid them. It also shows that
companies can be made to moderate or reverse their policies and practices, if only
temporarily. P3P can and should be used in concert with public policy to help protect
P3P cannot ensure that companies follow privacy policies
If a company says that they are going to do one thing and does
something else, no technological process can stop them. Deception must be stopped through
public policy processes, legislation and the courts. Even in the United States, a country
with limited consumer privacy protections, the Federal Trade Commission has brought cases
against companies that do not follow posted privacy policies.
P3P would make privacy policies transparent. It does not ensure that
the policies are followed. No technological process can ensure that companies comply with
law or statements they choose to make. But, P3P will lead to greater openness, more
informed Web users and therefore greater accountability.
from "P3P and
Privacy: An Update for the Privacy Community"; Center for Democracy &
Technology; October 21, 2001; http://www.cdt.org/privacy/pet/p3pprivacy.shtml)
Here are some of the principal flaws in
the concept of P3P.
concept presumes that privacy is a preference that some technologically advanced minority
might be granted an opportunity to avoid having violated on occasions where those people
have taken a specific action designated by the companies who wish to exploit personal
information. Rather, privacy is a fundamental human right that should be universally
concept presumes consumers have an extremely diverse range of "privacy
preferences" that should be catered to with a correspondingly wide range of options,
like flavors of soft drinks. Rather, the core of consumers' desires for privacy are simple
and easily stated, but unpalatable for marketers: consumers don't want their personal
information sold, shared, or reused for secondary purposes. The fact that some are willing
to grant specific consent for certain uses doesn't mean that they wish to make an open
offer of their privacy. A bewildering range of options tends to to distract consumers and
policymakers from the sad fact that what should be standard equipment is hard to find or
concept's premise promotes the view that personal information is a secondary currency or
commodity to be bartered rather than a necessary detail for performing some part of the
transaction, such as delivering the ordered goods by mail. Rather than the fake-privacy
doctrine of "notice and choice," which in practice means burdening the consumer
with understanding complex details and attempting to opt-out of some of them, real privacy
consists of limiting the use of information to what is needed, always with the explicit
consent and understanding of the consumer.
is a presumption that access should be focused on a company's policy instead of access by
individual consumers to information held by the company about them. Rather, a consumer
should be able to assume that the company's policy is to treat her data fairly; what she
then needs is to be given access to all her specific data so that she can check that it is
being correctly handled in practice. She should be able to check that her understanding of
what information the company should have about her corresponds with what is actually held,
and amend it if not. Granted, P3P does offer a way for a site to say whether it grants
access, but stops there. Standards such as the now-moribund Open Profiling Standard can be
quickly recognized as marketing mechanisms rather than privacy standards by the fact that
the flow of personal information is unidirectional: from the consumer to the company.
political environment surrounding the development of P3P promotes the erroneous belief
that Internet privacy is something terribly complex and remote from "offline"
privacy, and that technology will eventually solve the problem if given time, making legal
rights and enforcement mechanisms unnecessary. Rather, the core privacy issues are
identical online and offline; online consumers are more aware of the risks, so companies
have been forced to give it more attention. Further, no amount of technology can ever make
up for the lack of enforceable privacy rights held by the American citizen.
the most implausible premise is the view that a high level of privacy will eventually be
achieved if software makers and ecommerce sites agree on a standard that (after an even
longer time, as software is upgraded) might be adopted by a sufficiently large percentage
of consumers, thus expressing through the market and technology an economic demand for
privacy. Believing this process will succeed in protecting privacy is as naive as hoping
that environmental protection would be well served by having Exxon and GM draw up
standards for emission control, and by the auto industry providing consumers the
opportunity to vote on these standards by checking boxes on postcards made available to
them at gas stations and automobile showrooms. Rather, technologists should take as their
point of departure the strong privacy rights that are being mandated by an increasing
number of legislatures, and develop technology that will efficiently and effectively serve
people exercising those rights.
is an unspoken assumption that as soon as a highly technical language is provided for
codifying privacy policies, then marketers will offer good policies in this language.
Rather, a simple argument will prove that P3P will never provide the majority with any
real privacy protection or even useful guidance. Under the banner of "policy-neutral
language," P3P is simply deferring the difficult decision of what the minimum
acceptable standard should be. As a thought-experiment, suppose that some time in the 21st
century, the P3P language is finalized and the software ready. A decision will have to be
made on the defaults, designating the minimum expectation that surfers should have before
the browser raises alerts on visiting a substandard site. (For P3P to have any widespread
effect, it would have to be pre-installed in both major browsers, and there would have to
be some such default below which an alarm is raised.) This entails a large number of
questions to which no consensus answer is ever likely to be found. Should the consumer be
alerted if a site's policy:
it might sell names if the consumer doesn't separately opt out?
provide access to the data held by the company about the person?
and a hundred other
questions like these. It will take at least until the 22nd century for marketers to agree
to defaults that are anywhere near the levels that consumers or privacy experts would
want. And who would be making the decision on whether this technology and its defaults
goes into browsers? Microsoft and AOL/Netscape control more than 90% of the market. Do
these companies have a history of choosing privacy-friendly defaults, such as those for
cookies? No. Do these companies have a history of placing the privacy of consumers above
the commercial interest of themselves and their marketing partners? No. Would they install
defaults that alarm prospective purchasers unless stated privacy standards are higher than
what they currently offer? Very unlikely. If you disagree, why not issue a public challenge
to AOL, Microsoft and other sponsors of P3P and the Direct Marketing Association to
propose default settings that they consider would be acceptable. If you receive no
satisfactory response, take this as an admission that your project has been used as a pawn
in a cynical campaign against privacy. If you receive a sensible response, present it to
consumers and consumer advocates and ask whether they consider them acceptable. This
exercise is unlikely to succeed in gaining a consensus, and you might as well find out
whether it can before going to the mighty effort of finalizing the specification and
from Technical Standards and
Privacy, An Open Letter to P3P Developers; Jason Catlett of JunkBusters; September 13,
Prognosis for Adoption
After more than three years in development, P3P still faces a
number of serious challenges that will likely preclude its widespread adoption.
There is no user base and no user demand. Companies have been
reluctant to adopt the complicated protocol structure, and governments has shown little
indication that it will address public concerns about privacy protection.
Experience with cookies sheds light on another possible P3P user
agent-side problem. Those consumers, who have taken the time to configure their browsers
to notify when receiving, or reject cookies, have found that web surfing becomes nearly
The same situation will likely apply to P3P user agents. Concerned
users will configure their P3P user agents to reflect high privacy protections. However,
when these users attempt to access the majority of commercial web sites, endless pop-up
windows warning them that a site wishes to go beyond their specified privacy preferences
will result. Users who have configured their agents to block sites that do not meet their
preferences may well find that there are few web sites left to surf. Consumers will likely
respond to this frustrating situation by begrudgingly reverting to low P3P privacy
protective configurations, thus maintaining industry's present privacy invasive status
The incredible complexity of P3P, combined with the way that popular
browsers are likely to implement the protocol could also undermine well-established
privacy standards particularly where legislation is in place. P3P may actually strengthen
the monopoly position over personal information that U.S. data marketers now enjoy.
from "Pretty Poor
Privacy: An Assessment of P3P and Internet Privacy"; EPIC and JunkBusters; June
The Gathering Clouds
Privacy advocates adopted varying interpretations of P3P. Several activists, myself
included, participated in the W3C Working Group, in the belief that the initiative was
capable for delivering real technological protections for web-users. Several others were
more sceptical, and preferred to stay outside the Working Group.
In my critique of early 1998, I identified four aspects of P3P that I was concerned
- the coverage of privacy needs;
- the coverage of legal and cultural diversity;
- the drivers for implementation; and
- the mechanisms for ensuring compliance.
At the international privacy conference in Montreal in September 1997, EPIC's Marc
Rotenberg presented a classification scheme for technologies:
- Technologies of Surveillance (equivalent to my 'PITs');
- Technologies for Contracting (including P3P, which he saw as being neutral rather than a
positive contribution to privacy);
- Technologies for Labelling and Notice (such as 'trust labels'); and
- Privacy-Enhancing Technologies (PETs).
To address some of his concerns about the limited contribution that he saw P3P as
making, I suggested that some refinements were needed, including:
- the extension of the data schema to identify the nature of and authority for
enforceability of the policy (e.g. a company code, a contract, an industry code of
conduct, and/or legislation);
- the association of privacy preferences with a role, not with a person, and the
enablement of multiple sets of roles. This would have laid the foundation for
- the inclusion in the software agent schema of references to services providing anonymity
New York-based Australian, Jason Catlett, of Junkbusters
Inc., expressed more serious concerns in an open letter to P3P's designers in
September 1999. He depicted P3P as being part of the direct marketing lobby's manoeuvres
to convert privacy from the fundamental human right that it is, to nothing more than a
consumer preference. It diverted attention away from what is really needed
(privacy-protective law complete with enforcement and redress), towards the U.S. corporate
view of privacy as merely notice of practices and consumer choice. Rather than a Platform
for Privacy Preferences, he saw it as a Pretext for Privacy Procrastination.
P3P as a Pseudo-PET
I've had little to do with P3P during the 18 months since Jason's open letter. I
re-visited P3P recently, and was very disappointed with what I found.
The descriptions of the now all-but finalised specification make clear that the
protocol specifies only the statement of a web-site's use and disclosure policy. Worse, it
is actually depicted as thought it were a push-mechanism, rather than a communication
initiated by a request by a browser. The accompanying diagrams even go so far as to imply
that the browser submits personal data to the server irrespective of what the web-site's
policy statement is.
Critically, the specification contains no minimum requirements of web-browsers. This
had to be omitted in order to avoid constraining competition among browser-providers. P3P
therefore fails to create any momentum towards the inclusion of the necessary
privacy-sensitive features in the tools that users have at their disposal.
The original promise of P3P has been neutered. The judgements of Marc Rotenberg in 1997
and Jason Catlett in 1998, as updated in EPIC (2000), are fully
mere fraction of what is was meant to be, and of what the situation demands.
The key proponents of the P3P protocol have laboured long and hard in an endeavour to
deliver a PET, but the interests of W3C's members have resulted in it being watered down
to a mere pseudo-protection.
Re-visited"; Roger Clarke (Dr.
Clarke was initially a supporter of P3P but switched his position); March 20, 2001;
2: P3P - Hype or hope?
that is barely comprehensible even to lawyers when your computer can do it for you
automatically? P3P is intended to make it easier for Internet-surfers to negotiate over
their personal information by reducing the transaction costs of the negotiation -
implementation of P3P for cookie-winnowing is a fine example. Internet Explorer 5.5 has
just three settings for cookies - block all cookies, get an approve-or-reject prompt for
each cookie, or block none. The first option is too restricting, barring cookies even from
fairly trustworthy sites; the second option incurs the transaction costs of reading and
third option exposes the user to untrustworthy cookies. With the new controls available in
IE 6.0, users will be able to express their privacy preferences using P3P (the Platform
for Privacy Preferences), and let the computer incur the transaction costs of figuring out
whether any given cookie respects those preferences. P3P would thus seem the ideal
solution to put power over their personal information back in the hands of consumers.
Reality, alas, is somewhat more nuanced.
categorizes types of information that are handed over by the user (e.g. physical contact
info, demographic info, health info), purposes for which it is collected (e.g. technical
support, telemarketing, R&D), the recipients of the information (e.g. delivery
services, public fora, "<other-recipient>"), and the duration of the
categories can be misused. For example, one purpose category is
"<current/>" - the completion and support of the current activity.
Naturally, if the webpage states in some obscured corner that the current activity
includes handing the information to the highest bidder, then that is part of the activity.
See Robert Thibadeau, A Critique of P3P : Privacy on the Web (August 23,
2000) < http://dollar.ecom.cmu.edu/p3pcritique/>. A similar concern arises with the
retention policy category of "<stated-purpose/>". These categories
effectively require the user to hunt down the prose explanation of the "current
activity" or the "stated purpose" for which the data is being collected,
re-imposing the transaction costs that P3P was meant to eliminate in the first place. Even
the seemingly more-specific categories are not always as innocuous as they appear. The
recipient called "delivery services" ("<delivery>") is
explicitly described as delivery services which use the information for their own purposes
(delivery services using the information only for carrying out the delivery fall under
"<ours>" - "Ourselves and/or our [sic] entities acting as our
agents"). Explicitly described in the W3C technical specification, that is - which
few users are likely to peruse. In a similar vein, the purpose called "Research and
development" ("<develop/>") includes using the information to improve
the products marketing campaign.
potential problem with P3P is its support of automatic data-transmission (such a module
was actually incorporated into a prior draft of the protocol; it was dropped, but could be
re-added by services using P3P as their base protocol). In other words, the user would
type in her personal information once, and the computer will hand over that information
whenever requested to do so by another computer presenting an acceptable P3P policy. The
user may not realize that her consent to transmit, say, age information authorizes the
transmission not only of an age-range (e.g. 18-24), but of her actual date of birth, a
highly personal datum. A request for a zip+4 code (sometimes as specific as a single
building) can be presented as asking for "geographical location". Having to type
out the data would alert the user to the kind of information being collected; its
invisible transmission will not. See Karen Coyle, P3P: Pretty Poor Privacy?
(June, 1999) <http://www.kcoyle.net/p3p.html>.
most serious problem with P3P is probably its lack of an enforcement mechanism. Privacy
policies can point to a trusted third-party (e.g. TRUSTe) and claim to be audited by it,
but the W3C has yet to come up with a mechanism for non-repudiability of the agreement
between the user and the data-collector, a mechanism that would turn the policy into part
of a legally binding contract.
a problem that is endemic to all privacy protection schemes endangers P3P-based schemes as
well: consumer inertia. Microsofts P3P-based cookie winnower has a default setting
that even Microsofts Michael Wallent, product unit manager for IE, admits to be lax.
Yes, consumers can adjust the settings, but how many will, for either this or any of the
P3P-based privacy services?
from "P3P - An
Imperfect Tool for Privacy; Yair Galil from the New York Bureau
of The Internet Law Journal;
July 14, 2001; http://www.tilj.com/content/ecomheadline07140102.htm)