Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove list of privacy labor suggestions around ancillary data #221

Closed
wants to merge 1 commit into from

Conversation

pes10k
Copy link
Collaborator

@pes10k pes10k commented Feb 23, 2023

Remove list of suggestions around privacy labor and ancillary data.

I also think it'd be fine to solve #220 by being more specific about how browsers could prevent privacy labor around ancillary data collection w/o collecting ancillary data from users who dont want to or anticipate that data collection


Preview | Diff

@jyasskin jyasskin force-pushed the remove-privacy-labor-suggestions branch from f93a66b to e2aa4bd Compare March 8, 2023 17:24
when deciding what [=ancillary data=] to expose. To that end, user agents may
employ user research, solicitation of general preferences, and heuristics about
sensitivity of data or trust in a particular context. To facilitate site
when deciding what [=ancillary data=] to expose. To facilitate site
Copy link
Collaborator

@samuelweiler samuelweiler May 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Off topic, but since we're here.... how about simpler language? "To help sites understand user preferences..."

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a good change too

@pes10k
Copy link
Collaborator Author

pes10k commented May 3, 2023

my concern around the "user studies" language (and similar) is that it doesn't seem bound or limited in anyway, which makes it difficult to make this actionable in spec reviews.

On one extreme, we don't want a rule that says "any privacy-affecting changes to the web platform are acceptable as long as the proposers did a user study." On the other side, we dont want to require reviewers to be experts in understanding, designing and assessing human subject research (which is its own specialized field of research requiring years of study and expertise, etc).

If folks think its important to include "user studies" text here, what guides, limits, and bounds can be included to address these concerns?

@jyasskin
Copy link
Collaborator

I've asked the Web Perf WG for thoughts in https://lists.w3.org/Archives/Public/public-web-perf/2023May/0007.html.

when deciding what [=ancillary data=] to expose. To that end, user agents may
employ user research, solicitation of general preferences, and heuristics about
sensitivity of data or trust in a particular context. To facilitate site
when deciding what [=ancillary data=] to expose. To facilitate site
understanding of user preferences, user agents can provide browser-configurable
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should "browser-configurable" be "user-configurable" here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

browser-configurable here is the same thing as user-configureable. "browser-configureable" means the user can configure the browser to change the policy

@torgo
Copy link
Member

torgo commented Nov 22, 2023

closed as it's overtaken by #361

@torgo torgo closed this Nov 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove or revise examples of signals browsers should use to prevent "privacy labor" in ancillary data
5 participants