Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

labour -> labor per W3C manual of style US spelling #429

Merged
merged 2 commits into from
Nov 14, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -725,7 +725,7 @@

When deployed thoughtfully, these mechanisms can improve [=people=]'s [=autonomy=]. Often,
however, they are used as a way to avoid putting in the difficult work of deciding which
types of [=processing=] are [=appropriate=] and which are not, offloading [=privacy labour=]
types of [=processing=] are [=appropriate=] and which are not, offloading [=privacy labor=]
to the people using a system.

[=People=] should be able to [=consent=] to data sharing that would
Expand Down Expand Up @@ -767,21 +767,21 @@
for the scale of processing implied by web systems), and experience with existing systems
shows that they make it hard for [=people=] to exercise their rights.

### Privacy Labour {#privacy-labour}
### Privacy Labor {#privacy-labor}

<dfn data-lt="privacy labor|labour|labor">Privacy labour</dfn> is the practice of having a [=person=] do
<dfn data-lt="labor">Privacy labor</dfn> is the practice of having a [=person=] do
the work of ensuring [=data processing=] of which they are the subject or recipient is
[=appropriate=], instead of putting the responsibility on the [=actors=] who are doing the processing.
Data systems that are based on asking [=people=] for their [=consent=] tend to increase
[=privacy labour=].
[=privacy labor=].

More generally, implementations of [=privacy=] often offload [=labour=] to [=people=]. This is
More generally, implementations of [=privacy=] often offload [=labor=] to [=people=]. This is
notably true of the regimes descended from the <dfn data-lt="FIPs">Fair Information Practices</dfn>
([=FIPs=]), a loose set of principles initially elaborated in the 1970s in support of individual
[=autonomy=] in the face of growing concerns with databases. The [=FIPs=] generally assume that
there is sufficiently little [=data processing=] taking place that any [=person=] will be able to
carry out sufficient diligence to be [=autonomous=] in their decision-making. Since they offload
the [=privacy labour=] to people and assume perfect, unlimited [=autonomy=], the [=FIPs=] do not
the [=privacy labor=] to people and assume perfect, unlimited [=autonomy=], the [=FIPs=] do not
forbid specific types of [=data processing=] but only place them under different procedural
requirements. This approach is no longer [=appropriate=].

Expand Down Expand Up @@ -1703,7 +1703,7 @@
These forms might be governmental bodies at various administrative levels, standards
organisations, worker bargaining units, or civil society fora.
Even though collective decision-making can be better than offloading
[=privacy labour=] to [=individuals=], it is not a panacea.
[=privacy labor=] to [=individuals=], it is not a panacea.
Decision-making bodies need to be designed carefully,
for example using the <a data-cite="IAD#">Institutional Analysis and Development framework</a>.

Expand Down Expand Up @@ -1935,7 +1935,7 @@
</div>

Attempts to obtain consent to [=processing=] that is not in accordance with the person's
true preferences result in imposing unwanted [=privacy labour=] on the person, and may
true preferences result in imposing unwanted [=privacy labor=] on the person, and may
result in people erroneously giving consent that they regret later.

An [=actor=] should not prompt a [=person=] for consent if the
Expand Down