Your resource for web content, online publishing
and the distribution of digital products.
«  

May

  »
S M T W T F S
 
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 

Transparency’s Double-Edged Sword in Census Privacy

DATE POSTED:May 22, 2025
Table of Links

Abstract and 1. Introduction

2. Related Work

3. Theoretical Lenses

3.1. Handoff Model

3.2. Boundary objects

4. Applying the Theoretical Lenses and 4.1 Handoff Triggers: New tech, new threats, new hype

4.2. Handoff Components: Shifting experts, techniques, and data

4.3. Handoff Modes: Abstraction and constrained expertise

4.4 Handoff Function: Interrogating the how and 4.5. Transparency artifacts at the boundaries: Spaghetti at the wall

5. Uncovering the Stakes of the Handoff

5.1. Confidentiality is the tip of the iceberg

5.2. Data Utility

5.3. Formalism

5.4. Transparency

5.5. Participation

6. Beyond the Census: Lessons for Transparency and Participation and 6.1 Lesson 1: The handoff lens is a critical tool for surfacing values

6.2 Lesson 2: Beware objects without experts

6.3 Lesson 3: Transparency and participation should center values and policy

7. Conclusion

8. Research Ethics and Social Impact

8.1. Ethical concerns

8.2. Positionality

8.3. Adverse impact statement

Acknowledgments and References

5.4 Transparency

The Bureau initially emphasized confidentiality, not transparency, as the benefit of the new DAS [5]. However, the handoff lens reveals that the shift to DP also changed the DAS’s capacity to support transparency. Because DP enabled technical details of the new DAS to be made public without compromising confidentiality, transparency emerged as a principal value of the political process surrounding the DAS. This shift opens up new possibilities for transparent relationships between Bureau researchers and the public.

\ A closer examination of this handoff reveals that the notion of “transparency” is in fact standing in for, and masking, many different values. In particular, there were many different ideas about the goal of transparency efforts, making it difficult for the Bureau to succeed in achieving each simultaneously. The Bureau engaged in many types of transparency, going beyond simplistic information disclosures and attempting to engage multiple audiences. Despite this, some stakeholders maintained that the Bureau was not being sufficiently transparent [12, 19, 97]. Understanding the Bureau’s various information releases as efforts to create boundary objects where competing notions of transparency were negotiated, we can unpack these transparency efforts and understand the many values and conflicts subsumed under the umbrella of transparency

\ 5.4.1 Transparency for Data Utility. Transparent privacy mechanisms can enable well-informed data users to make valid statistical inferences using privatized data by properly accounting for the uncertainty introduced by the privacy mechanism [52, 134]. Because of this, transparency in the DAS can make Census data more useful for statistical applications [52]. If the purpose of transparency is to enhance data utility through appropriate uncertainty quantification, transparency can be narrowly defined. In this case, only technical details are relevant objects of transparency, while disclosures about why a particular decision was made or who made a decision are outside of the scope of transparency for data utility.

\

\ 5.4.2 Transparency for Trust. Increased trust is often cited as a primary benefit of transparency efforts [e.g., 64, 103]. We can see that the Bureau’s decisions about what to make transparent– and what to keep hidden– were shaped by the importance of trust in census products. For example, the decision not to release the noisy measurement files (at odds with the pursuit of data utility through transparency as outlined in the previous subsection) was intended to preserve trust in census counts by hiding implausible counts produced by the original DP processing. However, by keeping the noisy measurement files hidden, external DP experts were not able to fully evaluate the Bureau’s implementation [19], ultimately undermining the trust the Bureau had hoped to preserve.

\ Outside of the noisy measurement files, the Bureau made many elements of the DAS visible during the transition to DP. Despite the Bureau’s increased transparency, however, a number of key stakeholders expressed distrust in census data products during the DAS handoff. The National Congress of American Indians expressed concern that the 2020 census data would be “inaccurate and unusable” [137, p. 3]; similarly, organizers working to increase participation in the census questioned “why they should bother putting in all this effort if the end data are going to be so noisy” [93].

\ Schnackenberg and Tomlinson suggest that trustworthiness perceptions are enhanced through disclosure, clarity, and accuracy [106]. Because of limited ability to disclose all relevant information– including previous details of disclosure avoidance systems, details of the reconstruction attack, and ground-truth data– stakeholders could not evaluate the Census Bureau’s choices through the information disclosures and demonstration data. In the absence of this additional information, the complex technical details of the system, along with bugs in the demonstration data products caused by the post-processing system, damaged, rather than enhanced, trust [18].

\ Importantly, the shift from secrecy to transparency about the perturbations of census data drew attention to data alterations and their implications that had gone unnoticed, or at least unexamined, by many stakeholders. Thus, transparency undermined trust not only in the Bureau’s implementation of DP, but also in the value of insights gained from previous census products [19]. Freeman argues that when trust between stakeholders and agencies is low, negotiations over policy implementation take on an adversarial character under which transparency can become dangerous [49]. By frontstaging the hidden work involved in the disclosure avoidance system, the Bureau revealed that decisions involved in its design were not merely sparing stakeholders mundane technical details but were in fact obscuring important policy choices. While the introduction of DP allowed the Bureau to make behind-the-scenes decision-making processes visible, this visibility exposed the slippage between the backstage and the frontstage of agency discretion – to the detriment of trust.

\ 5.4.3 Transparency for Accountability. Stakeholders’ ability to interrogate the data and report on its limitations helped the Bureau identify what aspects of the DAS were limiting the utility of the data for different purposes. Allowing stakeholders to engage with the data during a Census workshop revealed the post-processing stage of the DAS was introducing “unacceptable and problematic data biases and distortions” [10] and required structural changes. This insight demonstrates the value of the Census Bureau’s transparency efforts in producing a more accountable DAS.

\ Yet, accountability was often limited because of a lack of transparency in what DP implementation decisions were feasible for the Census, or what limiting factors were effectively immutable. For instance, only the Bureau had access to the details of the previous SDL methods and the DP framework does not readily allow for comparisons to non-DP methods, making it challenging to assess critiques that did not agree with the DP formalization of privacy as a starting point [58, 70]. Without knowledge of what policy levers were available to them, stakeholders were constrained in their ability to change the DAS.

\ Additionally, accountability was further hampered by difficulties bridging different expert groups. The Bureau needed to communicate in expert language to the relevant theoretical computer science community to convey expertise and facilitate feedback. Yet, the technical jargon necessary to elicit solid feedback from that expert community ielded communications that alienated other expert stakeholders. A letter in July 2022 from the National Congress of American Indians specifically requested that the Bureau avoid the use of jargon and technical terms in their communications with tribal leadership, citing that prior tribal consultations were “at far too high literacy levels for a lay audience and were therefore not meaningful consultation sessions” [137]. While the Bureau recognized the importance of translating across varied stakeholder groups [10, 121], the challenge of doing so proved difficult to overcome and presented a persistent obstacle to accountability

5.5 Participation

The Bureau’s process for engaging stakeholders around the 2020 Census included a number of innovations to support both democratic and technocratic elements of agency policy-making [90]. As outlined in §5.4, DP newly allowed transparency in the DAS, which in turn enabled a wider range of actors to be made aware of and participate in policy decisions embedded within the DAS.[6]

\ Increased technocratic participation became clear: during this shift, the Bureau brought in a range of experts and opened itself up to external expert review. These experts considered not only the technical details of DP and the DAS, but also provided input and review of the Bureau’s communications around the system.

\ More democratic participation was less clear. Such participation was mediated by Bureau’s choices about who constituted a relevant public and how to communicate with them. While the Census Scientific Advisory Committee’s DP working group applauded the Bureau for their efforts to include multiple perspectives, the committee also noted that it was difficult to assess what perspectives were not included and that many relevant stakeholders might not have the awareness, time, or energy to engage in policy decisions around the Census’s implementation of DP [20].

\ \

\

:::info Authors:

(1) AMINA A. ABDU, University of Michigan, USA;

(2) LAUREN M. CHAMBERS, University of California, Berkeley, USA;

(3) DEIRDRE K. MULLIGAN, University of California, Berkeley, USA;

(4) ABIGAIL Z. JACOBS, University of Michigan, USA.

:::

:::info This paper is available on arxiv under CC BY-NC-SA 4.0 DEED license.

:::

[5] Further complicating the issue of transparency for data utility, many Census advocates argued that uncertainty caused by DP noise injection was minor compared to other sources in the Census’s data collection and processing unrelated to confidentiality [117]. However, with some partial exceptions, these sources of uncertainty were not made transparent, undermining the transparency efforts and foreclosing comparison to DP uncertainty. This highlights the importance of considering transparency efforts and sociotechnical systems within their larger context.