posted on 2024-02-15, 20:42authored byChelsea L. Horne
<p dir="ltr">Users have long had to navigate dodgy defaults, sticky settings, and deceptive data practices online. Despite regulatory actions, penalties, and new policies, these problems remains and privacy harms persist. This, then, is the challenge: how to navigate an online ecosystem where privacy settings can be paradoxically empowering, yet unfair? The central thesis of this dissertation is that privacy settings are a site and a mechanism of power, and therefore play a critical part in platform governance and regulation. At present, privacy settings are both the problem and the solution to many of the concerns about privacy, deceptive practices, and manipulative design. That is to say, there is immense power in privacy settings to influence users’ privacy experience online, especially via default settings, and in the same breath, we can say that by addressing the potential and sometimes unintended privacy harms of “bad” privacy settings and choices, we can solve several major privacy concerns. Settings have tangible impact: their innate, visible obviousness renders them a powerful piece of the privacy puzzle. To study privacy settings, this dissertation takes a two-pronged approach. First, I conduct a comparative analysis of the privacy settings of major social media platforms. Secondly, I consider a historical study of the evolution of Meta’s privacy settings in relation to privacy inflection points.</p><p dir="ltr">The primary goal and contribution of this dissertation is that it provides a framework for policymakers and social media platforms to better understand privacy online both in theory and in practice, so that they may develop better policies and designs. At the heart of my framework, and what is a major contribution of this dissertation, is my typology of privacy settings, which deconstructs privacy as a discourse and reimagines current applications, definitions, and understandings of privacy via the lens of privacy settings. I have decoupled the different subcomponents of privacy discourse by identifying, naming, and operationalizing seven distinct components present in privacy settings to create an analytical toolset that offers a greater degree of granularity and specificity to platforms, regulators, and consumers.</p>
History
Related Materials
1.
URN - Is identical to http://dissertations.umi.com/american:12128
Publisher
ProQuest
Language
English
Committee chair
Laura DeNardis
Committee member(s)
Aram Sinnreich; Nanette Levinson
Degree discipline
Communication
Degree grantor
American University. School of Communication
Degree level
Doctoral
Degree name
Ph.D. in Communication, American University, December 2023