Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Dataveillance imaginaries and their role in chilling effects online

Published: 01 November 2023 Publication History

Highlights

Self-inhibition due to dataveillance is defined as chilling effects of dataveillance.
Dataveillance imaginaries affect internet users’ self-inhibition of digital communication.
Dataveillance imaginaries include actors, workings, data types, and consequences.
Chilled behaviors include opinion voicing, service use, and information seeking.
A sense of dataveillance can also lead to no change or enhancing privacy.

Abstract

The automatic tracing and analysis of personal data on the internet is a common occurrence. So far, the extent of internet users’ sense of such dataveillance and reactions to it remain obscure. This article explores 1) internet users’ dataveillance imaginaries and 2) the role they play for self-inhibited digital communication behaviors in relation to other behavioral responses to a sense of dataveillance. To address these questions, we apply thematic analysis to semi-structured interviews. Our findings show that internet users’ dataveillance imaginaries affect their self-inhibition: Not trusting actors, being aware of advanced workings of dataveillance, being critical of data collecting and monetizing, valuing privacy highly, and evaluating consequences of dataveillance negatively lead to self-inhibition. Such self-inhibition because of a sense of dataveillance, i.e., chilling effects of dataveillance, manifest in not using certain services, not searching for information, and not voicing one's opinion, which is problematic in a democracy. Further behavioral responses to a sense of dataveillance include not changing one's use of services and using privacy enhancing techniques. By shedding light on the role internet users’ dataveillance imaginaries play for their self-inhibition of legitimate digital communication behavior, this article innovatively contributes to the empirical investigation of the chilling effects of dataveillance.

References

[1]
I. Ajzen, The theory of planned behavior, Organ. Behav. Hum. Decis. Process., Theories of Cognitive Self-Regulation 50 (1991) 179–211,.
[2]
Barnes, S.B., 2006. A privacy paradox: social networking in the United States. First Monday. https://doi.org/10.5210/fm.v11i9.1394.
[3]
L. Baruh, E. Secinti, Z. Cemalcilar, Online privacy concerns and privacy management: a meta-analytical review, J. Commun. 67 (2017) 26–53,.
[4]
Bedi, S., 2021. The Myth of the Chilling Effect. https://doi.org/10.2139/ssrn.3794037.
[5]
S.C. Boerman, S. Kruikemeier, F.J. Zuiderveen Borgesius, Exploring motivations for online privacy protection behavior: insights from panel data, Commun. Res. (2018) 1–25,.
[6]
N. Bol, T. Dienlin, S. Kruikemeier, M. Sax, S.C. Boerman, J. Strycharz, N. Helberger, C.H. de Vreese, Understanding the effects of personalization as a privacy calculus: analyzing self-disclosure across health, news, and commerce contexts†, J. Comput.-Mediat. Commun. 23 (2018) 370–388,.
[7]
V. Braun, V. Clarke, Thematic analysis: a Practical Guide, SAGE, London ; Thousand Oaks, California, 2022.
[8]
T. Bucher, The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms, Inf. Commun. Soc. 20 (2017) 30–44,.
[9]
M. Büchi, N. Festic, M. Latzer, The chilling effects of digital dataveillance: a theoretical model and an empirical research agenda, Big Data Soc 9 (2022) 1–14,.
[10]
M. Büchi, E. Fosch-Villaronga, C. Lutz, A. Tamò-Larrieux, S. Velidi, Making sense of algorithmic profiling: user perceptions on Facebook, Inf. Commun. Soc. (2021) 1–17,.
[11]
M. Büchi, E. Fosch-Villaronga, C. Lutz, A. Tamò-Larrieux, S. Velidi, S. Viljoen, The chilling effects of algorithmic profiling: mapping the issues, Comput. Law Secur. Rev. 36 (2020) 1–15,.
[12]
M. Büchi, N. Just, M. Latzer, Caring is not enough: the importance of Internet skills for online privacy protection, Inf. Commun. Soc. 20 (2017) 1261–1278,.
[13]
A. Calero-Valdez, M. Ziefle, The users’ perspective on the privacy-utility trade-offs in health recommender systems, Int. J. Hum.-Comput. Stud. 121 (2019) 108–121,.
[14]
W. Christl, S. Spiekermann, Networks of control: a Report On Corporate surveillance, Digital tracking, Big Data & Privacy, Facultas, Wien, 2016.
[15]
R. Clarke, Information technology and dataveillance, in: J. Savirimuthu (Ed.), Communications of the ACM, Routledge, 1988, pp. 498–512.
[16]
R. Clarke, G. Greenleaf, Dataveillance regulation: a research framework, J. Law Inf. Sci. 25 (2017) 104–122,.
[17]
V. Clarke, V. Braun, Thematic analysis, J. Posit. Psychol. 12 (2017) 297–298,.
[18]
J. Corbin, A. Strauss, Basics of Qualitative research: Techniques and Procedures For Developing Grounded Theory, 4th edition, SAGE, Los Angeles, 2015, ed.
[19]
M.A. DeVito, J. Birnholtz, J.T. Hancock, M. French, S. Liu, How people form folk theories of social media feeds and what it means for how we study self-presentation, in: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Presented at the CHI ’18: CHI Conference on Human Factors in Computing Systems, Montreal QC Canada, ACM, 2018, pp. 1–12,.
[20]
T. Dienlin, M.J. Metzger, An extended privacy calculus model for SNSs: analyzing self-disclosure and self-withdrawal in a representative US sample, J. Comput.-Mediat. Commun. 21 (2016) 368–383,.
[21]
K. Doherty, M. Barry, J.M. Belisario, C. Morrison, J. Car, G. Doherty, Personal information and public health: design tensions in sharing and monitoring wellbeing in pregnancy, Int. J. Hum.-Comput. Stud. 135 (2020),.
[22]
B.E. Duffy, N.K. Chan, You never really know who's looking”: imagined surveillance across social media platforms, New Media Soc 21 (2019) 119–138,.
[23]
Eide, E., 2019. Chilling effects on free expression: surveillance, threats and harassment, in: making transparency possible. Cappelen Damm Akademisk/NOASP, pp. 227–242. https://doi.org/10.23865/noasp.64.ch16.
[24]
M. Eslami, A. Rickman, K. Vaccaro, A. Aleyasen, A. Vuong, K. Karahalios, K. Hamilton, C. Sandvig, “I always assumed that I wasn't really that close to [her]”: reasoning about Invisible Algorithms in News Feeds, in: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI ’15. Presented at the the 33rd Annual ACM Conference, Seoul, Republic of Korea, ACM Press, 2015, pp. 153–162,.
[25]
A. Ghorayeb, R. Comber, R. Gooberman-Hill, Older adults’ perspectives of smart home technology: are we developing the technology that older people want?, Int. J. Hum.-Comput. Stud. 147 (2021),.
[26]
G. Gibbs, Analyzing Qualitative Data, SAGE Publications, 2007,. Ltd, 1 Oliver's Yard, 55 City Road, London England EC1Y 1SP United Kingdom.
[27]
M.S. Granovetter, The Strength of Weak Ties, Am. J. Sociol. 78 (1973) 1360–1380.
[28]
D. Gray, D. Citron, The right to quantitative privacy, Minn. Law Rev. (2013) 98.
[29]
G. Grill, N. Andalibi, Attitudes and folk theories of data subjects on transparency and accuracy in emotion recognition, Proc. ACM Hum.-Comput. Interact. 6 (78) (2022) 1–78,. 35.
[30]
J. Habermas, Ein Neuer Strukturwandel der Öffentlichkeit und Die Deliberative Politik, Suhrkamp, Berlin, 2022.
[31]
M. Hall, A neo-republican critique of transparency: the chilling effects of publicizing power, in: L.A. Viola, P. Laidler (Eds.), Trust and Transparency in an Age of Surveillance, Routledge, 2021.
[32]
L. Heiselberg, A. Stępińska, Transforming qualitative interviewing techniques for video conferencing platforms, Digit. J. (2022) 1–12,.
[33]
Y. Hermstrüwer, S. Dickert, Sharing is daring: an experiment on consent, chilling effects and a salient privacy nudge, Int. Rev. Law Econ. 51 (2017) 38–49,.
[34]
J. Hinds, E.J. Williams, A.N. Joinson, It wouldn't happen to me”: privacy concerns and perspectives following the Cambridge Analytica scandal, Int. J. Hum.-Comput. Stud. 143 (2020),.
[35]
C.P. Hoffmann, C. Lutz, G. Ranzini, Privacy cynicism: a new approach to the privacy paradox, Cyberpsychol. J. Psychosoc. Res. Cyberspace 10 (2016),.
[36]
J.A. Holton, The coding process and its challenges, in: A. Bryant, K. Charmaz (Eds.), The SAGE Handbook of Grounded Theory, SAGE Publications, 2007, pp. 265–289,. Ltd, 1 Oliver's Yard, 55 City Road, London England EC1Y 1SP United Kingdom.
[37]
K. Kappeler, N. Festic, M. Latzer, T. Rüedy, Coping with algorithmic risks: how internet users implement self-help strategies to reduce risks related to algorithmic selection, J. Digit. Soc. Res. 5 (2023) 23–47,.
[38]
E. Katz, J.G. Blumler, M. Gurevitch, Uses and gratifications research, Public Opin. Q. 37 (1973) 509,.
[39]
B.P. Knijnenburg, A. Kobsa, H. Jin, Dimensionality of information disclosure behavior, Int. J. Hum.-Comput. Stud. 71 (2013) 1144–1162,.
[40]
H.D. Lasswell, The Structure and Function of Communication in Society, Harper & Brothers, New York, 1948.
[41]
M. Latzer, The digital trinity—controllable human evolution—implicit everyday religion, KZfSS Köln. Z. Für Soziol. Sozialpsychologie 74 (2022) 331–354,.
[42]
M. Latzer, N. Festic, K. Kappeler, Internetverbreitung Und Digitale Bruchlinien in Der Schweiz 2021, Themenbericht aus Dem World Internet Project – Switzerland 2021, University of Zurich, Zurich, 2021.
[43]
Y.S. Lincoln, E.G. Guba, Naturalistic Inquiry, Sage Publications, Beverly Hills, Calif, 1985.
[44]
D. Lupton, Not the real me’: social imaginaries of personal data profiling, Cult. Sociol. 15 (2021) 3–21,.
[45]
D. Lupton, Thinking with care about personal data profiling: a more-than-human approach, Int. J. Commun. 14 (2020) 3165–3183.
[46]
C. Lutz, C.P. Hoffmann, G. Ranzini, Data capitalism and the user: an exploration of privacy cynicism in Germany, New Media Soc 2 (2020) 1168–1187,.
[47]
D. Lyon, Surveillance culture: engagement, exposure, and ethics in digital modernity, Int. J. Commun. 11 (2017) 824–842.
[48]
D. Lyon, Theorizing surveillance: the Panopticon and Beyond, Willan Publishing, 2006.
[49]
D. Lyon, Surveillance society: Monitoring Everyday Life, McGraw-Hill Education, UK), 2001.
[50]
C. Maher, M. Hadfield, M. Hutchings, A. de Eyto, Ensuring rigor in qualitative data analysis: a design research approach to coding combining NVivo with traditional material methods, Int. J. Qual. Method. 17 (2018),.
[51]
A. Marthews, C. Tucker, The impact of online surveillance on behavior, in: D. Gray, S.E. Henderson (Eds.), The Cambridge Handbook of Surveillance Law, Cambridge University Press, 2017, pp. 437–454,.
[52]
A. Marwick, The public domain: surveillance in everyday life, Surveill. Soc. 9 (2012) 378–393,.
[53]
M. Micheli, C. Lutz, M. Büchi, Digital footprints: an emerging dimension of digital inequality, J. Inf. Commun. Ethics Soc. 16 (2018) 242–251,.
[54]
J.E. Möller, J. Nowak, Surveillance and privacy as emerging issues in communication and media studies: an introduction, Mediat. Stud. 2 (2018) 8–15,.
[55]
J.W. Penney, Understanding chilling effects, Minn. Law Rev. (2022) 1451–1530.
[56]
J.W. Penney, Chilling effects: online surveillance and Wikipedia use, Berkeley Technol. Law J. 31 (2016) 117–182.
[57]
K. Quinn, Why we share: a uses and gratifications approach to privacy regulation in social media use, J. Broadcast. Electron. Media 60 (2016) 61–86,.
[58]
C.M. Segijn, S.J. Opree, I.-van Ooijen, The validation of the Perceived Surveillance Scale, Cyberpsychol. J. Psychosoc. Res. Cybersp. 16 (2022),.
[59]
C.M. Segijn, I. van Ooijen, Perceptions of techniques used to personalize messages across media in real time, Cyberpsychol. Behav. Soc. Netw. 23 (2020) 329–337,.
[60]
D.J. Solove, A taxonomy of privacy, Univ. Pa. Law Rev. 154 (2006) 477–564,.
[61]
E. Stoycheff, Cookies and content moderation: affective chilling effects of internet surveillance and censorship, J. Inf. Technol. Polit. 0 (2022) 1–12,.
[62]
E. Stoycheff, Under surveillance: examining Facebook's spiral of silence effects in the wake of NSA internet monitoring, J. Mass Commun. Q. 93 (2016) 296–311,.
[63]
E. Stoycheff, J. Liu, K. Xu, K. Wibowo, Privacy and the Panopticon: online mass surveillance's deterrence and chilling effects, New Media Soc. 21 (2019) 602–619,.
[64]
A. Strauss, J. Corbin, Grounded heory methodology: an overview, in: N.K. Denzin, Y.S. Lincoln (Eds.), Handbook of Qualitative Research, Sage Publications, Thousand Oaks, 1994, pp. 273–285.
[65]
J. Strycharz, E. Kim, C.M. Segijn, Why people would (not) change their media use in response to perceived corporate surveillance, Telemat. Inform. 101838 (2022),.
[66]
J. Strycharz, C.M. Segijn, The future of dataveillance in advertising theory and practice, J. Advert. 51 (2022) 574–591,.
[67]
J. Strycharz, G. van Noort, E. Smit, N. Helberger, Consumer view on personalized advertising: overview of self-reported benefits and concerns, in: E. Bigne, S. Rosengren (Eds.), Advances in Advertising Research X: Multiple Touchpoints in Brand Communication, European Advertising Academy, Springer Fachmedien, Wiesbaden, 2019, pp. 53–66,.
[68]
I. Tavory, Interviews and inference: making sense of interview data in qualitative research, Qual. Sociol. 43 (2020) 449–465,.
[69]
C. Taylor, Modern Social imaginaries, Public planet Books, Duke University Press, Durham, 2004.
[70]
J. Townend, Freedom of expression and the chilling effect, The Routledge Companion to Media and Human Rights, Routledge, 2017.
[71]
J. van Dijck, Datafication, dataism and dataveillance: big Data between scientific paradigm and ideology, Surveill. Soc. 12 (2014) 197–208,.
[72]
I. van Ooijen, C.M. Segijn, S.J. Opree, Privacy cynicism and its role in privacy decision-making, Commun. Res. (2022),. 00936502211060984.
[73]
Viola, L.A., Laidler, P. (Eds.), 2022. Trust and Transparency in an Age of Surveillance. Routledge, London. https://doi.org/10.4324/9781003120827.
[74]
K. Widdicks, C. Remy, O. Bates, A. Friday, M. Hazas, Escaping unsustainable digital interactions: toward “more meaningful” and “moderate” online experiences, Int. J. Hum.-Comput. Stud. 165 (2022),.
[75]
M. Williams, J.R.C. Nurse, S. Creese, (Smart)Watch Out! encouraging privacy-protective behavior through interactive games, Int. J. Hum.-Comput. Stud. 132 (2019) 121–137,.
[76]
B. Ytre-Arne, H. Moe, Folk theories of algorithms: understanding digital irritation, Media Cult. Soc. 43 (2021) 807–824,.
[77]
D. Zhang, S.C. Boerman, H. Hendriks, T. Araujo, H. Voorveld, A peak into individuals’ perceptions of surveillance, in: A. Vignolles, M.K.J. Waiguny (Eds.), Advances in Advertising Research (Vol. XII): Communicating, Designing and Consuming Authenticity and Narrative, European Advertising Academy, Springer Fachmedien, Wiesbaden, 2023, pp. 163–178,.

Index Terms

  1. Dataveillance imaginaries and their role in chilling effects online
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image International Journal of Human-Computer Studies
        International Journal of Human-Computer Studies  Volume 179, Issue C
        Nov 2023
        235 pages

        Publisher

        Academic Press, Inc.

        United States

        Publication History

        Published: 01 November 2023

        Author Tags

        1. Dataveillance
        2. Digital communication
        3. Self-inhibition
        4. Chilling effects
        5. Behavioral responses
        6. Imaginaries

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 0
          Total Downloads
        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 08 Mar 2025

        Other Metrics

        Citations

        View Options

        View options

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media