Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Rethinking People Analytics With Inverse Transparency by Design

Published: 04 October 2023 Publication History
  • Get Citation Alerts
  • Abstract

    Employees work in increasingly digital environments that enable advanced analytics. Yet, they lack oversight over the systems that process their data. That means that potential analysis errors or hidden biases are hard to uncover. Recent data protection legislation tries to tackle these issues, but it is inadequate. It does not prevent data misusage while at the same time stifling sensible use cases for data.
    We think the conflict between data protection and increasingly data-driven systems should be solved differently. When access to an employees' data is given, all usages should be made transparent to them, according to the concept of inverse transparency. This allows individuals to benefit from sensible data usage while addressing the potentially harmful consequences of data misusage. To accomplish this, we propose a new design approach for workforce analytics software we refer to as inverse transparency by design.
    To understand the developer and user perspectives on the proposal, we conduct two exploratory studies with students. First, we let small teams of developers implement analytics tools with inverse transparency by design to uncover how they judge the approach and how it materializes in their developed tools. We find that architectural changes are made without inhibiting core functionality. The developers consider our approach valuable and technically feasible. Second, we conduct a user study over three months to let participants experience the provided inverse transparency and reflect on their experience. The study models a software development workplace where most work processes are already digital. Participants perceive the transparency as beneficial and feel empowered by it. They unanimously agree that it would be an improvement for the workplace. We conclude that inverse transparency by design is a promising approach to realize accepted and responsible people analytics.

    References

    [1]
    Rafael Accorsi. 2010. BBox: A distributed secure log architecture. In Proceedings of the 2010 European Public Key Infrastructure Workshop (Lecture Notes in Computer Science, 6711). Springer, 109--124.
    [2]
    Rakesh Agrawal, Jerry Kiernan, Ramakrishnan Srikant, and Yirong Xu. 2002. Hippocratic databases. In Proceedings of the 28textsuperscriptth International Conference on Very Large Databases. Elsevier, 143--154.
    [3]
    Mohsen Ahmadvand, Amjad Ibrahim, and Alexander Pretschner. 2017. A distributed secure logging mechanism with post-compromise security. (2017). Working draft.
    [4]
    Gene M. Alarcon, Anthony M. Gibson, Charles Walter, Rose F. Gamble, Tyler J. Ryan, Sarah A. Jessup, Brian E. Boyd, and August Capiola. 2020. Trust perceptions of metadata in open-source software: the role of performance and reputation. Systems, Vol. 8, 3, Article 28 (2020).
    [5]
    Scott W. Ambler. 2008. Agile software development at scale. In Proceedings of the 2textsuperscriptnd IFIP Central and East European Conference on Software Engineering Techniques (Lecture Notes in Computer Science, 5082). Springer, 1--12.
    [6]
    Tuomas Aura. 1997. Strategies against replay attacks. In Proceedings of the 10textsuperscriptth Computer Security Foundations Workshop. IEEE, 59--68.
    [7]
    Eugene Bagdasaryan, Griffin Berlstein, Jason Waterman, Eleanor Birrell, Nate Foster, Fred B. Schneider, and Deborah Estrin. 2019. Ancile: Enhancing privacy for ubiquitous computing with use-based privacy. In Proceedings of the 18textsuperscriptth ACM Workshop on Privacy in the Electronic Society. ACM, 111--124.
    [8]
    Aaron Bangor, Philip T. Kortum, and James T. Miller. 2008. An empirical evaluation of the system usability scale. International Journal of Human-Computer Interaction, Vol. 24, 6 (2008), 574--594.
    [9]
    Dhruba Kumar Bhattacharyya and Jugal Kumar Kalita. 2019. DDoS attacks: evolution, detection, prevention, reaction, and tolerance. Chapman and Hall/CRC.
    [10]
    Devasheesh P. Bhave, Laurel H. Teo, and Reeshad S. Dalal. 2020. Privacy at work: a review and a research agenda for a contested terrain. Journal of Management, Vol. 46, 1 (2020), 127--164.
    [11]
    Eleanor Birrell, Anders Gjerdrum, Robbert van Renesse, Håvard Johansen, Dag Johansen, and Fred B. Schneider. 2018. SGX enforcement of use-based privacy. In Proceedings of the 17textsuperscriptth Workshop on Privacy in the Electronic Society. ACM, 155--167.
    [12]
    Wayne H. Bovey and Andrew Hede. 2001. Resistance to organisational change: the role of defence mechanisms. Journal of Managerial Psychology, Vol. 7, 16 (2001), 534--548.
    [13]
    Norman E. Bowie and Karim Jamal. 2006. Privacy rights on the internet: self-regulation or government regulation? Business Ethics Quarterly, Vol. 16, 3 (2006), 323--342.
    [14]
    David Brin. 1998. The Transparent Society. Basic Books.
    [15]
    John Brooke. 1986. System usability scale (SUS): a quick-and-dirty method of system evaluation user information. In Usability Evaluation in Industry, Patrick W. Jordan, Bruce Thomas, Bernard A. Weerdmeester, and Ian L. McClelland (Eds.). Vol. 43. Taylor & Francis, 189--194.
    [16]
    California Consumer Privacy Act. 2018. An act to add Title 1.81.5 (commencing with Section 1798.100) to Part 4 of Division 3 of the Civil Code, relating to privacy (California Consumer Privacy Act of 2018). Assembly Bill 375 (2018), 1--24.
    [17]
    Fred H. Cate. 2002. Principles for protecting privacy. Cato Journal, Vol. 22, 1 (2002), 33--58.
    [18]
    Ann Cavoukian. 2009. Privacy by Design: the 7 foundational principles. Information and Privacy Commissioner of Ontario, Canada.
    [19]
    Tomas Chamorro-Premuzic. 2020. Can surveillance AI make the workplace safe? MIT Sloan Management Review, Vol. 62, 1 (2020), 13--15. https://sloanreview.mit.edu/article/can-surveillance-ai-make-the-workplace-safe/
    [20]
    Prithwiraj Choudhury, Kevin Crowston, Linus Dahlander, Marco S. Minervini, and Sumita Raghuram. 2020. GitLab: work where you want, when you want. Journal of Organization Design, Vol. 9, Article 23 (2020).
    [21]
    Henriette Cramer, Vanessa Evers, Maarten van Someren, Bob Wielinga, Sam Besselink, Lloyd Rutledge, Natalia Stash, and Lora Aroyo. 2007. User interaction with user-adaptive information filters. In Proceedings of the 2007 International Conference on Usability and Internationalization. Springer, 324--333.
    [22]
    Thomas H. Davenport, Jeanne Harris, and Jeremy Shapiro. 2010. Competing on talent analytics. Harvard Business Review, Vol. 88, 10 (2010), 52--58. Issue October.
    [23]
    Hans Delfs and Helmut Knebl. 2007. Public-key cryptography. In Introduction to Cryptography. Springer, Chapter 3, 33--80.
    [24]
    Mina Deng, Kim Wuyts, Riccardo Scandariato, Bart Preneel, and Wouter Joosen. 2011. A privacy threat analysis framework: Supporting the elicitation and fulfillment of privacy requirements. Requirements Engineering, Vol. 16, 1 (2011), 3--32.
    [25]
    Tommaso Fabbri, Anna Chiara Scapolan, Fabiola Bertolotti, Federica Mandreoli, and Riccardo Martoglia. 2022. Work datafication and digital work behavior analysis as a source of HRM insights. In Do Machines Dream of Electric Workers? (Lecture Notes in Information Systems and Organisation, 49). Springer, 53--65.
    [26]
    Wenting Feng, Rungting Tu, Tim Lu, and Zhimin Zhou. 2019. Understanding forced adoption of self-service technology: the impacts of users' psychological reactance. Behaviour & Information Technology, Vol. 38, 8 (2019), 820--832.
    [27]
    Jolene Fisher and Toby Hopp. 2020. Does the framing of transparency impact trust? Differences between self-benefit and other-benefit message frames. International Journal of Strategic Communication, Vol. 14, 3 (2020), 1--20.
    [28]
    Donald W. Fiske and Susan T. Fiske. 2005. Laboratory studies. In Encyclopedia of Social Measurement, Kimberly Kempf-Leonard (Ed.). Elsevier, 435--439.
    [29]
    Nicola Flannery. 2017. GDPR series: A design for life? Designing the future of privacy. Data Protection Ireland, Vol. 10, 2 (2017), 6--8.
    [30]
    Guillaume R. Fréchette. 2015. Laboratory experiments: professionals versus students. In Handbook of Experimental Economic Methodology, Guillaume R. Fréchette and Andrew Schotter (Eds.). Oxford University Press, Chapter 17, 360--390.
    [31]
    Michal S. Gal and Oshrit Aviv. 2020. The competitive effects of the GDPR. Journal of Competition Law & Economics, Vol. 16, 3 (2020), 349--391.
    [32]
    Domingo Garc'ia-Marzá. 2005. Trust and dialogue: theoretical approaches to ethics auditing. Journal of Business Ethics, Vol. 57, 3 (2005), 209--219.
    [33]
    Chunpeng Ge, Siwei Sun, and Pawel Szalachowski. 2019. Permissionless blockchains and secure logging. In Proceedings of the 2019 IEEE International Conference on Blockchain and Cryptocurrency. IEEE, 56--60.
    [34]
    General Data Protection Regulation. 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Official Journal of the European Union, Vol. 59 (2016), 1--88.
    [35]
    Maren Gierlich-Joas, Thomas Hess, and Rahild Neuburger. 2020. More self-organization, more control--or even both? Inverse transparency as a digital leadership concept. Business Research, Vol. 13, 3 (2020), 921--947.
    [36]
    Lisa Marie Giermindl, Franz Strich, Oliver Christ, Ulrich Leicht-Deobald, and Abdullah Redzepi. 2022. The dark sides of people analytics: reviewing the perils for organisations and employees. European Journal of Information Systems, Vol. 31, 3 (2022), 410--435.
    [37]
    Seda Gürses and Joris van Hoboken. 2018. Privacy after the agile turn. In The Cambridge Handbook of Consumer Privacy, Jules Polonetsky, Omer Tene, and Evan Selinger (Eds.). Cambridge University Press, Chapter 32, 579--601.
    [38]
    Irit Hadar, Tomer Hasson, Oshrat Ayalon, Eran Toch, Michael Birnhack, Sofia Sherman, and Arod Balissa. 2018. Privacy by designers: software developers' privacy mindset. Empirical Software Engineering, Vol. 23, 1 (2018), 259--289.
    [39]
    Angela T. Hall, Dwight D. Frink, and M. Ronald Buckley. 2017. An accountability account: A review and synthesis of the theoretical and empirical research on felt accountability. Journal of Organizational Behavior, Vol. 38, 2 (2017), 204--224.
    [40]
    Kevin Anthony Hoff and Masooda Bashir. 2015. Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, Vol. 57, 3 (2015), 407--434.
    [41]
    Jason E. Holt. 2006. Logcrypt: Forward security and public verification for secure audit logs. In Proceedings of the 2006 Australasian Workshops on Grid Computing and E-Research. 203--211.
    [42]
    Joschka A. Hüllmann, Simone Krebber, and Patrick Troglauer. 2021. The IT artifact in people analytics: Reviewing tools to understand a nascent field. In Proceedings of the 16textsuperscriptth International Conference on Wirtschaftsinformatik (Lecture Notes in Information Systems and Organisation, 48). Springer, 238--254.
    [43]
    Patrik Hummel, Matthias Braun, Steffen Augsberg, and Peter Dabrock. 2018. Sovereignty and data sharing. ICT Discoveries, Vol. 1, 2 (2018).
    [44]
    Matthias Jarke, Boris Otto, and Sudha Ram. 2019. Data sovereignty and data space ecosystems. Business & Information Systems Engineering, Vol. 61, 5 (2019), 549--550.
    [45]
    Jian Jia, Ginger Zhe Jin, and Liad Wagman. 2018. The short-run effects of GDPR on technology venture investment. Technical Report 25248. National Bureau of Economic Research.
    [46]
    Farzaneh Karegar, John Sören Pettersson, and Simone Fischer-Hübner. 2020. The dilemma of user engagement in privacy notices: Effects of interaction modes and habituation on user attention. ACM Transactions on Privacy and Security, Vol. 23, 1 (2020), 1--38.
    [47]
    Elena Katok. 2018. Designing and conducting laboratory experiments. In The Handbook of Behavioral Operations, Karen Donohue, Elena Katok, and Stephen Leider (Eds.). John Wiley & Sons, Chapter 1, 3--33.
    [48]
    Florian Kelbert and Alexander Pretschner. 2018. Data usage control for distributed systems. ACM Transactions on Privacy and Security, Vol. 21, 3, Article 12 (2018), 32 pages.
    [49]
    René F. Kizilcec. 2016. How much information? Effects of transparency on trust in an algorithmic interface. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 2390--2395.
    [50]
    Tim Kraska, Michael Stonebraker, Michael Brodie, Sacha Servan-Schreiber, and Daniel Weitzner. 2019. SchengenDB: a data protection database proposal. In Heterogeneous Data Management, Polystores, and Analytics for Healthcare. Springer, 24--38.
    [51]
    Chloe N. Kuhlman. 2021. Will work for change: A correlational study on employee resistance to change and transformational leadership behaviors. Ph.,D. Dissertation. Azusa Pacific University.
    [52]
    Seungho Lee, Wonsuk Choi, Hyo Jin Jo, and Dong Hoon Lee. 2020. Poster: Secure logging infrastructure employing heterogeneous trusted execution environments. In Proceedings of the 2020 Network and Distributed System Security Symposium.
    [53]
    Michael Lörscher. 2012. Data usage control for the Thunderbird mail client. Master's thesis. University of Kaiserslautern, Germany.
    [54]
    Caitlin Lustig, Katie Pine, Bonnie Nardi, Lilly Irani, Min Kyung Lee, Dawn Nafus, and Christian Sandvig. 2016. Algorithmic authority: The ethics, politics, and economics of algorithms that interpret, decide, and manage. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 1057--1062.
    [55]
    Di Ma and Gene Tsudik. 2009. A new approach to secure logging. ACM Transactions on Storage, Vol. 5, 1 (2009), 1--21.
    [56]
    Akhil Mathur, Marc Van den Broeck, Geert Vanderhulst, Afra Mashhadi, and Fahim Kawsar. 2015. Quantified workplace: opportunities and challenges. In Proceedings of the 2textsuperscriptnd on Workshop on Physical Analytics. ACM, 37--41.
    [57]
    Aleecia M. McDonald and Lorrie Faith Cranor. 2008. The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, Vol. 4, 3 (2008), 543--568.
    [58]
    Craig Mundie. 2014. Privacy pragmatism; Focus on data use, not data collection. Foreign Affairs, Vol. 93, 2 (2014), 28--38.
    [59]
    Patrick Murmann and Simone Fischer-Hübner. 2017. Tools for achieving usable ex post transparency: a survey. IEEE Access, Vol. 5 (2017), 22965--22991.
    [60]
    Ning Nan and Donald E. Harter. 2009. Impact of budget and schedule pressure on software development cycle time and effort. IEEE Transactions on Software Engineering, Vol. 35, 5 (2009), 624--637.
    [61]
    Dean Povey. 1999. Optimistic security: a new access control paradigm. In Proceedings of the 1999 Workshop on New Security Paradigms. ACM, 40--45.
    [62]
    Alexander Pretschner. 2009. An overview of distributed usage control. In Proceedings of the 2textsuperscriptnd International Conference on Knowledge Engineering, Pinciples and Techniques. 25--33.
    [63]
    Alexander Pretschner. 2014. Achieving accountability with distributed data usage control technology. In Proceedings of the 2textsuperscriptnd International Workshop on Accountability: Science, Technology, and Policy. MIT.
    [64]
    Alexander Pretschner, Manuel Hilty, and David Basin. 2006. Distributed usage control. Commun. ACM (2006), 39--44.
    [65]
    Alexander Pretschner, Manuel Hilty, Florian Schütz, Christian Schaefer, and Thomas Walter. 2008. Usage control enforcement: present and future. IEEE Security & Privacy, Vol. 6, 4 (2008), 44--53.
    [66]
    Alexander Pretschner, Florian Kelbert, Enrico Kumari, Prachi, and Tobias Wüchner. 2013. A distributed data usage control infrastructure. (2013). Unpublished.
    [67]
    Christian Priebe, Kapil Vaswani, and Manuel Costa. 2018. EnclaveDB: A secure database using SGX. In Proceedings of the 2018 IEEE Symposium on Security and Privacy. IEEE, 264--278.
    [68]
    Emilee Rader, Kelley Cotter, and Janghee Cho. 2018. Explanations as mechanisms for supporting algorithmic transparency. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 1--13.
    [69]
    Ashwini Rao, Florian Schaub, Norman Sadeh, Alessandro Acquisti, and Ruogu Kang. 2016. Expecting the unexpected: Understanding mismatched privacy expectations online. In Proceedings of the 12textsuperscriptth Symposium on Usable Privacy and Security. USENIX, 77--96.
    [70]
    Brad R. Rawlins. 1994. Measuring the relationship between organizational transparency and employee trust. In CHI'94 Conference Companion on Human Factors in Computing Systems. ACM, 99--100.
    [71]
    Jens Riegelsberger, M. Angela Sasse, and John D. McCarthy. 2005. The mechanics of trust: A framework for research and design. International Journal of Human-Computer Studies, Vol. 62, 3 (2005), 381--422.
    [72]
    Aaron Rieke, Miranda Bogen, and David G. Robinson. 2018. Public scrutiny of automated decisions: early lessons and emerging methods. An Upturn and Omidyar Network Report. https://apo.org.au/node/210086
    [73]
    Ira S. Rubinstein. 2010. Privacy and regulatory innovation: moving beyond voluntary codes. I/S: A Journal of Law and Policy for the Information Society, Vol. 6, 3 (2010), 355--423.
    [74]
    Iflaah Salman, Ayse Tosun Misirli, and Natalia Juristo. 2015. Are students representatives of professionals in software engineering experiments?. In Proceedings of the 37textsuperscriptth IEEE/ACM International Conference on Software Engineering, Vol. 1. IEEE, 666--676.
    [75]
    Shruti Sannon, Billie Sun, and Dan Cosley. 2022. Privacy, surveillance, and power in the gig economy. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. ACM, Article 619, 15 pages.
    [76]
    SAP. 2020. SAP SuccessFactors workforce analytics: Optimize performance and results with data-driven insights. https://www.sap.com/products/hcm/workforce-planning-hr-analytics.html?pdf-asset=12e85371-c37c-0010--82c7-eda71af511fa
    [77]
    Christian Schaefer and Christine Edman. 2019. Transparent logging with Hyperledger Fabric. In Proceedings of the 2019 IEEE International Conference on Blockchain and Cryptocurrency. IEEE, 65--69.
    [78]
    John S. Seberger, Marissel Llavore, Nicholas Nye Wyant, Irina Shklovski, and Sameer Patil. 2021. Empowering resignation: there's an app for that. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Article 552, 18 pages.
    [79]
    Awanthika Senarath, Marthie Grobler, and Nalin Asanka Gamagedara Arachchilage. 2019. Will they use it or not? Investigating software developers' intention to follow privacy engineering methodologies. ACM Transactions on Privacy and Security, Vol. 22, 4 (2019), 1--30.
    [80]
    Tushar Sharma, Girish Suryanarayana, and Ganesh Samarthyam. 2015. Challenges to and solutions for refactoring adoption: an industrial perspective. IEEE Software, Vol. 32, 6 (2015), 44--51.
    [81]
    Dag I. K. Sjøberg, Jo Erskine Hannay, Ove Hansen, Vigdis By Kampenes, Amela Karahasanovic, Nils-Kristian Liborg, and Anette C. Rekdal. 2005. A survey of controlled experiments in software engineering. IEEE Transactions on Software Engineering, Vol. 31, 9 (2005), 733--753.
    [82]
    Matthias Söllner, Axel Hoffmann, Holger Hoffmann, Arno Wacker, and Jan Marco Leimeister. 2012. Understanding the formation of trust in IT artifacts. In Proceedings of the 33textsuperscriptrd International Conference on Information Systems. AIS.
    [83]
    Smitha Sundareswaran, Anna Squicciarini, and Dan Lin. 2012. Ensuring distributed accountability for data sharing in the cloud. IEEE Transactions on Dependable and Secure Computing, Vol. 9, 4 (2012), 556--568.
    [84]
    Jenny Tang, Hannah Shoemaker, Ada Lerner, and Eleanor Birrell. 2021. Defining privacy: How users interpret technical terms in privacy policies. Proceedings on Privacy Enhancing Technologies, Vol. 2021, 3 (2021), 1--25.
    [85]
    Mena Angela Teebken and Thomas Hess. 2021. Privacy in a digitized workplace: Towards an understanding of employee privacy concerns. In Proceedings of the 54textsuperscriptth Hawaii International Conference on System Sciences. University of Hawaii at Manoa, 6661--6670.
    [86]
    Walter F. Tichy. 2000. Hints for reviewing empirical work in software engineering. Empirical Software Engineering, Vol. 5, 4 (2000), 309--312.
    [87]
    Aizhan Tursunbayeva, Claudia Pagliari, Stefano Di Lauro, and Gilda Antonelli. 2021. The ethics of people analytics: risks, opportunities and recommendations. Personnel Review, Vol. 51, 3 (2021), 900--921.
    [88]
    Sjir Uitdewilligen, Mary J. Waller, and Adrian H. Pitariu. 2013. Mental model updating and team adaptation. Small Group Research, Vol. 44, 2 (2013), 127--158.
    [89]
    Martijn H. Van Beek. 2007. Comparison of enterprise digital rights management systems. Master's thesis. Radboud University Nijmegen.
    [90]
    Paul Georg Wagner, Pascal Birnstill, and Jürgen Beyerer. 2018. Distributed usage control enforcement through trusted platform modules and SGX enclaves. In Proceedings of the 23textsuperscriptrd ACM on Symposium on Access Control Models and Technologies. ACM, 85--91.
    [91]
    Daniel J. Weitzner, Harold Abelson, Tim Berners-Lee, Joan Feigenbaum, James Hendler, and Gerald Jay Sussman. 2008. Information accountability. Commun. ACM, Vol. 51, 6 (2008), 82--87.
    [92]
    Tal Z. Zarsky. 2017. Incompatible: The GDPR in the age of big data. Seton Hall Law Review, Vol. 47, 4 (2017), 995--1020.
    [93]
    Valentin Zieglmeier. 2023. The inverse transparency toolchain: a fully integrated and quickly deployable data usage logging infrastructure. Software Impacts (2023). Forthcoming.
    [94]
    Valentin Zieglmeier, Maren Gierlich-Joas, and Alexander Pretschner. 2022. Increasing employees' willingness to share: Introducing appeal strategies for people analytics. In Proceedings of the 13textsuperscriptth International Conference on Software Business (Lecture Notes in Business Information Processing, 463). Springer, 213--226. https://doi.org/10.1007/978--3-031--20706--8_15
    [95]
    Valentin Zieglmeier and Antonia Maria Lehene. 2021. Designing trustworthy user interfaces. In Proceedings of the 33textsuperscriptrd Australian Conference on Human-Computer Interaction. ACM, 182--189. https://doi.org/10.1145/3520495.3520525
    [96]
    Valentin Zieglmeier and Gabriel Loyola Daiqui. 2021. GDPR-compliant use of blockchain for secure usage logs. In Proceedings of the 25textsuperscriptth International Conference on Evaluation and Assessment in Software Engineering. ACM, 313--320. https://doi.org/10.1145/3463274.3463349
    [97]
    Valentin Zieglmeier, Gabriel Loyola Daiqui, and Alexander Pretschner. 2023. Decentralized inverse transparency with blockchain. Distributed Ledger Technologies: Research and Practice (2023), 1--30. https://doi.org/10.1145/3592624
    [98]
    Niina Zuber, Severin Kacianka, Alexander Pretschner, and Julian Nida-Rümelin. 2020. Ethical deliberation for agile software processes: the EDAP manual. In Hengstschlager, Markus (Hrsg): Digitaler Wandel und Ethik. Österreichischer Rat für Forschung und Technologieentwicklung. ecowin, 150--175.

    Cited By

    View all
    • (2023)The Inverse Transparency Toolchain: A Fully Integrated and Quickly Deployable Data Usage Logging InfrastructureSoftware Impacts10.1016/j.simpa.2023.10055417(100554)Online publication date: Sep-2023

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Human-Computer Interaction
    Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue CSCW2
    CSCW
    October 2023
    4055 pages
    EISSN:2573-0142
    DOI:10.1145/3626953
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 October 2023
    Published in PACMHCI Volume 7, Issue CSCW2

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. HR analytics
    2. data sovereignty
    3. privacy by design
    4. qualitative study

    Qualifiers

    • Research-article

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)128
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 11 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)The Inverse Transparency Toolchain: A Fully Integrated and Quickly Deployable Data Usage Logging InfrastructureSoftware Impacts10.1016/j.simpa.2023.10055417(100554)Online publication date: Sep-2023

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media