Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3675741.3675750acmotherconferencesArticle/Chapter ViewAbstractPublication PagescsetConference Proceedingsconference-collections
research-article
Open access

NERDS: A Non-invasive Environment for Remote Developer Studies

Published: 13 August 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Given the difficulties of secure development, studying software developers remains pivotal. However, conducting these studies remains a pain point for the security community as recruitment and retention can be incredibly difficult. In this work, we aim to make conducting security studies with software developers easier by building NERDS: a Non-invasive Environment for Remote Developer Studies. NERDS allows for conducting remote studies while still providing researchers with a controlled environment. We describe our experiences building and deploying NERDS in two distinct secure software development studies. Our lessons learned can provide valuable insight to other researchers wanting to utilize NERDS. We provide NERDS as an open-source system.

    References

    [1]
    Yasemin Acar, Michael Backes, Sascha Fahl, Simson Garfinkel, Doowon Kim, Michelle L Mazurek, and Christian Stransky. 2017. Comparing the usability of cryptographic APIs. In 2017 IEEE Symposium on Security and Privacy (SP). IEEE, 154–171.
    [2]
    Yasemin Acar, Sascha Fahl, and Michelle L Mazurek. 2016. You are not your developer, either: A research agenda for usable security and privacy research beyond end users. In 2016 IEEE Cybersecurity Development (SecDev). IEEE, 3–8.
    [3]
    Yasemin Acar, Christian Stransky, Dominik Wermke, Michelle L Mazurek, and Sascha Fahl. 2017. Security developer studies with Github users: Exploring a convenience sample. In Thirteenth Symposium on Usable Privacy and Security (SOUPS 2017). 81–95.
    [4]
    Sebastian Baltes and Stephan Diehl. 2016. Worse than spam: Issues in sampling software developers. In Proceedings of the 10th ACM/IEEE international symposium on empirical software engineering and measurement. 1–6.
    [5]
    Gunnar R Bergersen, Dag IK Sjøberg, and Tore Dybå. 2014. Construction and validation of an instrument for measuring programming skill. IEEE Transactions on Software Engineering 40, 12 (2014), 1163–1184.
    [6]
    Yung-Yu Chang, Pavol Zavarsky, Ron Ruhl, and Dale Lindskog. 2011. Trend analysis of the cve for software vulnerability management. In Proceedings of the 2011 IEEE Third International Conference on Privacy, Security, Risk and Trust and 2011 IEEE Third International Conference on Social Computing. IEEE, 1290–1293.
    [7]
    Anastasia Danilova, Stefan Horstmann, Matthew Smith, and Alena Naiakshina. 2022. Testing time limits in screener questions for online surveys with programmers. In Proceedings of the 44th International Conference on Software Engineering. 2080–2090.
    [8]
    Anastasia Danilova, Alena Naiakshina, Stefan Horstmann, and Matthew Smith. 2021. Do you really code? Designing and Evaluating Screening Questions for Online Surveys with Programmers. In 2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE). IEEE, 537–548.
    [9]
    Emscripten. [n. d.]. Debugging with Sanitizers. https://emscripten.org/docs/debugging/Sanitizers.html?highlight=dereference#catching-null-dereference|.
    [10]
    Janet Feigenspan, Christian Kästner, Jörg Liebig, Sven Apel, and Stefan Hanenberg. 2012. Measuring programming experience. In 2012 20th IEEE international conference on program comprehension (ICPC). IEEE, 73–82.
    [11]
    Kelsey R Fulton, Daniel Votipka, Desiree Abrokwa, Michelle L Mazurek, Michael Hicks, and James Parker. 2022. Understanding the how and the why: Exploring secure development practices through a course competition. In Proceedings of the 2022 ACM SIGSAC Conference on Computer and Communications Security. 1141–1155.
    [12]
    Nicolas Huaman, Alexander Krause, Dominik Wermke, Jan H. Klemmer, Christian Stransky, Yasemin Acar, and Sascha Fahl. 2022. If You Can’t Get Them to the Lab: Evaluating a Virtual Study Environment with Security Information Workers. 313–330.
    [13]
    Harjot Kaur, Sabrina Amft, Daniel Votipka, Yasemin Acar, and Sascha Fahl. 2022. Where to Recruit for Security Development Studies: Comparing Six Software Developer Samples. (2022).
    [14]
    Mitre. 2020. CVE. https://cve.mitre.org/.
    [15]
    Alena Naiakshina, Anastasia Danilova, Eva Gerlitz, and Matthew Smith. 2020. On conducting security developer studies with CS students: Examining a password-storage study with CS students, freelancers, and company developers. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
    [16]
    Alena Naiakshina, Anastasia Danilova, Eva Gerlitz, Emanuel Von Zezschwitz, and Matthew Smith. 2019. “If you want, I can store the encrypted password” A Password-Storage Field Study with Freelance Developers. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
    [17]
    Alena Naiakshina, Anastasia Danilova, Christian Tiefenau, Marco Herzog, Sergej Dechand, and Matthew Smith. 2017. Why do developers get password storage wrong? A qualitative usability study. In Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security. 311–328.
    [18]
    NIST. 2020. National Vulnerability Database. https://nvd.nist.gov/general.
    [19]
    Olgierd Pieczul, Simon Foley, and Mary Ellen Zurko. 2017. Developer-centered security and the symmetry of ignorance. In Proceedings of the 2017 New Security Paradigms Workshop. 46–56.
    [20]
    Andrew Ruef, Michael Hicks, James Parker, Dave Levin, Michelle L Mazurek, and Piotr Mardziel. 2016. Build it, break it, fix it: Contesting secure development. In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. 690–703.
    [21]
    Raphael Serafini, Marco Gutfleisch, Stefan Albert Horstmann, and Alena Naiakshina. 2023. On the Recruitment of Company Developers for Security Studies: Results from a Qualitative Interview Study. In Nineteenth Symposium on Usable Privacy and Security (SOUPS 2023). 321–340.
    [22]
    Christian Stransky, Yasemin Acar, Duc Cuong Nguyen, Dominik Wermke, Doowon Kim, Elissa M. Redmiles, Michael Backes, Simson Garfinkel, Michelle L. Mazurek, and Sascha Fahl. 2017. Lessons Learned from Using an Online Platform to Conduct Large-Scale, Online Controlled Security Experiments with Software Developers. In 10th USENIX Workshop on Cyber Security Experimentation and Test (CSET 17). USENIX Association, Vancouver, BC. https://www.usenix.org/conference/cset17/workshop-program/presentation/stransky
    [23]
    Mohammad Tahaei and Kami Vaniea. 2022. Recruiting Participants With Programming Skills: A Comparison of Four Crowdsourcing Platforms and a CS Student Mailing List. In CHI Conference on Human Factors in Computing Systems. 1–15.
    [24]
    Daniel Votipka, Desiree Abrokwa, and Michelle L Mazurek. 2020. Building and Validating a Scale for Secure Software Development Self-Efficacy. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–20.
    [25]
    Daniel Votipka, Kelsey R Fulton, James Parker, Matthew Hou, Michelle L Mazurek, and Michael Hicks. 2020. Understanding security mistakes developers make: Qualitative analysis from build it, break it, fix it. In 29th { USENIX} Security Symposium ({ USENIX} Security 20). 109–126.
    [26]
    Aiko Yamashita and Leon Moonen. 2013. Surveying developer knowledge and interest in code smells through online freelance marketplaces. In 2013 2nd International Workshop on User Evaluations for Software Engineering Researchers (USER). IEEE, 5–8.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CSET '24: Proceedings of the 17th Cyber Security Experimentation and Test Workshop
    August 2024
    115 pages
    ISBN:9798400709579
    DOI:10.1145/3675741
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 August 2024

    Check for updates

    Author Tags

    1. Secure software development
    2. Study methodology
    3. Usable security and privacy

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    CSET 2024

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 0
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 10 Aug 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media