Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3644815.3644985acmconferencesArticle/Chapter ViewAbstractPublication PagescainConference Proceedingsconference-collections
poster

Can causality accelerate experimentation in software systems?

Published: 11 June 2024 Publication History

Abstract

Software designed using dataflow architecture naturally produces a graphical model of the data transformation process through the system. Interpreting this as a causal graph, we can leverage techniques from causal inference to estimate downstream effects of changes in code components, which can be interpreted as interventions within the causal graph. This allows for less costly software experimentation and can add another layer of protection against undesirable production updates.

References

[1]
Paul Barham, Aakanksha Chowdhery, Jeff Dean, Sanjay Ghemawat, Steven Hand, Daniel Hurt, Michael Isard, Hyeontaek Lim, Ruoming Pang, Sudip Roy, et al. 2022. Pathways: Asynchronous distributed dataflow for ML. Proceedings of Machine Learning and Systems 4 (2022), 430--449.
[2]
Michael Blackstock and Rodger Lea. 2014. Toward a distributed data flow platform for the web of things (distributed Node-RED). In Proceedings of the 5th International Workshop on Web of Things. 34--39.
[3]
Andrew G Clark, Michael Foster, Benedikt Prifling, Neil Walkinshaw, Robert M Hierons, Volker Schmidt, and Robert D Turner. 2023. Testing causality in scientific modelling software. ACM Transactions on Software Engineering and Methodology 33, 1 (2023), 1--42.
[4]
Jack B. Dennis and David P. Misunas. 1974. A Preliminary Architecture for a Basic Data-Flow Processor (ISCA '75). Association for Computing Machinery, New York, NY, USA, 126--132.
[5]
Sorin Dumitrescu. 2022. Local Environments Have Become Impractical - Reasons Why QA Needs A New Approach. (2022). Available at https://www.bunnyshell.com/blog/reasons-qa-engineers-hate-local-environments/.
[6]
Ron Kohavi, Alex Deng, and Lukas Vermeer. 2022. A/B Testing Intuition Busters: Common Misunderstandings in Online Controlled Experiments. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Washington DC, USA) (KDD '22). Association for Computing Machinery, New York, NY, USA, 3168--3177.
[7]
Ron Kohavi and Stefan Thomke. 2017. The surprising power of online experiments. Harvard business review 95, 5 (2017), 74--82.
[8]
Helena Holmström Olsson, Jan Bosch, and Aleksander Fabijan. 2017. Experimentation that matters: a multi-case study on the challenges with A/B testing. In International Conference of Software Business. Springer, 179--185.
[9]
Alan Page, Ken Johnston, and Bj Rollison. 2008. How we test software at Microsoft. Microsoft Press.
[10]
Andrei Paleyes, Christian Cabrera, and Neil D. Lawrence. 2022. An Empirical Evaluation of Flow Based Programming in the Machine Learning Deployment Context. In 2022 IEEE/ACM 1st International Conference on AI Engineering - Software Engineering for AI (CAIN). 54--64.
[11]
Andrei Paleyes, Siyuan Guo, Bernhard Scholkopf, and Neil D. Lawrence. 2023. Dataflow graphs as complete causal graphs. In 2023 IEEE/ACM 2nd International Conference on AI Engineering-Software Engineering for AI (CAIN). IEEE, 7--12.
[12]
Andrei Paleyes and Neil David Lawrence. 2023. Causal fault localisation in dataflow systems. In Proceedings of the 3rd Workshop on Machine Learning and Systems (Rome, Italy) (EuroMLSys '23). Association for Computing Machinery, New York, NY, USA, 140--147.
[13]
Judea Pearl. 2009. Causality: Models, Reasoning and Inference (2nd ed.). Cambridge University Press.
[14]
Rudolf Ramler and Johannes Gmeiner. 2014. Practical Challenges in Test Environment Management. In 2014 IEEE Seventh International Conference on Software Testing, Verification and Validation Workshops. IEEE, 358--359.
[15]
Julien Siebert. 2023. Applications of statistical causal inference in software engineering. Information and Software Technology 159 (2023), 107198.
[16]
Chris Struble. 2001. Model-Based Testing of Installers in a Development Test Environment. Hewlett Packard Company (2001).
[17]
Ya Xu, Nanyu Chen, Addrian Fernandez, Omar Sinno, and Anmol Bhasin. 2015. From Infrastructure to Culture: A/B Testing Challenges in Large Scale Social Networks. In Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (Sydney, NSW, Australia) (KDD '15). 2227--2236.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CAIN '24: Proceedings of the IEEE/ACM 3rd International Conference on AI Engineering - Software Engineering for AI
April 2024
307 pages
ISBN:9798400705915
DOI:10.1145/3644815
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 June 2024

Check for updates

Author Tags

  1. dataflow architecture
  2. causal inference
  3. experimentation

Qualifiers

  • Poster

Funding Sources

Conference

CAIN 2024
Sponsor:

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 30
    Total Downloads
  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)4
Reflects downloads up to 18 Aug 2024

Other Metrics

Citations

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media