Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3485447.3512063acmconferencesArticle/Chapter ViewAbstractPublication PageswebconfConference Proceedingsconference-collections
research-article
Open access

Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms

Published: 25 April 2022 Publication History
  • Get Citation Alerts
  • Abstract

    Two-sided marketplace platforms often run experiments (or A/B tests) to test the effect of an intervention before launching it platform-wide. A typical approach is to randomize users into a treatment group, which receives the intervention, and a control group, which does not. The platform then compares the performance in the two groups to estimate the effect if the intervention were launched to everyone. We focus on two common experiment types, where the platform randomizes users either on the supply side or on the demand side. For these experiments, it is known that the resulting estimates of the treatment effect are typically biased: individuals in the market compete with each other, which creates interference and leads to a biased estimate. Here, we observe that economic interactions (competition among demand and supply) lead to statistical phenomenon (biased estimates).
    We develop a simple, tractable market model to study bias and variance in these experiments with interference. We focus on two choices available to the platform: (1) Which side of the platform should it randomize on (supply or demand)? (2) What proportion of individuals should be allocated to treatment? We find that both choices affect the bias and variance of the resulting estimators, but in different ways. The bias-optimal choice of experiment type depends on the relative amounts of supply and demand in the market, and we discuss how a platform can use market data to select the experiment type. Importantly, we find that in many circumstances choosing the bias-optimal experiment type has little effect on variance, and in some cases coincide with the variance-optimal type. On the other hand, we find that the choice of treatment proportion can induce a bias-variance tradeoff, where the bias-minimizing proportion increases variance. We discuss how a platform can navigate this tradeoff and best choose the proportion, using a combination of modeling as well as contextual knowledge about the market, the risk of the intervention, and reasonable effect sizes of the intervention.

    References

    [1]
    Susan Athey, Dean Eckles, and Guido W. Imbens. 2018. Exact p-Values for Network Interference. J. Amer. Statist. Assoc. 113, 521 (2018), 230–240. https://doi.org/10.1080/01621459.2016.1241178 arXiv:https://doi.org/10.1080/01621459.2016.1241178
    [2]
    Pat Bajari, Brian Burdick, Guido Imbens, James McQueen, Thomas Richardson, and Ido Rosen. 2019. Multiple Randomization Designs for Interference. (2019). https://assets.amazon.science/c1/94/0d6431bf46f7978295d245dd6e06/double-randomized-online-experiments.pdf
    [3]
    G W Basse, A Feller, and P Toulis. 2019. Randomization tests of causal effects under interference. Biometrika 106, 2 (02 2019), 487–494. https://doi.org/10.1093/biomet/asy072 arXiv:https://academic.oup.com/biomet/article-pdf/106/2/487/28575447/asy072.pdf
    [4]
    Guillaume W. Basse, Hossein Azari Soufiani, and Diane Lambert. 2016. Randomization and The Pernicious Effects of Limited Budgets on Auction Experiments. In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain, May 9-11, 2016(JMLR Workshop and Conference Proceedings, Vol. 51), Arthur Gretton and Christian C. Robert (Eds.). JMLR.org, 1412–1420. http://proceedings.mlr.press/v51/basse16b.html
    [5]
    Thomas Blake and Dominic Coey. 2014. Why Marketplace Experimentation is Harder than It Seems: The Role of Test-Control Interference. In Proceedings of the Fifteenth ACM Conference on Economics and Computation (Palo Alto, California, USA) (EC ’14). Association for Computing Machinery, New York, NY, USA, 567–582. https://doi.org/10.1145/2600057.2602837
    [6]
    Iavor Bojinov, David Simchi-Levi, and Jinglong Zhao. 2021. Design and Analysis of Switchback Experiments. arxiv:2009.00148 [stat.ME]
    [7]
    Kenneth Burdett, Shouyong Shi, and Randall Wright. 2001. Pricing and Matching with Frictions. Journal of Political Economy 109, 5 (2001), 1060–1085. http://www.jstor.org/stable/10.1086/322835
    [8]
    Nicholas Chamandy. 2016. Experimentation in a Ridesharing Marketplace. https://eng.lyft.com/experimentation-in-a-ridesharing-marketplace-f75a9c4fcf01
    [9]
    clavisinsight.com. [n.d.]. A Definitive Guide to Optimizing Brand Performance on Amazon. http://go.clavisinsight.com/rs/573-RCS-461/images/Clavis%20Winning%20on%20Amazon_Final.pdf
    [10]
    Nick Doudchenko, Minzhengxiong Zhang, Evgeni Drynkin, Edoardo Airoldi, Vahab Mirrokni, and Jean Pouget-Abadie. 2020. Causal Inference with Bipartite Designs. arxiv:2010.02108 [stat.ME]
    [11]
    Andrey Fradkin. 2015. Search Frictions and the Design of Online Marketplaces. In AMMA 2015.
    [12]
    Peter Glynn, Ramesh Johari, and Mohammad Rasouli. 2020. Adaptive Experimental Design with Temporal Interference: A Maximum Likelihood Approach. arxiv:2006.05591 [stat.ME]
    [13]
    Viet Ha-Thuc, Avishek Dutta, Ren Mao, Matthew Wood, and Yunli Liu. 2020. A Counterfactual Framework for Seller-Side A/B Testing on Marketplaces. In Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (Virtual Event, China) (SIGIR ’20). Association for Computing Machinery, New York, NY, USA, 2288–2296. https://doi.org/10.1145/3397271.3401434
    [14]
    Christopher Harshaw, Fredrik Sävje, David Eisenstat, Vahab Mirrokni, and Jean Pouget-Abadie. 2021. Design and Analysis of Bipartite Experiments under a Linear Exposure-Response Model. arxiv:2103.06392 [stat.ME]
    [15]
    David Holtz and Sinan Aral. 2020. Limiting Bias from Test-Control Interference in Online Marketplace Experiments. arxiv:2004.12162 [stat.AP]
    [16]
    David Holtz, Ruben Lobel, Inessa Liskovich, and Sinan Aral. 2020. Reducing Interference Bias in Online Marketplace Pricing Experiments. arxiv:2004.12489 [stat.ME]
    [17]
    Guido W. Imbens and Donald B. Rubin. 2015. Causal Inference for Statistics, Social, and Biomedical Sciences: An Introduction. Cambridge University Press, USA.
    [18]
    Ramesh Johari, Hannah Li, Inessa Liskovich, and Gabriel Weintraub. 2021. Experimental Design in Two-Sided Platforms: An Analysis of Bias. arxiv:2002.05670 [stat.ME]
    [19]
    Ron Kohavi, Diane Tang, and Ya Xu. 2020. Trustworthy Online Controlled Experiments: A Practical Guide to A/B Testing. Cambridge University Press.
    [20]
    Charles F. Manski. 2013. Identification of treatment response with social interactions. The Econometrics Journal 16, 1 (2013), S1–S23. https://doi.org/10.1111/j.1368-423X.2012.00368.x arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1368-423X.2012.00368.x
    [21]
    Jean Pouget-Abadie, Kevin Aydin, Warren Schudy, Kay Brodersen, and Vahab Mirrokni. 2019. Variance Reduction in Bipartite Experiments through Correlation Clustering. In Advances in Neural Information Processing Systems. 13288–13298.
    [22]
    Martin Saveski, Jean Pouget-Abadie, Guillaume Saint-Jacques, Weitao Duan, Souvik Ghosh, Ya Xu, and Edoardo M. Airoldi. 2017. Detecting Network Effects: Randomizing Over Randomized Experiments. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining(Halifax, NS, Canada) (KDD ’17). Association for Computing Machinery, New York, NY, USA, 1027–1035. https://doi.org/10.1145/3097983.3098192
    [23]
    Carla Sneider and Yixin Tang. 2019. Experiment Rigor for Switchback Experiment Analysis. https://doordash.engineering/2019/02/20/experiment-rigor-for-switchback-experiment-analysis/
    [24]
    Johan Ugander, Brian Karrer, Lars Backstrom, and Jon Kleinberg. 2013. Graph Cluster Randomization: Network Exposure to Multiple Universes. In Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining(Chicago, Illinois, USA) (KDD ’13). Association for Computing Machinery, New York, NY, USA, 329–337. https://doi.org/10.1145/2487575.2487695
    [25]
    Stefan Wager and Kuang Xu. 2019. Experimenting in Equilibrium. https://arxiv.org/abs/1903.02124
    [26]
    Ya Xu, Weitao Duan, and Shaochen Huang. 2018. SQR: Balancing Speed, Quality and Risk in Online Experiments. arxiv:1801.08532 [stat.AP]
    [27]
    Corwin M Zigler and Georgia Papadogeorgou. 2018. Bipartite causal inference with interference. arXiv preprint arXiv:1807.08660(2018).

    Cited By

    View all

    Index Terms

    1. Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      WWW '22: Proceedings of the ACM Web Conference 2022
      April 2022
      3764 pages
      ISBN:9781450390965
      DOI:10.1145/3485447
      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 April 2022

      Check for updates

      Author Tags

      1. Experimental design
      2. interference
      3. two-sided marketplaces

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      WWW '22
      Sponsor:
      WWW '22: The ACM Web Conference 2022
      April 25 - 29, 2022
      Virtual Event, Lyon, France

      Acceptance Rates

      Overall Acceptance Rate 1,899 of 8,196 submissions, 23%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)601
      • Downloads (Last 6 weeks)46
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Experimental Design through an Optimization LensSSRN Electronic Journal10.2139/ssrn.4780792Online publication date: 2024
      • (2024)A/B testingJournal of Systems and Software10.1016/j.jss.2024.112011211:COnline publication date: 2-Jul-2024
      • (2023)Statistical inference and A/B testing for first-price pacing equilibriaProceedings of the 40th International Conference on Machine Learning10.5555/3618408.3619268(20868-20905)Online publication date: 23-Jul-2023
      • (2023)Resilient Information AggregationElectronic Proceedings in Theoretical Computer Science10.4204/EPTCS.379.6379(31-45)Online publication date: 11-Jul-2023
      • (2023)Switchback Experiments in a Reactive EnvironmentSSRN Electronic Journal10.2139/ssrn.4436643Online publication date: 2023
      • (2023)Exploiting neighborhood interference with low-order interactions under unit randomized designJournal of Causal Inference10.1515/jci-2022-005111:1Online publication date: 3-Aug-2023
      • (2023)The Price is Right: Removing A/B Test Bias in a Marketplace of Expirable GoodsProceedings of the 32nd ACM International Conference on Information and Knowledge Management10.1145/3583780.3615502(4681-4687)Online publication date: 21-Oct-2023
      • (2023)Detecting Interference in Online Controlled Experiments with Increasing AllocationProceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3580305.3599308(661-672)Online publication date: 6-Aug-2023
      • (2023)Statistical Challenges in Online Controlled Experiments: A Review of A/B Testing MethodologyThe American Statistician10.1080/00031305.2023.225723778:2(135-149)Online publication date: 18-Oct-2023
      • (2022)Cluster randomized designs for one-sided bipartite experimentsProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3603021(37962-37974)Online publication date: 28-Nov-2022
      • Show More Cited By

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media