Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3584371.3613023acmconferencesArticle/Chapter ViewAbstractPublication PagesbcbConference Proceedingsconference-collections
abstract

Predicting Opioid Use Outcomes in Minoritized Communities

Published: 04 October 2023 Publication History
  • Get Citation Alerts
  • Abstract

    Within the healthcare space, machine learning algorithms can sometimes exacerbate racial, ethnic, and gender disparities, among others. Many machine learning algorithms are trained on data from majority populations, thereby generating less accurate or reliable results for minoritized groups [3]. For example, in a widely used algorithm, at a given risk score, the technique falsely concludes that Black individuals are healthier than equally sick White individuals [6]. Thus, such large-scale algorithms can often perpetuate biases. There has been limited work at exploring potential biases in algorithms deployed within minoritized communities. In particular, minimal research has detailed how biases may manifest in algorithms developed by insurance companies to predict opioid use outcomes, or opioid overdoses among people who use opioids in urban areas. An algorithm trained on data from white individuals may provide incorrect estimates for Hispanic/Latino individuals, perhaps resulting in adverse health outcomes.
    Since predicting opioid use outcomes is important to improving health in populations often neglected by larger health systems [4], our goal is to examine how machine learning algorithms perform at determining opioid use outcomes within minoritized communities. As a case study, we used data from a sample of 539 young adults who engaged in nonmedical use of prescription opioids and/or heroin [5]. The prevalence and incidence of opioid use has increased rapidly in the US in the past two decades, which is related to concomitant increases in opioid dependence, accidental overdose and death. We addressed the indicated issues through the following contributions: 1) Using machine learning techniques, we predicted opioid use outcomes for participants in our dataset; 2) We assessed if an algorithm trained on a majority sub-sample e.g., Non-Hispanic/Latino, male, could accurately predict opioid use outcomes for a minoritized subsample e.g., Latino, female. Our analysis was conducted to replicate possible real-world scenarios, and provide insight on how to improve broad health outcomes via predictive modeling. For example, if an insurance company primarily caters to Non-Hispanic/Latino individuals, models trained on data from Non-Hispanic/Latino individuals may not predict life insurance costing accurately for Hispanic individuals seeking treatment, and our analysis can provide understanding into such scenarios.
    Results indicated that models were able to predict recent injection drug use and participation in drug treatment. The presence of peers who also engaged in opioid use appeared to play a role in predicting drug treatment and injection drug use. However, the available data lacked comprehensive information on other facets of opioid use, such as harm reduction. We noted a decrease in precision when we trained our models on only data from a majority sub-sample, and tested these models on a minoritized sub-sample. Overall, machine learning approaches are only as precise and useful as the data they are trained on, and to make valid and accurate predictions they must be trained on data from people who are similar in terms of key sociodemographic characteristics as the populations about whom predictions will be made. Key to mitigating biases in models to predict health outcomes within minoritized communities, is the inclusion of stakeholders at every stage of the machine learning operations (MLOps) pipeline. For example, methadone patients need to be involved in the development of models to predict methadone dropout risk [1, 2]. Similarly, a committee of ethnic minority individuals can be involved in auditing algorithms used to detect cardiovascular risk. Insurance companies and other stakeholders who use machine learning to predict opioid use outcomes need to be aware that models can exacerbate biases, and seek to improve their predictive modelling capabilities. Insurance companies that have primarily white individuals in their datasets should seek to augment their datasets with individuals from minoritized backgrounds. Such practices can aid providers in making accurate predictions if their client demographics shift, or if nonwhite individuals seek treatment. There increasingly exist independent corporations that audit large scale machine learning models, and such corporations need to ensure that minoritized communities are adequately represented within the audit committee.

    References

    [1]
    Chen, K., Babaeianjelodar, M., Shi, Y., Janmohamed, K., Sarkar, R., Weber, I., Davidson, T., De Choudhury, M., Yadav, S., Khudabukhsh, A., et al. Partisan us news media representations of syrian refugees. arXiv preprint arXiv:2206.09024 (2022).
    [2]
    Chen, K., Feng, A., Aanegola, R., Saha, K., Wong, A., Schwitzky, Z., Lee, R. K.-W., O'Hanlon, R., De Choudhury, M., Altice, F. L., et al. Categorizing memes about the ukraine conflict. In Computational Data and Social Networks: 11th International Conference, CSoNet 2022, Virtual Event, December 5--7, 2022, Proceedings (2023), Springer, pp. 27--38.
    [3]
    Kostick-Quenet, K. M., Cohen, I. G., Gerke, S., Lo, B., Antaki, J., Movahedi, F., Njah, H., Schoen, L., Estep, J. E., and Blumenthal-Barby, J. Mitigating racial bias in machine learning. Journal of Law, Medicine & Ethics 50, 1 (2022), 92--100.
    [4]
    Kumar, N., Janmohamed, K., Nyhan, K., Martins, S. S., Cerda, M., Hasin, D., Scott, J., Frimpong, A. S., Pates, R., Ghandour, L. A., et al. Substance, use in relation to covid-19: a scoping review. Addictive behaviors 127 (2022), 107213.
    [5]
    Mateu-Gelabert, P., Guarino, H., Jessell, L., and Teper, A. Injection and sexual hiv/hcv risk behaviors associated with nonmedical use of prescription opioids among young adults in new york city. Journal of substance abuse treatment 48, 1 (2015), 13--20.
    [6]
    Obermeyer, Z., Powers, B., Vogeli, C., and Mullainathan, S. Dissecting racial bias in an algorithm used to manage the health of populations. Science 366, 6464 (2019), 447--453.

    Index Terms

    1. Predicting Opioid Use Outcomes in Minoritized Communities

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      BCB '23: Proceedings of the 14th ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics
      September 2023
      626 pages
      ISBN:9798400701269
      DOI:10.1145/3584371
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 04 October 2023

      Check for updates

      Author Tags

      1. opioid use
      2. bias
      3. marginalization

      Qualifiers

      • Abstract

      Funding Sources

      • National Institute on Drug Abuse

      Conference

      BCB '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 254 of 885 submissions, 29%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 33
        Total Downloads
      • Downloads (Last 12 months)33
      • Downloads (Last 6 weeks)2

      Other Metrics

      Citations

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media