Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Jump to content

DREAM Challenges

From Wikipedia, the free encyclopedia
DREAM Challenges
Typenon-profit organization
PurposeCrowd-sourced competitions
Websitehttp://www.dreamchallenges.org

DREAM Challenges (Dialogue for Reverse Engineering Assessment and Methods) is a non-profit initiative for advancing biomedical and systems biology research via crowd-sourced competitions.[1][2] Started in 2006, DREAM challenges collaborate with Sage Bionetworks to provide a platform for competitions run on the Synapse platform. Over 60 DREAM challenges have been conducted over the span of over 15 years.[3]

Overview

[edit]

DREAM Challenges were founded in 2006 by Gustavo Stolovizky from IBM Research[4] and Andrea Califano from Columbia University. Current chair of the DREAM organization is Paul Boutros from University of California. Further organization spans emeritus chairs Justin Guinney and Gustavo Stolovizky, and multiple DREAM directors.[5]

Individual challenges focus on tackling a specific biomedical research question, typically narrowed down to a specific disease. A prominent disease focus has been on oncology, with multiple past challenges focused on breast cancer, acute myeloid leukemia, and prostate cancer or similar diseases.[3] The data involved in an individual challenge reflects the disease context; while cancers typically involve data such as mutations in the human genome, gene expression and gene networks in transcriptomics, and large scale proteomics, newer challenges have shifted towards single cell sequencing technologies as well as emerging gut microbiome related research questions, thus reflecting trends in the wider research community.[6]

Motivation for DREAM Challenges is that via crowd-sourcing data to a larger audience via competitions, better models and insight is gained than if the analysis was conducted by a single entity.[7] Past competitions have been published in such scientific venues as the flagship journals of the Nature Portfolio and PLOS publishing groups.[8] Results of DREAM challenges are announced via web platforms, and the top performing participants are invited to present their results in the annual RECOMB/ISCB Conferences with RSG/DREAM [9] organized by the ISCB.

While DREAM Challenges have emphasized open science and data, in order to mitigate issues rising from highly sensitive data such as genomics in patient cohorts, "model to data" approaches have been adopted.[10] In such challenges participants submit their models via containers such as Docker or Singularity. This allows retaining confidentiality of the original data as these containers are then run by the organizers on the confidential data. This differs from the more traditional open data model, where participants submit predictions directly based on the provided open data.

Challenge organization

[edit]

DREAM challenge comprises a core DREAM/Sage Bionetworks organization group as well as an extended scientific expert group, who may have contributed to creation and conception of the challenge or by providing key data.[11] Additionally, new DREAM challenges may be proposed by the wider research community.[12] Pharmaceutical companies or other private entities may also be involved in DREAM challenges, for example in providing data.

Challenge structure

[edit]

Timelines for key stages (such as introduction webinars, model submission deadlines, and final deadline for participation) are provided in advance. After the winners are announced, organizers start collaborating with the top performing participants to conduct post hoc analyses for a publication describing key findings from the competition.[7]

Challenges may be split into sub-challenges, each addressing a different subtopic within the research question. For example, regarding cancer treatment efficacy predictions, these may be separate predictions for progression-free survival, overall survival, best overall response according to RECIST, or exact time until event (progression or death).[2]

Participation

[edit]

During DREAM challenges, participants typically build models on provided data, and submit predictions or models that are then validated on held-out data by the organizers. While DREAM challenges avoid leaking validation data to participants, there are typically mid-challenge submission leaderboards available to assist participants in evaluating their performance on a sub-sampled or scrambled dataset.[7]

DREAM challenges are free for participants. During the open phase anybody can register via Synapse to participate either individually or as a team. A person may only register once and may not use any aliases.

There are some exceptions, which disqualify an individual from participating, for example:[13]

  • Person has privileged access to the data for the particular challenge, thus providing them with an unfair advantage.
  • Person has been caught or is under suspicion of cheating or abusing previous DREAM Challenges.
  • Person is a minor (under age 18 or the age of majority in jurisdiction of residence). This may be alleviated via parental consent.

See also

[edit]

References

[edit]
  1. ^ Meyer, Pablo; Saez-Rodriguez, Julio (2021-06-16). "Advances in systems biology modeling: 10 years of crowdsourcing DREAM challenges". Cell Systems. 12 (6): 636–653. doi:10.1016/j.cels.2021.05.015. ISSN 2405-4712. PMID 34139170. S2CID 235472517.
  2. ^ a b Vincent, Benjamin G.; Szustakowski, Joseph D.; Doshi, Parul; Mason, Michael; Guinney, Justin; Carbone, David P. (2021-11-01). "Pursuing Better Biomarkers for Immunotherapy Response in Cancer Through a Crowdsourced Data Challenge". JCO Precision Oncology. 5 (5): 51–54. doi:10.1200/PO.20.00371. PMC 9848594. PMID 34994587. S2CID 234209297.
  3. ^ a b "Closed Challenges". DREAM Challenges. Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  4. ^ "DREAM Challenges (IBM Research)". Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  5. ^ "DREAM Directors & Support Team". DREAM Challenges. Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  6. ^ "Open Challenges". DREAM Challenges. Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  7. ^ a b c Saez-Rodriguez, Julio; Costello, James C.; Friend, Stephen H.; Kellen, Michael R.; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo (2016-07-15). "Crowdsourcing biomedical research: leveraging communities as innovation engines". Nature Reviews. Genetics. 17 (8): 470–486. doi:10.1038/nrg.2016.69. ISSN 1471-0056. PMC 5918684. PMID 27418159.
  8. ^ "Publications". DREAM Challenges. Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  9. ^ "HOME - RSGDREAM 2022". www.iscb.org. Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  10. ^ Ellrott, Kyle; Buchanan, Alex; Creason, Allison; Mason, Michael; Schaffter, Thomas; Hoff, Bruce; Eddy, James; Chilton, John M.; Yu, Thomas; Stuart, Joshua M.; Saez-Rodriguez, Julio; Stolovitzky, Gustavo; Boutros, Paul C.; Guinney, Justin (2019-09-10). "Reproducible biomedical benchmarking in the cloud: lessons from crowd-sourced data challenges". Genome Biology. 20 (1): 195. doi:10.1186/s13059-019-1794-0. ISSN 1474-760X. PMC 6737594. PMID 31506093.
  11. ^ Boutros, Paul (2020-11-20). "Can crowd-sourcing help advance open science?". FEBS Network. Archived from the original on 2023-01-07. Retrieved 2022-11-13.
  12. ^ Azencott, Chloé-Agathe; Aittokallio, Tero; Roy, Sushmita; DREAM Idea Challenge Consortium; Norman, Thea; Friend, Stephen; Stolovitzky, Gustavo; Goldenberg, Anna (2017-09-29). "The inconvenience of data of convenience: computational research beyond post-mortem analyses". Nature Methods. 14 (10): 937–938. doi:10.1038/nmeth.4457. ISSN 1548-7105. PMID 28960198. S2CID 34550460.
  13. ^ "DREAM OFFICIAL CHALLENGE RULES - Effective May 11, 2022 (on Synapse)". www.synapse.org. Archived from the original on 2023-01-06. Retrieved 2022-11-13.