Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3170358.3170408acmotherconferencesArticle/Chapter ViewAbstractPublication PageslakConference Proceedingsconference-collections
research-article

Capitalisation of analysis processes: enabling reproducibility, openness and adaptability thanks to narration

Published: 07 March 2018 Publication History

Abstract

Analysis processes of learning traces, used to gain important pedagogical insights, are yet to be easily shared and reused. They face what is commonly called a reproducibility crisis. From our observations, we identify two important factors that may be the cause of this crisis: technical constraints due to runnable necessities, and context dependencies. Moreover, the meaning of the reproducibility itself is ambiguous and a source of misunderstanding. In this paper, we present an ontological framework dedicated to taking full advantage of already implemented educational analyses. This framework shifts the actual paradigm of analysis processes by representing them from a narrative point of view, instead of a technical one. This enables a formal description of analysis processes with high-level concepts. We show how this description is performed, and how it can help analysts. The goal is to empower both expert and non-expert analysis stakeholders with the possibility to be involved in the elaboration of analysis processes and their reuse in different contexts, by improving both human and machine understanding of these analyses. This possibility is known as the capitalisation of analysis processes of learning traces.

References

[1]
Ryan SJD Baker and Kalina Yacef. 2009. The state of educational data mining in 2009: A review and future visions. JEDM 1, 1 (2009), 3--17.
[2]
Glenn Begley and John Ioannidis. 2015. Reproducibility in Science. Circulation Research 116, 1 (2015), 116--126.
[3]
Khalid Belhajjame et al. 2012. Why Workflows Break --- Understanding and Combating Decay in Taverna Workflows. In Proceedings of the 8th International Conference on E-Science. IEEE Computer Society, Washington, DC, USA, 1--9.
[4]
Khalid Belhajjame et al. 2015. Using a suite of ontologies for preserving workflow-centric research objects. Web Semantics: Science, Services and Agents on the World Wide Web 32 (2015), 16--42.
[5]
Daniel J. Benjamin et al. 2017. Redefine statistical significance. Nature Human Behaviour (2017).
[6]
Shawn Bowers and Bertram Ludäscher. 2004. An Ontology-Driven Framework for Data Transformation in Scientific Workflows. In Data Integration in the Life Sciences: First International Workshop, DILS 2004, Leipzig, Germany, March 25-26, 2004. Proceedings, Erhard Rahm (Ed.). Springer Berlin Heidelberg, 1--16.
[7]
Christopher A Brooks, Craig Thompson, and Stephanie D Teasley. 2014. Towards A General Method for Building Predictive Models of Learner Success using Educational Time Series Data. In In workshop on LA and ML of LAK 2014.
[8]
Mohamed Amine Chatti, Anna Lea Dyckhoff, Ulrik Schroeder, and Hendrik Thüs. 2012. A reference model for learning analytics. International Journal of Technology Enhanced Learning 4, 5-6 (2012), 318--331.
[9]
Christophe Choquet and Sébastien Iksal. 2006. Usage tracking language: a meta language for modelling tracks in tel systems. In Proceedings of ICSOFT'06. INSTICC, 133--138.
[10]
Doug Clow.2013. An overview of learning analytics. Teaching in Higher Education 18, 6 (2013), 683--695.
[11]
Adam Cooper. 2013. Learning Analytics Interoperability-a survey of current literature and candidate standards. (2013). Retrieved Nov. 22, 2017 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.650.3428&rep=rep1&type=pdf
[12]
David De Roure et al. 2010. Towards open science: the myExperiment approach. Concurrency and Computation: Practice and Experience 22, 17 (2010), 2335--2353.
[13]
Tanya Elias. 2011. Learning Analytics : Definitions, Processes and Potential. Technical Report.
[14]
Usama Fayyad, Gregory Piatetsky-Shapiro, and Padhraic Smyth. 1996. From data mining to knowledge discovery in databases. AI magazine 17, 3 (1996), 37.
[15]
Joint Committee for Guides in Metrology. 2008. International Vocabulary of Metrology-Basic and General Concepts and Associated Terms. Technical Report.
[16]
ACM Inc. 2016. Artifact Review and Badging. (2016). Retrieved Nov. 22, 2017 from http://www.acm.org/publications/policies/artifact-review-badging
[17]
Kenneth Koedinger et al. 2010. A data repository for the EDM community: The PSLC DataShop. Handbook of educational data mining 43, Article 4 (2010).
[18]
Matthias Kreuseler, Thomas Nocke, and Heidrun Schumann. 2004. A history mechanism for visual data mining. In Information Visualization, 2004. INFOVIS 2004. IEEE Symposium on. IEEE, 49--56.
[19]
Alexis Lebis, Marie Lefevre, Vanda Luengo, and Nathalie Guin. 2016. Towards a Capitalization of Processes Analyzing Learning Interaction Traces. In Proceedings of the EC-TEL'16. Springer, 397--403.
[20]
Nadine Mandran, Michael Ortega, Vanda Luengo, and Denis Bouhineau. 2015. DOP8: merging both data and analysis operators life cycles for technology enhanced learning. In Proceedings of LAK'15. ACM, 213--217.
[21]
MITx and HarvardX. 2014. HarvardX-MITx Person-Course Academic Year 2013 De-Identified dataset, version 2.0. (2014).
[22]
Nature Publishing Group. 2016. Reality check on reproducibility. Vol. 533. 437.
[23]
Kevin Page et al. 2012. From workflows to Research Objects: an architecture for preserving the semantics of science. Proceedings of the 2nd International Workshop on Linked Science (10 2012).
[24]
Ricardo Queirós and José Paulo Leal. 2011. A survey on eLearning content standardization. In World Summit on Knowledge Society. Springer, 433--438.
[25]
Cristobal Romero and Sebastian Ventura. 2007. Educational data mining: A survey from 1995 to 2005. Expert systems with applications 33, 1 (2007), 135--146.
[26]
George Siemens et al. 2011. Open Learning Analytics: an integrated & modularized platform. Technical Report. Society for Learning Analytics Research.
[27]
George Siemens and Phil Long. 2011. Penetrating the fog: Analytics in learning and education. EDUCAUSE review 46, 5 (2011), 30.

Cited By

View all
  • (2022)A Comparative Analysis of Approaches to Design and Capitalize Data IndicatorsOpen and Inclusive Educational Practice in the Digital World10.1007/978-3-031-18512-0_9(135-151)Online publication date: 14-Dec-2022
  • (2022)Objective Tests in Automated Grading of Computer Science Courses: An OverviewHandbook on Intelligent Techniques in the Educational Process10.1007/978-3-031-04662-9_12(239-268)Online publication date: 16-Jun-2022
  • (2020)Learning management system and course influences on student actions and learning experiencesEducational Technology Research and Development10.1007/s11423-020-09821-168:6(3263-3297)Online publication date: 11-Sep-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
LAK '18: Proceedings of the 8th International Conference on Learning Analytics and Knowledge
March 2018
489 pages
ISBN:9781450364003
DOI:10.1145/3170358
Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 March 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. adaptability
  2. analysis processes of learning traces
  3. capitalization
  4. context
  5. learning analytics
  6. ontology
  7. openness
  8. reproducibility
  9. reuse

Qualifiers

  • Research-article

Funding Sources

  • Agence Nationale de la Recherche

Conference

LAK '18
LAK '18: International Conference on Learning Analytics and Knowledge
March 7 - 9, 2018
New South Wales, Sydney, Australia

Acceptance Rates

LAK '18 Paper Acceptance Rate 35 of 115 submissions, 30%;
Overall Acceptance Rate 236 of 782 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)1
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)A Comparative Analysis of Approaches to Design and Capitalize Data IndicatorsOpen and Inclusive Educational Practice in the Digital World10.1007/978-3-031-18512-0_9(135-151)Online publication date: 14-Dec-2022
  • (2022)Objective Tests in Automated Grading of Computer Science Courses: An OverviewHandbook on Intelligent Techniques in the Educational Process10.1007/978-3-031-04662-9_12(239-268)Online publication date: 16-Jun-2022
  • (2020)Learning management system and course influences on student actions and learning experiencesEducational Technology Research and Development10.1007/s11423-020-09821-168:6(3263-3297)Online publication date: 11-Sep-2020
  • (2019)Topic Hierarchies for Knowledge Capitalization using Hierarchical Dirichlet Processes in Big Data ContextAdvanced Intelligent Systems for Sustainable Development (AI2SD’2018)10.1007/978-3-030-11928-7_54(592-608)Online publication date: 7-Mar-2019

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media