Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3027385.3027396acmotherconferencesArticle/Chapter ViewAbstractPublication PageslakConference Proceedingsconference-collections
research-article

Where is the evidence?: a call to action for learning analytics

Published: 13 March 2017 Publication History

Abstract

Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the 'file drawer effect', and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project's Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.

References

[1]
Allen, L.K., Mills, C., Jacovina, M.E., Crossley, S., D'Mello, S., and Mcnamara, D.S., 2016. Investigating boredom and engagement during writing using multiple sources of information: the essay, the writer, and keystrokes. In LAK16 ACM, 114--123.
[2]
Allen, R., 2013. Evidence-based practice: why number-crunching tells only part of the story. Blog post: https://ioelondonblog.wordpress.com/2013/03/14/evidence-based-practice-why-number-crunching-tells-only-part-of-the-story/. In IOE London Blog.
[3]
Arnold, K.E. and Pistilli, M., 2012. Course Signals at Purdue: Using Learning Analytics To Increase Student Success. In LAK12 ACM, 267--270.
[4]
Baker, R.S., Duval, E., Stamper, J., Wiley, D., and Buckingham Shum, S., 2012. Educational data mining meets learning analytics. In LAK12 ACM, 20--21.
[5]
Bakharia, A., Corrin, L., De Barba, P., Kennedy, G., Gašević, D., Mulder, R., Williams, D., Dawson, S., and Lockyer, L., 2016. A conceptual framework linking learning design with learning analytics. In LAK16 ACM, 329--338.
[6]
Bos, N. and Brand-Gruwel, S., 2016. Student differences in regulation strategies and their use of learning resources: implications for educational design. In LAK16 ACM, 344--353.
[7]
Brown, M.G., Demonbrun, R.M., Lonn, S., Aguilar, S.J., and Teasley, S.D., 2016. What and when: the role of course type and timing in students' academic performance. In LAK16 ACM, 459--468.
[8]
Buckingham Shum, S., Sándor, Á., Goldsmith, R., Wang, X., Bass, R., And Mcwilliams, M., 2016. Reflecting on reflective writing analytics: Assessment challenges and iterative evaluation of a prototype tool. In LAK16 ACM, 213--222.
[9]
Campbell, J.P., 2007. Utilizing Student Data within the Course Management System To Determine Undergraduate Student Academic Success: An Exploratory Study, PhD thesis, Purdue University, available: http://docs.lib.purdue.edu/dissertations/AAI3287222/.
[10]
Campbell, J.P., Deblois, P.B., and Oblinger, D.G., 2007. Academic analytics: a new tool for a new era. Educause Review 42, 4 (July/August), 40--57.
[11]
CAMPBELL, J.P. and OBLINGER, D.G., 2007. Academic Analytics. Educause. http://net.educause.edu/ir/library/pdf/PUB6101.pdf
[12]
Campus Technology, 2006. Data mining for academic success. In Campus Technology (21 May 2006).
[13]
Carnoy, M., 2015. International Test Score Comparisons and Educational Policy: a Review of the Critiques. National Education Policy Center.
[14]
Caulfield, M., 2012. Course Signals and Analytics. Blog post: http://hapgood.us/2012/08/24/course-signals-and-analytics/. In Hapgood.
[15]
Caulfield, M., 2013. A Simple, Less Mathematical Way To Understand the Course Signals Issue. Blog post: http://hapgood.us/2013/09/26/a-simple-less-mathematical-way-to-understand-the-course-signals-issue/ In Hapgood.
[16]
Charleer, S., Klerkx, J., and DuvaL, E., 2014. Learning dashboards. Journal of Learning Analytics 1, 3, 199--202.
[17]
Chatterji, M., 2013. Global forces and educational assessment - a foreword on why we need an international dialogue on validity and test use. In Validity and Test Use, An International Dialogue on Educational Assessment, Accountability and Equity, M. Chatterji Ed. Emerald, Bingley, UK.
[18]
Chen, Y., Yu, B., Zhang, X., and Yu, Y., 2016. Topic modeling for evaluating students' reflective writing: a case study of pre-service teachers' journals. In LAK16 ACM, 1--5.
[19]
Clow, D., 2012. The learning analytics cycle: closing the loop effectively. In LAK12 ACM, 134--138.
[20]
Clow, D., 2013. Looking harder at Course Signals (13 November 2013). Blog post: https://dougclow.org/2013/11/13/looking-harder-at-course-signals/. In Doug Clow's Imaginatively-Titled Blog.
[21]
Clow, D., Ferguson, R., Macfadyen, L., Prinsloo, P., and Slade, S., 2016. LAK Failathon. In LAK16 ACM, 509--511.
[22]
Cooper, M., Ferguson, R., and Wolff, A., 2016. What can analytics contribute to accessibility in e-learning systems and to disabled students' learning? In LAK16 ACM, 99--103.
[23]
Dawson, S., 2009. 'Seeing' the learning community: an exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology 41, 5, 736--752.
[24]
Dawson, S., McWilliam, E., and Tan, J.P.-L., 2008. Teaching smarter: How mining ICT data can inform and improve learning and teaching practice. In ascilite 2008, Melbourne, Australia (30 Nov--3 December).
[25]
De Liddo, A., Buckingham Shum, S., McAndrew, P., and Farrow, R., 2012. The Open Education Evidence Hub: a collective intelligence tool for evidence based policy. In Proceedings of the Joint OER12 and OpenCourseWare Consortium Global 2012 Conference (Cambridge, UK, 16--18 April 2012).
[26]
De Los Arcos, B., Farrow, R., Perryman, L.-A., Pitt, R., and Weller, M., 2014. OER Evidence Report 2013--2014 OER Research Hub. http://oro.open.ac.uk/41866/.
[27]
Essa, A., 2013. Can We Improve Retention Rates by Giving Students Chocolates? Blog post: http://alfredessa.com/2013/10/can-we-improve-retention-rates-by-giving-students-chocolates/. In alfredessa.com.
[28]
Feldstein, M., 2013. Purdue University Has an Ethics Problem (25 November 2013). Blog post: http://mfeldstein.com/purdue-university-ethics-problem/. In e-Literate.
[29]
Gelman, A. and Loken, E., 2013. The garden of forking paths: Why multiple comparisons can be a problem, even when there is no 'fishing expedition' or 'p-hacking' and the research hypothesis was posited ahead of time. http://www.stat.columbia.edu/~gelman/research/unpublished/p_hacking.pdf
[30]
Ginsburg, A. and Smith, M.S., 2016. Do Randomized Controlled Trials Meet the "Gold Standard"? Blog post: http://www.aei.org/publication/do-randomized-controlled-trials-meet-the-gold-standard/. In American Enterprise Institute.
[31]
Goldacre, B., 2013. Building Evidence into Education. Department for Education, UK. https://http://www.gov.uk/government/news/building-evidence-into-education.
[32]
Govaerts, S., Verbert, K., and Duval, E., 2011. Evaluating the student activity meter: two case studies. In International Conference on Web-Based Learning Springer, 188--197.
[33]
Greenhalgh, T., Howick, J., and Maskrey, N., 2014. Evidence based medicine: a movement in crisis? BMJ 2014;348:g3725
[34]
Harrison, S., Villano, R., Lynch, G., and Chen, G., 2016. Measuring financial implications of an early alert system. In LAK16 ACM, 241--248.
[35]
Henrich, J., Heine, S.J., and Norenzayan, A., 2010. The weirdest people in the world? Behavioral and Brain Sciences 33, 2--3, 61--83.
[36]
Howard-Jones, P.A., 2014. Neuroscience and education: myths and messages. Nature Reviews Neuroscience 15, 12, 817--824.
[37]
Ioannidis, J.P., 2005. Why most published research findings are false. PLoS Medicine 2, 8, e124.
[38]
Ioannidis, J.P.A., 2016. Why Most Clinical Research Is Not Useful. PLoS Medicine 13, 6, e1002049.
[39]
IZAWA, M.R., FRENCH, M.D., and HEDGE, A. Shining new light on the Hawthorne illumination experiments. Human Factors 53, 5, 528--547.
[40]
Joksimović, S., Manataki, A., Gašević, D., Dawson, S., Kovanović, V., and De Kereki, I.F., 2016. Translating network position into performance: importance of centrality in different network configurations. In LAK16 ACM, 314--323.
[41]
Karkalas, S. and Mavrikis, M., 2016. Towards analytics for educational interactive e-books: the case of the reflective designer analytics platform (RDAP). In LAK16 ACM, 143--147.
[42]
Kirkwood, A. and Price, L., 2015. Achieving improved quality and validity: reframing research and evaluation of learning technologies. European Journal of Open, Distance and E-learning 18, 1, 102--115.
[43]
Klenowski, V., 2015. Questioning the validity of the multiple uses of NAPLAN data. In National Testing in Schools: An Australian Assessment, B. Lingard Ed. Routledge, 44--56.
[44]
Lauría, E.J.M., Moody, E.W., Jayaprakash, S.M., Jonnalagadda, N., and Baron, J.D., 2013. Open academic analytics initiative: initial research findings. In LAK13 ACM, 150--154.
[45]
Mathewson, T.G., 2015. Analytics programs show 'remarkable' results --- and it's only the beginning. Blog post: http://www.educationdive.com/news/analytics-programs-show-remarkable-results-and-its-only-the-beginning/404266/. In Education Dive.
[46]
Mostafavi, B. and Barnes, T., 2016. Data-driven proficiency profiling: proof of concept. In LAK16 ACM, 324--328.
[47]
Muslim, A., Chatti, M.A., Mahapatra, T., and Schroeder, U., 2016. A rule-based indicator definition tool for personalized learning analytics. In LAK16 ACM, 264--273.
[48]
Open Science Collaboration, 2015. Estimating the reproducibility of psychological science. (28 August). Science 349, 6251.
[49]
Oster, M., Lonn, S., Pistilli, M.D., and Brown, M.G., 2016. The learning analytics readiness instrument. In LAK16 ACM, 173--182.
[50]
Papamitsiou, Z., Karapistoli, E., and Economides, A.A., 2016. Applying classification techniques on temporal trace data for shaping student behavior models. In LAK16 ACM, 299--303.
[51]
Petticrew, M. and Roberts, H., 2003. Evidence, hierarchies, and typologies: horses for courses. Journal of Epidemiology and Community Health 57, 7, 527--529.
[52]
Pistilli, M.D. and Arnold, K.E., 2010. Purdue Signals: Mining real-time academic data to enhance student success. About Campus: Enriching the Student Learning Experience 15, 3, 22--24.
[53]
Psaty, B.M., Weiss, N.S., Furberg, C.D., Koepsell, T.D., Siscovick, D.S., Rosendaal, F.R., Smith, N.L., Heckbert, S.R., Kaplan, R.C., Lin, D., and Fleming, T.R., 1999. Surrogate end points, health outcomes, and the drug-approval process for the treatment of risk factors for cardiovascular disease. Journal of the American Medical Association 282, 8, 786--790.
[54]
Rienties, B. and Toetenel, L., 2016. The impact of 151 learning designs on student satisfaction and performance: social learning (analytics) matters. In LAK16 ACM, 339--343.
[55]
Rienties, B., Toetenel, L., and Bryan, A., 2015. Scaling up learning design: impact of learning design activities on LMS behavior and performance. In LAK15 ACM, 315--319.
[56]
Robinson, C., Yeomans, M., Reich, J., Hulleman, C., and Gehlbach, H., 2016. Forecasting student achievement in MOOCs with natural language processing. In LAK16 ACM, 383--387.
[57]
Rust, J. and Golombok, S., 2009. Modern Psychometrics, Third Edition. New York, London.
[58]
Sackett, D.L., 1997. Evidence-based medicine. Seminars in Perinatology 21, 1, 3--5.
[59]
Scheffel, M., Drachsler, H., Stoyanov, S., and Specht, M., 2014. Quality indicators for learning analytics. Educational Technology & Society 17, 4, 117--132.
[60]
Siemens, G., Gašević, D., Haythornthwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., Duval, E., Verbert, K., and Baker, R.S.J.D., 2011. Open Learning Analytics: An Integrated and Modularized Platform (Concept Paper). SOLAR.
[61]
SIMMONS, J.P., NELSON, L.D., and SIMONSOHN, U., 2011. False-positive psychology undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science 22, 11, 1359--1366.
[62]
Smaldino, P.E. and McElreath, R., 2016. The natural selection of bad science. Royal Society Open Science arXiv preprint arXiv:1605.09511.
[63]
Suthers, D. and Verbert, K., 2013. Learning Analytics as a 'Middle Space'. In LAK13 ACM, 1--4.
[64]
Taraghi, B., Saranti, A., Legenstein, R., and Ebner, M., 2016. Bayesian modelling of student misconceptions in the one-digit multiplication with probabilistic programming. In LAK16 ACM, 449--453.
[65]
Tempelaar, D.T., Rienties, B., and Giesbers, B., 2015. Stability and sensitivity of learning analytics based prediction models. In 7th International conference on Computer Supported Education, Lisbon, Portugal, 156--166.
[66]
Wolff, A., Zdrahal, Z., Nikolov, A., and Pantucek, M., 2013. Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment. In LAK13 ACM, 145--149.

Cited By

View all
  • (2025)Predicting learning performance using NLP: an exploratory study using two semantic textual similarity methodsKnowledge and Information Systems10.1007/s10115-024-02293-2Online publication date: 13-Feb-2025
  • (2024)Inteligencia artificial y su incidencia en la estrategia metodológica de aprendizaje basado en investigaciónJournal of Economic and Social Science Research10.55813/gaea/jessr/v4/n2/1064:2(178-196)Online publication date: 30-Apr-2024
  • (2024)Feedback mit Learning Analytics – Interdisziplinäres Design eines Dashboards für StudierendeFeedback with learning analytics – Interdisciplinary design of a dashboard for studentsZeitschrift für Hochschulentwicklung10.21240/zfhe/19-4/0519:4(77-94)Online publication date: 22-Dec-2024
  • Show More Cited By

Index Terms

  1. Where is the evidence?: a call to action for learning analytics

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      LAK '17: Proceedings of the Seventh International Learning Analytics & Knowledge Conference
      March 2017
      631 pages
      ISBN:9781450348706
      DOI:10.1145/3027385
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 13 March 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. access
      2. ethics
      3. evidence
      4. evidence hub
      5. generalisability
      6. learning analytics cycle
      7. reliability
      8. validity

      Qualifiers

      • Research-article

      Funding Sources

      • European Commission

      Conference

      LAK '17
      LAK '17: 7th International Learning Analytics and Knowledge Conference
      March 13 - 17, 2017
      British Columbia, Vancouver, Canada

      Acceptance Rates

      LAK '17 Paper Acceptance Rate 36 of 114 submissions, 32%;
      Overall Acceptance Rate 236 of 782 submissions, 30%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)124
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 16 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Predicting learning performance using NLP: an exploratory study using two semantic textual similarity methodsKnowledge and Information Systems10.1007/s10115-024-02293-2Online publication date: 13-Feb-2025
      • (2024)Inteligencia artificial y su incidencia en la estrategia metodológica de aprendizaje basado en investigaciónJournal of Economic and Social Science Research10.55813/gaea/jessr/v4/n2/1064:2(178-196)Online publication date: 30-Apr-2024
      • (2024)Feedback mit Learning Analytics – Interdisziplinäres Design eines Dashboards für StudierendeFeedback with learning analytics – Interdisciplinary design of a dashboard for studentsZeitschrift für Hochschulentwicklung10.21240/zfhe/19-4/0519:4(77-94)Online publication date: 22-Dec-2024
      • (2024)Have Learning Analytics Dashboards Lived Up to the Hype? A Systematic Review of Impact on Students' Achievement, Motivation, Participation and AttitudeProceedings of the 14th Learning Analytics and Knowledge Conference10.1145/3636555.3636884(295-304)Online publication date: 18-Mar-2024
      • (2024)Exploring the Effectiveness of a SPOC Learning Analytics System Based on Attribution Theory: Evaluation Framework and Educational ExperimentIEEE Transactions on Learning Technologies10.1109/TLT.2023.326827617(98-111)Online publication date: 1-Jan-2024
      • (2024)Conceptualising a data analytics framework to support targeted teacher professional developmentProfessional Development in Education10.1080/19415257.2024.2422066(1-24)Online publication date: 5-Nov-2024
      • (2024)Assessing students’ handwritten text productions: A two-decades literature reviewExpert Systems with Applications10.1016/j.eswa.2024.123780250(123780)Online publication date: Sep-2024
      • (2024)Predicting at-risk students in the early stage of a blended learning course via machine learning using limited dataComputers and Education: Artificial Intelligence10.1016/j.caeai.2024.100261(100261)Online publication date: Jul-2024
      • (2024)Curriculum analytics in higher education institutions: a systematic literature reviewJournal of Computing in Higher Education10.1007/s12528-024-09410-8Online publication date: 23-Aug-2024
      • (2024)Leading the AI transformation in schools: it starts with a digital mindsetEducational technology research and development10.1007/s11423-024-10439-wOnline publication date: 11-Dec-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media