Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3408877.3432382acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article
Open access

Reduced Learning Time with Maintained Learning Outcomes

Published: 05 March 2021 Publication History

Abstract

Many online learning initiatives have failed to reach beyond the environments in which they were first developed. One exception is the Open Learning Initiative (OLI) at Carnegie Mellon University (CMU). In an attempt to validate the question-based learning methodology implemented in OLI, we developed online material for an introductory course in object-oriented programming, and tested it on two course offerings with a total of 70 students. As our course has been given in the same format for several years, we also had comparable assessment data for two classes prior to our intervention in order to determine that we did not introduce any obvious harm with this methodology. Findings show a reduced teaching and learning time by 25%. No statistically significant differences could be found in the results of the assessment quizzes nor confidence surveys completed by the students. The two teachers (the same who handled the classes before the intervention) took different paths to teaching preparations with this new methodology. One teacher increased preparations, whilst the other reduced them, but both teachers were convinced that using online question-based learning was superior to the previous lecture and textbook-based approach, both for the students and themselves in terms of overall satisfaction. We also gathered time logs from the development to estimate return on investment.

References

[1]
Susan A Ambrose, Michael W Bridges, Michele DiPietro, Marsha C Lovett, and Marie K Norman. 2010. How learning works: Seven research-based principles for smart teaching .John Wiley & Sons.
[2]
Olle B"alter, Dawn Zimmaro, and Candace Thille. 2018. Estimating the minimum number of opportunities needed for all students to achieve predicted mastery. Smart Learning Environments, Vol. 5, 1 (2018), 15.
[3]
Jon Baron. 2013. Randomized controlled trials commissioned by the Institute of Education Sciences since 2002: How many found positive versus weak or no effects. Washington, DC: Coalition for Evidence-Based Policy (2013).
[4]
William G Bowen, Matthew M Chingos, Kelly A Lack, and Thomas I Nygren. 2014. Interactive learning online at public universities: Evidence from a six-campus randomized trial. Journal of Policy Analysis and Management, Vol. 33, 1 (2014), 94--111.
[5]
Igor Chirikov, Tatiana Semenova, Natalia Maloshonok, Eric Bettinger, and René F Kizilcec. 2020. Online education platforms scale college STEM instruction with equivalent learning outcomes at lower cost. Science Advances, Vol. 6, 15 (2020), eaay5324.
[6]
Louis Deslauriers, Logan S McCarty, Kelly Miller, Kristina Callaghan, and Greg Kestin. 2019. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences, Vol. 116, 39 (2019), 19251--19257.
[7]
Mark Dynarski, Roberto Agodini, Sheila Heaviside, Timothy Novak, Nancy Carey, Larissa Campuzano, Barbara Means, Robert Murphy, William Penuel, Hal Javitz, et al. 2007. Effectiveness of reading and mathematics software products: Findings from the first student cohort. (2007).
[8]
The Racoon Gang. 2019 (accessed August 25, 2020). How much does it cost to develop an online course? https://raccoongang.com/blog/how-much-does-it-cost-create-online-course
[9]
Richard Glassey and Olle B"alter. 2020. Put the students to work: generating questions with constructive feedback. In IEEE Frontiers in Education Conference (FIE). IEEE.
[10]
Richard Glassey, Olle Balter, Philipp Haller, and Mattias Wiggberg. 2020. Addressing the Double Challenge of Learning and Teaching Enterprise Technologies through Peer Teaching. In 2010 IEEE/ACM 42nd International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET). IEEE.
[11]
Richard Glassey, Mattias Wiggberg, and Philipp Haller. 2018. Agile and Adaptive Learning via the ECK-model in the Software Development Academy. In EC-TEL Practitioner: 13th European Conference On Technology Enhanced Learning, EC-TEL .
[12]
Rebecca Griffiths, Matthew Chingos, Christine Mulhern, and Richard Spies. 2014. Interactive online learning on campus: Testing MOOCs and other platforms in hybrid formats in the University System of Maryland. Ithaka S+R, Vol. 10 (2014).
[13]
Ann House, Jared Boyce, Sam Wang, Barbara Means, Vanessa Peters Hinton, and Tallie Wetzel. 2018. Next Generation Courseware Challenge Evaluation. Online Submission (2018).
[14]
Ioana Jivet, Maren Scheffel, Hendrik Drachsler, and Marcus Specht. 2017. Awareness is not enough: pitfalls of learning analytics dashboards in the educational practice. In European Conference on Technology Enhanced Learning. Springer, 82--96.
[15]
Pernilla Josefsson, Alexander Baltatzis, Olle Balter, Fredrik Enoksson, Björn Hedin, and Emma Riese. 2018. Drivers and barriers for promoting technology-enhanced learning in higher education. In 12th International Technology, Education and Development Conference (INTED), MAR 05-07, 2018, Valencia, SPAIN. 4576--4584.
[16]
Kenneth R Koedinger, Julie L Booth, and David Klahr. 2013. Instructional complexity and the science to constrain it. Science, Vol. 342, 6161 (2013), 935--937.
[17]
Kenneth R Koedinger, Albert T Corbett, and Charles Perfetti. 2012. The Knowledge-Learning-Instruction framework: Bridging the science-practice chasm to enhance robust student learning. Cognitive science, Vol. 36, 5 (2012), 757--798.
[18]
Kenneth R Koedinger, Jihee Kim, Julianna Zhuxin Jia, Elizabeth A McLaughlin, and Norman L Bier. 2015. Learning is not a spectator sport: Doing is better than watching for learning from a MOOC. In Proceedings of the second (2015) ACM conference on learning@ scale. 111--120.
[19]
Kenneth R Koedinger, Elizabeth A McLaughlin, Julianna Zhuxin Jia, and Norman L Bier. 2016. Is the doer effect a causal relationship? How can we tell and why it's important. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge. 388--397.
[20]
E Lammers, A Bryant, Newman, and T Miles. 2015. Time for class: Lessons for the future of digital courseware in higher education. (2015).
[21]
E Lammers, G Bryant, LS Michel, and J Seaman. 2017. Time for class: Lessons for the future of digital courseware in higher education - updated. (2017).
[22]
Marsha Lovett. 2012. Cognitively informed analytics to improve teaching and learning. Presentation at EDUCAUSE Sprint. Retrieved October, Vol. 5 (2012).
[23]
Marsha Lovett, Oded Meyer, and Candace Thille. 2008. The Open Learning Initiative: Measuring the Effectiveness of the OLI Statistics Course in Accelerating Student Learning. Journal of Interactive Media in Education (2008).
[24]
Barbara Means, Yukie Toyama, Robert Murphy, Marianne Bakia, and Karla Jones. 2009. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. US Department of Education (2009).
[25]
Inge Molenaar and Carolien Knoop-van Campen. 2017. Teacher dashboards in practice: Usage and impact. In European Conference on Technology Enhanced Learning. Springer, 125--138.
[26]
Sarah Ryan, Julia Kaufman, Joel Greenhouse, Ruicong She, and Judy Shi. 2016. The effectiveness of blended online learning courses at the community college level. Community College Journal of Research and Practice, Vol. 40, 4 (2016), 285--298.
[27]
Marcel Schmitz, Evelien Van Limbeek, Wolfgang Greller, Peter Sloep, and Hendrik Drachsler. 2017. Opportunities and challenges in using learning analytics in learning design. In European Conference on Technology Enhanced Learning. Springer, 209--223.
[28]
Christian D Schunn and M Patchan. 2009. An evaluation of accelerated learning in the Carnegie Mellon University Open Learning Initiative course Logic & Proofs. Report, Learning Research and Development Center, University of Pittsburgh (2009).
[29]
Paul S Steif and Anna Dollár. 2009. Study of usage patterns and learning gains in a web-based interactive static course. Journal of Engineering Education, Vol. 98, 4 (2009), 321--333.
[30]
Candace Thille and Joel Smith. 2011. Cold Rolled Steel and Knowledge: What Can Higher Education Learn About Productivity? Change: The Magazine of Higher Learning, Vol. 43, 2 (2011), 21--27.
[31]
Anne Thoring, Dominik Rudolph, and Raimund Vogl. 2017. Digitalization of higher education from a student's point of view. EUNIS 2017--Shaping the Digital Future of Universities (2017), 279--288.
[32]
Arto Vihavainen, Jonne Airaksinen, and Christopher Watson. 2014. A systematic review of approaches for teaching introductory programming and their influence on success. In Tenth annual conference on International computing education research. ACM, 19--26.
[33]
Mattias Wiggberg, Elina Gobena, Matti Kaulio, Richard Glassey, Olle Balter, Dena Hussain, and Philipp Haller. 2020. Toward an effective approach for re-skilling at universities: The case of KTH's Software Development Academy. Technical Report. KTH Royal Institute of Technology.

Cited By

View all
  • (2024)Pure Question-Based LearningEducation Sciences10.3390/educsci1408088214:8(882)Online publication date: 13-Aug-2024
  • (2022)A Web-Based Program About Sustainable Development Goals Focusing on Digital Learning, Digital Health Literacy, and Nutrition for Professional Development in Ethiopia and Rwanda: Development of a Pedagogical MethodJMIR Formative Research10.2196/365856:12(e36585)Online publication date: 5-Dec-2022
  • (2022)Effective Reskilling of Foreign-Born People at Universities - The Software Development AcademyIEEE Access10.1109/ACCESS.2022.315219410(24556-24565)Online publication date: 2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGCSE '21: Proceedings of the 52nd ACM Technical Symposium on Computer Science Education
March 2021
1454 pages
ISBN:9781450380621
DOI:10.1145/3408877
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 March 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evaluation
  2. introductory programming
  3. question-based learning

Qualifiers

  • Research-article

Funding Sources

Conference

SIGCSE '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

Upcoming Conference

SIGCSE TS 2025
The 56th ACM Technical Symposium on Computer Science Education
February 26 - March 1, 2025
Pittsburgh , PA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)180
  • Downloads (Last 6 weeks)33
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Pure Question-Based LearningEducation Sciences10.3390/educsci1408088214:8(882)Online publication date: 13-Aug-2024
  • (2022)A Web-Based Program About Sustainable Development Goals Focusing on Digital Learning, Digital Health Literacy, and Nutrition for Professional Development in Ethiopia and Rwanda: Development of a Pedagogical MethodJMIR Formative Research10.2196/365856:12(e36585)Online publication date: 5-Dec-2022
  • (2022)Effective Reskilling of Foreign-Born People at Universities - The Software Development AcademyIEEE Access10.1109/ACCESS.2022.315219410(24556-24565)Online publication date: 2022
  • (2021)Sustainable Approaches for Accelerated LearningSustainability10.3390/su13211199413:21(11994)Online publication date: 29-Oct-2021

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media