Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Quantitative Assessment Method For Simulation-Based E-Learnings

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

A Quantitative Assessment Method for Simulation-based E-learnings

Natalia Andriano, Marcela Garay Moyano, Carlos Bertoni, Diego Rubio


Laboratorio de Investigación en Ingeniería y Calidad de Software
http://www.institucional.frc.utn.edu.ar/sistemas/gidicalso/
Departamento de Ing. en Sistemas de Información
Universidad Tecnológica Nacional
Maestro M. López esq. Cruz Roja Argentina
(X5016ZAA) Ciudad Universitaria, Córdoba, Argentina
{nandriano; mgaray; bertonicarlosalberto; drubio}@sistemas.frc.utn.edu.ar

Abstract
For several years the software industry has been focused on improving its product´s quality
by implementing different frameworks, models and standards like CMMI and ISO. It has been
discovered that training team members is a must within these quality frameworks. Given the
vast technologies differentiations and new methodologies for developing software, it is
imminent that alternative faster, effective and more customized ways of training people are
needed. One alternative way in training people is using simulation-based e-learning
technologies. Due to the vast e-learnings market´s availability, evaluations on educational
software must be done to verify the quality of the training that is been produced or acquired.
This paper presents a method that provides a quantitative assessment of the training quality.
The proposed method presents an approach towards assessing educational software through
the quantitative evaluation of predefined attribute. A pilot experience is presented in this paper
along with the method description and explanation.

1. Introduction
For several years the software industry has been focused on improving its product´s quality
by implementing different frameworks [8, 9, 35], models and standards like CMMI [1] and ISO
[2]. It has been discovered that training team members is a must within these quality
frameworks [4, 5]. For decades traditional trainings seemed to work fine [6, 7] but given the
vast technologies differentiations [10, 11, 12] and new methodologies for developing software
[3] it is imminent that alternative faster and effective ways of training people other than the
traditional ones are needed. One alternative way in training people is using simulation-based e-
learning technologies [13]. Simulation-based e-learning is an emerging approach towards
providing electronically supported teaching and learning forms like self-training, mentoring
and discussions [14] within controlled environments where people can practice and learn
without consequences [15].
Some organizations have the resources to produce their own e-learnings tailored to their own
specific needs while some of them must pay for outsourced trainings. In any case, an evaluation
on educational software must be done to verify the quality of the training that is been produced
or acquired in order to ensure the return on the investment.
This paper presents a proposed assessment method that provides a quantitative result of the
training quality. It can be used to evaluate any kind of educational software implemented by e-
learning through interactive simulations. The work is structured as follows, section 2
“Background” describes the context in which this work has been developed; section 3

159
978-1-4577-0348-5/11/$26.00
c 2011 IEEE CSEE&T 2011, Waikiki, Honolulu, HI, USA
“Exploration and Analysis” defines the steps followed to obtain general knowledge on e-
learning and simulators and shows the categories and attributes that compose the method;
section 4 “Experimentation” defines how to run the assessment method; section 5
Demonstration and Implementation presents results from pilot implementation; section 6
Discussion and Conclusions shows what the results mean and states the final conclusions of the
proposed assessment method.

2. Background
According to C. Quinn [39], there are seven steps to better e-learning: meaningful skills,
keep the things lean and light, emotional engagement, connected concepts, elaborated
examples, pragmatic practices and refined reflection. Dr. Ruth Clark [41] defines six principles
for effective e-learning as follows: The multimedia principle: Adding graphics to words can
improve learning; the contiguity principle: placing text near graphics improves learning; the
modality principle: explaining graphics with audio improves learning; the redundancy
principle: explaining graphics with audio and redundant text can hurt learning; The coherence
principle: using gratuitous visuals, text, and sounds can hurt learning and The personalization
principle: use conversational tone and pedagogical agents to increase learning. JISC [42]
presents a guide that establishes what are the most pedagogically sound and accessible ways of
embedding e-learning into everyday practice. Bill Brandon [40] defines five guidelines for e-
learning success, four related to the selection method and one related to the learning method.
Ahdell and Anfersen [43] identified e-learning’s benefits (anytime-anywhere, cost savings,
just-in-time education or updated information, fast development, personalized learning,
feedback provides continuous improvement) and problems (boring, text-heavy content, effects
are hard to measure, underuse) and proposed a model for learning´s effectiveness that contain
the following factors: willingness to learn, expectations, content, learning design, engagement,
collaboration and mentoring. Also they defined the following factors that influence user
engagement in games and simulations, as follows: interactivity, flexibility, competition, reality,
drama effects and usability.
As described in this work, there are many attributes and characteristics described by multiple
authors [17, 20, 21, 22, 23, and 43] when planning a successful e-learning. Assessing those
characteristics is key, not only to understand the quality of e-learnings but also to better design
and develop new e-learnings.
The assessment method presented in this paper is intended to provide assistance in such
endeavor. It is the result of the first phase of the “Sistema generador de e-learnings de procesos
de desarrollo de software mediante simulaciones interactivas” (Software development process
e-learnings generator through interactive simulations) [16] research project at the Universidad
Tecnológica Nacional – Facultad Regional Córdoba [36], Argentina. The objective of the
project is to develop software that automatically generates simulation-based e-learnings
[37]. The project was divided into three main phases. The first one aimed at defining an
assessment method that, based on predefined attributes, enables a quantitative evaluation
of the quality and effectiveness of e-learning implemented through interactive
simulations – which is presented in this work. The second one aimed at developing the
software that automatically generates e-learnings. Finally, the third one aimed at
evaluating the resulting e-learning generated during the second phase through the
quantitative assessment method defined in the first phase.

160
3. Exploration and Analysis
The principal objective of this step is the creation, validation and prioritization of a key
attributes list for both e-learnings and simulators based on general knowledge acquired on what
e-learnings and simulators are and what the main components they must possess. To this end
common concepts have been adopted, as follows:
E-learning: comprises all forms of electronically supported learning and teaching, which are
procedural in character and aim to effect the construction of knowledge with reference to
individual experience, practice and knowledge of the learner. Acronyms like CBT (Computer-
Based Training), IBT (Internet-Based Training) or WBT (Web-Based Training) have been used
as synonyms to e-learning. [14]. E-learning components are [17]: 1) Content, 2) Learning
Management Systems (LMS) - the essential function is to organize and manage learning and
competencies and 3) Learning Content Management System (LCMS) - is used to address the
team process of the creation and maintenance of content.
On the other hand the Oxford English Dictionary [18] describes simulators as: "A program
enabling a computer to execute programs written for a different operating system." Ruth
Thomas [15] specifies that simulations must have two key characteristics:
• A real world computer model exists that contains all the information on how the system
really behaves.
• Experimentation can take place. For example: changes to the input change the output.
A concept that is very closely related to simulation-based e-learning is game-based learning.
[45] defines simulation game as “simulations that contain multiple game-like elements but
retain some environmental fidelity. The environment, objects and rules simulate a performance
environment. This is ideal for Problem solving – using procedures, applying principles,
analytical skills”. In this work we will call simulation-based e-learning to e-learning through
iterative simulations and serious games (Educational games; video games; game-based
learning; instructional games; sim games; gamesims) as they possess similar characteristics [44,
45, 46].
Categories are a group of attributes that have common characteristics and were defined
considering the e-learning´s components (Content, LMS and LCMS) and simulator key
characteristics (real world entities are modeled and experimentation can take place). Categories
were then subdivided into more manageable subcategories, such as: measurable objectives,
logical structure, feedback, etc. then subcategories were divided into quantitative attributes.
Attributes are question-like statements that facilitate the evaluation of the software. To this
end a total of seventeen (17) categories and 165 (one hundred sixty five) attributes were
defined. These question-like statements were defined, validated and prioritized based on e-
learning´s characteristics, game-based e-learnings and simulations [17, 20, 21, 22, 23, 43, 44].
The logarithmic scale [38] was selected to specify the attribute’s priorities as follows: 1 the
lowest, 3 medium and 9 the highest. The prioritization gives relative weight depending on the
importance [17, 20, 21, 22, 23, 43, 44] of the attribute and will be used to calculate the final
evaluation result at the end of the assessment.
Table 1 and Table 2 show the description of the defined categories and subcategories for
both e-learnings and simulators.

161
Table 1: E-learning categories’ and subcategories’ description
Category Subcategory Description
Customization Refers to the delivery formats, PDFs, consistent look and feel, technical
Content writing. It evaluates: text, table, multimedia.
Usability Refers to the elements the end user has to better manipulate the
software. This category evaluates: links, style sheets, layout,
navigational controls and search mechanisms.

Measurable Refers to the evaluation to the way the training objectives and the target
objectives audience are defined.

Logical Refers to the evaluation to the way the training is structured, which is,
structure clearly divided into modules, the time stipulated for each one of the sub
modules, the activities that need to be done in each module.

Training Refers to the strategies that are used in the training: mentoring, forums,
strategies work groups, collaborative learning, case studies, discussions,
conferences, self training.
Feedback Refers to how quickly and effective is the feedback given to the student,
and evaluates the training´s capacity to obtain student´s feedback.

Progress Refers to the Kirkpatrick´s levels of evaluation [20]. In this case we


evaluation only assess the first 2 levels as the followings are meant to be evaluated
within the student´s work environment.

Motivation Refers to the evaluation of the way the training keeps the student´s
motivation to finish it. It evaluates both Synchronous and
Asynchronous communication

Increasing Refers to the evaluation on the ability of the training to increase the
difficulty difficulty as the student advances on the knowledge acquired

LMS Refers to the way the training manages learning and competencies. It
evaluates: Standards, Module configuration, Hardware, Registration
procedures, Reports, Student performance.

LCMS Refers to the evaluation on the way the training is presented, that is:
Formats (E.g.: html. Java, Flash, PDF), Types of multimedia (E.g.:
embedded multimedia, animations)

4. Experimentation
This step´s purpose is the definition of how the method calculates the final evaluation result
based on the prioritized categories´ attributes. After weighting was assigned, predefined
numerical value for assessing each attributes was defined. These values are described as
follows:
• -1 = the criteria does not apply to the software being evaluated.
• 0 = the software does not comply with the attribute at all.
• 1 = the software complies with the attribute but it does not fully satisfies the criteria.
• 2 = the software complies with the attribute and it fully satisfies the criteria.
• 3 = the software complies with the attribute and it exceeds the criteria.

162
It is requested to the evaluator to assign a predefined numerical value to each attribute, then
the tool automatically calculates the total value for each one by multiplying the relative weight
* the assigned predefined numerical value:
Total value per attribute = Relative weight * assigned value
Once all predefined numerical values have been assigned to each attribute the total values
are summarized:
TOTAL = ∑ Total value per attribute
Table 3 shows a screenshot of the defined attributes grouped by subcategory and the relative
weight assigned to each of the attributes.

Table 2: Simulators categories’ and subcategories’ description


Category Subcategory Description

Complexity Refers to the elements the training has in order to make the
learning more suitable to specific needs of a student. An example
Experimentation
of such configuration is an interface where all essential parameters
can be modified according to the previous knowledge and
experiences the student possess.

Feedback Idem e-learning attributes. The difference is that the feedback


obtained is focus from the simulator point of view.

Content and Refers to the information the simulators give to the student in order
knowledge to help him make the right decision.
Real world
Realism Refers to real world elements to be represented. For example:
employees, plans, templates, customers.

Misc Refers to issues related to trainings, installability, documentation,


simulator´s response speed.

5. Demonstration and Implementation


In order to validate the effectiveness of the assessment method it was decided to execute it
on available educational software as a pilot test – in this case SimSE [26]. SimSE was selected
as it was awarded the 2009 Premier Award for Excellence in Engineering Education
Courseware [31]. SimSE allows students to practice a "virtual" software engineering process
(or sub-process) in a fully graphical, interactive, and fun setting in which direct, graphical
feedback enables them to learn the complex cause and effect relationships underlying the
processes of software engineering [25, 26, 27, 28, 29, 30]. SimSE presents several development
models as alternatives. Cascade [31] and incremental [32] were selected for the purpose of this
experimentation. Also, to provide a better evaluation of the method in terms of reproducibility
and repeatability, five investigators run the assessment separately and then analyses were done
on their evaluations. An average of 1 ½ hours took the investigators to perform the evaluation.
As a result, an observation of the lack of instructions on how to run the evaluation was done. In
order to improve the method a detailed instructions list was developed and added to the
method.
The method presents the evaluation results in two different ways. The first one is a detailed
numerical list that discriminates total available criteria, criteria that apply to the software that is

163
being evaluated and the scores obtained in each category. Table 4 is a screenshot of the detailed
numerical information.

Table 3: Method’s attributes


Attributes Weight Value
Customization (delivery format, PDFs, appearance, technical writing)
Does the software provide an equivalent text for each non-text element? 9
This includes: imagines’, graphical text representations (including symbols),
map regions, animations, audio, video
Does all the color information is provided also in non-color information? 9
Does the natural language changes are clearly specified) (Example: match 9
case)?
Are the acronyms specified in a specific section? 1
Can the students download the courseware material? 3
Are the headers and rows clearly identified? 9
Is there a table resume provided? 9
Is there an auditory description on how important is the information in a 3
multimedia presentation?
For multimedia presentation based on time (Example: movie, animation) Are 9
the equivalent alternatives synchronized with the presentation?

Is the option to turn on and off the audio provided? 1


Can the audio be set up? 3
Can the video be set up? 3
Can the audio and the video be repeated if that is what the student wants? 3

As shown in the table below the score obtained in the evaluation is compared to the
maximum score that can be obtained taking into account two different levels of evaluations.
Note that these columns show the maximum possible score to be obtained taking into account
the criteria defined in the method:
1. “Maximum possible score (value 3)” 4th column - when the software exceeds the
criteria (predefined numerical value 3, as described in section 4. Experimentation).
2. “Maximum possible score (value 2)” 6th column - when the software fully satisfies
the criteria (predefined numerical value 2, as described in section 4.
Experimentation).
Also the score obtained is compared to the maximum possible score taking into account only
those attributes selected by the evaluator, that is all attributes that do not have a -1 predefined
numerical value assigned during the evaluation (as described in section Experimentation).
The second way of visualizing evaluation information is a quick view trough graphical
diagram. This graphical representation allows the evaluator to rapidly compare the software
that is being evaluated with the two defined levels. Figure 1 shows the comparison between
score obtained from the evaluation and the maximum possible score for both values 2 (satisfy
the criteria) and 3(exceeds the criteria) taking into account only those attributes that applied to
the evaluation.

164
Table 4: Detailed numerical information example
Categories # # Criteria Maximun Maximun Maximun Maximun Score
Criteria (only possible possible possible possible
those that score score (only score score (only
apply) (value 3) those that (value 2) those that
apply) apply)
(value 3) (value 2)

Customization 11 10 195 159 130 119 64


Usability 20 18 216 180 144 138 117
Measurable 3 3 57 57 38 38 28
objectives
Logical 2 2 36 36 24 24 12
structure
Training 6 6 54 54 36 36 18
strategies
Feedback 4 4 72 72 48 48 30

TOTALS 99 81 1425 1074 950 747 473

The method also provides a second representation on the results obtained. This
representation allows highlighting the strengths and weaknesses of the software. This is done
by comparing each one of the categories identified. Figure 2 shows the comparison of the
categories and the scores obtained in the evaluation to the maximum possible scores for both
values 2 (satisfy the criteria) and 3(exceeds the criteria), taking into account only those
attributes that applied to the evaluation.

Figure 1: Score comparison


As mentioned before the method allows the evaluator to customize his own evaluation. That
is, the evaluator can either assign the -1 value to a particular attribute or he can assign it to a
whole category. This value excludes the selected attribute/category from the final calculation
and from the evaluation. This gives the evaluator full flexibility if he wants to assess only one
particular aspect of the software.

165
Figure 2: Evaluated criteria by category

6. Discussions and Conclusions


The proposed method presents an approach towards assessing educational software through
the evaluation of predefined attributes. The attributes definition was based on e-learning´s and
simulator´s key characteristics and components. In order to prioritize the attributes, they have
been weighted through a logarithmic. After the evaluator performs the evaluation by first
selecting which attributes/categories will be considered (method customization) and then by
assigning a predefined numerical value to each attribute, the method automatically calculates
the final score.
Based on the attributes considered for the evaluation, the method provides quantitative
information on the quality of the software. The results are shown in two different ways: 1)
numerically by showing the comparison between the score obtained and the maximum possible
scores for both values 2 and 3; and 2) graphically by providing feedback on the strengths and
weaknesses by attributes’ category.
In order to decide whether educational software is effective or not, its final score must be
greater or equal to the maximum possible score for value 2. If the evaluator is looking for
software that exceeds the defined criteria, the software´s final score must equal to the
maximum possible score for value 3.
As the method provides a quantitative evaluation it is easier and faster to compare different
educational software. All that needs to be done is compare the numerical scores obtained by
each and see which one has the higher score. This comparison is only useful if the same criteria
(list of attributes/categories) are selected for the evaluation

7. Acknowledgments
The present work was developed as part of a research project at the Software Engineering
and Quality research laboratory (LIDICALSO) [34] at the Universidad Tecnológica Nacional
Facultad Regional Córdoba. We would like to thank: Álvaro Ruiz de Mendarozqueta, Claudio
Gonzalez, Paula Izaurralde and José D’Agostino for their contribution to this work.

8. References
[1] CMMI Product Team. “CMMI for Development, version 1.2”. Pittsburgh, Pennsylvania, USA: Software
Engineering Institute (SEI), August 2006. CMU/SEI-2006-TR-008.
[2] International Organization for Standardization. “ISO9001:2000 Quality management systems --
Requirements. s.l.: ISO copyright office, 2002. ICS 01.04|0.03.
[3] SCRUM Alliance. http://www.scrumalliance.org/

166
[4] J.L.Pfleeger; Ingeniería del Software: Teoría y Práctica; Buenos Aires: Prentice Hall, 2002.
[5] Rolland, C., Souveyet, C. and Moreno, M., 1995. An Approach for Defining Ways-of- Working,
Information Systems, 20(4), 337-359
[6] Laycock, Martyn. Collaborating to compete: achieving effective knowledge sharing in organizations. The
Learning Organization: Emerald Group Publishing Limited, 2005. DOI:10.1108/09696470510626739.
[7] Miller, Jerry. The Internet's Impact On Business Relationships. Information Week. [Online] Sears,
Roebuck and Co, September 17, 2001.
http://www.informationweek.com/news/management/showArticle.jhtml?articleID=6506627.
[8] Rico, David F. ROI of Software Process Improvement (Foreword by Roger S. Pressman). s.l. : J. Ross
Publishing, Inc., January 2004. ISBN: 1-932159-24-X.
[9] Thomas McGibbon; Daniel Ferens; Robert L. Vienneau. A Business Case for Software Process
Improvement (2007 Update). s.l. : Measuring Return on Investment from Software Engineering and
Management, 2007. DACS Report Number 347616.
[10] Nien-Lin Hsueh, Wen-Hsiang Shen, Zhi-Wei Yang, Don-Lin Yang: Applying UML and software
simulation for process definition, verification, and validation. 897-911
[11] O.J. Dahl, E. W. Dijkstra, C. A. R. Hoare; Structured Programming; Academic Press; England; 1972.
[12] Ambler, Scott W. Agile Modeling (AM) Home Page Effective Practices for Modeling and Documentation
.Ambysoft Copyright 2001-2009 http://www.agilemodeling.com/
[13] Goldschneider, Bob. e-learning Best Practices.http://www.syberworks.com/articles/bestpractices.htm
[14] Kontis. “What is e-learning”. s.r.o. Web Page:
http://onlinelearning.kontis.net/uvod_coje.asp?menu=elearning&submenu=coje
[15] Thomas, Ruth. “What Are Simulations? – The JeLSIM Perspective”. JeLSIM. Web Page:
http://www.jelsim.org/resources/whataresimulations.pdf
[16] Rubio Diego, Izaurralde Paula, Andriano Natalia, Silclir Mauricio. “Un entorno de aprendizaje activo de
ingeniería de software basado en la integración Universidad-Industria.” Universidad Tecnológica Nacional
– Facultad Regional Córdoba. 2010
[17] Berman, Pamela. “E-learning concepts and techniques - instructional strategies for e-learnings.” Institute
for Interactive Technologies, Bloomsburg University of Pennsylvania, USA. Web Page:
http://iit.bloomu.edu/Spring2006_eBook_files/chapter5.htm
[18] Oxford English Dictionary - http://oxforddictionaries.com/definition/simulator
[19] Advanced Distributed Learning. The Power of global collaboration. “SCORM. Advanced distributed
learning”. Web Page: http://www.adlnet.gov/Technologies/scorm/default.aspx
[20] Berman, Pamela. “E-learning concepts and techniques - e-learning evaluation”. Institute for Interactive
Technologies, Bloomsburg University of Pennsylvania, USA. Web Page:
http://iit.bloomu.edu/Spring2006_eBook_files/chapter9.htm
[21] W3C. “Checklist of Checkpoints for Web Content Accessibility Guidelines 1.0.”. Web Page:
http://www.w3.org/TR/WCAG10/full-checklist.html
[22] Ehlers, Ulf-D.. “Quality in e-Learning from a Learner's Perspective.” University of Duisburg-Essen
Campus Essen; Universitaetsstr. 9; 45141 Essen; Germany. Web Page:
http://www.eurodl.org/materials/contrib/2004/Online_Master_COPs.html
[23] SPI – Sociedade Portuguesa de Inovação. “Best Practices in e-Learning Study. INNOELEARNING –
Fostering Innovative Self-Learning for Work in the EU Through Dissemination of Innovative Structures
and Applications Identified in the USA and Europe” Project IST – 2001 – 32633. Web Page:
http://www.spi.pt/innoelearning/results/best_practices_in_e-learning_study.pdf
[24] Aviation industry CBT Committee AICC. “AICC Guidelines and recommendations version 1.5 -
TRAINING DEVELOPMENT CHECKLIST.” Web Page: http://www.aicc.org/docs/AGRs/agr012v15.pdf
[25] Oh Navarro, Emily; van der Hoek, André. "Design and Evaluation of an Education Software Process
Simulation Environment and Associated Model," In Proceedings of the Eighteenth Conference on Software
Engineering Education and Training. Ottawa, Canada: IEEE, 2005. Web Page:
http://www.ics.uci.edu/~emilyo/papers/CSEET2005-2.pdf
[26] Oh Navarro, Emily. “An educational, Game Based Software Engineering Simulation Enviroment”.
University of California, Irvine. Copyright ©2009. Web Page: http://www.ics.uci.edu/~emilyo/SimSE/
[27] Oh Navarro, Emily; van der Hoek, André. “Comprehensive Evaluation of an Educational Software

167
Engineering Simulation Environment.” Donald Bren School of Information and Computer Sciences
University of California, Irvine. Web Page: http://www.ics.uci.edu/~emilyo/papers/CSEET2007.pdf
[28] Oh Navarro, Emily. “SimSE: A Software Engineering Simulation Environment for Software
Process Education. Dissertation”. UNIVERSITY OF CALIFORNIA, IRVINE. Web Page:
http://www.ics.uci.edu/~emilyo/papers/Dissertation.pdf
[29] Navarro, Emily; van der Hoek, André. “SIMSE: AN INTERACTIVE SIMULATION GAME FOR
SOFTWARE ENGINEERING EDUCATION”. School of Information and Computer Science
University of California Irvine. Web Page: http://www.ics.uci.edu/~emilyo/papers/CATE2004.pdf
[30] Oh Navarro, Emily; van der Hoek, André. “Software Process Modeling for an Educational
Software Engineering Simulation Game”. Department of Informatics Donald Bren School of
Information and Computer Sciences University of California, Irvine. Web Page:
http://www.ics.uci.edu/~emilyo/papers/SPIP2004.pdf
[31] Engineering Pathway. 2009 Premier Courseware Award Winner SIMSE. http://www.k-
grayengineeringeducation.com/blog/index.php/2009/10/22/classroom-presenter-is-the-2009-
premier-courseware-award-winner/
[32] “SIMSE cascade model”. Web Page:
http://www.ics.uci.edu/~emilyo/SimSE/downloads/WaterfallModel-v-11.zip
[33] “SIMSE incremental model”. Web Page:
http://www.ics.uci.edu/~emilyo/SimSE/downloads/IncrementalModel-v-3.zip
[34] LIDICALSO. Laboratorio de Investigación en Ingeniería y Calidad de Software. Web Page:
http://www.institucional.frc.utn.edu.ar/sistemas/gidicalso/
[35] Jennifer Gremba and Chuch Myers. The Ideal(SM) Model: A practical Guide for process improvement.
Pittsburgh, Pennsylvania, USA : Software Engineering Institute (SEI), Bridge, issue three, 1997
[36] Universidad Tecnológica Nacional-Facultad Regional Córdoba. [Online] http://www.frc.utn.edu.ar/
[37] Roger C. Schank. Designing World-Class E-Learning: How IBM, GE, Harvard Business School and
Columbia University Are Succeding at e-Learning. s.l. : McGraw-Hill, 2002. ISBN:0-07-137772-7.
[38] Teixeira, Joe. “The difference between the Linear and the Logarithmic Scales” -
http://www.morevisibility.com/analyticsblog/the-difference-between-the-linear-and-the-logarithmic-
scales.html
[39] Clark N. Quinn, Director, Quinnovation. “Seven Steps to Better E-learning”
http://www.elearnmag.org/subpage.cfm?section=best_practices&article=35-1
[40] Brandon, Bill. “Leading Through Design: Five Guidelines for e-Learning Success”
http://www.learningsolutionsmag.com/articles/277/leading-through-design-five-guidelines-for-e-learning-
success
[41] Dr. Ruth Clark. “Six Principles of Effective e-Learning: What Works and Why”. September 10, 2002.
http://www.elearningguild.com/pdf/2/091002DES-H.pdf
[42] JISC. “Effective Practice with e-Learning A good practice guide in designing for learning”.
http://www.jisc.ac.uk/media/documents/publications/effectivepracticeelearning.pdf
[43] Rolf Ahdell, Guttorm Andresen. “Games and simulations in workplace eLearning - "How to align
eLearning content with learner needs". Norwegian University of Science and Technology
Department of Industrial Economics and Technology Management
http://www.twitchspeed.com/site/download/thesis_final.pdf
[44] Sara de Freitas - JICS “Learning in Immersive worlds - A review of game-based learning”
http://www.jisc.ac.uk/media/documents/programmes/elearninginnovation/gamingreport_v3.pdf
[45] “Game-based Learning - Learning that is fun and engaging” - Copyright © 2009 - 2010 Upside Learning
Solutions Pvt. Ltd - http://www.upsidelearning.com/game-based-learning.asp
[46] “Human Capital Management e-learning - Simulation-Based E-Learning from Percepsys / Second
Generation 3D Simulation” http://www.percepsys.com/images/Percepsys_SIMSTUDIO.pdf

168

You might also like