Dokumen GGP Aosl 9223 Final v3
Dokumen GGP Aosl 9223 Final v3
Dokumen GGP Aosl 9223 Final v3
This GGP outlines sets of characteristics that describe the minimum levels of
acceptable practices and are divided into areas:
Accordingly, the GGP covers different levels of standards leading to the award of
individual qualifications prescribed in the MQF 2nd Edition (2018), ranging from the
level of certificate (Level 3, MQF) to the level of Doctoral Degree (Level 8, MQF).
This GGP was developed by the MQA in collaboration with the Ministry of Higher
Education. It represents the significant contribution of the panel members from both
public and private HEPs, and in consultation with various HEPs, relevant government
and statutory agencies, industries, alumni, and students through stakeholder
workshops and online feedback. The GGP developed reflects national and
international best practices to ensure programme development from the HEPs in
Malaysia is on par with those in other countries.
1
This GGP encourages diversity and allows programme providers to be innovative in
creating their niches. HEPs should ensure that they produce graduates that meet the
current and future needs of the industry, and at the same time, fulfil their obligations to
society. Among others, this document includes statements of types of assessment in
various contexts, which are intended to give clarity and are not intended to be adopted
in a verbatim manner.
The MQA would like to express its appreciation to all the panel members and various
stakeholders for their valuable input, as well as to all MQA officers who contributed to
the development of this GGP for Assessment of Student Learning. It is hoped that this
GGP is beneficial to different stakeholders for the development of the competencies
required of our students, for both job and higher education prospects.
2
TABLE OF CONTENTS
List of Tables 5
List of Figures 6
Glossary 7
Abbreviations 20
3
Part 5 Communicating Assessment and Outcomes 92
5.1 Continuous Quality Improvements (CQI) 92
5.2 Review of Assessment and Development 93
5.3 Conclusion 95
References 97
Appendices 100
Appendix 1: Example of SOLO Taxonomy Aligned Assessment Plans 100
Appendix 2: Example of The Forms of Online Assessment (Multiple Choice
Questions and Fill in The Blanks) 102
Appendix 3: Example of The Forms of Online Assessment
(True/False and Essay Questions) 103
Appendix 4: Online Engagement, Platform and Assessment (Asynchronous) 104
Appendix 5: Online Engagement, Platform and Assessment for Continuous
and Authentic Assessment 105
Appendix 6: Data Encryption During Transmission of Examination 107
Appendix 7: Conditions and Requirements for Online Assessment 108
Appendix 8: Example of Approximation in Assessment Tasks 109
Appendix 9: Example of Technical Time Allowance for Submission in
Online Exams110
Appendix 10: Example for Supervisor’s Key Areas of Assessment 111
Appendix 11: Example for Postgraduate Level of Assessment 112
Appendix 12: Course Assessment Plan, Instruction and Rubric 113
Appendix 13: Consideration for Exam on Demand 119
Appendix 14: Example of Statements 122
4
LIST OF TABLES
5
LIST OF FIGURES
Figure 1: The Seven Areas of the COPPA 2nd edition (MQA, 2017) 24
Figure 2: Assessment of Learning & Teaching Activities 25
Figure 3: Assessment of Students’ Learning and the Structure of the Guidelines 27
Figure 4: The OBE's Principles 29
Figure 5: Constructive Alignment 31
Figure 6: The Purpose of TOS 36
Figure 7: Elaboration of Revised Bloom's Taxonomy for Cognitive 41
Domain
Figure 8: International Assessment of Adult Competencies (PIAAC) in 49
Numeracy Competency
Figure 9: Formative and Summative Assessment 55
Figure 10: Aligning Learning Outcomes, Learning and Teaching Activities and 65
Assessment
Figure 11: Review of Assessment Methodologies & Best Practices 70
Figure 12: Preventive Methods to Address Integrity Issues 75
Figure 13: Self-Instructional Materials (SIM) Features 77
Figure 14: Future-Ready Framework (2020) 79
Figure 15: Experience Learning and Competency-Based Education Landscape 79
Figure 16: The Research Under Examinations Characteristics 80
Figure 17: Methods of Assessments 83
Figure 18: Three (3) Principles of Assessment that can be Related to APEL 89
Figure 19: The Assessor’s Characteristics 90
Figure 20: Overview of the CQI Process 93
6
GLOSSARY
No Term Explanation
1. Alternative Assessment Alternative assessment is an assessment other than the paper and
pencil tests of examination. Alternative assessment has elements of
being holistic, authentic, collaborative, and related to the world and has
the potential to provide meaningful and ensuring ways of learning.
Note: Refer to this link for relevant publications related to JPT, MoHE
initiatives
https://jpt.mohe.gov.my/portal/index.php/ms/penerbitan?start
=10. Related to GGP:AoSL is Ebook on Alternative
Assessment in Higher Education and Ebook on NOBLe.
7
4. Assessment Data Assessment data can be obtained from directly examining student work
to assess the achievement of learning outcomes or can be based on
data from which one can make inferences about learning.
5. Assess Forward This concept is used in the document to indicate the opposite of the
design-backwards concept when designing a curriculum.
6. Assessment Method Assessment methods define the nature of the assessor's actions and
include examining, interviewing, and testing in a structured or self-
paced mode.
The assessment methods are simply the ways and strategies to collect
data. It can be classified into four categories:
8
provide highly accurate (reliable) data related to learning outcomes,
attainment, and achievement.
9. Assessment Task (AT) An assessment task is a specific work (performance or product) given
by educators to students, allowing them to show how much and how
well they have mastered the learning outcomes. The task is given using
an appropriate and aligned assessment instrument.
10. Classroom Assessment Classroom assessment is a form of continuous evidence collection that
is usually done during face-to-face learning activities.
9
11. Competency Competency is an underlying characteristic of a person/performer
regarding his/her knowledge, skills, and abilities that enables him/her to
successfully and meaningfully complete a given task or role.
13. Continuous Data collection processes are continuously done throughout a course,
Assessment module, or programme to gather evidence of learning to improve
learning, modify teaching, and adjust the curriculum design.
It also includes data gathering that is used to assess how well courses
offered by the programme support attainment of the Programme's
learning outcomes.
14. Continuous Quality Continuous Quality Improvement (CQI) in assessment establishes the
Improvement monitoring metrics to evaluate improvement efforts and outcomes
routinely and ongoing for students’ performance.
10
For Continual Quality Improvement, please refer to GGP: PDD, and
COPPA (Area 7)
15. Course Learning The CLO is intended or desired learning gains in terms of
Outcomes (CLOs)
i. Declarative knowledge (factual, conceptual, procedural),
ii. Functional knowledge (knowledge transfer),
iii. Metacognitive knowledge,
iv. Cognitive skills,
v. Practical skills,
vi. Habits of mind,
vii. Performance, and
viii. ways to respond to events and people as a result of the
learning experiences in the course/module.
16. Coursework The conventional continual content-based data collection process and
Assessment analysis, such as testing, writing, presenting, or performing, are used to
evaluate students' performance and how well they have learned the
content, which can also be used as part of the learning outcomes
attainment.
18. Design Backward An approach to curriculum design that begins with the goals in mind.
The goals begin with crafting the programme aim (purpose and
justification to offer the programme and the adopted philosophy) that
support the attainment of the country’s and university's mission.
Once this aim is agreed upon, programme designers then craft the
programme educational objectives (PEO) that will be used to support
the attainment of the programme aim.
11
This is followed by deciding on the programme learning outcomes
(PLOs), the performance criteria, the performance and outcome
indicators, and the target intended for each PLO.
19. Direct Evidence Evidence is collected and analysed to demonstrate that actual learning
has taken place.
20. Evaluation Evaluation is the process of using the evidence collected through
assessment to make a value judgement on students’ performance and
programme performance relative to the benchmark standards specified
by the learning outcomes’ performance criteria and performance target.
21. Formative Assessment Learning activities that are carried out to find out the level of
achievement based on the learning outcome.
12
In this sense, formative assessment informs both educators and
students about student understanding at a point when timely
adjustments can be made. These adjustments help to ensure students
achieve the targeted learning outcomes within a set time frame.
22. Functional Graduates Graduates who are competent and are able to persistently, responsibly,
and ethically transfer their knowledge, understanding, skills, and
abilities to identify and solve ill-defined, complex, and difficult problems
in their personal, social and professional journeys.
23. Grading Criteria This concept is used when making judgements on the quality of the
performance of assessment tasks or learning outcomes. Grades are
usually based on either indirect grading of learning outcomes (analytical
judgement of the assessment task aligned with the learning outcomes)
or direct grading of the learning outcomes (holistic judgement).
Performance quality for each grade is clearly described in the criteria.
24. Graduate Attributes Graduate attributes are the learning traits and characteristics that are
relevant and appropriate to the graduate's personal, social, and
professional role in life.
13
25. High-Stake Assessment High-stake is a conventional method of assessment that has significant
consequences or impact on individuals, based on their performance. It
is usually conducted in a controlled area like examination halls and most
likely uses a written-based type of examination, where examination has
undergone rigorous development, validation, and security measures to
ensure fairness and accuracy.
26. Holistic Judgement Judgement that defines the performance quality and standard by
combining all the performances solicited by assessment tasks.
27. Indirect Evidence Indirect evidence is evidence or data collected for the purpose of
seeking students’ perceptions of their learning and their learning
experiences.
28. Learning Taxonomies A classification system dealing with varying degrees of cognitive
complexity, skill complexity, and the complexity of the value system
adopted when acting or responding to people, events, and
environments.
29. Lesson Learning These are the outcomes to be achieved upon completion of a lesson.
Outcomes (LLOs) The lesson outcomes are systematic formative measures for
developing students’ attainment of the CLOs.
14
33. Outcomes-Based It is an integrated, valid, reliable, fair, continuous (rather than continual
Assessment (OBA) and judgmental testing), and aligned approach to collecting evidence of
students learning for the purpose of improvement by focusing more on
formative assessment and providing timely feedback.
34. Outcomes-Based An approach to education that begins with clearly focusing on high-
Education (OBE) quality, culminating demonstrations of significant learning in context and
organising everything in an educational system around what is essential
for all students to be able to do successfully at the end of their learning
experiences.
This means starting with a clear picture of what is important for students
to be able to do, then organising the curriculum, instruction, and
assessment to make sure this learning ultimately happens to all
students.
36. Peer Assessment Peer assessment involves students being responsible for making
assessment decisions and judgements on other students’ work.
15
As with any responsibility, the skill of peer assessment should be further
developed incrementally (step-by-step) by the educator.
Examples include:
39. Performance Target Specifies the threshold score and the threshold frequency to indicate
the effectiveness of a programme.
16
For example, an indicator that the programme is effective in achieving
programme learning outcomes related to acquiring and applying of
knowledge and understanding is by targeting that 60% of the students
score 70 or more in a programme exit examination.
41. Rubrics A scoring/grading tool that contains a list of criteria and benchmark
standards and is used to score or grade assessment tasks or learning
outcomes.
Students best benefit from the use of logs, diaries, and digital recording
devices to record their thoughts on the quality of their work so that they
can improve themselves.
43. Soft Skills The generic skills or attributes that employers value and that students
require in their professional and societal engagement.
17
Examples include the ability to communicate, manage information,
manage time, manage resources, engage harmoniously with others,
provide leadership, and become responsible and active team members.
45. Summative Assessment Summative assessments are used to evaluate student learning, skill
acquisition, and academic achievement at the conclusion of a defined
instructional period, typically at the end of a project, unit, course,
semester, programme.
The results are usually defining; for instance, they can determine
whether a student passes the course, gets a promotion, or secures
admission.
18
46. Test A test is a sample of items or constructs that measure performance in
a specific domain.
19
ABBREVIATIONS
AT Assessment Task
APEL Accreditation of Prior Experiential Learning
APEL.A Accreditation of Prior Experiential Learning for Access
APEL.C Accreditation of Prior Experiential Learning for Credit Award
APEL.M Accreditation of Prior Experiential Learning for Micro-
credentials
APEL.Q Accreditation of Prior Experiential Learning for Award of
Academic Qualifications
CA Constructive Alignment
CLO Course Learning Outcomes
COPIA Code of Practice for Institutional Audit
COPPA Code of Practice for Programme Accreditation
COPPA:ODL Code of Practice for Programme Accreditation: Open and
Distance Learning
COPTPA Code of Practice for TVET Programme Accreditation
CPD Continuing Professional Development
CQI Continual Quality Improvement
ELT Effective Learning Time
EXCEL Experiential Learning and Competency-Based Education
Landscape
GGP: PDD Guidelines for Good Practices: Curriculum Design and Delivery
HEIPs High Impact Educational Practices LMS Learning Management
System
LLO Lesson Learning Outcomes
LOC Learning Outcomes Cluster
LOD Learning Outcomes Domain
MMS Malaysian Micro-credential Statement
MOHE Ministry of Higher Education
MOOC Massive Open Online Courses
MQF Malaysian Qualifications Framework
20
NOBLe National Outcomes Based Learning
OBA Outcomes-Based Assessment
OBE Outcomes Based Education
OSCE Objective Structured Clinical Examination
ODL Open and Distance Learning
PBL Problem-Based Learning
PEO Programme Educational Objective
PIAAC International Assessment of Adult Competencies
PLO Programme Learning Outcomes
SIM Self-instructional Material
SLT Student Learning Time
SOLO Structure of Observed Learning Outcomes
TnL Teaching and Learning
TOS Table of Specifications
TVET Procedure Technical and Vocational Education and Training
SDG 4 UNESCO’s Sustainable Development Goal 4
WBL Work-based Learning
21
INTRODUCTION
Vision
Ensuring that HEPs are well informed with basic understanding of assessing students’
learning, which is key to confirming that learning has taken place. It is envisaged that
a clear understanding of the basic principles of assessing students will help HEPs
provide quality education in light of the aspirations of the nation.
Mission
22
PART 1
THE OVERVIEW OF ASSESSMENT IN HIGHER EDUCATION
When numbers are assigned to measure learning during an assessment, the scores
can be used as an evaluation to form a judgement, to see whether the learning that
has taken place is satisfactory based on the standard set.
In facing the challenges of the volatile, uncertain, complex, and ambiguous world while
adapting to the constant changes of the industrial revolution, assessment in education
must also be revisited.
23
Figure 1: The Seven Areas of the COPPA 2nd edition (MQA, 2017)
COPPA is concerned with the practices of HEPs in curriculum design and delivery,
while COPIA is primarily concerned with institutional processes applied in curriculum
development and delivery.
The second edition of the COPPA, which also relates to and is mapped to standards
for diverse programmes and themes, namely open and distance learning (COPPA-
ODL) as well as technical and vocational education and training (COPTPA), provides
the need to address issues with regards to management and assessment in these
diverse contexts.
For both reviews towards programme accreditation and institutional audit, it would be
the assessors' concerns on the procedures and practices adopted by the institutions
in the areas covered by the Codes and whether these match the provisions of the
Codes.
24
HEPs are discouraged from simply copying the guidelines and
samples/examples given in the document or appendices. Instead, HEPs must
strive to understand and develop their curriculum design, delivery processes,
and assessment to best fit the needs and requirements of HEPs and its students.
The GGP: AoSL is premised on the fact that assessments are paralleled with students'
learning.
Furthermore, research (see, for example, Biggs, 2003) suggests that assessment
drives student learning and directly influences students' approaches to studying.
For example, if assessment tasks for a particular programme and course require
students to reproduce or regurgitate information, students will study only to reproduce
information.
Figure 2 shows the role of assessment in learning and teaching activities in the
attainment of outcomes. Since assessment is an integral part of the learning and
teaching process, the assessment methods or outcome indicators employed must be
constructively aligned with the Programme Learning Outcomes (PLOs) and Course
Learning Outcomes (CLOs).
25
Ensuring this alignment will encourage students to take learning approaches that will
result in the achievement of the CLOs and hence assist in the attainment of the PLOs.
This document covers the following areas, which are divided into parts:
Part 4, which is an added component to the GGP: AoSL, ascertains various possible
contexts for assessment.
This is followed by Part 5 as the concluding part, which relates to how information
about students’ assessment and their learning can be communicated to them and other
various stakeholders to ensure continuous quality improvement can be made for
further development.
The glossary provides not only the basic terms used in the document but also a brief
explanation of these concepts.
Figure 3 shows the relationship between the assessment of students’ learning and the
attainment of CLOs and PLOs as the means to support the attainment of the
Programme Educational Objectives (PEO).
It indicates the need to align assessment methods with the attainment of the learning
outcomes (LO) and the need for a systematic student assessment process within the
institution in diverse contexts and communication for improvement.
The discussion provided in the five parts of this document addresses Area 2 of Code
of Practices Assessment of Student Learning, as illustrated in Figure 3.
26
Figure 3 :Assessment of Students’ Learning and the Structure of the Guidelines
27
PART 2
ASSESSMENT OF STUDENTS’ LEARNING
OBA is a systematic assessment approach to find out how well students attain the
intended CLOs and PLOs. The assessment of students’ learning involves collecting
evidence of outcome attainment both at the course and programme level. Evidence
gathered through OBA is used to judge how well the criteria specified by the LOs are
attained. The attainment of the CLOs is used to infer the attainment of the specific PLO
in the programme. Hence, students should be informed of their PLO attainment in each
semester to allow them to work on areas that need improvement.
Assessment is the process of finding evidence that the LOs, which are the minimum
performance or competence level, have been achieved when students have
successfully completed a certain course or graduated from a particular programme
offered by the HEP.
i. promote learning;
ii. measure performance by awarding grades that indicate whether and how
well a particular student has attained the stated LOs;
iii. determine whether a particular student is sufficiently well prepared in a
subject area to proceed to the next level of instruction;
28
iv. provide feedback to students that indicates levels of attainment and
diagnoses misunderstandings and learning difficulties; and
v. provide feedback to teaching staff to identify and diagnose ineffective
teaching methods/techniques.
It comprises the course assessment plan, its course execution, the required
report documentation, analysis of students’ performance, and intervention as
required.
29
2.1.3 This will also support CQI reporting for each course, thus embracing
the OBE implementation. Please refer to the MQA Guidelines to Good
Practices: Monitoring, Reviewing, and Continually Improving
Institutional Quality.
2.1.4 The implementation of OBE requires active learning and student-
centred learning approaches.
2.1.5 Active learning refers to a broad range of teaching strategies that
engage students as active participants in their learning during class
time with their instructors.
2.1.6 Strategies involve some students working together during class and
may involve individual work and/or reflection.
2.1.7 Outside of class time, students continue to be actively engaged in
pursuing knowledge and skills.
2.1.8 To ensure the attainment of CLOs, proper assessment planning must
be done and meet the principles of constructive alignment.
30
Figure 5: Constructive Alignment
31
2.2.11 Programme learning outcomes (PLO) state what students know and
are able to do upon completion of the programme and are derived from
MQF 2.0 (2018) LO domains, while outcome indicators are
assessment tools used to collect evidence of students’ performance
and attainment after pursuing a study programme.
32
Note: Information on assessment method and SLT based on the course learning
outcomes can also be found in HEPs course information provided in Table
4 document (refer to the template provided by MQA), which HEPS provide
the summary of the course information. However, Table 1, Table 2 and the
exemplar given in Appendix 12 in this GGP, serve merely as an explanation
for various components when designing course assessment plan and
determining the weightage for assessment.
33
Table 1: Description of Course Assessment Plan
CLO CLO Taxonomy PLO Topics/ Contact Teaching Assessment Weightage
statement domain Content hours and methods
learning
method
CLO i ii iii iv v vi vii viii
1
Description:
i CLO statement that is part attributed v Contact hours needed to complete the CLO.
towards the PLO; contains the objective of
the course’s learning outcomes with the
appropriate level of verbs that formulates
the needed resources, manpower, reading
materials. Contact hours for teaching and
learning, and the manner of assessment.
ii The cognitive/affective/psychomotor level vi The manner of knowledge/skill transfer
(lectures/tutorials/workshop/lab works, etc.)
and the mode of delivery:
conventional/online/hybrid.
iii The PLO identified based on the MQF 2.0 vii Assessment appropriated to measure the
(2018) which the CLO’s objective would CLO to ascertain the evidence in the
contribute towards the achievement. learning of the topic/content.
iv Topic and content as reflected in the viii Commonly measured in percentage to
needed Body of Knowledge. ascertain the needed values to complete the
CLOs (please refer to the glossary).
34
Table 2: Example of ToS Aligned Course Assessment Plan
CLO COURSE TAXONOMY PLO TOPICS/ CONTACT TEACHING AND ASSESSMENT SPECIFIC TASK RELATED TO
LEARNING DOMAIN CONTENT HOURS LEARNING METHOD TASK (%) MOHE/MQF 2.0 (2018)
OUTCOMES Case Portfolio Final
(CLOs) Study Examination
1. Discuss the C5 2 List of 21 Interactive 20% 30% Case Study (20%)
concept, topics Lecture/Cooperative Marks for case study is to
principles, related to and Collaborative deliberate and analyse the
issues and the CLO Learning issues, trends and challenges
challenges in related to visual assessment and
visual art evaluation from pre-school until
assessment higher learning education.
and evaluation. Final Examination (30%) factual
testing.
2. Evaluate C5 7 List of 12.6 Case Analysis 10% 20% Case analysis (10%)
strategy and topics Marks for the case analysis
approach in related to deliberate students’ critical
measuring the CLO evaluation in assessing learning
learning outcome through sample of
outcome quantitative and qualitative data
through according to appropriate process
assessment and techniques.
and
evaluation.
3. Integrate digital A4, P1 6 List of 8.4 Independent Learning 20% The use of e-Portfolio as
technologies topics evidence-based by focusing the
and appropriate related to sub-attribute of new ideas,
software for the CLO curation, articulation, tools.
diagnostic,
formative and
summative
assessment,
and evaluation.
35
2.5 TABLE OF SPECIFICATIONS
2.5.1 Table of Specifications (ToS) is a tool/document used to ensure that
an examination or test measures the contents.
2.5.2 In planning ToS for a test/examination, CLOs are assessed through the
test/examination and the topics covered as indicated in the course’s
assessment plan.
36
2.6 ASSESSING LEARNING OUTCOMES
2.6.1 In OBE, a learning outcome contains a verb that signifies the domain
and level of the outcome, whether it is cognitive, affective, or
psychomotor.
2.6.2 The level of the verb is ascertained according to the taxonomy that is
used in the design of the course.
2.6.3 For example, in Table 2 (refer to 2.4), Bloom’s Taxonomy is commonly
used for the cognitive domain, Simpson’s for the psychomotor domain,
and Krathwohl’s for the affective domain. (Refer to Appendix 1)
2.6.4 HEPs may decide to use any of the taxonomies for cognitive, affective,
and psychomotor that carry consistency in the delivery. (refer to
Appendix 1 for an example of SOLO taxonomy aligned assessment
plan).
2.6.5 While written examination questions can be used as an assessment
method, it is not the only approach.
2.6.6 This is especially true for complex skills like those in the MQF 2.0 (2018)
Learning Outcomes clusters.
2.6.7 At the same time, to assess using alternative methods, the skills to be
assessed should be defined according to how they are used and applied
in the course.
2.6.8 To help course owners define the skills to be assessed, the scholarly
approach of referring to the research literature and the relevant
principles or theories are recommended.
2.6.9 The psychomotor domain (Simpson, 1972) includes physical
movement, coordination, and use of the motor-skill areas.
2.6.10 Development of these skills requires practice and is measured in terms
of speed, precision, distance, procedures, or techniques in execution.
2.6.11 Assessment for each of the MQF 2.0 (2018) Learning Outcomes
clusters:
a. Practical skills
b. Interpersonal skills
37
c. Communication skills
d. Digital skills
e. Numeracy skills
f. Leadership, autonomy, and responsibility
iv. Personal and entrepreneurial skills
v. Ethics and professionalism.
38
2.7 KNOWLEDGE AND UNDERSTANDING
2.7.1 Knowledge and understanding refers to a systematic understanding of facts, ideas, information, principles, concepts,
theories, technical knowledge, regulations, numeracy, practical skills, tools to use, processes, and systems.
2.7.2 Knowledge and understanding comprise the knowledge dimension and the cognitive dimension, as shown in Table 3.
40
Figure 7: Elaboration of Revised Bloom's Taxonomy for Cognitive Domain
41
2.8.6 One way of assessing cognitive skills is by implementing problem-based
learning (PBL), which has been proven to develop self-directed learning
and problem-solving skills. The first two assessment tasks are made
during the “meet the problem” phase in PBL implemented in a typical class,
as shown in Table 5.
Table 5: Example for Assessing Problem Solving and Learning Process for PBL
2.8.7 Peer teaching is assessed using individual notes submitted before the
discussion on new concepts needed to solve the problem. In small-group
PBL, peer teaching is normally assessed through tutor observation.
42
2.9 FUNCTIONAL SKILL
2.9.1 Assessment of Practical Skills
43
Table 6: Example of Psychomotor Domain in Machining Course
Adaptation: the ability to Modify the machining Discussion, Machining Adopt the technical
modify the meet new or special procedures that suit Project concepts and s of
requirements. the design and material machining procedures
specification of the in actual projects.
given task
44
2.9.2 Interpersonal Skills
2.9.2.1 Interpersonal skills are defined in the MQF 2.0 (2018) manual
as a range of social skills such as interactive communications,
relationships and collaborative skills in managing relationships
in teams and within the organisation, networking with people of
different cultures, and social skills/etiquette.
2.9.2.2 The expected outcome, such as teamwork skills could be
classified under interpersonal skills.
2.9.3.3 To assess team working skills. multiple methods can be used,
such as through in-class peer-rating observation, recorded
video of team discussions, logbooks, minutes of meetings and
learning portfolio.
2.9.2.4 The learning outcomes and levels of attainment may also use
the affective domain taxonomy.
45
Table 7: Example of assessment For Team Oral Presentation: Individual And Team
Scale 1 2 3
Individual
A - Stature & Lack confidence, Somewhat Confident, good
Appearance lousy posture, confidence, proper eye contact and
shabbily attired, no posture, attire, and posture, smartly
eye contact. maintaining eye attired.
contact.
B - Presentation & Not precise, Straightforward, Apparent and
Voice mumbles and but sometimes fluent, reasonable
swallow words, too voice trails off, speech rate and
slow or too fast, sufficient speech volume.
poor intonation, rate and intonation.
and low voice
volume.
C - Delivery Mispronounce Mispronounce Very fluent, rarely
most words, often certain words, is mispronounce
stumbles, reads somewhat fluent, words, captures an
from notes/ slides, depends on audience,
and does not show notes/slides, and sparingly refers to
interest in a topic. shows interest in notes, passionate
sharing topics. about a topic.
D-Q&A Not able to answer, Able to answer but Answers
does not fumbles slightly, confidently
understand a topic. demonstrating demonstrate clear
understanding of and critical
the topic. knowledge of the
topic.
Group
E - Content Unsuitable and Suitable Content. Well-developed
disjointed content. Shows content with proper
Does not show understanding but elaboration and
understanding of lacks integration of examples. Shows
the material different sources. good
presented. understanding &
integration from
various sources.
46
2.9.4 Digital Skills
2.9.4.1 Digital skills are essential for current and future graduates to
remain relevant in the current and future phases of industrial
transformation.
2.9.4.2 Digital skills encompass knowledge and skills related to using
information/digital technologies and literacy to support learning
and professional life.
2.9.4.3 The skills include sourcing and storing information, processing
data, digital design, using applications for problem solving and
communication, and ethics in applying digital skills.
2.9.4.4 According to all programme standards and the MQF 2.0 (2018),
digital skills must be measured across disciplines and at all
qualification levels. Therefore, professionals from different
fields should be more than competent to provide better
descriptions of appropriate digital attributes for diverse curricula
or academic programmes.
2.9.4.5 The criteria are divided into the areas of the learners'
adaptability, capability, clarity of the relayed skills and
knowledge, coherence, relevance in comparability, recognition,
and transferability towards these programmes and qualification
frameworks. It provides the overarching framework that
integrates into all forms of learning.
2.9.4.6 Depending on the subject-matter area, digital skills can be
assessed across all three domains: cognitive, affective, and
psychomotor.
Digital skills
Understand Create Use
- identifying the - digital communication - information
Incorporated needed digital - digital collaboration literacy
component tools (industry/affiliated/practi - computer and
- digital hardware se) technology
and technology literacy
literacy
Note: Adopted Table from Aris et al., Digital Skills Framework in Higher Education. Proceedings
2022, 82, 61. https://doi.org/10.3390/proceedings2022082061
47
2.9.5 Numeracy Skills
48
Figure 8: International Assessment of Adult Competencies (PIAAC) in Numeracy
Competency
2.9.6.1 MQF 2.0 (2018) defines this cluster of skills as an individual’s ability
to build relationships and work with teams made up of peers or in
managerial capacities with varying degrees of autonomy to make
decisions or set goals at organisational/unit/team levels; to take
responsibility and provide accountability; to be confident,
knowledgeable, articulate, honest, professional, concerned,
resilient, a risk taker, and possess other intrapersonal skills,
including working in and leading teams.
2.9.6.2 The management role is ambivalent and includes negotiation
processes based on traditional scientific ideals and managerial
logic.
2.9.6.3 The relational aspects of how a professional habitus is formed and
negotiated in relation to management ideals and practices. The
value of knowledge in itself and collegial decision-making.
2.9.6.4 It concerns the conditions in the field, the academic subjects, and
the status relations between teaching and research, as well as to
what extent the students and their processes in the field affect the
prerequisites for exercising professional judgement.
2.9.6.5 Possible areas for assessing leadership are shown in Table 9.
49
Table 9: Areas of Assessment for Leadership
2.10.1.1 Personal skills are life skills that learners are expected to use
daily.
2.10.1.2 They are generally portrayed through enthusiasm for
independent learning, intellectual discourse and self-
development, confidence and self-control, social skills and
proper etiquette, and commitment to professionalism in the
workplace.
2.10.1.3 It also includes the capability to plan for career development or
further education.
2.10.1.4 Aspects of individual characteristics such as honesty,
punctuality, time management, and keeping to and maintaining
deadlines that are important in a work environment are also
essential personal skills.
50
2.10.2 Entrepreneurial Skills
i. Opportunity seeking
ii. Persistence
iii. commitment to work
iv. demand for quality and efficiency
v. risk-taking
vi. goal setting
vii. information seeking
viii. systematic planning and monitoring
51
2.11.3 These include integrity, professional conduct (professionalism), and
standards of conduct such as upholding regulations, laws, and codes of
good practices or professional conduct.
2.11.4 A sensitive approach to other cultures adds value to this learning domain.
2.11.5 Assessment of ethics can fall under any of the domains but likely for the
cognitive and affective domains.
2.11.6 Ideally, if the outcome is to develop students' professional and ethical
beliefs that will guide their conduct, then the outcome level should be at
the high affective taxonomy level, at the valuing or organisation level.
2.11.7 A constructively aligned teaching and learning environment will enable
the assessment of ethics, even at the higher levels of the taxonomy.
2.11.8 Joint assessment approaches for ethics include case studies, role play,
service learning, learning journals, etc.
An example in the construction Recognition of the dilemma - identifying the ethical issues
for the assessment rubric by or problems, especially concerning the ethics code.
Shuman, Olds and Besterfield- Information (argumentation)-gathering relevant
Sacre (2003) used the following information and justifying its importance to understand the
five constructs in developing a situation.
rubric to assess engineering Analysis (complexity and depth) - analysis of the
students' responses to cases information, taking into account different aspects and
with an ethical dilemma. opposing viewpoints and other factors such as the risks
and consequences.
Perspective (fairness) - taking different perspectives of the
parties involved (e.g., workers, residents, industry,
government, etc.) and looking at a global view to get an
overall perspective.
Resolution (argumentation) - a final resolution which
should consider the greater good and risk to the public,
with solid justification.
Note: Shuman, L., Olds, B. and Besterfield-Sacre, M. (2003), 33rd ASEE/FIE Frontiers in Education Conference
Proceedings, Boulder, Colorado, USA.
52
PART 3
ASSESSMENT MANAGEMENT
The management of student assessment is key to quality assurance. HEPs should ensure
the robustness and security of processes and procedures related to assessment
management.
On that note, every HEP should focus on combating academic misconduct. There may
also be differences across institutions in the structures.
53
Table 11: Assessment Integration and Process at the Institutional Level
54
3.2.6 The interpretation is then used to decide where the learners are in their
learning and to indicate the next step to promote learning (Assessment
Reform Group, 2002).
3.2.7 The increased use of coursework and continuous assessment offers the
opportunity for academic staff to provide constructive feedback to help
learners improve their future learning.
3.2.8 Formative assessment is an assessment for learning.
3.2.9 Assessment as learning requires students to play an active role in
becoming independent in their learning and assessment (Earl, 2003).
3.2.10 In order to incorporate assessment as learning into the learning process,
academic staff should help students develop skills to conduct self-
evaluation, metacognition, and design instructions and assessments to
monitor student learning.
3.2.11 On the other hand, summative assessments measure what students
have learned at the end of a learning unit.
3.2.12 Summative assessment refers to the assessment of students’ learning,
which involves grading and verification and is used for institutional
accountability and quality assurance.
3.2.13 The results are then communicated to the stakeholders.
3.2.14 Summative assessment is one of the methods used in the assessment
of students’ learning.
55
3.3 TYPES OF ASSESMENT
3.3.1 Various methods of assessment
56
3.3.2 Coursework and Continuous Assessment
57
3.3.3.9 Self-assessment is a valuable way of encouraging participants
to evaluate and reflect on their learning.
3.3.3.10 Peer assessment is especially useful in determining the
attainment of leadership, teamwork, and communication skills.
(refer to Table 12 for information on management of student
assessment and processes involved).
Note: Appendix 2 to Appendix 7 relate to online assessment.
58
Table 12: Management of Student Assessment and Process
Communicate with the students on continuous assessment scheduling. Online Synchronous assessment
cognitive level.
Area CONTINUOUS ASSESSMENT FINAL ASSESSMENT
Conventional or Alternative Alternative Assessment Final Examination (Conventional and Online)
Assessment
Mechanism of a) Establish an understanding of academic integrity and honesty.
marking and grading b) The area of concern for academic integrity and honesty may include: plagiarism, cheating, fabrication, deception, false information or any
student assessment related misconduct.
c) HEPs shall get advice from the university's legal advisor.
d) HEPs may establish a student academic integrity pledge.
e) HEPs shall clearly state the process of moderation
f) Establish a moderation committee.
g) Proper moderation process at programme and course level must be carried out in cases with more than one assessor (inter-rater reliability
towards consistency and fairness.
h) Marking and grading are guided by an answer key, answer scheme, or rubric.
i) Measures to curb biasness when marking are in place.
Mechanisms to System to ensure academic integrity and honesty. Ensure students submit their work a) Highly secure systematic process and
ensure the security and do not plagiarise. mechanism in developing, managing and
of assessment HEPs should have a clear disciplinary act for students who commit academic administrating final assessment.
documents and misconduct. b) Clear invigilator job description and roles.
records Keep the evidence of assessment in a certain period. c) Ensuring a strong room is highly secured
and meets the minimum specification for
safety. To a certain extent, a strong room
using online, or cloud services should
have a high-security mechanism.
d) The final online examination should be
administered with a secure browser,
remote proctoring, data encryption and IP
authorisation and authentication. All of
these are mechanism to curb or avoid
academic misconduct.
e) HEPs should have a clear disciplinary act
for students who commit academic
misconduct
f) HEPs are recommended to provide
plagiarism detection in assessing
students’ academic misconduct.
g) Assessment evidence must be kept
stored, maintained, and disposed of
based on stipulated period.
61
Area CONTINUOUS ASSESSMENT FINAL ASSESSMENT
Conventional or Alternative Alternative Assessment Final Examination (Conventional and Online)
Assessment
Assessment a) Assessment tasks scheduled - Assessment tasks scheduled within the final assessment week
and communication across the semester. - Systematic collection of assessment evidence, marking and grading.
with students b) Results were returned to - Marks and grading release upon Senate approval.
students promptly before the - Students receive notification of final grades through an integrated system- emphasis on integrity.
submission of the next - Processes for students to appeal against the results of assessment must be in place and
assessment task. integrated into the system.
c) Students can act on
assessment feedback before
submission of the next task.
d) System in place for the
collection of assignments,
marking of assignments,
feedback to students.
Periodically review System for periodic review of assessment, programme and course. May include input from external stakeholders as review panels e.g.,
the management of Students' evaluation of teaching
student assessment Student/staff liaison committee
More specifically, reviewers consider how well the system adheres to each assessment principle.
To ensure that timely and effective reviews are conducted, a continuing group must be responsible for monitoring the review process.
Students, other educators and experts also provide feedback about classroom and university practices.
Reviews of the overall assessment system and the whole academic programme require broad participation from all stakeholders, including
educators, students and assessment and curriculum specialists. The most important criterion for assessment review is that assessment does
not harm student learning and promotes active and engaged learning.
62
3.3.4 Alternative Assessment
3.3.4.2.1 To ensure the attainment of PLOs and to better prepare for the
workplace by training the students to be immersed in a real-
work environment, thus relating theories to practice in situ.
3.3.4.2.2 HEPs are encouraged to collaborate with industry when
planning, executing, and assessing students during their
workplace experience.
64
3.4.1.4 On the other hand, indirect assessments refer to the "analysis
of reported perceptions about student mastery of learning
outcome" (Allen, 2004).
3.4.1.5 It may be in the form of employer surveys, exit interviews of
graduates, and self-reports by students or others, such as the
supervisor, during industrial attachment.
Figure 10: Aligning Learning Outcomes, Learning and Teaching Activities and
Assessment
65
3.4.2.8 In conducting good practice in assessing course LOs,
various considerations need to be taken into account.
66
3.4.5 Diversity
3.4.6 Weightage
67
Table 13: Example of Task and Grading Instrument
68
3.4.7 Coverage
3.4.7.2 Table 14 shows the example of LOs for every lesson (Lesson
LOs - LLOs) being mapped to course LOs to ensure that
each lesson LO contributes to the achievement of one or
more of the course LOs (CLOs).
3.4.7.3 Consequently, the content to be taught is determined based
on the lesson LOs to be achieved.
3.4.7.4 Lesson LOs may differ from assessment outcomes because
it is impossible to assess all content taught due to constraints
such as time.
3.4.7.5 The assessment may only cover a sample of the content
taught, but the staff must ensure that the assessed content
represents the course content.
3.4.8 Criteria
69
3.4.8.4 Assessment criteria are the standards against which
learners' performance is measured. The marks awarded for
the attainment of each criterion need to be made clear.
3.4.8.5 It can be communicated through various forms of rubric.
3.4.9 Attainment
70
3.5.1 Validity and Reliability of Assessment
71
meanwhile, is concerned with whether the assessment can
represent a group of students or a body of opinion.
72
ii. Develop marking schemes/rubrics as a guide to
ensure standardisation in marking. Vague scoring
criteria threaten reliability.
iii. Ensure a fair distribution of marks for each
question/task.
iv. Provide clear guides for observing and recording
evidence.
v. Ensure that the test venue is conducive and that the
tests are administered lawfully.
vi. In cases of multiple examiners, conduct moderation in
marking. The appointed moderators determine the
appropriateness of the standards and markings.
vii. In order to maintain the validity and reliability of
assessments, students undertaking a particular
course at all sites must get the same opportunities in
terms of contents, coverage, resources, and expertise
from academic staff.
viii. Tests and examinations should be given, submitted,
and administered at the same time and under the
same conditions.
73
iv. Another person should validate the examination and
test questions with expertise in the area assessed.
74
Figure 12: Preventive Methods to Address Integrity Issues
1. ChatGPT and artificial intelligence in higher education: quick start guide. Published in 2023 by the United
Nations Educational, Scientific and Cultural Organization, https://etico.iiep.unesco.org/en/chatgpt-and-artificial-
intelligence-higher-education-quick-start-guide
2. AI and education: guidance for policy-makers. Published in 2021 by the United Nations Educational, Scientific
and Cultural Organization. https://unesdoc.unesco.org/ark:/48223/pf0000376709. Accessed: March 2023.
75
PART 4
ASSESSMENT IN DIVERSE CONTEXTS
4.1 OVERVIEW
4.1.1 The assessment for flexible education should still uphold the principles
to ensure integrity and credibility in the process of evaluating learners.
4.1.2 Equally there must be a transparent system that is valid, reliable,
efficient, and equitable and that is able to evaluate the ability of
learners to demonstrate learning outcomes for a set duration of time.
4.1.3 Even though the assessment can be considered as the final stage of
constructive alignment, the assessment process should be reflected
directly in the teaching processes.
4.1.4 Assessments that use rubrics require the assessor to commonly clarify
the areas to be assessed through a given briefing at an early onset,
so that the student should know what they are being assessed for.
4.1.5 UNESCO’s Sustainable Development Goal 4 (SDG 4) is incorporated
Twelfth Malaysia Plan 2021 – 2025, which aims to ensure inclusive
and equitable quality education and promote lifelong learning
opportunities for all.
4.1.6 This supports learners not only in access but also in a smooth
transition to the labour market.
4.1.7 The assessment should be designed not only for assessing whether
the student attains the outcomes but also as a form of formal feedback
to the student on their learning performances as well as a form of
feedback to lecturers/instructors on how effective their teaching
approaches are.
4.1.8 The recent pandemic has accelerated the transition of HEPs to
approaching the education system in more flexible educational
pathways similar to the frameworks arriving from the Open and
Distance Learning (ODL) system.
4.1.9 One of the main advantages of the ODL system is that it can be related
to Mixed Reality (MR) technology has emerged as a promising tool in
the field of education, offering immersive and interactive learning
experiences for students.
76
4.1.10 In order to obtain ODL licenses for a particular programme, institutions
will have to create self-instructional materials (SIM).
77
4.2.4 In other cases, HEPs may use interview sessions for different students
for the same courses, in case of the differently-abled student's inability
to properly answer the assessment paper in written form.
4.2.5 Some HEPs may also provide computerised system assessments for
differently-abled students who are unable to write/spell words properly,
with recommendations from certified medical professionals.
4.2.6 Lecturers/Instructors should also be aware of differences in different
generations that show major traits in different learning capabilities;
thus, the assessment may be designed differently.
4.3 CROSS-CULTURAL
4.3.1 Globalisation has encouraged the mobilisation of people, migration,
urbanisation, and increasing social and cultural diversity, reshaping
countries, and communities.
4.3.2 Assessors should be aware of the differences between societies and
cultures. For example, during assessing communication skills, the
abilities of students from urban and rural areas may have differences;
hence, the rubrics for accessing communication skills should consider
the advantages and disadvantages of the previous education system
for the learners.
4.3.3 Students from different countries may have different expressions when
conversing; hence, the assessment should be fair to all the students.
78
Figure 14: Future-Ready Framework (2020)
79
4.5 COURSEWORK MODE
4.5.1 Teaching and Learning (T&L) involves a combination of assignments
(coursework) with practicum or the production of an assessed project
paper to award students' grades.
4.5.2 Assignments/course work can be in the form of writing, presentations,
or demonstrations. Examples of dominant assessment activities in this
coursework mode are coursework assessment, quizzes, tests, and
examinations.
4.5.3 Alternative assessment can also be used. (refer to 3.3.4)
80
4.6.5 Please refer to the following suggested Table 15 on the appropriate
evaluation criteria:
Thesis Evaluation
Abstract - Rationale for the study and problem
statement
- Hypothesis(es) and objective(s)
- Methodology employed.
- Findings and conclusions
81
4.7 EXEGESIS AND CREATIVE OUTPUT
4.7.1 The Exegesis is a theorised and analytical discourse that presents
fresh and authoritative insights into the field by analysing and situating
the creative component and thereby setting the stage for the ideas and
models that guide the creation of the postgraduate works (by creative
project and the Exegesis written component).
4.7.2 The creative output written component shows that the candidate
understands the relationship of the investigation to the wider context
of the knowledge to which it belongs.
82
4.8.5 Methods of assessment might include any of the following, selected as
appropriate to the discipline or field of study and the programme's
aims, mode of delivery, and typical entrants (refer to Appendix 11):
83
4.9.5 HEPs should involve and if necessary, train the industry mentor to
ensure learning takes place as well as to validate assessments and
grading instrument for outcome attainment.
4.9.6 In this regard, a systematic buddy system should be established by
HEPs to ensure the validity and reliability of assessment during the
learning process, within the industrial model.
84
4.10 COURSES OFFERED
4.10.1 Semester-based
4.10.2 Module-based
85
4.10.3.2 HEPs may implement a CPD/portfolio-based system for
students at their own pace to prepare all the records
needed to collect hours and badges to claim credits for
particular courses.
4.10.3.3 Students may be evaluated by various approaches,
including interviews or challenge sessions, to prove that
they have attained the intended learning outcomes.
4.10.4 Micro-credentials
86
4.10.4.7 Information on the type of assessments (examinations, tests,
projects, etc.), grading (marks, grade points, alphabetical
grades, etc.), and quality assurance should be stated in the
Malaysian Micro-credential Statement (MMS).
4.10.4.8 For more information on micro-credentials, refer to Guidelines
to Good Practices: Micro-Credentials by MQA.
4.10.6 Community-based
87
4.10.6.3 The assessment can also be evaluated through a portfolio,
case study report, project report, etc., depending on the
intended outcomes.
For example, the student may claim credit for the co-
curriculum course by submitting a report after attending a
community-based programme during the semester break.
4.10.7 Mobility-based
88
4.10.9 APEL (A), (C), (Q) and (M)
4.10.9.3 There are three principles of assessment that can be related to APEL:
Figure 18: Three (3) Principles of Assessment that can be Related to APEL
89
4.10.9.4 The assessor appointed would be a subject matter expert/specialist
who can evaluate the evidence submitted based on the assessment
criteria.
4.10.9.5 In addition, he/she should demonstrate the following:
90
Table 18: APEL Assessment Instruments
91
PART 5
COMMUNICATING ASSESSMENT AND OUTCOMES
92
5.1.14 Performance criteria or targets can be set to determine the level of
achievement.
5.1.15 Such evidence can only be captured by taking a systematic approach
to assessment.
5.1.16 Hence, at the programme level, a programme’s impact is assessed by
finding evidence of the attainment of PLOs.
5.1.17 This allows HEPs to plan for continuous quality improvement (CQI)
based on the attainment levels of the CLOs and PLOs as shown in
Figure 9.
5.1.18 With systematic monitoring that began at the course levels, students
should be able to demonstrate attainment of all PLOs by the end of
the academic programme, acquiring the full skills set to perform as
functional graduates.
5.1.19 Assessment of students’ learning provides evidence of their level of
attainment.
93
5.2.3 The attainment is analytic data that allows the programme or course
developer to reflect on the quality of curriculum, instruction, and
assessment such as:
94
CONCLUSION
Assessment that is constructively aligned to the intended learning outcome, has a
formative function, by providing ‘feed-forward’ for future learning that can be acted
upon for continuance improvement.
The amplification in the use of technology and distance learning as well as flexible
education had come about under the needed conditions, which paralleled the
conventional ways of assessment that provides the significance for the use of
alternative assessment.
This method for assessing the academic achievement of a learner includes activities
requiring the application of acquired knowledge and skills to real-world situations, and
it is often seen as an alternative to standardised testing.
Whilst ensuring fairness and quality of assessment when managing assessment for
various contexts, there should be an opportunity and a safe context for diverse
students to expose problems with their studies and for HEPs to gather the appropriate
methods for improvement through the appropriate medium of learning facilities.
With the engagement from related stakeholders (industry, government, and non-
government bodies, alumni, etc.), it strengthened the manner of updating the required
bodies of knowledge, the current industry practices, and professional practices.
This creates a balanced, structured approach while still allowing flexibility as needed
to improve the programme’s approach and relevance to the needs of the industry.
95
These periodical assessments will pivot the feedback, surveys, correspondences, and
meetings, which eventually will affect the courses’ learning outcomes as they revise
newer texts and literature, current practices, case studies, and a new and updated
technological presence, which eventually shifts the fulcrum of the Programme’s
Learning Outcomes.
With the acquired clearly defined data and the needed tasks that arrive from these
resources, HEPs will need to execute the stages of reviewing or revamping their
courses and programmes, to be relevant.
The focus at this phase would be to generate knowledge that would inform
development in the HEP in making informed decisions at the planning and policy levels.
The mutual benefit is to produce competent graduates equipped with necessary skills
to deal with community challenges.
96
REFERENCES
Allen, M. J. (2004). Assessing academic programs in higher education. Bolton, MA:
Anker Publishing.
Aris et. Al., Digital Skills Framework in Higher Education. Proceedings 2022, 82, 61.
https://doi.org/10.3390/proceedings2022082061
Biggs, J. (1999). Teaching for quality learning at university. Buckingham, UK: SRHE
and Open University Press.
Biggs, J. (2003). Aligning teaching and Assessment to curriculum objectives. York, UK:
LTSN Generic Centre.
Felder, R. M. & Brent, R. (2003). Designing and Teaching Courses to Satisfy the ABET
Engineering Criteria. Journal of Engineering Education, 92 (1), 7 - 25.
97
http://wileyonlinelibrary.com/ journal/jee April 2018, Vol. 107, No. 2, pp. 00–00 DOI
10.1002/jee.20197
RELATED LINKS
98
FURTHER READINGS
Darling-Hammond, L., Herman, J., Pellegrino, J., et al. (2013). Criteria for high-quality
assessment. Stanford, CA:
McMorran, C., Ragupathi, K., and Luo, S. (2015). Assessment and learning without
grades? Motivations and concerns with implementing gradeless learning in higher
education. Assessment & Evaluation in Higher Education: 1-17.
http://dx.doi.org/10.1080/02602938.2015.1114584
99
APPENDIX 1
2 - Unistructural State Correct answer to a Poor essay structure. One issue is identified and
Recognise simple algorithmic this becomes the sole focus; no framework for
Single point Recall problem requiring organising discussion. Dogmatic presentation of a
Quote substitution of data single solution to the set task. This idea may be
Note Name into a formula. restated in different ways. Little support from the
literature.
Correct solution of
one part of a more
complex problem.
3 - Multi- structural Explain Correct solution to a Essay poorly structured. A range of material has
Define List multiple-part been selected and most of the material selected is
Multiple unrelated Solve problem requiring appropriate. Weak introduction and conclusion.
points Describe the substitution of Little attempt to provide a clear logical structure.
Interpret data from one part to Focus on a large number of facts with little attempt
the next. Poorly- at conceptual.
structured project
report or practical explanations. Very little linking of material between
report on open tasks. sections in the essay or report.
4 – Relational Apply Elegant solution to a Essay well-structured with a clear introduction and
Outline complex problem conclusion.
Logically related Distinguish requires the
answer Analyse identification of Framework, which is well developed, exists.
Classify variables to be Appropriate material. Content has logical flow, with
Contrast evaluated or ideas clearly expressed.
Summarise hypotheses to be Clearly identifiable structure to the argument with
Categorise tested. Well- discussion of differing views.
structured project or
practical report on
open task.
Note: Biggs and Collis’ (1982) Structure of Observed Learning Outcome (SOLO) taxonomy is
another considered assessment for cognitive learning.
100
It is especially beneficial when setting cognitive tasks or assessment items and designing
rubrics (performance standards) for grading the task.
When using this taxonomy for writing learning outcomes and grading, it informs learners and
faculty staff on the criteria and the standards of answers required to show evidence of
attainment at the various competency levels or levels of cognitive performance.
The QR reference shows a representation of the SOLO taxonomy, which has five levels,
starting from no knowledge (pre-structural), through surface learning (uni-structural and multi-
structural), to deep learning (relational and extended abstract).
The psychomotor domain (Simpson, 1972) includes physical movement, coordination, and use
of the motor-skill areas. The development of these skills requires practice and is measured in
terms of speed, precision, distance, procedures, or techniques in execution. This domain
includes seven major categories that are listed from the simplest behaviour to the most
complex namely perception, set, guided response, mechanism, complex overt response,
adaptation and origination.
The MQA and MOHE LO domains belonging to the psychomotor taxonomy include practical
skills and entrepreneurship.
The Affective Domain addresses interests, attitudes, opinions, appreciations, values, and
emotional sets. This domain includes the manner in which we deal with things emotionally,
such as feeling, value, appreciation, enthusiasm, motivation, and attitude. The five categories
in affective domain include receiving, responding, valuing, organisation and characterization
by value. The MQA and MOHE LO domains belonging to the affective taxonomy include
communication, teamwork and social responsibilities, ethics, morality, professionalism, lifelong
learning, management, and leadership. Other taxonomy examples could also be considered
by HEPs appropriated into their programme design.
101
APPENDIX 2
102
APPENDIX 3
Table 21: Example of The Forms of Online Assessment (True/False and Essay
Questions)
103
APPENDIX 4
(Assessment of
prior knowledge)
Critical Reflection & - Electronic - e-portfolio Requires the
Meta-Cognition portfolios - Wikis examiners to
Asynchronous - Online - Blogs apply the
journals, logs, - Academic's Preferred appropriate
diaries, blogs, peer assessment rubrics to the
wikis platforms submitted works
- Embedded
reflective
activities
- Peer & self-
assessment
104
APPENDIX 5
Table 23: Online Engagement, Platform and Assessment for Continuous and
Authentic Assessment
- Critical reviews
- Online - Screencast
Continuous presentations (Ink2Go)
Assessment
- Group online - Blog Platform
(Group) projects Video based
platform (Vimeo,
Asynchronous - Role play YouTube,
Instagram
- Online debates
- Loom
- Google Docs
Authentic - Scenario-based - Google Docs The assessment are
Assessment learning activity based,
- Google Forms continuous assessment
Asynchronous - Laboratory/field trip in variant forms of
reports - Plickers academic-student
engagement, Elements
- Simulations - Poll Everywhere of Academic dishonesty
are low and requires
- Case studies/Role - Mentimeter minimal invigilation
play
- Nearpod
- Online oral
presentations - Goformative.com
and/or debate
- Flipgrid
- Kahoot
105
Invigilated - Mid-semester Assigned online AI Proctoring may be
Online Exam exam video platform (e.g.: required (Utilising AI to
Assessment - Final exams MS Teams, Zoom, assist facial recognition
Google, Meet, etc) as well)
Synchronous
- Honorlock
- Protorio
- Talview
- RPNow
- Examus AI Proctoring
- Questionamark
- Mercer Mettl Online
Examination and
Proctoring Solutions
- Think Exam
Recommended Cloud-
based, secure on-line
biometric scan for
students’ identity
verification.
The standard
requirement for lights,
camera and microphone
settings in viewing the
spacing of candidates to
the viewed and monitor.
Note: This list is not exhaustive. HEPs are advised to explore other AI proctoring that
may provide a more comprehensive and secure data encryption
106
APPENDIX 6: Data Encryption During Transmission of Examination
107
APPENDIX 7
108
APPENDIX 8
109
APPENDIX 9
Note:
• For students sitting for online examination outside of HEP vicinities and with
moderate internet coverage.
• Technical time allowance refers to the acceptable period for submission
(depending on the length of the examination) after the online examination
ends.
110
APPENDIX 10
111
APPENDIX 11
Capstone Project
Proposal/Design Defence
Viva Voce
112
APPENDIX 12
Please note that this is merely to show an example to help understand what is
discussed in terms of constructive alignment in Part 2 and managing assessment in
Part 3 (Weightage, Table of Specification and Rubric). HEPs are required to apply the
knowledge appropriately in their own contexts. This exemplar is for two of three course
learning outcomes in the course.
To prepare your course assessment plan and to calculate the weightage, you need to refer
to the information in your syllabus.
B. State the total Teaching time allocated for the course or Total Student Learning
Time.
Note: for distance learning context, Total Student Learning Time can be used.
D. Determine Teaching time allocated by CLO (D1) or the SLT by CLO (D2) based on
the emphasis placed on each CLO and method of assessment,
113
The following is an example.
Brief Synopsis
This is an undergraduate course designed for education programme specialising in IT.
Exposure to various aspects of assessing in digital context include theories and principles on
visual arts and digital technologies that can be used when assessing and evaluating learning.
Fundamental concepts of assessment and evaluation using quantitative data and qualitative
data are introduced before opportunities to integrate digital technologies and relevant software
for various types of assessment are provided.
Description of Assessment:
This course consists of 50% coursework (covering CLO1, CLO2 and CLO3) and 50% final
examination (covering CLO1 and CLO2 only). There will be two assignments for the
coursework:
A) Credit Hour 3
CLO1: Discuss the concept, Case Study 21 hrs 56 hrs 1) 21/42 *100 = 50%
principles, issues and challenges in (20%)
visual art assessment and 2) 56/120*100
evaluation. Final = 46%
Examination (±40 - 50%)
(C2) (30%)
50%
CLO2: Evaluate quantitative and Case 12.6 hrs 40 hrs 1) 12.6/42*100 = 30%
qualitative data in measuring Analysis
learning outcome through (10%) 2) 40/120*100
assessment and evaluation = 33%
Final (±30 - 35%)
(C5) Examination
(20%) 30%
CLO3: Integrate digital technologies Group 8.4 hrs 24 hrs 1) 8.4/42*100 = 20%
and appropriate software for Project (e-
diagnostic, formative and Portfolio) 2) 24/120*100
summative assessment and (20%) = 20%
evaluation. (20%)
114
Course Assessment Plan Instruction and Rubric
Once the weightage is determined, it can be used to plan the number of items based
on the related CLO and topics for the final examination. However, since only CLO1 and
CLO2 in this exemplar are used in the final examination, there is a need to also
determine the weight for the final examination questions to determine the number of
items by CLO. Total weightage for CLO1 and CLO 2 is 80%. To determine the
weightage of the final exam questions there is a need to find the weightage of the final
examination only.
A Table of Specifications for the number of items based on the related CLO and topics
for the final examination can also be determined with the weightage.
Below is an example of two cases when planning the final examination types of
questions. Case 1 is used when the questions are structured, or essay type and Case
2 is used when the questions are in the multiple-choice type of questions.
Total 10 3 hours
115
To Determine The Table of Specification based on Weightage for CLO1 & CLO2
(Final Examination). (Note: The Highest Cognitive Level Targeted Is C5)
10 CLO1 Q A1 Q A2
Topic 1 11 10 10
(11.5) (C2) 5m 5m
20 CLO1 Q A3 Q A4
Topic 2 15 20 20
(15.6) (C2) 10 m 10 m
30 CLO1 Q B1 Q B2
Topic 3 30 30 30
(31.3) (C2) 10 m 20 m
20 CLO2 Q B3 Q B4a
Topic 4 20 20 20
(20.8) (C5) 10 m 10 m
20 CLO2 Q B4b Q B5
Topic 5 20 20 20
(20.8) (C5) 5m 15 m
Note: Assuming there will be 60 items in the MCQ, based on the weightage 60% of 60 items
(36 items) come from CLO1 (C2) and 40% of 60 items (24 items) come from CLO2(C5). The
highest cognitive level is C5.
14 22 8 7 9
116
Coursework Assignment for CLO1 and CLO2
Instruction:
Given a case of one course and its course assessment plan you are to discuss
the relevance of the design and evaluate its effectiveness in measuring
learning.
117
Example of Rubric for CLO1 and CLO2:
118
APPENDIX 13
Consideration for Exam on Demand
a) The Exam-On-Demand System
For HEPs to consider conducting an "exam-on-demand", it requires careful
planning, technological infrastructure, and considerations to ensure fairness,
security, and effective assessment. Here are the steps you can take to conduct
exams on demand:
b) Determine Feasibility:
Assess whether an "exam on demand" approach is suitable for your course,
subject, and educational institution. Consider factors such as the subject's
nature, assessment requirements, available technology, and the willingness of
instructors and students to adapt to this approach.
c) Choose a Platform:
Select or develop a suitable online platform or learning management system
(LMS) that can handle exam-on-demand scheduling, submission, grading, and
feedback. The platform should also ensure the security and integrity of the
assessment process.
d) Design Assessments:
Create exam questions that assess the intended learning outcomes effectively.
Ensure a mix of question types, such as multiple-choice, short answer, and
essay questions, to cater to various types of knowledge and skills.
f) Prepare Technology:
Ensure that both instructors and students have access to the necessary
technology and resources to participate in the "exam on demand" system.
Provide technical support and training as needed.
119
g) Create an Exam Repository:
Develop a repository of exam questions that can be randomly selected for each
student to ensure fairness and prevent academic dishonesty.
h) Implement Scheduling:
Set up a scheduling system where students can choose a suitable time slot to
take the exam. This can be done through the LMS or an online scheduling tool.
i) Provide Flexibility:
Allow students a window of time during which they can start and complete the
exam. This accommodates different time zones and personal schedules.
l) Maintain Integrity:
Implement measures to ensure exam security and prevent cheating. This might
include randomising question orders, using online proctoring tools, and setting
time limits for individual questions.
m) Continuous Improvement:
Gather feedback from instructors and students about their experiences with the
"exam on demand" system. Use this feedback to refine the process, address
challenges, and enhance the overall effectiveness of the approach.
120
n) Monitor and Evaluate:
Continuously monitor the effectiveness of the "exam-on-demand" approach in
terms of student performance, engagement, and satisfaction. Make adjustments
as needed to improve the process.
121
APPENDIX 14
Example of Statements
PEO PLO CLO
Definition Broad statements that The abilities (cognitive, Specific statements of
describe the career and psychomotor, and what the learners are
professional affective) that are expected to achieve
accomplishments of graduate should be able at the end of the
graduates within five to demonstrate at the courses.
(5) years upon time of graduation
graduation.
Example of IT Instructors who At the end of the At the end of the
Statement apply fundamental programme, students course, students can:
(Cognitive knowledge and should be able to: - Explain
Domain) practical skills in - Apply mathematics differentiation
providing services to and science and integration
the IT industries locally concepts, principles, concepts,
and globally. theories and law principles and
essential to IT; algorithms.
- Perform algorithm, - Perform second-
programming and order
diagnostic differentiation
procedures and triple
essential to IT. integration
techniques to
determine slopes,
sign of the slopes
area and volume
of mathematical
functions.
Example of IT professionals who At the end of the At the end of the
Statement can provide technology programme, graduates course, students are
(Psychomotor solutions and services are able to diagnose and able to:
Domain) to meet the evolving troubleshoot technical - Implement
needs of the industry. issues related to database security
computing software, and measures
design and implement including user
databases. authentication,
access control
and data
encryption.
- Design and
develop data
schemas using
industry standard
122
database
management
tools.
123
LIST OF PANEL MEMBERS
124
125