Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Dokumen GGP Aosl 9223 Final v3

Download as pdf or txt
Download as pdf or txt
You are on page 1of 127

`

Guidelines to Good Practices: Assessment of Student Learning


First Edition: 2014
Second Edition: November 2023

Malaysian Qualifications Agency


No. 3539, Jalan Teknokrat 7
Cyber 5, 63000, Cyberjaya
Selangor Darul Ehsan

Tel: +603-8688 1900


Fax:+603-8688 1911
Email: skp@mqa.gov.my
Website: www.mqa.gov.my

© Malaysian Qualifications Agency 2023


All the Agency’s publications are available on our website: www.mqa.gov.my
FOREWORD
The Malaysian Qualifications Agency (MQA) has published numerous quality
assurance documents, such as the Malaysian Qualifications Framework (MQF), Code
of Practice for Programme Accreditation (COPPA), Code of Practice for Institutional
Audit (COPIA), Code of Practice for TVET Programme Accreditation (COPTPA), Code
of Practice for Programme Accreditation: Open and Distance Learning (COPPA: ODL),
Standards, Programme Standards (PSs), and Guidelines to Good Practices (GGP), to
ensure that the programmes offered by the Higher Education Providers (HEPs) in
Malaysia meet international practices. It is imperative that these documents be read
together with this GGP for the assessment of students learning.

This GGP outlines sets of characteristics that describe the minimum levels of
acceptable practices and are divided into areas:

a. Part 1: The Overview of Assessment in Higher Education;

b. Part 2: Assessment of Students’ Learning;

c. Part 3: Assessment Management;

d. Part 4: Assessment in Diverse Contexts; and

e. Part 5: Communicating Assessment and Outcomes.

Accordingly, the GGP covers different levels of standards leading to the award of
individual qualifications prescribed in the MQF 2nd Edition (2018), ranging from the
level of certificate (Level 3, MQF) to the level of Doctoral Degree (Level 8, MQF).

This GGP was developed by the MQA in collaboration with the Ministry of Higher
Education. It represents the significant contribution of the panel members from both
public and private HEPs, and in consultation with various HEPs, relevant government
and statutory agencies, industries, alumni, and students through stakeholder
workshops and online feedback. The GGP developed reflects national and
international best practices to ensure programme development from the HEPs in
Malaysia is on par with those in other countries.

1
This GGP encourages diversity and allows programme providers to be innovative in
creating their niches. HEPs should ensure that they produce graduates that meet the
current and future needs of the industry, and at the same time, fulfil their obligations to
society. Among others, this document includes statements of types of assessment in
various contexts, which are intended to give clarity and are not intended to be adopted
in a verbatim manner.

The MQA would like to express its appreciation to all the panel members and various
stakeholders for their valuable input, as well as to all MQA officers who contributed to
the development of this GGP for Assessment of Student Learning. It is hoped that this
GGP is beneficial to different stakeholders for the development of the competencies
required of our students, for both job and higher education prospects.

Dato’ Prof. Dr. Mohammad Shatar bin Sabran (DIMP, DPMP)


Chief Executive Officer
Malaysian Qualifications Agency
November 2023

2
TABLE OF CONTENTS
List of Tables 5
List of Figures 6
Glossary 7
Abbreviations 20

Part 1 The Overview of Assessment in Higher Education 23

Part 2 Assessment of Students’ Learning 28


2.1 Notion of Active Learning and Constructive Alignment 29
2.2 Constructive Alignment 30
2.3 Notion of Student Learning Time and Assessment 32
2.4 Assessment Plans 33
2.5 Table of Specifications 36
2.6 Assessing Learning Outcomes 37
2.7 Knowledge and Understanding 39
2.8 Cognitive Skill 41
2.9 Functional Skill 43
2.10 Personal and Entreprenurial Skills 50
2.11 Ethics and Professionalism 51

Part 3 Assessment Management 53


3.1 Management of Student Assessment and Its Process 53
3.2 Conducting Both Formative and Summative Assessment 54
3.3 Types of Assesment 56
3.4 Assessment Methods 64
3.5 Review of Assessment Methodologies and Currency With Development in
Best Practices 70
3.6 The Presence of Artificial Intelligence Tool 74

Part 4 Assessment in Diverse Contexts 76


4.1 Assessment in Diverse Contexts 76
4.2 Student Differences 77
4.3 Cross-Cultural 78
4.4 Programme Conducted 78
4.5 Coursework Mode 80
4.6 Research Mode 80
4.7 Exegesis and Creative Output 82
4.8 Mixed Mode 82
4.9 Industrial Mode (Wbl, Apprenticeship, 2u2i) 83
4.10 Courses Offered 85

3
Part 5 Communicating Assessment and Outcomes 92
5.1 Continuous Quality Improvements (CQI) 92
5.2 Review of Assessment and Development 93
5.3 Conclusion 95

References 97

Appendices 100
Appendix 1: Example of SOLO Taxonomy Aligned Assessment Plans 100
Appendix 2: Example of The Forms of Online Assessment (Multiple Choice
Questions and Fill in The Blanks) 102
Appendix 3: Example of The Forms of Online Assessment
(True/False and Essay Questions) 103
Appendix 4: Online Engagement, Platform and Assessment (Asynchronous) 104
Appendix 5: Online Engagement, Platform and Assessment for Continuous
and Authentic Assessment 105
Appendix 6: Data Encryption During Transmission of Examination 107
Appendix 7: Conditions and Requirements for Online Assessment 108
Appendix 8: Example of Approximation in Assessment Tasks 109
Appendix 9: Example of Technical Time Allowance for Submission in
Online Exams110
Appendix 10: Example for Supervisor’s Key Areas of Assessment 111
Appendix 11: Example for Postgraduate Level of Assessment 112
Appendix 12: Course Assessment Plan, Instruction and Rubric 113
Appendix 13: Consideration for Exam on Demand 119
Appendix 14: Example of Statements 122

4
LIST OF TABLES

Table 1 Description of Course Assessment Plan 34


Table 2: Example of ToS Aligned Course Assessment Plans 35
Table 3: Revised Bloom's Taxonomy for Cognitive Domain 39
Table 4: Examples of Assessments Based on Bloom’s Taxonomy 40
Table 5: Example for Assessing Problem Solving and Learning Process for PBL 42
Table 6: Example of Psychomotor Domain in Machining Course 44
Table 7: Example of Assessment for Team Oral Presentation: Individual and 46
Team
Table 8: Digital Skills Framework in Higher Education 47
Table 9: Areas of Assessment for Leadership 50
Table 10: Example Construction for the Assessment Rubrics 52
Table 11: Assessment Integration and Process at The Institutional Level 54
Table 12: Management of Student Assessment and Process 59
Table 13: Example of Task and Grading Instrument 68
Table 14: Mapping of Lesson Learning Outcomes to Course Learning 69
Outcomes
Table 15: Example Criteria for Thesis Evaluation 81
Table 16: Example Criterions for Practical Evaluation 84
Table 17: APEL Assessment Mechanism 89
Table 18: APEL Assessment Instruments 91
Table 19: Example of SOLO Taxonomy Aligned Assessment Plans 100
Table 20: Example of the Forms and Online Assessment 102
(Multiple Choice Questions and Fill in The Blanks)
Table 21: Example of the Forms of Online Assessment (True/False and Essay 103
Questions)
Table 22: Online Engagement, Platform and Assessment (Asynchronous) 104
Table 23: Online Engagement, Platform, and Assessment for Continuous and 105
Authentic Assessment

5
LIST OF FIGURES

Figure 1: The Seven Areas of the COPPA 2nd edition (MQA, 2017) 24
Figure 2: Assessment of Learning & Teaching Activities 25
Figure 3: Assessment of Students’ Learning and the Structure of the Guidelines 27
Figure 4: The OBE's Principles 29
Figure 5: Constructive Alignment 31
Figure 6: The Purpose of TOS 36
Figure 7: Elaboration of Revised Bloom's Taxonomy for Cognitive 41
Domain
Figure 8: International Assessment of Adult Competencies (PIAAC) in 49
Numeracy Competency
Figure 9: Formative and Summative Assessment 55
Figure 10: Aligning Learning Outcomes, Learning and Teaching Activities and 65
Assessment
Figure 11: Review of Assessment Methodologies & Best Practices 70
Figure 12: Preventive Methods to Address Integrity Issues 75
Figure 13: Self-Instructional Materials (SIM) Features 77
Figure 14: Future-Ready Framework (2020) 79
Figure 15: Experience Learning and Competency-Based Education Landscape 79
Figure 16: The Research Under Examinations Characteristics 80
Figure 17: Methods of Assessments 83
Figure 18: Three (3) Principles of Assessment that can be Related to APEL 89
Figure 19: The Assessor’s Characteristics 90
Figure 20: Overview of the CQI Process 93

6
GLOSSARY

No Term Explanation

1. Alternative Assessment Alternative assessment is an assessment other than the paper and
pencil tests of examination. Alternative assessment has elements of
being holistic, authentic, collaborative, and related to the world and has
the potential to provide meaningful and ensuring ways of learning.

Note: Refer to this link for relevant publications related to JPT, MoHE
initiatives
https://jpt.mohe.gov.my/portal/index.php/ms/penerbitan?start
=10. Related to GGP:AoSL is Ebook on Alternative
Assessment in Higher Education and Ebook on NOBLe.

2. Analytic Judgement is based on specific assessment tasks. This can be part of


Judgement/Grading the judgement made in measuring or evaluating the performance
quality of students and programmes.

3. Assessment Assessment is the systematic process of documenting and using


empirical data on the knowledge, skills, attitudes, and beliefs to refine
programmes and improve student learning.

Assessment can focus on the individual learner, the learning community


(class, workshop, or other organised groups of learners), a course, an
academic programme, the institution, or the educational system.

It is a systematic and cyclical way to improve the quality of students'


performance and development by continuously collecting, analysing,
and discussing direct and indirect data and evidence of students
learning from multiple and diverse sources.

Its purpose is to have a deep understanding of what the students really


know and can do and to provide feedback to improve students' learning,
educators' teaching (feed-forward), curriculum planning, and the overall
programme's effectiveness.

The data collected in assessments is used by students, educators,


curriculum planners, and administrators to promote student learning
and is not meant to make judgements.

7
4. Assessment Data Assessment data can be obtained from directly examining student work
to assess the achievement of learning outcomes or can be based on
data from which one can make inferences about learning.

5. Assess Forward This concept is used in the document to indicate the opposite of the
design-backwards concept when designing a curriculum.

It refers to the process of collecting data starting at the classroom level


and proceeding to the course level.

Eventually, the data becomes part of the evidence required in


determining students learning, leading to improved/modified
instructional approaches and improving the effectiveness of a
programme and the institution.

6. Assessment Method Assessment methods define the nature of the assessor's actions and
include examining, interviewing, and testing in a structured or self-
paced mode.

The ‘examine' method is the process of reviewing, inspecting,


observing, studying, or analysing one or more assessment objects, i.e.,
specifications, mechanisms, or activities.

The assessment methods are simply the ways and strategies to collect
data. It can be classified into four categories:

i. Selected Response & short answer;


ii. Constructed or Extended Written Response;
iii. Performance Assessment; and
iv. Personal Communication.

This can be done in a formal or informal engagement.

7. Assessment The measuring device is used for learners to qualitatively and


Instruments/Tools of quantitatively provide direct and indirect evidence of learning and for
Measurement educators, curriculum designers, and administrators to collect direct
and indirect evidence of students learning gains and overall students'
learning experiences.

This device must be constructively aligned with the learning outcomes


(valid). By using appropriate assessment criteria, the device can

8
provide highly accurate (reliable) data related to learning outcomes,
attainment, and achievement.

Different measuring devices (belonging to any of the assessment


methods) will be required to collect data dealing with different and
varied learning outcomes.

Alternative assessment tools for measurement can include a checklist,


rubric, interview or observation protocol, and anecdotal records as an
instrument to measure learning.

8. Assessment Item The questions or statements are constructed in an assessment


instrument that will allow students to directly or indirectly demonstrate
how much and how well they know, understand, and can transfer what
they know to various authentic contexts.

9. Assessment Task (AT) An assessment task is a specific work (performance or product) given
by educators to students, allowing them to show how much and how
well they have mastered the learning outcomes. The task is given using
an appropriate and aligned assessment instrument.

It must be integral to the attainment of learning outcomes, provide


explicit instruction and information about what students are required to
do, inform the learner about the amount of time appropriate to complete
the task, and provide clear and explicit scoring/assessment criteria and
benchmark standards.

Results from this task can be used to improve students’ learning,


measure their performance, make judgements about achievement, and
assess programmes’ effectiveness.

10. Classroom Assessment Classroom assessment is a form of continuous evidence collection that
is usually done during face-to-face learning activities.

Classroom assessment aims to diagnose existing learning barriers and


identify students' progress in attaining the learning outcomes.

This evidence is used by educators to address the existing barriers and


promote student learning by changing or adjusting the classroom
instructional strategies and delivery system.

9
11. Competency Competency is an underlying characteristic of a person/performer
regarding his/her knowledge, skills, and abilities that enables him/her to
successfully and meaningfully complete a given task or role.

12. Constructive Alignment Constructive Alignment is an approach to curriculum design in which


(CA) the teaching and learning activities are designed to maximise (enhance)
learning by requiring students to engage and activate the verbs
specified in the learning outcomes and for them to activate the same
verb in the assessment tasks.

The term construct refers to students constructing and structuring their


prior knowledge and understanding by giving meaning to what is to be
learned.

Alignment refers to a learning environment set up by the educators that


allows students to meaningfully engage with the action verb of the
learning outcomes and engage the same action verb again in the
assessment task to solicit how well the outcomes are learned.

13. Continuous Data collection processes are continuously done throughout a course,
Assessment module, or programme to gather evidence of learning to improve
learning, modify teaching, and adjust the curriculum design.

It also includes data gathering that is used to assess how well courses
offered by the programme support attainment of the Programme's
learning outcomes.

Examples of Continuous Assessments:

alternative assessments, lab- or workshop-based tasks or assignments,


midterm examinations, tests, and quizzes.

14. Continuous Quality Continuous Quality Improvement (CQI) in assessment establishes the
Improvement monitoring metrics to evaluate improvement efforts and outcomes
routinely and ongoing for students’ performance.

It is meant to improve efficiencies in processes, communication, the


quality of delivery of the required assessments, and the body of
knowledge implemented in the courses.

10
For Continual Quality Improvement, please refer to GGP: PDD, and
COPPA (Area 7)

15. Course Learning The CLO is intended or desired learning gains in terms of
Outcomes (CLOs)
i. Declarative knowledge (factual, conceptual, procedural),
ii. Functional knowledge (knowledge transfer),
iii. Metacognitive knowledge,
iv. Cognitive skills,
v. Practical skills,
vi. Habits of mind,
vii. Performance, and
viii. ways to respond to events and people as a result of the
learning experiences in the course/module.

It contains the measurable action verbs, the substance/content to be


learned, and the targeted competency level.

16. Coursework The conventional continual content-based data collection process and
Assessment analysis, such as testing, writing, presenting, or performing, are used to
evaluate students' performance and how well they have learned the
content, which can also be used as part of the learning outcomes
attainment.

The score/grade contributes towards the final grade.

17. Criterion-referenced It is a method of assessment in which a person's grade, or score is


Assessment (CRA) compared to their learning standard and performance level,
independent of other students.

18. Design Backward An approach to curriculum design that begins with the goals in mind.

The goals begin with crafting the programme aim (purpose and
justification to offer the programme and the adopted philosophy) that
support the attainment of the country’s and university's mission.

Once this aim is agreed upon, programme designers then craft the
programme educational objectives (PEO) that will be used to support
the attainment of the programme aim.

11
This is followed by deciding on the programme learning outcomes
(PLOs), the performance criteria, the performance and outcome
indicators, and the target intended for each PLO.

Students' development and assessment towards being competent for


each PLO are then nurtured by planning an appropriate combination of
courses to be taken each semester, appropriate course learning
outcomes (CLOs), and relevant course content for each course.

19. Direct Evidence Evidence is collected and analysed to demonstrate that actual learning
has taken place.

It informs students, educators, and other stakeholders of the depth,


breadth, and performance quality (what, how much, and how well) that
students have gained in terms of the relevant knowledge,
understanding, skills, habits of mind, and ways of responding to people
and situations. Such evidence of students’ learning may also include
their performances and grades.

20. Evaluation Evaluation is the process of using the evidence collected through
assessment to make a value judgement on students’ performance and
programme performance relative to the benchmark standards specified
by the learning outcomes’ performance criteria and performance target.

For example, assigning a score/grade to an assessment task for a


course and deciding on the students’ next course of action or the
programme's course of action is considered evaluating the student or
the programme.

21. Formative Assessment Learning activities that are carried out to find out the level of
achievement based on the learning outcome.

It focuses on providing feedback (with/ without giving scores/marks) to


students for improvement, unlike the scores/marks given in summative
assessment, which determines students’ overall performance.

Formative assessment is a form of low-stakes assessment for FOR and


AS learning and is part of the instructional process. It is about
continuously collecting data as learning is in progress.

12
In this sense, formative assessment informs both educators and
students about student understanding at a point when timely
adjustments can be made. These adjustments help to ensure students
achieve the targeted learning outcomes within a set time frame.

Formative assessment, formative evaluation, formative feedback, or


assessment for learning, including diagnostic testing, is a range of
formal and informal assessment procedures conducted during the
learning process in order to modify teaching and learning activities to
improve student attainment.

The goal of a formative assessment is to monitor student learning and


provide ongoing feedback that can help students identify their strengths
and weaknesses and target areas that need work.

It also helps faculty recognise where students are struggling and


address problems immediately.

It typically involves qualitative feedback (rather than scores) for both


students and educators that focuses on the details of content and
performance.

22. Functional Graduates Graduates who are competent and are able to persistently, responsibly,
and ethically transfer their knowledge, understanding, skills, and
abilities to identify and solve ill-defined, complex, and difficult problems
in their personal, social and professional journeys.

23. Grading Criteria This concept is used when making judgements on the quality of the
performance of assessment tasks or learning outcomes. Grades are
usually based on either indirect grading of learning outcomes (analytical
judgement of the assessment task aligned with the learning outcomes)
or direct grading of the learning outcomes (holistic judgement).
Performance quality for each grade is clearly described in the criteria.

24. Graduate Attributes Graduate attributes are the learning traits and characteristics that are
relevant and appropriate to the graduate's personal, social, and
professional role in life.

These attributes are clearly indicated in the Malaysian Qualification


Framework 2.0

13
25. High-Stake Assessment High-stake is a conventional method of assessment that has significant
consequences or impact on individuals, based on their performance. It
is usually conducted in a controlled area like examination halls and most
likely uses a written-based type of examination, where examination has
undergone rigorous development, validation, and security measures to
ensure fairness and accuracy.

It enables students to demonstrate competencies, strengths, and the


synthesis of course outcomes across the full course.

26. Holistic Judgement Judgement that defines the performance quality and standard by
combining all the performances solicited by assessment tasks.

27. Indirect Evidence Indirect evidence is evidence or data collected for the purpose of
seeking students’ perceptions of their learning and their learning
experiences.

Examples include: Programme entrance and exit surveys, student


interviews (e.g., graduating seniors), and alumni surveys.

28. Learning Taxonomies A classification system dealing with varying degrees of cognitive
complexity, skill complexity, and the complexity of the value system
adopted when acting or responding to people, events, and
environments.

29. Lesson Learning These are the outcomes to be achieved upon completion of a lesson.
Outcomes (LLOs) The lesson outcomes are systematic formative measures for
developing students’ attainment of the CLOs.

30. Measurement Measurement is used to provide a score as a product of measuring and


quantifying tangible and intangible attributes and/or learning outcomes.

The measuring tools used are those described in the assessment


instruments section.

31. Norm-referenced Norm-referenced assessment is an assessment approach where


Assessment (NRA) students’ grades are determined by comparing their performance to
other students’ performance based on the normal bell-shaped curve.

32. Outcome/Performance See Performance Criteria/indicators.


Indicators

14
33. Outcomes-Based It is an integrated, valid, reliable, fair, continuous (rather than continual
Assessment (OBA) and judgmental testing), and aligned approach to collecting evidence of
students learning for the purpose of improvement by focusing more on
formative assessment and providing timely feedback.

It considers student diversity and employs multiple and diverse


assessment methods. It is criterion-referenced where the learning
outcomes and the benchmark standards become the assessment
criteria when judgement is made at the end of a course or at the end of
a programme.

34. Outcomes-Based An approach to education that begins with clearly focusing on high-
Education (OBE) quality, culminating demonstrations of significant learning in context and
organising everything in an educational system around what is essential
for all students to be able to do successfully at the end of their learning
experiences.

This means starting with a clear picture of what is important for students
to be able to do, then organising the curriculum, instruction, and
assessment to make sure this learning ultimately happens to all
students.

35. Outcomes-Based An approach to making judgements on students’ performance quality in


Grading (OBG) a course based on how well they have attained the learning outcomes.

Course grades are assigned based on either indirect and analytical


grading of assessment tasks aligned to the learning outcomes or direct
and holistic grading of the learning outcomes.

36. Peer Assessment Peer assessment involves students being responsible for making
assessment decisions and judgements on other students’ work.

It is an important part of formative assessment that should take place,


especially in group work, where it becomes a way in which the group
assesses itself.

This form of assessment assists students’ reflections and helps the


group members understand that the decisions they make regarding the
quality of their work are their own and that they should take
responsibility for improving their work.

15
As with any responsibility, the skill of peer assessment should be further
developed incrementally (step-by-step) by the educator.

37. Performance An assessment method that uses student activities or products, as


Assessment opposed to tests or surveys, to evaluate students’ knowledge, skills,
and development.

Instruments include essays, oral presentations, exhibitions,


performances, and demonstrations.

Examples include:

i. reflective journals (daily/weekly).


ii. capstone works and experiences.
iii. demonstrations of student work (e.g., acting in a
theatrical production, playing an instrument, observing a
student teaching a lesson);
iv. products of student work (e.g., Art students produce
paintings/ drawings,
v. Journalism students write newspaper articles,
vi. Geography students create maps,
vii. Computer science students generate computer
programmes.

38. Performance Criteria / Performance criteria or performance indicators are specific,


Indicators measurable statements identifying the specific knowledge, skills,
attitudes, and/or behaviours students must demonstrate as indicators
of achieving the outcomes.

Performance criteria are statements that define learning outcomes and


enable faculty to measure student competency.

Each performance criterion must also specifically describe an


acceptable level of measurable performance. For performance criteria
that are not directly assessable, indirect indicators of performance can
be identified.

39. Performance Target Specifies the threshold score and the threshold frequency to indicate
the effectiveness of a programme.

16
For example, an indicator that the programme is effective in achieving
programme learning outcomes related to acquiring and applying of
knowledge and understanding is by targeting that 60% of the students
score 70 or more in a programme exit examination.

Another example could be to target 80% of the students who indicated


a score of 4 or higher on the Likert scale in a programme exit survey to
indicate effectiveness on programme learning outcomes related to
lifelong learning.

40. Portfolio An accumulation of evidence about individual proficiencies, especially


in relation to the performance criteria for each of the programme
learning outcomes.

Examples include, but are not limited to:

Samples of student work include artworks, multimedia projects,


journals, exams, papers, presentations, videos of speeches, and
performances.

41. Rubrics A scoring/grading tool that contains a list of criteria and benchmark
standards and is used to score or grade assessment tasks or learning
outcomes.

Descriptors of the performance quality from the highest quality to the


unacceptable quality for each criterion or for each learning outcome will
guide both the students in identifying their shortcomings and the
educators in reliably scoring and grading the performance/product.

42. Self-assessment Self-assessment is a learning experience that involves the student


understanding assessment criteria and enabling them to take
responsibility for making judgements about their own learning. This
gives the learners the opportunity to reflect on what they do.

Students best benefit from the use of logs, diaries, and digital recording
devices to record their thoughts on the quality of their work so that they
can improve themselves.

43. Soft Skills The generic skills or attributes that employers value and that students
require in their professional and societal engagement.

17
Examples include the ability to communicate, manage information,
manage time, manage resources, engage harmoniously with others,
provide leadership, and become responsible and active team members.

44. Student-Centred Learning environments and approaches that focus on students.


Learning (SCL)
This means knowing about learners learning preferences, intelligences,
existing knowledge, interests, listening and writing skills, family and
cultural background, and other relevant information that can become a
barrier to learning or that can enhance their learning and the learning of
others in the learning community.

Instructional approaches employed in developing their potential must


be balanced and diverse to cater to the diversity of learners.

45. Summative Assessment Summative assessments are used to evaluate student learning, skill
acquisition, and academic achievement at the conclusion of a defined
instructional period, typically at the end of a project, unit, course,
semester, programme.

It is periodically determined at a particular time what students know and


do not know related to the content standards.

Assignments are presented to students at a specified period throughout


the curriculum delivery process.

It aims to measure and evaluate students' overall achievement, based


on the goals of learning outcomes with grades/scores, which are given
to evaluate their performance. It also links to accountability purposes in
the summative assessment methods, which have high stakes, which
means they have a high point value.

The results are usually defining; for instance, they can determine
whether a student passes the course, gets a promotion, or secures
admission.

The goal of summative assessment, usually found in high-stakes


examinations is to measure the level of success, performance quality,
proficiency, or how well students have achieved the learning outcomes
at the end of an instructional unit or a course/module/programme by
comparing them against some standard or benchmark.

18
46. Test A test is a sample of items or constructs that measure performance in
a specific domain.

47. Type of Assessment Generally, various types of assessment typically include:


i. Formative
ii. Summative
iii. Diagnostic
iv. Interim/benchmark
v. Norm-referenced
vi. Criterion-referenced

48. Weightage A quantitative way of assigning the significance or weight of a learning


outcome in a list of learning outcomes for a course/module.

Generally, the weightage is determined by taking into consideration the


number of teaching hours spent covering each course learning outcome
divided by the total teaching hours before finding its percentage.

This weightage indicates how essential the learning outcome is and


how the impact of the way the final grade for a course/module is
determined in view of the course learning outcomes.

19
ABBREVIATIONS

AT Assessment Task
APEL Accreditation of Prior Experiential Learning
APEL.A Accreditation of Prior Experiential Learning for Access
APEL.C Accreditation of Prior Experiential Learning for Credit Award
APEL.M Accreditation of Prior Experiential Learning for Micro-
credentials
APEL.Q Accreditation of Prior Experiential Learning for Award of
Academic Qualifications
CA Constructive Alignment
CLO Course Learning Outcomes
COPIA Code of Practice for Institutional Audit
COPPA Code of Practice for Programme Accreditation
COPPA:ODL Code of Practice for Programme Accreditation: Open and
Distance Learning
COPTPA Code of Practice for TVET Programme Accreditation
CPD Continuing Professional Development
CQI Continual Quality Improvement
ELT Effective Learning Time
EXCEL Experiential Learning and Competency-Based Education
Landscape
GGP: PDD Guidelines for Good Practices: Curriculum Design and Delivery
HEIPs High Impact Educational Practices LMS Learning Management
System
LLO Lesson Learning Outcomes
LOC Learning Outcomes Cluster
LOD Learning Outcomes Domain
MMS Malaysian Micro-credential Statement
MOHE Ministry of Higher Education
MOOC Massive Open Online Courses
MQF Malaysian Qualifications Framework

20
NOBLe National Outcomes Based Learning
OBA Outcomes-Based Assessment
OBE Outcomes Based Education
OSCE Objective Structured Clinical Examination
ODL Open and Distance Learning
PBL Problem-Based Learning
PEO Programme Educational Objective
PIAAC International Assessment of Adult Competencies
PLO Programme Learning Outcomes
SIM Self-instructional Material
SLT Student Learning Time
SOLO Structure of Observed Learning Outcomes
TnL Teaching and Learning
TOS Table of Specifications
TVET Procedure Technical and Vocational Education and Training
SDG 4 UNESCO’s Sustainable Development Goal 4
WBL Work-based Learning

21
INTRODUCTION

A guideline for the establishment of good practices is essential, especially in ensuring


the provision of quality in higher education. The guideline of good practice for
assessment of students' learning provides fundamental understanding in assessing
students following basic principles. This includes ensuring fairness, relevance, and
grounded ethical values that uphold integrity, validity, and reliability. This guideline has
been developed with the vision and mission to benefit higher education providers
(HEPs) in implementing assessment.

Vision

Ensuring that HEPs are well informed with basic understanding of assessing students’
learning, which is key to confirming that learning has taken place. It is envisaged that
a clear understanding of the basic principles of assessing students will help HEPs
provide quality education in light of the aspirations of the nation.

Mission

Providing a comprehensive guideline that gives room for future development on


assessment in higher education.

22
PART 1
THE OVERVIEW OF ASSESSMENT IN HIGHER EDUCATION

Assessment, whether done informally or formally, is a human activity that involves


interaction when feeding descriptive feedback for learners' improvement.

When numbers are assigned to measure learning during an assessment, the scores
can be used as an evaluation to form a judgement, to see whether the learning that
has taken place is satisfactory based on the standard set.

Various considerations govern the principles of assessment, including equity, fairness,


validity, reliability and feasibility as well as free from bias, with clarity in measurement
and the tool of assessment, to measure the outcomes stipulated.

The fundamentals of assessment in outcome-based education require an approach


that stresses the notion of beginning with the end in mind since assessment drives
learning.

In facing the challenges of the volatile, uncertain, complex, and ambiguous world while
adapting to the constant changes of the industrial revolution, assessment in education
must also be revisited.

Assessment must be expanded to not only include the conventional method of


assessment, which usually comes in the form of a pencil/pen and paper test, or also
known as high-stakes assessment, but also to include various alternative ways of
assessing in a meaningful, exciting, and authentic way in real-world contexts.

Therefore, the revised version of the 'Guidelines to Good Practices: Assessment of


Students’ Learning (GGP: AoSL) is a document developed to assist Higher Education
Providers (HEPs) in meeting the standards on the item in the GGP: AoSL, marked as
Area 2 of the Code of Practice for Programme Accreditation (COPPA), Code of Practice
for Programme Accreditation: Open and Distance Learning (COPPA:ODL), Code of
Practice for TVET Programme Accreditation (COPTPA) and the Code of Practice for
Institutional Audit (COPIA).

23
Figure 1: The Seven Areas of the COPPA 2nd edition (MQA, 2017)

COPPA is concerned with the practices of HEPs in curriculum design and delivery,
while COPIA is primarily concerned with institutional processes applied in curriculum
development and delivery.

The second edition of the COPPA, which also relates to and is mapped to standards
for diverse programmes and themes, namely open and distance learning (COPPA-
ODL) as well as technical and vocational education and training (COPTPA), provides
the need to address issues with regards to management and assessment in these
diverse contexts.

For both reviews towards programme accreditation and institutional audit, it would be
the assessors' concerns on the procedures and practices adopted by the institutions
in the areas covered by the Codes and whether these match the provisions of the
Codes.

In addressing the needs of the ever-changing context of the learning environment,


programmes, and diverse learners, there would be a need to consider various ways of
assessment relevant in different areas, such as Area 1 based on the programme
design and delivery and Area 7, which relates to the continuous quality of improvement
for the course(s) that have relevance to the programme.

24
HEPs are discouraged from simply copying the guidelines and
samples/examples given in the document or appendices. Instead, HEPs must
strive to understand and develop their curriculum design, delivery processes,
and assessment to best fit the needs and requirements of HEPs and its students.

The GGP: AoSL is premised on the fact that assessments are paralleled with students'
learning.

Furthermore, research (see, for example, Biggs, 2003) suggests that assessment
drives student learning and directly influences students' approaches to studying.

For example, if assessment tasks for a particular programme and course require
students to reproduce or regurgitate information, students will study only to reproduce
information.

Figure 2: Assessment of Learning & Teaching Activities

Figure 2 shows the role of assessment in learning and teaching activities in the
attainment of outcomes. Since assessment is an integral part of the learning and
teaching process, the assessment methods or outcome indicators employed must be
constructively aligned with the Programme Learning Outcomes (PLOs) and Course
Learning Outcomes (CLOs).
25
Ensuring this alignment will encourage students to take learning approaches that will
result in the achievement of the CLOs and hence assist in the attainment of the PLOs.

This document covers the following areas, which are divided into parts:

i. The Overview of Assessment in Higher Education (Part 1);


ii. Assessment for Students’ Learning (Part 2);
iii. Assessment Management (Part 3);
iv. Assessment in Diverse Contexts (Part 4) and
v. Communicating Assessment and Outcomes (Part 5).

Part 1 introduces the notion of assessment and the document.

Parts 2 and 3 provide a fundamental understanding of the alignment of learning


outcomes in assessments as well as managing assessments that can be carried out.

Part 4, which is an added component to the GGP: AoSL, ascertains various possible
contexts for assessment.

This is followed by Part 5 as the concluding part, which relates to how information
about students’ assessment and their learning can be communicated to them and other
various stakeholders to ensure continuous quality improvement can be made for
further development.

The glossary provides not only the basic terms used in the document but also a brief
explanation of these concepts.

Figure 3 shows the relationship between the assessment of students’ learning and the
attainment of CLOs and PLOs as the means to support the attainment of the
Programme Educational Objectives (PEO).

It indicates the need to align assessment methods with the attainment of the learning
outcomes (LO) and the need for a systematic student assessment process within the
institution in diverse contexts and communication for improvement.

The discussion provided in the five parts of this document addresses Area 2 of Code
of Practices Assessment of Student Learning, as illustrated in Figure 3.

26
Figure 3 :Assessment of Students’ Learning and the Structure of the Guidelines

27
PART 2
ASSESSMENT OF STUDENTS’ LEARNING

An Outcomes-Based Assessment is criterion-referenced, where the LOs are the


criteria to be assessed. In other words, the process of grading a course is done based
on evaluation of student learning against a set of predetermined criteria. This contrasts
with norm-referenced assessment, where students’ achievements are compared with
those of others. Thus, OBE requires academic staff to focus on the achievement of
CLOs based on the pre-determined criteria.

Outcomes-Based Assessment (OBA) involves choosing assessment tasks or


instruments that are constructively aligned with the attainment of the LOs. It also
means choosing assessment methods and tasks that will support learners in their
learning progress and that will validate their achievement of the LOs at the end of the
learning.

OBA is a systematic assessment approach to find out how well students attain the
intended CLOs and PLOs. The assessment of students’ learning involves collecting
evidence of outcome attainment both at the course and programme level. Evidence
gathered through OBA is used to judge how well the criteria specified by the LOs are
attained. The attainment of the CLOs is used to infer the attainment of the specific PLO
in the programme. Hence, students should be informed of their PLO attainment in each
semester to allow them to work on areas that need improvement.

Assessment is the process of finding evidence that the LOs, which are the minimum
performance or competence level, have been achieved when students have
successfully completed a certain course or graduated from a particular programme
offered by the HEP.

Assessment, in general, serves the purposes to:

i. promote learning;
ii. measure performance by awarding grades that indicate whether and how
well a particular student has attained the stated LOs;
iii. determine whether a particular student is sufficiently well prepared in a
subject area to proceed to the next level of instruction;

28
iv. provide feedback to students that indicates levels of attainment and
diagnoses misunderstandings and learning difficulties; and
v. provide feedback to teaching staff to identify and diagnose ineffective
teaching methods/techniques.

2.1 NOTION OF ACTIVE LEARNING AND CONSTRUCTIVE ALIGNMENT

Figure 4: The OBE’s Principles

It comprises the course assessment plan, its course execution, the required
report documentation, analysis of students’ performance, and intervention as
required.

2.1.1 In completing this process, students' performance reports must be


shared with them to ensure that they receive feedback on their PLO
attainment, monitor their progress, and take the necessary actions to
attain all PLOs upon graduation.
2.1.2 The process of monitoring, evaluating, and analysing is an iterative
process as part of closing the loop.

29
2.1.3 This will also support CQI reporting for each course, thus embracing
the OBE implementation. Please refer to the MQA Guidelines to Good
Practices: Monitoring, Reviewing, and Continually Improving
Institutional Quality.
2.1.4 The implementation of OBE requires active learning and student-
centred learning approaches.
2.1.5 Active learning refers to a broad range of teaching strategies that
engage students as active participants in their learning during class
time with their instructors.
2.1.6 Strategies involve some students working together during class and
may involve individual work and/or reflection.
2.1.7 Outside of class time, students continue to be actively engaged in
pursuing knowledge and skills.
2.1.8 To ensure the attainment of CLOs, proper assessment planning must
be done and meet the principles of constructive alignment.

2.2 CONSTRUCTIVE ALIGNMENT


2.2.1 Constructive alignment is a principle used to devise teaching and
learning activities and assessment tasks that directly address the
learning outcomes intended in a way not typically achieved in
conventional lectures, tutorial classes, or assessments.
2.2.2 The term "construct" refers to students constructing and structuring
their understanding and personally creating meaning about what is to
be learned.
2.2.3 Alignment refers to a learning environment set up by the course owner
to allow students to engage with the action verb of the learning
outcomes meaningfully and to employ the same action verb again in
the assessment task to assess how well the outcomes are learned.
2.2.4 The key is that the components in the delivery of knowledge itself,
especially the teaching methods and their suitable assessment tasks,
are aligned with the learning activities and the intended outcomes.

30
Figure 5: Constructive Alignment

2.2.5 Courses generally follow the progression of complexity in abilities


(cognitive, psychomotor, and affective) from lower to higher orders.
2.2.6 Assessing students' learning in a course may require varied
assessment methods.
2.2.7 The principle of constructive alignment is also applied in assessing
other learning outcomes, including PEOs and PLOs.
2.2.8 Referring to Appendix 14, it shows examples of PEOs, PLOs, CLOs,
and some constructively aligned assessment methods and indicators.
The assessment methods in Appendix 14 are only samples of methods
that may be employed to assess students' learning.
2.2.9 The assessment methods chosen must be aligned with the process of
finding evidence of the LOs’ attainment and must be consistent with
the student learning time required to complete the task.
2.2.10 They must also consider practical issues in scoring and providing
feedback to promote learning.

31
2.2.11 Programme learning outcomes (PLO) state what students know and
are able to do upon completion of the programme and are derived from
MQF 2.0 (2018) LO domains, while outcome indicators are
assessment tools used to collect evidence of students’ performance
and attainment after pursuing a study programme.

2.3 NOTION OF STUDENT LEARNING TIME AND ASSESSMENT


2.3.1 In determining student learning time (SLT) for a course, careful
considerations are made about the apportionment of the SLT required
to achieve each CLO.
2.3.2 The teaching and learning activities include both guided and non-
guided learning. All activities are targeted towards the achievement of
the CLOs. Thus, SLT should be the basis for formulating the weightage
for assessments to measure the attainment of each CLO.
2.3.3 The weightage of assessment tasks must be proportionate to the
emphasis on the CLOs, the learning activities/tasks and the
importance of the contents to the CLO attainment.
2.3.4 It is important to note that the weightage must adhere to the stated
assessment weighting for the course as approved by an academic
committee.
2.3.5 The teaching-learning activities include both guided and non-guided
learning.
2.3.6 All activities are targeted towards the achievement of the CLOs.
2.3.7 In the context of OBE, which is designed along a set of predetermined
outcomes, it is crucial to link teaching-learning activities to CLOs to
allow inferences on students' level of achievement for each CLO and
PLO.
2.3.8 The assessment methods are to be aligned to outcomes and
instructional delivery as aligned to the learning activities and
assessment tasks with the CLOs.

32
Note: Information on assessment method and SLT based on the course learning
outcomes can also be found in HEPs course information provided in Table
4 document (refer to the template provided by MQA), which HEPS provide
the summary of the course information. However, Table 1, Table 2 and the
exemplar given in Appendix 12 in this GGP, serve merely as an explanation
for various components when designing course assessment plan and
determining the weightage for assessment.

2.4 ASSESSMENT PLANS


2.4.1 The assessment provides feedback on the degree to which CLOs are
achieved.
2.4.2 The LOs for every lesson are mapped to CLOs to contribute to the
achievement of one or more of the CLOs.
2.4.3 The topics/contents to be taught are determined based on the lesson
LOs to be achieved.
2.4.4 The course instructor must ensure the content assessed represents
the course’s content.
2.4.5 HEPs may adopt different methods in which determining the manner
in the assessment task or examination meets the CLOs.
2.4.6 The template and requirements for the assessment plan may vary
across HEPs, but it should be able to illustrate the overall assessments
planned to meet the CLOs.

33
Table 1: Description of Course Assessment Plan
CLO CLO Taxonomy PLO Topics/ Contact Teaching Assessment Weightage
statement domain Content hours and methods
learning
method
CLO i ii iii iv v vi vii viii
1
Description:
i CLO statement that is part attributed v Contact hours needed to complete the CLO.
towards the PLO; contains the objective of
the course’s learning outcomes with the
appropriate level of verbs that formulates
the needed resources, manpower, reading
materials. Contact hours for teaching and
learning, and the manner of assessment.
ii The cognitive/affective/psychomotor level vi The manner of knowledge/skill transfer
(lectures/tutorials/workshop/lab works, etc.)
and the mode of delivery:
conventional/online/hybrid.
iii The PLO identified based on the MQF 2.0 vii Assessment appropriated to measure the
(2018) which the CLO’s objective would CLO to ascertain the evidence in the
contribute towards the achievement. learning of the topic/content.
iv Topic and content as reflected in the viii Commonly measured in percentage to
needed Body of Knowledge. ascertain the needed values to complete the
CLOs (please refer to the glossary).

34
Table 2: Example of ToS Aligned Course Assessment Plan
CLO COURSE TAXONOMY PLO TOPICS/ CONTACT TEACHING AND ASSESSMENT SPECIFIC TASK RELATED TO
LEARNING DOMAIN CONTENT HOURS LEARNING METHOD TASK (%) MOHE/MQF 2.0 (2018)
OUTCOMES Case Portfolio Final
(CLOs) Study Examination
1. Discuss the C5 2 List of 21 Interactive 20% 30% Case Study (20%)
concept, topics Lecture/Cooperative Marks for case study is to
principles, related to and Collaborative deliberate and analyse the
issues and the CLO Learning issues, trends and challenges
challenges in related to visual assessment and
visual art evaluation from pre-school until
assessment higher learning education.
and evaluation. Final Examination (30%) factual
testing.
2. Evaluate C5 7 List of 12.6 Case Analysis 10% 20% Case analysis (10%)
strategy and topics Marks for the case analysis
approach in related to deliberate students’ critical
measuring the CLO evaluation in assessing learning
learning outcome through sample of
outcome quantitative and qualitative data
through according to appropriate process
assessment and techniques.
and
evaluation.

3. Integrate digital A4, P1 6 List of 8.4 Independent Learning 20% The use of e-Portfolio as
technologies topics evidence-based by focusing the
and appropriate related to sub-attribute of new ideas,
software for the CLO curation, articulation, tools.
diagnostic,
formative and
summative
assessment,
and evaluation.
35
2.5 TABLE OF SPECIFICATIONS
2.5.1 Table of Specifications (ToS) is a tool/document used to ensure that
an examination or test measures the contents.

Figure 6: The Purpose of TOS

2.5.2 In planning ToS for a test/examination, CLOs are assessed through the
test/examination and the topics covered as indicated in the course’s
assessment plan.

36
2.6 ASSESSING LEARNING OUTCOMES
2.6.1 In OBE, a learning outcome contains a verb that signifies the domain
and level of the outcome, whether it is cognitive, affective, or
psychomotor.
2.6.2 The level of the verb is ascertained according to the taxonomy that is
used in the design of the course.
2.6.3 For example, in Table 2 (refer to 2.4), Bloom’s Taxonomy is commonly
used for the cognitive domain, Simpson’s for the psychomotor domain,
and Krathwohl’s for the affective domain. (Refer to Appendix 1)
2.6.4 HEPs may decide to use any of the taxonomies for cognitive, affective,
and psychomotor that carry consistency in the delivery. (refer to
Appendix 1 for an example of SOLO taxonomy aligned assessment
plan).
2.6.5 While written examination questions can be used as an assessment
method, it is not the only approach.
2.6.6 This is especially true for complex skills like those in the MQF 2.0 (2018)
Learning Outcomes clusters.
2.6.7 At the same time, to assess using alternative methods, the skills to be
assessed should be defined according to how they are used and applied
in the course.
2.6.8 To help course owners define the skills to be assessed, the scholarly
approach of referring to the research literature and the relevant
principles or theories are recommended.
2.6.9 The psychomotor domain (Simpson, 1972) includes physical
movement, coordination, and use of the motor-skill areas.
2.6.10 Development of these skills requires practice and is measured in terms
of speed, precision, distance, procedures, or techniques in execution.
2.6.11 Assessment for each of the MQF 2.0 (2018) Learning Outcomes
clusters:

i. Knowledge and understanding


ii. Cognitive skills
iii. Functional work skills with focus a on:

a. Practical skills
b. Interpersonal skills

37
c. Communication skills
d. Digital skills
e. Numeracy skills
f. Leadership, autonomy, and responsibility
iv. Personal and entrepreneurial skills
v. Ethics and professionalism.

2.6.12 For more examples, refer to Quick Reference 5 Clusters of Learning


Outcomes MQF 2.0 (2018), Rubrik PNGK Bersepadu ICG PA (MOHE,
2016) and Programme Standards.

38
2.7 KNOWLEDGE AND UNDERSTANDING
2.7.1 Knowledge and understanding refers to a systematic understanding of facts, ideas, information, principles, concepts,
theories, technical knowledge, regulations, numeracy, practical skills, tools to use, processes, and systems.
2.7.2 Knowledge and understanding comprise the knowledge dimension and the cognitive dimension, as shown in Table 3.

Table 3: Revised Bloom's Taxonomy for Cognitive Domain

Knowledge Cognitive Dimension


Dimension Remember Understand Apply Analyse Evaluate Create
Factual Remember Understand facts Apply facts
facts
Conceptual Remember Understand Apply concepts Analyse using Evaluate using Create using
concepts concepts facts, concepts, facts, concepts, facts, concepts,
principle, and principle, and principle, and
Procedural Remember Understand Apply procedures procedures procedures
procedures procedures procedures

Metacognitive Remember Understand Apply


metacognitive metacognitive metacognitive Analyse Evaluate Create
strategies strategies strategies metacognitive metacognitive metacognitive
strategies strategies strategies
39
2.7.3 Although the original intent of the taxonomy was to serve as a guide for
designing learning activities, it can also be used in designing assessments.
2.7.4 As seen in Table 3, there are two dimensions to the taxonomy.
2.7.5 The Cognitive Dimension signifies increasing cognitive process
complexity—from lower-order to higher-order thinking skills.
2.7.6 Referring to Table 3, there are six levels of cognitive dimension: remember,
understand, apply, analyse, evaluate, and create. Table 4 provides
examples of assessments for each level.
2.7.7 The Knowledge Dimension categorises knowledge into four dimensions
that learners may be expected to learn, ranging from the concrete to the
abstract.
2.7.8 These types of knowledge that can be tested are factual, conceptual,
procedural, and metacognitive.

Table 4: Examples of Assessments Based on Bloom’s Taxonomy

Bloom’s Taxonomy Level Example of Assessment


Remember Written examination questions to recall facts or describe
an event.

Understand Conduct discussion, oral presentation or written


examination questions to explain a concept or a
phenomenon.

Apply Conduct an experiment based on the instructions and


calculate using a theory to find specific values.

Analyse Questions or activities that require comparison and


contrast justify decisions made in a case study or
project, write a critique of an essay/movie, etc.

Evaluate Assess the performance or suitability of a method or tool,


etc., to select the best or most appropriate given
contexts in a project.

Create Design based on various aspects write essays that


synthesise various elements of knowledge and contexts.

40
Figure 7: Elaboration of Revised Bloom's Taxonomy for Cognitive Domain

2.8 COGNITIVE SKILL


2.8.1 In MQF 2.0 (2018), cognitive skills are thinking skills that give a person the
ability to utilise knowledge and skills in various intellectual capacities, such
as problem-solving, creative and critical thinking, etc.
2.8.2 The higher-order thinking skills in the cognitive domain are also cognitive
skills.
2.8.3 Cognitive skills may be assessed using methods other than written
examinations. For example, complex problem solving is better assessed
through projects where students work on solving complex real-world
problems.
2.8.4 An assessment can be made of the problem-solving process and not just
of the final product or output.
2.8.5 Major process elements can be defined and assessed as part of problem-
solving, such as problem identification, project management, depth of
understanding, and creativity of the solution.

41
2.8.6 One way of assessing cognitive skills is by implementing problem-based
learning (PBL), which has been proven to develop self-directed learning
and problem-solving skills. The first two assessment tasks are made
during the “meet the problem” phase in PBL implemented in a typical class,
as shown in Table 5.

Table 5: Example for Assessing Problem Solving and Learning Process for PBL

ASSESSMENT PERFORMANCE LEVEL


TASK 1 2 3
Problem Copy/rewrite Provide general Demonstrate a
Restatement sentences from the statements of the comprehensive
problem problem. understanding of the
problem.
Summarise the
problem without Reproduce the gist of
demonstrating a the problem in his/her
clear own words.
understanding of
the problem.
Knowledge and Unable to identify Mixed-up learning Able to identify all prior
information gap prior knowledge issues to be knowledge to solve the
identification and missing data learned problem
(What we or information
know, what we needed to solve (missing data Able to identify
need to know, the problem. information but knowledge gaps
learning concepts already through all the
issues) Identify very known). necessary learning
minimal/ irrelevant issues ant to the
learning issues. problem.
Peer teaching Copy and paste Summary of Deep understanding is
notes of from books or learning issues evident in the summary
learning issues other resources mostly at surface of understood concepts
understanding, complete with proper
List irrelevant lacking examples. examples.
questions or
issues to be Questions raised Organised and clear
verified. to be verified and questions and issues to
lack direction. be verified.
Action plan *Example of additional assessment tasks – performance level
using Gantt descriptions can be filled in according to the given learning
Charts* outcomes and expected standard.
Other suitable
tasks

2.8.7 Peer teaching is assessed using individual notes submitted before the
discussion on new concepts needed to solve the problem. In small-group
PBL, peer teaching is normally assessed through tutor observation.

42
2.9 FUNCTIONAL SKILL
2.9.1 Assessment of Practical Skills

2.9.1.1 Practical skills are work-place skills that can be hands-on or


organisational skills, such as:
conducting laboratory experiments, handling equipment or
machinery, using devices or software, performing sports,
music, drama, singing, or dancing.
2.9.1.2 The development of these skills requires practice and its
application, which is measured in terms of procedures or
techniques in their execution.
2.9.1.3 Practical skills are assessed based on task coordination,
accuracy, and consistency.
2.9.1.4 The assessment tasks involve several operations that are
planned in sequence, starting from observation towards the
progression of mastery of a skill up to the highest skill, which
is invention.
2.9.1.5 The assessment tasks involve several operations planned in
sequence, starting from observation towards the progression
of mastery of a skill up to the highest skill, which is invention.
2.9.1.6 On the other hand, practical skills related to entrepreneurship
include the ability to determine opportunities, conduct market
research, organise and adapt projects, and manage risks
(Kozlinska et al., 2020).
Table 6 provides examples for addressing CLOs, TLA, and
Assessment Tasks based on Simpson's Psychomotor
Domain.
2.9.1.7 Possible assessment tasks are performance observation,
product observation, or simulation.

43
Table 6: Example of Psychomotor Domain in Machining Course

Psychomotor Domain Course Learning Teaching and Assessment Tasks


Objectives Learning Activities
Perception: Demonstrate an Observe machining Demonstration, Recognise the
awareness or knowledge of the techniques for material Discussion, Question significant components
behaviours needed to carry out removal and Answer of the lathe machine
the skill and tools Identify steps
in preparing for the
machining process.
Set: readiness to perform the Organise the steps in Demonstration, Set up the machining
task conducting machining Discussion Practical tool Start the machine
operation Task and record the
operational
observation Test and
adjust the speed
settings to control the
shape of the surface
Guided: the early stage of Demonstrate correctly Demonstration, Video Practice the machining
leaming a complex shill. It is the machining task presentation, Video process with correct
first attempt at a physical skill procedures Simulation, procedures and meet
and includes imitation and trial Discussion, Practical the specifications
and error. The learner can Task Coaching. Adapt the correct
complete the steps involved in Feedback usage of machining
the skill as directed tools Examine the
completed workpieces
for defects.
Mechanism: the ability to Perform safely and Practical Task, Demonstrate the
convert the lamed responses appropriately the Discussion, Video correct procedure of
into habitual actions so the machining operations presentation setting and operating
movements can be performed to remove material the machine and tools
with a medium level of from a workpiece for the machining
proficiency, assurance and process Adapt the
confidence work procedures to
meet the given
specifications Measure
completed work places
to verify conformance
to the specifications
Complex Overt Response: Conduct an accurately Discussion, Practical Organise the
the ability to skilfully perform machining process to task machining procedures
complex movements correctly. remove material from and tools to shape a
Complex movements are the workpiece given object from the
performed quickly accurately according to the workpiece
and with minimal wasted effort. specifications

Adaptation: the ability to Modify the machining Discussion, Machining Adopt the technical
modify the meet new or special procedures that suit Project concepts and s of
requirements. the design and material machining procedures
specification of the in actual projects.
given task

44
2.9.2 Interpersonal Skills

2.9.2.1 Interpersonal skills are defined in the MQF 2.0 (2018) manual
as a range of social skills such as interactive communications,
relationships and collaborative skills in managing relationships
in teams and within the organisation, networking with people of
different cultures, and social skills/etiquette.
2.9.2.2 The expected outcome, such as teamwork skills could be
classified under interpersonal skills.
2.9.3.3 To assess team working skills. multiple methods can be used,
such as through in-class peer-rating observation, recorded
video of team discussions, logbooks, minutes of meetings and
learning portfolio.
2.9.2.4 The learning outcomes and levels of attainment may also use
the affective domain taxonomy.

2.9.3 Communication Skills

2.9.3.1 The MQF 2.0 (2018) manual defines communication skills as


the ability to convey information, reports or ideas professionally
and logically in oral and written forms using suitable language.
2.9.3.2 The communication process must be practical and in
appropriate forms, in various mediums, and to various
audiences in various settings.
2.9.3.3 The ability to communicate in more than one language is also
encouraged.
2.9.3.4 Table 7 shows an example of a team oral presentation, divided
into individual and team segments of the oral presentation
assessment.
2.9.3.5 Possible methods of assessing oral communication are
presentation, debate, discussion, and forum. Reports and term
papers may be used to assess written communication.

45
Table 7: Example of assessment For Team Oral Presentation: Individual And Team

Scale 1 2 3
Individual
A - Stature & Lack confidence, Somewhat Confident, good
Appearance lousy posture, confidence, proper eye contact and
shabbily attired, no posture, attire, and posture, smartly
eye contact. maintaining eye attired.
contact.
B - Presentation & Not precise, Straightforward, Apparent and
Voice mumbles and but sometimes fluent, reasonable
swallow words, too voice trails off, speech rate and
slow or too fast, sufficient speech volume.
poor intonation, rate and intonation.
and low voice
volume.
C - Delivery Mispronounce Mispronounce Very fluent, rarely
most words, often certain words, is mispronounce
stumbles, reads somewhat fluent, words, captures an
from notes/ slides, depends on audience,
and does not show notes/slides, and sparingly refers to
interest in a topic. shows interest in notes, passionate
sharing topics. about a topic.
D-Q&A Not able to answer, Able to answer but Answers
does not fumbles slightly, confidently
understand a topic. demonstrating demonstrate clear
understanding of and critical
the topic. knowledge of the
topic.

Group
E - Content Unsuitable and Suitable Content. Well-developed
disjointed content. Shows content with proper
Does not show understanding but elaboration and
understanding of lacks integration of examples. Shows
the material different sources. good
presented. understanding &
integration from
various sources.

F - Slides It does not support Support Enhance and


the presentation; presentation, clarify
too wordy or too apparent but dull or presentation,
distracting. distracting in some pleasing design
places. that fits the
purpose of the
presentation.

46
2.9.4 Digital Skills

2.9.4.1 Digital skills are essential for current and future graduates to
remain relevant in the current and future phases of industrial
transformation.
2.9.4.2 Digital skills encompass knowledge and skills related to using
information/digital technologies and literacy to support learning
and professional life.
2.9.4.3 The skills include sourcing and storing information, processing
data, digital design, using applications for problem solving and
communication, and ethics in applying digital skills.
2.9.4.4 According to all programme standards and the MQF 2.0 (2018),
digital skills must be measured across disciplines and at all
qualification levels. Therefore, professionals from different
fields should be more than competent to provide better
descriptions of appropriate digital attributes for diverse curricula
or academic programmes.
2.9.4.5 The criteria are divided into the areas of the learners'
adaptability, capability, clarity of the relayed skills and
knowledge, coherence, relevance in comparability, recognition,
and transferability towards these programmes and qualification
frameworks. It provides the overarching framework that
integrates into all forms of learning.
2.9.4.6 Depending on the subject-matter area, digital skills can be
assessed across all three domains: cognitive, affective, and
psychomotor.

Table 8: Digital Skills Framework in Higher Education

Digital skills
Understand Create Use
- identifying the - digital communication - information
Incorporated needed digital - digital collaboration literacy
component tools (industry/affiliated/practi - computer and
- digital hardware se) technology
and technology literacy
literacy
Note: Adopted Table from Aris et al., Digital Skills Framework in Higher Education. Proceedings
2022, 82, 61. https://doi.org/10.3390/proceedings2022082061

47
2.9.5 Numeracy Skills

2.9.5.1 Numeracy skills are the ability to understand and apply


mathematics for everyday use, at home, during learning, or at the
workplace.
2.9.5.2 It is confidence in using mathematics that demands familiarity in
situations such as calculating a budget, managing personal
finance, and managing time for events or a travel journey.
2.9.5.3 It is the capacity to identify and understand the role that
mathematics plays in the world, to make well-founded
judgements, and to use and engage with mathematics in ways
that meet the needs of that individual’s life as a constructive,
concerned, and reflective citizen.
2.9.5.4 Depending on the type of programme, numeracy can be
complex, going up to the extent of solving a complex problem.
2.9.5.5 Numeracy skills are considered mathematical literacy for
learners’ capacity to formulate, employ, and interpret in various
problem-solving contexts that describe, explain, and predict
phenomena.
2.9.5.6 Students can be directly assessed using written examinations,
projects, and other forms of assignments that require them to
learn and utilise their numeracy skills. Some examples of
numeracy skills can be seen in Figure 8.

48
Figure 8: International Assessment of Adult Competencies (PIAAC) in Numeracy
Competency

2.9.6 Leadership, autonomy, and responsibility

2.9.6.1 MQF 2.0 (2018) defines this cluster of skills as an individual’s ability
to build relationships and work with teams made up of peers or in
managerial capacities with varying degrees of autonomy to make
decisions or set goals at organisational/unit/team levels; to take
responsibility and provide accountability; to be confident,
knowledgeable, articulate, honest, professional, concerned,
resilient, a risk taker, and possess other intrapersonal skills,
including working in and leading teams.
2.9.6.2 The management role is ambivalent and includes negotiation
processes based on traditional scientific ideals and managerial
logic.
2.9.6.3 The relational aspects of how a professional habitus is formed and
negotiated in relation to management ideals and practices. The
value of knowledge in itself and collegial decision-making.
2.9.6.4 It concerns the conditions in the field, the academic subjects, and
the status relations between teaching and research, as well as to
what extent the students and their processes in the field affect the
prerequisites for exercising professional judgement.
2.9.6.5 Possible areas for assessing leadership are shown in Table 9.

49
Table 9: Areas of Assessment for Leadership

Example leadership Area of Assessment


Assessment
Vision The student demonstrates the ability to craft vision
for their group or organisation; and the ability to
develop a strategic plan to achieve.

Actions The student thoroughly demonstrates the ability to


identify goals toward the development of shared
knowledge among faculty and peers to accomplish a
variety of tasks or objectives.

Service The student thoroughly demonstrates the ability to


seek input from diverse viewpoints, critically
evaluate, and clearly narrate the rationale on the
actions taken.

Protocol The student thoroughly demonstrates the ability to


recognise the value of following proper protocols.
Note: Ref: Entrepreneurship Assessment in Higher Education: A Research Review for Engineering Education
Researchers Journal of Engineering Education VC 2018 ASEE. http://wileyonlinelibrary.com/journal/jee April 2018, Vol.
107, No. 2, pp. 00–00 DOI 10.1002/jee.20197

2.10 PERSONAL AND ENTREPRENEURIAL SKILLS


2.10.1 Personal Skills

2.10.1.1 Personal skills are life skills that learners are expected to use
daily.
2.10.1.2 They are generally portrayed through enthusiasm for
independent learning, intellectual discourse and self-
development, confidence and self-control, social skills and
proper etiquette, and commitment to professionalism in the
workplace.
2.10.1.3 It also includes the capability to plan for career development or
further education.
2.10.1.4 Aspects of individual characteristics such as honesty,
punctuality, time management, and keeping to and maintaining
deadlines that are important in a work environment are also
essential personal skills.

50
2.10.2 Entrepreneurial Skills

2.10.2.1 Entrepreneurial skills require relevant knowledge, skills, and


expertise in critical areas of an enterprise.
2.10.2.2 Important personal qualities will include creativity, grit, and
drive.
2.10.2.3 The learning outcomes describe the incremental development
of these skills.
2.10.2.4 The drive to be an entrepreneur is set as a personal skill but
also requires the prerequisites of relevant knowledge and
cognitive and functional skills.
2.10.2.5 To assess personal or entrepreneurial skills, aspects of the
skills required must be defined before identifying the
assessment method. Examples of dimensions to measure
personal and entrepreneurial skills are:

i. Opportunity seeking
ii. Persistence
iii. commitment to work
iv. demand for quality and efficiency
v. risk-taking
vi. goal setting
vii. information seeking
viii. systematic planning and monitoring

2.10.2.6 Although attaining entrepreneurial skills through knowledge


and cognitive skills are more readily visible than personal
skills, both can be authentically assessed through projects
with real world problems.

2.11 ETHICS AND PROFESSIONALISM


2.11.1 Ethics and values are essential in personal, organisational,
societal/community and global settings as they guide personal actions
and interactions at work and within the community.
2.11.2 Respect for ethical, social, and cultural differences and issues is essential
in exercising professional skills and responsibilities.

51
2.11.3 These include integrity, professional conduct (professionalism), and
standards of conduct such as upholding regulations, laws, and codes of
good practices or professional conduct.
2.11.4 A sensitive approach to other cultures adds value to this learning domain.
2.11.5 Assessment of ethics can fall under any of the domains but likely for the
cognitive and affective domains.
2.11.6 Ideally, if the outcome is to develop students' professional and ethical
beliefs that will guide their conduct, then the outcome level should be at
the high affective taxonomy level, at the valuing or organisation level.
2.11.7 A constructively aligned teaching and learning environment will enable
the assessment of ethics, even at the higher levels of the taxonomy.
2.11.8 Joint assessment approaches for ethics include case studies, role play,
service learning, learning journals, etc.

Table 10: Example Construction for the Assessment Rubrics

An example in the construction Recognition of the dilemma - identifying the ethical issues
for the assessment rubric by or problems, especially concerning the ethics code.
Shuman, Olds and Besterfield- Information (argumentation)-gathering relevant
Sacre (2003) used the following information and justifying its importance to understand the
five constructs in developing a situation.
rubric to assess engineering Analysis (complexity and depth) - analysis of the
students' responses to cases information, taking into account different aspects and
with an ethical dilemma. opposing viewpoints and other factors such as the risks
and consequences.
Perspective (fairness) - taking different perspectives of the
parties involved (e.g., workers, residents, industry,
government, etc.) and looking at a global view to get an
overall perspective.
Resolution (argumentation) - a final resolution which
should consider the greater good and risk to the public,
with solid justification.
Note: Shuman, L., Olds, B. and Besterfield-Sacre, M. (2003), 33rd ASEE/FIE Frontiers in Education Conference
Proceedings, Boulder, Colorado, USA.

52
PART 3
ASSESSMENT MANAGEMENT

The management of student assessment is key to quality assurance. HEPs should ensure
the robustness and security of processes and procedures related to assessment
management.

Systematic management is important in encapsulating the assessment's validity,


reliability, and integrity.

This chapter addresses the management of conventional assessment and alternative


assessment as a way forward.

On that note, every HEP should focus on combating academic misconduct. There may
also be differences across institutions in the structures.

The sub-topics are:

i. Management of Student Assessment and its Process


ii. Conducting both Formative and Summative Assessment
iii. Types of Assessment
iv. Assessment Methods
v. Review of Assessment Methodologies and Currency with Development in
Best Practices

3.1 MANAGEMENT OF STUDENT ASSESSMENT AND ITS PROCESS


3.1.1 HEPs have significant responsibilities with regard to student assessment.
3.1.2 HEPs need to develop and implement its own assessment processes and
procedures through the HEP’s administrative processes, as shown in
Table 11.
3.1.3 Table 11 provides an overview of the structure, function, and integration
of the assessment processes and procedures at the institutional level.

53
Table 11: Assessment Integration and Process at the Institutional Level

Academic Board/Senate Approves assessment policies/procedures that


establish and maintain academic standards through
the principles of assessment and procedures.

HEP's Academic Committee Develops and reviews assessment


policies/procedures - may include external
stakeholders such as industry representatives and
alumni and submits the review to the academic
board/senate for approval.

Faculty/School Department Oversees the implementation of assessment


policies/procedures in academic processes and
provides feedback to the HEP's academic
committee in the continual review of policies and
processes.

Academic Staff Implements assessment and provides formative


and summative feedback to students and
faculty/school/department.

Review of Input from stakeholders which includes students,


Programme/Courses alumni, industry representatives, external
assessors, and academic staff.

3.2 CONDUCTING BOTH FORMATIVE AND SUMMATIVE ASSESSMENT


3.2.1 Ongoing formative assessments are conducted throughout a course,
embedded and linked directly to the current learning and teaching
activities.
3.2.2 Through observations and interactions in the classroom, the
assessment helps the academic staff gain feedback on students'
progress.
3.2.3 In-class tasks can be given to assist students in monitoring and to
improve their learning.
3.2.4 Providing feedback to students about their learning is crucial to
understanding the use of assessment for learning.
3.2.5 Assessment for learning is seeking and interpreting evidence for use by
the learners and academic staff.

54
3.2.6 The interpretation is then used to decide where the learners are in their
learning and to indicate the next step to promote learning (Assessment
Reform Group, 2002).
3.2.7 The increased use of coursework and continuous assessment offers the
opportunity for academic staff to provide constructive feedback to help
learners improve their future learning.
3.2.8 Formative assessment is an assessment for learning.
3.2.9 Assessment as learning requires students to play an active role in
becoming independent in their learning and assessment (Earl, 2003).
3.2.10 In order to incorporate assessment as learning into the learning process,
academic staff should help students develop skills to conduct self-
evaluation, metacognition, and design instructions and assessments to
monitor student learning.
3.2.11 On the other hand, summative assessments measure what students
have learned at the end of a learning unit.
3.2.12 Summative assessment refers to the assessment of students’ learning,
which involves grading and verification and is used for institutional
accountability and quality assurance.
3.2.13 The results are then communicated to the stakeholders.
3.2.14 Summative assessment is one of the methods used in the assessment
of students’ learning.

Figure 9: Formative and Summative Assessment

55
3.3 TYPES OF ASSESMENT
3.3.1 Various methods of assessment

3.3.1.1 Multiple assessment methods should be adopted to measure


the attainment of LOs, including diverse attributes to be
measured.
3.3.1.2 The selection of assessment tasks is based on common
practices in one's respective field and on the experience of
academic staff.
3.3.1.3 The choice of instruments must be determined based on the
performance criteria in terms of the qualities and abilities sought
in the learner, which are explicitly stated in the LO statements.
For example, in requiring students to portray creativity and
innovation, the assessor/academic staff may require a studio
project, the development of a product, and performance or case
studies that can appropriately measure the abilities of the
students in producing an output, such as through
experimentation, expression, and exploration.
3.3.1.4 Various methods can be used to assess the cognitive domain
and critical thinking skills, including critiques, reviews, reports,
or tests.
3.3.1.5 Case studies and group projects can determine students'
abilities to apply theory to practice, apart from determining their
communication, managerial, critical thinking and problem-
solving skills among others.
3.3.1.6 Case studies and group projects may also be used to measure
the affective domain regarding values, attitudes,
professionalism, teamwork, communication, lifelong learning,
and ethics.
3.3.1.7 In assessing performance or demonstration techniques, an
assessor/academic staff can adopt any of the following
methods or may choose to combine these methods:
demonstrations, role play, posters, laboratory reports, illustrated
manuals, or simulations.

56
3.3.2 Coursework and Continuous Assessment

3.3.2.1 The most common data collection processes that are


continuously done throughout a course or module may be done
in various forms.
3.3.2.2 Although the following list is not exhaustive, the measurement
of learning gains through coursework can be made through:

Presentations, essays, critiques, reviews, projects, case


studies, portfolios, simulations, development of products,
capstone projects, reflective journals, exhibitions,
performances (e.g., music, theatre), clinical work, posters,
debates, lab reports, manuals, and essays.

3.3.3 Examination and Tests

3.3.3.1 Examinations and tests reflect the cumulative attainment of


LOs.
3.3.3.2 Among others, examinations and tests enhance the student's
abilities, such as the ability to articulate, argue, analyse, justify,
communicate ideas, and assess critically.
3.3.3.3 These abilities can be demonstrated through essays and
structured, open-ended questions.
3.3.3.4 Although objective questions can measure higher-order
thinking, they do not promote some other abilities that
subjective questions can tap.
3.3.3.5 Some tests that can be adopted in classroom assessment are
written, oral, practical, and standardised tests.
3.3.3.6 Standardised tests can be purchased to measure
communication or critical thinking skills, among others.
3.3.3.7 Apart from the graded tasks, ungraded tasks such as short
quizzes and minute papers may provide formative feedback for
students to gauge their achievement of LOs and to allow the
academic staff to improve or modify their teaching.
3.3.3.8 Procedures that involve elements of self and peer assessment
can also be implemented.

57
3.3.3.9 Self-assessment is a valuable way of encouraging participants
to evaluate and reflect on their learning.
3.3.3.10 Peer assessment is especially useful in determining the
attainment of leadership, teamwork, and communication skills.
(refer to Table 12 for information on management of student
assessment and processes involved).
Note: Appendix 2 to Appendix 7 relate to online assessment.

58
Table 12: Management of Student Assessment and Process

AREA CONTINUOUS ASSESSMENT FINAL ASSESSMENT


Conventional or Alternative Alternative Assessment Final Examination (Conventional and Online)
Assessment
Assessment Task Examples: Examples: Reflective module assessment, • Demonstrate a comprehensive
Assignments, Proposal defence, Self-reflective report, Final Project Graduate, understanding of the problem given a
Written/Oral assessment, Individual Exhibition, Expert-based assessment, specified time.
assignments, Group assignments, Reviews and Critiques, Graduate Seminar,
Quiz/Test, Demonstrations, Observation Portfolio/Logbook. • Reproduce the gist of the problem in
notes, Anecdotal, Presentation, his/her own words.
Laboratory reports
Level of autonomy in System to ensure academic quality, validity, reliability, fairness and consistency of the HEPs must have a system to ensure security and
the management of assessment. standard/academic quality of assessment.
student assessment
Preparation of Assessment Task Preparation of Assessment Task:
• Ensure the constructive alignment has been reviewed and approved. • Establish the use of the Table of
• Establish the use of a table of specifications (TOS) when implementing test/quiz/ Specification (TOS)
mid term examination. • Preparing questions following SLT and
• Use of alternative assessment not limited to the cognitive domain but also to appropriate weightage.
include affective and psychomotor as well as other skills (where applicable). • A committee review to vet the set of
• Ensure the design and instruction of the assessment process. examination questions.
• Considering assessment planning with the student learning time (SLT).
• A subject matter expert (internal or external) reviews
questions/items/assignments or any assessment form before administering.
• Communication students with criteria, expectations and rubrics.
59
Area CONTINUOUS ASSESSMENT FINAL ASSESSMENT
Conventional or Alternative Alternative Assessment Final Examination (Conventional and Online)
Assessment
Assessment Approve questions/items/assignments or any assessment form will be administered to Conventional
administration the students. Only an approved set of final examinations will be
process administered to the students in a conducive and
Proper scheduling for continuous assessment within the semester. secured location during the stipulated time.

Communicate with the students on continuous assessment scheduling. Online Synchronous assessment

Synchronous assessment involves lecturers and


students online at the same time.

Lecturers need to plan and ensure synchronous


assessment can be performed in a prescribed
period. Synchronous assessment can be
Implemented if the student has good internet
access.

A synchronous online examination can be carried


out in three ways:

a. With manual online invigilation.


b. With online proctoring.
c. Randomisation of questions.

Online Asynchronous Assessment

• Asynchronous assessment involves lecturers


and students at different times and locations. This
assessment allows lecturers to plan and design
the assessment that can be implemented in a
predetermined period.

• Ensure questions and time are allotted adequate


for students to complete the task.

• Can use different sets of questions with the same


60

cognitive level.
Area CONTINUOUS ASSESSMENT FINAL ASSESSMENT
Conventional or Alternative Alternative Assessment Final Examination (Conventional and Online)
Assessment
Mechanism of a) Establish an understanding of academic integrity and honesty.
marking and grading b) The area of concern for academic integrity and honesty may include: plagiarism, cheating, fabrication, deception, false information or any
student assessment related misconduct.
c) HEPs shall get advice from the university's legal advisor.
d) HEPs may establish a student academic integrity pledge.
e) HEPs shall clearly state the process of moderation
f) Establish a moderation committee.
g) Proper moderation process at programme and course level must be carried out in cases with more than one assessor (inter-rater reliability
towards consistency and fairness.
h) Marking and grading are guided by an answer key, answer scheme, or rubric.
i) Measures to curb biasness when marking are in place.
Mechanisms to System to ensure academic integrity and honesty. Ensure students submit their work a) Highly secure systematic process and
ensure the security and do not plagiarise. mechanism in developing, managing and
of assessment HEPs should have a clear disciplinary act for students who commit academic administrating final assessment.
documents and misconduct. b) Clear invigilator job description and roles.
records Keep the evidence of assessment in a certain period. c) Ensuring a strong room is highly secured
and meets the minimum specification for
safety. To a certain extent, a strong room
using online, or cloud services should
have a high-security mechanism.
d) The final online examination should be
administered with a secure browser,
remote proctoring, data encryption and IP
authorisation and authentication. All of
these are mechanism to curb or avoid
academic misconduct.
e) HEPs should have a clear disciplinary act
for students who commit academic
misconduct
f) HEPs are recommended to provide
plagiarism detection in assessing
students’ academic misconduct.
g) Assessment evidence must be kept
stored, maintained, and disposed of
based on stipulated period.
61
Area CONTINUOUS ASSESSMENT FINAL ASSESSMENT
Conventional or Alternative Alternative Assessment Final Examination (Conventional and Online)
Assessment
Assessment a) Assessment tasks scheduled - Assessment tasks scheduled within the final assessment week
and communication across the semester. - Systematic collection of assessment evidence, marking and grading.
with students b) Results were returned to - Marks and grading release upon Senate approval.
students promptly before the - Students receive notification of final grades through an integrated system- emphasis on integrity.
submission of the next - Processes for students to appeal against the results of assessment must be in place and
assessment task. integrated into the system.
c) Students can act on
assessment feedback before
submission of the next task.
d) System in place for the
collection of assignments,
marking of assignments,
feedback to students.
Periodically review System for periodic review of assessment, programme and course. May include input from external stakeholders as review panels e.g.,
the management of Students' evaluation of teaching
student assessment Student/staff liaison committee

Reviews involve an inquiry process focused on two questions:


Does the system provide useful information for making decisions and taking necessary action?
Are the actions taken educationally beneficial?

More specifically, reviewers consider how well the system adheres to each assessment principle.
To ensure that timely and effective reviews are conducted, a continuing group must be responsible for monitoring the review process.
Students, other educators and experts also provide feedback about classroom and university practices.

Reviews of the overall assessment system and the whole academic programme require broad participation from all stakeholders, including
educators, students and assessment and curriculum specialists. The most important criterion for assessment review is that assessment does
not harm student learning and promotes active and engaged learning.
62
3.3.4 Alternative Assessment

3.3.4.1 Performance-based Assessment

3.3.4.1.1 Generally, it assesses students' ability to apply the skills and


knowledge gained from a unit or unit of study.
3.3.4.1.2 Typically, the task requires students to use higher-order
thinking skills / high-complex activities to create a product or
complete a process.
3.3.4.1.3 Performance-based assessment encourages the application
of real-life situations or problems.

3.3.4.2 Workplace-based Assessment

3.3.4.2.1 To ensure the attainment of PLOs and to better prepare for the
workplace by training the students to be immersed in a real-
work environment, thus relating theories to practice in situ.
3.3.4.2.2 HEPs are encouraged to collaborate with industry when
planning, executing, and assessing students during their
workplace experience.

3.3.4.3 Interdisciplinary-based Assessment

3.3.4.3.1 Assessment tasks should provide opportunities for learning


experience through assessment by allowing the integration of
the learning components from two or more courses.
3.3.4.3.2 Integrated assessment is an interdisciplinary approach
combining various skillsets, diverse disciplines, and
knowledge to better understand a complex situation or
environment.
3.3.4.3.3 The collaborative approach is a proposed approach for
learning delivery and evaluation.
3.3.4.3.4 Learners must experience a pedagogy beyond “standard”
passive lecture. a significant problem for which their own
discipline and its way of knowing are necessary but not
sufficient contributors to the solution.
3.3.4.3.5 Assessment is valued through observing the learners acquire
relevant and important facts outside their own major
63
disciplines and inserting them into new contexts from multiple
perspectives.
3.3.4.3.6 The assessment practice acquiring those facts and
manipulating them in those contexts under instructor’s
supervision.
3.3.4.3.7 Going through this process enables learners to gain insights
from various disciplines, synthesise information required for
the assessment, before ultimately offering a more complete
understanding of an issue.

3.3.4.4 Multidisciplinary-based Assessment

3.3.4.4.1 Multidisciplinary involves two or more disciplines/bodies of


knowledge by combining various skill sets and exposing the
students to the chain of environment.
3.3.4.4.2 The students should be able to organise and correlate the
disciplines being integrated. The students are evaluated
through collaborative tasks, the extension of knowledge, and
the connections or greatest degree of integration.
3.3.4.4.3 Assessing the quality of multidisciplinary work is complex,
concerning exchanging methods, translating categories, and
testing outcomes against multiple quality standards.

3.4 ASSESSMENT METHODS


3.4.1 An overview

3.4.1.1 Assessment may require direct examination or observation of


students' displayed knowledge or skills, which can be
assessed based on measurable LOs.
3.4.1.2 Attainment of outcomes in the cognitive and psychomotor
domains can be directly assessed, while those of the affective
domain, soft skills, and values may be more difficult to assess,
resulting in a more subjective assessment.
3.4.1.3 Direct assessments involve examining actual samples of the
student's work, including exams, quizzes, reports, portfolios,
and presentations.

64
3.4.1.4 On the other hand, indirect assessments refer to the "analysis
of reported perceptions about student mastery of learning
outcome" (Allen, 2004).
3.4.1.5 It may be in the form of employer surveys, exit interviews of
graduates, and self-reports by students or others, such as the
supervisor, during industrial attachment.

Figure 10: Aligning Learning Outcomes, Learning and Teaching Activities and
Assessment

Note: Adapted from Biggs (1999) p. 2


3.4.2 Planning Assessment Tasks

3.4.2.1 Attention has to be given to the planning of assessment tasks


for students. In the rest of the section, the discussion on
assessment tasks focuses on the course LOs.
3.4.2.2 This must be conducted throughout the course, and
academic staff must understand the assessment methods.
3.4.2.3 It is of utmost importance that assessment methods are
aligned to outcomes and instructional delivery.
3.4.2.4 Constructive alignment, a term coined by John Biggs (1999),
posits that the curriculum is designed so that the learning
activities and assessment tasks are aligned with the LOs that
are intended in the course, resulting in a consistent system.
3.4.2.5 For instance, to achieve the LOs of a certain course, the case
study or problem-based learning approach may be regarded
as the most suitable.
3.4.2.6 Thus, the chosen teaching approach and activities would
demand specific methods of measuring those outcomes.
3.4.2.7 To cater to the diversity in outcomes, the assessment
methods must be aligned with the teaching approaches.

65
3.4.2.8 In conducting good practice in assessing course LOs,
various considerations need to be taken into account.

3.4.3 Communicating the Assessment Plan to Students

3.4.3.1 The assessment plan should be communicated to students


in writing at the beginning of the semester.
3.4.3.2 Academic staff should provide the course description, which
includes a summary of the course topics and requirements,
the general format of the course, instructional materials and
assessment methods, mark apportionment, grading criteria,
and a schedule for the assessments.
3.4.3.3 Clear grading criteria, such as rubrics and performance
standards for assessing student work, should be made
available to students in hardcopy or electronic form.
3.4.3.4 Academic staff should provide ongoing student performance
feedback as the class progresses.
3.4.3.5 They may provide feedback to the class after completing and
grading continuous assessment tasks.
3.4.3.6 This could include a summary of the student's overall
performance and strategies for improvement.

3.4.4 Planning of Assessment

3.4.4.1 The planning of assessment tasks for a given course must


consider the course's level and credit value.
3.4.4.2 The academic staff must gauge whether the number and
complexity of the assignments to be given are
commensurate with the credit load of the course.
3.4.4.3 The expected time needed to complete a given assessment
task must be based on the MQA's Guidelines for Good
Practices: Programme Development and Delivery (GGP:
PDD) for determining Student Learning Time (SLT).
3.3.4.4 The preparation time needed by students for every hour of a
test is also provided in the SLT guideline.

66
3.4.5 Diversity

3.4.5.1 Assessment tasks should provide opportunities for students


to display their knowledge, talents, competencies and/or
skills.
3.4.5.2 Based on the LOs, each task has to be planned to determine
the achievement of the outcomes.
3.4.5.3 The following table suggests tasks and grading instruments
used to measure various attributes.
3.4.5.4 The diversity of assessment may include the size of students,
academic and cultural backgrounds, the development of
learning level, cross-cultural communication, short time
span, and many more.
3.4.5.5 Constructive alignment provides a powerful framework and
allows adjustment and modification to increase student
engagement and learning.
3.4.5.6 HEPs should be able to manage all the diversity faced by the
academic programme with appropriate intervention.

3.4.6 Weightage

3.4.6.1 The weighting of assessment tasks must be proportionate to


the emphasis on the learning activities/tasks and their
importance to the CLO attainment.
3.4.6.2 Assessment comprises ungraded and graded continuous
assignments quizzes/tests/midterm) assessment tasks and
may include final graded assessments.
3.4.6.3 Generally the weightage is determined by the number of
hours spent covering the course learning outcome over the
total teaching hours of the course.
3.4.6.4 The weightage must adhere to the stated assessment
weighting for the course as approved by an academic
committee. (Refer to Appendix 12 on Course Assessment
Plan, Instruction and Rubric).

67
Table 13: Example of Task and Grading Instrument

Example Leadership Attributes to be Examples of Examples of Suggested


Assessment Assessed Outcomes to be Assessment Grading
Measured Tasks Instrument

Cognitive Skills Complex- Ability to Test, Answer


Problem diagnose, Assignments, Schemes,
Solving, analyse, Projects, Answer Keys,
Creativity, synthesise, and Studio works, Rubrics
Critical Thinking. propose solutions Final
Design Thinking Examination
Interpersonal Skills Collaborate. Practise good Case Study, Rubrics,
Interact, social Interaction Case Analysis, Checklists,
Maturity, and respect of Reflective Direct
Respect, other Writing, Observation
Sensitivity. stakeholders’ Portfolio
Empathy, Social opinions through
Responsibility. seminars or
Emotion discourse
Management
Communication Skills Clear, Ability to present Written and oral Rubrics
Confident, coherent and examination
Effective, clear ideas
Adaptive, through case
Coherent, studies; and to
Systematic demonstrate
systematic writing
in research
proposal
Leadership Autonomy Coaching Ability to Case study, Checklists,
and Responsibility Responsive, demonstrate Case analysis, Direct
Effective, responsible Reflective Observation
Respect, leadership Project,
Autonomy, through group Seminar
Adaptable work project
Engagement
Entrepreneurship Skills Mindset, Skills, Ability to propose Project, Checklists,
Planning and working business Pitching, Direct
Organising plans Portfolio Observation
Visionary,
Network, Risk
Evaluator,
Negotiation

68
3.4.7 Coverage

3.4.7.1 The assessment provides feedback on the degree to which


course LOs are achieved.

Table 14: Mapping of Lesson Learning Outcomes to Course Learning


Outcomes

Course Los (CLOs) Lesson Los contributing to the course LOs


CLO 1 Lesson LOs 1, 4, 6
CLO 2 Lesson LOs 2, 7, 8
CLO 3 Lesson LOs 3, 5
CLO 4 Lesson LOs 1, 4

3.4.7.2 Table 14 shows the example of LOs for every lesson (Lesson
LOs - LLOs) being mapped to course LOs to ensure that
each lesson LO contributes to the achievement of one or
more of the course LOs (CLOs).
3.4.7.3 Consequently, the content to be taught is determined based
on the lesson LOs to be achieved.
3.4.7.4 Lesson LOs may differ from assessment outcomes because
it is impossible to assess all content taught due to constraints
such as time.
3.4.7.5 The assessment may only cover a sample of the content
taught, but the staff must ensure that the assessed content
represents the course content.

3.4.8 Criteria

3.4.8.1 Assessment criteria must be established for assessing tasks


and should be made known to students in writing and given
together with the tasks.
3.4.8.2 It guides the academic staff in objectively assessing the tasks
and helps learners meet the expectations of the tasks.
3.4.8.3 This practice also encourages students to self-assess, thus
improving the quality of their work.

69
3.4.8.4 Assessment criteria are the standards against which
learners' performance is measured. The marks awarded for
the attainment of each criterion need to be made clear.
3.4.8.5 It can be communicated through various forms of rubric.

3.4.9 Attainment

3.4.9.1 The overreaching goal of the learning outcome is to measure


the attainment for each of the tasks.
3.4.9.2 This attainment is to provide HEPs with data on student
performance and identify an area for improvement.
3.4.9.3 The area of improvement is not only focused on the students
but also important for the course and programme academics
offered.
3.4.9.4 The attainment is analytic data that allows the programme or
course developer to reflect on the quality of curriculum,
instruction, and assessment.

3.5 REVIEW OF ASSESSMENT METHODOLOGIES AND BEST PRACTICES


Sources determining currency and the best practices for assessment include:

Figure 11: Review of Assessment Methodologies & Best Practices

70
3.5.1 Validity and Reliability of Assessment

3.5.1.1 Validity and reliability are two important assessment


principles, apart from flexibility and fairness.
3.5.1.2 It must be both valid and reliable to ensure that the
assessment can provide sufficient evidence of students'
competence.
3.5.1.3 To ensure adherence to assessment principles, the HEP
policy on assessment must be in place.

3.5.2 Validity of Assessment

3.5.2.1 Validity refers to the ability of the assessment to measure


what it is supposed to measure.
3.5.2.2 Among the three types of validity: construct, content and
criterion validity, content validity may be the most important
to ascertain in developing assessment tasks, especially for
examinations and tests.
3.5.2.3 Content validity is based on the extent to which a
measurement reflects the specific intended domain of
content (Carmines & Zeller, 1991). In other words, content
validity shows the extent to which the measurement matches
the learning outcomes.
3.5.2.4 Since the coverage of test items may just be a sample of the
contents covered in a course, the extent to which the
selected test items reflect the entire contents indicates the
content validity.
3.5.2.5 The assessment vetting committee determines the content
validity of assessment tasks.
3.5.2.6 The vetting committee should also judge the fairness of the
distribution of marks and time or each assessment task.
3.5.2.7 The validity issue in the assessment will touch on two areas:
relevancy and representative.
3.5.2.8 'Relevancy' is the extent to which the assessment is
appropriate to the student's ability. 'Representative',

71
meanwhile, is concerned with whether the assessment can
represent a group of students or a body of opinion.

3.5.3 Reliability Assessment

3.5.3.1 Reliability refers to the degree of consistency and accuracy


of the assessment outcomes.
3.5.3.2 It reflects the extent to which the assessment will provide
similar outcomes for candidates with equal competence at
different times or places, regardless of the assessor
experimenting (Department of Education and Training, 2008,
pg. 10). Thus, reliability includes consistency in assessment
and grading.
3.5.3.3 It reflects the extent to which the marking by an examiner is
accurate, consistent, reliable, fair, and acceptable.
3.5.3.4 This could be easily established through conformity to the
answer and marking schemes or rubrics.
3.5.3.5 Academic staff are also recommended to provide sufficient
and timely feedback on assessment tasks to allow students
to improve their performance and progress.
3.5.3.6 Complete and accurate information on assessments must be
provided to students.
3.5.3.7 Openness in the assessment must be practised as it requires
sharing arrangements, the requirements of the assessment
process, and the marking criteria with students in the early
part of the semester.
3.5.3.8 Several approaches that can be applied to increase reliability
in assessment are illustrated below:

i. Provide clear instructions on how to answer questions


in all tests. Ambiguous questions and unclear
directions must be avoided. For assignments or
projects, provide students with specific guidelines on
requirements and expectations, including information
on how to ensure authenticity.

72
ii. Develop marking schemes/rubrics as a guide to
ensure standardisation in marking. Vague scoring
criteria threaten reliability.
iii. Ensure a fair distribution of marks for each
question/task.
iv. Provide clear guides for observing and recording
evidence.
v. Ensure that the test venue is conducive and that the
tests are administered lawfully.
vi. In cases of multiple examiners, conduct moderation in
marking. The appointed moderators determine the
appropriateness of the standards and markings.
vii. In order to maintain the validity and reliability of
assessments, students undertaking a particular
course at all sites must get the same opportunities in
terms of contents, coverage, resources, and expertise
from academic staff.
viii. Tests and examinations should be given, submitted,
and administered at the same time and under the
same conditions.

3.5.3.9 Some key factors to ascertain validity in an assessment are


as follows:

i. Assessment methods and instruments must be


appropriate to the desired levels of learning outcomes
to be attained.
ii. Assessments throughout the semester should be in
various forms (such as tests, assignments, and
presentations) to assess the learning domains and the
CLOs determined for the course. More than one task
and source of evidence are needed to judge students'
competence.
iii. Test coverage has to be balanced, covering most of
the main ideas and important concepts in proportion
to the emphasis they receive in class.

73
iv. Another person should validate the examination and
test questions with expertise in the area assessed.

3.6 THE PRESENCE OF ARTIFICIAL INTELLIGENCE (AI) TOOL


3.6.1 HEPs must also be aware of the use of technology and artificial
intelligence, which require innovative measures to ask questions that
are heavily contextual to avoid plagiarism.
3.6.2 It is highly commended for HEPs to advocate the use of plagiarism
detection software, which is combined with the use of artificial
intelligence algorithms to analyse texts and compare them to a large
database of existing content, looking for similarities and matches.
3.6.3 These tools can quickly identify instances of plagiarism and provide
detailed reports to instructors, helping them to identify and address
academic misconduct.
3.6.4 Preventive measures and methods to address these integrity issues:

i. Assignments and assessments should be designed to encourage


higher-order thinking and critical analysis, which makes it more
difficult for students to manner or practise academic dishonesty
using AI tools.
ii. Plagiarism detection software can help HEPs identify instances of
cheating and ensure that students are submitting original work.
iii. Instructors can randomise assessments and questions to reduce
the likelihood of students sharing answers with one another.
iv. HEPs must monitor student activity during assignments, and the
processes from draft to final output of works must be genuinely
worked on by the students themselves in the process of learning.

74
Figure 12: Preventive Methods to Address Integrity Issues

Notes for further reading:

1. ChatGPT and artificial intelligence in higher education: quick start guide. Published in 2023 by the United
Nations Educational, Scientific and Cultural Organization, https://etico.iiep.unesco.org/en/chatgpt-and-artificial-
intelligence-higher-education-quick-start-guide
2. AI and education: guidance for policy-makers. Published in 2021 by the United Nations Educational, Scientific
and Cultural Organization. https://unesdoc.unesco.org/ark:/48223/pf0000376709. Accessed: March 2023.

75
PART 4
ASSESSMENT IN DIVERSE CONTEXTS

4.1 OVERVIEW
4.1.1 The assessment for flexible education should still uphold the principles
to ensure integrity and credibility in the process of evaluating learners.
4.1.2 Equally there must be a transparent system that is valid, reliable,
efficient, and equitable and that is able to evaluate the ability of
learners to demonstrate learning outcomes for a set duration of time.
4.1.3 Even though the assessment can be considered as the final stage of
constructive alignment, the assessment process should be reflected
directly in the teaching processes.
4.1.4 Assessments that use rubrics require the assessor to commonly clarify
the areas to be assessed through a given briefing at an early onset,
so that the student should know what they are being assessed for.
4.1.5 UNESCO’s Sustainable Development Goal 4 (SDG 4) is incorporated
Twelfth Malaysia Plan 2021 – 2025, which aims to ensure inclusive
and equitable quality education and promote lifelong learning
opportunities for all.
4.1.6 This supports learners not only in access but also in a smooth
transition to the labour market.
4.1.7 The assessment should be designed not only for assessing whether
the student attains the outcomes but also as a form of formal feedback
to the student on their learning performances as well as a form of
feedback to lecturers/instructors on how effective their teaching
approaches are.
4.1.8 The recent pandemic has accelerated the transition of HEPs to
approaching the education system in more flexible educational
pathways similar to the frameworks arriving from the Open and
Distance Learning (ODL) system.
4.1.9 One of the main advantages of the ODL system is that it can be related
to Mixed Reality (MR) technology has emerged as a promising tool in
the field of education, offering immersive and interactive learning
experiences for students.
76
4.1.10 In order to obtain ODL licenses for a particular programme, institutions
will have to create self-instructional materials (SIM).

Figure 13: Self-Instructional Materials (SIM) Features

4.2 STUDENT DIFFERENCES


4.2.1 HEPs must consider marginalised groups, such as senior citizens,
inmates, rural residents, indigenous people, differently-abled,
conditionally challenged, single parents, or unemployed students to be
assessed differently due to their different challenges.
4.2.2 To empower the student with respect to their learning capabilities,
renowned HEPs around the world have long applied different methods
to assessing their students.
4.2.3 For example, most students may be given 2 hours for a written final
examination in the hall, but HEPs may allocate extended hours for
differently-abled students certified by medical professionals to ensure
fairness and inclusiveness of the differences.

77
4.2.4 In other cases, HEPs may use interview sessions for different students
for the same courses, in case of the differently-abled student's inability
to properly answer the assessment paper in written form.
4.2.5 Some HEPs may also provide computerised system assessments for
differently-abled students who are unable to write/spell words properly,
with recommendations from certified medical professionals.
4.2.6 Lecturers/Instructors should also be aware of differences in different
generations that show major traits in different learning capabilities;
thus, the assessment may be designed differently.

4.3 CROSS-CULTURAL
4.3.1 Globalisation has encouraged the mobilisation of people, migration,
urbanisation, and increasing social and cultural diversity, reshaping
countries, and communities.
4.3.2 Assessors should be aware of the differences between societies and
cultures. For example, during assessing communication skills, the
abilities of students from urban and rural areas may have differences;
hence, the rubrics for accessing communication skills should consider
the advantages and disadvantages of the previous education system
for the learners.
4.3.3 Students from different countries may have different expressions when
conversing; hence, the assessment should be fair to all the students.

4.4 PROGRAMME CONDUCTED


4.4.1 Considering the future education system, the Ministry of Higher
Education (MOHE) has launched various frameworks and working
models to be adopted in the Malaysian HEP's education system.
4.4.2 Figure 14 and Figure 15 show examples of different frameworks or
working models that may be adopted specifically in relation to TVET
setting in Malaysia (Figure 14) and providing experiential learning and
competency-based education in Malaysian Higher Education contexts
(Figure 15).

78
Figure 14: Future-Ready Framework (2020)

Figure 15: Experience Learning and Competency-Based Education Landscape

79
4.5 COURSEWORK MODE
4.5.1 Teaching and Learning (T&L) involves a combination of assignments
(coursework) with practicum or the production of an assessed project
paper to award students' grades.
4.5.2 Assignments/course work can be in the form of writing, presentations,
or demonstrations. Examples of dominant assessment activities in this
coursework mode are coursework assessment, quizzes, tests, and
examinations.
4.5.3 Alternative assessment can also be used. (refer to 3.3.4)

4.6 RESEARCH MODE


4.6.1 Dedicate entirely to research work leading to the production of a thesis,
exegesis, hermeneutic, or dissertation. (refer to Appendix 10 for
supervisor’s assessment).
4.6.2 HEPs would be required to produce the appropriate guidelines and
requirements for the production of the written and/or produced works
according to the requirements in the respected programme standards.
4.6.3 The processes may be evaluated through proposal defence,
colloquium, poster presentation, paper publications, viva-voce, and
written components.
4.6.4 The research under examination should demonstrate:

Figure 16: The Research Under Examinations Characteristics

80
4.6.5 Please refer to the following suggested Table 15 on the appropriate
evaluation criteria:

Table 15: Example Criteria for Thesis Evaluation

Thesis Evaluation
Abstract - Rationale for the study and problem
statement
- Hypothesis(es) and objective(s)
- Methodology employed.
- Findings and conclusions

Literature - Relevant background on what is known


Review on the topic.
- Existing information gap, and importance
of bridging the gap.
- Appropriate citations
- Literature review lead to research
question(s), hypothesis(es), and
objective(s).
- The section present different possible
methodologies and the reason for
Incorporated selecting the one that was used in the
Component study.

Methodology - Detail of what, when, where, and how the


research was performed.
- The method relevant to each objective,
hypothesis, or research question
presented.

Results Results presented clearly, concisely, and in


logical order for each objective, hypothesis, or
research question.

Discussion - Presented in a logical order for each


objective, hypothesis, or research
question (in case of multiple objectives,
hypotheses, and/or research questions).
- Does the student answer the research
question(s), or accept or fail to accept null
hypothesis(es) proposed for the study?
- Relate the findings to relevant literature
with proper citation.
- Present satisfactory reasons in the
findings reported.
- Suggesting direct future research.

81
4.7 EXEGESIS AND CREATIVE OUTPUT
4.7.1 The Exegesis is a theorised and analytical discourse that presents
fresh and authoritative insights into the field by analysing and situating
the creative component and thereby setting the stage for the ideas and
models that guide the creation of the postgraduate works (by creative
project and the Exegesis written component).
4.7.2 The creative output written component shows that the candidate
understands the relationship of the investigation to the wider context
of the knowledge to which it belongs.

i. the calibre of the creative output and its capacity to aesthetically


and conceptually contribute to discussions and praxis in its
area;
ii. the ability of the candidate to review pertinent literature and
adequately cite statements;
iii. the written work be done in critical analysis that rigorously
argues the case of the overall thesis and provides a critical
context for the contribution to knowledge made through the
creative component;
iv. the effectiveness of the Exegesis to reflect on and situate the
creative output;
v. the degree to which the candidate's attitude towards their own
work and the work of others is critical and perceptive, whether
the literary/ written presentation of the exegesis is satisfactory.

4.8 MIXED MODE


4.8.1 A combination of coursework and leading research to produce a
project paper or dissertation.
4.8.2 The ratio of coursework to research is commonly balanced between
50:50 and 30:70.
4.8.3 The assessment of a particular course may be the same as the course
mode.
4.8.4 However, the weightage of the project paper or dissertation may be as
close as the research mode with the expectation of mastery of
knowledge or a new contribution of knowledge.

82
4.8.5 Methods of assessment might include any of the following, selected as
appropriate to the discipline or field of study and the programme's
aims, mode of delivery, and typical entrants (refer to Appendix 11):

Figure 17: Methods of Assessments

4.9 INDUSTRIAL MODE (WBL, APPRENTICESHIP, 2U2I)


4.9.1 A combination of on-campus and off-campus learning (real-world
learning applications in the workplace) throughout the study involving
HEPs and industry experts in curriculum development and delivery.
4.9.2 It can be offered in various combinations, such as 3u1i, 2u2i, 2u1i and
1½u1i.
4.9.3 For the industrial-based learning or programme conducted through the
Industrial Mode/Apprenticeship, HEPs must have a proper mutual
agreement with the respective industry.
4.9.4 A suitable industry mentor should be appointed to assist the students
with experiential learning in the industry.

83
4.9.5 HEPs should involve and if necessary, train the industry mentor to
ensure learning takes place as well as to validate assessments and
grading instrument for outcome attainment.
4.9.6 In this regard, a systematic buddy system should be established by
HEPs to ensure the validity and reliability of assessment during the
learning process, within the industrial model.

Table 16: Example Criterions for Practical Evaluation

Types of Examples of Examples of Suggested Grading


Work-Based Outcomes to Assessment Tasks Instrument
Immersion be Measured
Programmes
Practical Ability to Solve specific Rubrics:
Training solve problems and The assessor is to rate the
problems in preparing the student through:
workplace reported tasks - Observation
Ability to - Reports - Discussion with
communicate - Presentations peers/management
orally and in - Effectiveness of
writing decision
Ability to plan - Proposal
and execute - Reports
projects - Presentations
assigned - Development
of product (if
applicable)
Clinical Ability to - Written tests Answer Schemes
training solve clinical - Oral tests
problems
Ability to - Objective
show Structured
analytical Clinical
skills Examination
(OSCE)
Ability to - OSCE
demonstrate - Long case
critical examinations
thinking skills
Ability to - OSCE Answer Schemes
communicate - Long case Rubrics
effectively examinations
- Mini clinical
evaluation
exercise

84
4.10 COURSES OFFERED
4.10.1 Semester-based

4.10.1.1 Semester-based assessment may be implemented within a


normal semester system. (refer to GGP: PDD)
4.10.1.2 The assessment activities may include quizzes, tests,
projects, and assignments throughout the semester.
4.10.1.3 The department may synchronise the schedule of
assessment activities to ensure the assessment is
distributed throughout the semester to lessen the students’
burden.
4.10.1.4 The assessment activities must also consider the allocated
SLTs for the semesters to ensure the assessment is reflected
in the intended LOs within the specified SLTs.

4.10.2 Module-based

4.10.2.1 Module-based or also referred to as modular-based does not


tie with the semester (14-week/8-week/12-week) system.
4.10.2.2 The assessment can be conducted within the period of the
modules according to the specified SLT credits.
4.10.2.3 For example, 1 module will probably be conducted within six
weeks with 8 hours per day (40 hours of learning time), which
can be reflected in 6 SLT credits.
4.10.2.4 The assessment can be implemented during the module's
period without additional weeks for the final examination,
compared to a semester-based system.

4.10.3 Continuing Professional Development (CPD)/Portfolio-based

4.10.3.1 Continuing Professional Development (CPD)/portfolio-


based system has become popular of late, as the
assessment can be implemented within one semester or
throughout the programme.
For example, previous co-curriculum courses are normally
implemented within one semester.

85
4.10.3.2 HEPs may implement a CPD/portfolio-based system for
students at their own pace to prepare all the records
needed to collect hours and badges to claim credits for
particular courses.
4.10.3.3 Students may be evaluated by various approaches,
including interviews or challenge sessions, to prove that
they have attained the intended learning outcomes.

4.10.4 Micro-credentials

4.10.4.1 Micro-credentials is a “…term that encompasses various


forms of certifications, including ‘nano- degrees’, ‘micro-
masters’, ‘credentials’, ‘certificates’, ‘badges’, ‘licences’ and
‘endorsements’” (UNESCO, 2018:10).
4.10.4.2 As the name implies, micro-credentials focus on a much
smaller learning volume than conventional awards, allowing
learners to complete the required study over a shorter
duration.
4.10.4.3 A micro-credential can lead to an academic award in
Malaysia, and there are three ways to do it.

i. Component of Accredited programme (one HEP)


ii. Component of Accredited programmes (multiple HEPs)
iii. Stand-alone courses

4.10.4.4 The attainment of the outcomes should be demonstrated


through suitable assessment methods and reported in a
user-friendly manner.
4.10.4.5 The mode of delivery, the pace of learning, and assessment
methods should be appropriately personalised for optimal
learning by different learners.
4.10.4.6 When implementing micro-credentials from accredited
programmes, the HEP can adjust the teaching, learning and
assessments of course(s) offered via micro-credentials,
provided that constructive alignment is always maintained
and demonstrated.

86
4.10.4.7 Information on the type of assessments (examinations, tests,
projects, etc.), grading (marks, grade points, alphabetical
grades, etc.), and quality assurance should be stated in the
Malaysian Micro-credential Statement (MMS).
4.10.4.8 For more information on micro-credentials, refer to Guidelines
to Good Practices: Micro-Credentials by MQA.

4.10.5 Exam on-Demand

Please refer to the Consideration of ‘Exam On-Demand’ in Appendix


13.

4.10.5.1 Exam on demand (self-paced assessment) has been one of


the most popular approaches in the flexible education
system. Learning today extends beyond traditional
classroom (Learning without walls) and instruction (Learning
without lecture).
4.10.5.2 With the latest developments, such as online learning,
students can understand a theory or concept through videos
on mass media learning such as massive online open
courses (MOOC).
4.10.5.3 This self-paced learning can give students the freedom to
determine their learning time.
4.10.5.4 Fast learners may finish one course earlier than others,
depending on their time spent for the assessment.
4.10.5.5 To ensure the validity of the assessment, the lecturer must
have made multiple exam questions with the same level of
difficulty and outcomes.

4.10.6 Community-based

4.10.6.1 Community-based assessment can be implemented through


the semester-based system or module-based system.
4.10.6.2 HEPs may also implement the community-based
assessment outside of the semester system.

87
4.10.6.3 The assessment can also be evaluated through a portfolio,
case study report, project report, etc., depending on the
intended outcomes.
For example, the student may claim credit for the co-
curriculum course by submitting a report after attending a
community-based programme during the semester break.

4.10.7 Mobility-based

4.10.7.1 Mobility-based assessment can be implemented for a


student participating in a mobility programme.
4.10.7.2 For all students who participated in the mobility programme,
the assessment depends on the agreement between the
mobility institutions.
4.10.7.3 HEPs may have different mechanisms for transferring credit
courses from those mobility programmes.

4.10.8 TVET Practical-based

4.10.8.1 UNESCO-UNEVOC has defined Technical and Vocational


Education and Training (TVET) as an educational process
involving, in addition to general education, the study of
technologies and related sciences and the acquisition of
practical skills, attitudes, understanding, and knowledge
relating to occupation in various sectors of economic life.
4.10.8.2 The TVET-practical-based assessment should reflect the
ability of a student concerning competencies within certain
areas in performing a task that could be directly related to
occupational-based (refer to Figure 14, 4.4).
4.10.8.3 The assessment could involve the quality of finishing, the
speed of completing the task, assessing skills such as
creativity, problem-solving, and communication among
others, using appropriate tools of measurement based on
competencies and standard set.

88
4.10.9 APEL (A), (C), (Q) and (M)

4.10.9.1 APEL (Accreditation of Prior Experiential Learning) is a


systematic process that involves the identification,
documentation, and assessment of prior experiential
learning, i.e., knowledge, skills, and attitudes, to determine
the extent to which an individual has achieved the desired
learning outcomes for access to a programme of study
and/or award of credits.
4.10.9.2 Table 17 depicts the details of APEL (A), APEL (C), APEL (Q)
and APEL (M).

Table 17: APEL Assessment Mechanism

APEL ASSESSMENT MENCHANISM


Access (APEL.A) - Aptitude Test;
- Portfolio; and
- Interview.
Credit (APEL.C) - Challenge Test; or/and
- Portfolio.
Qualification (APEL.Q) - Portfolio;
- Field And Validation Visit;
- Challenge Test, And Capstone Courses.
Micro-credentials (APEL.M) Portfolio.

4.10.9.3 There are three principles of assessment that can be related to APEL:

Figure 18: Three (3) Principles of Assessment that can be Related to APEL

89
4.10.9.4 The assessor appointed would be a subject matter expert/specialist
who can evaluate the evidence submitted based on the assessment
criteria.
4.10.9.5 In addition, he/she should demonstrate the following:

Figure 19: The Assessor’s Characteristics

4.10.9.6 Table 18 describes the types of instruments and descriptions available


for APEL assessment.

For more information on the detailed instruments and descriptions


available for APEL assessment, can be obtained from the GGP of
MQA for APEL implementation in Malaysia.

90
Table 18: APEL Assessment Instruments

Types of Instruments Descriptions


Written Test Multiple-choice, true/false, matching, fill in the
blanks, short answer, essay, and situation-based
problem solving.

Oral Exam Structural oral test, one-to-one interview, and


panel interview.

Performance Assessment Simulation, skills demonstration, role play, and


observation.

Oral Exam Structural oral test, one-to-one interview, and


panel interview.

91
PART 5
COMMUNICATING ASSESSMENT AND OUTCOMES

5.1 CONTINUOUS QUALITY IMPROVEMENTS (CQI)


5.1.1 Quality in higher education comes with continuous quality
improvements (CQI) at course levels to provide information for the
continual quality improvement at the programme level.
5.1.2 The manner of utilising assessment even online, and evaluation
process provides critical information to the faculty and administrators
of the HEPs.
5.1.3 HEPs that use a CQI system and tools based on digital feedback,
loops through course/faculty evaluations easily.
5.1.4 These allow for appropriate real-time reports on the effectiveness of
the programme design, delivery, and objectives.
5.1.5 They will enable the institution to enhance the quality of education, and
the process will continue year after year.
5.1.6 Thus, the HEP would encourage building an environment of safe
engagements as well as discussion sessions.
5.1.7 The call on the stakeholders is to ask questions and equally show
appreciation for the diverse views expressed by them during the
discussion sessions.
5.1.8 Apart from CLOs and PLOs, PEOs must also be systematically
assessed.
5.1.9 The assessment of PEOs should be done in about one cycle after
graduation, normally within three to five years.
5.1.10 The faculty or department must create a proper database on their
alumni according to the programmes attended.
5.1.11 The outcomes of the PEO assessment are to be used for CQI, which
will further improve the academic programme.
5.1.12 Apart from employer and alumni surveys, other assessment methods
to assess the attainment of PEO can be adopted.
5.1.13 These methods include interviews (face-to-face or online) with alumni
and employers and focus group interviews.

92
5.1.14 Performance criteria or targets can be set to determine the level of
achievement.
5.1.15 Such evidence can only be captured by taking a systematic approach
to assessment.
5.1.16 Hence, at the programme level, a programme’s impact is assessed by
finding evidence of the attainment of PLOs.
5.1.17 This allows HEPs to plan for continuous quality improvement (CQI)
based on the attainment levels of the CLOs and PLOs as shown in
Figure 9.

Figure 20: Overview of the CQI Process

5.1.18 With systematic monitoring that began at the course levels, students
should be able to demonstrate attainment of all PLOs by the end of
the academic programme, acquiring the full skills set to perform as
functional graduates.
5.1.19 Assessment of students’ learning provides evidence of their level of
attainment.

5.2 REVIEW OF ASSESSMENT AND DEVELOPMENT


5.2.1 This attainment is to provide HEPs with data on student performance
and identify areas for improvement.
5.2.2 The area of improvement is not only focused on the students but also
important for the academic courses and programmes offered.

93
5.2.3 The attainment is analytic data that allows the programme or course
developer to reflect on the quality of curriculum, instruction, and
assessment such as:

i. Embed evaluation of teaching quality within broader


evaluation processes.
ii. Ensure that assessments of assessment quality, and
evaluations of initiatives to foster quality, are included in
broader quality assurance processes and assessments of
overall institutional performance.
iii. Articulate the inter-connections between different types of
internal and external evaluations in use to promote
coherence across them and develop a clearer understanding
of the contribution each one makes to quality assessment.
iv. Eliminate those evaluation processes that do not contribute
significantly to achieving the institution’s objectives, and
verify that the data collected is appropriately and fully used
as well as relevant to the strategic goals of the institution.
v. Build evaluation into the design of every assessment
initiative, specify the criteria and evidence for judging
success, and communicate these publicly.
vi. Develop benchmarks for assessment quality and seek to
build a knowledge base of evidence, connecting with real-
world and industry needs, with real improvements that have
impacts on learning outcomes.
vii. Encourage a culture of evidence-informed practice and use
evaluations to deepen understanding of the relationships
between inputs and processes and learning outcomes as
well as identifying external factors likely to affect them.

94
CONCLUSION
Assessment that is constructively aligned to the intended learning outcome, has a
formative function, by providing ‘feed-forward’ for future learning that can be acted
upon for continuance improvement.

The amplification in the use of technology and distance learning as well as flexible
education had come about under the needed conditions, which paralleled the
conventional ways of assessment that provides the significance for the use of
alternative assessment.

This method for assessing the academic achievement of a learner includes activities
requiring the application of acquired knowledge and skills to real-world situations, and
it is often seen as an alternative to standardised testing.

As the tasks should be challenging, demanding higher-order learning and (for


employability), integration of knowledge learned in both the university and practical
contexts that encourage metacognition, and promote thinking about the learning
process, should encourage thinking beyond settling just the learning outcomes.

Whilst ensuring fairness and quality of assessment when managing assessment for
various contexts, there should be an opportunity and a safe context for diverse
students to expose problems with their studies and for HEPs to gather the appropriate
methods for improvement through the appropriate medium of learning facilities.

With the engagement from related stakeholders (industry, government, and non-
government bodies, alumni, etc.), it strengthened the manner of updating the required
bodies of knowledge, the current industry practices, and professional practices.

Through these segmented stakeholders and with appropriated processes such as


surveys, interviews, case studies, industrial engagements, and feedback will ensure
the active exchange of discourses on the nature of teaching and learning and relevant
ways in the students’ assessment in achieving the appropriate nature of the courses’
learning outcomes (CLOs), which align with the programme’s learning outcomes
(PLOs).

This creates a balanced, structured approach while still allowing flexibility as needed
to improve the programme’s approach and relevance to the needs of the industry.

95
These periodical assessments will pivot the feedback, surveys, correspondences, and
meetings, which eventually will affect the courses’ learning outcomes as they revise
newer texts and literature, current practices, case studies, and a new and updated
technological presence, which eventually shifts the fulcrum of the Programme’s
Learning Outcomes.

With the acquired clearly defined data and the needed tasks that arrive from these
resources, HEPs will need to execute the stages of reviewing or revamping their
courses and programmes, to be relevant.

The focus at this phase would be to generate knowledge that would inform
development in the HEP in making informed decisions at the planning and policy levels.

The mutual benefit is to produce competent graduates equipped with necessary skills
to deal with community challenges.

96
REFERENCES
Allen, M. J. (2004). Assessing academic programs in higher education. Bolton, MA:
Anker Publishing.

Anderson, L. W. & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching,


and assessing: A revision of Bloom's taxonomy of educational objectives. New York:
Longman.

Aris et. Al., Digital Skills Framework in Higher Education. Proceedings 2022, 82, 61.
https://doi.org/10.3390/proceedings2022082061

Assessment Reform Group. (2002). Assessment for learning: Research-based


principle to guide classroom practice. London, UK: Assessment Reform Group.

Biggs, J. (1999). Teaching for quality learning at university. Buckingham, UK: SRHE
and Open University Press.

Biggs, J. (2003). Aligning teaching and Assessment to curriculum objectives. York, UK:
LTSN Generic Centre.

Bloom B. S. (1956). Taxonomy of educational objectives, Handbook I: The cognitive


domain. New York: David McKay Co Inc.

Carmines, E. G. & Zeller, R. A. (1991). Reliability and validity of Assessment. Newbury


Park, CA: Sage Publications.

Dave, R. H. (1975). Psychomotor levels. In R. J. Armstrong (ed.), Developing and


writing behavioural objectives, pp. 33 - 34. Tucson, AZ: Educational Innovators Press.

Department of Education and Training. (2008). Guidelines for assessing competence


in VET (2nd Ed). Perth, WA: Department of Education and Training.

Earl, L.M. (2003). Assessment as learning: Using classroom assessment to maximise


student learning. Thousand Oaks, CA: Corwin Press.

Felder, R. M. & Brent, R. (2003). Designing and Teaching Courses to Satisfy the ABET
Engineering Criteria. Journal of Engineering Education, 92 (1), 7 - 25.

Entrepreneurship Assessment in Higher Education: A Research Review for


Engineering Education ResearchersJournal of Engineering Education VC 2018 ASEE.

97
http://wileyonlinelibrary.com/ journal/jee April 2018, Vol. 107, No. 2, pp. 00–00 DOI
10.1002/jee.20197

RELATED LINKS

APEL MQA. - https://www2.mqa.gov.my/apel/

MQA’s GGP Assessment of Student MQA -https://shorturl.at/bhmH3

Garis Panduan Pembangunan programme Akademik - https://shorturl.at/oxyzM

SDG UNESCO - https://en.unesco.org/sustainabledevelopmentgoals

TTAC Manual MBOT - https://ttasmbot.org.my/srr.php

High Technology High Value (HTHV) KPT - https://shorturl.at/bRV58

Future Ready Framework KPT - https://shorturl.at/abhsv

ExCEL KPT -https://shorturl.at/O1679

GGP 2u2i KPT - https://shorturl.at/glC28

GGP: APEL.A and APEL.M: https://www2.mqa.gov.my

GGP: Micro-credentials: https://www2.mqa.gov.my

98
FURTHER READINGS
Darling-Hammond, L., Herman, J., Pellegrino, J., et al. (2013). Criteria for high-quality
assessment. Stanford, CA:

Stanford Center for Opportunity Policy in Education.

Donnan (2007), Conducting assessment online: Educational developers' perspectives

McMorran, C., Ragupathi, K., and Luo, S. (2015). Assessment and learning without
grades? Motivations and concerns with implementing gradeless learning in higher
education. Assessment & Evaluation in Higher Education: 1-17.
http://dx.doi.org/10.1080/02602938.2015.1114584

Linn, R. L, Baker, E. L, and Dunbar, S. B. (1991). Complex, Performance-based


Assessment: Expectations and Validation Criteria, Educational Researcher, 20, 15-21.

Ragupathi, K. (2020). Designing Effective Online Assessments: Resource Guide,


National University of Singapore https://www.nus.edu.sg/odt/docs/default-
source/professional-development-docs/resources/designing-online- assessments.pdf

99
APPENDIX 1

Table 19: Example of SOLO Taxonomy Aligned Assessment Plans

SOLO level Type of Solution category Structure of essay


outcome

1 – Prestructural Completely incorrect Inappropriate or a few issues identified. No


solution. framework for discussion and little relevant
Misses the point material selected. Poor structure to the essay.
Irrelevant detail and some misinterpretation of the
question. Little logical relationship to the topic and
poor use of examples.

2 - Unistructural State Correct answer to a Poor essay structure. One issue is identified and
Recognise simple algorithmic this becomes the sole focus; no framework for
Single point Recall problem requiring organising discussion. Dogmatic presentation of a
Quote substitution of data single solution to the set task. This idea may be
Note Name into a formula. restated in different ways. Little support from the
literature.
Correct solution of
one part of a more
complex problem.

3 - Multi- structural Explain Correct solution to a Essay poorly structured. A range of material has
Define List multiple-part been selected and most of the material selected is
Multiple unrelated Solve problem requiring appropriate. Weak introduction and conclusion.
points Describe the substitution of Little attempt to provide a clear logical structure.
Interpret data from one part to Focus on a large number of facts with little attempt
the next. Poorly- at conceptual.
structured project
report or practical explanations. Very little linking of material between
report on open tasks. sections in the essay or report.

4 – Relational Apply Elegant solution to a Essay well-structured with a clear introduction and
Outline complex problem conclusion.
Logically related Distinguish requires the
answer Analyse identification of Framework, which is well developed, exists.
Classify variables to be Appropriate material. Content has logical flow, with
Contrast evaluated or ideas clearly expressed.
Summarise hypotheses to be Clearly identifiable structure to the argument with
Categorise tested. Well- discussion of differing views.
structured project or
practical report on
open task.

5 - Extended Create Solution to a Well-structured essay with a clear introduction and


abstract Synthesise problem that goes conclusion. Issues clearly identified; clear
Hypothesise beyond the framework for organising discussion; appropriate
Unanticipated Validate anticipated answer. material selected. Evidence of wide reading from
extension Predict Project or practical many sources. Clear evidence of sophisticated
Debate report dealing with a analysis or innovative
Theories real-world ill-defined thinking.
topic.

Note: Biggs and Collis’ (1982) Structure of Observed Learning Outcome (SOLO) taxonomy is
another considered assessment for cognitive learning.

100
It is especially beneficial when setting cognitive tasks or assessment items and designing
rubrics (performance standards) for grading the task.

When using this taxonomy for writing learning outcomes and grading, it informs learners and
faculty staff on the criteria and the standards of answers required to show evidence of
attainment at the various competency levels or levels of cognitive performance.

The QR reference shows a representation of the SOLO taxonomy, which has five levels,
starting from no knowledge (pre-structural), through surface learning (uni-structural and multi-
structural), to deep learning (relational and extended abstract).

The psychomotor domain (Simpson, 1972) includes physical movement, coordination, and use
of the motor-skill areas. The development of these skills requires practice and is measured in
terms of speed, precision, distance, procedures, or techniques in execution. This domain
includes seven major categories that are listed from the simplest behaviour to the most
complex namely perception, set, guided response, mechanism, complex overt response,
adaptation and origination.

The MQA and MOHE LO domains belonging to the psychomotor taxonomy include practical
skills and entrepreneurship.

The Affective Domain addresses interests, attitudes, opinions, appreciations, values, and
emotional sets. This domain includes the manner in which we deal with things emotionally,
such as feeling, value, appreciation, enthusiasm, motivation, and attitude. The five categories
in affective domain include receiving, responding, valuing, organisation and characterization
by value. The MQA and MOHE LO domains belonging to the affective taxonomy include
communication, teamwork and social responsibilities, ethics, morality, professionalism, lifelong
learning, management, and leadership. Other taxonomy examples could also be considered
by HEPs appropriated into their programme design.

101
APPENDIX 2

Table 20: Example of The Forms of Online Assessment (Multiple Choice


Questions and Fill in The Blanks)

Question Grading Common Verbs Advantages Disadvantages


Types Type in Bloom’s
Multiple- Automatically Knowledge, MCQs are the most When compared to
Choice graded Recall, versatile of the true/false and
Question Comprehension, closed-ended matching, multiple-
Application, question types. choice items can
Analysis be more
challenging to
write.
This versatility stems They also require
from the fact that the the creation of
questions can plausible
contain more "distractors" or
elaborate scenarios incorrect answer
that require careful options. As with
consideration on the other closed-
part of the student. ended questions,
The probability of multiple-choice
students guessing is assesses
also relatively low. recognition over
recall.
Fill-In-The- Automatically Knowledge, Fill-in-the-blank (FIB) FIB questions are
Blanks graded Recall, questions assess only suitable for
Comprehension, unassisted recall of questions that can
Application, information rather be answered with
than recognition. short responses.
They are relatively Additionally,
easy to write. because students
are free to answer
any way they
choose, FIB
questions can lead
to difficulties in
scoring if the
question is not
worded carefully.

102
APPENDIX 3

Table 21: Example of The Forms of Online Assessment (True/False and Essay
Questions)

Question Grading Type Common Verbs Advantages Disadvantages


Types in Bloom’s
True/False Automatically Knowledge, True/false True/false questions
Questions graded Comprehension, questions are are limited in what
Recognition among the kinds of student
easiest to mastery they can
prepare. assess.
They have a
relatively high
probability of
students guessing
the correct answer
(50%).
True/false also
assesses
recognition of
information, as
opposed to recall.
Essays/Short Manually Knowledge, Essay questions There are two main
Answer graded Analysis, are the only type disadvantages to
Comprehension, of question that essay questions -
Application, can effectively time requirements
Evaluation assess all six and grading
levels of consistency.
Bloom's Scoring can be
Taxonomy. They difficult due to the
allow students variety of answers,
to express their as well as the "halo
thoughts and affect (students are
opinions in rewarded for strong
writing, granting writing skills for the
a clearer picture mastery of utilising
of the level of the right vocabulary,
student as opposed to
understanding. demonstrated
Finally, as open- mastery of the
ended content).
questions, they
assess recall
over
recognition.

103
APPENDIX 4

Table 22: Online Engagement, Platform and Assessment (Asynchronous)

Assessment Types Examples Platform Considered


Tools
Traditional - Essays - CN - Turnitin
Paper/Report - Case studies - Google Classroom - iThenticate
Submission - Article reviews - Canvas - Quetext
Asynchronous - Proposal - Blackboard Learning - Grammarly
writing Online marking and
- Report writing feedback
- ExamSoft
- Emails
- Cloud-sharing Drives
Automated Online - Online - CN Asynchronous
Assessment Quizzes - Google Forms (time-based)
Asynchronous (MCQs, - Plickers Automated result
MROS, FIBs, - Poll Everywhere! and in- situ for the
T/F matching) - Mentimeter students
- Nearpod
- In-video - Goformative.com
quizzes -In- - Fipgrid
Class Live - Kahoot
Answer

(Assessment of
prior knowledge)
Critical Reflection & - Electronic - e-portfolio Requires the
Meta-Cognition portfolios - Wikis examiners to
Asynchronous - Online - Blogs apply the
journals, logs, - Academic's Preferred appropriate
diaries, blogs, peer assessment rubrics to the
wikis platforms submitted works
- Embedded
reflective
activities
- Peer & self-
assessment

104
APPENDIX 5

Table 23: Online Engagement, Platform and Assessment for Continuous and
Authentic Assessment

Assessment Examples Platform Considered Tools


Types
- Contributions to Blogs/wiki/Google Requires the examiners
Continuous forums, chats, docs to apply the appropriate
Assessment blogs and wikis rubrics to the submitted
works
(Individuals) - Reading
summaries
Asynchronous
- Collaborative
learning

- Critical reviews
- Online - Screencast
Continuous presentations (Ink2Go)
Assessment
- Group online - Blog Platform
(Group) projects Video based
platform (Vimeo,
Asynchronous - Role play YouTube,
Instagram
- Online debates
- Loom

- Google Docs
Authentic - Scenario-based - Google Docs The assessment are
Assessment learning activity based,
- Google Forms continuous assessment
Asynchronous - Laboratory/field trip in variant forms of
reports - Plickers academic-student
engagement, Elements
- Simulations - Poll Everywhere of Academic dishonesty
are low and requires
- Case studies/Role - Mentimeter minimal invigilation
play
- Nearpod
- Online oral
presentations - Goformative.com
and/or debate
- Flipgrid
- Kahoot

105
Invigilated - Mid-semester Assigned online AI Proctoring may be
Online Exam exam video platform (e.g.: required (Utilising AI to
Assessment - Final exams MS Teams, Zoom, assist facial recognition
Google, Meet, etc) as well)
Synchronous
- Honorlock
- Protorio
- Talview
- RPNow
- Examus AI Proctoring
- Questionamark
- Mercer Mettl Online
Examination and
Proctoring Solutions
- Think Exam

Multiple invigilators may


be required to assess
the suspicious case or
condition for the high
percentage.

Recommended Cloud-
based, secure on-line
biometric scan for
students’ identity
verification.

The standard
requirement for lights,
camera and microphone
settings in viewing the
spacing of candidates to
the viewed and monitor.

Note: This list is not exhaustive. HEPs are advised to explore other AI proctoring that
may provide a more comprehensive and secure data encryption

106
APPENDIX 6: Data Encryption During Transmission of Examination

a) Data encryption plays an important role in preventing unauthorised access to


question banks. It also helps to avoid result manipulation and blocks access
without valid credentials. It is a vital feature to ensure the security of the
examination.
b) The online examination system's data has been encrypted to prevent any kind
of misuse.
c) Question bank and exam data are stored in a highly secure and encrypted
manner.
d) The entire communication between the server and examination client is also
encrypted with a secure mode of communication.
e) This ensures the confidentiality of the question papers being exchanged
between the server and the client.
f) A timer may be set for each section and each question in the exam, and if any
student is unable to answer within the specified time, the system moves on to
the next question.

107
APPENDIX 7

Conditions and Requirements for Online Assessment

a) Stable Internet Line with the acceptable minimum appropriate speed.


b) The data are secure, while prepared, stored, delivered, and retrieved.
c) Exam materials are optimised to be on low data to ease the high bandwidth.
d) Appropriate hardware and the application platform to conduct the examination.
e) Preparations of the set exams by the HEPs are appropriated to the level of MOF
2.0 (2018) and the Bloom's Taxonomy Educational Objectives.

108
APPENDIX 8

Example of Approximation in Assessment Tasks

109
APPENDIX 9

Example of Technical Time Allowance for Submission in Online Exams

Exam Length Technical Time Allowance

Up to and including 75 minutes 15 minutes technical time

76 minutes to 179 minutes 30 minutes technical time

180 minutes or more 60 minutes technical time

Note:

• For students sitting for online examination outside of HEP vicinities and with
moderate internet coverage.
• Technical time allowance refers to the acceptable period for submission
(depending on the length of the examination) after the online examination
ends.

110
APPENDIX 10

Example for Supervisor’s Key Areas of Assessment

111
APPENDIX 11

Example for Postgraduate Level of Assessment

Evidence of PLOs achieved or in process can be observed through these processes


and modes of assessment.
Students Processes Include Their: Modes Of Assessment
• Participations Project Paper and Oral Presentation

• Preparations Reviews and Critiques

• Engagements with Supervisor Presentation Paper/Seminar Work

• Engagements with Exhibition


Stakeholders
Survey Analysis
• Process of Creation
Exegesis/Thesis/Dissertation
• Presentation of Creative Works
Journal (Written, Audio/Visual)
• Presentations
Industry Report
• Writings
Research Progress Report
• Prepared Journals (Whichever
Graduate Seminar
Media Form]
Presentation
• Preparation for Viva Voce
Assignments

Capstone Project

Graduate Studio Work

Proposal/Design Defence

Publication Works (During Study)

Viva Voce

112
APPENDIX 12

Course Assessment Plan, Instruction and Rubric

Exemplar of a Course Assessment Plan

Please note that this is merely to show an example to help understand what is
discussed in terms of constructive alignment in Part 2 and managing assessment in
Part 3 (Weightage, Table of Specification and Rubric). HEPs are required to apply the
knowledge appropriately in their own contexts. This exemplar is for two of three course
learning outcomes in the course.

To prepare your course assessment plan and to calculate the weightage, you need to refer
to the information in your syllabus.

Based on the course information from your syllabus:

A. State the credit hour for your course.

B. State the total Teaching time allocated for the course or Total Student Learning
Time.

Note: for distance learning context, Total Student Learning Time can be used.

C. State your method of assessment.

D. Determine Teaching time allocated by CLO (D1) or the SLT by CLO (D2) based on
the emphasis placed on each CLO and method of assessment,

E. Calculate the weight of each assessment.

Time spent Teaching/Learning for each CLO x 100


Total Teaching/Learning for the course

The sum of the assessment weight must equal to 100%.

113
The following is an example.

Course Code: SPGP3023


Course Name: Assessment in Digital Context

Brief Synopsis
This is an undergraduate course designed for education programme specialising in IT.
Exposure to various aspects of assessing in digital context include theories and principles on
visual arts and digital technologies that can be used when assessing and evaluating learning.
Fundamental concepts of assessment and evaluation using quantitative data and qualitative
data are introduced before opportunities to integrate digital technologies and relevant software
for various types of assessment are provided.

Description of Assessment:
This course consists of 50% coursework (covering CLO1, CLO2 and CLO3) and 50% final
examination (covering CLO1 and CLO2 only). There will be two assignments for the
coursework:

1) Case Study and Analysis - 20% + 10% and


2) Group e-portfolio Project – 20%

A) Credit Hour 3

B) Total SLT/Teaching Time 120 / 42

CLO C) Method D1) D2) E} Weightage


of Teaching SLT
Assessment time by for
CLO CLO

CLO1: Discuss the concept, Case Study 21 hrs 56 hrs 1) 21/42 *100 = 50%
principles, issues and challenges in (20%)
visual art assessment and 2) 56/120*100
evaluation. Final = 46%
Examination (±40 - 50%)
(C2) (30%)
50%

CLO2: Evaluate quantitative and Case 12.6 hrs 40 hrs 1) 12.6/42*100 = 30%
qualitative data in measuring Analysis
learning outcome through (10%) 2) 40/120*100
assessment and evaluation = 33%
Final (±30 - 35%)
(C5) Examination
(20%) 30%

CLO3: Integrate digital technologies Group 8.4 hrs 24 hrs 1) 8.4/42*100 = 20%
and appropriate software for Project (e-
diagnostic, formative and Portfolio) 2) 24/120*100
summative assessment and (20%) = 20%
evaluation. (20%)

(A4, P1) 20%

114
Course Assessment Plan Instruction and Rubric

Once the weightage is determined, it can be used to plan the number of items based
on the related CLO and topics for the final examination. However, since only CLO1 and
CLO2 in this exemplar are used in the final examination, there is a need to also
determine the weight for the final examination questions to determine the number of
items by CLO. Total weightage for CLO1 and CLO 2 is 80%. To determine the
weightage of the final exam questions there is a need to find the weightage of the final
examination only.

For example, CLO1 (50/80*100 = 60%) and CLO2 (30/80*100 = 40%).

A Table of Specifications for the number of items based on the related CLO and topics
for the final examination can also be determined with the weightage.

Below is an example of two cases when planning the final examination types of
questions. Case 1 is used when the questions are structured, or essay type and Case
2 is used when the questions are in the multiple-choice type of questions.

Case 1: Final Examination – structured and essay (CLO1 and CLO 2)


(Refer to Appendix 8 for the duration by type of questions)

Duration of the Exam is 3 Hours.


Item No of Item Approximate Time
Short-Answer 3 min x 0 0 minutes

Short Essay 15 min x 9 2 hour and 15 minutes

Long Essay 45 min x 1 45 minutes

Total 10 3 hours

60% come from CLO1 6 items (Part A – 4 questions,


Part B – 2 questions)
40% come from CLO2 4 items (Part B – 3 questions (short essay)
1 long essay questions)

115
To Determine The Table of Specification based on Weightage for CLO1 & CLO2
(Final Examination). (Note: The Highest Cognitive Level Targeted Is C5)

HRS Spent % Marks CLO & Total Marks


TOPIC C1 C2 C3 C4 C5 C6
on Topic HRS Allocated C Level Developed

10 CLO1 Q A1 Q A2
Topic 1 11 10 10
(11.5) (C2) 5m 5m

20 CLO1 Q A3 Q A4
Topic 2 15 20 20
(15.6) (C2) 10 m 10 m

30 CLO1 Q B1 Q B2
Topic 3 30 30 30
(31.3) (C2) 10 m 20 m

20 CLO2 Q B3 Q B4a
Topic 4 20 20 20
(20.8) (C5) 10 m 10 m

20 CLO2 Q B4b Q B5
Topic 5 20 20 20
(20.8) (C5) 5m 15 m

TOTAL 96/120hrs 100 100 25 35 10 15 15 100

Case 2: Final Examination - MCQ (CLO1 and CLO2)

Duration of MCQ - 1 hour 30 minutes

(Refer to Appendix 8 for the duration by type of questions)

Note: Assuming there will be 60 items in the MCQ, based on the weightage 60% of 60 items
(36 items) come from CLO1 (C2) and 40% of 60 items (24 items) come from CLO2(C5). The
highest cognitive level is C5.

Topics PLO CLO Weightage C1 C2 C3 C4 C5 C6


(MQF2.0)
(2018)
easy average difficult
A Q1, Q2, Q3, Q15, Q22, Q25, - - -
B 60% Q10, Q26, Q27, Q31,
C (36 items) Q11, Q9, Q36
Cognitive CLO1 Q12,
D Skill Q4, Q5, Q6, Q16, Q17, Q18, - - -
E Q7, Q8, Q19, Q20, Q23,
F Q13, Q14, Q24, Q28, Q29,
Q30, Q32, Q33,
Q34, Q35, Q21,
G 40% - - Q38, Q39, Q50, Q51, Q55, Q56,
H Q40, Q37, Q52, Q53, Q57,
I Numeracy CLO2 (24 items) Q41, Q42, Q54, Q45, Q47, Q48,
skill Q43, Q44, Q46 Q49, Q58,
Q59, Q60

14 22 8 7 9

116
Coursework Assignment for CLO1 and CLO2
Instruction:
Given a case of one course and its course assessment plan you are to discuss
the relevance of the design and evaluate its effectiveness in measuring
learning.

117
Example of Rubric for CLO1 and CLO2:

Rubric for Case Study/ Analysis

Learning Category Exemplary (5) Proficient (4) Partially Incomplete (2) x2


Outcome Proficient (3)

Cognitive Excellent Good Sufficient Poor


Skills Explanation explanatory, explanatory, explanatory, explanatory,
(C2) in discussion and inventive and inventive and inventive and inventive
(CLO1 account; fully account; fully account; fully account; fully
Discuss the supported, supported, supported, supported,
concept, verified, and verified, and verified, and verified, and
principles, justified; deep justified; deep justified; deep justified; deep
issues and and broad and broad and broad and broad
challenges in
visual art
assessment Idea Guiding ideas is Guiding idea is
and
evaluation)
rich and novel, well written and Guiding idea is Guiding idea is
compelling conceived with basically basically poor
[Ability to statements that some allusion to explained
discuss] lead to strong form
(20%) forms.
Learning
Outcome Category Exemplary (5) Proficient (4) Partially Incomplete (2) x1
Proficient (3)
Demonstrates Demonstrates Demonstrates Fails to
Numeracy strong some limited demonstrate
Skills reflection on reflection on reflection on reflection on
(C5) personal personal personal personal
Reflection experiences experiences experiences experiences
(CLO2 with with with with
Evaluate assessment assessment assessment assessment
assessment strategy, and strategy but strategy and strategy, and
strategy and critically lacks critical lacks critical lacks
approach in
evaluates the evaluation of evaluation of evaluation of
measuring
learning effectiveness the the the
outcome of these effectiveness effectiveness effectiveness
through practices. of these of these of these
assessment practices. practices. practices.
and
Demonstrates Demonstrates Demonstrates Fails to
evaluation)
high-level some level of limited thinking demonstrate
Critical thinking skills thinking skills skills in the acceptable
[Ability to
Thinking to in the analysis in the analysis analysis and thinking skills
evaluate
use and evaluation and evaluation evaluation of in the analysis
using the
appropriate of formative or of formative or formative or and evaluation
appropriate
measuring summative summative summative of formative
measuring
tool for assessment assessment assessment and
tool and
measuring practices using practices using practices with summative
reflect on
learning. appropriate appropriate inappropriate assessment
implication
measuring tool measuring tool measuring too practices with
in practice]
to measure to measure to measure misconception.
(10%)
learning. learning. learning.

30% Percentage obtained

118
APPENDIX 13
Consideration for Exam on Demand
a) The Exam-On-Demand System
For HEPs to consider conducting an "exam-on-demand", it requires careful
planning, technological infrastructure, and considerations to ensure fairness,
security, and effective assessment. Here are the steps you can take to conduct
exams on demand:

b) Determine Feasibility:
Assess whether an "exam on demand" approach is suitable for your course,
subject, and educational institution. Consider factors such as the subject's
nature, assessment requirements, available technology, and the willingness of
instructors and students to adapt to this approach.

c) Choose a Platform:
Select or develop a suitable online platform or learning management system
(LMS) that can handle exam-on-demand scheduling, submission, grading, and
feedback. The platform should also ensure the security and integrity of the
assessment process.

d) Design Assessments:
Create exam questions that assess the intended learning outcomes effectively.
Ensure a mix of question types, such as multiple-choice, short answer, and
essay questions, to cater to various types of knowledge and skills.

e) Set Guidelines and Policies:


Establish clear guidelines for taking exams on demand. Communicate rules,
expectations, and any restrictions to students, such as time limits, use of
resources, and guidelines for submitting answers.

f) Prepare Technology:
Ensure that both instructors and students have access to the necessary
technology and resources to participate in the "exam on demand" system.
Provide technical support and training as needed.

119
g) Create an Exam Repository:
Develop a repository of exam questions that can be randomly selected for each
student to ensure fairness and prevent academic dishonesty.

h) Implement Scheduling:
Set up a scheduling system where students can choose a suitable time slot to
take the exam. This can be done through the LMS or an online scheduling tool.

i) Provide Flexibility:
Allow students a window of time during which they can start and complete the
exam. This accommodates different time zones and personal schedules.

j) Submission and Grading:


Design a process for students to submit their completed exams through the
online platform. Develop an automated or efficient grading system, especially
for multiple-choice questions, to provide prompt feedback.

k) Feedback and Support:


Provide personalised feedback to students after the exam, highlighting
strengths and areas for improvement. Offer opportunities for students to ask
questions or seek clarification about their performance.

l) Maintain Integrity:
Implement measures to ensure exam security and prevent cheating. This might
include randomising question orders, using online proctoring tools, and setting
time limits for individual questions.

m) Continuous Improvement:
Gather feedback from instructors and students about their experiences with the
"exam on demand" system. Use this feedback to refine the process, address
challenges, and enhance the overall effectiveness of the approach.

120
n) Monitor and Evaluate:
Continuously monitor the effectiveness of the "exam-on-demand" approach in
terms of student performance, engagement, and satisfaction. Make adjustments
as needed to improve the process.

121
APPENDIX 14
Example of Statements
PEO PLO CLO
Definition Broad statements that The abilities (cognitive, Specific statements of
describe the career and psychomotor, and what the learners are
professional affective) that are expected to achieve
accomplishments of graduate should be able at the end of the
graduates within five to demonstrate at the courses.
(5) years upon time of graduation
graduation.
Example of IT Instructors who At the end of the At the end of the
Statement apply fundamental programme, students course, students can:
(Cognitive knowledge and should be able to: - Explain
Domain) practical skills in - Apply mathematics differentiation
providing services to and science and integration
the IT industries locally concepts, principles, concepts,
and globally. theories and law principles and
essential to IT; algorithms.
- Perform algorithm, - Perform second-
programming and order
diagnostic differentiation
procedures and triple
essential to IT. integration
techniques to
determine slopes,
sign of the slopes
area and volume
of mathematical
functions.
Example of IT professionals who At the end of the At the end of the
Statement can provide technology programme, graduates course, students are
(Psychomotor solutions and services are able to diagnose and able to:
Domain) to meet the evolving troubleshoot technical - Implement
needs of the industry. issues related to database security
computing software, and measures
design and implement including user
databases. authentication,
access control
and data
encryption.
- Design and
develop data
schemas using
industry standard

122
database
management
tools.

Example of IT professionals who At the end of the At the end of the


Statement lead and effectively programme, students course, students
(Affective communicate with should be able to: should be able to:
Domain) team members in - Demonstrate - Give a verbal
solving workplace effective presentation
professional issues. communication by utilising ICT
skills. technology.
- Demonstrate - Support and
effective respect team
teamwork in a members’
multidisciplinary opinions and
team and ideas during
- Demonstrate team-related
leadership skills. tasks.
- Demonstrate
leadership
skills in team-
related tasks.
Example of - Alumni surveys - Entrance Survey - Tests
Statement - Alumni interviews - Exit Survey - Projects
Methods/Outco - Employer surveys - Exit Interviews - Reports
me - Employer interviews - Exit Exam - Oral
presentation
Indicators - Job offers. - Standardised test - Proposal
- Starting salaries (eg. Graduate - Summary
(relative to national Record - Critiques
benchmarks) Examination - Assignments
- Admissions to (GRE), the - Journals
graduate school. Collegiate - Portfolio
Learning
Assessment
(CLA) and the
National Survey
of Student
- Evaluation
(NSSE)

123
LIST OF PANEL MEMBERS

NO. PANEL MEMBERS ORGANISATION


1, Prof. Dr. Fauziah Abdul Rahim
Universiti Utara Malaysia (UUM)
(Chairman)
2. Prof. Madya Ts. Dr. Khairil Azril Ismail Akademi Seni Budaya Dan Warisan
(Standard Writer) Kebangsaan (ASWARA)
3. Prof. Dr. Suria Baba Universiti Malaysia Kelantan (UMK)
4. Prof. Madya Dr. Azidah Abu Ziden Universiti Sains Malaysia (USM)
5. Prof. Ts. Dr. Mohd Ruslim Mohamed Universiti Malaysia Pahang Al-Sultan
Abdullah (UMPSA)
6. Prof. Madya Ts. Dr. Syamsul Nor Azlan
Universiti Teknologi MARA (UiTM)
Mohamed
7. Dr. Aida Suraya Mohd Yunus Universiti Putra Malaysia (UPM)
8. Prof. Dr. Khairiyah Mohd Yusof Universiti Teknologi Malaysia (UTM)
9. Puan Haslinda Abdul Hamid Kolej Komuniti Langkawi
10. En Zulkifli Mat Som Institut Pendidikan Guru.
Kementerian Pendidikan Malaysia
11. Dr. Airil Haimi Mohd Adnan Jabatan Pendidikan Tinggi, KPT
12. Dr. Norhayati Ibrahim Jabatan Pendidikan Politeknik dan
Kolej Komuniti.

124
125

You might also like