Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Designing Assessment
tasks :
Rita Kizito
march 2014
Rita Kizito
march 2014
Workshop for Earth
Sciences Department
April 2014
In this workshop we intend
to…
• Link assessment to module learning
outcomes
• Learn how to set assessment activities at
the appropriate NQF level
• Briefly interrogate assessment (purpose,
types , features)
• Refine/develop assessment tasks
hand-out #1 – UWC Assessment policy
Integrate outcomes and
assessment
Knowledge reproduction/
creation
Recall
Comprehension
Application
Analysis
Synthesis
Evaluation
Attitudes
Inquiry focused
Ethically, environmentally and
socially aware
Motivated
Skills
Skilled communicators
Autonomous and collaborative
Problem solving
Are your outcomes well designed
?
Specific Provide details of aspect of
expectation
Meaningful Written in understandable
language
Appropriate Suit learner’s abilities &
experiences
Realistic Achievable in given time
constraints
Testable Some measure of
progress/achievement
(Butcher, et al., p. 41, 2006)
Bloom’s taxonomy for generating
outcomes and assessment tasks
Bigg’s (2003) SOLO taxonomy:
Single
point
Unanticipate
d extension
Logical
related
answer
Multiple
unrelated
points
Lower level outcomes Higher level
outcomes
Bigg’s (2003) SOLO taxonomy:
hand-out #2 – Learning taxonomy
SOLO Taxonomy
Structure of Observable
Learning Outcome
Prestructural Unistructural Multistructural Relational
Extended
abstract
Quantitative Qualitative
Level Descriptors - SA
Qualifications Framework
To ensure
coherence in
learning
achievement
and to facilitate
assessment at
the appropriate
levels.
NQF
Level
Education
Level
Qualification
10 Doctorate
9 Masters
8 PG Diploma/Cert
Honours
7 Bachelor (ord.)
degree
6 Diploma
5 Certificate
4 Matric
3
2
1
HIGHER
EDUCATION
FURTHER
EDUCATION
GENERAL
EDUCATION
Level Descriptors
KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATIO
N
list
name
identify
show
define
recognize
recall
state
summarize
explain
put into your own
words
interpret
describe
compare
paraphrase
differentiate
demonstrate
visualize
find more information
about
Restate
solve
illustrate
calculate
use
interpret
relate
manipulate
apply
classify
modify
put into practice
analyze
organize
deduce
choose
contrast
compare
distinguish
design
hypothesize
support
schematize
write
report
discuss
plan
devise
compare
create
construct
Evaluate
Choose
Estimate
Judge
Defend
Criticize
Justify
YEAR 1 YEARS 1, 2 YEARS 2, 3 YEAR 3 YEARS 3, 4 YEAR 4
Level Descriptors - SA
Qualifications Framework
What should the students know about the learning
area/subject?
What types of problems should the student be able to
solve? …in what contexts? …with what
methods/procedures?
How should students obtain and process information?
How should students communicate?
How independent should the students be?
Which values should students uphold?
hand-out #3 – level descriptors
“We first have to be clear about what
we want students to learn, and then
teach and assess accordingly in an
aligned system of instruction”
(Biggs, 1996).
Curriculum alignment
Teacher’s
Intentions
Student’
s Activity
Exam’s
Assess
ment
e.g.
Explain
Relate
Prove
Apply
e.g.
Memorize
describe
e.g.
Memorize
describe
Unaligned course
Unaligned course
Teacher’s
Intentions
Student’s
Activity
Exam’s
Assessment
e.g.
Explain
Relate
Prove
apply
e.g.
Explain
Relate
Prove
apply
e.g.
Explain
Relate
Prove
apply
Why assess? -two key purposes
• Making judgements
about student
learning for
certification/ grading
• Helping to prompt
and promote further
learning/ for
monitoring learning
progress
Summative
assessment
Formative
assessment
What does assessment do ?
• Defines what students will concentrate on
when learning
• Affects how they learn
• Specifies what counts as learning
• Provides information about shortfalls
between performance and specification
• Simulates conversation about, and
reflection, on improvement
Ndagire Kizito 04-04-2014
Peter Knight, 2001
Types of assessment
• Laboratory
Work
•Class
Presentations
• Essays /
Research
• Online
Tests
• Problem Based
Learning
Case studies
Projects
• Reflective
Journal
• Multiple
choice
questions
(MCQs)• Portfolio
• Group
work
Assessment features
Reliability refers to the degree to which an
assessment tool produces stable and
consistent results.
•Test-retest reliability correlating tests given twice over a period of time
to a group of individuals.
•Parallel forms reliability correlating different versions of an
assessment tool (probing the same construct, skill, knowledge base,
etc.) to the same group of individuals.
•Inter-rater reliability is a measure of the degree to which different
judges or raters agree in their assessment decisions.
•Internal consistency reliability is an evaluation of the degree to which
different test items that probe the same construct produce similar
results. similar results
Assessment features
Validity refers to how well a test measures
what it is purported to measure.
•Face Validity ascertains that the measure appears to be assessing the
intended construct under study
•Construct Validity is used to ensure that the measure is actually
measuring what it is intended to measure (i.e. the construct), and not
other variables.
Ways to improve validity
•Make sure your goals and outcomes are clearly defined and
operationalized.
•Match your assessment measures to your learning outcomes. Use
outside reviewers.
•If possible, compare your measure with other measures, or data that
may be available.
Rubrics/ Assessment criteria
• A rubric is a scoring /set of expectations used to
judge student performance. It shows students
how well they have performed on an
assessment task.
• Uses assessment criteria and levels of
performance to break down a task in parts
explaining what is required .
• It can be used for a large number of tasks
(essays, research projects, oral presentations,
portfolios, etc.) and is especially useful for
assessing complex , subjective subjects.
Rubrics/ Assessment criteria
Examples of rubrics .
hand-outs # 4 & 5 – Rubrics
Task 1
1. Identify one outcome from your module
outline and develop an assessment task (
you can develop more than one task).
2. Create a rubric for one of the tasks in
which you develop criteria (2 or more)
and levels of expected achievement for
that task answer.
3. Allow a colleague to review the tasks and
make comments.
Task 2
Compare the two past question papers and
answer the following questions
1.Are hey appropriate for the grade levels
they have been prepared for?
2.Which one would you give to your
students? Why?
3.How would you modify each one to fit your
own context?
hand-outs # 6 & 7 – Past papers
Hints for writing exam papers
1. Don’t do it on your own! Get one or two
colleagues to do your questions.
2. Have your intended learning outcomes in front
of you as your draft your questions.
3. Keep your sentences short.
4. Work out what you’re really testing.
5. Don’t measure the same things again and
again.
6. Include data or information in questions to
reduce the emphasis on memory.
Hints for writing exam papers
7. Make the question layout easy to follow.
8. Write out an answer to your own question.
9. Decide what the assessment criteria will be.
10.Work out a tight marking scheme.
11.Use the question itself to show how marks are
to be allocated.
12.Try your questions out.
13.Proof-read your exam questions carefully
hand-out # 8 A moderation checklist for exam papers
hand-out # 9 Moderation checklist RGU
Biggs, J. (1996). Enhancing teaching through constructive alignment.
Higher education, 32(3), 347-364.
Butcher, K. R. (2006). Learning from text with diagrams: Promoting
mental model development and inference generation. Journal of
Educational Psychology, 98(1), 182.
Knight, P. (2001). A Briefing on Key Concepts: Formative and
summative, criterion and norm-referenced assessment. Learning
and Teaching Support Network.
SAQA (2000), The South African Qualifications Authority Level
Descriptors for The South African National Qualifications
Framework, ttp://www.saqa.org.za/docs/misc/level_descriptors.pdf
References
Thank - you

More Related Content

Faculty assessment presentation april 2014

  • 1. Designing Assessment tasks : Rita Kizito march 2014 Rita Kizito march 2014 Workshop for Earth Sciences Department April 2014
  • 2. In this workshop we intend to… • Link assessment to module learning outcomes • Learn how to set assessment activities at the appropriate NQF level • Briefly interrogate assessment (purpose, types , features) • Refine/develop assessment tasks hand-out #1 – UWC Assessment policy
  • 3. Integrate outcomes and assessment Knowledge reproduction/ creation Recall Comprehension Application Analysis Synthesis Evaluation Attitudes Inquiry focused Ethically, environmentally and socially aware Motivated Skills Skilled communicators Autonomous and collaborative Problem solving
  • 4. Are your outcomes well designed ? Specific Provide details of aspect of expectation Meaningful Written in understandable language Appropriate Suit learner’s abilities & experiences Realistic Achievable in given time constraints Testable Some measure of progress/achievement (Butcher, et al., p. 41, 2006)
  • 5. Bloom’s taxonomy for generating outcomes and assessment tasks
  • 6. Bigg’s (2003) SOLO taxonomy: Single point Unanticipate d extension Logical related answer Multiple unrelated points Lower level outcomes Higher level outcomes
  • 7. Bigg’s (2003) SOLO taxonomy: hand-out #2 – Learning taxonomy SOLO Taxonomy Structure of Observable Learning Outcome Prestructural Unistructural Multistructural Relational Extended abstract Quantitative Qualitative
  • 8. Level Descriptors - SA Qualifications Framework To ensure coherence in learning achievement and to facilitate assessment at the appropriate levels. NQF Level Education Level Qualification 10 Doctorate 9 Masters 8 PG Diploma/Cert Honours 7 Bachelor (ord.) degree 6 Diploma 5 Certificate 4 Matric 3 2 1 HIGHER EDUCATION FURTHER EDUCATION GENERAL EDUCATION
  • 9. Level Descriptors KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATIO N list name identify show define recognize recall state summarize explain put into your own words interpret describe compare paraphrase differentiate demonstrate visualize find more information about Restate solve illustrate calculate use interpret relate manipulate apply classify modify put into practice analyze organize deduce choose contrast compare distinguish design hypothesize support schematize write report discuss plan devise compare create construct Evaluate Choose Estimate Judge Defend Criticize Justify YEAR 1 YEARS 1, 2 YEARS 2, 3 YEAR 3 YEARS 3, 4 YEAR 4
  • 10. Level Descriptors - SA Qualifications Framework What should the students know about the learning area/subject? What types of problems should the student be able to solve? …in what contexts? …with what methods/procedures? How should students obtain and process information? How should students communicate? How independent should the students be? Which values should students uphold? hand-out #3 – level descriptors
  • 11. “We first have to be clear about what we want students to learn, and then teach and assess accordingly in an aligned system of instruction” (Biggs, 1996). Curriculum alignment
  • 14. Why assess? -two key purposes • Making judgements about student learning for certification/ grading • Helping to prompt and promote further learning/ for monitoring learning progress Summative assessment Formative assessment
  • 15. What does assessment do ? • Defines what students will concentrate on when learning • Affects how they learn • Specifies what counts as learning • Provides information about shortfalls between performance and specification • Simulates conversation about, and reflection, on improvement Ndagire Kizito 04-04-2014 Peter Knight, 2001
  • 16. Types of assessment • Laboratory Work •Class Presentations • Essays / Research • Online Tests • Problem Based Learning Case studies Projects • Reflective Journal • Multiple choice questions (MCQs)• Portfolio • Group work
  • 17. Assessment features Reliability refers to the degree to which an assessment tool produces stable and consistent results. •Test-retest reliability correlating tests given twice over a period of time to a group of individuals. •Parallel forms reliability correlating different versions of an assessment tool (probing the same construct, skill, knowledge base, etc.) to the same group of individuals. •Inter-rater reliability is a measure of the degree to which different judges or raters agree in their assessment decisions. •Internal consistency reliability is an evaluation of the degree to which different test items that probe the same construct produce similar results. similar results
  • 18. Assessment features Validity refers to how well a test measures what it is purported to measure. •Face Validity ascertains that the measure appears to be assessing the intended construct under study •Construct Validity is used to ensure that the measure is actually measuring what it is intended to measure (i.e. the construct), and not other variables. Ways to improve validity •Make sure your goals and outcomes are clearly defined and operationalized. •Match your assessment measures to your learning outcomes. Use outside reviewers. •If possible, compare your measure with other measures, or data that may be available.
  • 19. Rubrics/ Assessment criteria • A rubric is a scoring /set of expectations used to judge student performance. It shows students how well they have performed on an assessment task. • Uses assessment criteria and levels of performance to break down a task in parts explaining what is required . • It can be used for a large number of tasks (essays, research projects, oral presentations, portfolios, etc.) and is especially useful for assessing complex , subjective subjects.
  • 20. Rubrics/ Assessment criteria Examples of rubrics . hand-outs # 4 & 5 – Rubrics
  • 21. Task 1 1. Identify one outcome from your module outline and develop an assessment task ( you can develop more than one task). 2. Create a rubric for one of the tasks in which you develop criteria (2 or more) and levels of expected achievement for that task answer. 3. Allow a colleague to review the tasks and make comments.
  • 22. Task 2 Compare the two past question papers and answer the following questions 1.Are hey appropriate for the grade levels they have been prepared for? 2.Which one would you give to your students? Why? 3.How would you modify each one to fit your own context? hand-outs # 6 & 7 – Past papers
  • 23. Hints for writing exam papers 1. Don’t do it on your own! Get one or two colleagues to do your questions. 2. Have your intended learning outcomes in front of you as your draft your questions. 3. Keep your sentences short. 4. Work out what you’re really testing. 5. Don’t measure the same things again and again. 6. Include data or information in questions to reduce the emphasis on memory.
  • 24. Hints for writing exam papers 7. Make the question layout easy to follow. 8. Write out an answer to your own question. 9. Decide what the assessment criteria will be. 10.Work out a tight marking scheme. 11.Use the question itself to show how marks are to be allocated. 12.Try your questions out. 13.Proof-read your exam questions carefully hand-out # 8 A moderation checklist for exam papers hand-out # 9 Moderation checklist RGU
  • 25. Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher education, 32(3), 347-364. Butcher, K. R. (2006). Learning from text with diagrams: Promoting mental model development and inference generation. Journal of Educational Psychology, 98(1), 182. Knight, P. (2001). A Briefing on Key Concepts: Formative and summative, criterion and norm-referenced assessment. Learning and Teaching Support Network. SAQA (2000), The South African Qualifications Authority Level Descriptors for The South African National Qualifications Framework, ttp://www.saqa.org.za/docs/misc/level_descriptors.pdf References