Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

SBM Processes

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 25

SBM ASSESSMENT

Purpose of the assessment


SBM Assessment
a. to get a division profile of
may be
SBM implementation;
conducted by the
school to
b. to identify schools that need
determine its
assistance and those with best
level of practice
practices for benchmarking by
and identify its
other schools; and
best practices. It
may be
c. to validate assessment results
conducted by the
of a requesting school.
Division to:
10/10/2019 2
Basic Steps in the Assessment Process

Step 1: Organize a team of at least ten members, five (5)


teachers and five (5) community. The eight shall compose
the group who shall obtain and validate evidence. One
shall be elected as team leader, and the other as team
secretary.

Step 2: Let team members select the Principle they want


to assess. There should be at least two in each team.

3
Basic Steps in the Assessment Process

Step 3: In the Pre-assessment meeting, decide whether to


use the Whole or Part method. In the former (whole
method), all team members shall work as a group, to
validate one principle after another. In the latter (part
method), at least two (2) members shall be assigned to every
Principle. The team leader acts as coordinator and facilitator
while the secretary acts as documenter. The team should
study the Assessment Manual, especially the D-O-D process.

Step 4: Assessment Proper (school visit if assessment is done


by an external team). Proper courtesies must be accorded to
the School Head in planning the assessment process.
4
During the assessment proper. Have a schedule of
activities for documenting analysis (1 day),
observation analysis (1 day), and discussion of data
(1/2 day). Classify documents by Principle. Note that
one document like the SIP, can be a source of
evidence for several indicators across principles.
Gather, analyze evidence using the D-O-D process and
select samples of documents using emergent
saturation sampling and snow balling. Summarize the
evidences, and arrive at a consensus, what rating to
give to each indicator based on documented
evidences.

10/10/2019 5
Basic Steps in the Assessment Process

Step 5: Conduct process validation, the purpose is to


gather process evidence to validate documented
evidences using observation of classes and activities.
Follow D-O-D process.

Step 6: Discussion of Documents and Process Evidence.


Summarize data for each Principle/ indicator. Clarify
Issues, problems, opportunities, etc. Team scores the
indicators.

Step 7: Closure or Exit Conference/Meeting.

Step 8: Report Writing by team.


6
Validation Procedure:

The SBM Assessment Tool


uses evidence to determine
Using DOD to a school’s level of practice.
Validate Evidence DOD is a means of
of SBM Practice. evaluating the validity or
truthfulness of the
evidence.

10/10/2019 7
Three essential
steps in evaluating the validity of an evidence of an SBM
practice.

1. Conduct Documentary Analysis. Obtain and assemble all existing artifacts


related to the indicator being assessed. Artifacts are the things used by the
school community to achieve educational goals. Evaluate the validity or
truthfulness of each artifact against the four RACS criteria namely:
Relevance. The evidence must be appropriate to the indicator being assessed. It
is appropriate if the artifact or document is a tool or a product of a practice
expressed in the indicator.
Accuracy. The evidence must be correct. If it is a lesson plan, then both content
and procedure must be correct.
Currency. The evidence must be present, existing or actual.
Sufficiency. The evidence must be adequate or enough. If a student learning
portfolio is presented as evidence of self-directed learning, its presence in only
two or three classes is not an adequate evidence of school-wide implementation.

Synthesize the results of the documentary analysis. 10/10/2019 8


Three essential
steps in evaluating the validity of an evidence of
an SBM practice.

2. Conduct observations to obtain process evidence. Documentary evidence may


show the school’s focus on learner-centered learning like cooperative, interactive,
problem solving and decision making. But are these being practiced? There is a
need to obtain process evidence to answer the question.

Process evidence is obtained by scrutinizing instructional, leadership, and


management styles, methods, techniques, approaches, and activities used by the
school community to achieve the SBM goal. Evidence is identified through
participant or nonparticipant observations which may be conducted formally or
informally. Individual or group interviews are held to verify or clarify the evidence.
Evidence is scrutinized for validity using the RACS criteria.

Determining the number of observations, interviews, and documents to be


scrutinized is a sampling problem in conducting DOD. The problem is commonly
addressed by using saturation sampling. The technique is described in Attachment
of the SBM Assessment Tool. Use the process evidence to cross-validate9
10/10/2019

documentary evidence.
Three essential
steps in evaluating the validity of an evidence of an SBM practice.

3. Discuss the synthesized documentary and process evidence. Conduct the discussion as
a friendly non-confrontational conversation to explain, verify, clarify and augment the
evidence.

Invite members of the school community who were engaged in the collection and
presentation of evidence to participate in the discussion.

As a team arrive at a consensus on the level of practice of the indicator being assessed,
and indicate it in the scale with a check mark (✔) in the appropriate box. Continue the
process until all the four dimensions are assessed.

Practices vary in establishing the level of practice of an indicator. The most common is
the integrative approach in which the entire body of evidence for all indicators of a
standard is assembled first, scrutinized for internal consistency, and finally used as guide
in making a consensual decision to which level of practice an indicator belongs.

The other practice is non-integrative. Indicators of a standard are scrutinized one by one
for evidence and also classified one by one for level of practice. Relationships among
indicators are given less attention. 10/10/2019 10
Who conducts the
DOD?

A school assessment committee


conducts the DOD if assessment is
school initiated. A Division
assessment committee conducts
the DOD if the assessment is
Division-initiated, or if the
assessment is requested by a
school.

10/10/2019 11
Who constitute the assessment
committee?

A leader assisted by a secretary, heads


the assessment committee. Four
subcommittees are organized and each
one is assigned to assess an SBM
standard. Four to five members may
compose one subcommittee.

10/10/2019 12
What operational principles guide the DOD process?
Collaboration. The assessors work as a team. Leadership is shared.Decisions are made by consensus. Every
member is accountable for the performance of the team.

Transparency. The validation of evidence is open to stakeholders’ view and review.

Confidentiality. Information obtained from the DOD process that may prejudice individuals, groups or the
school is handled judiciously.

Validity. Documentary analyses and observations are rigorous in procedure and demanding in quality of
results.

Reform-oriented. DOD comes up with informed recommendations and action programs that continuously
move the school to higher levels of practice.

Principle-oriented. DOD is guided by the ACCESs principles.

Stakeholders satisfaction. DOD is an exciting growth experience.


Analysis of documents, artifacts, and processes unfold the progress made, objectives achieved, new
techniques developed, best practices mainstreamed, prices won--despite
10/10/2019 limited 13resources and physical,
social and political constraints.
The Revised School-Based Management (SBM) Assessment tool
is guided by the four principles of ACCESs (A Child- and
Community- Centered Education System). The indicators of
SBM practice were contextualized from the ideals of an
The Revised SBM ACCESs school system. The unit of analysis is the school
system, which may be classified as beginning, developing or
Assessment Tool advanced (accredited level). The SBM practice is ascertained
by the existence of structured mechanisms, processes and
practices in all indicators. A team of practitioners and
experts from the district, division, region and central office
validates the selfstudy/assessment before a level of SBM
practice is established. The highest level- “advanced” is a
candidacy for accreditation after a team of external
validators confirmed the evidence of practices and
procedures that satisfies quality standards.

10/10/2019 14
Characteristics and Features. The revised tool is systems
oriented , principle-guided, evidence- based, learner-
centered, process-focused, non-prescriptive, user-friendly,
collaborative in approach and results/outcomes focused.

The Revised SBM Parts of the Tool. The tool shall contain the following parts:
a) basic school/learning center information; b) principle-
Assessment Tool guided indicators; c) description of SBM practice scaled in
terms of extent of community involvement; and d) learner-
centeredness and the scoring instructions.

Users. The users of the following tools are: a) internal


stakeholders – teachers and school heads; and b) external
stakeholders – learners, parents, LGU, Private Sector and
NGO/PO.

15
1. The four (4) principles were assigned
percentage weights on the basis of their
relative importance
to the aim of school (improved learning
Scoring outcomes and school operations);
Instructions
Leadership and Governance - 30%
Curriculum and Learning – 30%
Accountability and Continuous
Improvement – 25%
Management of Resources – 15%

16
2. Each principle has several
indicators. Based on the results of
Scoring the D-O-D (Document Analysis,
Instructions Observation, Discussion), summarize
the evidences, and arrive at a
consensus, what rating to give to
each indicator;

17
3. Rate the items by checking the appropriate
boxes. These are the points earned by the school
for
the specific indicator. The rating scale is:
Scoring 0 - No evidence
1 - Evidence indicates early
Instructions or preliminary stages of implementation
2 - Evidence indicates planned practices and
procedures are fully implemented
3 - Evidence indicates practices and procedure
satisfy quality standards

18
4. Assemble the Rubrics rated by the
respondents; edit them for errors like double
entries or incomplete responses;
Scoring
5. Count the number of check marks in each
Instructions indicator and record in the appropriate box in
the summary table for the area / standard
rated;

6. Multiply the number of check marks in each


column by the points (1-3);

19
7. Get the average rating for each principle by
dividing the total score by the number of
indicators of the principle;

8. Record the average ratings for the principle in


Scoring the Summary Table for the computation of the
General Average;
Instructions
9. Multiply the rating for each principle by its
percentage weight to get the weighted average
rating;

10. To get the total rating for the four principles,


get the sum of all the weighted ratings. The
value derived is the school rating based on DOD;
20
11. The level of practice will be computed
based on the criteria below:
 60% based on improvement of learning
outcomes;
– 10% increment: 25 points
– 20% increment: 50 points
Scoring – 50% increment: 100 points
40% according to the validated practices
Instructions using DOD
– 0.00 - 0.50: 25 points
– 0.51 – 1.50: 50 points
– 1.51 – 2.50: 75 points
– 2.51 – 3.00: 100 points

21
12. The resulting score will
Scoring be interpreted as:
 Level III: 150-200 points
Instructions
 Level II: 149-100 points
 Level I: 99 and below

22
Level I: BEGINNING - Establishing and developing
structures and mechanisms with acceptable
level and extent of community participation and
impact on learning outcomes.
Level II: DEVELOPING - Introducing and
Description of SBM sustaining continuous improvement process that
Level of Practice integrates wider community participation and
The resulting levels improve significantly performance and learning
are described as outcomes.
Level III: ADVANCED (ACCREDITED LEVEL) -
follows:
Ensuring the production of intended
outputs/outcomes and meeting all standards of a
system fully integrated in the local community
and is self-renewing and self-sustaining.
23
To accelerate the
implementation and reward best
practices, the revised SBM
Recognition practice approaches assessment
and using systematic recognition and
Incentives incentives program in terms of
higher grant, capital outlay and
performance-based bonus.

24
10/10/2019 25

You might also like