Equitable Job Eval Project Overview Report
Equitable Job Eval Project Overview Report
Equitable Job Eval Project Overview Report
Background....................................................................................................................3 History and rationale..................................................................................................3 Development and testing ...........................................................................................3 Key EJE design features.............................................................................................5 Implementation...............................................................................................................5 Beta release.................................................................................................................5 Conditions of use........................................................................................................6 EJE Review................................................................................................................6 Training modules........................................................................................................6 Pay investigations...........................................................................................................7 EJE projects................................................................................................................7 Special Education Support Workers Pay Investigation..........................................7 Community Support Workers................................................................................8 Department of Conservation (DOC) roles..............................................................9 PSA project............................................................................................................9 Roles evaluated with EJE.....................................................................................11 Project terminated.................................................................................................12 Implementation issues for consideration in the review................................................12 Training....................................................................................................................12 Process......................................................................................................................13 Factor plan................................................................................................................14 Comparisons with other systems .............................................................................14 Becoming more efficient in implementation............................................................14 Identifying comparator occupations.........................................................................14 The time projects have taken ...................................................................................15 Using EJE without its own market data base of points and pay...............................16 The future of EJE.....................................................................................................16 Conclusion....................................................................................................................17 Appendix 1 The Gender-inclusive Job Evaluation Standard, and Spotlight: A Skills Recognition Tool......................................................................................................18
Background
There was no existing Standard on gender bias and job evaluation, and t he Standard has since attracted interest in South Africa, Spain, Australia and the United Kingdom. The Pay and Employment Equity Unit developed three resources on meeting the Standard - A Guide to the Gender-inclusive job Evaluation Standard, Gender Bias in Job Evaluation: A Resource Collection, and Dorfox Meets the Standard: Gender-inclusive Job Evaluation. 2 For example, in the United Kingdom, the job evaluation systems developed for local government and for the National Health Service were specifically designed to minimise gender bias. In April 2009, a decision was handed down in the United Kingdom case Hartley and Others and Northumbria Healthcare NHS Foundation Trust. It was found that Agenda for Change, the job evaluation system used in the health sector, was not affected by sex discrimination. The decision provides a comprehensive consideration of gender bias in job evaluation.
scope of some factors and to consider any issues of the interrelationship of the factors.
3. The third trial included testing the data gathering process, the application
of the factor plan to 17 jobs within a single organisation and the use of the system by a job evaluation committee from the organisation itself. It also allowed for comparison of the rank order of jobs produced by EJE compared with that which was established using another system within the organisation and consideration of whether any differences were explainable. 4. The final trial examined the performance of EJE in evaluating occupational hierarchies in job families within the health and education sector. This trial also allowed a final review of language and level distinction.
The project team was confident that the last version of EJE met the design principles of gender neutrality developed at the outset. Further modifications could only be achieved after real life use of the tool. While EJE is a New Zealand system built for use in contemporary organizations4 the 12 factors are not a complete departure from other job evaluation systems.
Implementation
Beta release
Cabinet originally agreed that EJE would be released in a beta release a testing phase which could be concluded following an evaluation of the system after evaluation of a sufficiently large number of jobs [Cab 06 34/8]. That decision and others relating to the Pay and Employment Equity Plan of Action have been rescinded [Cab 09 16/12]. The use of a beta release phase is common in implementing a new job evaluation system. It recognizes that there may be a need to modify certain features of the scheme in the light of its operation and results. In the case of EJE, the beta release phase accommodates the reality that while four tests had been conducted, there is some unavoidable artificiality in how the job evaluation process operates when its results are not being implemented. This artificiality
3
The fixed weightings are set out in the Factor Plan. Consistent with other job evaluation systems, the weightings are based on full time equivalent staff number and operational budgets. 4 For example, the system recognises job features such as multicultural skills and leadership through influence.
affects data gathering, the operation of the job evaluation process, the way job holders, managers, and evaluators participate in the process and the management of the results of the evaluation. Certain aspects of the operation of the scheme can only become apparent when a sufficiently large and diverse range of jobs is evaluated. The beta release allows the use of the tool, while recognizing that there may be some changes in the tool and some results may need to be revisited, without obstructing the use of the tool and the implementation of results. Since the beta release phase involves monitoring of results, and is an important phase in quality assurance of the tool, standards were set regarding how the tool would be used. The beta release process was agreed in letters between the Department of Labour and the State Services Commission.
Conditions of use
EJE has been provided to users with a Conditions of Use Agreement which requires, among other things, the use of the tool in its entirety, and requires users to provide the results of EJE evaluations to the DoL and to train participants appropriately for the EJE projects. In response to a high level of interest in the tool, EJE was also provided on a for information basis to interested people, with the proviso that use of EJE for job evaluation without a Conditions of Use agreement contravened the basis on which the for information copy is made available5.
EJE Review
A monitoring committee comprising employer and union representatives from the Public Service, and the public health and education sectors, was established by the Department of Labour to review EJE results as they emerged and to conduct the evaluation to conclude the beta release. The committee met once on 10 May 2007. A review and monitoring framework was commissioned from Strategic Pay, using an excel-based tool to record scores, evaluation rationales and remuneration information. The committee agreed to the EJE monitoring and review framework.
Training modules
Top Drawer Consultants and Working Wisdom were contracted to develop the EJE training modules. Three modules were initially developed: 1. Introduction to EJE (.5 day) 2. Data Gatherer training (one day) 3. Evaluator training (one day) The training is a mix of presentation and practical experience of (for example) data gathering. A further stand alone module was later developed by Top Drawer Consultants and Pulse HR - Minimising Bias in Human Resources Practice. The three core EJE modules were piloted with volunteers (mainly from the Public Service Association and HR staff from Government departments). A training opportunity was provided to consultants and others to become competent in providing EJE training or assist with EJE implementation within organizations. As required by the EJE Conditions of Use agreement, training has been provided to data gatherers and the job evaluation committee for the Ministry of Education pay investigation of education support workers. Training for data gatherers was provided to the PSA for its internal project evaluating the jobs of its own staff and the Child, Youth and Family pay investigation of social workers.
5
Following the disestablishment of the Pay and Employment Equity Unit, the basis on which EJE is provided has changed and is set out at the end of this report.
In addition to formal training sessions, there have been numerous brief presentations to seminars and groups to introduce EJE and give an outline of the system and its implementation. It became clear that there is a need to have some flexibility about the content and duration of the training depending on existing levels of skills and knowledge about gender and job evaluation.
Pay investigations
The Plan of Action included a provision for pay investigations and remedial pay settlements6. A pay investigation was a process of systematic enquiry into all the factors affecting remuneration of female dominated occupations where 70% or more people in that occupation are women. The investigation focused on factors that influence job size such as skills, knowledge, responsibilities, demands and conditions. It also included other factors that affect pay rates such as market influences, performance payments and other employment conditions. Cabinet had determined that pay investigations can arise from agency or sector Pay and Employment Equity Review response or action plans; from a request to participate in a sector/cross sector pay investigation; or in bargaining. Cabinet had approved terms of reference and guidelines for conducting a pay investigation [Cab 05 34/8]. Comparator occupations from outside the organization or bargaining unit within the sector or across the sector could be used, if suitable internal comparators could not be identified. The conduct of a pay investigation was seen as likely to include evaluating a sample of jobs to assess the relative size, content and contribution of target and comparator roles. It was anticipated that EJE would be the evaluation instrument although by agreement of the parties to the pay investigation, it was possible to use another job evaluation system providing it met the Gender-inclusive Job Evaluation Standard. It was intended that any remedial pay settlement arising from a pay investigation would be delivered mainly through existing budget planning and management mechanisms. Two pay investigations were initiated as a result of organizational pay and employment equity review response plans. These were of social workers in Child Youth and Family (CYF) (part of the Ministry of Social Development) and special education support workers in the Ministry of Education. Both investigations used EJE as the key evaluation tool with another job evaluation system used to provide a comparison with the EJE outcomes. The Ministry of Education with the NZEI completed the investigation process in January 2009. The results are now being used within the bargaining process. The investigation in CYFs was not completed by the time the Government discontinued pay investigations early in February 2009. The Ministry of Education pay investigation (as it related to EJE) is described in more detail below.
EJE projects
Special Education Support Workers Pay Investigation
The Pay and Employment Equity Review in the Ministry of Education suggested that the work of Special Education Support Workers may be undervalued because the work is primarily undertaken by women and has many of the characteristics found in research and pay equity cases to be associated with gender related
6
Pay investigations have been discontinued and the related Cabinet decisions have been rescinded [Cab 09 16/12].
undervaluation. There are three support worker roles Behavioural Support Workers, Communication Support Workers and Educational Support Workers. The comparator roles chosen were Corrections Officers and Hospital Orderlies. These were male-dominated occupations in the public sector, with the same skill level as education support workers, as measured by the Australian and New Zealand Standard Classification of Occupations (ANZSCO). The pay investigation was conducted by Janice Burns (Top Drawer Consultants) and Lyndy Young (PulseHR) both of whom had been on the EJE development team. The full report can be found on NZEIs website: http://www.nzei.org.nz/Group+Special+Education/Support+Workers+Pay+Invest igation.html The pay investigation used all the components of the EJE system including the EJE questionnaire. Once the data had been gathered and validated, the information was evaluated using EJE. The job data was also provided to Mercer to undertake an evaluation using the Compers system7. This allowed comparison of the rank order generated by the EJE evaluation and the Compers evaluation. If the rank orders had been substantially different there would have been an opportunity to analyse the ways in which the EJE factors (in terms of what they capture and measure) could have affected the scores. The EJE and Mercer evaluations used the same job information and produced substantially similar results. Both evaluations supported the hypothesis that the Special Education Support Workers jobs were of similar job size to Corrections Officers who are paid considerably more, and the jobs are of larger size than Hospital Orderlies who are also paid more.
Compers is owned by Mercer through a merger with Watson Wyatt. The Ministry of Education used Compers as their job evaluation system. Compers is no longer supported by Mercer. 8 Community support workers and the comparator occupations are classified at level 4. There are five skill levels in ANZSCO defined in terms of formal education and training, previous experience and on- the-job training. Skill level 4 is a level of skill commensurate with a NZ level 2 or 3 qualification or at least one year of relevant experience.
involving nine jobs and the data gathering and evaluation was done by the consultant. However, in line with the Gender-inclusive Job Evaluation Standard, the evaluation was independently validated by another consultant.
PSA project
In 2007 the PSA undertook a Pay and Employment Equity review. The review was committed to in bargaining between the PSA staff union representatives and the PSA. The PSA Joint Union Management Committee (JUMC) sponsored the review. The JUMC is made up of the National Secretariat, a representative of the Assistant Secretaries, the Union Group (UG) and the PSA Member Employee Group (MEG). The JUMC agreed on the following recommendations in the PAEE report: 1. developing job descriptions for all 20 PSA positions using the EJE questionnaire and the job evaluation record 2. EJE training for participants on job evaluation committee 3. undertaking an Equitable Job Evaluation (EJE) exercise of all 20 positions within the PSA;
A working party was established to carry out these tasks. Developing Job Descriptions The first phase of the project involved developing gender neutral job descriptions for every job in the PSA covered by the Collective Agreement. The working party drafted gender neutral job descriptions for every job within the PSA. The positions are: Assistant Secretary Financial Controller Human Resources Advisor Policy Advisor Legal officer Asset Manager Communications Advisor Organiser Knowledge management and information Advisor IT database development Senior Finance Officer IT systems officer IT service network database PSA asset and administration officer On line organiser Finance systems and processes officer Membership officer Organising administrator Database development administrator Support Administration Collecting job data After consultation with staff on the job description, the new job descriptions were approved and job holders were interviewed using the Equitable Job Evaluation Questionnaire. The outcome of the interviews was reviewed by the National Secretaries and Assistant Secretaries: is the material an accurate reflection of the job? is there anything missing than should be added? Their comments were available to the working party. Evaluation Committee The evaluation committee was made up of the members of the working party and an external consultant with an in-depth knowledge and experience in EJE. In line with the EJE Conditions of Use Agreement with the Department of Labour, the evaluation committee was trained in using the Equitable Job Evaluation factor plan. The training covered: the ground rules for working together on the evaluation team;
an in-depth discussion of EJE factors and their underlying concepts; avoiding bias in the evaluation process; the evaluation process.
The committee did not undertake evaluations of their own roles; this was done by the external consultant. In addition, the committee undertook the evaluation to the point of determining the level score only. This meant that prior to applying
10
the factor weightings9 it is not possible to see the final relationships between the jobs. It avoids any tendency to make jobs come out where it feels right. The factor weights were applied by the external consultant after the committee finished all the evaluations. This is simply a mechanical process. On completion of the evaluations the committee reviewed all the level scores for consistency. The committee evaluated all jobs on the information that was available from the job descriptions and the questionnaires. The committee evaluated the jobs on the basis on this information and what the PSA would require of a competent job holder, not on the job-holder who currently occupies the job. Job evaluation does not factor in length of service. That is a separate process determined by the progression structure of the remuneration system. The evaluation process took four days. It is important to check that jobs evaluated at different points in time have been treated consistently. Often, over time evaluation committees increase their understanding of the factors and the way levels are assigned. All staff received a copy of the job scores for their own role, a copy of the weighted scores chart for all jobs and the evaluators written rationales for reaching those scores. Review process In line with the requirements of the Gender-inclusive Job Evaluation Standard, a review/appeal process was established for jobholders. This allowed staff to request a review of their job evaluation score if they felt key information had been omitted. Four jobholders sought a review and as a result of this job scores were amended. Outcome The final job scores were discussed at the JUMC. The final EJE results showed that some jobs have substantially changed their internal relativity. These are mainly female-dominated jobs and their pay ranges were adjusted to reflect the EJE results. Some jobs have moved downwards in relation to other jobs. This may be addressed if a vacancy arises. Another outcome was that the administrative jobs are all about the same size.
These weightings are transparent and are documented in the EJE Factor Plan
11
Communications Advisor Organiser Knowledge management and information Advisor IT database development Senior Finance Officer IT systems officer IT service network database PSA asset and administration officer On line organiser Finance systems and processes officer Membership officer Organising administrator Database development administrator Support Administration.
Project terminated
As mentioned above, the CYF social worker pay investigation was terminated in February 2007. Interviews had taken place with a sample of jobholders in each of the five social work roles and planning was underway to establish the comparator roles.
Training
Experience has shown that the content and duration of the EJE training needs to be sensitive to the skills and experience of the audience. However, it was also clear that people may not know what they dont know in terms of understanding the potential for gender bias in job evaluation and HR processes generally. Examples of adaptations made to the material are: Training Ministry of Education data gatherers who were HR advisors with experience of interviewing and some general job evaluation knowledge. More emphasis was placed in the course on understanding gender bias. The session time was reduced to a half day on condition that they had personally completed the questionnaire Training PSA data gatherers they too had good interviewing skills and all had some familiarity (some in depth) with job evaluation and with issues of bias. The training took half a day with an hour follow up later. None of the participants had completed the EJE questionnaire and so extra time was allocated to going through the questionnaire.
The modules cover a lot of material and have relied on participants doing some substantial home work prior to the course such as reading the factor plan and/or completing the EJE questionnaire for their own job. This home work did not happen. The modules may need to be revised to allow for this reality perhaps by spending more time on the meanings of the factors or going through the questionnaire to demonstrate why particular questions are being asked.
12
The success of the data gatherer and the evaluation training depends on having sufficient time for practice. In any revision or adaptation this should not be compromised.
Process
In the completed EJE projects from data gathering to evaluations the process as specified in the EJE Users Guide generally worked well. Some particular processes worked well and could be emphasized or added to the Guide as tips. These were: Staring the data gathering interview by asking the job holder for a general overview of their job perhaps by asking about a typical day provides the interviewer with some useful vocabulary and a slightly wider knowledge base from which to probe Having the job information validated and signed off meant that there was confidence that it was complete and accurate. This supported the practice of using only the validated information in decision-making. It also meant that it was not necessary to ask the job holder to answer questions from the committee The chair of the committee was one of the consultants. They did not have voting rights but provided independent clarification of factors and factor level distinction when required The committee adopted a practice of parking a decision on a particular factor when there was not sufficient agreement. Often, in discussion of other factors or on further reading of the questionnaire or job descriptions, agreement was possible On the initial reading of the questionnaire committee members were encouraged to use post it notes to mark text where the information provided in one factor was pertinent to another factor While the knowledge factor is the first one in the questionnaire, it is also one of the more conceptually complex factors. The committee decided to score this factor after the other skills factors had been scored It is important to read and re-read the guidance notes for the factors this allows correction of any misunderstanding or group think about what the factor really means The committee confirmed that it is essential to take the time to develop the scoring rationale once the factor has been evaluated. It is impossible to re-create this later. The committee articulated its reasons and one person wordsmithed the final text which was read to the committee for agreement or modification.
EJE is a new system and it took the committee some time to come to grips with the factors and their meaning. To ensure that everyone understood the factors it was decided that the initial reading of the job information would occur at the committee. Members became more confident during the process. At this point in a committees development it becomes possible to read the material independently prior to the meeting and even begin scoring as long as clear
13
rationales are kept at the point of scoring. This would substantially reduce the time it takes to evaluate the jobs. Independent scoring does increase the requirement of the chair to be alert to bias in individual decision-making by observing patterns of scoring and rationales.
Factor plan
Experience working with EJE is limited. However, the experience does suggest consideration of changes to the factors or the factor guidelines. Some of the changes are in factor wording, refining factor concepts, or in the number of or, distinction between factor levels. Some of the minor changes could be made in advance of any review because they provide greater clarity for use, rather than any substantive change.
14
A difficulty that arises in choosing appropriate comparators is that the more dissimilarities there are in the work of the male- and female-dominated occupations, the more likely it is that the differences in remuneration may be seen as appropriately and legitimately attributable to differences in the nature of the work, irrespective of comparability of measured job size. However, it is likely that there will be significant differences between female- and male-dominated jobs because of the way occupational segregation has developed in the labour market, and has been reflected in job evaluation and wage-fixing systems over time. The use of specific male-dominated comparators to establish gender neutral job evaluation may not be necessary since job evaluation systems themselves build in comparisons among jobs of similar size irrespective of similarity of job content. Once there is confidence in the job evaluation system, and it can be demonstrated to be gender neutral, it can be sufficient that evaluations are carried out using the gender neutral job evaluation system. In the UK, Canada, and the USA, among other countries, the use of a gender-neutral job evaluation system on its own provides evidence about the relative value of jobs. In the equal remuneration principles established in the Australian states New South Wales and Queensland, comparators specifically were not required. In operational terms, some delays were occasioned in seeking the participation of external organizations in providing comparator occupations. This is probably related to concerns about the possible industrial implications for the organization providing the comparators as well as about the resources involved in participating. The demands on comparator organisations were very similar to those on the target organization provide contact people to locate jobholders, provide time for interviews (up to 2 hours), provide other people, managers and staff, to validate the job information and provide details of remuneration setting and any factors that impact on this. Once comparator sites were located, job holders themselves were extremely cooperative and helpful. Choosing comparators at the national level could not recognise or take account of organisational or local complexities such as (in the case of one DHB) the pressure on orderlies to manage the move to the new hospital block or, the media attention to a series of events at one prison that made staff wary of extra engagement.
15
holders that the process was inclusive. The result was that a great deal of travel was necessary and sometimes for only one appointment. This increased the time and cost of the projects. An underlying principle of EJE is that gender neutrality in job evaluation requires a significant level of participation of employees, in both data gathering and in evaluation. Some job evaluation practitioners and/or systems rely on existing job descriptions which may be incomplete, out-of-date and/or not endorsed as accurate by job holders and their managers. Using the EJE job information questionnaire to collect job information is time-consuming. Experience with the major UK job evaluation schemes aimed at improving gender neutrality is that the quality of the evaluations depends heavily on the quality of the job information, and endorsement of job information by job holders and their managers is a critical element of that. Some job evaluations are carried out solely by a single consultant. While that can be quick, there is a potential for cutting corners and for bias. The Genderinclusive Job Evaluation Standard P8007/2006 advises that a committee is used, or if the project is a small one, that there is a quality assurance process involving another evaluator10.
Using EJE without its own market data base of points and pay
EJE can be used without a data base of its own by drawing on a range of market information for some jobs to provide anchor points for an EJE points to pay line. For example, some jobs considered unlikely to be affected by the differences arising from specific features of EJE can be used as reference points for one or more market surveys. Organisations often do refer to more than one market survey in setting pay rates. Concerns about how to use the results of EJE while its own market data base was being developed, and also an interest in gathering data on how EJE evaluations compare with evaluations using other systems has meant that EJE was used in conjunction with other systems in most projects. The community support worker project involved use of EJE alone and the results were for use in bargaining and in discussions with funding bodies.
10
Sue Hastings, UK job evaluation expert, has questioned whether it is possible to support a claim that a job evaluation process is gender neutral if a joint employer/employee job evaluation committee has not been used.
16
Conditions of Use Agreement Providing the EJE materials Training Review/refinement Building up a database Monitoring EJE outcomes
The Department of Labour will continue to deliver the EJE tool. The Departments website will display information about the tool, and a Conditions of Use Agreement to complete to order the tool. The tool will then be delivered by email. Periodically users will be requested to provide the information they have agreed to provide about the project(s) they are undertaking and the information will be entered into the EJE monitoring tool11. The Monitoring and Review Committee will be convened annually to review the evaluation records and scores. If there is a need for specialist job evaluation advice, it can be requested from the panel of job evaluation providers who have experience and/or training with EJE and have agreed to provide advice. The Monitoring and Review Committee will have the responsibility of conducting a review of the beta release of the tool once there have been several hundred evaluations, and recommending to the Department of Labour any changes the Committee considers necessary.
Conclusion
There has been limited use of EJE to date. Experience in using the tool is that it can contribute to full and fair description and analysis of jobs, especially in service sector occupations. Participants in the job evaluation projects have valued the contribution of the EJE language and concepts to capturing job elements in better ways. The extent of the contribution will depend on the level of ongoing use and the management of the quality and integrity of the system.
11
The EJE Monitoring Tool is stored by the Department of Labour and the current data base of job evaluation scores (for ongoing updates and committee monitoring) includes thirty-one roles.
17
Appendix 1 The Gender-inclusive Job Evaluation Standard, and Spotlight: A Skills Recognition Tool
The Gender-inclusive Job Evaluation Standard was developed during 2006, adopted by the New Zealand Standards Council, and published by Standards New Zealand in December 2006. The Standards development committee included representatives of employers and unions, and experts in job evaluation and in gender issues, and the Human Resources Institute of new Zealand. The job evaluation providers on the committee (Hay, Mercer and Strategic Pay) included the major providers of job evaluation in New Zealand. They have all undertaken to meet the Standard as have other providers. The companies have provided statements on how their systems meet the Standard and where clients request it, statements on how the processes for particular evaluation projects meet the Standard. These statements provide valuable input for clients. As the Standard is a voluntary one, responsibility for demonstrating how the Standard is met lies with those who claim to meet it, and responsibility for assessing their claims with those to who they make the claim (most commonly human resources managers). Some job evaluation providers have advised that they now provide training and/or briefing on gender-neutral job evaluation for their own consultants and for participants in job evaluation projects. The Standard is presented in four sections: A description of job evaluation and of the Standard An outline of how gender bias can arise in job evaluation Requirements and optional guidance for planning and preparing job evaluation projects Requirements and optional guidance for evaluation of jobs and reviewing evaluations, including appeal and review procedures, and the issue of slotting. Spotlight: A Skills Recognition Tool was developed and tested in New Zealand public sector workplaces by an Australian and New Zealand team led by Dr Anne Junor, University of New South Wales. The tool is to improve recognition of skills, especially those in service sector occupations, and to inform a range of human resources management processes including recruitment, writing position descriptions, learning and development, and job evaluation. It complements other skills and job description and job analysis instruments, and focuses specifically on the types of skills that are often overlooked, especially in human services work and in jobs in the lower levels of organisational hierarchies. The main types of skills often overlooked are the skills of combining activities in work streams, and those involved in the sensitive, responsive and integrated delivery of appropriate services to people. Spotlight provides a taxonomy of three sets of under-recognised tacit work skills, each divided into three skill elements, and five experience-based skill levels at which each skill element can be used. It can be used to describe the performance of work in any job at any functional level. It has a set of pre-classified empirically-derived work activity descriptors through which the skill elements and levels can be recognised. Based on this set of descriptors, it provides a job analysis questionnaire for use in identifying the implicit demand for those skills in any job and a skills audit questionnaire for use by individuals and teams to identify their level of proficiency in using these skills, It includes a crossreferencing system whereby personal attributes and employability skills can be defined more precisely and at different levels of workplace learning (the skills of
18
experience), specifically focusing on attributes, customer focus, problemsolving, teamwork and leadership. It also incorporates a succinct graphical technique for representing the combination of tacit work process skills and levels required by a job and/or within an individuals capabilities at a point in time. Several briefings on the tool have been provided and its application is being explored in some community sector settings.
19