Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Monitoring and Evaluating Knowledge Management Strategies ”   
Background of the work Paper commissioned by IKM emergent, a research program  Work done in 2008 By Sibrenne Wagenaar, Mark Turpin and Joitske Hulsebosch
Core question How to monitor and evaluate knowledge management strategies to generate conclusive evidence of their value for development efforts? Or “How to measure the intangible”? Method: literature. Interviews with 15 leading thinkers from within the development sectors and experts from other sectors.
Some common elements of a KM strategy  putting in place knowledge sharing systems; strengthening communities of practice (CoPs) or learning networks; using stories to make worthwhile experiences explicit; encouraging cultural change within the organization; and  creating knowledge- sharing relationships with partners (based upon an overview of Hovland, 2003).
The ripple model  Performance improvement Changed practices Knowledge capital Knowledge process enhancing activities
Main challenges the lag time between cause and effect;  demonstrating causality and attribution; quantifying the unquantifiable;  power relations and ownership;  reflection, critical thinking and documenting experiences; finding the right balance between the cost and the results of the assessment; working across multicultural settings and in a multicultural context; and proving results  versus  risk-taking and innovation.
Lessons from M&E in the development sector E very knowledge management strategy is unique and requires its own M&E approach and methods. Methods however, cannot ensure a valuable M&E cyclus, therefore  the design needs to be well thought through and care should be taken to ensure that the right people are engaged.
Lessons from profit sector The combination of narrative techniques that may capture causality between the various levels of our ripple model and efforts to measure changes by using rating sheets.
General key lessons The core question needs to be questioned: do we always need evidence? Decide on inherent versus extractive assessments (avoid the risk of success story-telling) Use informal methods and use the ability of people to read what is going on Combine measurements with narratives and sense-making Link and use available data
Design considerations: decisions we have to make Phase 1: Questions to consider at the start Is it important to formally monitor and evaluate the impact or not? To monetize or not? What is the main purpose of the impact assessment? Who is the owner? Who judges? Who is involved in the design process?
Phase 2: Focus of the M&E process     What kind of change processes are we measuring? Do we measure all the way down? Which indicators do we use? What can we measure?  What do we assess?
Phase 3: Selection of methods   Do we use retrospective techniques or baseline studies? What mix of methods is appropriate?
Phase 4: Presenting and learning from the results   How are we going to present the results? How are we going to read the assessment?

More Related Content

Monitoring and evaluating knowledge management strategies

  • 1. Monitoring and Evaluating Knowledge Management Strategies ”  
  • 2. Background of the work Paper commissioned by IKM emergent, a research program Work done in 2008 By Sibrenne Wagenaar, Mark Turpin and Joitske Hulsebosch
  • 3. Core question How to monitor and evaluate knowledge management strategies to generate conclusive evidence of their value for development efforts? Or “How to measure the intangible”? Method: literature. Interviews with 15 leading thinkers from within the development sectors and experts from other sectors.
  • 4. Some common elements of a KM strategy putting in place knowledge sharing systems; strengthening communities of practice (CoPs) or learning networks; using stories to make worthwhile experiences explicit; encouraging cultural change within the organization; and creating knowledge- sharing relationships with partners (based upon an overview of Hovland, 2003).
  • 5. The ripple model Performance improvement Changed practices Knowledge capital Knowledge process enhancing activities
  • 6. Main challenges the lag time between cause and effect; demonstrating causality and attribution; quantifying the unquantifiable; power relations and ownership; reflection, critical thinking and documenting experiences; finding the right balance between the cost and the results of the assessment; working across multicultural settings and in a multicultural context; and proving results versus risk-taking and innovation.
  • 7. Lessons from M&E in the development sector E very knowledge management strategy is unique and requires its own M&E approach and methods. Methods however, cannot ensure a valuable M&E cyclus, therefore the design needs to be well thought through and care should be taken to ensure that the right people are engaged.
  • 8. Lessons from profit sector The combination of narrative techniques that may capture causality between the various levels of our ripple model and efforts to measure changes by using rating sheets.
  • 9. General key lessons The core question needs to be questioned: do we always need evidence? Decide on inherent versus extractive assessments (avoid the risk of success story-telling) Use informal methods and use the ability of people to read what is going on Combine measurements with narratives and sense-making Link and use available data
  • 10. Design considerations: decisions we have to make Phase 1: Questions to consider at the start Is it important to formally monitor and evaluate the impact or not? To monetize or not? What is the main purpose of the impact assessment? Who is the owner? Who judges? Who is involved in the design process?
  • 11. Phase 2: Focus of the M&E process   What kind of change processes are we measuring? Do we measure all the way down? Which indicators do we use? What can we measure? What do we assess?
  • 12. Phase 3: Selection of methods   Do we use retrospective techniques or baseline studies? What mix of methods is appropriate?
  • 13. Phase 4: Presenting and learning from the results   How are we going to present the results? How are we going to read the assessment?