Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Tools For Performance PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

An Overview of Existing Approaches

and Tools for Performance


Assessment and Benchmarking
Bjrn Andersen
Professor, Norwegian University of Science and
Technology
Research Director, SINTEF Technology and Society

Presentation Structure

 Holistic view
 The development of performance measurement
 Performance measurement approaches
 Performance measurement enables benchmarking
 The benchmarking concept
 Benchmarking approaches and initiatives
 Experiences with benchmarking
 Selected benchmarking tools
 Recommendations for performance measurement and
benchmarking

1
A Holistic Performance Management Model

Stakeholders
and their requirements
N External reference
through benchmarking

Strategic planning W E

Businessprocesses

Day-to-day business
process management
Performance measurement
system
Be
BPR
nc
hm
ark SPC
ing
Streamlining Idealizing
Organizational
self-assessment

Improvement toolbox
3

Performance Measurement Approaches

 Some different schools:


 The economics-driven initiatives, often at a national/super-national
level
 The more operational PI (performance indicator) focused
initiatives, often derived from manufacturing
 The Balanced Scorecard movement, which is perhaps more
management-oriented
 Public sector performance measurement initiatives
 The ERP system angle into performance measurement
 These have their distinct approaches and purposes, but
also overlap slightly

2
Economics-Driven

 Typically still quite productivity-focused (as opposed to


performance-focused).
 Often developed and used by government (agencies),
international groups (UN, OECD).
 Consists normally of higher-level PIs that can be used for
overall financial diagnosis (and benchmarking).
 The specific PIs used are less relevant for infrastructure
services, but some of the calculation models can be
useful.

Operational PI Focus
 Enterprise level targeted.
 Started as rebellion against economists to capture operational
aspects of performance in the belief that these drive financial
performance.
 Has led the development toward performance and realized that
performance is a many-splendored thing.
 Typical performance dimensions defined:
 Hard and soft measures.
 Financial and non-financial measures.
 Result and process measures.
 Measures for predicting future performance.
 The four classic dimensions (cost, time, quality, and flexibility).
 More modern dimensions (SHE, environmental impact, business ethics).

3
The ENAPS/APM Performance
Measurement Cube
Measurement Levels

Engineer-to-Order
Cost
Make-to-Order
Typology
Assemble-to-Order
Time
Make-to-Stock
Measurement
Quality Dimensions

Volume

Flexibility
Business Process Function
Level Level Level Environment

The Balanced Scorecard Approach


 To a very large extent built on the foundations of the
economics-driven development and the manufacturing-
based work.
 A stroke of genius in packaging and marketing!
 Kaplan & Norton have sold millions of copies of their
books and there are BSC forums all over the world.
 Basically nothing new beyond showing in a clear manner
how strategy and measurement are linked.
 Defines a similar (albeit more limited) set of performance
dimensions as the PI-focus school.

4
Public Sector Approaches
 Not as organized a school as the others.
 More about transforming private sector approaches to
public sector applications.

ERP-Based Development
 MRP (Manufacturing Resource Planning) => ERP
(Enterprise Resource Planning), largest software sector
beyond Microsoft.
 All-encompassing applications for company operations
management and administration, needed performance
management as well.
 Based on available data in the ERP systems.
9

In Summary

 Performance measurement has come a long way in about


30 years:
 Large number of application areas.
 Used at many different levels.
 Key learnings:
 Balanced set of PIs, both dimensions and levels.
 Tailor the PMS to the applications intended.
 Use as few PIs as possible.
 For management overview use traffic lights, for more
operational applications, use instruments.
 Data collection costs and data quality almost demand computer-
based systems.

10

5
Types of Benchmarking
Depending on whom it is being benchmarked against:
 Internal benchmarking (within ones own class).
 Competitive benchmarking (within the parallel class).
 Functional benchmarking (against a different school
of the same type).
 Generic benchmarking (against a totally different
school). Other industries
Competitors

Own
company

Types of Benchmarking

Depending on what is being benchmarked:


 Performance benchmarking (how high should we jump?).
 Process benchmarking (how to get over?).
 Strategic benchmarking (where to jump?).

6
Benchmarking Many Different Applications!

 Ranking; comparing organizations and their results to see


which is better (possibly in pillory and stocks).
 Motivation; shows that other achieve good results and
thus motivate for improvement (or the 3 Ds).
 Learning and improvement; learn what the best do to
reach the benchmark and copy their practices
(benchlearning, enhanced by the strawman model).
 Regulation; using benchmarking data to define income
limits or improvement requirements.

13

SOME Benchmarking Initiatives


 Questionnaire driven:
 TOPP, Norwegian research project to establish self-audit and external
assessment tools.
 Operational/PI driven:
 ENAPS/APM, EU projects to develop a comprehensive performance
measurement approach and an online benchmarking database.
 US Benchmarking Index, DTI benchmarking service of SMEs in the UK.
 CII Benchmarking, a benchmarking service available to members of the
Construction Industry Institute for comparing projects.
 Criteria driven:
 Different quality awards (e.g., MBNQA, EQA, etc.).
 Regulatory/statistics driven:
 OFWAT, UK water regulation benchmarking.
 NVE, Norwegian power regulation benchmarking.

14

7
Lessons Learned from Benchmarking Initiatives

 TOPP Benchmarking:
 Self-audit, extensive questionnaire for quantitative ranking current and
future importance and performance for a number of areas/business
processes in the company. Was analyzed by TOPP and a report with
recommendations produced.
 External assessment, equally extensive questionnaire to be used by an
external assessor to qualitatively evaluate the performance for various
business processes, summarized in a report with more detailed
recommendations.
 Both data sets used for benchmarking to understand trends in the
industry, identify common problem areas, and target research.
 Learning:
 Gave by far the most detailed insight into the companies.
 Was very time-consuming and expensive to carry out.
 Benchmarking based on self-audit data rather inaccurate, based on
external assessments impossible due to the qualitative nature.

15

Lessons Learned from Benchmarking Initiatives

 ENAPS developed a (in 1998 - very early!) benchmarking system for manufacturing companies:
 Comprehensive set of business processes and corresponding PIs.
 Excel-based PI questionnaire used by companies to enter data and upload to the ENAPS database.
 Contributing data gave the right to do benchmarking queries into the database.
 Queries defined by country, sector, and company size and returned as numbers or graphically.
 Reached about 500 entries, but attempts at establishing ENAPS as a commercial service failed, due
to:
 Cumbersome manual data collection and registration for the companies.
 Doubts about the quality of the data (many obvious errors).
 Confidentiality issues one main selling point was the potential for follow-up process benchmarking, but
companies were concerned about sharing information more openly.
 Too few entries into the database to return useful samples for queries.
 As a result, APM succeeded ENAPS by:
 Automating the data collection in the companies by adding an application on top of their ERP systems that
captured data automatically from the systems extensive databases.
 Simplifying the uploading and querying procedure by integrating everything into the APM tool.
 As with ENAPS, see that many of the basic PIs that can fairly easily be measured are trivial and not very
useful.
 Allowed users to also manually enter customized PIs, but these were only useful for internal purposes and not
possible to benchmark.
 Main learnings:
 Must have PI standardization, easy and reliable data collection, and voluntary benchmarking is hard to sell.
 Other PI driven initiatives; similar experiences

16

8
The ENAPS Approach

PIs ENAPS

Consultant

Data

Anonymized
Via Internet

APM+ENAPS: Real-time Benchmarking

ERP
Benchmarking
Network PIs:
Comparison + Best Practice

COMPANY
Internet APM

Benchmarking
Database

Operational data ERP

18

9
Lessons Learned from Benchmarking Initiatives

 Quality awards:
 Performance divided into a certain number of areas (e.g., for EQA:
leadership and constancy of purpose, customer focus, corporate social
responsibility, people development and involvement, results orientation,
management by processes and facts, continuous learning, innovation and
improvement, partnership development).
 Points awarded to each area, often a maximum score of 1,000.
 First stage self-assessment.
 Second stage (for scores above, e.g., 400) assessor review to
identify award winner.
 Benchmarks (scores) can be used to identify areas that need
improvement, find benchmarking partners, and see trends.
 Learning:
 Such criteria too general to be very helpful in more detailed
benchmarking.

19

Lessons Learned from Benchmarking Initiatives

 Regulatory/statistics driven:
 Examples presented earlier.
 Challenges also here to capture data easily and with sufficient
data quality.
 Added complexity in the analysis phase simple gap analysis no
longer sufficient.
 Learning:
 This application of benchmarking shares the challenges of
company-oriented benchmarking of data collection and quality.
 Analysis and improvement tools must be different.
 However, this type of benchmarking can also be used at the single
organization level based on the same performance data, but with
different analysis.

20

10
Benchmarking Analysis Approaches

Partial methods Total methods

PI-based gap analysis


Performance matrix Index methods
M2 analysis
Relations diagram
Total Factor
Limit methods Productivity

Non-parametric Parametric

Data Envelopment Stochastic Smallest EFFOMETER


Analysis limit quadrats

21

Normalizing
A common argument against benchmarking is:
"We are so different from everybody else that any
comparison against other companies would be a
waste of time."

Most things can be compared by transforming pears


into something similar to apples. Normalizing often
consists of recalculating into:
 Per year.
 Per employee.
 Etc., often expressed as a ratio or percentage.

11
Gap Analysis (Per PI)
Partners performance

P Future
e gap
r
f Current
o gap
r
m
a Former gap
Own performance
n
c
e -T1 Today T1
Time

Performance Matrix
C
u


r 9
r
e
n
t

p Overkill OK


e 5
r
f
o
r
m
a 1
Unimportant Must be improved
n 1 5 9
c
e
Importance

24

12
M2 Analysis
Time order to
start manuf.1,0
0,8

Us 0,6

0,4

0,2

0,0

Ind. costs as Time order


% of price to delivery

Partner A

Customer Complaints
satisfaction

Relations Diagram
0 in 4 out
0 in 4 out

Time from order


to start manufacturing
22inin 00out
out

Indirect costs
as % of price

Time from order


to delivery
11inin 33out
out

Customer
satisfaction
22inin 00out
Customer complaints
out
22inin 00out
out

13
Analysis Method Suitability

 Enterprise-oriented benchmarking:
 1-1 analysis approaches best suited.
 Larger population benchmarking:
 More comprehensive statistical approaches more relevant.
 Also in this case can more 1-1 and process benchmarking be
used, with corresponding tools.

27

Three Levels of Information

Enablers

Performance
Practice level

Enablers

14
Comparison of Flow Charts
A A

B
B

C D
E
Our E Partners
process process
H
G F H
I
I

J K L

Root Cause Analysis

Low
Lowlevel
levelof
ofWork-in-Progress
Work-in-Progress

Why? Maintains no stock of finished goods


Why? Short manufacturing time
Why? Runs small batch sizes
Why? Frequent and swift deliveries from the suppliers
Why? Extremely good relationships to the suppliers

15
Final Personal Recommendations
 Integrate performance measurement and benchmarking to make
performance management a integral part of the organizations.
 Especially the introduction of systematic performance measurement
has many benefits beyond benchmarking:
 Leads to a debate about strategy and why are we here.
 Creates a need for cleaning up management systems and procedures
for follow-up.
 Performance data can be used toward the public and create a positive
attitude.
 Focus on performance data will normally trigger improvements.
 Increases the awareness of stakeholder satisfaction.
 Balance the use of benchmarking for regulatory purposes with
improvement-oriented 1-1 one and process benchmarking (possibly
through CIGs).
 Use insight created through benchmarking to generate best practice
libraries.

31

16

You might also like