Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

290 Full

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

SYSTEMATIC REVIEW

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
Systematic review of the application
of the plan–do–study–act method to
improve quality in healthcare
Michael J Taylor,1,2 Chris McNicholas,2 Chris Nicolay,1 Ara Darzi,1
Derek Bell,2 Julie E Reed2

▸ Additional material is ABSTRACT INTRODUCTION


published online only. To view
Background Plan–do–study–act (PDSA) cycles Delivering improvements in the quality
please visit the journal online
(http://dx.doi.org/10.1136/bmjqs- provide a structure for iterative testing of and safety of healthcare remains an inter-
2013-001862). changes to improve quality of systems. The national challenge. In recent years, quality
1 method is widely accepted in healthcare improvement (QI) methods such as plan–
Department of Surgery and
Cancer, Imperial College improvement; however there is little overarching so–study–act (PDSA) cycles have been
London, London, UK evaluation of how the method is applied. This used in an attempt to drive such improve-
2
National Institute for Health paper proposes a theoretical framework for ments. The method is widely used in
Research (NIHR) Collaboration
for Leadership in Applied Health
assessing the quality of application of PDSA healthcare improvement; however there is
Research and Care (CLAHRC) for cycles and explores the consistency with which little overarching evaluation of how the
North-West London, London, UK the method has been applied in peer-reviewed method is applied. This paper proposes a
literature against this framework. theoretical framework for assessing the
Correspondence to
Michael J Taylor, Academic Methods NHS Evidence and Cochrane quality of application of PDSA cycles and
Surgical Unit, 10th Floor, QEQM databases were searched by three independent explores the quality and consistency of
building, St Mary’s Hospital, reviewers. Empirical studies were included that PDSA cycle application against this frame-
Paddington, London W2 1NY, reported application of the PDSA method in work as documented in peer-reviewed
UK; mtaylor3@imperial.ac.uk
healthcare. Application of PDSA cycles was literature.
Received 29 January 2013 assessed against key features of the method,
Revised 25 June 2013 including documentation characteristics, use of Use of PDSA cycles in healthcare
Accepted 4 July 2013
Published Online First
iterative cycles, prediction-based testing of Despite increased investment in research
23 August 2013 change, initial small-scale testing and use of data into the improvement of healthcare,
over time. evidence of effective QI interventions
Results 73 of 409 individual articles identified remains mixed, with many systematic
met the inclusion criteria. Of the 73 articles, 47 reviews concluding that such interven-
documented PDSA cycles in sufficient detail for tions are only effective in specific set-
full analysis against the whole framework. Many tings.1–4 To make sense of these findings,
of these studies reported application of the PDSA it is necessary to understand that deliver-
method that failed to accord with primary ing improvements in healthcare requires
features of the method. Less than 20% (14/73) the alteration of processes within complex
Open Access
Scan to access more fully documented the application of a sequence social systems that change over time in
free content
of iterative cycles. Furthermore, a lack of predictable and unpredictable ways.5
adherence to the notion of small-scale change is Research findings highlight the influential
apparent and only 15% (7/47) reported the use effect that local context can have on the
of quantitative data at monthly or more frequent success of an intervention6 7 and, as such,
data intervals to inform progression of cycles. ‘single-bullet’ interventions are not antici-
Discussion To progress the development of the pated to deliver consistent improvements.
▸ http://dx.doi.org/10.1136/ science of improvement, a greater understanding Instead, effective interventions need to be
bmjqs-2013-002703
of the use of improvement methods, including complex and multi-faceted8–11 and devel-
PDSA, is essential to draw reliable conclusions oped iteratively to adapt to the local
To cite: Taylor MJ,
about their effectiveness. This would be context and respond to unforeseen obsta-
McNicholas C, Nicolay C, supported by the development of systematic and cles and unintended effects.12 13 Finding
et al. BMJ Qual Saf rigorous standards for the application and effective QI methods to support iterative
2014;23:290–298. reporting of PDSAs. development to test and evaluate

290 Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
interventions to care is essential for delivery of high- Users of the PDSA method follow a prescribed four-
quality and high-value care in a financially constrained stage cyclic learning approach to adapt changes aimed at
environment. improvement. In the ‘plan’ stage a change aimed at
PDSA cycles provide one such method for structur- improvement is identified, the ‘do’ stage sees this
ing iterative development of change, either as a stan- change tested, the ‘study’ stage examines the success of
dalone method or as part of wider QI approaches, the change and the ‘act’ stage identifies adaptations and
such as the Model for Improvement (MFI), Total next steps to inform a new cycle. The MFI30 and
Quality Management, Continuous QI, Lean, Six FOCUS31 (see figure 1) frameworks have been devel-
Sigma or ‘Quality Improvement Collaboratives’.3 4 14 oped to precede the use of PDSA and PDCA cycles30 31
Despite increased use of QI methods, the evidence respectively (table 1).
base for their effectiveness is poor and under- In comparison to more traditional healthcare
theorised.15–17 PDSA cycles are often a central com- research methods (such as randomised controlled
ponent of QI initiatives, however few formal objective trials in which the intervention is determined in
evaluations of their effectiveness or application have advance and variation is attempted to be eliminated
been carried out.18 Some PDSA approaches have been or controlled for), the PDSA cycle presents a prag-
demonstrated to result in significant improvements matic scientific method for testing changes in complex
in care and patient outcomes,19 while others have systems.32 The four stages mirror the scientific experi-
demonstrated no improvement at all.20–22 mental method33 of formulating a hypothesis, collect-
Although at the surface level these results appear ing data to test this hypothesis, analysing and
disheartening for those involved in QI, there is a need interpreting the results and making inferences to
to explore the extent to which the PDSA method has iterate the hypothesis.
been successfully deployed to draw conclusions from The pragmatic principles of PDSA cycles promote
these studies. Rather than see the PDSA method as a the use of a small-scale, iterative approach to test
‘black box’ of QI,23 it is important to understand that interventions, as this enables rapid assessment and
the use of PDSA cycles is, itself, a complex interven- provides flexibility to adapt the change according to
tion made up of a series of interdependent steps and feedback to ensure fit-for-purpose solutions are devel-
key principles that inform its application5 24 25 and oped.10 12 13 Starting with small-scale tests provides
that this application is also affected by local context.26 users with freedom to act and learn; minimising risk
To interpret the results regarding the outcome(s) from to patients, the organisation and resources required
the application of PDSA cycles (eg, whether processes and providing the opportunity to build evidence for
or outcomes of care improved) and gauge the effect- change and engage stakeholders as confidence in the
iveness of the method, it is necessary to understand intervention increases.
how the method has been applied. In line with the scientific experimental method, the
No formal criteria for evaluating the application or PDSA cycle promotes prediction of the outcome of a
reporting of PDSA cycles currently exist. It is only in test of change and subsequent measurement over time
recent years, through SQUIRE guidelines, that frame- (quantitative or qualitative) to assess the impact of an
works for publication have been developed that expli- intervention on the process or outcomes of interest.
citly consider description of PDSA application.27 28 Thus, learning is primarily achieved through interven-
We consider that such criteria are necessary to tional experiments designed to test a change. In recog-
support and assess the effective application of PDSA nition of working in complex settings with inherent
cycles and to increase their legitimacy as a scientific variability, measurement of data over time helps
method for improvement. We revisited the origins and understand natural variation in a system, increase
theory of the method to develop a theoretical frame- awareness of other factors influencing processes or
work to evaluate the application of the method. outcomes, and understand the impact of an
intervention.
As with all scientific methods, documentation of
The origins and theory of PDSA cycles each stage of the PDSA cycle is important to support
The PDSA method originates from industry and scientific quality, local learning and reflection and to
Walter Shewhart and Edward Deming’s articulation of ensure knowledge is captured to support organisa-
iterative processes which eventually became known as tional memory and transferability of learning to other
the four stages of PDSA.25 PDCA ( plan–do–check– settings.
act) terminology was developed following Deming’s This review examines the application of PDSA
early teaching in Japan.29 The terms PDSA and PDCA cycles as determined by these principle features of the
are often used interchangeably in reference to the PDSA method described above. We recognise that a
method. This distinction is rarely referred to in the lit- number of health and research related contextual
erature and for the purpose of this article we consider factors may affect application of the method but these
PDSA and PDCA but refer to the methodologies gen- factors are beyond the scope of this review. The
erally as ‘PDSA’ cycles unless otherwise stated. review intends to improve the understanding of

Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862 291


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
Figure 1 The Model for Improvement; FOCUS.

whether the PDSA method is being used and reported Database of Systematic Reviews. The last search date
in line with the literature informed criteria and there- was 25 September 2012.
fore inform the interpretation of studies that have
used PDSA cycles to facilitate iterative development of Data collection process and study selection
an intervention. Data were collected and tabulated independently by
MJT, CM and CN in a manner guided by the
METHODS Cochrane Handbook. Eligibility was decided inde-
A systematic narrative review was conducted in adher- pendently, in a standardised manner and disagree-
ence to the Preferred Reporting Items for Systematic ments were resolved by consensus. If an abstract was
Reviews and Meta-Analyses (PRISMA) statement.34 not available from the database, the full-text reference
was accessed.
Search Inclusion criteria for articles were as follows: pub-
The search was designed to identify peer-reviewed lished in peer-reviewed journal; describes PDSA
publications describing empirical studies that applied method being applied to improve quality in a health-
the PDSA method. Taking into account the develop- care setting; published in English. Editorial letters,
ment of the method and terminology, the search conference abstracts, opinion and audit articles were
terms used were ‘PDSA’, ‘PDCA’, ‘Deming Cycle’, excluded from the study selection.
‘Deming Circle’, ‘Deming Wheel’ and ‘Shewhart
Cycle’. No year of publication restrictions were Data items
imposed. A theoretical framework was constructed by compart-
mentalising the key features of the PDSA method into
Information sources observable variables for evaluation (table 2). This
The following databases were searched for articles: framework was developed in accordance with recom-
Allied and Complementary Medicine Database mendations for PDSA use cited in the literature,
(AMED; 1985 to present), British Nursing Index (BNI; describing the origins and theory of the method. Face
1985 to present), Cumulative Index to Nursing and validity of the framework was achieved through dis-
Allied Health Literature (CINAHL; 1981 to present), cussion among authors, with QI facilitators and at
Embase (1980 to present), Health Business Elite local research meetings.
(EMBESCO Publishing, Ipswich, Massachusetts, USA), Data were collected regarding application of the
the Health Management Information Consortium PDSA method in line with the theoretical framework.
(HMIC), MEDLINE from PubMed (1950 to present) Other data collected included first author, year of
and PsychINFO (1806 to present) using the NHS publication, country, area of healthcare, use of PDSA
Evidence online library (REF), and the Cochrane or PDCA terminology, and use of MFI or FOCUS as

292 Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
Table 1 Description of the plan–do–study–act (PDSA) cycle method according to developers and commentators
Langley (1996)30How the PDSA method Speroff and O’Connor (2004)33How the
Deming (1986)25Original description of may be adapted for use in healthcare PDSA method is analogous to scientific
the method relating to manufacturing contexts methodology
Plan Plan a change or test aimed at improvement ▸ Identify objective Formation of a hypothesis for improvement
▸ Identify questions and predictions
▸ Plan to carry out the cycle (who, when,
where, when)
Do Carry out the change or test (preferably on ▸ Execute the plan Conduct study protocol with collection of data
a small scale)
▸ Document problems and unexpected
observations
▸ Begin data analysis
Study Examine the results. What did we learn? ▸ Complete the data analysis Analysis and interpretation of the results
What went wrong?
▸ Compare data to predictions
▸ Summarise what was learnt
Act Adopt the change, abandon it or run ▸ What changes are to be made? Iteration for what to do next
through cycle again
▸ What will the next cycle entail?

supporting frameworks. Ratios were used to analyse Risk of bias across studies
the results regarding the majority of variables, and Despite our review being focused on reported applica-
mean scores regarding data associated with length of tion, rather than success of interventions, it may still
study, length of PDSA cycle and sample size were also be possible that publication bias affected the results of
used for analysis. Data were analysed independently this study. Research that used PDSA methodology, but
by MJT and CM. Discrepancies (which occurred in did not yield successful results, may be less likely to
less than 3% of data items) were resolved by get published than reports of successful PDSA
consensus. interventions.

Risk of bias in individual studies RESULTS


The present review aimed to assess the reported appli- Study selection
cation of the PDSA method and the results of individ- A search of the databases yielded 942 articles. After
ual studies were not analysed in this review. removal of duplicates, 409 remained; 216 and 120

Table 2 Theoretical framework based on key features of the plan–do–study–act (PDSA) cycle method
Feature of PDSA Description of feature How this was measured
Iterative cycles To achieve an iterative approach, multiple PDSA cycles must ▸ Were multiple cycles used?
occur. Lessons learned from one cycle link and inform cycles that
▸ Were multiple cycles linked to one another (ie, does
follow. Depending on the knowledge gained from a PDSA cycle,
the following cycle may seek to modify, expand, adopt or the ‘act’ stage of one cycle inform the ‘plan’ stage of
abandon a change that was tested the cycle that follows)?
▸ When isolated cycles were used were future actions
postulated in the ‘act’ stage?
Prediction-based test A prediction of the outcome of a change is developed in the ▸ Was a change tested?
of change ‘plan’ stage of a cycle. This change is then tested and examined
▸ Was an explicit prediction articulated?
by comparison of results with the prediction
Small-scale testing As certainty of success of a test of change is not guaranteed, ▸ Sample size per cycle?
PDSAs start small in scale and build in scale as confidence grows.
This allows the change to be adapted according to feedback,
▸ Temporal duration of cycles?
minimises risk and facilitates rapid change and learning ▸ Number of changes tested per cycle?
▸ Did sequential cycles increase scale of testing?
Use of data over time Data over time increases understanding regarding the variation ▸ Was data collected over time?
inherent in a complex healthcare system. Use of data over time is
▸ Were statistics used to test the effect of changes
necessary to understand the impact of a change on the process or
outcome of interest and/or understand variation?
Documentation Documentation is crucial to support local learning and ▸ How thoroughly was the application of the PDSA
transferability of learning to other settings
method detailed in the reports?
▸ Was each stage of the PDSA cycles documented?

Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862 293


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
were further discarded following review of abstracts PDCA use diminishing (see online supplementary
and full texts, respectively. Excluded articles did not figure S1). The earliest reported use of PDCA and
apply the PDSA method as part of an empirical study PDSA in healthcare was 1993 and 2000, respectively.
or coincidently used the acronyms PDSA or PDCA for
different terms, or were abstracts for conferences or Documentation
poster presentations. A total of 73 articles met the The following four categories were used to describe
inclusion criteria and were included in the review (see the extent to which cycles were documented in arti-
figure 2). cles (n=73): no detail of cycles (n=16); themes of
cycles (but no additional details) (n=8); details of
General study characteristics individual cycles, but not of stages within cycles
Country of study (n=8); details of cycles including separated informa-
The retrieved articles describe studies conducted in tion on stages of cycles (n=41).
the USA (n=46), the UK (n=13), Canada (n=3) Analysis of articles against the developed framework
Australia (n=3), the Netherlands (n=2) and one each was dependent on the extent to which the application
from six other countries (see online supplementary of PDSA cycles was reported. Articles that provided
appendix A for complete synthesis of results). no details of cycles or only themes of cycles were
insufficient for full review and excluded for analysis
Healthcare discipline to which method was applied
against all features. Articles that provided further
This varied across acute and community care and clin- details of cycles completed (n=49) were included for
ical and organisational settings. The most common analysis against the remaining four features of the
settings were those of pain management and surgery framework. A full breakdown of findings can be
(six articles each). viewed in online supplementary appendix B.
Method terminology
Of the 73 articles identified, 42 articles used ‘PDSA’ Application of method
as terminology and 31 referred to the method as Iterative cycles (n=49)
‘PDCA’. Eight of these reported using the MFI. Fourteen articles described a sequence of iterative
Thirty-one articles used ‘PDCA’ terminology, with 20 cycles (two or more cycles with lessons learned from
using the preceding FOCUS framework. One article one cycle linking and informing a subsequent cycle),
described use of FOCUS and MFI. Over time there 33 described isolated cycles that are not linked, and 2
was an increase in the prevalence of PDSA use with articles described cycles that used PDSA stages in the
incorrect order (in one article, one plan, one do, two
checks and three acts were described, PDACACA35; a
further study did not report use of a ‘check’ stage;
PDA36) and are excluded from further review. Of the
33 articles that described non-iterative cycles, 29
reported a single cycle being used, and 4 described
multiple, isolated (non-sequential) cycles. Although
future actions are often suggested in articles that
reported a single cycle, only three explicitly men-
tioned the possibility of further cycles taking place. A
total of 13.6% (3/22) of PDCA studies described the
application of iterative cycles compared with 44%
(11/25) of PDSA studies describing the application of
iterative cycles (see figure 3).

Prediction-based testing of change (n=47)


The aims of the cycles adhered to one of two themes:
tests of a change; and collection or review of data
without a change made. Of the 33 articles with single
cycles, 30 aimed to test a change while 3 used the
PDSA method to collect or review data. Of the 14
articles demonstrating sequential cycle use, 8 solely
used their cycles to test change whilse5 began with a
cycle collecting or reviewing data followed by cycles
testing change. One article described a mixture of
cycles testing changes and cycles that involved collec-
tion/review of data. Four of the 47 studies contained
Figure 2 PRISMA diagram. an explicit prediction regarding the outcome of a

294 Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
(first to last cycle of one chain) ranged from 1 day to
4 years (mean=20.38, SD=20.39 months).

Complexity
Twenty-two articles reported more than one change
being tested within a single cycle. Of the articles
describing iterative cycles, 42% administered more
than one change per cycle compared with 48% of the
articles describing non-iterative PDSA cycles.

Data over time (n=47)


All studies used a form of qualitative and quantitative
data to assess cycles. Studies were categorised accord-
Figure 3 Iterative nature of cycles for all articles and split by
ing to four types of reporting quantitative data:
plan–do–check–act and plan–do–study–act terminology.
regular (n=15), three or more data points with con-
sistent time intervals; non-regular (n=16), before and
change; all 4 aimed to test a change (see online sup- after or per PDSA cycle; single data point (n=8), a
plementary table S1). single data point after PDSA cycle(s); and no quantita-
tive data reported (n=8). Of the 15 articles that used
Small-scale testing (n=47) regular data, only 7 used monthly or more frequent
Scale was assessed in three ways: sample size, duration data intervals (see online supplementary figure S3 for
and complexity. Sample size refers to quantity of full frequency of regular quantitative data reporting).
observations used to measure the change; duration No studies reported using statistical process control to
refers to the length of PDSA cycle application; and analyse data collected from PDSA cycles. Eleven
complexity refers to the quantity of changes adminis- included analysis of data using inferential statistical
tered per cycle. tests (five of these studies collected isolated data, six
involved continuous data collection).
Sample size Of the eight articles that did not report any quanti-
Patient data, staff data and case data were used as tative data, two reported that quantitative analyses
samples within PDSA cycles. Twenty-seven articles had taken place but did not present the findings and
reported a sample size from at least one of their six described the use of qualitative feedback only (one
cycles. Twenty-one of these were isolated cycle studies non-regular, five single data point). Qualitative data
with sample size ranging from 7 to 2079 were gathered through a range of mechanisms from
(mean=323.33, SD=533.60). The remaining six informal staff or patient feedback to structured focus
studies reporting individual cycle sample sizes used groups.
iterative cycles; the sample size of the first cycles of
these ranged from 1 to 34 (mean=16.75, SD=11.47). DISCUSSION
Two of these studies described the use of incremental PDSA cycles offer a supporting mechanism for itera-
sample sizes across cycles, three used non-incremental tive development and scientific testing of improve-
sample sizes across cycles, and one changed the type ments in complex healthcare systems. A review of the
of sample. Of the eight iterative cycle articles that did historic development and rationale behind PDSA
not report individual cycle sample sizes, two did not cycles has informed the development of a theoretical
differentiate sample sizes between cycles and instead framework to guide the evaluation of PDSA cycles
gave an overall sample for the chain of cycles and six against use of iterative cycles, initial small-scale
did not report sample size. testing, prediction-based testing of change, use of data
over time and documentation.
Duration Using these criteria to assess peer-reviewed publica-
Reported study duration of isolated cycles ranged tions of PDSA cycles demonstrates an inconsistent
from 2 weeks to 5 years (mean=11.91 months, approach to the application and reporting of PDSA
SD=12.81). Only five articles describing iterative cycles and a lack of adherence to key principals of the
cycles explicitly reported individual cycle duration. method. Only 2/7337 38 articles demonstrated compli-
Individual cycle duration could be estimated from the ance with criteria in all five principles. Assessment of
total duration of the PDSA cycle chain and the compliance was problematic due to the marked vari-
number of cycles conducted, resulting in approximate ation in reporting of this method, which reflects a
cycle lengths ranging from three cycles in 1 day to lack of standardised reporting requirements for the
one cycle in 16 months (mean=5.41 months, PDSA method.
SD=4.80, see online supplementary figure S2). The From the articles that reported details of PDSA
total PDSA cycle duration for series of iterative cycles cycles it was possible to ascertain that variation is

Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862 295


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
inherent not just in reporting standards, but in the The theoretical framework presented in this paper
conduct of the method, implying that the key princi- highlights the complexity of PDSA cycles and the
ples of the PDSA method are frequently not followed. underpinning knowledge required for correct applica-
Less than 20% (14/73) of reviewed articles reported tion. The considerable variation in application
the conduct of iterative cycles of change, and of these, observed in the reported literature suggests that
only 15% (2/14) used initial small-scale tests with caution should be taken in interpreting results from
increasing scale as confidence in the intervention evaluations in which PDSAs are used in a controlled
developed. These results suggest that the full benefits setting and as a ‘black box’ of QI. This review did not
of the PDSA method would probably not have been compare the effectiveness of use to reported outcomes
realised in these studies. Without an iterative and therefore this study does not conclude whether
approach, learning from one cycle is not used to better application of the PDSA method results in
inform the next cycle, and therefore it is unlikely that better outcomes, but instead draws on theoretical
interventions will be adapted and optimised for use in principles of PDSAs to rationalise why this would be
a particular setting. Furthermore, large-scale cycles expected. Prospective mechanistic studies exploring
risk significant resource investment in an intervention the effective application of the method as well as
that has not been tested and optimised within that study outcomes would be of greater use in drawing
environment and risk producing ‘false’ negatives. conclusions regarding the effectiveness of the method.
Only 14% (7/47) of articles reported use of regular The framework presented in this paper could act as a
data over time at monthly or more frequent intervals, good starting point for such studies.
indicating a lack of understanding around the use of The fact that only peer-reviewed publications were
the PDSA method to track change within a ‘live’ assessed in this study means that results may be
system, and limiting the ability to interpret the results affected by publication bias. This is anticipated both
from the study. Cycles that included an explicit predic- in terms of what is accepted for publication but also
tion of outcomes were reported in only 9% (4/47) of the level and type of detail that is requested and
articles, suggesting that PDSA cycles were not used as allowed in typical publications (eg, before and after
learning cycles to test and revise theory-based studies are more common than presenting data over
predictions. time and this may make these types of studies easier
Overall these results demonstrate poor compliance to publish). Though QI work may be easier to publish
with key principles of the PDSA method, suggesting now through recent changes in publication guide-
that it is not being used optimally. The increasing lines,27 possible publication outlets continue to be
trend in using PDSA (as opposed to ‘PDCA’) cycles in relatively limited.
recent years, however, does seem to have been accom- To support systematic reporting and encourage
panied by an increase in compliance with some key appropriate usage, we suggest that reporting guidelines
principles, such as use of iterative cycles. Deming was be produced for users of the PDSA method to increase
cautious over the use of the ‘PDCA’ terminology and transparency as to the issues that were encountered and
warned it referred to an explicitly different process, how they were resolved. While PDSA is analogous to a
referring to a quality control circle for dealing with scientific method, it appears to be rarely used or
faults within a system, rather than the PDSA process, reported with scientific rigour, which in turn, inhibits
which was intended for iterative learning and perceptions of PDSA as a scientific method. Such
improvement of a product or a process.39 This subtle guidelines are essential to increase the scientific legit-
difference in terminologies may help to explain the imacy of the PDSA method as well as to improve scien-
better compliance with key methodological principles tific rigour or application and reporting. Although the
in studies that refer to the method as ‘PDSA’. SQUIRE guidelines make reference to the potential use
One of the articles identified in the search included of PDSA cycles, further support to users and teachers,
comments by the authors that the PDSA method and publication of this improvement method seems
should be ‘more realistically represented’,40 as inef- necessary. Consistent reporting of PDSA structure
fective cycles can be ‘abandoned’ early on, making it would allow meta-evaluation and systematic reviews to
needless to go through all four stages in each iteration. further build the knowledge of how to use such
These comments may provide insight into an import- methods effectively and the principles to apply to
ant potential misunderstanding of the PDSA method- increase chances of success.
ology. Ineffective changes will result in learning, It is clear from these findings that there is much
which is a fundamental principle behind a PDSA room for improvement in the application and use of
cycle. However minor this abandoned trial may have the PDSA method. Previous studies have discussed the
been, it can still be usefully described as a PDSA cycle. influence of different context factors on the use of QI
A minor intervention may be planned (P) and put into methods, such as motivation, data support infrastruc-
practice (D). A barrier may be encountered (S), result- ture and leadership20 22 41–43 Understanding how
ing in a decision being made to retract the interven- high-quality usage can be promoted and supported
tion, and to do something differently (A). needs to become the focus of further research if such

296 Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
QI methods are going to be used effectively in main- 6 McCormack B, Kitson A, Harvey G, et al. Getting evidence
stream healthcare. into practice: the meaning of context. J Adv Nurs
2002;38:94–104.
7 Kaplan HC, Brady PW, Dritz MC, et al. The influence of
CONCLUSIONS context on quality improvement success in health care:
There is varied application and reporting of PDSAs a systematic review of the literature. Milbank Q 2010;88:
and lack of compliance with the principles that under- 500–59.
pin its design as a pragmatic scientific method. The 8 Oxman AD, Thomson MA, Davis DA, et al. No magic bullets:
varied practice compromises its effectiveness as a a systematic review of 102 trials of interventions to improve
method for improvement and cautions against studies professional practice. CMAJ 1995;153:1423.
that view QI or PDSA as a ‘black box’ intervention. 9 Department of Health. Report of the High Level Group (HLG)
There is an urgent need for greater scientific rigour on clinical effectiveness. London: Department of Health, 2007.
in the application and reporting of these methods to 10 Greenhalgh T, Robert G, Macfarlane F, et al. Diffusion of
innovations in service organizations: systematic review and
advance the understanding of the science of improve-
recommendations. Milbank Q 2004;82:581–629.
ment and efficacy of the PDSA method. The PDSA
11 Plsek PE, Wilson T. Complexity science: complexity, leadership,
method should be applied with greater consistency and management in healthcare organisations. BMJ 2001;323:746.
and with greater accordance to guidelines provided by 12 Damschroder LJ, Aron DC, Keith RE, et al. Fostering
founders and commentators25 30 44 45 implementation of health services research findings into
practice: a consolidated framework for advancing
Acknowledgements The authors would like to thank Dr
Thomas Woodcock for his valuable input into the theoretical implementation science. Implement Sci 2009;4:50.
framework and data analysis. 13 Powell AE, Rushmer RK, Davies HTO. A systematic narrative
Contributors All listed authors qualify for authorship based on review of quality improvement models in health care: NHS
making one or more of the substantial contributions to the Quality Improvement Scotland. 2009. Report No. 1844045242.
intellectual content: conceptual design (MJT, CM, CN, DB, 14 Boaden R, Harvey J, Moxham C, et al. Quality improvement:
AD and JR), acquisition of data (MJT, CM and CN) and/or theory and practice in healthcare. NHS Institute for Innovation
analysis and interpretation of data (MJT, CM, CN and JR).
Furthermore all authors participated in drafting the manuscript and Improvement, 2008.
(MJT, CM, CN, DB, AD and JR) and critical revision of the 15 Walshe K. Understanding what works—and why—in quality
manuscript for important intellectual content (MJT, CM, CN, improvement: the need for theory-driven evaluation. Int J
DB, AD and JR). Qual Health Care 2007;19:57–9.
Disclaimer This article presents independent research 16 Shojania KG, Grimshaw JM. Evidence-based quality
commissioned by the National Institute for Health Research improvement: the state of the science. Health Aff
(NIHR) under the Collaborations for Leadership in Applied
2005;24:138–50.
Health Research and Care (CLAHRC) programme for North
West London. The views expressed in this publication are those 17 Auerbach AD, Landefeld CS, Shojania KG. The tension
of the author(s) and not necessarily those of the NHS, the between needing to improve care and knowing how to do it.
NIHR or the Department of Health. N Engl J Med 2007;357:608–13.
Competing interests The authors declare no conflict of interest. 18 Ting HH, Shojania KG, Montori VM, et al. Quality improvement
Provenance and peer review Not commissioned; externally science and action. Circulation 2009;119:1962–74.
peer reviewed. 19 Pronovost P, Needham D, Berenholtz S, et al. An intervention
Open Access This is an Open Access article distributed in to decrease catheter-related bloodstream infections in the ICU.
accordance with the Creative Commons Attribution Non N Engl J Med 2006;355:2725–32.
Commercial (CC BY-NC 3.0) license, which permits others to 20 Benning A, Ghaleb M, Suokas A, et al. Large scale
distribute, remix, adapt, build upon this work non- organisational intervention to improve patient safety in four
commercially, and license their derivative works on different
terms, provided the original work is properly cited and the use UK hospitals: mixed method evaluation. BMJ 2011;342:d195.
is non-commercial. See: http://creativecommons.org/licenses/by- 21 Landon BE, Wilson IB, McInnes K, et al. Effects of a quality
nc/3.0/ improvement collaborative on the outcome of care of patients
with HIV infection: the EQHIV study. Ann Intern Med
REFERENCES 2004;140:887–96.
1 Øvretveit J. Does improving quality save money. A review of 22 Vos L, Duckers ML, Wagner C, et al. Applying the quality
evidence of which improvements to quality reduce costs to health improvement collaborative method to process redesign: a
service providers. London: The Health Foundation, 2009. multiple case study. Implement Sci 2010;5:19.
2 Walshe K, Freeman T. Effectiveness of quality improvement: 23 Grol R, Baker R, Moss F. Quality improvement research:
learning from evaluations. Qual Saf Health Care understanding the science of change in health care. Qual Saf
2002;11:85–7. Health Care 2002;11:110–11.
3 Schouten LMT, Hulscher MEJL, van Everdingen JJE, et al. 24 Walshe K. Pseudoinnovation: the development and spread of
Evidence for the impact of quality improvement collaboratives: healthcare quality improvement methodologies. Int J Qual
systematic review. BMJ 2008;336:1491–4. Health Care 2009;21:153–9.
4 Nicolay CR, Purkayastha S, Greenhalgh A, et al. Systematic 25 Deming WE. Out of the crisis, 1986. Cambridge, MA:
review of the application of quality improvement Massachusetts Institute of Technology Center for Advanced
methodologies from the manufacturing industry to surgical Engineering Study xiii, 1991;507.
healthcare. Br J Surg 2012;99:324–35. 26 Øvretveit J. Understanding the conditions for improvement:
5 Berwick DM. Developing and testing changes in delivery of research to discover which context influences affect
care. Ann Intern Med 1998;128:651–6. improvement success. BMJ Qual Saf 2011;20(Suppl 1):i18–23.

Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862 297


Systematic review

BMJ Qual Saf: first published as 10.1136/bmjqs-2013-001862 on 11 September 2013. Downloaded from http://qualitysafety.bmj.com/ on May 28, 2023 by guest. Protected by copyright.
27 Davidoff F, Batalden P, Stevens D, et al. Publication guidelines 37 Lynch-Jordan AM, Kashikar-Zuck S, Crosby LE, et al.
for quality improvement in health care: evolution of the Applying quality improvement methods to implement a
SQUIRE project. Qual Saf Health Care 2008;17(Suppl 1): measurement system for chronic pain-related disability. J Pediatr
i3–9. Psychol 2010;35:32–41.
28 Ogrinc G, Mooney S, Estrada C, et al. The SQUIRE (Standards 38 Varkey P, Sathananthan A, Scheifer A, et al. Using quality-
for Quality Improvement Reporting Excellence) guidelines for improvement techniques to enhance patient education and
quality improvement reporting: explanation and elaboration. counselling of diagnosis and management. Qual Prim Care
Qual Saf Health Care 2008;17(Suppl 1):i13–32. 2009;17:205–13.
29 Imai M. The key to Japan’s competitive success. New York: 39 Moen R, Norman C. Circling back: clearing up the myths
McGraw-Hill, 1986. about the Deming cycle and seeing how it keeps evolving.
30 Langley GJ. The improvement guide: a practical approach to Qual Progress 2010;42:23–8.
enhancing organizational performance. 1st edn. San Francisco: 40 Tomolo AM, Lawrence RH, Aron DC. A case study of
Jossey-Bass Publishers, 1996. translating ACGME practice-based learning and improvement
31 Batalden P. Building knowledge for improvement-an introductory requirements into reality: systems quality improvement projects
guide to the use of FOCUS-PDCA. Nashville, TN: Quality as the key component to a comprehensive curriculum. Postgrad
Resource Group, Hospital Corporation of America, 1992. Med J 2009;85:530–7.
32 Moen R, Norman C. Evolution of the PDCA cycle. 2006. 41 Berwick DM. Developing and testing changes in delivery of
33 Speroff T, O’Connor GT. Study designs for PDSA quality care. Ann Intern Med 1998;128:651.
improvement research. Qual Manag Health Care 42 Benn J, Burnett S, Parand A, et al. Perceptions of the impact of a
2004;13:17–32. large-scale collaborative improvement programme: experience in
34 Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA the UK Safer Patients Initiative. J Eval Clin Pract 2009;15:524–40.
statement for reporting systematic reviews and meta-analyses of 43 Parand A, Burnett S, Benn J, et al. Medical engagement in
studies that evaluate health care interventions: explanation and organisation-wide safety and quality-improvement
elaboration. PLoS Med 2009;6:e1000100. programmes: experience in the UK Safer Patients Initiative.
35 Bader MK, Palmer S, Stalcup C, et al. Using a FOCUS-PDCA Qual Saf Health Care 2010;19:e44.
quality improvement model for applying the severe traumatic 44 Berwick D. Broadening the view of evidence-based medicine.
brain injury guidelines to practice: process and outcomes. Qual Saf Health Care 2005;14:315–16.
Reflect Nurs Leadersh 2002;28:34–5. 45 Speroff T, James BC, Nelson EC, et al. Guidelines for appraisal
36 Reid D, Glascott G, Woods D. Improving referral information and publication of PDSA quality improvement. Qual Manag
in community mental health. Nurs Times 2005;101:34–5. Health Care 2004;13:33–9.

298 Taylor MJ, et al. BMJ Qual Saf 2014;23:290–298. doi:10.1136/bmjqs-2013-001862

You might also like