Psycho Social Main
Psycho Social Main
Psycho Social Main
NOTICE: The project that is the subject of this report was approved by the Govern
ing Board of the National Research Council, whose members are drawn from the
councils of the National Academy of Sciences, the National Academy of Engineer
ing, and the Institute of Medicine. The members of the committee responsible for
the report were chosen for their special competences and with regard for appropri
ate balance.
This study was supported by Contract No. HHSN26300037 between the National
Academy of Sciences and the National Institutes of Health, Contract No. VA741
13-P-0317 between the National Academy of Sciences and the U.S. Department of
Veterans Affairs, and Contract No. HHSP233201300183A between the National
Academy of Sciences and the U.S. Department of Health and Human Services (Of
fice of the Assistant Secretary of Planning and Evaluation and the Substance Abuse
and Mental Health Services Administration), and grants from the American Psychi
atric Association, American Psychological Association, Association for Behavioral
Health and Wellness, and the National Association of Social Workers. Any opinions,
findings, conclusions, or recommendations expressed in this publication are those of
the author(s) and do not necessarily reflect the views of the organizations or agencies
that provided support for the project.
Additional copies of this report are available for sale from the National Academies
Press, 500 Fifth Street, NW, Keck 360, Washington, DC 20001; (800) 624-6242 or
(202) 334-3313; http://www.nap.edu.
For more information about the Institute of Medicine, visit the IOM home page
at: www.iom.edu.
The serpent has been a symbol of long life, healing, and knowledge among almost
all cultures and religions since the beginning of recorded history. The serpent ad
opted as a logotype by the Institute of Medicine is a relief carving from ancient
Greece, now held by the Staatliche Museen in Berlin.
The National Academy of Engineering was established in 1964, under the charter
of the National Academy of Sciences, as a parallel organization of outstanding en
gineers. It is autonomous in its administration and in the selection of its members,
sharing with the National Academy of Sciences the responsibility for advising the
federal government. The National Academy of Engineering also sponsors engineer
ing programs aimed at meeting national needs, encourages education and research,
and recognizes the superior achievements of engineers. Dr. C. D. Mote, Jr., is presi
dent of the National Academy of Engineering.
The National Research Council was organized by the National Academy of Sci
ences in 1916 to associate the broad community of science and technology with the
Academy’s purposes of furthering knowledge and advising the federal government.
Functioning in accordance with general policies determined by the Academy, the
Council has become the principal operating agency of both the National Academy
of Sciences and the National Academy of Engineering in providing services to the
government, the public, and the scientific and engineering communities. The Coun
cil is administered jointly by both Academies and the Institute of Medicine. Dr.
Ralph J. Cicerone and Dr. C. D. Mote, Jr., are chair and vice chair, respectively, of
the National Research Council.
www.national-academies.org
COMMITTEE ON DEVELOPING EVIDENCE-BASED STANDARDS
FOR PSYCHOSOCIAL INTERVENTIONS FOR MENTAL DISORDERS
v
SARAH HUDSON SCHOLLE, Vice President of Research and Analysis,
National Committee for Quality Assurance
JOHN T. WALKUP, Professor of Psychiatry, DeWitt Wallace Senior
Scholar, Vice Chair of Psychiatry, and Director, Division of Child
and Adolescent Psychiatry, Department of Psychiatry, Weill Cornell
Medical College, Cornell University
MYRNA WEISSMAN, Diane Goldman Kempner Family Professor
of Epidemiology and Psychiatry, Columbia University College of
Physicians and Surgeons; Chief, Division of Epidemiology, New York
State Psychiatric Institute
IOM Staff
ADRIENNE STITH BUTLER, Study Director
MONICA L. GONZALEZ, Associate Program Officer
THELMA L. COX, Administrative Assistant
LORA K. TAYLOR, Financial Associate
ANDREW M. POPE, Director, Board on Health Sciences Policy
Consultants
GARY BOND, Professor of Psychiatry, Dartmouth University
RONA BRIERE, Editor
BRUCE CHORPITA, Professor of Psychology, University of California,
Los Angeles
MIRIAM DAVIS, Writer
vi
Reviewers
vii
viii REVIEWERS
ix
x PREFACE
xi
Contents
ACRONYMS xix
GLOSSARY xxiii
SUMMARY 1
1 INTRODUCTION 21
Study Context, 23
Psychosocial Interventions, 31
Key Findings, 40
References, 43
PSYCHOSOCIAL INTERVENTIONS 47
Guidelines, 51
xiii
xiv CONTENTS
References, 54
Interventions, 57
Summary, 69
References, 70
References, 89
5 QUALITY MEASUREMENT 95
References, 124
Consumers, 134
Providers, 136
References, 152
CONTENTS xv
APPENDIXES
BOXES
S-1 Statement of Task, 4
Measures, 97
xvii
xviii BOXES, FIGURES, AND TABLES
FIGURES
S-1 Framework for developing standards for psychosocial
interventions, 8
interventions, 48
elements, 61
interventions, 63
TABLES
1-1 Leading Causes of Disease Burden, 30
1-2 Elements of the Statement of Task and Chapters Where They Are
Addressed, 41
6-1 Stakeholders and Their Levers for Influencing the Quality of Care
Acronyms
xix
xx ACRONYMS
P4P pay-for-performance
PAR participatory action research
PCORI Patient-Centered Outcomes Research Institute
PCORnet National Patient-Centered Clinical Research Network
PCPI Physician Consortium for Performance Improvement
PICOT population/disease, intervention or variable of interest,
comparison, outcome, time
ACA: The Patient Protection and Affordable Care Act (ACA), known col
loquially as health care reform or “Obamacare,” was designed to increase
the quality and affordability of health care for all Americans. The law’s
provisions focus on expanding coverage, controlling health care costs, and
improving the health care delivery system (KFF, 2013). The law became
effective on March 23, 2010. Several major provisions, including the indi
vidual mandate, guaranteed access to insurance for those with preexisting
conditions, minimum standards for health insurance policies, federal sub
sidies, and the implementation of health insurance exchanges, were phased
in through 2014.2
xxiii
xxiv GLOSSARY
Effect size: The difference between treatment and control groups, generally
expressed in standard deviation units.
External validity: “The extent to which the results of a study can be gener
alized to other situations and to other populations” (Brewer, 2000, p. 4).
Family: “Not only people related by blood or marriage, but also close
friends, partners, companions, and others whom patients would want as
part of their care team” (IOM, 2015, p. 28).
HITECH Act: The Health Information Technology for Economic and Clini
cal Health (HITECH) Act was enacted under Title XIII of the American
Recovery and Reinvestment Act of 2009 and officially established the Of
fice of the National Coordinator for Health Information Technology at the
U.S. Department of Health and Human Services. The act includes incentives
designed to accelerate the adoption of health information technology by the
health care industry, health care providers, consumers, and patients, largely
through the promotion of electronic health records and secure electronic
exchange of health information.3
Learning health care system: A health care system in which science, infor
matics, incentives, and culture are aligned for continuous improvement
and innovation, with best practices being seamlessly embedded in the care
process, patients and families being active participants in all elements of
care, and new knowledge being captured as an integral by-product of the
care experience (IOM, 2012).
3 Health Information Technology for Economic and Clinical Health (HITECH) Act, Title
XIII of Division A and Title IV of Division B of the American Recovery and Reinvestment
Act of 2009 (ARRA), Public Law 111-5, 111th Congress, 1st session (February 17, 2009).
GLOSSARY xxvii
MHPAEA: The Mental Health Parity and Addiction Equity Act (MHPAEA)
is a federal law that requires group health plans and health insurance issu
xxviii GLOSSARY
Peer specialists: People with lived experience of mental illness and/or chemi
cal dependency who act formally in roles that entail helping their peers to
overcome and recover from mental illness and/or chemical dependency.
They are also known as “peer mentors,” “recovery support specialists,”
and “peer navigators.”
Precision medicine: “An emerging approach for disease treatment and pre
vention that takes into account individual variability in genes, environment,
and lifestyle for each person” (NIH, 2015).
4 Mental Health Parity and Addiction Equity Act (MHPAEA), amending section 712 of the
Employee Retirement Income Security Act of 1974, section 2705 of the Public Health Service
Act, and section 9812 of the Internal Revenue Code of 1986, H.R. 6983, 110th Congress,
2nd session (September 23, 2008).
GLOSSARY xxix
Quality of evidence: “The extent to which one can be confident that the
estimate of an intervention’s effectiveness is correct” (IOM, 2011, p. 158).
REFERENCES
Addis, M. E., and J. Waltz. 2002. Implicit and untested assumptions about the role of psycho
therapy treatment manuals in evidence-based mental health practice. Clinical Psychology:
Science and Practice 9(4):421-424.
Baron, R. M., and D. A. Kenny. 1986. The moderator-mediator variable distinction in social
psychological research: Conceptual, strategic, and statistical considerations. Journal of
Personality and Social Psychology 51:1173-1182.
The Bill & Melinda Gates Foundation. 2015. Clinical trials. https://docs.gatesfoundation.org/
documents/clinical_trials.pdf (accessed May 12, 2015).
Brewer, M. B. 2000. Research design and issues of validity. In Handbook of research methods
in social and personality psychology, edited by H. T. Reis and C. M. Judd. Cambridge,
MA: Cambridge University Press. Pp. 3-16.
CBO (Congressional Budget Office). 2013. Dual-eligible beneficiaries of Medicare and Med
icaid: Characteristics, health care spending, and evolving policies. CBO publication no.
4374. Washington, DC: U.S. Government Printing Office.
CDC (Centers for Disease Control and Prevention). 2012. Meaningful use: Introduction.
http://www.cdc.gov/ehrmeaningfuluse/introduction.html (accessed May 8, 2015).
CMS (Centers for Medicare & Medicaid Services). 2014. EHR incentive programs. http://
www.cms.gov/RegulationsandGuidance/Legislation/HERIncentivePrograms/index.
html?redirect=/ehrincentiveprograms (accessed November 7, 2014).
_____. 2015. The Mental Health Parity and Addiction Equity Act. https://www.cms.gov/
CCIIO/Programs-and-Initiatives/Other-Insurance-Protections/mhpaea_factsheet.html (ac
cessed June 23, 2015).
Fairburn, C. G., and Z. Cooper. 2011. Therapist competence, therapy quality, and therapist
training. Behaviour Research and Therapy 49(6):373-378.
Fink, A., J. Kosecoff, M. Chassin, and R. Brook. 1984. Consensus methods: Characteristics
and guidelines for use. American Journal of Public Health 74(9):979-983.
Hickey, J. V., L. Y. Unruh, R. P. Newhouse, M. Koithan, M. Johantgen, R. G. Hughes, K. B.
Haller, and V. A. Lundmark. 2014. Credentialing: The need for a national research
agenda. Nursing Outlook 62(2):119-127.
Holmboe, E. 2014. Competency-based medical education (CBME) and transformation. Pre
sentation at the IOM’s Future Directions of Credentialing Research in Nursing: A
Workshop, Washington, DC. http://www.iom.edu/~/media/Files/ActivityFiles/Workforce/
FutureDirectionsCNRworkshop/NCRWorkshopPresentations/DRAFT-EricHolmboe
FutureofNursingCredentialing.ppt (accessed December 18, 2014).
International Council of Nurses. 2009. Credentialing: Fact sheet. http://www.icn.ch/images/
stories/documents/publications/fact_sheets/1a_FS-Credentialing.pdf (accessed November
7, 2014).
IOM (Institute of Medicine). 1991. Disability in America: Toward a national agenda for
prevention. Washington, DC: National Academy Press.
_____. 2001. Envisioning the national health care quality report. Washington, DC: National
Academy Press.
_____. 2009. Initial national priorities for comparative effectiveness research. Washington,
DC: The National Academies Press.
GLOSSARY xxxi
_____. 2011. Finding what works in health care: Standards for systematic reviews. Washing
ton, DC: The National Academies Press.
_____. 2012. Best care at lower cost: The path to continuously learning health care in America.
Washington, DC: The National Academies Press.
_____. 2015. Dying in America: Improving quality and honoring individual preferences near
the end of life. Washington, DC: The National Academies Press.
KFF (Kaiser Family Foundation). 2013. Summary of the Affordable Care Act. http://kff.org/
health-reform/fact-sheet/summary-of-the-affordable-care-act (accessed May 19, 2015).
Kraemer, H. C. 2002. Mediators and moderators of treatment effects in randomized clinical
trials. Archives of General Psychiatry 59(10):877-883.
Lauzon Clabo, L. 2014. Core competencies in nursing credentialing and certification. Presen
tation at the IOM’s Future Directions of Credentialing Research in Nursing: A Work
shop, Washington, DC. http://www.iom.edu/~/media/Files/ActivityFiles/Workforce/Future
DirectionsCNRworkshop/NCRWorkshopPresentations/3CoreCompetenciesinNursing
CredentialingandCertificationpublicversion.pptx (accessed December 30, 2014).
Luborsky, L., and R. J. DeRubeis. 1984. The use of psychotherapy treatment manuals: A
small revolution in psychotherapy research style. Clinical Psychology Review 4(1):5-14.
McHugh, M. D., R. E. Hawkins, P. E. Mazmanian, P. S. Romano, H. L. Smith, and J. Spetz.
2014. Challenges and opportunities in nursing credentialing research design. Discussion
Paper, Institute of Medicine, Washington, DC. http://iom.edu/~/media/Files/Perspectives
Files/2014/Discussion-Papers/CredentientialingResearchDesign.pdf (accessed November
4, 2014).
NAMI (National Alliance on Mental Illness). 2015. Psychotherapy. https://www.nami.org/
Learn-More/Treatment/Psychotherapy#sthash.kgOfTezP.dpuf (accessed May 8, 2015).
Needleman, J., R. S. Dittus, P. Pittman, J. Spetz, and R. Newhouse. 2014. Nurse credentialing
research frameworks and perspectives for assessing a research agenda. Discussion Paper,
Institute of Medicine, Washington, DC. http://www.iom.edu/~/media/Files/Perspectives
Files/2014/DiscussionPapers/CredentialingResearchFrameworks.pdf (accessed November
4, 2014).
Newhouse, R. 2014. Understanding the landscape and state of science in credentialing re
search in nursing. Presentation at the IOM’s Future Directions of Credentialing Re
search in Nursing: A Workshop, Washington, DC. http://www.iom.edu/~/media/Files/
ActivityFiles/Workforce/FutureDirectionsCNRworkshop/NCRWorkshopPresentations/
Workshop_IOM_Newhouse.pdf (accessed December 18, 2014).
NIH (National Institutes of Health). 2015. Precision medicine initiative. http://www.nih.gov/
precisionmedicine (accessed May 8, 2015).
NOCA (National Organization for Competency Assurance). 2005. The NOCA guide to un
derstanding credentialing concepts. Washington, DC: NOCA.
Powell, C. 2003. The Delphi technique: Myths and realities. Journal of Advanced Nursing
41(4):376-382.
SAMHSA (Substance Abuse and Mental Health Services Administration). 2010. SAMHSA’s
working definition of recovery: 10 guiding principles of recovery. http://content.samhsa.
gov/ext/item?uri=/samhsa/content/item/10007447/10007447.pdf (accessed May 8, 2015).
______. 2014. Peer support and social inclusion. http://www.samhsa.gov/recovery/peer
support-social-inclusion (accessed May 8, 2015).
U.S. Department of Labor. 2014. Credential resource guide. http://wdr.doleta.gov/directives/
attach/TEGL15-10a2.pdf (accessed November 7, 2014).
Weissman, D. E., and D. E. Meier. 2011. Identifying patients in need of a palliative care as
sessment in the hospital setting: A consensus report from the center to advance palliative
care. Journal of Palliative Medicine 14(1):17-23.
Summary1
ABSTRACT
Approximately 20 percent of Americans are affected by mental
health and substance use disorders, which are associated with sig
nificant morbidity and mortality. While the evidence base for the
effectiveness of interventions to treat these disorders is sizable, a
considerable gap exists between what is known to be effective and
interventions that are actually delivered in clinical care. Addressing
this quality chasm in mental health and substance use care is par
ticularly critical given the recent passage of the Patient Protection
and Affordable Care Act (ACA) and the Mental Health Parity and
Addiction Equity Act, which are changing the delivery of care and
access to treatments for mental health and substance use disorders.
Increasing emphasis on accountability and performance measure
ment, moreover, will require strategies to promote and measure the
quality of psychosocial interventions.
In this report, the study committee develops a framework that
can be used to chart a path toward the ultimate goal of improving
the outcomes of psychosocial interventions for those with mental
health and substance use disorders. This framework identifies the
key steps entailed in successfully bringing an evidence-based psy
chosocial intervention into clinical practice. It highlights the need
1 This summary does not include references. Citations for the discussion presented in the
summary appear in the subsequent report chapters.
1
2 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
BOX S-1
Statement of Task
The Institute of Medicine will establish an ad hoc committee that will develop
a framework to establish efficacy standards for psychosocial interventions used to
treat mental disorders. The committee will explore strategies that different stake
holders might take to help establish these standards for psychosocial treatments.
Specifically, the committee will:
• Characterize
the types of scientific evidence and processes needed to
establish the effectiveness of psychosocial interventions.
– Define levels of scientific evidence based on their rigor.
– Define the types of studies needed to develop quality measures for
monitoring quality of psychosocial therapies and their effectiveness.
– Define the evidence needed to determine active treatment elements
as well as their dose and duration.
• Using the best available evidence, identify the elements of psychosocial
treatments that are most likely to improve a patient’s mental health and
can be tracked using quality measures. In addition, identify features of
health care delivery systems involving psychosocial therapies that are
most indicative of high-quality care that can be practically tracked as part
of a system of quality measures. The following approaches to quality
measurement should be considered:
– Measures to determine if providers implement treatment in a manner
that is consistent with evidence-based standards;
– Measures that encourage continuity of treatment;
– Measures that assess whether providers have the structures and
processes in place to support effective psychotherapy;
– Consumer-reported
experiences of evidence-based psychosocial
care; and
– Consumer-reported
outcomes using a measurement-based care
approach.
2
The committee’s recommendations are numbered according to the chapter of the report
in which they appear. Thus, for example, recommendation 2-1 is the first recommendation in
Chapter 2. For purposes of clarity, some recommendations are presented in this summary in
a different sequence from that in which they appear in the full report; however, their numeric
designation remains the same.
SUMMARY 5
Key Findings
The information gathered for this study led to the following key find
ings concerning mental health and substance use disorders and the interven
tions developed to treat them:
The mental health and substance use care delivery system needs a
framework for applying strategies to improve the evidence base for
and increase the uptake of high-quality evidence-based interven
tions in the delivery of care.
SUMMARY 9
laborious and costly, and rarely keep pace with advances in the field. To
avoid the cost and timeliness problems inherent in systematic reviews, an
entity charged with overseeing the reviews and their products could explore
the potential for technology (e.g., the use of machine learning to augment
and streamline the systematic review process) and clinical and research
networks and learning environments to expedite the process and the devel
opment of updates to recommendations. In 2011, the IOM offered a set
of recommendations for conducting high-quality systematic reviews. The
guidelines broadly identify evidence-based treatments and approaches but
generally are not designed to provide the level of detail needed to inform
clinicians in the delivery of treatments to ensure reproducibility and a con
sistent level of quality outcomes. As a result, these guidelines would need
to be modified to be more specific and ensure that information beyond
intervention impact is available.
Having a process for systematically reviewing evidence is particularly
important given the changes introduced under the ACA and the Mental
Health Parity and Addiction Equity Act. Now more than ever, a stan
dardized evaluation process is needed to enable the generation of reliable
information to form the basis for policy and coverage decisions, curricu
lum development and training of clinicians, and other efforts to improve
the quality of psychosocial care. Absent such a standardized process, the
quality of care will continue to vary considerably. Systematic reviews need
to address intervention efficacy, effectiveness, and implementation needs.
Equally important is identifying the best information with which to answer
these questions.
Two examples of the benefits of having a standardized, coordinated
process for determining which interventions are evidence based are the
National Institute for Health Care and Excellence (NICE) in the United
Kingdom and the U.S. Department of Veterans Affairs’ (VA’s) Evidence-
Based Synthesis Program (ESP). Both employ a coordinated process for con
ducting systematic reviews and creating guidelines based on internationally
agreed-upon standards, and both have a process for evaluating the impact
of guidelines on practice and outcomes. Based on the successes of NICE
and the ESP, it is possible to develop a process for conducting systematic
reviews and creating guidelines and implementation materials for psycho
social interventions, as well as a process for evaluating the impact of these
tools, by leveraging existing resources.
The committee envisions a process that involves input from consumers
and clinicians at every step. A potential direction is for the U.S. Depart
ment of Health and Human Services, in partnership with professional and
consumer organizations, to develop a coordinated process for conducting
systematic reviews of the evidence for psychosocial interventions and creat
ing guidelines and implementation materials in accordance with the IOM
SUMMARY 13
A Research Agenda
Additional research is needed to expand the evidence base on the ef
fectiveness of psychosocial interventions, validate strategies for applying
elements approaches, develop and test quality measures, and design and
evaluate implementation strategies and policies. The committee offers the
following recommendations as a research agenda to further progress in each
phase of the framework.
CONCLUSION
The prevalence of mental health and substance use disorders and the
impacts of these disorders on morbidity and mortality are well documented.
The gap between what interventions are known to be effective and the
care that is delivered, together with the changing landscape in health care,
demands fundamental changes in processes used to ensure the availability
and delivery of high-quality evidence-based psychosocial interventions. De
termining the best ways to strengthen the evidence base, identify elements
that underpin interventions, conduct systematic reviews to inform clinical
guidelines, develop quality measures to track the effectiveness of interven
tions, and implement quality interventions to improve patient outcomes
has been remarkably challenging for the field of mental health. The process
of moving through each step of the committee’s framework is complex,
requires evidence, and should be iterative. The committee believes that its
framework and its recommendations for action can help achieve the goal
of improved outcomes from psychosocial interventions for individuals suf
fering from mental health and substance use disorders.
1
Introduction
21
1 Mental Health Parity and Addiction Equity Act (MHPAEA), amending section 712 of the
Employee Retirement Income Security Act of 1974, section 2705 of the Public Health Service
Act, and section 9812 of the Internal Revenue Code of 1986, Division C of Public Law 110
343, 110th Congress, 2nd session (October 3, 2008).
INTRODUCTION 23
STUDY CONTEXT
This study comes at a time of significant policy change. The enact
ment of the ACA is creating fundamental changes in the organization,
financing, and delivery of health care. The act is intended to make care
less fragmented, more efficient, and higher-quality through a number of
provisions. Of particular relevance to the subject of this report, through
the ACA, several million previously uninsured people have gained coverage
for services to treat their mental health and substance use disorders. Health
plans offered on the health insurance exchanges must include mental health
and substance use services as essential benefits. One early model, devel
oped prior to the ACA’s full enactment, indicated that 3.7 million people
with serious mental illness would gain coverage, as would an additional
1.15 million new users with less severe disorders (Garfield et al., 2011).2
In its broadest sense, the goal of the ACA is to achieve patient-centered,
more affordable, and more effective health care. One prominent provision
is a mandate for a National Quality Strategy,3 which is focused on mea
suring performance, demonstrating “proof of value” provided by the care
delivery system, exhibiting transparency of performance to payers and con
sumers, linking payment and other incentives/disincentives to performance,
establishing provider accountability for the quality and cost of care, and
reforming payment methodology (AHRQ, 2011). The National Quality
Forum (NQF) was charged by the Centers for Medicare & Medicaid Ser
vices to compile, review, and endorse quality measures for use in gauging
the quality and effectiveness of health care across many sectors of the health
care system (CMS, 2014). Under certain provisions of the ACA, meeting the
targets for these quality measures will serve as the basis for payment and
for the application of other incentives/disincentives. Among those quality
measures addressing mental health and substance use disorders, only two
that focus on psychosocial interventions are NQF-endorsed.4
The ACA includes reforms with the potential to mitigate the division of
mental health and substance use care between primary and specialty care.
The act creates opportunities for large networks of providers to become
accountable care organizations (ACOs)5—a care model that directly links
2 This model assumed that Medicaid expansion would occur in all states, but because of a
Supreme Court ruling in 2012, several states have opted out of Medicaid expansion.
3 The National Quality Strategy is a strategic framework for policies designed to improve
BOX 1-1
Statement of Task
The Institute of Medicine will establish an ad hoc committee that will develop
a framework to establish efficacy standards for psychosocial interventions used to
treat mental disorders. The committee will explore strategies that different stake
holders might take to help establish these standards for psychosocial treatments.
Specifically, the committee will:
• Characterize
the types of scientific evidence and processes needed to
establish the effectiveness of psychosocial interventions.
– Define levels of scientific evidence based on their rigor.
– Define the types of studies needed to develop quality measures for
monitoring quality of psychosocial therapies and their effectiveness.
– Define the evidence needed to determine active treatment elements
as well as their dose and duration.
• Using
the best available evidence, identify the elements of psychosocial
treatments that are most likely to improve a patient’s mental health and
can be tracked using quality measures. In addition, identify features of
health care delivery systems involving psychosocial therapies that are
most indicative of high-quality care that can be practically tracked as part
of a system of quality measures. The following approaches to quality
measurement should be considered:
– Measures to determine if providers implement treatment in a manner
that is consistent with evidence-based standards;
– Measures that encourage continuity of treatment;
– Measures that assess whether providers have the structures and
processes in place to support effective psychotherapy;
– Consumer-reported
experiences of evidence-based psychosocial
care; and
– Consumer-reported
outcomes using a measurement-based care
approach.
the study charge (see Appendix A for further information). The committee’s
conclusions and recommendations are based on its review of the scientific
evidence, information gathered in its public workshops, and the expert
judgment of its members.
From the outset, it was clear to the committee that there is no gener
ally accepted definition of psychosocial interventions in the literature. The
committee offers a definition in this report that includes psychotherapies
of various orientations for specific disorders (e.g., interpersonal, cognitive-
behavioral, brief psychodynamic) and interventions that enhance outcomes
across disorders (e.g., supported employment, supported housing, family
26 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
6 Given the rigor and time involved in conducting a systematic review of the evidence for
psychosocial interventions, this task is beyond the purview of the committee. Chapter 4 pro
vides recommendations regarding how these systematic reviews should be conducted. This
report also includes discussion of reviews conducted by the Agency for Healthcare Research
and Quality, the Veterans Heath Administration, and the U.K. National Institute for Health
and Care Excellence that meet the standards put forth in the IOM (2011) report Finding What
Works in Health Care: Standards for Systematic Reviews.
INTRODUCTION 27
lishing the evidence base for psychosocial interventions and the systems in
which those interventions are delivered.
The committee was charged “to identify the evidence needed to deter
mine active treatment elements as well as their dose and duration.” The ef
fort to identify the active elements of psychosocial interventions has a long
tradition in intervention development and research in the field of mental
health and substance use disorders. Two perspectives emerge from this lit
erature, focused on (1) the nature and quality of the interpersonal relation
ship between the interventionist and the client/patient, and (2) the content
of the interchange between the interventionist and client/patient. Both of
these perspectives have been demonstrated to be important components of
evidence-based care. The charge to the committee thus requires that both
of these traditions be included in its discussion of the active components of
evidence-based interventions.
The recommendations offered in this report are intended to assist pol
icy makers, health care organizations, and payers who are organizing and
overseeing the provision of care for mental health and substance use disor
ders while navigating a new health care landscape. The recommendations
also target providers, professional societies, funding agencies, consumers,
and researchers, all of whom have a stake in ensuring that evidence-based,
high-quality care is provided to individuals receiving mental health and
substance use services.
7 DALYs denote the number of years of life lost due to ill health; disability; or early death,
including suicide. A DALY represents the sum of years lost to disability (YLDs) and years of
life lost (YLLs).
30
TABLE 1-1 Leading Causes of Disease Burden
Proportion of Total Proportion of Total Proportion of Total
Condition DALYs (95% UI) YLDs (95% UI) YLLs (95% UI)
Cardiovascular and circulatory diseases 11.9% (11.0-12.6) 2.8% (2.4-3.4) 15.9% (15.0-16.8)
Diarrhea, lower respiratory infections, 11.4% (10.3-12.7) 2.6% (2.0-3.2) 15.4% (14.0-17.1)
diseases
Mental and substance use disorders 7.4% (6.2-8.6) 22.9% (18.6-27.2) 0.5% (0.4-0.7)
Diabetes and urogenital, blood, and endocrine 4.9% (4.4-5.5) 7.3% (6.1-8.7) 3.8% (3.4-4.3)
diseases
Unintentional injuries other than transport 4.8% (4.4-5.3) 3.4% (2.5-4.4) 5.5% (4.9-5.9)
injuries
NOTE: DALYs = disability-adjusted life-years; UI = uncertainty interval; YLDs = years lived with a disability; YLLs = years of life lost.
SOURCE: Whiteford et al., 2013.
INTRODUCTION 31
bid with mental disorders, is similarly high among prison inmates (Peters
et al., 1998). Still, only 39 percent of the 45.9 million adults with mental
disorders used mental health services in 2010 (SAMHSA, 2012a). And
according to the National Comorbidity Survey Replication, conducted in
2001-2003, a similarly low percentage of adults with comorbid substance
use disorders used services (Wang et al., 2005). States bear a large propor
tion of the indirect costs of mental health and substance disorders through
their disability, education, child welfare, social services, and criminal and
juvenile justice systems.
PSYCHOSOCIAL INTERVENTIONS
Definition
To guide our definition of psychosocial interventions, the committee
built on the approach to defining interventions used in the Consolidated
Standards of Reporting Trials for Social and Psychological Interventions
(CONSORT-SPI; Grant, 2014).8
The term “intervention” means “the act or . . . a method of interfer
ing with the outcome or course especially of a condition or process (as
to prevent harm or improve functioning)” (Merriam-Webster Dictionary)
or “acting to intentionally interfere with an affair so to affect its course
or issue” (Oxford English Dictionary). These definitions emphasize two
constructs—an action and an outcome. Psychosocial interventions capital
ize on psychological or social actions to produce change in psychological,
social, biological, and/or functional outcomes. CONSORT-SPI emphasizes
the construct of mediators, or the ways in which the action leads to an
outcome, as a way of distinguishing psychosocial from other interventions,
such as medical interventions (Montgomery et al., 2013). Based on these
sources, modified for mental health and substance use disorders, the com
mittee proposes the following definition of psychosocial interventions:
8 This text has been updated since the prepublication version of this report.
Psychosocial Interventions for
Mental Health and Substance
32 UsePSYCHOSOCIAL
FRAMEWORK FOR (MH/SU) Disorders
INTERVENTIONS
1
Intervention Nonspecific elements are generic to all effective
Activities, techniques, or psychosocial interventions (e.g., therapeutic alliance)
strategies delivered
Specific elements are unique to a particular
interpersonally or by
theoretical orientation or approach (e.g., cognitive
presenting information
restructuring, identification of interpersonal triggers)
2
How the intervention
The intervention influences outcomes through
might affect change changes across an array of mediating bio-psychosocial
Changes in biological, factors. The mechanisms underlying these mediating
behavioral, cognitive, factors are likely to extend from basic central nervous
emotional, interpersonal, system function to perceptions and beliefs.
social, or environmental
factors
FIGURE 1-1 Illustration of the three main concepts in the committee’s definition
of psychosocial interventions.
Providers
Providers who deliver psychosocial interventions include psychologists,
psychiatrists, social workers, counselors/therapists, primary care and other
nonpsychiatric physicians, nurses, physical and occupational therapists,
religious leaders, lay and peer providers, paraprofessionals and caregiv
34 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Populations
The population targeted by psychosocial interventions is varied. It
includes individuals at risk of or experiencing prodromal symptoms of an
illness; individuals with acute disorders; individuals in remission, mainte
nance, or recovery phases of disorders; and individuals who are not ill but
are challenged by daily functioning, relationship problems, life events, or
psychological adjustment.
BOX 1-2
9 Although this report uses the more familiar word “terminology,” the committee recognizes
that the term “ontology” may be helpful in that it describes an added dimension of intercon
nectedness among elements, beyond simply defining them. This is supported by the IOM
(2014) report Capturing Social and Behavioral Domains and Measures in Electronic Health
Records: Phase 2.
INTRODUCTION 39
receiving the best possible treatment (see Chapter 5). Research to develop
quality measures from electronic health records is one potential means of
improving how quality is determined. Research is needed as well to identify
practice patterns associated with performance quality. A systematic way to
review quality also needs to be established.
KEY FINDINGS
The committee identified the following key findings about mental health
and substance use disorders and the interventions developed to treat them:
TABLE 1-2 Elements of the Statement of Task and Chapters Where They
Are Addressed
Element of the Statement of Task Chapters
Using the best available evidence, identify Chapter 3: The Elements of Therapeutic
the elements of psychosocial treatments Change
that are most likely to improve a patient’s • An Elements Approach to Evidence-Based
mental health and can be tracked using Psychosocial Interventions
quality measures.
REFERENCES
AHRQ (Agency for Healthcare Research and Quality). 2011. Report to Congress: National
strategy for quality improvement in health care. http://www.ahrq.gov/workingforquality/
nqs/nqs2011annlrpt.pdf (accessed May 27, 2015).
APA (American Psychological Association). 2015. Standards of accreditation for health service psy
chology. http://www.apa.org/ed/accreditation/about/policies/standards-of-accreditation.
pdf (accessed June 18, 2015).
Barth, J., T. Munder, H. Gerger, E. Nuesch, S. Trelle, H. Znoj, P. Juni, and P. Cuijpers. 2013.
Comparative efficacy of seven psychotherapeutic interventions for patients with depres
sion: A network meta-analysis. PLoS Medicine 10(5):e1001454.
Bauer, M. S. 2002. A review of quantitative studies of adherence to mental health clinical
practice guidelines. Harvard Review of Psychiatry 10(3):138-153.
Beck, A., A. Rush, B. Shaw, and G. Emery. 1979. Cognitive therapy of depression. New York:
Guilford Press.
BJS (Bureau of Justice Statistics). 2006. Mental health problems of prison and jail inmates.
http://www.bjs.gov/content/pub/pdf/mhppji.pdf (accessed March 17, 2014).
Bush, D. E., R. C. Ziegelstein, U. V. Patel, B. D. Thombs, D. E. Ford, J. A. Fauerbach, U. D.
McCann, K. J. Stewart, K. K. Tsilidis, and A. L. Patel. 2005. Post-myocardial infarction
depression: Summary. AHRQ publication number 05-E018-1. Evidence reports/technol
ogy assessment number 123. Rockville, MD: AHRQ.
Cherry, D. K., D. A. Woodwell, and E. A. Rechtsteiner. 2007. National Ambulatory Medical
Care Survey: 2005 summary. Hyattsville, MD: National Center for Health Statistics.
CMS (Centers for Medicare & Medicaid Services). 2014. CMS measures inventory. http://www.
cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/
CMS-Measures-Inventory.html (accessed May 20, 2014).
Colton, C. W., and R. W. Manderscheid. 2006. Congruencies in increased mortality rates,
years of potential life lost, and causes of death among public mental health clients in
eight states. Preventing Chronic Disease 3(2):A42.
CSWE (Council on Social Work Education). 2008. Educational policy and education stan
dards. http://www.cswe.org/File.aspx?id=13780 (accessed June 18, 2015).
Cuijpers, P., F. Smit, E. Bohlmeijer, S. D. Hollon, and G. Andersson. 2010a. Efficacy of cogni
tive-behavioural therapy and other psychological treatments for adult depression: Meta
analytic study of publication bias. The British Journal of Psychiatry 196(3):173-178.
Cuijpers, P., A. van Straten, J. Schuurmans, P. van Oppen, S. D. Hollon, and G. Andersson.
2010b. Psychotherapy for chronic major depression and dysthymia: A meta-analysis.
Clinical Psychological Review 30(1):51-62.
Cuijpers, P., A. S. Geraedts, P. van Oppen, G. Andersson, J. C. Markowitz, and A. van Straten.
2011. Interpersonal psychotherapy for depression: A meta-analysis. American Journal of
Psychiatry 168(6):581-592.
Cuijpers, P., M. Sijbrandij, S. L. Koole, G. Andersson, A. T. Beekman, and C. F. Reynolds.
2013. The efficacy of psychotherapy and pharmacotherapy in treating depressive and anx
iety disorders: A meta-analysis of direct comparisons. World Psychiatry 12(2):137-148.
Drake, R. E., and K. T. Mueser. 2000. Psychosocial approaches to dual diagnosis. Schizophre
nia Bulletin 26(1):105-118.
Garfield, R. L., S. H. Zuvekas, J. R. Lave, and J. M. Donohue. 2011. The impact of national
health care reform on adults with severe mental disorders. American Journal of Psychia
try 168(5):486-494.
Grant, S. 2014. Development of a CONSORT extension for social and psychological interven
tions. DPhil. University of Oxford, U.K. http://ora.ox.ac.uk/objects/uuid:c1bd46df-eb3f
4dc6-9cc1-38c26a5661a9 (accessed August 4, 2015).
44 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Wood, E., J. H. Samet, and N. D. Volkow. 2013. Physician education in addiction medicine.
Journal of the American Medical Association 310(16):1673-1674.
Young, A. S., R. Klap, C. D. Sherbourne, and K. B. Wells. 2001. The quality of care for
depressive and anxiety disorders in the United States. Archives of General Psychiatry
58(1):55-61.
2
of Psychosocial Interventions
47
0.28-0.52]) (Huhn et al., 2014).1 The effect sizes for psychotherapies varied
across mental disorders. The largest effect sizes were for bulimia nervosa
(SMD2 = 1.61, CI3 = 0.96-2.29), obsessive compulsive disorder (SMD =
1.37, CI = 0.64-2.24), trichotillomania (SMD = 1.14, CI = 0.38-1.89),
anorexia nervosa (SMD = 0.99, CI = 0.38-1.6), and binge eating disorder
(SMD = 0.86, CI = 0.42-1.3). The effect sizes were still moderate or greater
(SMD >0.5) for major depressive disorder, generalized anxiety disorder,
social anxiety disorder, posttraumatic stress disorder, and insomnia. The
lowest effect sizes were for schizophrenia with psychodynamic therapy
(SMD = −0.25, CI = −0.59-0.11) and alcohol use disorders (SMD = 0.17,
CI = 0.08-0.26) (Huhn et al., 2014). These effect sizes are based on a variety
of different psychotherapies from different theoretical orientations. Several
other meta-analyses have been conducted for specific psychotherapies (e.g.,
cognitive-behavioral therapy, interpersonal psychotherapy, and problem
solving therapy), indicating that some therapies are specifically indicated
for particular disorders, while others appear to be effective for many dif
ferent disorders.
Few meta-analyses exist for other types of psychosocial interventions,
such as suicide prevention programs, vocational rehabilitation, and clini
cal case management. However, these interventions have been subjected to
randomized controlled trials (RCTs) and have been shown to have positive
effects on the intended intervention target.
Although meta-analyses support the use of psychosocial interventions
in the treatment of mental health and substance use problems, other studies
are needed to further determine the utility of these interventions in differ
ent populations and settings. An argument can be made for emphasizing
new study designs that yield immediately actionable results relevant to
a variety of stakeholders. Tunis and colleagues (2003) describe the need
for “practical clinical trials” that address issues of effectiveness—whether
interventions work under real-world conditions—as a second step follow
ing efficacy studies under the ideal circumstances of an RCT. Pragmatic or
practical trials focus on engaging stakeholders in all study phases to address
questions related to intervention effectiveness, implementation strategies,
and the degree to which an intervention can be conducted to fidelity in a
variety of service settings. These studies also address the resources required
1 The effect size is the difference between treatment and control groups and is expressed
in standard deviation units. An effect size of 1 indicates that the average treated patient is 1
standard deviation healthier than the average untreated patient. An effect size of 0.8 is con
sidered a large effect, an effect size of 0.5 is considered a moderate effect, and an effect size
of 0.2 is considered a small effect.
2 Huhn and colleagues (2014) measured standardized between-group mean differences
(SMDs).
3 Reported data include CIs.
50 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
were carrying out the treatment under study with fidelity, and to ensure
consistency among therapists in how the treatment was delivered. However,
manuals have become an important aspect of the implementation of inter
ventions. Without a manual, a guideline, or documentation of how an in
tervention works, the intervention cannot be deployed as it was developed.
Most manuals have been adapted for different age groups, cultures, dis
orders, and delivery formats. Sometimes the adaptations have been newly
tested, but often they have not. The proliferation of manuals has caused
some confusion, and as a result, the manuals often are not widely accepted
in clinical practice (Addis and Waltz, 2002). Among the reasons for limited
acceptance is the view that the manuals are overly prescriptive and too
complicated to follow, and most are not accompanied by evidence-based
trainings. When providers are properly trained and supported in an inter
vention, however, manuals can be useful resources.
Standardization of psychosocial interventions provides an opportunity
for identifying the potential nonspecific and specific elements of these treat
ments. As discussed in Chapter 3, a process for specifying elements will be
necessary to improve the impact of psychosocial interventions.
The mental health and substance use care delivery system needs a
framework for applying strategies to improve the evidence base for
and increase the uptake of high-quality evidence-based interven
tions in the delivery of care.
REFERENCES
Addis, M. E., and J. Waltz. 2002. Implicit and untested assumptions about the role of psycho
therapy treatment manuals in evidence-based mental health practice. Clinical Psychology:
Science and Practice 9(4):421-424.
Beinecke, R., and J. Delman. 2008. Commentary: Client involvement in public administration
research and evaluation. The Innovation Journal: The Public Sector Innovation Journal
13(1). http://www.innovation.cc/peer-reviewed/beinicke_7_commenta-_client_public_
admin.pdf (accessed February 18, 2009).
Berwick, D. M. 2009. What “patient-centered” should mean: Confessions of an extremist.
Health Affairs 28(4):w555-w565.
Checkoway, B. 2011. What is youth participation? Children and Youth Services Review
33(2):340-345.
Deegan, P. E. 1993. Recovering our sense of value after being labeled mentally ill. Journal of
Psychosocial Nursing and Mental Health Services 31(4):7-11.
CLOSING THE QUALITY CHASM 55
Graham, T., D. Rose, J. Murray, M. Ashworth, and A. Tylee. 2014. User-generated quality
standards for youth mental health in primary care: A participatory research design using
mixed methods. BMJ Quality & Safety 10.1136/bmjqs-2014-002842.
Huhn, M., M. Tardy, L. M. Spineli, W. Kissling, H. Forstl, G. Pitschel-Walz, C. Leucht, M.
Samara, M. Dold, J. M. Davis, and S. Leucht. 2014. Efficacy of pharmacotherapy and
psychotherapy for adult psychiatric disorders: A systematic overview of meta-analyses.
JAMA Psychiatry 71(6):706-715.
IOM (Institute of Medicine). 2010. Provision of mental health counseling services under
TRICARE. Washington, DC: The National Academies Press.
_____. 2011a. Finding what works in health care: Standards for systematic reviews. Washing
ton, DC: The National Academies Press.
_____. 2011b. Clinical practice guidelines we can trust. Washington, DC: The National
Academies Press.
Israel, B. A., A. J. Schulz, E. A. Parker, A. B. Becker, A. J. Allen, and J. R. Guzman. 2003.
Critical issues in developing and following community-based participatory research
principles. In Community-based participatory research for health, edited by M. Minkler
and N. Wallerstein. San Francisco, CA: Jossey-Bass. Pp. 53-76.
Krist, A. H., D. Shenson, S. H. Woolf, C. Bradley, W. R. Liaw, S. F. Rothemich, A. Slonim,
W. Benson, and L. A. Anderson. 2013. Clinical and community delivery systems for
preventive care: An integration framework. American Journal of Preventive Medicine
45(4):508-516.
Pincus, H. A. 2010. From PORT to policy to patient outcomes: Crossing the quality chasm.
Schizophrenia Bulletin 36(1):109-111.
SAMHSA (Substance Abuse and Mental Health Services Administration). 2015. NREPP re
views and submissions. http://www.nrepp.samhsa.gov/ReviewSubmission.aspx (accessed
May 28, 2015).
Tunis, S. R., D. B. Stryer, and C. M. Clancy. 2003. Practical clinical trials: Increasing the
value of clinical research for decision making in clinical and health policy. Journal of the
American Medical Association 290(12):1624-1632.
WHO (World Health Organization). 2010. mhGAP intervention guide. http://www.paho.org/
mhgap/en (accessed January 6, 2015).
3
57
BOX 3-1
Obviously, the further apart the theoretical orientations, the less likely
it is that shared elements function in the same way across two interven
tions. For example, exploration of attempts to avoid distressing thoughts
and feelings within psychodynamic therapy functions to identify unresolved
conflicts, whereas exploration of avoidance of unwanted thoughts or im
ages in cognitive-behavioral therapy provides the rationale for exposure
therapy to reduce discomfort and improve functioning. The discussion
returns to this issue below.
At the same time, some specific elements differentiate among manu
alized psychosocial interventions or are unique to a given manual. For
example, the element of “the dialectic between acceptance and change” is
THE ELEMENTS OF THERAPEUTIC CHANGE 61
Nonspecific Elements
(e.g., engaging the client)
FIGURE 3-1 An example of nonspecific and unique and shared specific elements.
NOTE: PTSD = posttraumatic stress disorder.
Terminology
Recognition of the elements of evidence-based psychosocial interven
tions highlights the similarities across interventions as well as the true
differences. However, this process of discovery is somewhat hampered by
the lack of a common language for describing elements across different
62 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Moderators
An elements approach for psychosocial interventions may advance the
study of moderators of outcome, or what intervention is most effective for
a given patient subgroup or individual. The study of moderation is consis
tent with the National Institute of Mental Health’s (NIMH’s) Strategic Plan
for Research, in which a priority is to “foster personalized interventions
and strategies for sequencing, or combining existing and novel interven
tions which are optimal for specific phases of disease progression (e.g.,
THE ELEMENTS OF THERAPEUTIC CHANGE 63
Mechanisms
Mechanisms of action could be investigated for each element or se
quence of elements across multiple units of analysis (from genes to behav
ior), consistent with NIMH’s Research Domain Criteria Initiative (Insel et
al., 2010) and its Strategic Plan for Research, which calls for mechanistic
research for psychological treatments. For example, an aim of the Strategic
Plan is to “develop objective surrogate measures of outcome and clinical
change that extend beyond symptoms, to assess if target mechanisms un
derlying function, general health, and quality of life have been modified
by treatments” (NIMH, 2015). The elements of psychosocial interventions
themselves are not mediators or mechanisms. However, elements may have
the capacity to be tied more precisely to mechanisms than is the case for
a complex psychosocial intervention comprising multiple elements. For
example, the element of “cognitive restructuring” relates more closely to
the mechanism of attentional bias than does a manual comprising cogni
tive restructuring, relaxation training, and exposure techniques for anxiety
disorders. Similarly, the mechanism of social cognition in schizophrenia
may be linked more closely to the element of “social skills training” than
to the effects of broader intervention packages such as assertive community
treatment or supported employment. Knowledge of mechanisms can be
used to hone psychosocial interventions to be optimally effective (Kazdin,
2014). In addition, an elements approach could encourage investigation of
the degree to which outcomes are mediated by nonspecific versus specific
elements. Although both are critical to intervention success, the debate
noted earlier regarding the relative importance of each could be advanced
by this approach.
A mechanistic approach is not without constraints. The degree to
which mechanisms can be tied to particular elements alone or presented in
sequence is limited, especially given the potential lag time between the deliv
ery of an intervention and change in either the mediator or the outcome—
although this same limitation applies to complex psychosocial interventions
comprising multiple elements. Nonetheless, emerging evidence on the role
of neural changes as mechanisms of psychological interventions (e.g., Quide
et al., 2012) and rapidly expanding technological advances for recording
real-time moment-to-moment changes in behavior (e.g., passive recording
THE ELEMENTS OF THERAPEUTIC CHANGE 65
of activity levels and voice tone) and physiology (e.g., sleep) hold the po
tential for much closer monitoring of purported mediators and outcomes
that may offer more mechanistic precision than has been available to date.
Intervention Development
The elements approach would not preclude the development of new
psychosocial interventions using existing or novel theoretical approaches.
However, the approach could have an impact on the development of new
interventions in several ways. First, any new intervention could be ex
amined in the context of existing elements that can be applied to new
populations or contexts. This process could streamline the development
of new interventions and provide a test of how necessary it is to develop
entirely novel interventions. Second, for the development of new psychoso
cial interventions, elements would be embedded in a theoretical model that
specifies (1) mechanisms of action for each element (from genes to brain to
behavior), recognizing that a given element may exert its impact through
more than one mechanism; (2) measures for establishing fidelity; and (3)
measures of purported mechanisms and outcomes for each element. Also,
new interventions could be classified into their shared and unique elements,
providing a way to justify the unique elements theoretically. Finally, the
development of fidelity measures could be limited to those unique elements
in any new intervention.
Training
When elements are presented together in a single manual, an interven
tion can be seen as quite complex (at least by inexperienced practitioners).
The implementation of complex interventions in many mental health care
delivery centers may prove prohibitive, since many such interventions do
not get integrated regularly into daily practice (Rogers, 2003). Training in
the elements has the potential to be more efficient as practitioners would
learn strategies and techniques that can be applied across target problems/
disorders or contexts. This approach could lead to greater uptake com
pared with a single complex intervention (Rogers, 2003), especially for
disciplines with relatively less extensive training in psychosocial interven
tions. Furthermore, many training programs for evidence-based psychoso
cial interventions already use an elements framework, although currently
these frameworks are tied to specific theoretical models and approaches.
For example, the comprehensive program for Improving Access to Psycho
therapies (IAPT) in the United Kingdom trains clinicians in the elements
of cognitive-behavioral therapy, interpersonal psychotherapy, and brief
psychodynamic therapy (NHS, 2008). Conceivably, an elements approach
66 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Implementation
Attempts recently have been made to implement an elements approach
for evidence-based psychosocial interventions for children, adolescents, and
adults (e.g., Chorpita et al., 2005). One such approach—the Distillation
and Matching Model of Implementation (Chorpita et al., 2005) (described
in more detail in Chapter 4)—involves an initial step of coding and iden
tifying the elements (i.e., specific activities, techniques, and strategies) that
make up evidence-based treatments for childhood mental disorders. For
example, evaluation of 615 evidence-based psychosocial treatment manu
als for youth yielded 41 elements (Chorpita and Daleiden, 2009). After the
elements were identified, they were ranked in terms of how frequently they
occurred within evidence-based psychosocial intervention manuals in rela
tion to particular client characteristics (e.g., target problem, age, gender,
ethnicity) and treatment characteristics (e.g., setting, format). Focusing on
the most frequent elements has the advantage of identifying elements that
are the most characteristic of evidence-based psychosocial interventions.
Figure 3-3 shows a frequency listing for an array of elements for interven
tions for anxiety disorders, specific phobia, depression, and disruptive
behavior in youth. Figure 3-4 ties the frequency listing for specific phobia
to further characteristics of the sample.
In terms of implementation, the matrix of elements (ranked by frequency
for different patient characteristics) functioned as a guide for community
practitioners, who chose the elements that matched their sample. Whereas
Chorpita and colleagues (2005) do not address nonspecific elements (i.e.,
common factors), an elements approach could encourage practitioners to
select nonspecific elements as the foundation of their intervention, and to
select specific elements from among those occurring most frequently that
have an evidence base for their population (i.e., a personalized approach).
With the accrual of evidence, the personalized selection of elements could
increasingly be based on research demonstrating which elements, or se
quence of elements, are most effective for specific clinical profiles. The
THE ELEMENTS OF THERAPEUTIC CHANGE 67
FIGURE 3-4 Intervention element profiles by patient characteristics for the example
of specific phobia.
manuals in both the short term (Weisz et al., 2012) and long term (Chorpita
et al., 2013). Also, implementation of an elements approach to training in
the Child and Adolescent Mental Health Division of the Hawaii Depart
ment of Health resulted in decreased time in treatment and increased rate of
improvement (Daleiden et al., 2006). The training in Hawaii was facilitated
by a Web-based system that detailed the research literature to help clinicians
gather information relevant to their particular needs (i.e., which elements
are most frequent in evidence-based treatments for a targeted problem
with certain sample characteristics). Because the investigative team derived
elements from manualized interventions that are evidence based, and be
cause by far the majority of such interventions for child mental health fall
under the rubric of cognitive-behavioral therapy, the elements focused on
cognitive-behavioral approaches. However, application of a matrix of ele
ments for all evidence-based psychosocial interventions across all targeted
problems/disorders and various sample characteristics (e.g., age, gender,
ethnicity/race) is likely to provide a larger array of elements that are not
restricted to cognitive-behavioral therapies.
the number of studies using a given element. The result can be a “frequency
bias” when one is making general statements about the importance of any
given element.
Finally, only those psychosocial interventions deemed evidence based
would be included in efforts to identify elements. Consequently, some
potentially effective interventions for which efficacy has not been demon
strated would be omitted from such efforts. Also, because some psycho
therapy traditions have not emphasized the demonstration of efficacy, the
full range of potentially effective elements might not be identified.
SUMMARY
The committee recognizes the major gains that have been made to date
in demonstrating the efficacy of manualized psychosocial interventions
through randomized controlled clinical trials. The committee also recog
nizes that evidence-based psychosocial interventions comprise therapeutic
strategies, activities, and techniques (i.e., elements) that are nonspecific to
most if not all interventions, as well as those that are specific to a particular
theoretical model and approach to intervention. Furthermore, some ele
ments denoted as specific are actually shared among certain manualized
psychosocial interventions, although not always referred to using the same
terminology, whereas others are unique. The lack of a common terminol
ogy is an impediment to research. The committee suggests the need for
research to develop a common terminology that elucidates the elements
of evidence-based psychosocial interventions, to evaluate the elements’
optimal sequencing and dosing in different populations and for different
target problems, and to investigate their mechanisms. This research agenda
may have the potential to inform training in and the implementation of an
elements approach in the future. However, it should not be carried out to
the exclusion of other research agendas that may advance evidence-based
psychosocial interventions.
REFERENCES
Carroll, K. M., and L. S. Onken. 2005. Behavioral therapies for drug abuse. The American
Journal of Psychiatry 168(8):1452-1460.
Chorpita, B. F., and E. L. Daleiden. 2009. Mapping evidence-based treatments for children
and adolescents: Application of the distillation and matching model to 615 treatments
from 322 randomized trials. Journal of Consulting and Clinical Psychology 77(3):566.
Chorpita, B. F., E. L. Daleiden, and J. R. Weisz. 2005. Identifying and selecting the common
elements of evidence-based interventions: A distillation and matching model. Mental
Health Services Research 7(1):5-20.
Chorpita, B. F., J. R. Weisz, E. L. Daleiden, S. K. Schoenwald, L. A. Palinkas, J. Miranda, C. K.
Higa-McMillan, B. J. Nakamura, A. A. Austin, and C. F. Borntrager. 2013. Long-term
outcomes for the child steps randomized effectiveness trial: A comparison of modular
and standard treatment designs with usual care. Journal of Consulting and Clinical Psy
chology 81(6):999. Figures 3-3 and 3-4 reprinted with permission from Springer Science.
Clark, D. M. 2011. Implementing NICE guidelines for the psychological treatment of depres
sion and anxiety disorders: The IAPT experience. International Review of Psychiatry
23(4):318-327.
Daleiden, E. L., B. F. Chorpita, C. Donkervoet, A. M. Arensdorf, and M. Brogan. 2006. Get
ting better at getting them better: Health outcomes and evidence-based practice within
a system of care. Journal of the American Academy of Child and Adolescent Psychiatry
45(6):749-756.
DCOE (Defense Centers of Excellence). 2011. Best practices identified for peer support
programs: White paper. http://www.dcoe.mil/content/Navigation/Documents/Best_
Practices_Identified_for_Peer_Support_Programs_Jan_2011.pdf (accessed May 29, 2015).
Dixon, L. B., F. Dickerson, A. S. Bellack, M. Bennett, D. Dickinson, R. W. Goldberg, A.
Lehman, W. N. Tenhula, C. Calmes, R. M. Pasillas, J. Peer, and J. Kreyenbuhl. 2010.
The 2009 schizophrenia PORT psychosocial treatment recommendations and summary
statements. Schizophrenia Bulletin 36(1):48-70.
Ehlers, A., J. Bisson, D. M. Clark, M. Creamer, S. Pilling, D. Richards, P. P. Schnurr, S. Turner,
and W. Yule. 2010. Do all psychological treatments really work the same in posttrau
posttrau-
matic stress disorder? Clinical Psychology Review 30(2):269-276.
Insel, T., B. Cuthbert, M. Garvey, R. Heinssen, D. S. Pine, K. Quinn, and P. Wang. 2010.
Research Domain Criteria (RDoC): Toward a new classification framework for research
on mental disorders. The American Journal of Psychiatry 167(7):748-751.
THE ELEMENTS OF THERAPEUTIC CHANGE 71
73
tions even when well-respected organizations have reviewed the same body
of evidence. For example, two independent organizations reviewed behav
ioral treatments for autism spectrum disorders and produced very different
recommendations on the use of behavioral interventions for these disorders.
The National Standard Project (NSP) reviewed more than 700 studies using
a highly detailed rating system—the Scientific Merit Rating Scale—and de
termined that 11 treatments had sufficient evidence to be considered effica
cious (NAC, 2009). During the same time period, however, the Agency for
Healthcare Research and Quality (AHRQ) sponsored a systematic review of
the same literature and concluded that the evidence was not strong enough
to prove the efficacy of any treatments for these disorders (AHRQ, 2011).
The reason for these differing recommendations lies in how studies were se
lected and included in the review: the NSP included single case studies using
a special process to rank their validity and quality, while AHRQ eliminated
more than 3,406 articles based on its selection criteria, according to which
only randomized controlled trials (RCTs) were included, and single case
studies with sample sizes of less than 10 were excluded.
Having a standardized, coordinated process for determining which
interventions are evidence based for given disorders and conditions could
mitigate this problem. Two examples of the benefits of such coordination
are NICE in the United Kingdom and the VA’s Evidence-Based Synthesis
Program (ESP). Both employ a coordinated process for conducting system
atic reviews and creating guidelines based on internationally agreed-upon
standards, and both have a process for evaluating the impact of guidelines
on practice and outcomes.
NICE is a nonfederal public body that is responsible for developing
guidance and quality standards (NICE, 2011; Vyawahare et al., 2014). It
was established to overcome inconsistencies in the delivery of health care
across regional health authorities in the United Kingdom and Wales. NICE
works with the National Health Service (NHS) to ensure high-quality
health care, and is responsible for conducting systematic reviews, develop
ing guidelines and recommendations, and creating tools for clinicians to
assist in the implementation of care that adheres to the guidelines. NICE’s
recommendations encompass health care technologies, treatment guidelines,
and guidance in the implementation of best practices. Its guideline process
involves a number of steps, with consumers actively engaged at each step
(NICE, 2014). A systematic review is called for when the U.K. Department
of Health refers a topic for review. A comment period is held so that con
sumers and clinicians can register interest in the topic. Once there is ample
interest, the National Collaborating Center prepares the scope of work and
key questions for the systematic review, which are then made available for
consumer input. Next, an independent guideline group is formed, consisting
of health care providers, experts, and consumers. Internal reviewers within
STANDARDS FOR REVIEWING THE EVIDENCE 77
NICE conduct the systematic review, and the guideline group creates guide
lines based on the review. A draft of the guidelines undergoes at least one
public comment period, after which the final guidelines are produced, and
implementation materials are made available through NHS.
Preliminary reviews of the impact of the NICE process have indicated
that it has resulted in positive outcomes for many health disorders (Payne
et al., 2013), and in particular for mental health and behavioral problems
(Cairns et al., 2004; Pilling and Price, 2006). Recommendations from this
body also have informed the credentialing of providers who deliver psy
chosocial interventions, ensuring that there is a workforce to provide care
in accordance with the guidelines (Clark, 2011). In the psychosocial inter
vention realm, NICE has identified several interventions as evidence based
(e.g., brief dynamic therapy, cognitive-behavioral therapy, interpersonal
psychotherapy) for a variety of mental health and substance use problems.
One result has been the creation of the Increasing Access to Psychothera
pies program, charged with credentialing providers in these practices (see
Chapter 6 for a full description of this program and associated outcomes).
The VA follows a similar process in creating evidence-based standards
through the ESP (VA, 2015). The ESP is charged with conducting system
atic reviews and creating guidelines for nominated health care topics. It is
expected to conduct these reviews to the IOM standards and in a timely
fashion. The VA’s Health Services Research and Development division funds
four ESP centers, which have joint Veterans Health Administration (VHA)
and university affiliations. Each center director is an expert in the conduct
of systematic reviews, and works closely with the AHRQ Evidence-based
Practice Centers (EPCs) to conduct high-quality reviews and create guid
ance and implementation materials for clinicians and the VA managers.
The process is overseen by a steering committee whose mission is to ensure
that the program is having an impact on the quality of care throughout the
VA. Regular reviews of impact are conducted with the aim of continuing to
improve the implementation process. A coordinating center monitors and
oversees the systematic review process, coordinates the implementation of
guidelines, and assists stakeholders in implementation and education.
The ESP model has been highly effective in improving the implemen
tation of psychosocial interventions in the VA system (Karlin and Cross,
2014a,b). To date, several evidence-based psychotherapies have been identi
fied and subsequently implemented in nearly every VA facility throughout
the United States (see Chapter 6 for details). Program evaluation has re
vealed that not only are clinicians satisfied with the training and support
they receive (see Chapter 5), but they also demonstrate improved competen
cies, and patients report greater satisfaction with care (Chard et al., 2012;
Karlin et al., 2013a,b; Walser et al., 2013).
Based on the successes of NICE and the VA, it is possible to develop a
78 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
process for conducting systematic reviews and creating guidelines and im
plementation materials for psychosocial interventions, as well as a process
for evaluating the impact of these tools, by leveraging existing resources.
The committee envisions a process that entails the procedures detailed
below and, as with NICE, involves input from consumers, professional
organizations, and clinicians at every step. The inclusion of consumers in
guideline development groups is important, although challenging (Harding
et al., 2011). In their review of consumer involvement in NICE’s guideline
development, Harding and colleagues (2011) recommend a shared decision-
making approach to consumer support: consumers may receive support
from consumer organizations, and should be provided with “decision sup
port aids” for grading and assessment purposes and given an opportunity
to discuss with other stakeholders any of their concerns regarding the
content of the proposed guidelines, with clear direction on how to initiate
those discussions. This approach can be supported by participatory action
research training as discussed in Chapter 2 (Graham et al., 2014; Scharlach
et al., 2014).
A potential direction for the United States is for the U.S. Department
of Health and Human Services (HHS), in partnership with professional
and consumer organizations, to develop a coordinated process for con
ducting systematic reviews of the evidence for psychosocial interventions
and creating guidelines and implementation materials in accordance with
the IOM standards for guideline development. Professional and consumer
organizations, which are in the best position to inform the review process,
could work collaboratively with representation from multiple stakeholders,
including consumers, researchers, professional societies and organizations,
policy makers, health plans, purchasers, and clinicians. This body would
recommend guideline topics, appoint guideline development panels (also
including consumers, researchers, policy makers, health plans, purchasers,
and clinicians), and develop procedures for evaluating the impact of the
guidelines on practice and outcomes. When a topic for review was nomi
nated, a comment period would be held so that consumers and clinicians
could register interest in the topic. Once the body had recommended a
topic for review and the guideline panel had been formed, the panel would
identify the questions to be addressed by the systematic review and create
guidelines based on the review. For topics on which systematic reviews
and guidelines already exist, a panel would review these guidelines and
recommend whether they should be disseminated or require update and/
or revision.
STANDARDS FOR REVIEWING THE EVIDENCE 79
2 Current EPCs include Brown University, Duke University, ECRI Institute–Penn Medicine,
Johns Hopkins University, Kaiser Permanente Research Affiliates, Mayo Clinic, Minnesota
Evidence-based Practice Center, Pacific Northwest Evidence-based Practice Center–Oregon
Health and Science University, RTI International–University of North Carolina, Southern Cali
fornia Evidence-based Practice Center–RAND Corporations, University of Alberta, University
of Connecticut, and Vanderbilt University. http://www.ahrq.gov/research/findings/evidence
based-reports/centers/index.html (accessed June 21, 2015).
80 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
NREPP/NGC/
Professional • Disseminate guidelines and
Societies implementation tools
SAMSHA/
NIH/PCORI/ • Evaluates impact
AHRQ, etc. of tools on care
and practice
FIGURE 4-1 Proposed process for conducting systematic reviews and developing
NOTE: AHRQ = Agency for Healthcare Research and Quality; EPC = Evidence-
Based Practice Center; HHS = U.S. Department of Health and Human Services;
The IOM standards for systematic reviews have been adopted globally,
and are now employed in countries with a formal process for determin
ing whether a psychosocial intervention is indicated for a given problem
(Qaseem et al., 2012). They also are currently used for guideline develop
ment by professional organizations such as the American Psychiatric As
sociation and the American Psychological Association (Hollon et al., 2014).
Briefly, the process entails establishing a guideline panel to identify critical
questions that guide the systematic review, and ensuring that consumers are
represented throughout the process. As noted earlier, the review should be
conducted by a group of separate and independent guideline developers.
STANDARDS FOR REVIEWING THE EVIDENCE 81
This group collects information from a variety of sources; grades the qual
ity of that information using two independent raters; and then presents the
evidence to the guideline panel, which is responsible for developing recom
mendations based on the review.
The systematic review process is guided by the questions asked. Typi
cally, reviews focus on determining the best assessment and treatment pro
tocols for a given disorder. Reviews usually are guided by what are called
PICOT questions: In (Population U), what is the effect of (Intervention W)
compared with (Control X) on (Outcome Y) within (Time Z) (Fineout-
Overholt et al., 2005)? Other questions to be addressed derive from the
FDA. When the FDA approves a drug or device for marketing, the existing
data must provide information on its effective dose range, safety, toler
ability/side effects, and effectiveness (showing that the drug/device has an
effect on the mechanism underlying the disease being treated and is at least
as efficacious as existing treatments) (FDA, 2014).
Although the PICOT and FDA questions are important in determining
the effectiveness of psychosocial interventions, they are not sufficient to
ensure appropriate adoption of an intervention. Often, questions related
to moderators that facilitate or obstruct an intervention’s success, such as
intervention characteristics, required clinician skill level, systems needed
to support intervention fidelity, and essential treatment elements, are not
included in systematic reviews, yet their inclusion is necessary to ensure that
the intervention and its elements are implemented appropriately by health
plans, clinicians, and educators.
It is well known that interventions such as assertive community treat
ment and psychotherapies such as cognitive-behavioral therapy are complex
and may not need to be implemented in their entirety to result in a positive
outcome (Lyon et al., 2015; Mancini et al., 2009; Salyers et al., 2003).
Beyond the PICOT and FDA regulations, then, important additional ques
tions include the minimal effective dose of an intervention and the essential
elements in the treatment package. As discussed in Chapter 3, instead of
having to certify clinicians in several evidence-based interventions, a more
economical approach may be to identify their elements and determine the
effectiveness of those elements in treating target problems for different
populations and settings (Chorpita et al., 2005, 2007). The review process
also should address the acceptability of an intervention to consumers. For
example, cognitive-behavioral therapy for depression is a well-established,
evidence-based psychosocial intervention that many health plans already
cover; however, it is an intervention with high consumer dropout early in
treatment, and early dropout is associated with poorer outcomes (Bados et
al., 2007; Schindler et al., 2013; Schnicker et al., 2013).
Reviews also should extract information on the practicalities of imple
menting psychosocial interventions and their elements. Some psychoso
82 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
then, should depend on the question being asked, the intervention type, the
desired outcome, and the quality to which the methodology of the interven
tion was employed.
Further, data from field trials and observational studies can comple
ment data from RCTs and mechanistic trials, yet there is little support
for this type of research in the arena of psychosocial interventions. While
pharmaceutical companies historically have had the resources to field test
their interventions, psychosocial interventions often are developed in the
field and in academia, rather than by large companies. Whereas agencies
such as the National Institutes of Health have served as the primary funders
of research evaluating psychosocial interventions, funds for field and ob
servational studies have been constrained by budgetary limitations. More
funding is needed to evaluate these interventions so that systematic reviews
can be conducted comprehensively.
BOX 4-1
BOX 4-2
• Determining
the clinical effectiveness, cost-effectiveness, or comparative
effectiveness of a test or treatment, including evaluating the acceptability
of drugs, devices, or procedures for reimbursement.
• Measuring or monitoring the safety and harm of specific products and
treatments, including comparative evaluation of safety and effectiveness.
• Measuring or improving the quality of care, including conducting pro
grams to measure and/or improve the practice of medicine and/or public
health.
• Assessing natural history, including estimating the magnitude of a prob
lem, determining an underlying incidence or prevalence rate, examining
trends of disease over time, conducting surveillance, assessing service
delivery and identifying groups at high risk, documenting the types of
patients served by a health care provider, and describing and estimating
survival.
are the Society for Thoracic Surgeons, the American College of Cardiology,
and the American Society of Anesthesiologists. Both the Health Informa
tion Technology for Economic and Clinical Health (HITECH) Act and the
ACA support the creation of online registries to improve the quality and
reduce the cost of behavioral health interventions, as do health plans, pur
chasers, hospitals, physician specialty societies, pharmaceutical companies,
and patients. As an example, the Patient-Centered Outcomes Research
Institute’s (PCORI’s) PCORnet program3 has the aim of developing a large
and nationally representative registry to conduct comparative effectiveness
research.
These approaches to data synthesis when information on psychosocial
interventions is not readily available are particularly helpful in identify
ing directions for future research. When faced with minimal information
about the utility of psychosocial interventions in understudied settings and
populations, the entity conducting systematic reviews could employ these
models to identify candidate best practices and to generate hypotheses
about candidate interventions, and could work with research funding agen
cies (e.g., NIMH, PCORI) to deploy the candidate best practices and study
their impact and implementation.
Li et al., 2013; Patrick et al., 2011; Tang et al., 2013; Xu et al., 2012).
Machine learning refers to training computers to detect patterns in data
using Bayesian statistical modeling and then to develop decision algorithms
based on those patterns. One study has demonstrated that machine-learning
technology not only reduces the workload of systematic reviewers but also
results in more reliable data extraction than is obtained with manual review
(Matwin et al., 2010). Another study employed natural-language process
ing techniques, preprocessing key terms from study abstracts to create a
semantic vector model for prioritizing studies according to relevance to
the review. The researchers found that this method reduced the number of
publications that reviewers needed to evaluate, significantly reducing the
time required to conduct reviews (Jonnalagadda and Petitti, 2013). The
application of this technology to ongoing literature surveillance also could
result in more timely updates to recommendations. To be clear, the commit
tee is not suggesting that machine learning be used to replace the systematic
review process, but rather to augment and streamline the process, as well
as potentially lower associated costs.
The use of clinical and research networks and learning environments
to collect data on outcomes for new interventions and their elements is
another potential way to ensure that information on psychosocial interven
tions is contemporary. As an example, the Mental Health Research Net
work (MHRN), an NIMH-funded division of the HMO Research Network
and Collaboratory, consists of 13 health system research centers across
the United States that are charged with improving mental health care. It
comprises research groups, special interest groups, and a large research-
driven infrastructure for conducting large-scale clinical trials and field trials
(MHRN, n.d.). The MHRN offers a unique opportunity to study innova
tions in psychosocial interventions, system- and setting-level challenges to
implementation, and relative costs. HHS could partner with consortiums
such as the MHRN to obtain contemporary information on psychosocial
interventions, as well as to suggest areas for research.
REFERENCES
AHRQ (Agency for Healthcare Research and Quality). 1996. Clinical practice guidelines archive.
http://www.ahrq.gov/professionals/clinicians-providers/guidelines-recommendations/
archive.html (accessed June 11, 2015).
90 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
_____. 2011. Therapies for children with autism spectrum disorders. Comparative effectiveness
review no. 26. AHRQ publication no. 11-EHC029-EF. http://www.effectivehealthcare.
ahrq.gov/ehc/products/106/656/CER26_Autism_Report_04-14-2011.pdf (accessed May
29, 2015).
_____. 2014. About National Guideline Clearinghouse. http://www.guideline.gov/about/index.
aspx (accessed February 5, 2015).
_____. n.d. Registries for evaluating patient outcomes: A user’s guide. http://effectivehealthcare.
ahrq.gov/index.cfm/search-for-guides-reviews-and-reports/?productid=12&pageaction=
displayproduct (accessed June 18, 2015).
Arce, J. M., L. Hernando, A. Ortiz, M. Díaz, M. Polo, M. Lombardo, and A. Robles. 2014.
Designing a method to assess and improve the quality of healthcare in Nephrology by
means of the Delphi technique. Nefrologia: Publicacion Oficial de la Sociedad Espanola
Nefrologia 34(2):158-174.
Bados, A., G. Balaguer, and C. Saldana. 2007. The efficacy of cognitive-behavioral therapy and
the problem of drop-out. Journal of Clinical Psychology 63(6):585-592.
Barry, C. L., J. P. Weiner, K. Lemke, and S. H. Busch. 2012. Risk adjustment in health in
surance exchanges for individuals with mental illness. American Journal of Psychiatry
169(7):704-709.
Becker, K. D., B. R. Lee, E. L. Daleiden, M. Lindsey, N. E. Brandt, and B. F. Chorpita. 2015.
The common elements of engagement in children’s mental health services: Which ele
ments for which outcomes? Journal of Clinical Child and Adolescent Psychology: The
Official Journal for the Society of Clinical Child and Adolescent Psychology, American
Psychological Association, Division 53 44(1):30-43.
Cairns, R., J. Evans, and M. Prince. 2004. The impact of NICE guidelines on the diagnosis and
treatment of Alzheimer’s disease among general medical hospital inpatients. International
Journal of Geriatric Psychiatry 19(8):800-802.
Chard, K. M., E. G. Ricksecker, E. T. Healy, B. E. Karlin, and P. A. Resick. 2012. Dissemina
tion and experience with cognitive processing therapy. Journal of Rehabilitation Research
and Development 49(5):667-678.
Chorpita, B. F., and E. L. Daleiden. 2009. Mapping evidence-based treatments for children and
adolescents: Application of the distillation and matching model to 615 treatments from
322 randomized trials. Journal of Consulting and Clinical Psychology 77(3):566-579.
Chorpita, B. F., E. L. Daleiden, and J. R. Weisz. 2005. Identifying and selecting the common
elements of evidence-based interventions: A distillation and matching model. Mental
Health Services Research 7(1):5-20.
Chorpita, B. F., K. D. Becker, and E. L. Daleiden. 2007. Understanding the common elements
of evidence-based practice: Misconceptions and clinical examples. Journal of the Ameri
can Academy of Child and Adolescent Psychiatry 46(5):647-652.
Chorpita, B. F., J. R. Weisz, E. L. Daleiden, S. K. Schoenwald, L. A. Palinkas, J. Miranda,
C. K. Higa-McMillan, B. J. Nakamura, A. A. Austin, C. F. Borntrager, A. Ward, K. C.
Wells, and R. D. Gibbons. 2013. Long-term outcomes for the Child STEPs randomized
effectiveness trial: A comparison of modular and standard treatment designs with usual
care. Journal of Consulting and Clinical Psychology 81(6):999-1009.
Clark, D. M. 2011. Implementing NICE guidelines for the psychological treatment of depres
sion and anxiety disorders: The IAPT experience. International Review of Psychiatry
23(4):318-327.
D’Avolio, L. W., T. M. Nguyen, S. Goryachev, and L. D. Fiore. 2011. Automated concept-level
information extraction to reduce the need for custom software and rules development.
Journal of the American Medical Informatics Association 18(5):607-613.
STANDARDS FOR REVIEWING THE EVIDENCE 91
de Bruijn, B., C. Cherry, S. Kiritchenko, J. Martin, and X. Zhu. 2011. Machine-learned solu
tions for three stages of clinical information extraction: The state of the art at i2b2 2010.
Journal of the American Medical Informatics Association 18(5):557-562.
Decker, S. L., D. Kostova, G. M. Kenney, and S. K. Long. 2013. Health status, risk factors,
and medical conditions among persons enrolled in Medicaid vs. uninsured low-income
adults potentially eligible for Medicaid under the Affordable Care Act. Journal of the
American Medical Association 309(24):2579-2586.
FDA (U.S. Food and Drug Administration). 2014. FDA fundamentals. http://www.fda.gov/
AboutFDA/Transparency/Basics/ucm192695.htm (accessed February 5, 2015).
Fineout-Overholt, E., S. Hofstetter, L. Shell, and L. Johnston. 2005. Teaching EBP: Getting to
the gold: How to search for the best evidence. Worldviews on Evidence-Based Nursing
2(4):207-211.
Fortney, J., M. Enderle, S. McDougall, J. Clothier, J. Otero, L. Altman, and G. Curran. 2012.
Implementation outcomes of evidence-based quality improvement for depression in VA
community-based outpatient clinics. Implementation Science: IS 7:30.
Graham, T., D. Rose, J. Murray, M. Ashworth, and A. Tylee. 2014. User-generated quality
standards for youth mental health in primary care: A participatory research design using
mixed methods. BMJ Quality & Safety Online 1-10.
Guyatt, G. H., A. D. Oxman, G. E. Vist, R. Kunz, Y. Falck-Ytter, and H. J. Schünemann.
2008. Grade: What is “quality of evidence” and why is it important to clinicians? British
Medical Journal 336(7651):995-998.
Harding, K. J., A. J. Rush, M. Arbuckle, M. H. Trivedi, and H. A. Pincus. 2011. Measurement-
based care in psychiatric practice: A policy framework for implementation. Journal of
Clinical Psychiatry 72(8):1136-1143.
Hennessy, K. D., and S. Green-Hennessy. 2011. A review of mental health interventions in
SAMHSA’s National Registry of Evidence-based Programs and Practices. Psychiatric
Services 62(3):303-305.
Hollon, S. D., P. A. Arean, M. G. Craske, K. A. Crawford, D. R. Kivlahan, J. J. Magnavita,
T. H. Ollendick, T. L. Sexton, B. Spring, L. F. Bufka, D. I. Galper, and H. Kurtzman.
2014. Development of clinical practice guidelines. Annual Review of Clinical Psychol
ogy 10:213-241.
IOM (Institute of Medicine). 2011. Clinical practice guidelines we can trust. Washington, DC:
The National Academies Press.
Jonnalagadda, S., and D. Petitti. 2013. A new iterative method to reduce workload in system
atic review process. International Journal of Computational Biology and Drug Design
6(1-2):5-17.
Karlin, B. E., and G. Cross. 2014a. Enhancing access, fidelity, and outcomes in the na
tional dissemination of evidence-based psychotherapies. The American Psychologist
69(7):709-711.
_____. 2014b. From the laboratory to the therapy room: National dissemination and imple
mentation of evidence-based psychotherapies in the U.S. Department of Veterans Affairs
health care system. The American Psychologist 69(1):19-33.
Karlin, B. E., R. D. Walser, J. Yesavage, A. Zhang, M. Trockel, and C. B. Taylor. 2013a. Ef
fectiveness of acceptance and commitment therapy for depression: Comparison among
older and younger veterans. Aging & Mental Health 17(5):555-563.
Karlin, B. E., M. Trockel, C. B. Taylor, J. Gimeno, and R. Manber. 2013b. National dissemina
tion of cognitive-behavioral therapy for insomnia in veterans: Therapist- and patient-level
outcomes. Journal of Consulting and Clinical Psychology 81(5):912-917.
Kong, E. H., L. K. Evans, and J. P. Guevara. 2009. Nonpharmacological intervention for
agitation in dementia: A systematic review and meta-analysis. Aging & Mental Health
13(4):512-520.
92 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Lang, L. A., and S. T. Teich. 2014. A critical appraisal of the systematic review process: Sys
tematic reviews of zirconia single crowns. The Journal of Prosthetic Dentistry 111(6):
476-484.
Li, Q., H. Zhai, L. Deleger, T. Lingren, M. Kaiser, L. Stoutenborough, and I. Solti. 2013. A
sequence labeling approach to link medications and their attributes in clinical notes and
clinical trial announcements for information extraction. Journal of the American Medical
Informatics Association 20(5):915-921.
Lindsey, M. A., N. E. Brandt, K. D. Becker, B. R. Lee, R. P. Barth, E. L. Daleiden, and B. F.
Chorpita. 2014. Identifying the common elements of treatment engagement interven
tions in children’s mental health services. Clinical Child and Family Psychology Review
17(3):283-298.
London, P., and G. L. Klerman. 1982. Evaluating psychotherapy. American Journal of Psy
chiatry 139(6):709-717.
Lyon, A. R., S. Dorsey, M. Pullmann, J. Silbaugh-Cowdin, and L. Berliner. 2015. Clinician
use of standardized assessments following a common elements psychotherapy training
and consultation program. Administration and Policy in Mental Health 42(1):47-60.
Mancini, A. D., L. L. Moser, R. Whitley, G. J. McHugo, G. R. Bond, M. T. Finnerty, and B. J.
Burns. 2009. Assertive community treatment: Facilitators and barriers to implementation
in routine mental health settings. Psychiatric Services 60(2):189-195.
Matwin, S., A. Kouznetsov, D. Inkpen, O. Frunza, and P. O’Blenis. 2010. A new algorithm
for reducing the workload of experts in performing systematic reviews. Journal of the
American Medical Informatics Association 17(4):446-453.
MHRN (Mental Health Research Network). n.d. About. https://sites.google.com/a/mhresearch
network.org/mhrn/home/about-us (accessed February 6, 2015).
Montgomery, E. C., M. E. Kunik, N. Wilson, M. A. Stanley, and B. Weiss. 2010. Can parapro
fessionals deliver cognitive-behavioral therapy to treat anxiety and depressive symptoms?
Bulletin of the Menninger Clinic 74(1):45-62.
Mynors-Wallis, L. 1996. Problem-solving treatment: Evidence for effectiveness and feasibility
in primary care. International Journal of Psychiatry in Medicine 26(3):249-262.
NAC (National Autism Center). 2009. National standards report: Phase 1. http://dlr.sd.gov/
autism/documents/nac_standards_report_2009.pdf (accessed February 21, 2015).
NICE (National Institute for Health and Clinical Excellence). 2011. Common mental health
disorders: Identification and pathways to care. Leicester, UK: British Psychological
Society.
_____. 2014. Developing NICE guidelines: The manual. https://www.nice.org.uk/article/
pmg20/chapter/1-Introduction-and-overview (accessed February 5, 2014).
_____. 2015. Guidance list. http://www.nice.org.uk/guidance/published?type=Guidelines (ac
cessed June 11, 2015).
Palinkas, L. A., J. R. Weisz, B. F. Chorpita, B. Levine, A. F. Garland, K. E. Hoagwood, and
J. Landsverk. 2013. Continued use of evidence-based treatments after a randomized
controlled effectiveness trial: A qualitative study. Psychiatric Services 64(11):1110-1118.
Patel, V. L., J. F. Arocha, M. Diermeier, R. A. Greenes, and E. H. Shortliffe. 2001. Methods of
cognitive analysis to support the design and evaluation of biomedical systems: The case
of clinical practice guidelines. Journal of Biomedical Informatics 34(1):52-66.
Patrick, J. D., D. H. Nguyen, Y. Wang, and M. Li. 2011. A knowledge discovery and reuse
pipeline for information extraction in clinical notes. Journal of the American Medical
Informatics Association 18(5):574-579.
Payne, H., N. Clarke, R. Huddart, C. Parker, J. Troup, and J. Graham. 2013. Nasty or nice?
Findings from a U.K. survey to evaluate the impact of the National Institute for Health
and Clinical Excellence (NICE) clinical guidelines on the management of prostate cancer.
Clinical Oncology 25(3):178-189.
STANDARDS FOR REVIEWING THE EVIDENCE 93
Pilling, S., and K. Price. 2006. Developing and implementing clinical guidelines: Lessons from
the NICE schizophrenia guideline. Epidemiologia e Psichiatria Sociale 15(2):109-116.
Qaseem, A., F. Forland, F. Macbeth, G. Ollenschläger, S. Phillips, and P. van der Wees. 2012.
Guidelines international network: Toward international standards for clinical practice
guidelines. Annals of Internal Medicine 156(7):525-531.
Salyers, M. P., G. R. Bond, G. B. Teague, J. F. Cox, M. E. Smith, M. L. Hicks, and J. I. Koop.
2003. Is it ACT yet? Real-world examples of evaluating the degree of implementation
for assertive community treatment. Journal of Behavioral Health Services & Research
30(3):304-320.
SAMHSA (Substance Abuse and Mental Health Services Administration). 2015. About
SAMHSA’s National Registry of Evidence-based Programs and Practices. http://www.
nrepp.samhsa.gov/AboutNREPP.aspx (accessed February 5, 2015).
Scharlach, A. E., C. L. Graham, and C. Berridge. 2014. An integrated model of co-ordinated
community-based care. Gerontologist doi:10.1093/geront/gnu075.
Schindler, A., W. Hiller, and M. Witthoft. 2013. What predicts outcome, response, and
drop-out in CBT of depressive adults? A naturalistic study. Behavioural and Cognitive
Psychotherapy 41(3):365-370.
Schnicker, K., W. Hiller, and T. Legenbauer. 2013. Drop-out and treatment outcome of outpa
tient cognitive-behavioral therapy for anorexia nervosa and bulimia nervosa. Compre
hensive Psychiatry 54(7):812-823.
Tang, B., Y. Wu, M. Jiang, Y. Chen, J. C. Denny, and H. Xu. 2013. A hybrid system for
temporal information extraction from clinical text. Journal of the American Medical
Informatics Association 20(5):828-835.
Tol, W. A., I. H. Komproe, D. Susanty, M. J. Jordans, R. D. Macy, and J. T. De Jong. 2008.
School-based mental health intervention for children affected by political violence in
Indonesia: A cluster randomized trial. Journal of the American Medical Association
300(6):655-662.
VA (U.S. Department of Veterans Affairs). 2015. Health Services Research and Development:
Evidence-Based Synthesis Program. http://www.hsrd.research.va.gov/publications/esp
(accessed February 5, 2015).
Victora, C. G., J. P. Habicht, and J. Bryce. 2004. Evidence-based public health: Moving beyond
randomized trials. American Journal of Public Health 94(3):400-405.
Vyawahare, B., N. Hallas, M. Brookes, R. S. Taylor, and S. Eldabe. 2014. Impact of the Na
tional Institute for Health and Care Excellence (NICE) guidance on medical technology
uptake: Analysis of the uptake of spinal cord stimulation in England 2008-2012. BMJ
Open 4(1):e004182.
Walser, R. D., B. E. Karlin, M. Trockel, B. Mazina, and C. Barr Taylor. 2013. Training in
and implementation of Acceptance and Commitment Therapy for depression in the Vet
erans Health Administration: Therapist and patient outcomes. Behaviour Research and
Therapy 51(9):555-563.
Wen, H., J. R. Cummings, J. M. Hockenberry, L. M. Gaydos, and B. G. Druss. 2013. State
parity laws and access to treatment for substance use disorder in the United States: Im
plications for federal parity legislation. JAMA Psychiatry 70(12):1355-1362.
Xu, Y., K. Hong, J. Tsujii, and E. I. Chang. 2012. Feature engineering combined with machine
learning and rule-based methods for structured information extraction from narrative
clinical discharge summaries. Journal of the American Medical Informatics Association
19(5):824-832.
5
Quality Measurement
The Patient Protection and Affordable Care Act (ACA) has set the stage
for transformation of the health care system. This transformation includes
change in what the nation wants from health care as well as in how care is
paid for. New care delivery systems and payment reforms require measures
for tracking the performance of the health care system. Quality measures
are among the critical tools for health care providers and organizations dur
ing the process of transformation and improvement (Conway and Clancy,
2009). Quality measures also play a critical role in the implementation and
monitoring of innovative interventions and programs. This chapter begins
by defining a good quality measure. It then reviews the process for mea
sure development and endorsement and the existing landscape of quality
measures for treatment of mental health and substance use (MH/SU) disor
ders. Next, the chapter details a framework for the development of quality
measures—structural, process, and outcome measures—for psychosocial
interventions, including the advantages, disadvantages, opportunities, and
challenges associated with each. The final section presents the committee’s
recommendations on quality measurement.
95
BOX 5-1
To date, quality measures are lacking for key areas of MH/SU treat
ment. Of the 55 nationally endorsed measures related to MH/SU, just 2
address a psychosocial intervention (both dealing with intervention for
substance use) (see Table 5-1). An international review of quality measures
in mental health similarly showed the lack of measures for psychosocial
interventions, with fewer than 10 percent of identified measures being
considered applicable to these interventions (Fisher et al., 2013). The small
number of nationally endorsed quality measures addressing MH/SU reflects
both limitations in the evidence base for what treatments are effective at
achieving improvements in patient outcomes and challenges faced in obtain
ing the detailed information necessary to support quality measurement from
100 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Medications
Diabetes Monitoring for People with Diabetes and Schizo 1934 Process
phrenia (SMD)
Diabetes Care for People with Serious Mental Illness: Hemo 2603 Process
globin A1c (HbA1c) Testing
continued
102 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
existing clinical data (Byron et al., 2014; Kilbourne et al., 2010; Pincus et
al., 2011).
Most of the endorsed measures listed in Table 5-1 are used to evaluate
processes of care. Of the 13 outcome measures, 4 are focused on depres
sion. The endorsed measures address care in inpatient and outpatient set
tings, and several address screening and care coordination. Few address
patient-centeredness.
While the NQF endorsement process focuses on performance measures
for assessing processes and outcomes of care, measures used for accredita
tion or certification purposes often articulate expectations for structural
capabilities and how those resources are used. However, these structural
QUALITY MEASUREMENT 103
State of New Standards for Health “The health home provider is accountable for
York Homes engaging and retaining health home enrollees
in care; coordinating and arranging for the
provision of services; supporting adherence to
treatment recommendations; and monitoring
and evaluating a patient’s needs, including
prevention, wellness, medical, specialist, and
behavioral health treatment, care transitions,
and social and community services where
appropriate through the creation of an
individual plan of care” (New York State
Health Department, 2012).
NCQA The Medical Home The MHSS is used to assess the degree to
System Survey which an individual primary care practice
(MHSS) (NQF or provider has in place the structures and
#1909) processes of an evidence-based patient-
centered medical home. The survey comprises
six composite measures, each used to assess
a particular domain of the patient-centered
medical home:
continued
104 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Structure Measures
“Structural components have a propensity to influence the process of
care . . . changes in the process of care, including variations in quality,
will influence the outcomes of care, broadly defined. Hence, structural
effects on outcomes are mediated through process.”
—Donabedian, 1980, p. 84
Opportunities
The committee envisions important opportunities to develop and apply
structure measures as part of a systematic, comprehensive, and balanced
strategy for enhancing the quality of psychosocial interventions. Structure
measures can be used to assess providers’ training and capacity to offer
evidence-based psychosocial interventions. They provide guidance on in
frastructure development and best practices. They support credentialing
and payment, thereby allowing purchasers and health plans to select clin
ics or provider organizations that are equipped to furnish evidence-based
psychosocial interventions. Finally, they can support consumers in select
ing providers with expertise in interventions specific to their condition or
adapted to their cultural expectations (Brown et al., 2014). A framework
for leveraging these structural concepts to develop quality measures for
psychosocial interventions might include the following:
Challenges
A number of challenges must be considered in exploiting the opportuni
ties for developing and implementing structure measures described above:
• While there is strong face validity for these concepts, and most of
them are key components of evidence-based chronic care models,
they have not been formally tested individually or together.
• Resources would be needed to support both the documentation and
the verification of structures.
• Clinical organizations providing care for MH/SU disorders have
less well developed information systems compared with general
health care and also are excluded from the incentive programs
in the Health Information Technology for Economic and Clinical
Health (HITECH) Act (CMS, 2015b). The costs of developing the
health information technology and other capacities necessary to
meet the structural criteria discussed above will require additional
resources.
• The infrastructure for clinician training, competency assessment,
and certification in evidence-based psychosocial interventions is
neither well developed nor standardized at the local or national
level. For MH/SU clinical organizations to implement their own
clinician training and credentialing programs would be highly
inefficient.
• Many providers of care for MH/SU disorders work in solo or
small practices and lack access to the infrastructure assumed for
the concepts discussed above. There would need to be a substantial
restructuring of the practice environment and shift of incentives to
encourage providers to link with organizations that could provide
this infrastructure support. Incentive strategies would need to go
beyond those associated with reimbursement (perhaps involving
licensure and certification), because a significant proportion of
providers of MH/SU care do not accept insurance (Bishop et al.,
2014).
Process Measures
“[Measuring the process of care] is justified by the assumption that . . .
what is now known to be ‘good’ medical care has been applied. . . .
The estimates of quality that one obtains are less stable and less final
than those that derive from the measurement of outcomes. They may,
QUALITY MEASUREMENT 109
Opportunities
The committee sees important opportunities to develop and apply
process measures as part of a systematic, comprehensive, and balanced
strategy for enhancing the quality of psychosocial interventions. Defining
the processes of care associated with evidence-based psychosocial interven
tions is complicated. However, effective and efficient measures focused on
the delivery of evidence-based psychosocial interventions are important
opportunities for supporting the targeting and application of improvement
strategies (Brown et al., 2014), and currently used data sources offer several
opportunities to track the processes of care (see Table 5-4):
Challenges
A number of challenges need to be considered in the design of process
measures, many related to the nature of the data source itself. Claims,
EHRs, and consumer surveys all pose challenges as data sources for these
measures.
Claims, while readily available, exist for the purpose of payment, not
tracking the content of treatment. Procedure codes used for billing lack
detail on the content of psychotherapy; the codes have broad labels such
as “individual psychotherapy” and “group psychotherapy” (APA, 2013). A
further complication is that state Medicaid programs have developed their
own psychotherapy billing codes, and these, too, provide no detail on the
112 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Outcome Measures
“Outcomes do have . . . the advantage of reflecting all contributions
to care, including those of the patient. But this advantage is also a
handicap, since it is not possible to say precisely what went wrong
unless the antecedent process is scrutinized.”
—Donabedian, 1988, p. 1746
Opportunities
The committee sees important opportunities to develop and apply
quality measures based on patient-reported outcomes as part of a system
atic, comprehensive, and balanced strategy for enhancing the quality of
psychosocial interventions. Priority domains for these quality measures
include symptom reduction/remission functional status, patient/consumer
perceptions of care, and recovery outcomes.
QUALITY MEASUREMENT 115
choices that support physical and emotional well-being; (2) home, where
an individual has a stable, safe place to live; (3) purpose, with an individual
engaging in meaningful daily activities (e.g., job, school, volunteering);
and (4) community, wherein an individual builds relationships and social
networks that provide support (SAMHSA, 2011).
Measure developers have made different assumptions regarding the
underlying mechanisms of recovery and included different domains in their
recovery outcome measures (Scheyett et al., 2013). Several instruments—
including the Consumer Recovery Outcomes System (Bloom and Miller,
2004), the Recovery Assessment Scale (RAS) (Corrigan et al., 1999; Salzer
and Brusilovskiy, 2014), and the Recovery Process Inventory (Jerrell et al.,
2006)—have strong psychometric properties. The RAS in particular has
been used in the United States with good results. It is based on five domains:
(1) confidence/hope, (2) willingness to ask for help, (3) goal and success
orientation, (4) reliance on others, and (5) no domination by symptoms
(Corrigan et al., 1999; Salzer and Brusilovskiy, 2014).
Challenges
A number of challenges are entailed in measuring MH/SU outcomes.
These involve (1) determination of which measures and which outcomes to
use; (2) accountability and the lack of a standardized methodology for risk
adjustment related to complexity, risk profile, and comorbidities; (3) the
118 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
3See Table 5-1 for information on outcome measures NQF #1884, #1885, #0710, #0711,
and #0726.
QUALITY MEASUREMENT 119
gap in available outcome measures for the other major MH/SU disorders,
as well as for quality of life and full recovery.
The focus on symptom response/remission measures also does not take
into account the fact that consumers with an MH/SU disorder often have
multiple comorbid conditions. They also rarely receive only one psycho
social intervention, more often receiving a combination of services, such
as medication management and one or more psychosocial interventions,
making assessment of overall response to MH/SU services appealing. Out
come measures look at overall impact on the consumer and are particularly
relevant for psychosocial interventions that have multifactorial, person-
centered dimensions.
The large number of tools available for assessing diverse outcomes
makes comparisons across organizations and populations highly challeng
ing. In the CMS EHR incentive program, specification of quality mea
sures that use patient-reported outcomes requires specific code sets (CMS,
2015b). Use of measures in the public domain can reduce the burden on
health information technology vendors and providers. Consensus on tools
for certain topics (e.g., the PHQ-9 for monitoring depression symptoms)
allows for relative ease of implementation; however, other tools are pre
ferred for specific populations. An initiative called PROsetta stone is under
way to link the PROMIS scales with other measures commonly used to
assess patient-reported outcomes (Choi et al., 2012). In addition, efforts to
develop a credible national indicator for subjective well-being that reflects
“how people experience and evaluate their lives and specific domains and
activities in their lives” (NRC, 2013, p. 15) have led to several advances
that may be worth considering for quality measurement.
6.7 percent of the variance in these areas was explained by models using
diagnostic and sociodemographic data, while an average of 22.8 percent
of the variance was explained by models using more detailed clinical and
quality-of-life data (Hermann et al., 2007). Risk adjustment models based
on administrative or claims data explained less than one-third of the vari
ance explained by models that included clinical assessment or medical
records data (Hermann et al., 2007). Consensus on a reasonable number
of clinical outcome and quality indicators is needed among payers, regula
tors, and behavioral health organizations to enable the development of risk
adjustment models that can account for the interactions among different
risk factors.
The lack of a cohesive and comprehensive plan requiring the use of stan
dardized MH/SU outcome measures Comprehensive approaches such as
the MHSIP could serve as a model for standardizing measures for MH/SU
disorders; however, even that program does not extend to outcomes other
than consumer satisfaction, nor does it cover individuals or care outside
of the public sector. Efforts to encourage the use of outcome measurement
need to be carried out at multiple levels and to involve multiple stake
holders. Consumers need to be encouraged to track their own recovery;
clinicians to monitor patient responses and alter treatment strategies based
on those responses; and organizations to use this information for quality
improvement, network management, and accountability.
(VA), and the U.S. Department of Defense (DoD)—in order to make suf
ficient resources available and avoid duplication of effort. Also essential is
coordination with relevant nongovernmental organizations, such as NQF,
NCQA, and the Patient-Centered Outcomes Research Institute (PCORI),
as well professional associations and private payers, to support widespread
adoption of the measures developed in multipayer efforts. The designated
entity needs to be responsible for using a multistakeholder process to
develop strategies for identifying measure gaps, establishing priorities for
measure development, and determining mechanisms for evaluating the
impact of measurement activities. In these efforts, representation and con
sideration of the multiple disciplines involved in the delivery of behavioral
health care treatment are essential. Consumer/family involvement needs to
encompass participation in multistakeholder panels that guide measure de
velopment; efforts to garner broad input, such as focus groups; and specific
efforts to obtain input on how to present the findings of quality measure
ment in ways that are meaningful to consumers/families.
In the short term, structure measures that set expectations for the in
frastructure needed to support outcome measurement and the delivery of
evidence-based psychosocial interventions need to be a priority to establish
the capacity for the expanded routine clinical use of outcome measures. A
second priority is the development of process measures that can be used
to assess access to care (in light of concerns about expanded populations
QUALITY MEASUREMENT 123
with access to MH/SU care under the ACA and the limited availability of
specialty care and evidence-based services). Other process measures ad
dressing the content of care can be used for hypothesis generation and test
ing with regard to quality improvement. The measurement strategy needs
to take into account how performance measures can be used to support
patient care in real time, as well as the quality improvement efforts of care
teams, organizations, plans, and states, and to encompass efforts to assess
the impact of policies concerning the application of quality measures at
the local, state, and federal levels. HHS is best positioned to lead efforts to
gain consensus on the priority of developing and applying patient-reported
outcome measures for use in quality assessment and of validating patient-
reported outcome measures for gap areas such as recovery. Standardized
and validated patient-reported outcome measures are necessary for perfor
mance measurement.
The committee drew the following conclusion about the development
of approaches to measure quality of psychosocial interventions:
REFERENCES
AHRQ (Agency for Healthcare Research and Quality). 2010. National healthcare dispari
ties report. http://www.ahrq.gov/research/findings/nhqrdr/nhdr11/nhdr11.pdf (accessed
February 17, 2015).
_____. 2015a. CAHPS surveys and tools to advance patient-centered care. https://cahps.ahrq.
gov (accessed June 15, 2015).
_____. 2015b. Experience of Care and Health Outcomes (ECHO). https://www.cahps.ahrq.
gov/surveys-guidance/echo/index.html (accessed June 15, 2015).
QUALITY MEASUREMENT 125
_____. 2015d. Annual update of 2014 eligible hospitals and eligible professionals Electronic
Clinical Quality Measures (eCQMs). http://www.cms.gov/Regulations-and-Guidance/
Legislation/EHRIncentivePrograms/Downloads/eCQM_TechNotes2015.pdf (accessed
June 15, 2015).
_____. n.d. CMS measures management system blueprint (the Blueprint) v 11.0. http://www.
cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Measures
ManagementSystemBlueprint.html (accessed October 1, 2014).
Conway, P. H., and C. Clancy. 2009. Transformation of health care at the front line. Journal
of the American Medical Association 301(7):763-765.
Corrigan, P. W., D. Giffort, F. Rashid, M. Leary, and I. Okeke. 1999. Recovery as a psycho
logical construct. Community Mental Health Journal 35(3):231-239.
Deegan, P. E. 1988. Recovery: The lived experience of rehabilitation. Journal of Psychosocial
Rehabilitation 11(4):11-19.
Donabedian, A. 1980. Explorations in quality assessment and monitoring. Vol. I. Ann Arbor,
MI: Health Administration Press.
_____. 1988. The quality of care. How can it be assessed? Journal of the American Medical
Association 260(12):1743-1748.
_____. 2005. Evaluating the quality of medical care. 1966. Milbank Quarterly 83(4):691-729.
Ettner, S. L., R. G. Frank, T. G. McGuire, J. P. Newhouse, and E. H. Notman. 1998. Risk
adjustment of mental health and substance abuse payments. Inquiry 35(2):223-239.
Fillenbaum, G. G., and M. A. Smyer. 1981. The development, validity, and reliability of the
OARS multidimensional functional assessment questionnaire. Journal of Gerontology
36(4):428-434.
Fisher, C. E., B. Spaeth-Rublee, and H. A. Pincus. 2013. Developing mental health-care quality
indicators: Toward a common framework. International Journal for Quality in Health
Care 25(1):75-80.
Fortney, J., K. Rost, M. Zhang, and J. Pyne. 2001. The relationship between quality and
outcomes in routine depression care. Psychiatric Services 52(1):56-62.
Frank, R. 2014. Presentation to Committee on Developing Evidence-Based Standards for
Psychosocial Interventions for Mental Disorders. Workshop on Approaches to Quality
Improvement, July 24, Washington, DC.
Garnick, D. W., C. M. Horgan, M. T. Lee, L. Panas, G. A. Ritter, S. Davis, T. Leeper, R.
Moore, and M. Reynolds. 2007. Are Washington Circle performance measures associ
ated with decreased criminal activity following treatment? Journal of Substance Abuse
Treatment 33(4):341-352.
Glasgow, R. E., R. M. Kaplan, J. K. Ockene, E. B. Fisher, and K. M. Emmons. 2012. Patient-
reported measures of psychosocial issues and health behavior should be added to elec
tronic health records. Health Affairs (Millwood) 31(3):497-504.
Harding, C. M., G. W. Brooks, T. Ashikaga, J. S. Strauss, and A. Breier. 1987. The Vermont
longitudinal study of persons with severe mental illness. II: Long-term outcome of sub
jects who retrospectively met DSM-III criteria for schizophrenia. American Journal of
Psychiatry 144(6):727-735.
Harding, K. J., A. J. Rush, M. Arbuckle, M. H. Trivedi, and H. A. Pincus. 2011. Measurement-
based care in psychiatric practice: A policy framework for implementation. Journal of
Clinical Psychiatry 72(8):1136-1143.
Harrow, M., T. H. Jobe, and R. N. Faull. 2012. Do all schizophrenia patients need antipsy
chotic treatment continuously throughout their lifetime? A 20-year longitudinal study.
Psychological Medicine 42(10):2145-2155.
Hepner, K. A., F. Azocar, G. L. Greenwood, J. Miranda, and M. A. Burnam. 2010. Develop
ment of a clinician report measure to assess psychotherapy for depression in usual care
settings. Administration and Policy in Mental Health 37(3):221-229.
QUALITY MEASUREMENT 127
NQF (National Quality Forum). 2009. Nursing staff skill mix. NQF #0204. http://www.
qualityforum.org/QPS/0204 (accessed January 27, 2015).
_____. 2011. Medical Home System Survey (MHSS). NQF #1909. http://www.qualityforum.
org/QPS/1909 (accessed January 27, 2015).
_____. 2014a. Consensus development process. http://www.qualityforum.org/Measuring_
Performance/Consensus_Development_Process.aspx (accessed April 15, 2015).
_____. 2014b. Measure applications partnership. http://www.qualityforum.org/map (accessed
January 27, 2015).
_____. 2014c. Measure evaluation criteria. http://www.qualityforum.org/docs/measure_
evaluation_criteria.aspx (accessed April 27, 2015).
_____. 2015. Quality positioning system. http://www.qualityforum.org/QPS/QPSTool.aspx
(accessed June 15, 2015).
NQMC (National Quality Measure Clearinghouse). 2014. Tutorials on quality measures:
Desirable attributes of a quality measure. http://www.qualitymeasures.ahrq.gov (accessed
November 6, 2014).
NRC (National Research Council). 2013. Subjective well-being: Measuring happiness, suf
fering, and other dimensions of experience. Washington, DC: The National Academies
Press.
Pincus, H. A., B. Spaeth-Rublee, and K. E. Watkins. 2011. Analysis & commentary: The
case for measuring quality in mental health and substance abuse care. Health Affairs
(Millwood) 30(4):730-736.
Richmond, T., S. T. Tang, L. Tulman, J. Fawcett, and R. McCorkle. 2004. Measuring function.
In Instruments for clinical health-care research, 3rd ed., edited by M. Frank-Stromborg
and S. J. Olsen. Sudbury, MA: Jones & Barlett. Pp. 83-99.
Salzer, M. S., and E. Brusilovskiy. 2014. Advancing recovery science: Reliability and validity
properties of the Recovery Assessment Scale. Psychiatric Services 65(4):442-453.
SAMHSA (Substance Abuse and Mental Health Services Administration). 2011. SAMHSA’s
working definition of recovery. https://store.samhsa.gov/shin/content/PEP12-RECDEF/
PEP12-RECDEF.pdf (accessed January 8, 2015).
Scheyett, A., J. DeLuca, and C. Morgan. 2013. Recovery in severe mental illnesses: A literature
review of recovery measures. Social Work Research 37(3):286-303.
Schoenwald, S. K., J. E. Chapman, A. J. Sheidow, and R. E. Carter. 2009. Long-term youth
criminal outcomes in MST transport: The impact of therapist adherence and orga
nizational climate and structure. Journal of Clinical Child & Adolescent Psychology
38(1):91-105.
Schoenwald, S. K., A. F. Garland, M. A. Southam-Gerow, B. F. Chorpita, and J. E. Chapman.
2011. Adherence measurement in treatments for disruptive behavior disorders: Pursu
ing clear vision through varied lenses. Clinical Psychology (New York) 18(4):331-341.
Slade, M., M. Amering, and L. Oades. 2008. Recovery: An international perspective. Epide
miologia e Psichiatria Sociale 17(2):128-137.
Spitzer, R. L., K. Kroenke, J. B. W. Williams, and B. Löwe. 2006. A brief measure for assess
ing generalized anxiety disorder: The GAD-7. Archives of Internal Medicine 166(10):
1092-1097.
Steinman, K. J., K. Kelleher, A. E. Dembe, T. M. Wickizer, and T. Hemming. 2012. The use of
a “mystery shopper” methodology to evaluate children’s access to psychiatric services.
Journal of Behavioral Health Services & Research 39(3):305-313.
Strong, D. M., Y. W. Lee, and R. Y. Wang. 1997. Data quality in context. Communications
of the ACM 40(5):103-110.
Sullivan, G. 2008. Complacent care and the quality gap. Psychiatric Services 59(12):1367-1367.
QUALITY MEASUREMENT 129
Tu, S. W., C. Nyulas, M. Tierney, A. Syed, R. Musacchio, and T. B. Üstün. 2014. A content
model for health interventions. Presented at WHO—Family of International Classifica
tions Network Annual Meeting 2014, October 11-17, Barcelona, Spain.
UW (University of Washington). 2013. Mental Health Statistics Improvement Program
(MHSIP) surveys. Seattle, WA: University of Washington Department of Psychiatry and
Behavioral Sciences, Public Behavioral Health and Justice Policy. https://depts.washington.
edu/pbhjp/projects-programs/page/mental-health-statistics-improvement-program-adult
consumer-survey-acs (accessed June 15, 2015).
Velentgas, P., N. A. Dreyer, and A. W. Wu. 2013. Outcome definition and measurement. In
Developing a protocol for observational comparative effectiveness research: A user’s
guide, Ch. 6, edited by P. Velentgas, N. A. Dreyer, P. Nourjah, S. R. Smith, and M. M.
Torchia. Rockville, MD: AHRQ. Pp. 71-92.
Vinik, A. I., and E. Vinik. 2003. Prevention of the complications of diabetes. American Journal
of Managed Care 9(Suppl. 3):S63-S80; quiz S81-S84.
Ward, J. C., M. G. Dow, K. Penner, T. Saunders, and S. Halls. 2006. Manual for using the
Functional Assessment Rating Scale (FARS). http://outcomes.fmhi.usf.edu/FARSUser
Manual2006.pdf (accessed September 26, 2014).
Ware, J. E. 2014. SF-36 health survey update. http://www.sf-36.org/tools/sf36.shtml (accessed
January 28, 2015).
Williams, B. 1994. Patient satisfaction: A valid concept? Social Science and Medicine 38:
509-516.
Wilson, I. B., and P. D. Cleary. 1995. Linking clinical variables with health-related quality of
life. A conceptual model of patient outcomes. Journal of the American Medical Associa
tion 273:59-65.
Wolraich, M. L., W. Lambert, M. A. Doffing, L. Bickman, T. Simmons, and K. Worley. 2003.
Psychometric properties of the Vanderbilt ADHD diagnostic parent rating scale in a
referred population. Journal of Pediatric Psychology 28(8):559-568.
6
Quality Improvement
131
TABLE 6-1 Stakeholders and Their Levers for Influencing the Quality of
Care for Mental Health and Substance Use Disorders
Levers for Influencing
Stakeholder Quality of Care Examples
Consumers • Evaluation • Participation/leadership in
• Service provision evaluation
• Participation in governance • Participation in surveys
• Shared decision making • Serving as administrators,
members of advisory boards
• Serving as peer support
specialists
CONSUMERS
Substantive consumer participation—the formal involvement of con
sumers in the design, implementation, and evaluation of interventions—is
known to improve the outcomes of psychosocial interventions (Delman,
2007; Taylor et al., 2009). The unique experience and perspective of con
sumers also make their active involvement essential to quality management
and improvement for psychosocial interventions (Linhorst et al., 2005). To
be meaningful, the participation must be sustained over time and focused on
crucial elements of the program (Barbato et al., 2014). Roles for consum
ers include involvement in evaluation, training, management, and service
provision, as well as active participation in their own care, such as through
shared decision making, self-management programs, and patient-centered
medical homes. As noted in Chapter 2, participatory action research (PAR)
methods engage consumers. The PAR process necessarily includes resources
and training for consumer participants and cross-training among stakehold
ers (Delman and Lincoln, 2009).
Evaluation
Evidence supports the important role of consumers in program evalua
tion (Barbato et al., 2014; Drake et al., 2010; Hibbard, 2013). Consumers
have been involved at all levels of evaluation, from evaluation design to
data collection (Delman, 2007). At the design level, consumer participation
helps organizations understand clients’ views and expectations for mental
health care (Linhorst and Eckert, 2002), and ensures that outcomes mean
ingful to consumers are included in evaluations and that data are collected
in a way that is acceptable to and understood by consumers (Barbato et al.,
2014). Further, Clark and colleagues (1999) found that mental health con
sumers often feel free to talk openly to consumer interviewers, thus provid
ing more honest and in-depth data than can otherwise be obtained. Personal
interviews maximize consumer response rates overall and in populations
frequently excluded from evaluation (e.g., homeless persons) (Barbato et
al., 2014).
Training
Consumers can be valuable members of the workforce training team.
The active involvement of consumers in the education and training of health
QUALITY IMPROVEMENT 135
Participation in Governance
Consumer participation in decisions about a provider organization’s
policy direction and management supports the development of psychosocial
interventions that meet the needs of consumers (Grant, 2010; Taylor et
al., 2009). Consumers’ increasing assumption of decision-making roles in
provider organizations and governmental bodies has resulted in innovations
that have improved the quality of care (e.g., peer support services) (Allen
et al., 2010). Consumer participation in managing services directly informs
organizations about consumer needs and has been strongly associated with
consumers’ having information about service quality and how to access
services (Omeni et al., 2014).
Consumer councils are common, and can be effective in involving
clients in formal policy reviews and performance improvement projects
(Taylor et al., 2009). Consumer council involvement provides staff with
a better understanding of consumers’ views and expectations, increases
clients’ involvement in service improvement, and can impact management
decisions (Linhorst et al., 2005). Clients are more likely to participate when
their program (e.g., group homes, hospitals) encourages their independence
and involvement in decision making (Taylor et al., 2009).
Service Provision
By actively participating in discussions within treatment teams and
with staff more generally, consumers bring a lived experience that can
round out a more clinical view, improving the treatment decision-making
process. Consumers take on a wide variety of service delivery roles as peer
support workers, a general term applying to people with a lived experience
of mental illness who are empathetic and provide direct emotional support
for a consumer. Operating in these roles, peers can play an important part
in quality management and transformation (Drake et al., 2010). In August
2007, the Centers for Medicare & Medicaid Services (CMS) issued a letter
136 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
PROVIDERS
Behavioral health providers bring commitment and training to their
work. Many, if not most, efforts to improve the quality of psychosocial
interventions have focused on providers, reflecting their key role in help
ing clients achieve recovery and quality of life. Provider-focused efforts to
improve quality of care include dissemination of treatment information,
such as through manuals and guidelines; various forms of training, coach
ing, expert consultation, peer support, and supervision; fidelity checks; and
provider profiling and feedback on performance. The Cochrane Effective
Practice and Organisation of Care (EPOC) Group has conducted systematic
reviews documenting the effectiveness of various provider-focused strate
gies for quality improvement, such as printed educational materials (12
randomized controlled trials [RCTs], 11 nonrandomized studies), educa
tional meetings (81 RCTs), educational outreach (69 RCTs), local opinion
QUALITY IMPROVEMENT 137
leaders (18 RCTs), audit and feedback (118 RCTs), computerized remind
ers (28 RCTs), and tailored implementation (26 RCTs) (Cochrane, 2015;
Grimshaw et al., 2012). Research on the implementation of evidence-based
psychosocial interventions has focused overwhelmingly on strategies that
entail monitoring fidelity (also referred to as adherence and compliance)
and assessing provider attitudes toward or satisfaction with the interven
tions (Powell et al., 2014). Other clinician-level factors that can influence
and improve quality include competence, motivation, and access to diag
nostic and decision-making tools. Importantly, as noted above with regard
to consumers, providers actively working in clinical settings should be
engaged in the quality improvement culture and the design and application
of these levers.
BOX 6-1
However, studies have shown that wait times in the VHA are not substan
tially longer than those in other health services settings (Brandenburg et
al., 2015).
Utilization Management
Plans use a variety of utilization management techniques to influence
the use of health care services by members. A plan’s goals for these tech
niques include controlling growth in health care spending and improving
the quality of care—for example, by discouraging treatment overuse or
misuse. Common utilization management techniques include prior authori
zation requirements, concurrent review, and fail-first policies (i.e., requiring
an enrollee to “fail” on a lower-cost therapy before obtaining approval for
coverage of a higher-cost therapy). These review processes can be burden
some for clinicians, and may encourage them to provide alternative treat
ments that are not subject to these techniques.
A large literature documents decreases in the use of health care ser
vices associated with utilization management techniques, with some stud
ies suggesting that the quality of care could be adversely affected for some
individuals (Newhouse et al., 1993). In the case of mental health–related
prescription drugs, for example, the implementation of prior authorization
requirements has been associated with reductions in the use of medications
subject to prior authorization and lower medication expenditures, but
also with reduced medication compliance and sometimes higher overall
health care expenditures (e.g., Adams et al., 2009; Law et al., 2008; Lu et
al., 2010; Motheral et al., 2004; Zhang et al., 2009). Similarly, the use of
fail-first policies for prescription drugs (sometimes referred to as “step ther
apy”) has been associated with lower prescription drug expenditures (e.g.,
Farley et al., 2008; Mark et al., 2010); however, one study of a fail-first
policy for antidepressant medications found that adoption of this policy
was associated with an increase in mental health–related inpatient admis
sions, outpatient visits, and emergency room visits for antidepressant users
in affected plans (Mark et al., 2010). Thus, the use of these tools can have
both intended and unintended outcomes, and these outcomes can be linked
to quality of care. However, a carefully constructed utilization management
strategy could serve to improve the quality of psychosocial interventions if
it resulted in more appropriate use of these interventions among those most
likely to benefit from them. On the other hand, as with benefit design, the
differential application of utilization management across treatment modali
ties could affect treatment decision making (i.e., individuals might be less
likely to use services subject to stricter utilization management).
Selective contracting and network management is another utilization
management tool used by plans that can influence the provision of psycho
social interventions. Plans typically form exclusive provider networks, con
tracting with a subset of providers in the area. Under this approach, plans
generally provide more generous coverage for services delivered by network
providers than for those delivered by providers outside the network. As
QUALITY IMPROVEMENT 145
a result, plans often can negotiate lower fees in exchange for the patient
volume that will likely result from being part of the plan’s network. To en
sure the availability of evidence-based psychosocial interventions, a plan’s
provider network must include adequate numbers of providers with skills in
delivering these interventions who are accepting new patients. Importantly,
plans will need tools with which to determine the competence of network
providers in delivering evidence-based treatments. Network adequacy has
been raised as a concern in the context of new insurance plans offered on
the state-based health insurance exchanges under the Patient Protection and
Affordable Care Act (Bixby, 1998).
As discussed in Chapter 1, the Mental Health Parity and Addiction
Equity Act requires parity in coverage for behavioral health and general
medical services. Parity is required in both quantitative treatment limita
tions (e.g., copays, coinsurance, inpatient day limits, outpatient visit limits)
and nonquantitative treatment limitations, including the use of utilization
management techniques by plans. Thus, plans are prohibited from using
more restrictive utilization management for mental health and substance
use services than for similar types of general medical services. However,
the regulations would not govern differential use of utilization management
techniques across different mental health/substance use treatment modali
ties (e.g., drugs versus psychosocial treatments).
Provider Payment
The methods used to pay health care providers for the services they
deliver influence the types, quantity, and quality of care received by con
sumers. Historically, providers typically were paid on a fee-for-service (FFS)
basis, with no explicit incentives for performance or quality of care. FFS
payment creates incentives for the delivery of more services, as each service
brings additional reimbursement, but does not encourage the coordination
of care or a focus on quality improvement. Since their introduction more
than 20 years ago, managed behavioral health care carve-outs—a dominant
method of financing mental health/substance use care whereby specialty
benefits for this care are separated from the rest of health care benefits and
managed by a specialty managed care vendor—also have shaped the financ
ing and delivery of behavioral health services. These arrangements allow
the application of specialty management techniques for behavioral health
care and help protect a pool of funds for behavioral health services (because
a separate budget and contract are established just for these services). By
definition, however, carve-out contracts increase fragmentation in service
delivery and distort clinical decision making to some extent. For example,
risk-based carve-out contracts have traditionally excluded psychiatric medi
cations, giving carve-out organizations an incentive to encourage the use
146 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Bundled Payments
Instead of reimbursing each provider individually for every service de
livered to a patient under an FFS model, bundled payment models involve
fixed payments for bundles of related services. The bundle of services can
be defined relatively narrowly (e.g., all physician and nonphysician services
delivered during a particular inpatient stay) or more broadly, with the
broadest bundle including all services provided to an individual over the
course of a year (i.e., a global budget). The current Medicare accountable
care organization (ACO) demonstration programs fall somewhere in the
middle of this continuum, including almost all services in the bundle but
placing the large provider organizations that serve as ACOs at only lim
ited—not full—financial risk for total health care spending.
Bundled payment arrangements create incentives for efficiency in the
delivery of all services included in the bundle and for greater coordination
of care, in addition to providing incentives to substitute services not in
cluded in the bundle (and thus reimbursed outside of the bundled payment)
where possible. These arrangements also raise concerns about stinting and
poor quality of care to the extent that maintaining or improving quality
can be costly. In the case of psychosocial interventions, there is concern that
provider organizations operating under a global full risk payment contract,
with strong incentives for efficiency in service delivery, could reduce the
delivery of effective psychosocial interventions for which measurement of
quality is problematic or there is no incentive for the provision of quality
in payment systems, as is the case for many psychosocial interventions
(Mechanic, 2012).
Pay-for-Performance (P4P)
Both public and private purchasers and plans also have embraced P4P
approaches to encouraging quality improvement. Under P4P, clinicians or
provider organizations receive bonuses if they meet or exceed certain qual
ity thresholds that are specified in provider contracts. While the literature
on P4P strategies suggests that they often result in improved quality as
QUALITY IMPROVEMENT 147
measured by the metrics used, the improvements often are relatively small
in magnitude and may be somewhat narrowly focused on the clinical areas
that are targeted through the measures (Colla et al., 2012; Mullen et al.,
2010; Werner et al., 2013; Wilensky, 2011).
Risk-based payment models currently in use for Medicare and some
commercial payers include a P4P component, with a set of performance
metrics and associated financial incentives. The P4P components are in
cluded in the risk-based contracts in an effort to ensure that quality of care
is maintained or improved in the face of greater provider financial risk for
expenditures. Given such financial risk, provider organizations may be
more likely to discourage the use of treatments with no associated quality
metrics or less focused on ensuring the quality of those treatments relative
to treatments for which financial incentives are included in the contract.
This concern underscores the importance of incorporating validated quality
metrics for psychosocial treatments in P4P systems. For any metrics based
on outcome measures, it will be important for the P4P methodology to ac
count for differences in patient case mix to counteract incentives for selec
tion behavior on the part of clinicians and provider organizations.
REFERENCES
Aarons, G. A., and D. H. Sommerfeld. 2012. Leadership, innovation climate, and attitudes
toward evidence-based practice during a statewide implementation. Journal of the Ameri
can Academy of Child and Adolescent Psychiatry 51(4):423-431.
Aarons, G., J. Horowitz, L. Dlugosz, and M. Ehrhart. 2012. The role of organizational pro
cesses in dissemination and implementation research. In Dissemination and implementa
tion research in health: Translating science to practice, by R. Brownson, G. A. Colditz,
Enola K. Proctor. New York: Oxford University Press. Pp. 128-153.
ACF (Administration for Children and Families). n.d. Training and technical assistance. http://
www.acf.hhs.gov/programs/cb/assistance (accessed January 1, 2015).
ACGME (Accreditation Council for Graduate Medical Education) and ABPN (American
Board of Psychiatry and Neurology). 2013. The psychiatry milestone project. http://
acgme.org/acgmeweb/Portals/0/PDFs/Milestones/PsychiatryMilestones.pdf (accessed June
17, 2015).
Adams, A. S., F. Zhang, R. F. LeCates, A. J. Graves, D. Ross-Degnan, D. Gilden, T. J.
McLaughlin, C. Lu, C. M. Trinacty, and S. B. Soumerai. 2009. Prior authorization for
antidepressants in Medicaid: Effects among disabled dual enrollees. Archives of Internal
Medicine 169(8):750-756.
AHRQ (Agency for Healthcare Research and Quality). 2012. Closing the quality gap se
ries: Public reporting as a quality improvement strategy. http://www.effectivehealthcare.
ahrq.gov/search-for-guides-reviews-and-reports/?pageaction=displayproduct&product
ID=1198 (accessed June 17, 2015).
Albrecht, L., M. Archibald, D. Arseneau, and S. D. Scott. 2013. Development of a checklist
to assess the quality of reporting of knowledge translation interventions using the Work
group for Intervention Development and Evaluation Research (WIDER) recommenda
tions. Implementation Science 8(52).
Allen, J., A. Q. Radke, and J. Parks. 2010. Consumer involvement with state mental health
authorities. Alexandria, VA: National Association of Consumer/Survivor Mental Health
Administrators.
APA (American Psychological Association). n.d. Understanding APA accreditation. http://
www.apa.org/ed/accreditation/about (accessed June 17, 2015).
QUALITY IMPROVEMENT 153
Austin, Z., A. Marini, N. MacLeod Glover, and D. Tabak. 2006. Peer-mentoring workshop
for continuous professional development. American Journal of Pharmaceutical Educa
tion 70(5):117.
Barbato, A., B. D’Avanzo, V. D’Anza, E. Montorfano, M. Savio, and C. G. Corbascio. 2014.
Involvement of users and relatives in mental health service evaluation. The Journal of
Nervous and Mental Disease 202(6):479-486.
Beidas, R. S., and P. C. Kendall. 2010. Training therapists in evidence-based practice: A criti
cal review of studies from a systems contextual perspective. Clinical Psychology: Science
and Practice 17:1-30.
Beidas, R. S., W. Cross, and S. Dorsey. 2014. Show me, don’t tell me: Behavioral rehearsal
as a training and analogue fidelity tool. Cognitive and Behavioral Practice 21(1):1-11.
Bixby, T. D. 1998. Network adequacy: The regulation of HMO’s network of health care
providers. Missouri Law Review 63:397.
Bledsoe, S. E., M. M. Weissman, E. J. Mullen, K. Ponniah, M. Gameroff, H. Verdeli, L.
Mufson, H. Fitterling, and P. Wickramaratne. 2007. Empirically supported psychother
apy in social work training programs: Does the definition of evidence matter? Research
on Social Work Practice 17:449-455.
Brandenburg, L., P. Gabow, G. Steele, J. Toussaint, and B. Tyson. 2015. Innovation and best
practices in health care scheduling. Discussion paper. Washington, DC: Institute of Medi
cine. http://www.iom.edu/schedulingbestpractices (accessed June 15, 2015).
Brown, A. H., A. N. Cohen, M. J. Chinman, C. Kessler, and A. S. Young. 2008. EQUIP: Imple
menting chronic care principles and applying formative evaluation methods to improve
care for schizophrenia: QUERI Series. Implementation Science 3:9.
Brownson, R. C., J. A. Jacobs, R. G. Tabak, C. M. Hoehner, and K. A. Stamatakis. 2012.
Designing for dissemination among public health researchers: Findings from a national
survey in the United States. American Journal of Public Health 103(9):1693-1699.
Chambers, D. A., R. E. Glasgoe, and K. C. Stange. 2013. The dynamic sustainability frame
work: Addressing the paradox of sustainment amid ongoing change. Implementation
Science 8:117.
Choudhry, N. K., M. B. Rosenthal, and A. Milstein. 2010. Assessing the evidence for value-
based insurance design. Health Affairs 29(11):1988-1994.
Chun, M. B., and D. M. Takanishi, Jr. 2009. The need for a standardized evaluation method
to assess efficacy of cultural competence initiatives in medical education and residency
programs. Hawaii Medical Journal 68(1):2-6.
Clark, C. C., E. A. Scott, K. M. Boydell, and P. Goering. 1999. Effects of client inter
viewers on client-reported satisfaction with mental health services. Psychiatric Services
50(7):961-963.
CMS (Centers for Medicare & Medicaid Services). 2007. Letter to state Medicaid direc
tors. SMDL #07-011. August 15, 2007. http://downloads.cms.gov/cmsgov/archived
downloads/SMDL/downloads/SMD081507A.pdf (accessed June 16, 2015).
Cochrane. 2015. Cochrane effective practice and organisation of care group: Our reviews.
http://epoc.cochrane.org/our-reviews (accessed June 18, 2015).
Colla, C. H., D. E. Wennberg, E. Meara, J. S. Skinner, D. Gottlieb, V. A. Lewis, C. M.
Snyder, and E. S. Fisher. 2012. Spending differences associated with the Medicare Phy
sician Group Practice Demonstration. Journal of the American Medical Association
308(10):1015-1023.
Cross, W., M. M. Matthieu, J. Cerel, and K. L. Knox. 2007. Proximate outcomes of gatekeeper
training for suicide prevention in the workplace. Suicide and Life-Threatening Behavior
37(6):659-670.
154 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Goldner, E. M., V. Jeffries, D. Bilsker, E. Jenkins, M. Menear, and L. Petermann. 2011. Knowl
edge translation in mental health: A scoping review. Healthcare Policy 7:83-98.
Goscha, R., and C. Rapp. 2014. Exploring the experiences of client involvement in medication
decisions using a shared decision making model: Results of a qualitative study. Commu
nity Mental Health Journal 1-8.
Grant, J. G. 2010. Embracing an emerging structure in community mental health services
hope, respect, and affection. Qualitative Social Work 9(1):53-72.
Grimshaw, J. M., M. P. Eccles, J. N. Lavis, S. J. Hill, and J. E. Squires. 2012. Knowledge
translation of research findings. Implementation Science 7:1-17.
Herschell, A. D., D. J. Kolko, B. L. Baumann, and A. C. Davis. 2010. The role of therapist
training in the implementation of psychosocial treatments: A review and critique with
recommendations. Clinical Psychology Review 30:448-466.
HHS (U.S. Department of Health and Human Services). 1999. Mental health: A report of the
Surgeon General. Rockville, MD: HHS, Substance Abuse and Mental Health Services
Administration, Center for Mental Health Services, National Institutes of Health, Na
tional Institute of Mental Health.
Hibbard, J. H. 2013. What the evidence shows about patient activation: Better health out
comes and care experiences; fewer data on costs. Health Affairs 32(2):207.
Horgan, C. M. 1985. Specialty and general ambulatory mental health services: A comparison
of utilization and expenditures. Archives of General Psychiatry 42:565-572.
_____. 1986. The demand for ambulatory mental health services from specialty providers.
Health Services Research 21(2):291-319.
Huskamp, H. A. 1999. Episodes of mental health and substance abuse treatment under a
managed behavioral health care carve-out. Inquiry 36(2):147-161.
IHI (Institute for Healthcare Improvement). 2003. The breakthrough series: IHI’s collab
orative model for achieving breakthrough improvement. http://www.ihi.org/resources/
Pages/IHIWhitePapers/TheBreakthroughSeriesIHIsCollaborativeModelforAchieving
BreakthroughImprovement.aspx (accessed June 17, 2015).
IOM (Institute of Medicine). 2006. Improving the quality of care for mental and substance
use conditions. Washington, DC: The National Academies Press.
Jamtvedt, G., J. M. Young, D. T. Kristoffersen, M. A. O’Brien, and A. D. Oxman. 2006. Au
dit and feedback: Effects on professional practice and health care outcomes. Cochrane
Database of Systematic Reviews (2):CD000259.
Karlin, B. E., and G. Cross. 2014a. Enhancing access, fidelity, and outcomes in the national
dissemination of evidence-based psychotherapies. American Psychologist 69(7):709-711.
_____. 2014b. From the laboratory to the therapy room: National dissemination and imple
mentation of evidence-based psychotherapies in the U.S. Department of Veterans Affairs
health care system. American Psychologist 69(1):19-33.
Kauth, M. R., G. Sullivan, and K. L. Henderson. 2005. Supporting clinicians in the develop
ment of best practice innovations in education. Psychiatric Services 56(7):786-788.
Klein, K. J., and A. P. Knight. 2005. Innovation implementation: Overcoming the challenge.
Current Directions in Psychological Science 14(5):243-246.
Kullgren, J. T., A. A. Galbraith, V. L. Hinrichsen, I. Miroshnik, R. B. Penfold, M. B. Rosenthal,
B. E. Landon, and T. A. Lieu. 2010. Health care use and decision making among
lower-income families in high-deductible health plans. Archives of Internal Medicine
170(21):1918-1925.
Landsverk, J., C. H. Brown, J. Rolls Reutz, L. A. Palinkas, and S. M. Horwitz. 2011. Design
elements in implementation research: A structured review of child welfare and child
mental health studies. Administration and Policy in Mental Health 38:54-63.
156 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
NHS (U.K. National Health Service). 2015. High intensity cognitive behavioural therapy
COMMITTEE EXPERTISE
The IOM formed a committee of 16 experts to conduct a study to
respond to the study charge. The committee comprised members with ex
pertise in health care policy, health care quality and performance, health
systems research and operation, implementation science, intervention devel
opment and evaluation, primary care, professional education, clinical psy
chology and psychiatry, recovery-oriented care, and peer support services.
Appendix B provides biographical information for each committee member.
159
LITERATURE REVIEW
PUBLIC MEETINGS
The committee deliberated from March 2014 through December 2014
to conduct this expert assessment. During this period, the committee held
five 2-day meetings, and committee members also participated in multiple
conference calls. Two public meetings were held in conjunction with the
committee’s May and July 2014 meetings, which allowed committee mem
bers to obtain additional information on specific aspects of the study charge
(see Boxes A-1 and A-2).
The first public meeting focused on approaches to quality measurement
both in and outside the mental health care field. The second public meet
ing focused on approaches to quality improvement both in and outside the
mental health care field, and included speakers with expertise in the fields
of treatment fidelity, implementation, and health technology.
Each open-session meeting included a public comment period in which
the committee invited input from any interested party. All open-session
meetings were held in Washington, DC. A conference call number and
online public comment tool were provided to allow opportunity for input
from those unable to travel to the meetings. A link to the public comment
tool was made available on the National Academies’ website from Janu
ary 2014 to March 2015, and all online comments were catalogued in the
study’s public access file. Any information provided to the committee from
outside sources or through the online comment tool is available by request
through the National Academies’ Public Access Records Office. The agen
das for the two open-session committee meetings are presented below.
APPENDIX A 161
BOX A-1
RAND Corporation
continued
162 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
BOX A-2
continued
164 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
Mary Jane England, M.D. (Chair), is professor of health policy and man
agement at the Boston University School of Public Health. Recently, she
successfully completed a term as interim chair of community health sciences
at the Boston University School of Public Health. In 1964, Dr. England
received her medical degree from Boston University and launched an in
ternational career as a child psychiatrist. As an authority on employer and
employee benefits, she has brought multiple informed perspectives to bear
on health care reform. She was the first commissioner of the Massachusetts
Department of Social Services (1979-1983), associate dean and director
of the Littauer Master in Public Administration Program at the John F.
Kennedy School of Government at Harvard University (1983-1987), presi
dent of the American Medical Women’s Association (1986-1987), president
of the American Psychiatric Association (1995-1996), and a corporate
vice president of Prudential (1987-1990) and chief executive officer (CEO)
of the Washington Business Group on Health (1990-2001). A nationally
known expert on health care and mental health parity, Dr. England chaired
the Institute of Medicine (IOM) committee that produced the 2006 Qual
ity Chasm report on care for mental health and substance use disorders. In
2008, she chaired an IOM committee on parental depression and its effect
on children and other family members. In 2011, she chaired an IOM com
mittee on the public health dimensions of the epilepsies. Having recently
completed a term on the Commission on Effective Leadership (2006-2009)
of the American Council on Education and currently participating in the
Advancing Care Together project in Colorado (2009-present), Dr. England
165
including the Robert Wood Johnson Foundation. She directed the Brandeis/
Harvard Center to Improve the Quality of Drug Abuse Treatment, funded
by NIDA, which is focused on how performance measurement and man
agement and payment techniques can be harnessed more effectively and ef
ficiently to deliver higher-quality substance abuse treatment. For the past 15
years, Dr. Horgan has led a series of NIH-funded nationally representative
surveys of the provision of behavioral health care in private health plans, in
cluding the use of incentives, performance measures, and other approaches
to quality improvement, and how behavioral health parity legislation is af
fecting those services. She is a co-investigator on studies funded by NIDA,
the Robert Wood Johnson Foundation, and the Centers for Medicare &
Medicaid Services (CMS) on the design, implementation, and evaluation of
provider and patient incentive payments to improve care delivery. For more
than 20 years, Dr. Horgan has directed the NIAAA training program to
support doctoral students in health services research, teaching core courses
in the substance use and treatment areas and directing the weekly doctoral
seminar. Currently, she is a member of the National Committee for Qual
ity Assurance’s (NCQA’s) Behavioral Health Care Measurement Advisory
Panel and also serves on the National Quality Forum’s (NQF’s) Behavioral
Health Standing Committee. Dr. Horgan received her doctorate in health
policy and management from Johns Hopkins University and her master’s
degree in demography from Georgetown University.
Harold Alan Pincus, M.D., is professor and vice chair of the Department
of Psychiatry at Columbia University’s College of Physicians and Sur
geons, director of quality and outcomes research at New York-Presbyterian
172 FRAMEWORK FOR PSYCHOSOCIAL INTERVENTIONS
health services research for 20 years. She also leads the NIMH-funded
Implementation Research Institute, a national training program for imple
mentation science for mental health services. Her peer-reviewed publica
tions address the quality of mental health services and the implementation
of evidence-based interventions. Among her books are Dissemination and
Implementation Research in Health: Translating Science to Practice, pub
lished in 2012 by Oxford University Press, and Developing Practice Guide
lines for Social Work Interventions: Issues, Methods, and Research Agenda,
published in 2003 by Columbia University Press. Dr. Proctor was a member
of NIMH’s National Advisory Council from 2007 to 2011 and served on
two National Advisory Committee workgroups—on research workforce
development and intervention research. She served as editor-in-chief of
the research journal of the National Association of Social Workers, Social
Work Research. Her awards include the Knee Wittman Award for Lifetime
Achievement in Health and Mental Health Practice, National Association
of Social Workers Foundation; the Distinguished Achievement Award from
the Society for Social Work and Research; the President’s Award for Excel
lence in Social Work Research, National Association of Social Workers;
and Mental Health Professional of the Year, Alliance for the Mentally Ill
of Metropolitan St. Louis. Along with university mentoring awards, she
received Washington University’s top honor, the Arthur Holly Compton
Faculty Achievement Award. She also was elected as a founding member of
the American Academy of Social Work and Social Welfare.
major impact on public policy and public health. The summary of her work
on depression appears in a special issue of the Annals of Epidemiology,
Triumphs in Epidemiology. Dr. Weissman received a Ph.D. in epidemiology
from Yale University School of Medicine in 1974.