Curran, 2012 - Effectiveness-Implementation Hybrid Designs
Curran, 2012 - Effectiveness-Implementation Hybrid Designs
Curran, 2012 - Effectiveness-Implementation Hybrid Designs
TABLE 1. Key Terms and Definitions* (continued) TABLE 2. Design Characteristics of Clinical Effectiveness and
conduct and quality of implementation but in
Implementation Trials (Ideal Types)
which data are not used during conduct of the Design Clinical Effectiveness
study to influence the process; such data can Characteristic Trial Implementation Trial
be collected previous to the study,
concurrently, or retrospectively. Test “Clinical” intervention Implementation intervention
Formative evaluation A rigorous assessment approach, integral to the or strategy
conduct of an action-oriented Typical unit Patient, clinical unit Provider, clinical unit, or
implementation study, designed to identify of randomization system
and make parallel use of potential and actual Typical unit of Patient Provider, clinical unit, or
influences on the progress, quality and analysis system
potential sustainability of implementation; Summative Health outcomes; Adoption/uptake of the
that is, formative evaluation data are used outcomes process/quality “clinical” intervention;
during the study to refine, improve and measures typically process measures/quality
evolve the implementation process and, in considered measures typically
some cases, adapt an implementation and/or intermediate; costs considered outcomes
clinical intervention itself. Formative
evaluation involves assessment previous to,
concurrent with, and/or after actual reviewing, and conducting research projects across the efficacy–
implementation activities, and thus provides
data for both immediate use to optimize a effectiveness-implementation spectrum. Under the umbrella of
related study effort and for post hoc the VA QUERI and its implementation frameworks,22 we
interpretation of findings.21 formed a work group to explore the hybrid concept. The work
Summative Evaluation A rigorous assessment of the worth or impact group, consisting of the authors of the manuscript, brought ex-
of a “clinical” and/or implementation
intervention/strategy, including, for example:
pertise in implementation research, clinical effectiveness trials,
patient-level health outcomes for a clinical cost effectiveness research, qualitative research, formative eval-
intervention, such as symptoms or mortality uation, evidence-based practice, and clinical expertise in psy-
process or quality measures for an chiatry. The work group discussed the following issues: design
implementation strategy, such as adherence elements, challenges, and potential areas or opportunities for
to a new practice
population-level health status or indices of blending effectiveness with implementation research designs to
system function for a system/organizational- hasten the movement of interventions from effectiveness testing
level intervention. through implementation to public health impact. We initially
Note than an outcome in a summative evaluation articulated a definition of hybrid designs and tested it both log-
may also be part of a process or formative
evaluation, for example, rates of performing a
ically and against existing studies. We then revised our definition
clinical or health promotion behavior, used to and typology in light of this testing. We subsequently had the
track implementation progress during a trial. opportunity to make formal presentations to audiences of health
In fact, process-related outcomes in services and implementation researchers and trainees, and
implementation studies are often considered incorporated their feedback.
intermediate process variables in “clinical”
intervention studies that investigate health The remainder of this study builds on our refined defi-
outcomes. nition of hybrid design and its 3 types as articulated below, with
Indirect evidence Clinical efficacy or effectiveness data from the text amplifying and unpacking the information presented in
different but associated populations to the 2 supporting tables. Table 3 provides a summary of hybrid
one(s) that are the subject of a possible
hybrid study. If there is no indirect evidence
research aims, design considerations, and trade-offs to be
supporting a clinical intervention from an considered within each hybrid type. Table 4 provides funded
associated population, we do not recommend and published or presented examples of type 1, 2, and 3 hybrid
pursuing a hybrid design based on consensus designs, along with a comparison with a published clinical
guidelines or expert opinion. effectiveness randomized-controlled trial and a nonhybrid im-
*Many definitions based on the Quality Enhancement Research Initiative plementation trial. Most of the examples of the proposed hy-
Glossary.22 brids discussed in the text (except for 2) and all presented
in Table 4 are from the VA. As noted above, we have ap-
proached this work based on our experience in a VA im-
explicitly recognized the hybrid nature of their designs or plementation research program, and most of the examples we
acknowledged/described the trade-offs entailed in such de- know best come from the VA setting. In theory, the hybrid
signs. This is understandable as there has been a dearth of designs should not be more or less effective in the VA or any
explicit guidance on how such hybridization can be sys- other setting (as was true for our 2 non-VA examples), and the
tematically attempted (Fig. 1). recommended conditions we propose for their use are not
exclusive to, or even more ideal in, a VA or other large/
single-payer system.
HYBRID DESIGNS: CONCEPTUAL Hybrid Type 1: Testing a clinical intervention while
DEVELOPMENT gathering information on its delivery during the effectiveness
The origins of the hybrid designs proposed herein result trial and/or on its potential for implementation in a real-
from our collective experience over many years in writing, world situation.
Study aims
Clinical Randomized clinical Randomized clinical effectiveness Controlled trial to evaluate the Evaluation of selected clinical —
effectiveness effectiveness trial to trial to test effectiveness of effectiveness (relative to outcomes after implementation of
aim/focus determine effect of providing incentives for negative usual care) of a chronic illness a model of integrated primary care
collaborative CCM on urine screens on frequency of care model on clinical and mental health
clinical, functional, quality substance use, treatment outcomes among persons with
of life, and economic attendance, and service utilization schizophrenia
outcomes in bipolar costs for patients attending
disorder treatment for alcohol and/or
stimulant use disorders
www.lww-medicalcare.com |
(Continued )
Effectiveness implementation Hybrid Designs
221
TABLE 4. Key Terms and Definitions* (continued)
Nonhybrid Effectiveness Hybrid Type 1: Hybrid Type 2: Hybrid Type 3: Nonhybrid Implementation Trial:
33
Trial: Bauer et al27–29 Hagedorn et al30 Brown et al31 Kirchner et al32 Lukas et al
Curran et al
222 | www.lww-medicalcare.com
outcomes* involvement, and supported Sustainability of uptake
employment
Organizational quality
improvement and clinical
capacity
Utilization
Access
Treatment appropriateness
Sustainability
Organizational Total direct treatment costs Total direct treatment costs from Treatment costs, clinician — —
Outcomes* from VHA economic VHA economic perspective burnout
perspective
Key process (PE) or FE Measures
Not a focus of research; Process evaluation of: Formative evaluation Formative evaluation used by the Preimplementation assessment
limited to CCM fidelity Level of organizational preimplementation and study staff to tailor the facilitation including site visits at 4- to 6-
monitoring and rectification readiness postimplementation strategy month intervals to monitor and
throughout trial Barriers assessment of: Preimplementation and during facilitate
Patient receptivity Organizational readiness for implementation organizational Interview data regarding
Sustainability (patient, clinic, change (pre) assessments, stakeholder characteristics of implementation
and medical center level) Stakeholder knowledge, perspectives of OTM, network level support
attitudes and beliefs During implementation activities
Continuous documentation assessment of key uptake Implementation ratings completed
of organizational change, measures with feedback to by research site visit teams
functioning, and facilitators (facility level)
Medical Care
external facilitators as an
r
implementation intervention to
tailor implementation (data fed
back to providers and managers)
*Could be either a clinical intervention-related outcome or an implementation outcome depending on the a priori focus of the study.
CCM indicates chronic care model; FE, formative evaluation; OTM, Organizational Transformation Model; VA, Veterans Affairs; VISN, Veterans Integrated Services Network; VHA, Veterans Health Administration.
achieved in a sequential “intervention-then-preliminary-im- Rationale: This hybrid type is a more direct blending
plementation” study strategy: What are potential barriers and of clinical effectiveness and implementation research aims in
facilitators to “real-world” implementation of the intervention? support of more rapid translation. In this case, interventions
What problems were associated with delivering the inter- in both the clinical and implementation spheres are tested
vention during the clinical effectiveness trial and how might simultaneously. It is important to note that we are using the
they translate or not to real-world implementation? What po- term “test” in a liberal manner, meaning that the inter-
tential modifications to the clinical intervention could be made ventions in question need not all be tested with randomized,
to maximize implementation? What potential implementation strongly powered designs. What makes for a “test” of an
strategies appear promising? intervention here is that at least 1 outcome measure is being
We recommend that the above type of questions should used and that at least 1 related hypothesis, however prelim-
be posed to representatives of relevant stakeholder groups— inary, is being studied. The nature of randomizations/com-
for example, patients, providers, and administrators. Process parisons and power can vary depending on research needs
evaluation data can also help explain/provide context for and conditions. Given the reality of research funding limits,
summative findings from the clinical effectiveness trial. it is likely that some design/power compromises will be
Recommended conditions for use: Hybrid 1 designs necessary in one or both of the intervention tests; however, in
should be considered under the following conditions: (1) the cases of conditions favorable to this hybrid type (see
there should be strong face validity for the clinical inter- below), such design compromises need not derail progress
vention that would support applicability to the new setting, toward addressing translation gaps/needs in the literature.
population, or delivery method in question; (2) there should This hybrid type is also motivated by the recognition
be a strong base of at least indirect evidence (eg, data from that conventional effectiveness studies often yield estimates
different but associated populations) for the intervention that of effectiveness that are significantly different (worse) from
would support applicability to the new setting, population, or the estimates of efficacy studies because the effectiveness
delivery method in question; (3) there should be minimal risk study is often conducted in “worst case” conditions; that is,
associated with the intervention, including both its direct risk with little or no research team support of delivery/im-
and any indirect risk through replacement of a known ad- plementation, without clear understanding of barriers to fi-
equate intervention. These conditions, to varying degrees, delity, and without efforts to overcome those barriers. In a
are often found in “research-practice networks” such as the Hybrid Type 2 study, where an implementation intervention/
National Institute on Drug Abuse clinical trials network, and strategy of some kind is being tested alongside and in support
Hybrid 1 designs should be particularly attractive for these of a clinical intervention under study, it is possible to create
partnerships. and study a “medium case”/pragmatic set of delivery/im-
In addition, there are conditions under which a Hybrid plementation conditions versus “best” or “worst” case con-
1 study would seem premature or less feasible—for example, ditions. Therefore, while speeding translation, it is possible
in clinical effectiveness trials with major safety issues, with Hybrid Type 2 designs to provide more valid estimates
complex comparative effectiveness trials, and “pilot” or very of potential clinical effectiveness.
early clinical effectiveness trials. In general, however, we Recommended conditions for use: The following
argue that a Hybrid 1 is particularly “low risk” with the conditions are recommended to consider a Hybrid Type 2:
potential for high reward given that the implementation re- (1) there should be strong face validity for the clinical and
search portion is essentially an “add-on” to a routinely de- implementation interventions/strategies that would support
signed and powered clinical trial. When moving into the next applicability to the new setting, population, or delivery/im-
Hybrid type, there are more complexities to consider and plementation methods in question; (2) there should be at
trade-offs to be weighed. least a strong base of indirect evidence (defined above) for
Examples: As summarized in Table 4, a recent Hybrid the clinical and implementation interventions/strategies that
Type 1 study by Hagedorn et al30 included a randomized would support applicability to the new setting, population, or
clinical effectiveness trial of contingency management with delivery/implementation method in question; (3) there
a mixed-method, multistakeholder process evaluation of the should be minimal risk associated with the clinical and im-
delivery of the intervention. In another example (not found in plementation interventions/strategies, including both the di-
the Table 4), the National Institute on Mental Health-funded rect risk of the interventions and any indirect risk through
“Coordinated Anxiety Learning and Management (CALM)- replacement of known adequate interventions; (4) there
study”34 tested the clinical effectiveness of a collaborative should be “implementation momentum” within the clinical
care intervention for anxiety disorders while also conducting system and/or the literature toward routine adoption of the
a multistakeholder qualitative process evaluation. Key re- clinical intervention in question. The momentum could come
search questions in the CALM process evaluation were: (1) from a number of possible scenarios or factors—for example,
what were the facilitators/barriers to delivering the CALM strong “indirect” efficacy or effectiveness data; advocacy
intervention?; (2) what were the facilitators/barriers to sus- from patient groups, providers or lawmakers (often in the
taining the CALM intervention after the study was com- case of severe clinical consequences/risks from nonaction);
pleted?; (3) how could the CALM intervention be changed to and/or health system administrators seeking rapid uptake
improve adoption and sustainability? of an intervention based on the above or other factors, for ex-
Hybrid Type 2: Simultaneous testing of a clinical in- ample, costs. Evidence of such momentum could come from
tervention and an implementation intervention/strategy. published reports, official policies, or even from discussions
with key stakeholders; (5) there should be reasonable expect- (patient education) and “implementation” (academic detail-
ations that the implementation intervention/strategy being ing) interventions tested were certainly not as complex some
tested is supportable in the clinical and organizational context of the other examples, perhaps it was this simplicity that
under study; (6) there is reason to gather more data on the allowed for the large factorial design.
effectiveness of the clinical intervention (eg, it is being pro- Hybrid Type 3: Testing an implementation inter-
vided in a different format or novel setting). vention/strategy while observing/gathering information on
In addition, we have identified other conditions that the clinical intervention and related outcomes.
might be considered to be “ideal” for this hybrid type. These Rationale: A “pure,” nonhybrid implementation study
are: (i) strong indirect clinical evidence (see #2 above) is conducted after an adequate body of evidence has accu-
comes from either a population or clinical setting reasonably mulated that clearly establishes the effectiveness of a clinical
close to the population or setting in question, thereby not intervention, and thus clearly supports the appropriateness of
necessitating a large clinical effectiveness trial. If the clinical costly efforts to try to facilitate better implementation.
effectiveness study can be of moderate size, additional Sometimes, however, we can and do proceed with im-
budget and efforts can be used toward the implementation plementation studies without completion of the full or at
intervention/strategy and its evaluation; (ii) the clinical in- times even a modest portfolio of effectiveness studies be-
tervention and/or implementation intervention/strategy to be forehand. In such cases, it is common that a prevailing health
tested are not overly complex in terms of changes necessary policy dictates/encourages implementation of a clinical in-
within the clinic/organization to support it; that is, the testing tervention that is, to varying degrees, still in question from
of clinical and implementation interventions/strategies an effectiveness perspective. Similar to the cases discussed
within the same providers/clinics/systems is not perceived to above with reference to “momentum for implementation,”
be, nor actually is overly taxing to participating stakeholders. the situations where health systems actually encourage or
Examples: As summarized in Table 4, the Hybrid Type attempt implementation of a clinical intervention without the
2 “Enhancing Quality and Utilization in Psychosis study”31 desired clinical effectiveness data base could include the
was an 8-site VA study where 4 sites were randomized to a presence of respected consensus guidelines; strong “indirect”
chronic illness care model for schizophrenia (clinical/deliv- efficacy or effectiveness data; advocacy from patient groups,
ery system intervention) supported by an implementation providers or lawmakers (often in the case of the current state
strategy (facilitation, quality improvement teams, quality of practice and severe clinical consequences/risks from
reports, etc.). The study gathered clinical and im- nonaction); and administrators seeking to reduce costs by
plementation outcome data at all sites. The investigators implementing a cheaper clinical alternative. In these cases, it
recognized a need to test implementation strategies to sup- is, therefore, important to proactively include resources to
port recovery-oriented interventions for persons with schiz- collect evidence of clinical effectiveness.
ophrenia, as many guidelines and VA directives were In addition, Hybrid Type 3 designs are indicated if it is
encouraging the use of recovery-oriented programs even in suspected that the clinical intervention effects might be
the case of less than ideal effectiveness research support. In susceptible to change during implementation in a new setting
another example (not in Table 4), the “HIV Translating In- or under conditions less controlled that in effectiveness trials.
itiatives for Depression into Effective Solutions study,”35 Such changes in clinical intervention effectiveness could
human immunodeficiency virus (HIV) patients were represent either a vulnerability or an enhancement under
randomized to a clinical/delivery system intervention (col- implementation conditions compared with effects seen dur-
laborative care for depression) at 3 VA clinics, whereas the ing clinical trials.
same clinics participated in a nonrandomized, exploratory Recommended conditions for use: The following con-
implementation strategy as well. Although it was clear to the ditions are recommended to consider a Hybrid Type 3: (1)
investigators that a patient-level randomized trial of the ef- there should be strong face validity for the clinical and im-
fectiveness of depression collaborative care in HIV patients plementation interventions/strategies that would support gen-
was needed (no trials yet in HIV setting), they also recog- eralizability to the new setting, population, or delivery/
nized that, given the strong evidence of depression collabo- implementation methods in question; (2) there should be a
rative care in primary care settings (similar in scope to many strong base of indirect evidence (defined above) for the clinical
HIV clinics) and momentum in the system for its uptake, it and implementation interventions/strategies that would support
was also timely to use the study to explore an im- generalizability to the new setting, population, or delivery/
plementation strategy in this setting as well. An additional implementation method in question; (3) there should be min-
Hybrid Type 2 design variant (also not in Table 4) comes imal risk associated with the clinical and implementation in-
from the “Healthy Bones” study,36 where both patient-di- terventions/strategies, including both the direct risk of the
rected and physician-directed interventions for fracture pre- interventions and any indirect risk through replacement of
vention were simultaneously tested in a 2 2 factorial known adequate interventions; (4) there should be strong
randomized controlled trial. On the basis of the observation “implementation momentum” in the form of actual mandates
that management of osteoporosis and fall prevention was or strong encouragement within the clinical system and/or the
suboptimal and that both patient and provider behaviors literature toward routine adoption of the clinical intervention in
needed improvement, the study investors used this design to question; (5) there should be evidence that the implementation
simultaneously test patient and provider education inter- intervention/strategy being tested is feasible and supportable in
ventions on a range of outcomes. Although the “clinical” the clinical and organization context under study.
Examples: As summarized in Table 4, the Hybrid Type jective (eg, current definitions of “strong face validity” and
3 “Blended-Facilitation” study (Kirchner et al32) is a 16-site “indirect evidence”) and that they will need to be reasoned
implementation trial in VA with 8 sites receiving an im- and argued by investigators on a case-by-case basis at least
plementation facilitation strategy consisting of internal and until additional work refines the definitions and conditions.
facilitation (plus numerous implementation tools and aids) to Although traditional clinical effectiveness and im-
support implementation of integrated primary care and plementation trials are likely to remain the most common
mental health. Internal (to the program sites) and external approach to moving a clinical intervention through from
facilitators use a variety of strategies to facilitate im- efficacy research to public health impact, judicious use of the
plementation including academic detailing, provider educa- proposed hybrid designs could speed the translation of re-
tion, local change agent participation, stakeholder search findings into routine practice. However, despite their
engagement at all levels of the organization, performance potential benefits, we recommend that certain conditions
monitoring and feedback, formative evaluation, and mar- should first be met; and, even when desirable, we recognize
keting. The comparison sites receive a dissemination/im- that hybrids might not always be feasible or affordable
plementation strategy being provided by the VA nationally, within traditional research budget limits. We recommend
as integrated primary care mental health is an officially that future hybrid research seeks to document in both
mandated “best practice.” Some components of what is being quantitative and qualitative ways the extent and manner in
“rolled-out” nationally do not have a strong clinical effec- which translation has been sped. As of now, we can only say
tiveness research base—for example, use of “generic” that these hybrids have the potential to speed and improve
mental health care managers (the data support only care translation. Further, the relative speed of translation is not
managers providing service for individual disorders like usually included in traditional cost effectiveness analyses,
depression or specific clusters of disorders like anxiety dis- and it would be interesting to explore the potential benefits of
orders). Therefore, while the main outcomes for the study are hybrid designs from this perspective.
implementation focused, the study is also collecting clinical In considering use of a hybrid, it is important to ac-
data from the VA’s automated medical record where possible knowledge the potential “ecological” challenges associated
(eg, scores on routine depression screeners). In another ex- with pursuing such designs. First, researchers from clinical and
ample, the “Implementation of the Hospital to Home (H2H) implementation research backgrounds often do not share
Health Failure Initiative” (Heidenreich et al37) study is concepts, constructs, and vocabulary; more difficult, some-
testing an implementation strategy to support uptake of the times the vocabulary is the same but the meanings are differ-
H2H Initiative (a package of clinical interventions) in mul- ent. This makes it somewhat difficult for researchers from
tiple facilities while also collecting clinical outcome data. different traditions to communicate efficiently and effectively,
The study randomized 122 VA facilities to intervention and which could serve as a barrier to collaboration, and perhaps
comparison conditions, with intervention facilities receiving also impede comprehension during research proposal and
a range of implementation interventions, including web- manuscript reviews. More specifically, lack of reviewer ex-
based “kick-off” meetings (education and marketing), toolkit pertise on grant review panels and among journal reviewers
dissemination, and roles for local opinion leaders. Compar- and editorial boards relative to emerging concepts and in-
ison programs are converting to intervention programs after novations in the field of implementation science can have an
6 months. Although key outcomes variables related to uptake inhibitory effect on the development, implementation, and
of H2H are defined for the implementation component (eg, reporting of hybrid studies. Review of hybrid designs requires
enrollment rates other performance measures), the study is at least an appreciation of the complexities balancing internal
also collecting and comparing key patient outcomes data and external validity considerations in such trials, as well as the
(mortality and rehospitalization) across intervention and design trade-offs inherent in structuring such complex proto-
comparison sites. cols and related budgetary needs. Reviews must also, of
course, have sufficient technical expertise across members so
that, in aggregate, both the clinical intervention and the im-
DISCUSSION plementation aspects of the study can be effectively evaluated.
As we have explored relevant work in the VA and the Finally, the same appreciation and expertise required of
implementation field in general, we have seen nonsystematic journal and grant review bodies is required on promotion and
efforts at blending effectiveness and implementation trial tenure committees, although as implementation research and
elements. Through this hybrid framework we offer guidance hybrid designs become more widely appreciated, this lack of
to the field and hopefully provide assistance to investigators expertise will diminish—as it has for effectiveness-oriented
searching for identifiable design solutions. In addition, we clinical trials.4,38–40 Hybrid studies are typically more complex
hope to stimulate further thinking and to encourage new to execute and thus may be relatively “high risk”; however, the
design combinations. The hybrid definition and typology successfully implemented hybrid study will likely pay divi-
offered here must be considered constructs still in evolution. dends across both the (a priori) clinical and implementation
It is important to note that the “boundaries” we have drawn foci, thus yielding rich data that will be of use both scientifi-
around these hybrid types are not intended to be rigid, and cally and in terms of public health impact.
that future work should refine and extend what has been The impetus to blend or explicitly link research tradi-
presented here. In addition, we recognize that some of the tions in the service of accelerating scientific progress
“recommended conditions for use” of the hybrids are sub- and enhancing public health impact is not at all new,4,38–40
and the idea of blending clinical effectiveness and im- 17. Tunis SR, Stryer DB, Clancey CM. Increasing the value of clinical
plementation research elements is also not new. As the ex- research for decision making in clinical and health policy. JAMA.
2003;290:1624–1632.
amples above indicate, “informal” hybrid design trials are 18. March JS, Silva SG, Comptom S, et al. The case for practical clinical
already being conducted and reported. The function of this trials in psychiatry. Am J Psychiatry. 2005;162:836–846.
study is both to help the field better organize its thinking and 19. Wells KB. Treatment research at the crossroads: the scientific interface
design deliberations concerning these concepts that we felt of clinical trials and effectiveness research. Am J of Psychiatry.
were not yet clearly articulated and to stimulate further de- 1999;156:5–10.
20. Proctor EK, Landsverk J, Aarons G, et al. Implementation research in
velopment. The “ecological” challenges noted above will not mental health services: an emerging science with conceptual, method-
endure. They can be overcome, like many diffusion of in- ological, and training challenges. Adm Policy Ment Health. 2009;36:
novation challenges, with education and committed leader- 24–34.
ship over time. 21. Stetler CB, Legro MW, Wallance CM, et al. The role of formative
evaluation in implementation research and the QUERI experience.
J Gen Intern Med. 2006;21:S1–S8.
REFERENCES 22. Stetler CB, Mittman BS, Francis J. Overview of the VA Quality
1. Institute of Medicine. Crossing the Quality Chasm: A New Health Enhancement Research Initiative (QUERI) and QUERI Theme Articles:
System for the 21st Century. Washington, DC: Institute of Medicine of QUERI Series. Implement Sci. 2008;3:8.
the National Academics Press; 2001. 23. Demakis JG, McQueen L, Kizer KW, et al. Quality Enhancement
2. Grol R, Wensing M, Eccles M. Improving Patient Care: The Research Initiative (QUERI): a collaboration between research and
Implementation of Change in Clinical Practices. Toronto: Elsevier; clinical practice. Med Care. 2000;38:17–25.
2005. 24. Bauer MS, Williford WO, Dawson EE, et al. Principals of effectiveness
3. Stetler CB, McQueen L, Demakis J, et al. An organizational framework trials and their implementation in VA Cooperative Study #430:
and strategic implementation for system-level change to enhance Reducing the efficacy-effectiveness gap in bipolar disorder. J Affect
research-based practice: QUERI Series. Implement Sci. 2008;3:30. Disord. 2001;67:61–78.
4. Glasgow RE, Lictenstein E, Marcus AC. Why don’t we see more translation 25. Sox HC. Comparative effectiveness research: a progress report. Ann
of health promotion research to practice? Rethinking the efficacy- Intern Med. 2010;153:469–472.
to-effectiveness transition. Am J Public Health. 2003;93:1261–1267. 26. Atkins D. QUERI and implementation research: emerging from
5. Shojania KG, Ranji SR, Shaw LK, et al. Closing the Quality Gap: A adolescence into adulthood: QUERI series. Implement Sci. 2009;4:12.
Critical Analysis of Quality Improvement Strategies (Vol 2: Diabetes 27. Bauer MS, McBride L, Williford WO, et al. for the CSP #430 Study
Care). Rockville, MD: Agency for Healthcare Research and Quality; Team. Collaborative care for bipolar disorder, Part I: intervention and
2004. implementation in a randomized effectiveness trial. Psychiatric
6. Bravata DM, Sundaram V, Lewis R, et al. Closing the Quality Gap: A Services. 2006a;57:927–936.
Critical Analysis of Quality Improvement Strategies (Vol 5: Asthma 28. Bauer MS, McBride L, Williford WO, et al. for the CSP #430 Study
Care). Rockville, MD: Agency for Healthcare Research and Quality; Team. Collaborative care for bipolar disorder, Part II: impact on clinical
2007. outcome, function, and costs. Psychiatric Services. 2006b;57:937–945.
7. Walsh J, McDonald KM, Shojania KG, et al. Closing the Quality Gap: A 29. Bauer MS, Biswas K, Kilbourne AM. Enhancing multi-year guideline
Critical Analysis of Quality Improvement Strategies (Vol 3: Hyper- concordance for bipolar disorder through collaborative care. Am J
tension Care). Rockville, MD: Agency for Healthcare Research and Psychiatry. 2009;166:1244–1250.
Quality; 2005. 30. Hagedorn H, Noorbaloochi S, Rimmele C, et al. The Rewarding Early
8. Carroll KM, Rounsaville BJ. Bridging the gap: a hybrid model to link Abstinence and Treatment Participation (REAP) Study, Presented at
efficacy and effectiveness research in substance abuse treatment. Enhancing Implementation Science in VA, Denver, CO; 2010.
Psychiatr Serv. 2003;54:333–339. 31. Brown AH, Cohen AN, Chinman MJ, et al. EQUIP: implementing
9. Ranji SR, Steinman MA, Sundaram V, et al. Closing the Quality Gap: A chronic care principles and applying formative evaluation methods
Critical Analysis of Quality Improvement Strategies (Vol 4: Antibiotic to improve care for schizophrenia: QUERI Series. Implement Sci.
Prescribing Behavior). Rockville, MD: Agency for Healthcare Research 2008;3:9.
and Quality; 2006. 32. Kirchner JE, Ritchie M, Curran G, et al. “Facilitating: Design, Using,
10. Freemantle N, Eccles M, Wood J, et al. A randomized trial of evidence- and Evaluating a Facilitation Strategy” Enhancing Implementation
based outreach (EBOR) rationale and design. Control Clin Trials. Science meeting sponsored by Department of Veterans Affairs Quality
1999;20:479–492. Enhancement Research Initiative, Phoenix: AZ; 2011.
11. Cochrane LJ, Olson CA, Murray S, et al. Gaps between knowing and 33. Lukas CV, Engle RL, Holmes SK, et al. Strengthening organizations to
doing: understanding and assessing the barriers to optimal health care. implement evidence-based clinical practices. Healthcare Manag Rev.
J Contin Educ Health Prof. 2007;27:94–102. 2010;35:235–245.
12. Fruth SJ, Veld RD, Despos CA, et al. The influence of a topic-specific, 34. Roy-Byrne P, Craske MG, Sullivan G, et al. Delivery of evidence-based
research-based presentation on physical therapists’ beliefs and practices treatment for multiple anxiety disorders in primary care. JAMA.
regarding evidence-based practice. Physiother Theory Pract. 2010;26: 2010;303:1921–1928.
537–557. 35. Pyne JM, Fortney JC, Curran GC, et al. Effectiveness of collaborative
13. Caban MD, Rand CS, Powe NR, et al. Why don’t physicians follow care for depression in human immunodeficiency virus clinics. Arch
clinical practice guidelines? A framework for improvement. JAMA. Intern Med. 2011;171:23–31.
1999;282:1458–1465. 36. Solomon DH, Brookhart MA, Polinski J, et al. Osteoporosis action:
14. Reschovsky JD, Hadley J, Landon BE. Effects of compensation methods design of the Healthy Bones project trial. Contemp Clin Trials.
and physician group structure on physicians’ perceived incentives to 2005;26:78–94.
alter services to patients. Health Serv Res. 2006;41:1200–1220. 37. Heidenreich et al. http://www.queri.research.va.gov/chf/products/h2h.
15. Racine DP. Reliable effectiveness: a theory on sustaining and 38. Zerhouni E. The NIH Roadmap. Science. 2003;302:63–72.
replicating worthwhile innovations. Adm Policy Ment Health. 2006; 39. Wells K, Miranda J, Bruce ML, et al. Bridging community intervention
33:356–387. and mental health services research. Am J Psychiatry. 2004;161:
16. Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of 955–963.
health services research findings into practice: a consolidated frame- 40. Brown CH, Ten Have TR, Jo B, et al. Adaptive designs for randomized
work for advancing implementation science. Implement Sci. 2009;4:50. trials in public health. Annu Rev Public Health. 2009;30:1–25.