Booth 2006
Booth 2006
Booth 2006
Clear and present questions: formulating questions for evidence based practice
Andrew Booth
Article information:
To cite this document:
Andrew Booth, (2006),"Clear and present questions: formulating questions for evidence based practice",
Library Hi Tech, Vol. 24 Iss 3 pp. 355 - 368
Permanent link to this document:
http://dx.doi.org/10.1108/07378830610692127
Downloaded on: 15 April 2016, At: 08:05 (PT)
References: this document contains references to 47 other documents.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Access to this document was granted through an Emerald subscription provided by emerald-srm:451335 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for
Authors service information about how to choose which publication to write for and submission guidelines
are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as
providing an extensive range of online products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee
on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.
*Related content and download information correct at time of download.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
The current issue and full text archive of this journal is available at
www.emeraldinsight.com/0737-8831.htm
Abstract
Purpose – The paper seeks to provide an overview and update of thinking in relation to the theory
and practice of formulation of answerable research questions within evidence based information
practice.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Introduction
To be able to ask a question clearly is two-thirds of the way to getting it answered (John
Ruskin).
How many times in a working day do you question the value of established library
practice? What questions do you and your colleagues ask when contemplating the
introduction of some new technology or some innovative service? It is true that, on
occasions, lack of time or other logistic constraints will conspire to cause what we
might label an “evaluation bypass”. On such occasions we may continue to perform
some task or action even though we are not truly convinced of its efficacy or we may
move to uncritical adoption of a new technology or procedure. Hopefully such
instances are the exception rather than the rule. Nevertheless we must acknowledge
that it does take both time and effort to “start stopping” or to “stop starting” (Gray,
1997). Thus it becomes critical that all information professionals become efficient at Library Hi Tech
Vol. 24 No. 3, 2006
identifying, formulating and addressing relevant questions from their own practice. pp. 355-368
Within other domains of evidence-based practice, most notably evidence-based q Emerald Group Publishing Limited
0737-8831
healthcare, a considerable knowledge base has been built up around the formulation of DOI 10.1108/07378830610692127
LHT questions. We know, for example, that on average physicians ask two questions for
24,3 every three patients seen (Booth, 2005). We know too that a large percentage of these
questions (30-60 per cent) will go unanswered – in some cases because the one asking
does not believe that the answer is to be found (Booth, 2005). Frequently questions will
be answered with reference to colleagues or from outdated textbooks (Booth, 2005). It
would be reassuring to believe that, as a profession, our knowledge of the research
356 literature in librarianship is such that we can correctly anticipate whether the answers
to our questions from day-to-day practice do exist. The truth is that very few of us have
regular and intensive contact with our own evidence base. This situation is further
aggravated by the fact that many of our activities lie within domains populated by
other research literatures, for example management, education, marketing and
computer science (Crumley and Koufogiannakis, 2002).
also on high volume, high impact, high risk, etc. (which, of course, may ultimately have
large cost implications). It has also been noted that identifying the “most important
research questions” risks conflating those that have yet to be answered (a research
agenda issue) with those that have been answered satisfactorily but which are yet to
impact upon practitioners (a dissemination issue) (Booth, 2001a, b).
The goal of this primary stage, variously called “focusing or formulating your
question” (Richardson et al., 1995), is to convert a precise, yet possibly vaguely
expressed, information need into an “answerable question”. This is mirrored within our
own profession where Crumley and Koufogiannakis state:
The first and most important step in enabling librarians to practice their profession in an
evidence-based manner, is to ensure they know how to ask well-structured questions; a much
harder task than one might initially suppose – having a well-built question focuses your
search for information (Crumley and Koufogiannakis, 2002).
As implied by this format, the level of prior knowledge required to pose such a
question is much greater than that required for a background question. Our choice
between such alternatives may be determined (assuming the existence of suitable
studies) from research studies published in the journal literature. Those who are more
experienced in a profession are most likely to ask foreground questions unless, as
previously mentioned, they face a situation or service that they have not previously
encountered. The existence of two or more alternatives suggests that some form of
comparative study will be most useful in addressing this question. Of course, where a
straight head-to-head comparison does not exist we may have to look at studies where
choice A and choice B are compared separately to a third alternative or arrive at some
crude “balance sheet” of likely advantages and disadvantages for each alternative.
Types of questions
Efforts to classify clinical questions into question types (Gorman and Helfand, 1995;
Barrie and Ward, 1997) initially predated and, subsequently, ran parallel to work on
question formulation within evidence based healthcare. In contrast, the early days of
evidence based information practice have seen attempts by Eldredge (2002a) to
disaggregate a wealth of practitioner questions into three main question types:
(1) prediction questions;
(2) intervention questions; and
(3) exploration questions.
presence of this “level playing field” at the beginning of a study makes it easier to
attribute any changes taking place to the relative effects of the intervention and not to any
pre-existing factors. However this is achieved at the cost of having to prescribe certain
user behaviours in the interests of conducting an experimental, rather than observational,
study. So, for example users may be asked to give prior consent to receiving training six
months later in the “control group” than in the corresponding experimental group. Again
Eldredge (2002a) has identified such intervention questions as:
.
Does weeding some classification ranges in a monographs collection result in
higher usage than the unweeded but otherwise similar ranges?
.
Which methods of teaching search skills result in clinicians searching for their
own evidence in patient care?
.
Do students learn searching skills more effectively from librarians or teaching
faculty?
Exploration questions typically seek to answer the question “why?”. As such they
frequently employ qualitative research designs. Factors mediating the intended effects
of some new service or training course are often grounded in variations in human
attitudes, opinions, feelings, thoughts or behaviours. As such they cannot be easily
explored in a large quantitative study where there is a predefined assumption, or
hypothesis, of how an intervention might work. Single studies for answering
exploration questions may be used to generate a hypothesis for subsequent exploration
in a cohort or randomized controlled study. Alternatively they may be used to explore
an effect that has been demonstrated in a quantitative study but which has not been
satisfactorily explained. For example, trainers frequently report that, while initial
training usually reduces anxiety, perversely further training may increase anxiety.
Only a qualitative study would be able to explore why this might occur – perhaps
revealing that a certain level of training may make readily apparent how complex it is
to obtain a full understanding of the subject being taught. Thus, for a student with no
knowledge of database searching it may be reassuring to learn how to search the ERIC
database. However, once they have obtained this valuable insight if we continue to
cover all the other databases available within education their initial anxiety may be
replaced by a different yet related concern – a qualitative study would reveal if this is
the case. Of course qualitative research is not, in itself, a single research design but Clear and present
comprises a toolbox of such methods as “focus groups, ethnographic studies, questions
naturalistic observations, in-depth interviewing, Delphi techniques, nominal group
processes, and historical analyses”. Eldredge (2002a) has again identified such
exploration questions as:
.
Why do potential users, who are presently non-users, not use their library?
.
Why do some users prefer certain information resources over equally relevant
361
information resources?
. Do librarians improve or worsen users’ perceptions of information overload?
Of course, reducing library practitioner questions into these three discrete categories
may have the unintentional effect of blurring the wealth of information practice that
may be subject to a questioning approach. Exponents of evidence-based healthcare (the
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Evidence Based Medicine Working Group, 1992) devised a mixed typology including
both question-types, for example, diagnosis, (a)etiology, prognosis and therapy and
study types (e.g. economic analysis, systematic review, practice guideline). A more
purist typology for evidence based information practice (Booth, 2004b) might include:
. information needs;
.
information behaviour;
.
causation;
.
information delivery;
. use studies;
.
interventions to promote uptake and utilisation of resources;
.
information retrieval;
.
information presentation;
.
information impact;
.
cost-effectiveness/cost-benefit; and
.
service organisation and management.
Booth and Brice (2004b) use this taxonomy as a strategy for developing the CRISTAL
series of user guides, based on question types as opposed to study types, to enable
librarians to ask meaningful questions of published research. Currently such guides
exist for category 1 (Information needs) and category 5 (Use studies).
Crumley and Koufogiannakis (2002) have explored in much detail six domains of
library practice and their corresponding evidence base:
(1) Reference/enquiries – providing service and access to information that meets
the needs of library users.
(2) Education – finding teaching methods and strategies to educate users about
library resources and how to improve their research skills.
(3) Collections – building a high-quality collection of print and electronic materials
that is useful, cost-effective and meets the users needs.
(4) Management – managing people and resources within an organization.
LHT (5) Information access and retrieval – creating better systems and methods for
24,3 information retrieval and access.
(6) Marketing/promotion – promoting the profession, the library and its services to
both users and non-users.
They suggest that matching librarianship questions to one of the above domains, or a
362 subsequently added domain of professional issues (Koufogiannakis et al., 2004), can:
.
help librarians decide the appropriate search terms to answer that type of
question;
.
determine the sources to be used to answer these questions; and
.
allow librarians to focus upon what they are really asking, rather than permitting
the question to snowball in many different directions.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Precise questions have been linked to more efficient searching for the needed evidence
(Snowball, 1997; Eldredge, 2000b): “Fuzzy questions tend to lead to fuzzy answers”
(Oxman and Guyatt, 1988). Additionally, as many initial questions lead to other
questions, the question formulation process is an iterative activity (Eldredge, 2000a).
Practitioners’ questions versus researchers’ questions Clear and present
There is a frequently reported mismatch between questions generated by practitioners questions
and those addressed by researchers. For example, Farmer and Williams (1999) asked:
“Why are practitioners’ research priorities so much more concrete than those of the
funding bodies?” – a theme echoed by Dwyer (1999) when she described the
practitioner’s “focus on answering practical questions”. While the initiative of Jonathan
Eldredge and his colleagues on the Medical Library Association’s Evidence-Based 365
Librarianship Implementation Committee (EBLIC) in asking practitioners to identify
the “most important research questions facing the profession” (Eldredge, 2001) is to be
commended, there is a danger that such lists are dominated by questions around new
technologies and interventions (“what we know we don’t know) rather than central
practices and procedures (“what we don’t know we don’t know”).
Such a complaint is by no means unique to the information sector with frequent
tensions between demand-led and more strategic approaches to research priorities.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Within a health service context it has been bemoaned that question answering focuses
on the margins of the health services where new technologies make a peripheral
contribution rather than on the less glamorous core business. This situation is
compounded by the fact that researchers, and indeed research funders, are more likely
to be beguiled by keyhole surgery and neurological scans than by bandaging and
injection administration. What are our equivalents of bandaging and giving injections?
– things that we do every day without questioning our procedures and practice. These
should be the focus for our clear and present questions.
Bexon (2005), in a recent conference presentation, concludes that while
evidence-based librarianship is feasible “there needs to be a greater emphasis on
identifying relevant questions and applying the appraised evidence to real life”. In this
connection Crumley and Koufogiannakis (2002) provide the actionable
recommendation that “librarians should consider keeping an online list of questions
that have already been studied and those that need to be explored, similar to the trial
registries in medicine”.
Conclusion
Formulating the question is fundamental to evidence-based practice, irrespective of the
discipline involved. Question formulation, and indeed question answering, is a key
competency for our profession. Our practice may be informed both by research within
information science and by wider developments in evidence-based practice. Much
remains to be done in constructing a comprehensive typology of question types and
identifying priorities for primary research and for secondary literature review. Once we
have established the extent to which questions generated by information practitioners
have already been addressed the way will be clear for tackling the outstanding
questions that currently present themselves for our attention.
References
Barrie, A.R. and Ward, A.M. (1997), “Questioning behaviour in general practice: a pragmatic
study”, British Medical Journal, Vol. 315, pp. 1512-5.
Bexon, N. (2005), “Evidence-based librarianship can be adopted by practising librarians, but
there needs to be a greater emphasis on identifying relevant questions and applying the
LHT appraised evidence to real life”, paper presented at the Implementation of Quality Systems
and Certification of Biomedical Libraries, EAHIL Workshop, Palermo, 23-25 June.
24,3
Booth, A. (2000), “Formulating the question”, in Booth, A. and Walton, G. (Eds), Managing
Knowledge in Health Services, Library Association, London, pp. 197-206.
Booth, A. (2001a), “Research column: turning research priorities into answerable questions”,
Health Information & Libraries Journal, Vol. 18 No. 2, pp. 130-2.
366 Booth, A. (2001b), “Research column: asking questions, knowing answers”, Health Information &
Libraries Journal, Vol. 18 No. 2, pp. 238-40.
Booth, A. (2004a), “Evaluating your performance”, in Booth, A. and Brice, A. (Eds), Evidence
Based Practice for Information Professionals, Facet Publishing, London, pp. 127-37.
Booth, A. (2004b), “Formulating answerable questions”, in Booth, A. and Brice, A. (Eds),
Evidence Based Practice for Information Professionals, Facet Publishing, London,
pp. 61-70.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Booth, A. and Brice, A. (2004a), “Why evidence-based information practice?”, in Booth, A. and
Brice, A. (Eds), Evidence Based Practice for Information Professionals, Facet Publishing,
London, pp. 1-12.
Booth, A. and Brice, A. (2004b), “Appraising the evidence”, in Booth, A. and Brice, A. (Eds),
Evidence Based Practice for Information Professionals, Facet Publishing, London,
pp. 104-18.
Booth, A. (2005), “The body in questions”, Health Information & Libraries Journal, Vol. 22 No. 2,
pp. 150-5.
Booth, A., O’Rourke, A.J. and Ford, N.J. (2000), “Structuring the pre-search reference interview: a
useful technique for handling clinical questions”, Bulletin of the Medical Library
Association, Vol. 88 No. 3, pp. 239-46.
Cabell, C.H., Schardt, C., Sanders, L., Corey, G.R. and Keitz, S.A. (2001), “Resident utilization of
information technology”, Journal of General Internal Medicine, Vol. 16 No. 12, pp. 838-44.
Crumley, E. and Koufogiannakis, D. (2002), “Developing evidence-based librarianship: practical
steps for implementation”, Health Information & Libraries Journal, Vol. 19 No. 2, pp. 61-70.
Dwyer, M.A. (1999), “Delphi survey of research priorities and identified areas for collaborative
research in health sector library and information services UK”, Health Libraries Review,
Vol. 16 No. 3, pp. 174-91.
Eldredge, J.D. (2000a), “Evidence-based librarianship: an overview”, Bulletin of the Medical
Library Association, Vol. 88 No. 4, pp. 289-302.
Eldredge, J.D. (2000b), “Evidence-based librarianship: formulating EBL questions”, Bibliotheca
Medica Canadiana, Vol. 22 No. 2, pp. 74-7.
Eldredge, J.D. (2001), “The most relevant and answerable research questions facing the practice
of health sciences librarianship”, Hypothesis, Vol. 15 No. 1, pp. 3-5, available at: http://168.
17.205.219/mla/research/Hypo2001v.15%20no.1.pdf
Eldredge, J.D. (2002a), “Evidence-based librarianship levels of evidence”, Hypothesis, Vol. 16
No. 3, pp. 10-13.
Eldredge, J.D. (2002b), “Cohort studies in health sciences librarianship”, Journal of the Medical
Library Association, Vol. 90 No. 4, pp. 380-92.
Eldredge, J.D. (2003), “The randomised controlled trial design: unrecognized opportunities for
health sciences librarianship”, Health Information & Libraries Journal, Vol. 20,
Supplement 1, pp. 34-44.
Evidence Based Medicine Working Group (1992), “Evidence based medicine: a new approach to Clear and present
teaching the practice of medicine”, Journal of the American Medical Association, Vol. 268
No. 17, pp. 2420-5.
questions
Farmer, J. and Williams, D. (1999), “Are research priorities a priority for research?”, Health
Libraries Review, Vol. 16 No. 1, pp. 56-60.
Flemming, K. (1998), “EBN notebook. Asking answerable questions”, Evidence-Based Nursing,
Vol. 1 No. 2, pp. 36-7. 367
Geddes, J. (1999), “Asking structured and focused clinical questions: essential first steps of
evidence-based practice”, Evidence Based Mental Health, Vol. 2 No. 2, pp. 35-6.
Gorman, P.N. and Helfand, M. (1995), “Information seeking in primary care: how physicians
choose which clinical questions to pursue and which to leave unanswered”, Medical
Decision Making, Vol. 15 No. 2, pp. 113-9.
Gray, J.A.M. (1997), Evidence-Based Healthcare: How to Make Health Policy and Management
Decisions, Churchill Livingstone, London, p. 23.
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Gerould, J.T. (1906), “A plan for the compilation of comparative university and college library
statistics”, Library Journal, Vol. 31, pp. 761-3.
Hiller, S. and Self, J. (2004), “From measurement to management: using data wisely for planning
and decision-making”, Library Trends, Vol. 53 No. 1, pp. 129-55.
Horton, R. (2002), “Teacher librarians: what should we be? Professional development from the
inside”, Access, Vol. 16 No. 2, pp. 31-3, available at: http://alia.org.au/, rhorton/education/
role.html
Koufogiannakis, D., Slater, L. and Crumley, E. (2004), “A content analysis of librarianship
research”, Journal of Information Science, Vol. 30 No. 3, pp. 227-39.
Oxman, A.D. and Guyatt, G.H. (1988), “Guidelines for reading literature reviews”, Canadian
Medical Association Journal, Vol. 138 No. 8, pp. 697-703.
Plutchak, T.S. (2005), “Building a body of evidence”, Journal of the Medical Library Association,
Vol. 93 No. 2, pp. 193-5.
Richardson, W.S. (1998), “Ask and ye shall retrieve”, Evidence Based Medicine, Vol. 3, pp. 100-1.
Richardson, W.S. and Wilson, M.C. (1997), “On questions, background and foreground”, Evidence
Based Healthcare Newsletter, Vol. 17, pp. 8-9.
Richardson, W.S., Wilson, M.C., Nishikawa, J. and Hayward, R.S.A. (1995), “The well-built
clinical question: a key to evidence based decisions”, ACP Journal Club, Vol. 123 No. 3,
pp. A12-A13.
Rosenberg, W. and Donald, A. (1995), “Evidence based medicine: an approach to clinical problem
solving”, British Medical Journal, Vol. 310 No. 6987, pp. 1122-6.
Sackett, D.L. and Rosenberg, W.M.C. (1995), “On the need for evidence based medicine”, Journal
of Public Health Medicine, Vol. 17 No. 3, pp. 330-4.
Sackett, D.L. and Wennberg, J.E. (1997), “Choosing the best research design for each question”,
British Medical Journal, Vol. 315 No. 7123, p. 1636.
Schön, D. (1983), The Reflective Practitioner. How Professionals Think in Action, Temple Smith,
London.
Rosenberg, W.M., Deeks, J., Lusher, A., Snowball, R., Dooley, G. and Sackett, D. (1998),
“Improving searching skills and evidence retrieval”, Journal of the Royal College of
Physicians of London, Vol. 32 No. 6, pp. 557-63.
LHT Snowball, R. (1997), “Using the clinical question to teach search strategy: fostering transferable
conceptual skills in user education by active learning”, Health Libraries Review, Vol. 14
24,3 No. 3, pp. 167-72.
Villanueva, E.V., Burrows, E.A., Fennessy, P.A., Rajendran, M. and Anderson, J.N. (2001),
“Improving question formulation for use in evidence appraisal in a tertiary care setting:
a randomised controlled trial [ISRCTN66375463]”, BMC Medical Informatics & Decision
Making, Vol. 1, p. 4.
368
Walsh, M. and Ford, P. (1989), Nursing Rituals: Research and Rational Actions,
Butterworth-Heinemann, Oxford.
White, M.D. (1998), “Questions in reference interviews”, Journal of Documentation, Vol. 54 No. 4,
pp. 443-65.
Wildemuth, B.M. (2003), “Why conduct user studies? The role of empirical evidence in improving
the practice of librarianship”, Keynote address, INFORUM 2003, Prague, 27 May.
Wildridge, V. and Bell, L. (2002), “Brief communication. How CLIP became ECLIPSE: a mnemonic
Downloaded by Monash University At 08:05 15 April 2016 (PT)
Further reading
Richardson, W.S. (2005), “Focus on questions”, posting to evidence-based-health discussion list,
8 August, No. 16, available at: www.jiscmail.ac.uk/evidence-based-health
Corresponding author
Andrew Booth can be contacted at: a.booth@sheffield.ac.uk
1. Amy L. Wilson, Elizabeth Buckley, Jonathan D. Buckley, Svetlana Bogomolova. 2016. Nudging healthier
food and beverage choices through salience and priming. Evidence from a systematic review. Food Quality
and Preference 51, 47-64. [CrossRef]
2. Jianhua Xu, Qi Kang, Zhiqiang Song. 2015. The current state of systematic reviews in library and
information studies. Library & Information Science Research 37, 296-310. [CrossRef]
3. Helen R. Bayliss, Fiona R. Beyer. 2015. Information retrieval for ecological syntheses. Research Synthesis
Methods 6:10.1002/jrsm.v6.2, 136-148. [CrossRef]
4. Pablo Salas Medina, Desirée Mena Tudela, María Isabel Orts Cortés, Antonio Jesús Ramos
MorcilloFormulación de la pregunta clínica 31-47. [CrossRef]
5. Fiona Bath-Hextall. 2014. The Systematic Review of Health Care Evidence. Nursing Clinics of North
America 49, 461-473. [CrossRef]
6. Lee Ann Riesenberg, Ellen M. Justice. 2014. Revisión sistemática de la bibliografía (parte 1). Nursing
Downloaded by Monash University At 08:05 15 April 2016 (PT)