|Volume 58 Number 1
Evidence Based Library & Information Practice
Assistant Director for Public Services
Clinical Reference Librarian
East Tennessee State University
Quillen College of Medicine Library
Program Abstract: Evidence Based Library & Information Practice (EBLIP) is a way of using the best research to solve practical problems in the library. This session will cover the fundamentals of EBLIP, along with possible applications. “Evidence-Based Librarianship is an approach to information science that promotes the collection, interpretation and integration of valid, important and applicable user-reported, librarian observed, and research-derived evidence. The best available evidence, moderated by user needs and preferences, is applied to improve the quality of professional judgements” (Booth & Brice, 2004). Evidence-Based Library and Information Practice (EBLIP) evolved from the evidence-based medicine (EBM) movement which is a systematic way to review and apply the medical literature to medical practice. EBM began to find its way into other health disciplines and eventually was applied to health sciences librarianship and then to librarianship as a whole.
Much of what is produced in the library literature is case reports of “how we did it.” It is not of a high methodological quality. It does not instill confidence in non-librarians of the reliability of the “science” in library science. Evidence Based Library & Information Practice (EBLIP) is a movement to change the direction of library practice to more of a research base.
Booth & Brice (2004) defined EBLIP as a 5 step process:
1.) Define the problem or formulate your question;
2.) Find the evidence;
3.) Critically appraise the evidence;
4.) Apply the appraised evidence to the problem;
5.) Quality assurance – evaluate the plan.
The first part of EBLIP is to formulate a question arising from an issue or problem in a library or in the use of a library based philosophy. A helpful process for question formulation was developed by Booth and Brice (2004). It is called the SPICE model and it consists of:
1.) Setting – context of the question.
2.) Perspective – users/potential users of service.
3.) Intervention – what is being done to them/for them.
4.) Comparison – alternatives to the intervention.
5.) Evaluation – how you will measure whether the intervention succeeded.
For example: In adolescent users [P] at a public library [S], does a book club [I]or an author signing [C] increase adolescent usage of the library as measured by a count of adolescent users [E]?
Another device which is used to develop clear, concise, well-formulated, answerable questions is the PICO structure which was developed by Evidence-Based Medicine (EBM) practitioners. PICO represents:
1.) Population: Recipients of a service/intervention (or) a situation being examined.
2.) Intervention: A service/action to be delivered to the population.
3.) Comparison: An alternative service or action.
4.) Outcomes: Ways in which the service/action can be measured to determine whether it has had a desired effect.
For example: In a reference department at an academic library [P], is a combination of a circulation/reference desk [I] versus a separate reference desk [C] more effective for time and cost [O]?
Koufogiannakis, Slater and Crumley (2004) developed six domains into which library questions may be categorized. They are: 1.) Reference/Enquiries—relates to service and access to information to meet the needs of users; 2.) Education—discovering new methods and strategies to educate users about library resources and improving their research skills; 3.) Collections—how to build a quality collection of print and electronic materials; 4.) Management—supervising and administering people and resources within an organization; 5.) Information access and retrieval—how to create systems and methods for information retrieval and access; and 6.) Marketing/Promotion—how to promote services, libraries, and librarianship to both users and non-users.
A three-tier list of questions was developed that the six types of domains of research fall within: prediction, intervention and exploration. Prediction questions attempt to predict a specific outcome of an action. Intervention questions are comparison questions that look at two different potential actions. Exploration questions generally are open-ended and tend to look at the why of librarianship. Each of these types of questions are best answered by different types of research studies. All three types are best answered by systematic reviews or other types of systematizing studies. After these types of studies, the next best type for prediction questions is cohort study; for intervention questions, randomized controlled trials; and for exploration questions, qualitative studies (Eldredge, 2002). One of the distinctions of EBLIP is a stronger focus on knowing research methodology and prioritizing search results based on the methodological strength of the article including the type of study used.
The list below is a hierarchy of evidence used in EBLIP, with qualitative research a separate entity. The bottom level quantitative studies are not “bad”, but the top ones are superior and a librarian should put more confidence in what they say.
- Systematic reviews or Meta-analyses - reviews that employ a systematic method to find and evaluate relevant research, and to analyze data from the studies that are included in the review. Meta-analyses employ statistical methods to analyze and combine the results of the chosen studies. Systematic reviews utilize scientific methods to combine the results from many studies.
- Randomized controlled trials (RCTs) - research that employs random assignment and tests a control and a treatment group. These trials can involve individuals or groups.
- Cohort studies - describe possible causal links and pose probabilities of risk.
- Comparative studies - a study to find similarities and differences between two or more observed outcomes.
- Descriptive articles (including narrative reviews) - an overview of a subject that has developed from a literature search.
- Surveys - survey of questions that are evaluated in a predefined manner.
- Case studies - Descriptive study of an intervention or a series of interventions.
- Qualitative research – involves the utilization of study designs such as interviews, focus groups, bibliometics, and others. Systematic reviews can be done on qualitative study designs.
Unfortunately, the literature for library and information science is poorly indexed. Gaps exist in almost all of the citation databases in the field. In some cases major journals are left out or partially indexed. It is hard to refine searches by study type (such as systematic review) because of the lack of robustness in the indexing. This problem is ironic since this is the literature for the information profession. Therefore, another focus of EBLIP is to apply the rigors of quality indexing that librarians insist other professions must do to their literature to our own professional literature. Library citation databases include LISA, Library Literature, ISTA and others. Since librarianship has such a broad base, it is important to search the databases of other disciplines to insure that a subject is thoroughly covered. This searching might involve looking at ERIC (education) for instructional interventions; ABI-Inform (business) for matters of marketing; psychology and social science databases for issues of customer relations; and even looking at the grey literature of conference proceedings.
Asking well-focused questions as described above is perhaps a new concept to most librarians, but it is not unlike conducting a reference interview. Searching the literature is indeed our forte as professionals. Another important part of EBLIP is developing skills that allow the reader to critically appraise studies that have been retrieved from the literature once the search process has been completed. This part of EBLIP creates the greatest discomfort in librarians. Good critical appraisal skills are developed over a lifetime. The resources section at the end of this article contains articles and links regarding critical appraisal. The first question that should be asked is about the usefulness of the article to the question. The library worker should then look at the journal and author to see if it is a reputable author in a peer-reviewed journal. Some important concepts in critical appraisal are 1.) determining the type of research methodology used in an article; 2.) knowing where that type of research methodology fits in the evidence hierarchy; and 3.) knowing the basics of statistical measures.
After the process of EBLIP has been conducted, including: 1.) start with a problem in the library; 2) ask a well-formed research question; 3.) thoroughly search the literature; 4.) critically appraise the search results, then it becomes time to 5.) apply the findings to the problem situation. The process of application is an art. Every situation is different. Value judgments must be made as to the trustworthiness of the research findings and their fit with your own political situation, local environment and culture.
However well these steps are conducted, any effort made in applying EBLIP will be a help to the profession. Research findings with small populations that are similar can be combined into systematic reviews or meta-analyses. These results can be shared in journals and conferences and other librarians can benefit. A reduction will be seen in the “how we did it” studies. Librarians can become “practitioner-researchers” and gain esteem amongst other professions. The “science” can be properly said to be a part of library and information science.
Booth, Andrew and Anne Brice, eds. (2004). Evidence-based practice for information professionals: a handbook. London: Facet Publishing.
Koufogiannakis, Denise, Linda Slater and Ellen Crumley. (2004). "A Content Analysis of Librarianship Research." Journal of Information Science
Eldredge, J. (2002). "Evidence-based librarianship levels of evidence." Hypothesis, 16, 10-13.
“Appraising the Evidence.” EBLIP Toolkit. May 10, 2008. University of Newcastle, Australia. http://www.newcastle.edu.au/service/library/gosford/ebl/toolkit/
Booth, Andrew. (2004). "Formulating Answerable Questions." Evidence-based Practice for Information Professionals: A Handbook. Eds. Andrew Booth and Anne Brice. London: Facet, 2004. 62-3.
Booth, Andrew. "The Unteachable in Pursuit of the Unreadable." (2006). Evidence Based Library and Information Practice 1.2: 51-56. http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/48/118
Booth, Andrew. (2006). "Using Research in Practice." Health Information and Libraries Journal. 23: 69-72.
Booth, Andrew and Anne Brice,eds. (2004). Evidence-based practice for information professionals: a handbook. London: Facet Publishing.
"Critical appraisal." Evidence-Based Medicine. May 10, 2008. http://www.sahealthinfo.org/evidence/c.htm
Crumley, Ellen and Denise Koufogiannakis. (2002). "Developing Evidence-Based Librarianship: Practical Steps for Implementation." Health Information and Libraries Journal 19: 61-70.
Davies, Eric J. (2002). "What Gets Measured, Gets Managed: Statistics and Performance Indicators for Evidence Based Management." Journal of Librarianship and Information Science 34.3: 129-133. EBLIP Toolkit. May 10, 2008. University of Newcastle, Australia. http://www.newcastle.edu.au/service/library/gosford/ebl/toolkit/
Eldredge, J. (2000). "Evidence-based librarianship: an overview." Bulletin of the Medical Library Association, 88: 289-302.
Eldredge, J. (2002). "Evidence-based librarianship levels of evidence." Hypothesis, 16: 10-13.
Eldredge, Jonathan. (2006). "Evidence-based librarianship: the EBL process". Library Hi Tech. 24.3: 341-354.
Genoni, Paul, Gaby Haddow, and Ann Ritchie. (2004). "Why don't librarians use research?" Evidence-based Practice for Information Professionals: A Handbook. Eds. Andrew Booth and Anne Brice. London: Facet. pp. 49-60.
Glynn, Lindsay. (2006). "A Critical Appraisal Tool for Library and Information Research." Library Hi Tech 24.3: 387-399.
Koufogiannakis, Denise, Linda Slater and Ellen Crumley. (2004). "A Content Analysis of Librarianship Research." Journal of Information Science 30.3: 227-39
Koufogainnakis, Denise and Ellen Crumley. (2004). "Applying Evidence to your Everyday Practice." Evidence-based Practice for Information Professionals: A Handbook. Eds. Andrew Booth and Anne Brice. London: Facet. p. 120.
McKenna, Julie. (2006). "Evidence Based Research Activities, Interests and Opportunities exist for Practitioners in all Library Sectors in the British Isles.” Evidence Based Library and Information Practice. 1.1: 107-9. http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/23/75
Morrison, Heather. (2006). "Evidence Based Librarianship and Open Access." Evidence Based Library and Information Practice. 1.2: 46-50. http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/49/117