Research Synthesis of the State of the Science on Clinical Evaluation in Nursing Education

2.50
Hdl Handle:
http://hdl.handle.net/10755/603905
Category:
Full-text
Type:
Presentation
Title:
Research Synthesis of the State of the Science on Clinical Evaluation in Nursing Education
Other Titles:
Nursing Education: What Evaluations Mean to Practice [Session]
Author(s):
Lewallen, Lynne Porter; Van Horn, Elizabeth
Lead Author STTI Affiliation:
Gamma Zeta
Author Details:
Lynne Porter Lewallen, RN, CNE, ANEF, lplewall@uncg.edu; Elizabeth Van Horn, RN, CNE
Abstract:
Session presented on Saturday, April 9, 2016: The purpose of this NLN-funded study was to conduct a research synthesis to determine the state of the science related to clinical evaluation in nursing education programs. There were two major rationales for this study. First was that in the seminal work about transforming nursing education through clinical teaching in nursing (Benner, Sutphen, Leonard & Day 2010), very little was written about the clinical evaluation process.  Second was that in preliminary examination of the literature on clinical evaluation, we found that of the few summary articles available, most were limited in scope, such as Cant, McKenna & Cooper’s (2013) summary on the use of the Objective Structured Clinical Examination (OSCE). The research synthesis method described by Cooper (2010) was used to guide the study.  Included in our synthesis were research studies that focused on clinical evaluation of any level of nursing student. Exclusion criteria included articles that did not report results of a study, studies that focused on practicing nurses rather than nursing students, studies focusing on human patient simulation, studies focusing only on student perceptions of or satisfaction with clinical evaluation, and articles not available in English.  A comprehensive literature search utilizing twelve computerized databases, the tables of contents or seven leading nursing education journals, the references lists of five review articles, and the abstracts of conference proceeding available at the Virginia Henderson library was conducted with the assistance of a Health Sciences librarian.  These searches resulted in a grand total of 226 articles, of which 77 met study criteria and were analyzed. Of the 77, 59 used quantitative methods, 8 used qualitative methods and 10 used mixed methods.  Our analysis methods consisted of narrative synthesis.  No groups of studies were found that were amenable to quantitative meta-analysis or qualitative meta-synthesis.  In the quantitative studies, the following designs were used: descriptive (n=11), correlational (n=7), comparative (n=15), quasi-experimental (n=11), experimental (n=6); and psychometric testing (n=9).  We then examined the studies (excluding the articles that focused on psychometric testing) to determine what level of evidence was represented by this body of work.  The number of studies classified according to Melnyk and Fineout-Overholt’s (2011) levels of evidence are Level 2 (one or more randomized controlled trials): 6 studies; Level 3 (controlled trial (no randomization): 11 studies; Level 4 (case-control or cohort study): 15 studies; Level 6 (single descriptive or qualitative study): 18 studies. The studies were categorized into topics. The topics were exhaustive but not mutually exclusive due to the multiple aims of some of the studies.  The topics included: Teaching methods; OSCE; Congruence; Faculty/preceptor issues with clinical evaluation; Essential clinical behaviors; Competence; Topic-based evaluation; Clinical reasoning; Instrumentation; and Decision making about clinical grade. Two areas for future research that stand out most in this study are the need to accurately and efficiently measure competence in the clinical area and the need for reliable and valid instrumentation. The largest number of studies located were on the topic of competence (n=31); all but two were conducted with undergraduate nursing students. The majority of studies had the aim of measuring global competence at the end of a nursing program; most used researcher-developed instruments and many used student self-report measures.  There is a need for a more standardized approach to the measure of clinical competence so results can be compared across programs, nationally and internationally.  Nursing education science is in its infancy in many areas.  The majority of the research designs used in the studies were non-experimental such as descriptive or correlational, with small convenience samples, which limits the strength of the evidence base of our science.  Nurse educators frequently conduct small studies with limited budgets that address areas of local concern but often do not contribute to the larger body of knowledge.  An important finding from this study is that nursing education research is being conducted globally.  Clinical evaluation is a concern world-wide; and research findings can potentially be applied in diverse settings.  By synthesizing research in this area, we can bridge the gap in evaluation of students from diverse cultures within each country and apply research findings to diverse settings, which will broaden the reach of nursing education research and strengthen the foundation of nursing education science.  This information can help nursing educators use evidence-based methods of clinical evaluation as a foundation for their practice. This study funded by a National League for Nursing (NLN) Ruth Donnelly/Corcoran Research Award.
Keywords:
research synthesis; clinical evaluation; nursing education
Repository Posting Date:
29-Mar-2016
Date of Publication:
29-Mar-2016
Other Identifiers:
NERC16D03
Conference Date:
2016
Conference Name:
Nursing Education Research Conference 2016
Conference Host:
Sigma Theta Tau International, the Honor Society of Nursing, and National League for Nursing
Conference Location:
Washington, DC
Description:
Nursing Education Research Conference Theme: Research as a Catalyst for Transformative Practice

Full metadata record

DC FieldValue Language
dc.language.isoen_USen
dc.type.categoryFull-texten
dc.typePresentationen
dc.titleResearch Synthesis of the State of the Science on Clinical Evaluation in Nursing Educationen
dc.title.alternativeNursing Education: What Evaluations Mean to Practice [Session]en
dc.contributor.authorLewallen, Lynne Porteren
dc.contributor.authorVan Horn, Elizabethen
dc.contributor.departmentGamma Zetaen
dc.author.detailsLynne Porter Lewallen, RN, CNE, ANEF, lplewall@uncg.edu; Elizabeth Van Horn, RN, CNEen
dc.identifier.urihttp://hdl.handle.net/10755/603905en
dc.description.abstractSession presented on Saturday, April 9, 2016: The purpose of this NLN-funded study was to conduct a research synthesis to determine the state of the science related to clinical evaluation in nursing education programs. There were two major rationales for this study. First was that in the seminal work about transforming nursing education through clinical teaching in nursing (Benner, Sutphen, Leonard & Day 2010), very little was written about the clinical evaluation process.  Second was that in preliminary examination of the literature on clinical evaluation, we found that of the few summary articles available, most were limited in scope, such as Cant, McKenna & Cooper’s (2013) summary on the use of the Objective Structured Clinical Examination (OSCE). The research synthesis method described by Cooper (2010) was used to guide the study.  Included in our synthesis were research studies that focused on clinical evaluation of any level of nursing student. Exclusion criteria included articles that did not report results of a study, studies that focused on practicing nurses rather than nursing students, studies focusing on human patient simulation, studies focusing only on student perceptions of or satisfaction with clinical evaluation, and articles not available in English.  A comprehensive literature search utilizing twelve computerized databases, the tables of contents or seven leading nursing education journals, the references lists of five review articles, and the abstracts of conference proceeding available at the Virginia Henderson library was conducted with the assistance of a Health Sciences librarian.  These searches resulted in a grand total of 226 articles, of which 77 met study criteria and were analyzed. Of the 77, 59 used quantitative methods, 8 used qualitative methods and 10 used mixed methods.  Our analysis methods consisted of narrative synthesis.  No groups of studies were found that were amenable to quantitative meta-analysis or qualitative meta-synthesis.  In the quantitative studies, the following designs were used: descriptive (n=11), correlational (n=7), comparative (n=15), quasi-experimental (n=11), experimental (n=6); and psychometric testing (n=9).  We then examined the studies (excluding the articles that focused on psychometric testing) to determine what level of evidence was represented by this body of work.  The number of studies classified according to Melnyk and Fineout-Overholt’s (2011) levels of evidence are Level 2 (one or more randomized controlled trials): 6 studies; Level 3 (controlled trial (no randomization): 11 studies; Level 4 (case-control or cohort study): 15 studies; Level 6 (single descriptive or qualitative study): 18 studies. The studies were categorized into topics. The topics were exhaustive but not mutually exclusive due to the multiple aims of some of the studies.  The topics included: Teaching methods; OSCE; Congruence; Faculty/preceptor issues with clinical evaluation; Essential clinical behaviors; Competence; Topic-based evaluation; Clinical reasoning; Instrumentation; and Decision making about clinical grade. Two areas for future research that stand out most in this study are the need to accurately and efficiently measure competence in the clinical area and the need for reliable and valid instrumentation. The largest number of studies located were on the topic of competence (n=31); all but two were conducted with undergraduate nursing students. The majority of studies had the aim of measuring global competence at the end of a nursing program; most used researcher-developed instruments and many used student self-report measures.  There is a need for a more standardized approach to the measure of clinical competence so results can be compared across programs, nationally and internationally.  Nursing education science is in its infancy in many areas.  The majority of the research designs used in the studies were non-experimental such as descriptive or correlational, with small convenience samples, which limits the strength of the evidence base of our science.  Nurse educators frequently conduct small studies with limited budgets that address areas of local concern but often do not contribute to the larger body of knowledge.  An important finding from this study is that nursing education research is being conducted globally.  Clinical evaluation is a concern world-wide; and research findings can potentially be applied in diverse settings.  By synthesizing research in this area, we can bridge the gap in evaluation of students from diverse cultures within each country and apply research findings to diverse settings, which will broaden the reach of nursing education research and strengthen the foundation of nursing education science.  This information can help nursing educators use evidence-based methods of clinical evaluation as a foundation for their practice. This study funded by a National League for Nursing (NLN) Ruth Donnelly/Corcoran Research Award.en
dc.subjectresearch synthesisen
dc.subjectclinical evaluationen
dc.subjectnursing educationen
dc.date.available2016-03-29T13:12:18Zen
dc.date.issued2016-03-29en
dc.date.accessioned2016-03-29T13:12:18Zen
dc.conference.date2016en
dc.conference.nameNursing Education Research Conference 2016en
dc.conference.hostSigma Theta Tau International, the Honor Society of Nursing, and National League for Nursingen
dc.conference.locationWashington, DCen
dc.descriptionNursing Education Research Conference Theme: Research as a Catalyst for Transformative Practiceen
All Items in this repository are protected by copyright, with all rights reserved, unless otherwise indicated.