• - Google Chrome

Intended for healthcare professionals

  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • News & Views
  • Critical thinking in...

Critical thinking in healthcare and education

  • Related content
  • Peer review
  • Jonathan M Sharples , professor 1 ,
  • Andrew D Oxman , research director 2 ,
  • Kamal R Mahtani , clinical lecturer 3 ,
  • Iain Chalmers , coordinator 4 ,
  • Sandy Oliver , professor 1 ,
  • Kevan Collins , chief executive 5 ,
  • Astrid Austvoll-Dahlgren , senior researcher 2 ,
  • Tammy Hoffmann , professor 6
  • 1 EPPI-Centre, UCL Department of Social Science, London, UK
  • 2 Global Health Unit, Norwegian Institute of Public Health, Oslo, Norway
  • 3 Centre for Evidence-Based Medicine, Oxford University, Oxford, UK
  • 4 James Lind Initiative, Oxford, UK
  • 5 Education Endowment Foundation, London, UK
  • 6 Centre for Research in Evidence-Based Practice, Bond University, Gold Coast, Australia
  • Correspondence to: J M Sharples Jonathan.Sharples{at}eefoundation.org.uk

Critical thinking is just one skill crucial to evidence based practice in healthcare and education, write Jonathan Sharples and colleagues , who see exciting opportunities for cross sector collaboration

Imagine you are a primary care doctor. A patient comes into your office with acute, atypical chest pain. Immediately you consider the patient’s sex and age, and you begin to think about what questions to ask and what diagnoses and diagnostic tests to consider. You will also need to think about what treatments to consider and how to communicate with the patient and potentially with the patient’s family and other healthcare providers. Some of what you do will be done reflexively, with little explicit thought, but caring for most patients also requires you to think critically about what you are going to do.

Critical thinking, the ability to think clearly and rationally about what to do or what to believe, is essential for the practice of medicine. Few doctors are likely to argue with this. Yet, until recently, the UK regulator the General Medical Council and similar bodies in North America did not mention “critical thinking” anywhere in their standards for licensing and accreditation, 1 and critical thinking is not explicitly taught or assessed in most education programmes for health professionals. 2

Moreover, although more than 2800 articles indexed by PubMed have “critical thinking” in the title or abstract, most are about nursing. We argue that it is important for clinicians and patients to learn to think critically and that the teaching and learning of these skills should be considered explicitly. Given the shared interest in critical thinking with broader education, we also highlight why healthcare and education professionals and researchers need to work together to enable people to think critically about the health choices they make throughout life.

Essential skills for doctors and patients

Critical thinking …

Log in using your username and password

BMA Member Log In

If you have a subscription to The BMJ, log in:

  • Need to activate
  • Log in via institution
  • Log in via OpenAthens

Log in through your institution

Subscribe from £184 *.

Subscribe and get access to all BMJ articles, and much more.

* For online subscription

Access this article for 1 day for: £50 / $60/ €56 ( excludes VAT )

You can download a PDF version for your personal record.

Buy this article

what is critical thinking in evidence based practice

Nurse Practitioner Certification

ANA Nursing Resources Hub

Search Resources Hub

A female nurse wearing dark blue scrubs is seated at a desk and reading information from a folder. A computer screen can be seen in front of her.

What is Evidence-Based Practice in Nursing?

5 min read • June, 01 2023

Evidence-based practice in nursing involves providing holistic, quality care based on the most up-to-date research and knowledge rather than traditional methods, advice from colleagues, or personal beliefs. 

Nurses can expand their knowledge and improve their clinical practice experience by collecting, processing, and implementing research findings. Evidence-based practice focuses on what's at the heart of nursing — your patient. Learn what evidence-based practice in nursing is, why it's essential, and how to incorporate it into your daily patient care.

How to Use Evidence-Based Practice in Nursing

Evidence-based practice requires you to review and assess the latest research. The knowledge gained from evidence-based research in nursing may indicate changing a standard nursing care policy in your practice Discuss your findings with your nurse manager and team before implementation. Once you've gained their support and ensured compliance with your facility's policies and procedures, merge nursing implementations based on this information with your patient's values to provide the most effective care. 

You may already be using evidence-based nursing practices without knowing it. Research findings support a significant percentage of nursing practices, and ongoing studies anticipate this will continue to increase.

Evidence-Based Practice in Nursing Examples

There are various examples of evidence-based practice in nursing, such as:

  • Use of oxygen to help with hypoxia and organ failure in patients with COPD 
  • Management of angina
  • Protocols regarding alarm fatigue
  • Recognition of a family member's influence on a patient's presentation of symptoms
  • Noninvasive measurement of blood pressure in children 

Improving patient care begins by asking how you can make it a safer, more compassionate, and personal experience. 

Learn about pertinent evidence-based practice information on our  Clinical Practice Material page .

Five Steps to Implement Evidence-Based Practice in Nursing

A young female nurse is seated at a desk, wearing a light blue scrub outfit and doing research using a laptop and taking notes.

Evidence-based nursing draws upon critical reasoning and judgment skills developed through experience and training. You can practice evidence-based nursing interventions by  following five crucial steps  that serve as guidelines for making patient care decisions. This process includes incorporating the best external evidence, your clinical expertise, and the patient's values and expectations.

  • Ask a clear question about the patient's issue and determine an ultimate goal, such as improving a procedure to help their specific condition. 
  • Acquire the best evidence by searching relevant clinical articles from legitimate sources.
  • Appraise the resources gathered to determine if the information is valid, of optimal quality compared to the evidence levels, and relevant for the patient.
  • Apply the evidence to clinical practice by making decisions based on your nursing expertise and the new information.
  • Assess outcomes to determine if the treatment was effective and should be considered for other patients.

Analyzing Evidence-Based Research Levels

You can compare current professional and clinical practices with new research outcomes when evaluating evidence-based research. But how do you know what's considered the best information?

Use critical thinking skills and consider  levels of evidence  to establish the reliability of the information when you analyze evidence-based research. These levels can help you determine how much emphasis to place on a study, report, or clinical practice guideline when making decisions about patient care.

The Levels of Evidence-Based Practice

Four primary levels of evidence come into play when you're making clinical decisions.

  • Level A acquires evidence from randomized, controlled trials and is considered the most reliable.
  • Level B evidence is obtained from quality-designed control trials without randomization.
  • Level C typically gets implemented when there is limited information about a condition and acquires evidence from a consensus viewpoint or expert opinion.
  • Level ML (multi-level) is usually applied to complex cases and gets its evidence from more than one of the other levels.

Why Is Evidence-Based Practice in Nursing Essential?

Three people are standing in a hospital corridor, a male nurse and two female nurses, and they are all looking intently at some information that one of the nurses is holding in her hands.

Implementing evidence-based practice in nursing bridges the theory-to-practice gap and delivers innovative patient care using the most current health care findings. The topic of evidence-based practice will likely come up throughout your nursing career. Its origins trace back to Florence Nightingale. This iconic founder of modern nursing gathered data and conclusions regarding the relationship between unsanitary conditions and failing health. Its application remains essential today.

Other Benefits of Evidence-Based Practice in Nursing

Besides keeping health care practices relevant and current, evidence-based practice in nursing offers a range of other benefits to you and your patients:

  • Promotes positive patient outcomes
  • Reduces health care costs by preventing complications 
  • Contributes to the growth of the science of nursing
  • Allows for incorporation of new technologies into health care practice
  • Increases nurse autonomy and confidence in decision-making
  • Ensures relevancy of nursing practice with new interventions and care protocols 
  • Provides scientifically supported research to help make well-informed decisions
  • Fosters shared decision-making with patients in care planning
  • Enhances critical thinking 
  • Encourages lifelong learning

When you use the principles of evidence-based practice in nursing to make decisions about your patient's care, it results in better outcomes, higher satisfaction, and reduced costs. Implementing this method promotes lifelong learning and lets you strive for continuous quality improvement in your clinical care and nursing practice to achieve  nursing excellence .

Images sourced from Getty Images

Related Resources

A cheerful nurse in blue scrubs takes notes on a clipboard while engaging with an elderly patient in a clinic room, signifying a caring and professional nurse-patient interaction.

Item(s) added to cart

what is critical thinking in evidence based practice

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

what is critical thinking in evidence based practice

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Don't submit your assignments before you do this

The academic proofreading tool has been trained on 1000s of academic texts. Making it the most accurate and reliable proofreading tool for students. Free citation check included.

what is critical thinking in evidence based practice

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved August 12, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals

You are here

  • Volume 24, Issue 3
  • Evidence-based practice education for healthcare professions: an expert view
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • Elaine Lehane 1 ,
  • Patricia Leahy-Warren 1 ,
  • Cliona O’Riordan 1 ,
  • Eileen Savage 1 ,
  • Jonathan Drennan 1 ,
  • Colm O’Tuathaigh 2 ,
  • Michael O’Connor 3 ,
  • Mark Corrigan 4 ,
  • Francis Burke 5 ,
  • Martina Hayes 5 ,
  • Helen Lynch 6 ,
  • Laura Sahm 7 ,
  • Elizabeth Heffernan 8 ,
  • Elizabeth O’Keeffe 9 ,
  • Catherine Blake 10 ,
  • Frances Horgan 11 ,
  • Josephine Hegarty 1
  • 1 Catherine McAuley School of Nursing and Midwifery , University College Cork , Cork , Ireland
  • 2 School of Medicine , University College Cork , Cork , Ireland
  • 3 Postgraduate Medical Training , Cork University Hospital/Royal College of Physicians , Cork , Ireland
  • 4 Postgraduate Surgical Training, Breast Cancer Centre , Cork University Hospital/Royal College of Surgeons , Cork , Ireland
  • 5 School of Dentistry , University College Cork , Cork , Ireland
  • 6 School of Clinical Therapies , University College Cork , Cork , Ireland
  • 7 School of Pharmacy , University College Cork , Cork , Ireland
  • 8 Nursing and Midwifery Planning and Development Unit , Kerry Centre for Nurse and Midwifery Education , Cork , Ireland
  • 9 Symptomatic Breast Imaging Unit , Cork University Hospital , Cork , Ireland
  • 10 School of Public Health, Physiotherapy and Sports Science , University College Dublin , Dublin , Ireland
  • 11 School of Physiotherapy , Royal College of Surgeons in Ireland , Dublin , Ireland
  • Correspondence to Dr Elaine Lehane, Catherine McAuley School of Nursing and Midwifery, University College Cork, Cork T12 K8AF, Ireland; e.lehane{at}ucc.ie

https://doi.org/10.1136/bmjebm-2018-111019

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

  • qualitative research

Introduction

To highlight and advance clinical effectiveness and evidence-based practice (EBP) agendas, the Institute of Medicine set a goal that by 2020, 90% of clinical decisions will be supported by accurate, timely and up-to-date clinical information and will reflect the best available evidence to achieve the best patient outcomes. 1 To ensure that future healthcare users can be assured of receiving such care, healthcare professions must effectively incorporate the necessary knowledge, skills and attitudes required for EBP into education programmes.

The application of EBP continues to be observed irregularly at the point of patient contact. 2 5 7 The effective development and implementation of professional education to facilitate EBP remains a major and immediate challenge. 2 3 6 8 Momentum for continued improvement in EBP education in the form of investigations which can provide direction and structure to developments in this field is recommended. 6

As part of a larger national project looking at current practice and provision of EBP education across healthcare professions at undergraduate, postgraduate and continuing professional development programme levels, we sought key perspectives from international EBP education experts on the provision of EBP education for healthcare professionals. The two other components of this study, namely a rapid review synthesis of EBP literature and a descriptive, cross-sectional, national, online survey relating to the current provision and practice of EBP education to healthcare professionals at third-level institutions and professional training/regulatory bodies in Ireland, will be described in later publications.

EBP expert interviews were conducted to ascertain current and nuanced information on EBP education from an international perspective. Experts from the UK, Canada, New Zealand and Australia were invited by email to participate based on their contribution to peer-reviewed literature on the subject area and recognised innovation in EBP education. Over a 2-month period, individual ‘Skype’ interviews were conducted and recorded. The interview guide (online  supplementary appendix A ) focused on current practice and provision of EBP education with specific attention given to EBP curricula, core EBP competencies, assessment methods, teaching initiatives and key challenges to EBP education within respective countries. Qualitative content analysis techniques as advised by Bogner et al 9 for examination of expert interviews were used. Specifically, a six-step process was applied, namely transcription, reading through/paraphrasing, coding, thematic comparison, sociological conceptualisation and theoretical generalisation. To ensure trustworthiness, a number of practices were undertaken, including explicit description of the methods undertaken, participant profile, extensive use of interview transcripts by way of representative quotations, peer review (PL-W) of the data analysis process and invited interviewees to feedback in relation to the overall findings.

Supplementary file 1

Five EBP experts participated in the interviews ( table 1 ). All experts waived their right to anonymity.

  • View inline

EBP education expert profile

Three main categories emerged, namely (1) ‘EBP curriculum considerations’, (2) ‘Teaching EBP’ and (3) ‘Stakeholder engagement in EBP education’. These categories informed the overarching theme of ‘Improving healthcare through enhanced teaching and application of EBP’ ( figure 1 ).

  • Download figure
  • Open in new tab
  • Download powerpoint

Summary of data analysis findings from evidence-based practice (EBP) expert interviews—theme, categories and subcategories.

EBP curriculum considerations

Definitive advice in relation to curriculum considerations was provided with a clear emphasis on the need for EBP principles to be integrated throughout all elements of healthcare professions curricula. Educators, regardless of teaching setting, need to be able to ‘draw out evidence-based components’ from any and all aspects of curriculum content, including its incorporation into assessments and examinations. Integration of EBP into clinical curricula in particular was considered essential to successful learning and practice outcomes. If students perceive a dichotomy between EBP and actual clinical care, then “never the twain shall meet” (GG) requiring integration in such a way that it is “seen as part of the basics of optimal clinical care” (GG). Situating EBP as a core element within the professional curriculum and linking it to professional accreditation processes places further emphasis on the necessity of teaching EBP:

…it is also core in residency programmes. So every residency programme has a curriculum on evidence-based practice where again, the residency programmes are accredited…They have to show that they’re teaching evidence-based practice. (GG)

In terms of the focus of curriculum content, all experts emphasised the oft-cited steps of asking questions, acquiring, appraising and applying evidence to patient care decisions. With regard to identifying and retrieving information, the following in particular was noted:

…the key competencies would be to identify evidence-based sources of information, and one of the key things is there should be no expectation that clinicians are going to go to primary research and evaluate primary research. That is simply not a realistic expectation. In teaching it…they have to be able to identify the pre-processed sources and they have to be able to understand the evidence and they have to be able to use it… (GG)

In addition to attaining proficiency in the fundamental EBP steps, developing competence in communicating evidence to others, including the patient, and facilitating shared decision-making were also highlighted:

…So our ability to communicate risks, benefits, understand uncertainty is so poor…that’s a key area we could improve… (CH)
…and a big emphasis [is needed] on the applicability of that information on patient care, how do you use and share the decision making, which is becoming a bigger and bigger deal. (GG)

It was suggested that these EBP ‘basics’ can be taught “from the start in very similar ways” (GG), regardless of whether the student is at an undergraduate or postgraduate level. The concept of ‘ developmental milestones’ was raised by one expert. This related to different levels of expectations in learning and assessing EBP skills and knowledge throughout a programme of study with an incremental approach to teaching and learning advocated over a course of study:

…in terms of developmental milestones. So for the novice…it’s really trying to get them aware of what the structure of evidence-based practice is and knowing what the process of asking a question and the PICO process and learning about that…in their final year…they’re asked to do critically appraised topics and relate it to clinical cases…It’s a developmental process… (LT)

Teaching EBP

Adoption of effective strategies and practical methods to realise successful student learning and understanding was emphasised. Of particular note was the grounding of teaching strategy and associated methods from a clinically relevant perspective with student exposure to EBP facilitated in a dynamic and interesting manner. The use of patient examples and clinical scenarios was repeatedly expressed as one of the most effective instructional practices:

…ultimately trying to get people to teach in a way where they go, “Look, this is really relevant, dynamic and interesting"…so we teach them in loads of different ways…you’re teaching and feeding the ideas as opposed to “"Here’s a definitive course in this way”. (CH)
…It’s pretty obscure stuff, but then I get them to do three examples…when they have done that they have pretty well got their heads around it…I build them lots of practical examples…clinical examples otherwise they think it’s all didactic garbage… (BA)

EBP role models were emphasised as being integral to demonstrating the application of EBP in clinical decision-making and facilitating the contextualisation of EBP within a specific setting/organisation.

…where we’ve seen success is where organisations have said, “There’s going to be two or three people who are going to be the champions and lead where we’re going”…the issue about evidence, it’s complex, it needs to be contextualised and it’s different for each setting… (CH)

It was further suggested that these healthcare professionals have the ‘X-factor’ required of EBP. The acquisition of such expertise which enables a practitioner to integrate individual EBP components culminating in evidence-based decisions was proposed as a definitive target for all healthcare professionals.

And we call it the X factor…the idea is that the clinician who has the X factor is the good clinician. It’s actually integrating the evidence, the patient values, the patient’s pathophysiology, etc. It could be behavioural issues, systems issues…Those are the four quadrants and the clinical expertise is about integrating those together…You’re not actually adding clinical expertise. It seems to me that the clinical expertise is the ability to integrate those four quadrants. (RJ)

The provision of training for educators to aid the further development of skills and use of resources necessary for effective EBP teaching was recommended:

…so we choose the option to train people as really good teachers and give them really high level skills so that they can then seed it across their organisation… (CH)

Attaining a critical mass of people who are ‘trained’ was also deemed important in making a sustained change:

…and it requires getting the teachers trained and getting enough of them. You don’t need everybody to be doing it to make an impression, but you need enough of them really doing it. (GG)

Stakeholder engagement in EBP education

Engagement of national policy makers, healthcare professionals and patients with EBP was considered to have significant potential to advance its teaching and application in clinical care. The lack of a coherent government and national policy to EBP teaching was cited as a barrier to the implementation of the EBP agenda resulting in a somewhat ‘ad-hoc’ approach, dependent on individual educational or research institutions:

…there’s no cohesive or coherent policy that exists…It’s not been a consistent approach. What we’ve tended to see is that people have started going around particular initiatives…but there’s never been any coordinated approach even from a college perspective, to say we are about improving the uptake and use of evidence in practice and/or generating evidence in practice. And so largely, it’s been left to research institutions… (CH)

To further ingrain EBP within healthcare professional practice, it was suggested that EBP processes, whether related to developing, disseminating or implementing evidence, be embedded in a more structured way into everyday clinical care to promote active and consistent engagement with EBP on a continuous basis:

…we think it should be embedded into care…we’ve got to have people being active in developing, disseminating and implementing evidence…developing can come in a number of formats. It can be an audit. It can be about a practice improvement. It can be about doing some aspect like a systematic review, but it’s very clearly close to healthcare. (CH)

Enabling patients to engage with evidence with a view to informing healthcare professional/patient interactions and care decisions was also advocated:

…I think we really need to put some energy into…this whole idea of patient-driven care, patient-led care and putting some of these tools in the hands of the consumers so that they’re enabled to be able to ask the right questions and to go into an interaction with some background knowledge about what treatments they should be expecting. (LT)

If patients are considered as recipients of EBP rather than key stakeholders, the premise of shared decision-making for care cannot be achieved.

The implementation of a successful EBP education is necessary so that learners not only understand the importance of EBP and be competent in the fundamental steps, but it ultimately serves to influence behaviour in terms of decision-making, through application of EBP in their professional practice. In essence, it serves the function of developing practitioners who value EBP and have the knowledge and skills to implement such practice. The ultimate goal of this agenda is to enhance the delivery of healthcare for improved patient outcomes. The overarching theme of ‘Improving healthcare through enhanced teaching and application of EBP’ represents the focus and purpose of the effort required to optimally structure healthcare professional (HCP) curricula, promote effective EBP teaching and learning strategies, and engage with key stakeholders for the overall advancement of EBP education as noted:

…we think that everyone in training should be in the game of improving healthcare…It’s not just saying I want to do some evidence-based practice…it’s ultimately about…improving healthcare. (CH)

Discussion and recommendations

Education programmes and associated curricula act as a key medium for shaping healthcare professional knowledge, skills and attitudes, and therefore play an essential role in determining the quality of care provided. 10 Unequivocal recommendations were made in relation to the pervasive integration of EBP throughout the academic and clinical curricula. Such integration is facilitated by the explicit inclusion of EBP as a core competency within professional standards and requirements in addition to accreditation processes. 11

Further emphasis on communication skills was also noted as being key to enhancing EBP competency, particularly in relation to realising shared decision-making between patients and healthcare practitioners in making evidence-based decisions. A systematic review by Galbraith et al , 12 which examined a ‘real-world’ approach to evidence-based medicine in general practice, corroborates this recommendation by calling for further attention to be given to communication skills of healthcare practitioners within the context of being an evidence-based practitioner. This resonates with recommendations by Gorgon et al 13 for the need to expose students to the intricacies of ‘real world’ contexts in which EBP is applied.

Experts in EBP, together with trends throughout empirical research and recognised educational theory repeatedly, make a number of recommendations for enhancing EBP teaching and learning strategies. These include (1) clinical integration of EBP teaching and learning, (2) a conscious effort on behalf of educators to embed EBP throughout all elements of healthcare professional programmes, (3) the use of multifaceted, dynamic teaching and assessment strategies which are context-specific and relevant to the individual learner/professional cohort, and (4) ‘scaffolding’ of learning.

At a practical level this requires a more concerted effort to move away from a predominant reliance on stand-alone didactic teaching towards clinically integrative and interactive teaching. 10 14–17 An example provided by one of the EBP experts represents such integrated teaching and experiential learning through the performance of GATE/CATs (Graphic Appraisal Tool for Epidemiological studies/Critically Appraised Topics) while on clinical rotation, with assessment conducted by a clinician in practice. Such an activity fulfils the criteria of being reflective of practice, facilitating the identification of gaps between current and desired levels of competence, identifying solutions for clinical issues and allowing re-evaluation and opportunity for reflection of decisions made with a practitioner. This level of interactivity facilitates ‘deeper’ learning, which is essential for knowledge transfer. 8 Such practices are also essential to bridge the gap between academic and clinical worlds, enabling students to experience ‘real’ translation of EBP in the clinical context. 6 ‘Scaffolding’ of learning, whereby EBP concepts and their application increase in complexity and are reinforced throughout a programme, was also highlighted as an essential instructional approach which is in keeping with recent literature specific both to EBP education and from a broader curriculum development perspective. 3 6 18 19

In addition to addressing challenges such as curriculum organisation and programme content/structure, identifying salient barriers to implementing optimal EBP education is recommended as an expedient approach to effecting positive change. 20 Highlighted strategies to overcome such barriers included (1) ‘Training the trainers’, (2) development of and investment in a national coherent approach to EBP education, and (3) structural incorporation of EBP learning into workplace settings.

National surveys of EBP education delivery 21 22 found that a lack of academic and clinical staff knowledgeable in teaching EBP was a barrier to effective and efficient student learning. This was echoed by findings from EBP expert interviews, which correspond with assertions by Hitch and Nicola-Richmond 6 that while recommended educational practices and resources are available, their uptake is somewhat limited. Effective teacher/leader education is required to improve EBP teaching quality. 10 16 23 24 Such formal training should extend to academic and clinical educators. Supporting staff to have confidence and competence in teaching EBP and providing opportunities for learning throughout education programmes is necessary to facilitate tangible change in this area.

A national and coherent plan with associated investment in healthcare education specific to the integration of EBP was highlighted as having an important impact on educational outcomes. The lack of a coordinated and cohesive approach and perceived value of EBP in the midst of competing interests, particularly within the context of the healthcare agenda, was suggested to lead to an ‘ad-hoc’ approach to the implementation of and investment in EBP education and related core EBP resources. Findings from a systematic scoping review of recommendations for the implementation of EBP 16 draw attention to a number of interventions at a national level that have potential to further promote and facilitate EBP education. Such interventions include government-level policy direction in relation to EBP education requirements across health profession programmes and the instalment and financing of a national institute for the development of evidence-based guidelines.

Incorporating EBP activities into routine clinical practice has potential to promote the consistent participation and implementation of EBP. Such incorporation can be facilitated at various different levels and settings. At a health service level, the provision of computer and internet facilities at the point of care with associated content management/decision support systems allowing access to guidelines, protocols, critically appraised topics and condensed recommendations was endorsed. At a local workplace level, access to EBP mentors, implementation of consistent and regular journal clubs, grand rounds, audit and regular research meetings are important to embed EBP within the healthcare and education environments. This in turn can nurture a culture which practically supports the observation and actualisation of EBP in day-to-day practice 16 and could in theory allow the coherent development of cohorts of EBP leaders.

There are study limitations which must be acknowledged. Four of the five interviewees were medical professionals. Further inclusion of allied healthcare professionals may have increased the representativeness of the findings. However, the primary selection criteria for participants were extensive and recognised expertise in relation to EBP education, the fundamental premises of which traverse specific professional boundaries.

Despite positive attitudes towards EBP and a predominant recognition of its necessity for the delivery of quality and safe healthcare, its consistent translation at the point of care remains elusive. To this end, continued investigations which seek to provide further direction and structure to developments in EBP education are recommended. 6 Although the quality of evidence has remained variable regarding the efficacy of individual EBP teaching interventions, consistent trends in relation to valuable andragogically sound educational approaches, fundamental curricular content and preferential instructional practices are evident within the literature in the past decade. The adoption of such trends is far from prevalent, which brings into question the extent of awareness that exists in relation to such recommendations and accompanying resources. There is a need to translate EBP into an active clinical resolution, which will have a positive impact on the delivery of patient care. In particular, an examination of current discourse between academic and clinical educators across healthcare professions is required to progress a ‘real world’ pragmatic approach to the integration of EBP education which has meaningful relevance to students and engenders active engagement from educators, clinicians and policy makers alike. Further attention is needed on strategies that not only focus on issues such as curricula structure, content and programme delivery but which support educators, education institutions, health services and clinicians to have the capacity and competence to meet the challenge of providing such EBP education.

Summary Box

What is already known.

Evidence-based practice (EBP) is established as a fundamental element and key indicator of high-quality patient care.

Both achieving competency and delivering instruction in EBP are complex processes requiring a multimodal approach.

Currently there exists only a modest utilisation of existing resources available to further develop EBP education.

What are the new findings?

In addition to developing competence in the fundamental EBP steps of ‘Ask’, ‘Acquire’, ‘Appraise’, ‘Apply’ and ‘Assess’, developing competence in effectively communicating evidence to others, in particular patients/service users, is an area newly emphasised as requiring additional attention by healthcare educators.

The successful expansion of the assessment and evaluation of EBP requires a pragmatic amplification of the discourse between academic and clinical educators.

How might it impact on clinical practice in the foreseeable future?

Quality of care is improved through the integration of the best available evidence into decision-making as routine practice and not in the extemporised manner often currently practised.

Acknowledgments

Special thanks to Professor Leanne Togher, Professor Carl Heneghan, Professor Bruce Arroll, Professor Rodney Jackson and Professor Gordon Guyatt, who provided key insights on EBP education from an international perspective. Thank you to Dr Niamh O’Rourke, Dr Eve O’Toole, Dr Sarah Condell and Professor Dermot Malone for their helpful direction throughout the project.

  • 1. ↵ Institute of Medicine (IOM) (US) Roundtable on Evidence-Based Medicine . Leadership Commitments to Improve Value in Healthcare: Finding Common Ground: Workshop Summary . Washington (DC : National Academies Press (US) , 2009 .
  • Summerskill W ,
  • Glasziou P , et al
  • Saroyan A ,
  • Dauphinee WD
  • Thangaratinam S ,
  • Barnfield G ,
  • Weinbrenner S , et al
  • Barends E ,
  • Nicola-Richmond K
  • Zeleníková R ,
  • Ren D , et al
  • Menz W , et al
  • Volmink J , et al
  • Bhutta ZA , et al
  • Galbraith K ,
  • Gorgon EJ ,
  • Fiddes P , et al
  • Coomarasamy A ,
  • Ubbink DT ,
  • Guyatt GH ,
  • Vermeulen H
  • Kortekaas MF ,
  • Bartelink ML ,
  • van der Heijden GJ , et al
  • Odabaşı O , et al
  • Camosso-Stefinovic J ,
  • Gillies C , et al
  • Blanco MA ,
  • Capello CF ,
  • Dorsch JL , et al
  • Heneghan C ,
  • Crilly M , et al
  • Ingvarson L ,
  • Walczak J ,
  • Gabryś E , et al
  • Ahmadi SF ,
  • Baradaran HR ,
  • Tilson JK ,
  • Kaplan SL ,
  • Harris JL , et al

Contributors This project formed part of a national project on EBP education in Ireland of which all named authors are members. The authors named on this paper made substantial contributions to both the acquisition and analysis of data, in addition to reviewing the report and paper for submission.

Funding This research was funded by the Clinical Effectiveness Unit of the National Patient Safety Office (NPSO), Department of Health, Ireland.

Competing interests None declared.

Patient consent Not required.

Ethics approval All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional ethical committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Ethical approval was granted by the Social Research Ethics Committee, University College Cork (Log 2016–140).

Provenance and peer review Not commissioned; externally peer reviewed.

Data sharing statement The full report entitled ’Research on Teaching EBP in Ireland to healthcare professionals and healthcare students' is available on the National Clinical Effectiveness, Department of Health website.

Read the full text or download the PDF:

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Critical thinking: knowledge and skills for evidence-based practice

Affiliation.

  • 1 Communication Sciences and Special Education, 514 Aderhold Hall, University of Georgia, Athens, GA 30602, USA. [email protected]
  • PMID: 20601532
  • DOI: 10.1044/0161-1461(2010/09-0037)

Purpose: I respond to Kamhi's (2011) conclusion in his article "Balancing Certainty and Uncertainty in Clinical Practice" that rational or critical thinking is an essential complement to evidence-based practice (EBP).

Method: I expand on Kamhi's conclusion and briefly describe what clinicians might need to know to think critically within an EBP profession. Specifically, I suggest how critical thinking is relevant to EBP, broadly summarize the relevant skills, indicate the importance of thinking dispositions, and outline the various ways our thinking can go wrong.

Conclusion: I finish the commentary by suggesting that critical thinking skills should be considered a required outcome of our professional training programs.

PubMed Disclaimer

  • Balancing certainty and uncertainty in clinical practice. Kamhi AG. Kamhi AG. Lang Speech Hear Serv Sch. 2011 Jan;42(1):59-64. doi: 10.1044/0161-1461(2009/09-0034). Epub 2009 Oct 15. Lang Speech Hear Serv Sch. 2011. PMID: 19833828

Similar articles

  • Some pragmatic tips for dealing with clinical uncertainty. Bernstein Ratner N. Bernstein Ratner N. Lang Speech Hear Serv Sch. 2011 Jan;42(1):77-80; discussion 88-93. doi: 10.1044/0161-1461(2009/09-0033). Epub 2009 Oct 15. Lang Speech Hear Serv Sch. 2011. PMID: 19833829
  • Rational thinking in school-based practice. Clark MK, Flynn P. Clark MK, et al. Lang Speech Hear Serv Sch. 2011 Jan;42(1):73-6; discussion 88-93. doi: 10.1044/0161-1461(2010/09-0043). Epub 2010 Jul 2. Lang Speech Hear Serv Sch. 2011. PMID: 20601529
  • Questions about certainty and uncertainty in clinical practice. Nelson NW. Nelson NW. Lang Speech Hear Serv Sch. 2011 Jan;42(1):81-7; discussion 88-93. doi: 10.1044/0161-1461(2010/09-0046). Epub 2010 Jul 2. Lang Speech Hear Serv Sch. 2011. PMID: 20601528
  • Critical thinking a new approach to patient care. Sullivan DL, Chumbley C. Sullivan DL, et al. JEMS. 2010 Apr;35(4):48-53. doi: 10.1016/S0197-2510(10)70094-2. JEMS. 2010. PMID: 20399376 Review.
  • Identifying and managing common childhood language and speech impairments. Reilly S, McKean C, Morgan A, Wake M. Reilly S, et al. BMJ. 2015 May 14;350:h2318. doi: 10.1136/bmj.h2318. BMJ. 2015. PMID: 25976972 Review. No abstract available.
  • Synchronous online lecturing or blended flipped classroom with jigsaw: an educational intervention during the Covid-19 pandemic. Mohebbi Z, Mortezaei-Haftador A, Mehrabi M. Mohebbi Z, et al. BMC Med Educ. 2022 Dec 7;22(1):845. doi: 10.1186/s12909-022-03915-5. BMC Med Educ. 2022. PMID: 36476447 Free PMC article.

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources.

  • MedlinePlus Health Information

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

What is Evidence-Based Practice in Nursing? (With Examples, Benefits, & Challenges)

what is critical thinking in evidence based practice

Are you a nurse looking for ways to increase patient satisfaction, improve patient outcomes, and impact the profession? Have you found yourself caught between traditional nursing approaches and new patient care practices? Although evidence-based practices have been used for years, this concept is the focus of patient care today more than ever. Perhaps you are wondering, “What is evidence-based practice in nursing?” In this article, I will share information to help you begin understanding evidence-based practice in nursing + 10 examples about how to implement EBP.

What is Evidence-Based Practice in Nursing?

When was evidence-based practice first introduced in nursing, who introduced evidence-based practice in nursing, what is the difference between evidence-based practice in nursing and research in nursing, what are the benefits of evidence-based practice in nursing, top 5 benefits to the patient, top 5 benefits to the nurse, top 5 benefits to the healthcare organization, 10 strategies nursing schools employ to teach evidence-based practices, 1. assigning case studies:, 2. journal clubs:, 3. clinical presentations:, 4. quizzes:, 5. on-campus laboratory intensives:, 6. creating small work groups:, 7. interactive lectures:, 8. teaching research methods:, 9. requiring collaboration with a clinical preceptor:, 10. research papers:, what are the 5 main skills required for evidence-based practice in nursing, 1. critical thinking:, 2. scientific mindset:, 3. effective written and verbal communication:, 4. ability to identify knowledge gaps:, 5. ability to integrate findings into practice relevant to the patient’s problem:, what are 5 main components of evidence-based practice in nursing, 1. clinical expertise:, 2. management of patient values, circumstances, and wants when deciding to utilize evidence for patient care:, 3. practice management:, 4. decision-making:, 5. integration of best available evidence:, what are some examples of evidence-based practice in nursing, 1. elevating the head of a patient’s bed between 30 and 45 degrees, 2. implementing measures to reduce impaired skin integrity, 3. implementing techniques to improve infection control practices, 4. administering oxygen to a client with chronic obstructive pulmonary disease (copd), 5. avoiding frequently scheduled ventilator circuit changes, 6. updating methods for bathing inpatient bedbound clients, 7. performing appropriate patient assessments before and after administering medication, 8. restricting the use of urinary catheterizations, when possible, 9. encouraging well-balanced diets as soon as possible for children with gastrointestinal symptoms, 10. implementing and educating patients about safety measures at home and in healthcare facilities, how to use evidence-based knowledge in nursing practice, step #1: assessing the patient and developing clinical questions:, step #2: finding relevant evidence to answer the clinical question:, step #3: acquire evidence and validate its relevance to the patient’s specific situation:, step #4: appraise the quality of evidence and decide whether to apply the evidence:, step #5: apply the evidence to patient care:, step #6: evaluating effectiveness of the plan:, 10 major challenges nurses face in the implementation of evidence-based practice, 1. not understanding the importance of the impact of evidence-based practice in nursing:, 2. fear of not being accepted:, 3. negative attitudes about research and evidence-based practice in nursing and its impact on patient outcomes:, 4. lack of knowledge on how to carry out research:, 5. resource constraints within a healthcare organization:, 6. work overload:, 7. inaccurate or incomplete research findings:, 8. patient demands do not align with evidence-based practices in nursing:, 9. lack of internet access while in the clinical setting:, 10. some nursing supervisors/managers may not support the concept of evidence-based nursing practices:, 12 ways nurse leaders can promote evidence-based practice in nursing, 1. be open-minded when nurses on your teams make suggestions., 2. mentor other nurses., 3. support and promote opportunities for educational growth., 4. ask for increased resources., 5. be research-oriented., 6. think of ways to make your work environment research-friendly., 7. promote ebp competency by offering strategy sessions with staff., 8. stay up-to-date about healthcare issues and research., 9. actively use information to demonstrate ebp within your team., 10. create opportunities to reinforce skills., 11. develop templates or other written tools that support evidence-based decision-making., 12. review evidence for its relevance to your organization., bonus 8 top suggestions from a nurse to improve your evidence-based practices in nursing, 1. subscribe to nursing journals., 2. offer to be involved with research studies., 3. be intentional about learning., 4. find a mentor., 5. ask questions, 6. attend nursing workshops and conferences., 7. join professional nursing organizations., 8. be honest with yourself about your ability to independently implement evidence-based practice in nursing., useful resources to stay up to date with evidence-based practices in nursing, professional organizations & associations, blogs/websites, youtube videos, my final thoughts, frequently asked questions answered by our expert, 1. what did nurses do before evidence-based practice, 2. how did florence nightingale use evidence-based practice, 3. what is the main limitation of evidence-based practice in nursing, 4. what are the common misconceptions about evidence-based practice in nursing, 5. are all types of nurses required to use evidence-based knowledge in their nursing practice, 6. will lack of evidence-based knowledge impact my nursing career, 7. i do not have access to research databases, how do i improve my evidence-based practice in nursing, 7. are there different levels of evidence-based practices in nursing.

• Level One: Meta-analysis of random clinical trials and experimental studies • Level Two: Quasi-experimental studies- These are focused studies used to evaluate interventions. • Level Three: Non-experimental or qualitative studies. • Level Four: Opinions of nationally recognized experts based on research. • Level Five: Opinions of individual experts based on non-research evidence such as literature reviews, case studies, organizational experiences, and personal experiences.

8. How Can I Assess My Evidence-Based Knowledge In Nursing Practice?

what is critical thinking in evidence based practice

Promoting critical thinking through an evidence-based skills fair intervention

Journal of Research in Innovative Teaching & Learning

ISSN : 2397-7604

Article publication date: 23 November 2020

Issue publication date: 1 April 2022

The lack of critical thinking in new graduates has been a concern to the nursing profession. The purpose of this study was to investigate the effects of an innovative, evidence-based skills fair intervention on nursing students' achievements and perceptions of critical thinking skills development.

Design/methodology/approach

The explanatory sequential mixed-methods design was employed for this study.

The findings indicated participants perceived the intervention as a strategy for developing critical thinking.

Originality/value

The study provides educators helpful information in planning their own teaching practice in educating students.

Critical thinking

Evidence-based practice, skills fair intervention.

Gonzalez, H.C. , Hsiao, E.-L. , Dees, D.C. , Noviello, S.R. and Gerber, B.L. (2022), "Promoting critical thinking through an evidence-based skills fair intervention", Journal of Research in Innovative Teaching & Learning , Vol. 15 No. 1, pp. 41-54. https://doi.org/10.1108/JRIT-08-2020-0041

Emerald Publishing Limited

Copyright © 2020, Heidi C. Gonzalez, E-Ling Hsiao, Dianne C. Dees, Sherri R. Noviello and Brian L. Gerber

Published in Journal of Research in Innovative Teaching & Learning . Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode

Introduction

Critical thinking (CT) was defined as “cognitive skills of analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge” ( Scheffer and Rubenfeld, 2000 , p. 357). Critical thinking is the basis for all professional decision-making ( Moore, 2007 ). The lack of critical thinking in student nurses and new graduates has been a concern to the nursing profession. It would negatively affect the quality of service and directly relate to the high error rates in novice nurses that influence patient safety ( Arli et al. , 2017 ; Saintsing et al. , 2011 ). It was reported that as many as 88% of novice nurses commit medication errors with 30% of these errors due to a lack of critical thinking ( Ebright et al. , 2004 ). Failure to rescue is another type of error common for novice nurses, reported as high as 37% ( Saintsing et al. , 2011 ). The failure to recognize trends or complications promptly or take action to stabilize the patient occurs when health-care providers do not recognize signs and symptoms of the early warnings of distress ( Garvey and CNE series, 2015 ). Internationally, this lack of preparedness and critical thinking attributes to the reported 35–60% attrition rate of new graduate nurses in their first two years of practice ( Goodare, 2015 ). The high attrition rate of new nurses has expensive professional and economic costs of $82,000 or more per nurse and negatively affects patient care ( Twibell et al. , 2012 ). Facione and Facione (2013) reported the failure to utilize critical thinking skills not only interferes with learning but also results in poor decision-making and unclear communication between health-care professionals, which ultimately leads to patient deaths.

Due to the importance of critical thinking, many nursing programs strive to infuse critical thinking into their curriculum to better prepare graduates for the realities of clinical practice that involves ever-changing, complex clinical situations and bridge the gap between education and practice in nursing ( Benner et al. , 2010 ; Kim et al. , 2019 ; Park et al. , 2016 ; Newton and Moore, 2013 ; Nibert, 2011 ). To help develop students' critical thinking skills, nurse educators must change the way they teach nursing, so they can prepare future nurses to be effective communicators, critical thinkers and creative problem solvers ( Rieger et al. , 2015 ). Nursing leaders also need to redefine teaching practice and educational guidelines that drive innovation in undergraduate nursing programs.

Evidence-based practice has been advocated to promote critical thinking and help reduce the research-practice gap ( Profetto-McGrath, 2005 ; Stanley and Dougherty, 2010 ). Evidence-based practice was defined as “the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of the individual patient” ( Sackett et al. , 1996 , p. 71). Skills fair intervention, one type of evidence-based practice, can be used to engage students, promote active learning and develop critical thinking ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). Skills fair intervention helps promote a consistent teaching practice of the psychomotor skills to the novice nurse that decreased anxiety, gave clarity of expectations to the students in the clinical setting and increased students' critical thinking skills ( Roberts et al. , 2009 ). The researchers of this study had an opportunity to create an active, innovative skills fair intervention for a baccalaureate nursing program in one southeastern state. This intervention incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in nursing students ( Hsu and Hsieh, 2013 ; Oermann et al. , 2011 ; Roberts et al. , 2009 ). The effects of an innovative skills fair intervention on senior baccalaureate nursing students' achievements and their perceptions of critical thinking development were examined in the study.

Literature review

The ability to use reasoned opinion focusing equally on processes and outcomes over emotions is called critical thinking ( Paul and Elder, 2008 ). Critical thinking skills are desired in almost every discipline and play a major role in decision-making and daily judgments. The roots of critical thinking date back to Socrates 2,500 years ago and can be traced to the ancient philosopher Aristotle ( Paul and Elder, 2012 ). Socrates challenged others by asking inquisitive questions in an attempt to challenge their knowledge. In the 1980s, critical thinking gained nationwide recognition as a behavioral science concept in the educational system ( Robert and Petersen, 2013 ). Many researchers in both education and nursing have attempted to define, measure and teach critical thinking for decades. However, a theoretical definition has yet to be accepted and established by the nursing profession ( Romeo, 2010 ). The terms critical literacy, CT, reflective thinking, systems thinking, clinical judgment and clinical reasoning are used synonymously in the reviewed literature ( Clarke and Whitney, 2009 ; Dykstra, 2008 ; Jones, 2010 ; Swing, 2014 ; Turner, 2005 ).

Watson and Glaser (1980) viewed critical thinking not only as cognitive skills but also as a combination of skills, knowledge and attitudes. Paul (1993) , the founder of the Foundation for Critical Thinking, offered several definitions of critical thinking and identified three essential components of critical thinking: elements of thought, intellectual standards and affective traits. Brunt (2005) stated critical thinking is a process of being practical and considered it to be “the process of purposeful thinking and reflective reasoning where practitioners examine ideas, assumptions, principles, conclusions, beliefs, and actions in the contexts of nursing practice” (p. 61). In an updated definition, Ennis (2011) described critical thinking as, “reasonable reflective thinking focused on deciding what to believe or do” (para. 1).

The most comprehensive attempt to define critical thinking was under the direction of Facione and sponsored by the American Philosophical Association ( Scheffer and Rubenfeld, 2000 ). Facione (1990) surveyed 53 experts from the arts and sciences using the Delphi method to define critical thinking as a “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as an explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which judgment, is based” (p. 2).

To come to a consensus definition for critical thinking, Scheffer and Rubenfeld (2000) also conducted a Delphi study. Their study consisted of an international panel of nurses who completed five rounds of sequenced questions to arrive at a consensus definition. Critical thinking was defined as “habits of mind” and “cognitive skills.” The elements of habits of mind included “confidence, contextual perspective, creativity, flexibility, inquisitiveness, intellectual integrity, intuition, open-mindedness, perseverance, and reflection” ( Scheffer and Rubenfeld, 2000 , p. 352). The elements of cognitive skills were recognized as “analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge” ( Scheffer and Rubenfeld, 2000 , p. 352). In addition, Ignatavicius (2001) defined the development of critical thinking as a long-term process that must be practiced, nurtured and reinforced over time. Ignatavicius believed that a critical thinker required six cognitive skills: interpretation, analysis, evaluation, inference, explanation and self-regulation ( Chun-Chih et al. , 2015 ). According to Ignatavicius (2001) , the development of critical thinking is difficult to measure or describe because it is a formative rather than summative process.

Fero et al. (2009) noted that patient safety might be compromised if a nurse cannot provide clinically competent care due to a lack of critical thinking. The Institute of Medicine (2001) recommended five health care competencies: patient-centered care, interdisciplinary team care, evidence-based practice, informatics and quality improvement. Understanding the development and attainment of critical thinking is the key for gaining these future competencies ( Scheffer and Rubenfeld, 2000 ). The development of a strong scientific foundation for nursing practice depends on habits such as contextual perspective, inquisitiveness, creativity, analysis and reasoning skills. Therefore, the need to better understand how these critical thinking habits are developed in nursing students needs to be explored through additional research ( Fero et al. , 2009 ). Despite critical thinking being listed since the 1980s as an accreditation outcome criteria for baccalaureate programs by the National League for Nursing, very little improvement has been observed in practice ( McMullen and McMullen, 2009 ). James (2013) reported the number of patient harm incidents associated with hospital care is much higher than previously thought. James' study indicated that between 210,000 and 440,000 patients each year go to the hospital for care and end up suffering some preventable harm that contributes to their death. James' study of preventable errors is attributed to other sources besides nursing care, but having a nurse in place who can advocate and critically think for patients will make a positive impact on improving patient safety ( James, 2013 ; Robert and Peterson, 2013 ).

Adopting teaching practice to promote CT is a crucial component of nursing education. Research by Nadelson and Nadelson (2014) suggested evidence-based practice is best learned when integrated into multiple areas of the curriculum. Evidence-based practice developed its roots through evidence-based medicine, and the philosophical origins extend back to the mid-19th century ( Longton, 2014 ). Florence Nightingale, the pioneer of modern nursing, used evidence-based practice during the Crimean War when she recognized a connection between poor sanitary conditions and rising mortality rates of wounded soldiers ( Rahman and Applebaum, 2011 ). In professional nursing practice today, a commonly used definition of evidence-based practice is derived from Dr. David Sackett: the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient ( Sackett et al. , 1996 , p. 71). As professional nurses, it is imperative for patient safety to remain inquisitive and ask if the care provided is based on available evidence. One of the core beliefs of the American Nephrology Nurses' Association's (2019) 2019–2020 Strategic Plan is “Anna must support research to develop evidence-based practice, as well as to advance nursing science, and that as individual members, we must support, participate in, and apply evidence-based research that advances our own skills, as well as nursing science” (p. 1). Longton (2014) reported the lack of evidence-based practice in nursing resulted in negative outcomes for patients. In fact, when evidence-based practice was implemented, changes in policies and procedures occurred that resulted in decreased reports of patient harm and associated health-care costs. The Institute of Medicine (2011) recommendations included nurses being leaders in the transformation of the health-care system and achieving higher levels of education that will provide the ability to critically analyze data to improve the quality of care for patients. Student nurses must be taught to connect and integrate CT and evidence-based practice throughout their program of study and continue that practice throughout their careers.

One type of evidence-based practice that can be used to engage students, promote active learning and develop critical thinking is skills fair intervention ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). Skills fair intervention promoted a consistent teaching approach of the psychomotor skills to the novice nurse that decreased anxiety, gave clarity of expectations to the students in the clinical setting and increased students' critical thinking skills ( Roberts et al. , 2009 ). The skills fair intervention used in this study is a teaching strategy that incorporated CT prompts, Socratic questioning, group work, guided discussions, return demonstrations and blended learning in an attempt to develop CT in nursing students ( Hsu and Hsieh, 2013 ; Roberts et al. , 2009 ). It melded evidence-based practice with simulated CT opportunities while students practiced essential psychomotor skills.

Research methodology

Context – skills fair intervention.

According to Roberts et al. (2009) , psychomotor skills decline over time even among licensed experienced professionals within as little as two weeks and may need to be relearned within two months without performing a skill. When applying this concept to student nurses for whom each skill is new, it is no wonder their competency result is diminished after having a summer break from nursing school. This skills fair intervention is a one-day event to assist baccalaureate students who had taken the summer off from their studies in nursing and all faculty participated in operating the stations. It incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in baccalaureate students.

Students were scheduled and placed randomly into eight teams based on attributes of critical thinking as described by Wittmann-Price (2013) : Team A – Perseverance, Team B – Flexibility, Team C – Confidence, Team D – Creativity, Team E – Inquisitiveness, Team F – Reflection, Team G – Analyzing and Team H – Intuition. The students rotated every 20 minutes through eight stations: Medication Administration: Intramuscular and Subcutaneous Injections, Initiating Intravenous Therapy, ten-minute Focused Physical Assessment, Foley Catheter Insertion, Nasogastric Intubation, Skin Assessment/Braden Score and Restraints, Vital Signs and a Safety Station. When the students completed all eight stations, they went to the “Check-Out” booth to complete a simple evaluation to determine their perceptions of the effectiveness of the innovative intervention. When the evaluations were complete, each of the eight critical thinking attribute teams placed their index cards into a hat, and a student won a small prize. All Junior 2, Senior 1 and Senior 2 students were required to attend the Skills Fair. The Skills Fair Team strove to make the event as festive as possible, engaging nursing students with balloons, candy, tri-boards, signs and fun pre and postactivities. The Skills Fair rubrics, scheduling and instructions were shared electronically with students and faculty before the skills fair intervention to ensure adequate preparation and continuous resource availability as students move forward into their future clinical settings.

Research design

Institutional review board (IRB) approval was obtained from XXX University to conduct this study and protect human subject rights. The explanatory sequential mixed-methods design was employed for this study. The design was chosen to identify what effects a skills fair intervention that had on senior baccalaureate nursing students' achievements on the Kaplan Critical Thinking Integrated Test (KCTIT) and then follow up with individual interviews to explore those test results in more depth. In total, 52 senior nursing students completed the KCTIT; 30 of them participated in the skills fair intervention and 22 of them did not participate. The KCTIT is a computerized 85-item exam in which 85 equates to 100%, making each question worth one point. It has high reliability and validity ( Kaplan Nursing, 2012 ; Swing, 2014 ). The reliability value of the KCTIT ranged from 0.72 to 0.89. A t -test was used to analyze the test results.

A total of 11 participants were purposefully selected based on a range of six high achievers and five low achievers on the KCTIT for open-ended one-on-one interviews. Each interview was conducted individually and lasted for about 60 minutes. An open-ended interview protocol was used to guide the flow of data collection. The interviewees' ages ranged from 21 to 30 years, with an average of 24 years. One of 11 interviewees was male. Among them, seven were White, three were Black and one was Indian American. The data collected were used to answer the following research questions: (1) What was the difference in achievements on the KCTIT among senior baccalaureate nursing students who participated in the skills fair intervention and students who did not participate? (2) What were the senior baccalaureate nursing students' perceptions of internal and external factors impacting the development of critical thinking skills during the skills fair intervention? and (3) What were the senior baccalaureate nursing students' perceptions of the skills fair intervention as a critical thinking developmental strategy?

Inductive content analysis was used to analyze interview data by starting with the close reading of the transcripts and writing memos for initial coding, followed by an analysis of patterns and relationships among the data for focused coding. The intercoder reliability was established for qualitative data analysis with a nursing expert. The lead researcher and the expert read the transcript several times and assigned a code to significant units of text that corresponded with answering the research questions. The codes were compared based on differences and similarities and sorted into subcategories and categories. Then, headings and subheadings were used based on similar comments to develop central themes and patterns. The process of establishing intercoder reliability helped to increase dependability, conformability and credibility of the findings ( Graneheim and Lundman, 2004 ). In addition, methods of credibility, confirmability, dependability and transferability were applied to increase the trustworthiness of this study ( Graneheim and Lundman, 2004 ). First, reflexivity was observed by keeping journals and memos. This practice allowed the lead researcher to reflect on personal views to minimize bias. Data saturation was reached through following the recommended number of participants as well as repeated immersion in the data during analysis until no new data surfaced. Member checking was accomplished through returning the transcript and the interpretation to the participants to check the accuracy and truthfulness of the findings. Finally, proper documentation was conducted to allow accurate crossreferencing throughout the study.

Quantitative results

Results for the quantitative portion showed there was no difference in scores on the KCTIT between senior nursing students who participated in the skills fair intervention and senior nursing students who did not participate, t (50) = −0.174, p  = 0.86 > 0.05. The test scores between the nonparticipant group ( M  = 67.59, SD = 5.81) and the participant group ( M  = 67.88, SD = 5.99) were almost equal.

Qualitative results

Initial coding.

The results from the initial coding and generated themes are listed in Table 1 . First, the participants perceived the skills fair intervention as “promoting experience” and “confidence” by practicing previously learned knowledge and reinforcing it with active learning strategies. Second, the participants perceived the skills fair intervention as a relaxed, nonthreatening learning environment due to the festive atmosphere, especially in comparison to other learning experiences in the nursing program. The nonthreatening environment of the skills fair intervention allowed students to learn without fear. Third, the majority of participants believed their critical thinking was strengthened after participating. Several participants believed their perception of critical thinking was “enhanced” or “reinforced” rather than significantly changed.

Focused coding results

The final themes were derived from the analysis of patterns and relationships among the content of the data using inductive content analysis ( Saldana, 2009 ). The following was examined across the focused coding process: (1) factors impacting critical thinking skills development during skills fair intervention and (2) skills fair intervention a critical thinking skills developmental strategy.

Factors impacting critical thinking skills development . The factors impacting the development of critical thinking during the skills fair intervention were divided into two themes: internal factors and external factors. The internal factors were characteristics innate to the students. The identified internal factors were (1) confidence and anxiety levels, (2) attitude and (3) age. The external factors were the outside influences that affected the students. The external factors were (1) experience and practice, (2) faculty involvement, (3) positive learning environment and (4) faculty prompts.

I think that confidence and anxiety definitely both have a huge impact on your ability to be able to really critically think. If you start getting anxious and panicking you cannot think through the process like you need too. I do not really think gender or age necessarily would have anything to do with critical thinking.
Definitely the confidence level, I think, the more advanced you get in the program, your confidence just keeps on growing. Level of anxiety, definitely… I think the people who were in the Skills Fair for the first time, had more anxiety because they did not really know to think, they did not know how strict it was going to be, or if they really had to know everything by the book. I think the Skills Fair helped everyone's confidence levels, but especially the Jr. 2's.

Attitude was an important factor in the development of critical thinking skills during the skills fair intervention as participants believed possessing a pleasant and positive attitude meant a student was eager to learn, participate, accept responsibility for completing duties and think seriously. Participant 6 believed attitude contributed to performance in the Skills Fair.

I feel like, certain things bring critical thinking out in you. And since I'm a little bit older than some of the other students, I have had more life experiences and am able to figure stuff out better. Older students have had more time to learn by trial and error, and this and that.
Like when I had clinical with you, you'd always tell us to know our patients' medications. To always know and be prepared to answer questions – because at first as a Junior 1 we did not do that in the clinical setting… and as a Junior 2, I did not really have to know my medications, but with you as a Senior 1, I started to realize that the patients do ask about their meds, so I was making sure that I knew everything before they asked it. And just having more practice with IVs – at first, I was really nervous, but when I got to my preceptorship – I had done so many IVs and with all of the practice, it just built up my confidence with that skill so when I performed that skill during the Fair, I was confident due to my clinical experiences and able to think and perform better.
I think teachers will always affect the ability to critically think just because you want [to] get the right answer because they are there and you want to seem smart to them [Laugh]. Also, if you are leading in the wrong direction of your thinking – they help steer you back to [in] the right direction so I think that was very helpful.
You could tell the faculty really tried to make it more laid back and fun, so everybody would have a good experience. The faculty had a good attitude. I think making it fun and active helped keep people positive. You know if people are negative and not motivated, nothing gets accomplished. The faculty did an amazing job at making the Skills Fair a positive atmosphere.

However, for some of the participants, a positive learning environment depended on their fellow students. The students were randomly assigned alphabetically to groups, and the groups were assigned to starting stations at the Skills Fair. The participants claimed some students did not want to participate and displayed cynicism toward the intervention. The participants believed their cynicism affected the positive learning environment making critical thinking more difficult during the Skills Fair.

Okay, when [instructor name] was demonstrating the Chevron technique right after we inserted the IV catheter and we were trying to secure the catheter, put on the extension set, and flush the line at what seemed to be all at the same time. I forgot about how you do not want to put the tape right over the hub of the catheter because when you go back in and try to assess the IV site – you're trying to assess whether or not it is patent or infiltrated – you have to visualize the insertion site. That was one of the things that I had been doing wrong because I was just so excited that I got the IV in the vein in the first place – that I did not think much about the tape or the tegaderm for sterility. So I think an important part of critical thinking is to be able to recognize when you've made a mistake and stop, stop yourself from doing it in the future (see Table 2 ).

Skills fair intervention as a developmental strategy for critical thinking . The participants identified the skills fair intervention was effective as a developmental strategy for critical thinking, as revealed in two themes: (1) develops alternative thinking and (2) thinking before doing (See Table 3 ).

Develops alternative thinking . The participants perceived the skills fair intervention helped enhance critical thinking and confidence by developing alternative thinking. Alternative thinking was described as quickly thinking of alternative solutions to problems based on the latest evidence and using that information to determine what actions were warranted to prevent complications and prevent injury. It helped make better connections through the learning of rationale between knowledge and skills and then applying that knowledge to prevent complications and errors to ensure the safety of patients. The participants stated the learning of rationale for certain procedures provided during the skills fair intervention such as the evidence and critical thinking prompts included in the rubrics helped reinforce this connection. The participants also shared they developed alternative thinking after participating in the skills fair intervention by noticing trends in data to prevent potential complications from the faculty prompts. Participant 1 stated her instructor prompted her alternative thinking through questioning about noticing trends to prevent potential complications. She said the following:

Another way critical thinking occurred during the skills fair was when [instructor name] was teaching and prompted us about what it would be like to care for a patient with a fractured hip – I think this was at the 10-minute focused assessment station, but I could be wrong. I remember her asking, “What do you need to be on the look-out for? What can go wrong?” I automatically did not think critically very well and was only thinking circulation in the leg, dah, dah, dah. But she was prompting us to think about mobility alterations and its effect on perfusion and oxygenation. She was trying to help us build those connections. And I think that's a lot of the aspects of critical thinking that gets overlooked with the nursing student – trouble making connections between our knowledge and applying it in practice.

Thinking before doing . The participants perceived thinking before doing, included thinking of how and why certain procedures, was necessary through self-examination prior to taking action. The hands-on situational learning allowed the participants in the skills fair intervention to better notice assessment data and think at a higher level as their previous learning of the skills was perceived as memorization of steps. This higher level of learning allowed participants to consider different future outcomes and analyze pertinent data before taking action.

I think what helped me the most is considering outcomes of my actions before I do anything. For instance, if you're thinking, “Okay. Well, I need to check their blood pressure before I administer this blood pressure medication – or the blood pressure could potentially bottom out.” I really do not want my patient to bottom out and get hypotensive because I administered a medication that was ordered, but not safe to give. I could prevent problems from happening if I know what to be on alert for and act accordingly. So ultimately knowing that in the clinical setting, I can prevent complications from happening and I save myself, my license, and promote patient safety. I think knowing that I've seen the importance of critical thinking already in practice has helped me value and understand why I should be critically thinking. Yes, we use the 5-rights of medication safety – but we also have to think. For instance, if I am going to administer insulin – what do I need to know or do to give this safely? What is the current blood sugar? Has the patient been eating? When is the next meal scheduled? Is the patient NPO for a procedure? Those are examples of questions to consider and the level of thinking that needs to take place prior to taking actions in the clinical setting.

Although the results of quantitative data showed no significant difference in scores on the KCTIT between the participant and nonparticipant groups, during the interviews some participants attributed this result to the test not being part of a course grade and believed students “did not try very hard to score well.” However, the participants who attended interviews did identify the skills fair intervention as a developmental strategy for critical thinking by helping them develop alternative thinking and thinking before doing. The findings are supported in the literature as (1) nurses must recognize signs of clinical deterioration and take action promptly to prevent potential complications ( Garvey and CNE series 2015 ) and (2) nurses must analyze pertinent data and consider all possible solutions before deciding on the most appropriate action for each patient ( Papathanasiou et al. , 2014 ).

The skills fair intervention also enhanced the development of self-confidence by participants practicing previously learned skills in a controlled, safe environment. The nonthreatening environment of the skills fair intervention allowed students to learn without fear and the majority of participants believed their critical thinking was strengthened after participating. The interview data also revealed a combination of internal and external factors that influenced the development of critical thinking during the skills fair intervention including confidence and anxiety levels, attitude, age, experience and practice, faculty involvement, positive learning environment and faculty prompts. These factors should be considered when addressing the promotion and development of critical thinking.

Conclusions, limitations and recommendations

A major concern in the nursing profession is the lack of critical thinking in student nurses and new graduates, which influences the decision-making of novice nurses and directly affects patient care and safety ( Saintsing et al. , 2011 ). Nurse educators must use evidence-based practice to prepare students to critically think with the complicated and constantly evolving environment of health care today ( Goodare, 2015 ; Newton and Moore, 2013 ). Evidence-based practice has been advocated to promote critical thinking ( Profetto-McGrath, 2005 ; Stanley and Dougherty, 2010 ). The skills fair intervention can be one type of evidence-based practice used to promote critical thinking ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). The Intervention used in this study incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in nursing students.

The explanatory sequential mixed-methods design was employed to investigate the effects of the innovative skills fair intervention on senior baccalaureate nursing students' achievements and their perceptions of critical thinking skills development. Although the quantitative results showed no significant difference in scores on the KCTIT between students who participated in the skills fair intervention and those who did not, those who attended the interviews perceived their critical thinking was reinforced after the skills fair intervention and believed it was an effective developmental strategy for critical thinking, as it developed alternative thinking and thinking before doing. This information is useful for nurse educators who plan their own teaching practice to promote critical thinking and improve patient outcomes. The findings also provide schools and educators information that helps review their current approach in educating nursing students. As evidenced in the findings, the importance of developing critical thinking skills is crucial for becoming a safe, professional nurse. Internal and external factors impacting the development of critical thinking during the skills fair intervention were identified including confidence and anxiety levels, attitude, age, experience and practice, faculty involvement, positive learning environment and faculty prompts. These factors should be considered when addressing the promotion and development of critical thinking.

There were several limitations to this study. One of the major limitations of the study was the limited exposure of students' time of access to the skills fair intervention, as it was a one-day learning intervention. Another limitation was the sample selection and size. The skills fair intervention was limited to only one baccalaureate nursing program in one southeastern state. As such, the findings of the study cannot be generalized as it may not be representative of baccalaureate nursing programs in general. In addition, this study did not consider students' critical thinking achievements prior to the skills fair intervention. Therefore, no baseline measurement of critical thinking was available for a before and after comparison. Other factors in the nursing program could have affected the students' scores on the KCTIT, such as anxiety or motivation that was not taken into account in this study.

The recommendations for future research are to expand the topic by including other regions, larger samples and other baccalaureate nursing programs. In addition, future research should consider other participant perceptions, such as nurse educators, to better understand the development and growth of critical thinking skills among nursing students. Finally, based on participant perceptions, future research should include a more rigorous skills fair intervention to develop critical thinking and explore the link between confidence and critical thinking in nursing students.

Initial coding results

ThemesFrequency
Experience and confidence contributed to critical thinking skills76
Skills fair intervention had a relaxed atmosphere23
Skills fair intervention reinforced critical thinking skills21

Factors impacting critical thinking skill development during skills fair intervention

ThemesSubthemesFrequency of mentions
Internal factors 33
Confidence and anxiety levels17
Attitude10
Age6
External factors 62
Experience and practice21
Faculty involvement24
Positive learning environment11
Faculty prompts6

Skills fair intervention as a developmental strategy for critical thinking

ThemesSubthemesFrequency
Develops alternative thinking 13
Application of knowledge and skills9
Noticing trends to prevent complications4
Thinking before doing 10
Considering future outcomes5
Analyzing relevant data5

American Nephrology Nurses Association (ANNA) ( 2019 ), “ Learning, leading, connecting, and playing at the intersection of nephrology and nursing-2019–2020 strategic plan ”, viewed 3 Aug 2019, available at: https://www.annanurse.org/download/reference/association/strategicPlan.pdf .

Arli , S.D. , Bakan , A.B. , Ozturk , S. , Erisik , E. and Yildirim , Z. ( 2017 ), “ Critical thinking and caring in nursing students ”, International Journal of Caring Sciences , Vol. 10 No. 1 , pp. 471 - 478 .

Benner , P. , Sutphen , M. , Leonard , V. and Day , L. ( 2010 ), Educating Nurses: A Call for Radical Transformation , Jossey-Bass , San Francisco .

Brunt , B. ( 2005 ), “ Critical thinking in nursing: an integrated review ”, The Journal of Continuing Education in Nursing , Vol. 36 No. 2 , pp. 60 - 67 .

Chun-Chih , L. , Chin-Yen , H. , I-Ju , P. and Li-Chin , C. ( 2015 ), “ The teaching-learning approach and critical thinking development: a qualitative exploration of Taiwanese nursing students ”, Journal of Professional Nursing , Vol. 31 No. 2 , pp. 149 - 157 , doi: 10.1016/j.profnurs.2014.07.001 .

Clarke , L.W. and Whitney , E. ( 2009 ), “ Walking in their shoes: using multiple-perspectives texts as a bridge to critical literacy ”, The Reading Teacher , Vol. 62 No. 6 , pp. 530 - 534 , doi: 10.1598/RT.62.6.7 .

Dykstra , D. ( 2008 ), “ Integrating critical thinking and memorandum writing into course curriculum using the internet as a research tool ”, College Student Journal , Vol. 42 No. 3 , pp. 920 - 929 , doi: 10.1007/s10551-010-0477-2 .

Ebright , P. , Urden , L. , Patterson , E. and Chalko , B. ( 2004 ), “ Themes surrounding novice nurse near-miss and adverse-event situations ”, The Journal of Nursing Administration: The Journal of Nursing Administration , Vol. 34 , pp. 531 - 538 , doi: 10.1097/00005110-200411000-00010 .

Ennis , R. ( 2011 ), “ The nature of critical thinking: an outline of critical thinking dispositions and abilities ”, viewed 3 May 2017, available at: https://education.illinois.edu/docs/default-source/faculty-documents/robert-ennis/thenatureofcriticalthinking_51711_000.pdf .

Facione , P.A. ( 1990 ), Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , The California Academic Press , Millbrae .

Facione , N.C. and Facione , P.A. ( 2013 ), The Health Sciences Reasoning Test: Test Manual , The California Academic Press , Millbrae .

Fero , L.J. , Witsberger , C.M. , Wesmiller , S.W. , Zullo , T.G. and Hoffman , L.A. ( 2009 ), “ Critical thinking ability of new graduate and experienced nurses ”, Journal of Advanced Nursing , Vol. 65 No. 1 , pp. 139 - 148 , doi: 10.1111/j.1365-2648.2008.04834.x .

Garvey , P.K. and CNE series ( 2015 ), “ Failure to rescue: the nurse's impact ”, Medsurg Nursing , Vol. 24 No. 3 , pp. 145 - 149 .

Goodare , P. ( 2015 ), “ Literature review: ‘are you ok there?’ The socialization of student and graduate nurses: do we have it right? ”, Australian Journal of Advanced Nursing , Vol. 33 No. 1 , pp. 38 - 43 .

Graneheim , U.H. and Lundman , B. ( 2014 ), “ Qualitative content analysis in nursing research: concepts, procedures, and measures to achieve trustworthiness ”, Nurse Education Today , Vol. 24 No. 2 , pp. 105 - 12 , doi: 10.1016/j.nedt.2003.10.001 .

Hsu , L. and Hsieh , S. ( 2013 ), “ Factors affecting metacognition of undergraduate nursing students in a blended learning environment ”, International Journal of Nursing Practice , Vol. 20 No. 3 , pp. 233 - 241 , doi: 10.1111/ijn.12131 .

Ignatavicius , D. ( 2001 ), “ Six critical thinking skills for at-the-bedside success ”, Dimensions of Critical Care Nursing , Vol. 20 No. 2 , pp. 30 - 33 .

Institute of Medicine ( 2001 ), Crossing the Quality Chasm: A New Health System for the 21st Century , National Academy Press , Washington .

James , J. ( 2013 ), “ A new, evidence-based estimate of patient harms associated with hospital care ”, Journal of Patient Safety , Vol. 9 No. 3 , pp. 122 - 128 , doi: 10.1097/PTS.0b013e3182948a69 .

Jones , J.H. ( 2010 ), “ Developing critical thinking in the perioperative environment ”, AORN Journal , Vol. 91 No. 2 , pp. 248 - 256 , doi: 10.1016/j.aorn.2009.09.025 .

Kaplan Nursing ( 2012 ), Kaplan Nursing Integrated Testing Program Faculty Manual , Kaplan Nursing , New York, NY .

Kim , J.S. , Gu , M.O. and Chang , H.K. ( 2019 ), “ Effects of an evidence-based practice education program using multifaceted interventions: a quasi-experimental study with undergraduate nursing students ”, BMC Medical Education , Vol. 19 , doi: 10.1186/s12909-019-1501-6 .

Longton , S. ( 2014 ), “ Utilizing evidence-based practice for patient safety ”, Nephrology Nursing Journal , Vol. 41 No. 4 , pp. 343 - 344 .

McCausland , L.L. and Meyers , C.C. ( 2013 ), “ An interactive skills fair to prepare undergraduate nursing students for clinical experience ”, Nursing Education Perspectives , Vol. 34 No. 6 , pp. 419 - 420 , doi: 10.5480/1536-5026-34.6.419 .

McMullen , M.A. and McMullen , W.F. ( 2009 ), “ Examining patterns of change in the critical thinking skills of graduate nursing students ”, Journal of Nursing Education , Vol. 48 No. 6 , pp. 310 - 318 , doi: 10.3928/01484834-20090515-03 .

Moore , Z.E. ( 2007 ), “ Critical thinking and the evidence-based practice of sport psychology ”, Journal of Clinical Sport Psychology , Vol. 1 , pp. 9 - 22 , doi: 10.1123/jcsp.1.1.9 .

Nadelson , S. and Nadelson , L.S. ( 2014 ), “ Evidence-based practice article reviews using CASP tools: a method for teaching EBP ”, Worldviews on Evidence-Based Nursing , Vol. 11 No. 5 , pp. 344 - 346 , doi: 10.1111/wvn.12059 .

Newton , S.E. and Moore , G. ( 2013 ), “ Critical thinking skills of basic baccalaureate and accelerated second-degree nursing students ”, Nursing Education Perspectives , Vol. 34 No. 3 , pp. 154 - 158 , doi: 10.5480/1536-5026-34.3.154 .

Nibert , A. ( 2011 ), “ Nursing education and practice: bridging the gap ”, Advance Healthcare Network , viewed 3 May 2017, available at: https://www.elitecme.com/resource-center/nursing/nursing-education-practice-bridging-the-gap/ .

Oermann , M.H. , Kardong-Edgren , S. , Odom-Maryon , T. , Hallmark , B.F. , Hurd , D. , Rogers , N. and Smart , D.A. ( 2011 ), “ Deliberate practice of motor skills in nursing education: CPR as exemplar ”, Nursing Education Perspectives , Vol. 32 No. 5 , pp. 311 - 315 , doi: 10.5480/1536-5026-32.5.311 .

Papathanasiou , I.V. , Kleisiaris , C.F. , Fradelos , E.C. , Kakou , K. and Kourkouta , L. ( 2014 ), “ Critical thinking: the development of an essential skill for nursing students ”, Acta Informatica Medica , Vol. 22 No. 4 , pp. 283 - 286 , doi: 10.5455/aim.2014.22.283-286 .

Park , M.Y. , Conway , J. and McMillan , M. ( 2016 ), “ Enhancing critical thinking through simulation ”, Journal of Problem-Based Learning , Vol. 3 No. 1 , pp. 31 - 40 , doi: 10.24313/jpbl.2016.3.1.31 .

Paul , R. ( 1993 ), Critical Thinking: How to Prepare Students for a Rapidly Changing World , The Foundation for Critical Thinking , Santa Rosa .

Paul , R. and Elder , L. ( 2008 ), “ Critical thinking: the art of socratic questioning, part III ”, Journal of Developmental Education , Vol. 31 No. 3 , pp. 34 - 35 .

Paul , R. and Elder , L. ( 2012 ), Critical Thinking: Tools for Taking Charge of Your Learning and Your Life , 3rd ed. , Pearson/Prentice Hall , Boston .

Profetto-McGrath , J. ( 2005 ), “ Critical thinking and evidence-based practice ”, Journal of Professional Nursing , Vol. 21 No. 6 , pp. 364 - 371 , doi: 10.1016/j.profnurs.2005.10.002 .

Rahman , A. and Applebaum , R. ( 2011 ), “ What's all this about evidence-based practice? The roots, the controversies, and why it matters ”, American Society on Aging , viewed 3 May 2017, available at: https://www.asaging.org/blog/whats-all-about-evidence-based-practice-roots-controversies-and-why-it-matters .

Rieger , K. , Chernomas , W. , McMillan , D. , Morin , F. and Demczuk , L. ( 2015 ), “ The effectiveness and experience of arts‐based pedagogy among undergraduate nursing students: a comprehensive systematic review protocol ”, JBI Database of Systematic Reviews and Implementation Reports , Vol. 13 No. 2 , pp. 101 - 124 , doi: 10.11124/jbisrir-2015-1891 .

Robert , R.R. and Petersen , S. ( 2013 ), “ Critical thinking at the bedside: providing safe passage to patients ”, Medsurg Nursing , Vol. 22 No. 2 , pp. 85 - 118 .

Roberts , S.T. , Vignato , J.A. , Moore , J.L. and Madden , C.A. ( 2009 ), “ Promoting skill building and confidence in freshman nursing students with a skills-a-thon ”, Educational Innovations , Vol. 48 No. 8 , pp. 460 - 464 , doi: 10.3928/01484834-20090518-05 .

Romeo , E. ( 2010 ), “ Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance ”, Journal of Nursing Education , Vol. 49 No. 7 , pp. 378 - 386 , doi: 10.3928/01484834-20100331-05 .

Sackett , D. , Rosenberg , W. , Gray , J. , Haynes , R. and Richardson , W. ( 1996 ), “ Evidence-based medicine: what it is and what it isn't ”, British Medical Journal , Vol. 312 No. 7023 , pp. 71 - 72 , doi: 10.1136/bmj.312.7023.71 .

Saintsing , D. , Gibson , L.M. and Pennington , A.W. ( 2011 ), “ The novice nurse and clinical decision-making: how to avoid errors ”, Journal of Nursing Management , Vol. 19 No. 3 , pp. 354 - 359 .

Saldana , J. ( 2009 ), The Coding Manual for Qualitative Researchers , Sage , Los Angeles .

Scheffer , B. and Rubenfeld , M. ( 2000 ), “ A consensus statement on critical thinking in nursing ”, Journal of Nursing Education , Vol. 39 No. 8 , pp. 352 - 359 .

Stanley , M.C. and Dougherty , J.P. ( 2010 ), “ Nursing education model. A paradigm shift in nursing education: a new model ”, Nursing Education Perspectives , Vol. 31 No. 6 , pp. 378 - 380 , doi: 10.1043/1536-5026-31.6.378 .

Swing , V.K. ( 2014 ), “ Early identification of transformation in the proficiency level of critical thinking skills (CTS) for the first-semester associate degree nursing (ADN) student ”, doctoral thesis , Capella University , Minneapolis , viewed 3 May 2017, ProQuest Dissertations & Theses database .

Turner , P. ( 2005 ), “ Critical thinking in nursing education and practice as defined in the literature ”, Nursing Education Perspectives , Vol. 26 No. 5 , pp. 272 - 277 .

Twibell , R. , St Pierre , J. , Johnson , D. , Barton , D. , Davis , C. and Kidd , M. ( 2012 ), “ Tripping over the welcome mat: why new nurses don't stay and what the evidence says we can do about it ”, American Nurse Today , Vol. 7 No. 6 , pp. 1 - 10 .

Watson , G. and Glaser , E.M. ( 1980 ), Watson Glaser Critical Thinking Appraisal , Psychological Corporation , San Antonio .

Wittmann-Price , R.A. ( 2013 ), “ Facilitating learning in the classroom setting ”, in Wittmann-Price , R.A. , Godshall , M. and Wilson , L. (Eds), Certified Nurse Educator (CNE) Review Manual , Springer Publishing , New York, NY , pp. 19 - 70 .

Corresponding author

Related articles, all feedback is valuable.

Please share your general feedback

Report an issue or find answers to frequently asked questions

Contact Customer Support

Masks Strongly Recommended but Not Required in Maryland, Starting Immediately

Due to the downward trend in respiratory viruses in Maryland, masking is no longer required but remains strongly recommended in Johns Hopkins Medicine clinical locations in Maryland. Read more .

  • Vaccines  
  • Masking Guidelines
  • Visitor Guidelines  

Center for Nursing Inquiry

Evidence-based practice, what is ebp.

As nurses, we often hear the term evidence-based practice (EBP). But, what does it actually mean? EBP is a process used to review, analyze, and translate the latest scientific evidence. The goal is to quickly incorporate the best available research, along with clinical experience and patient preference, into clinical practice, so nurses can make informed patient-care decisions ( Dang et al., 2022 ). EBP is the cornerstone of clinical practice. Integrating EBP into your nursing practice improves quality of care and patient outcomes.

How do I get involved in EBP?

As a nurse, you will have plenty of opportunities to get involved in EBP. Take that “AHA” moment. Do you think there’s a better way to do something? Let’s turn to the evidence and find out!

EBP Model

When conducting an EBP project, it is important to use a model to help guide your work. In the Johns Hopkins Health System, we use the Johns Hopkins Evidence-Based Practice (JHEBP) model. It is a three-phase approach referred to as the PET process: practice question, evidence, and translation. In the first phase, the team develops a practice question by identifying the patient population, interventions, and outcomes (PICO). In the second phase, a literature search is performed, and the evidence is appraised for strength and quality. In the third phase, the findings are synthesized to develop recommendations for practice.

The JHEBP model is accompanied by user-friendly tools. The tools walk you through each phase of the project. Johns Hopkins nurses can access the tools via our Inquiry Toolkit . The tools are available to individuals from other institutions via the Institute for Johns Hopkins Nursing (IJHN) .

If you’re interested in learning more about the JHEBP model and tools, Johns Hopkins nurses have access to a free online course entitled JHH Nursing | Central | Evidence-Based Practice Series in MyLearning. The course follows the JHEBP process from beginning to end and provides guidance to the learner on how to use the JHEBP tools. The course is available to individuals from other institutions for a fee via the Institute for Johns Hopkins Nursing (IJHN) .

Where should I start?

All EBP projects need to be submitted to the Center for Nursing Inquiry for review. The CNI ensures all nurse-led EBP projects are high-quality and value added. We also offer expert guidance and support, if needed.

Who can help me?

The Center for Nursing Inquiry  can answer any questions you may have about the JHEBP tools. All 10 JHEBP tools can be found in our Inquiry Toolkit : project management guide, question development tool, stakeholder analysis tool, evidence level and quality guide, research evidence appraisal tool, non-research evidence appraisal tool, individual evidence summary tool, synthesis process and recommendations tool, action planning tool, and dissemination tool. The tools walk you through each phase of an EBP project.

The Welch Medical Library  serves the information needs of the faculty, staff, and students of Johns Hopkins Medicine, Nursing and Public Health. Often, one of the toughest parts of conducting an EBP project is finding the evidence. The informationist  assigned to your department can assist you with your literature search and citation management.

When do I share my work?

Your project is complete. Now what? It’s time to share your project with the scholarly community.

To prepare your EBP project for publication, use the JHEBP Dissemination Tool . The JHEBP Dissemination Tool (Appendix J) details what to include in each section of your manuscript, from the introduction to the discussion, and shows you which EBP appendices correspond to each part of a scientific paper. You can find the JHEBP Dissemination Tool in our Inquiry Toolkit . 

You can also present your project at a local, regional, or national conference. Poster and podium presentation templates are available in our Inquiry Toolkit .

To learn more about sharing your project, check out our Abstract & Manuscript Writing webinar and our Poster & Podium Presentations webinar !

Submit Your Project

Do you have an idea for an EBP project?

close

Evidence-based HR: Make better decisions and step up your influence

A step-by-step approach to using evidence-based practice in your decision-making

People professionals are often involved in solving complex organisational problems and need to understand ‘what works’ in order to influence key organisational outcomes. The challenge is to pick reliable, trustworthy solutions and not be distracted by unreliable fads, outdated received wisdom or superficial quick fixes.

This challenge has led to evidence-based practice . The goal is to make better, more effective decisions to help organisations achieve their goals.

At the CIPD, we believe this is an important step for the people profession to take: our Profession Map describes a vision of a profession that is principles-led, evidence-based and outcomes-driven.

This guide sets out what evidence-based practice is, why it’s important, what evidence we should use and how the step-by-step approach works. It builds on previous CIPD publications 1 and the work of the Center for Evidence-Based Management ( CEBMa ), as well as our experience of applying an evidence-based approach to the people profession.

What is evidence-based practice?

Why do we need to be evidence-based, what evidence should we use, how do we make evidence-based decisions, how to move towards an evidence-based profession, notes and further reading, acknowledgements and publication information, evidence-based practice: a video introduction.

The basic idea of evidence-based practice is that high-quality decisions and effective practices are based on critically appraised evidence from multiple sources. When we say ‘evidence’, we mean information, facts or data supporting (or contradicting) a claim, assumption or hypothesis. This evidence may come from scientific research, the local organisation, experienced professionals or relevant stakeholders. We use the following definition from CEBMa :

“Evidence-based practice is about making decisions through the conscientious, explicit and judicious use of the best available evidence from multiple sources… to increase the likelihood of a favourable outcome.”

This technical definition is worth unpacking.

Conscientious means that you make a real effort to gather and use evidence from multiple sources – not just professional opinion. Good decisions will draw on evidence from other sources as well: the scientific literature, the organisation itself, and the judgement of experienced professionals.

Explicit means you take a systematic, step-by-step approach that is transparent and reproducible – describing in detail how you acquired the evidence and how you evaluated its quality. In addition, in order to prevent cherry-picking, you make explicit the criteria you used to select the evidence.

Judicious means critically appraised. Evidence-based practice is not about using all the evidence you can find, but focuses only on the most reliable and trustworthy evidence.

Increased likelihood means that taking an evidence-based approach does not guarantee a certain outcome. Evidence-based people professionals typically make decisions not based on conclusive, solid evidence, but on probabilities, indications and tentative conclusions. As such, an evidence-based approach does not tell you what to decide, but it does help you to make a better-informed decision.

The importance of evidence-based practice and the problems it sets out to solve is explained in more detail in our  factsheet and thought leadership article but, in essence, it has three main benefits:

  • It ensures that decision-making is based on fact, rather than outdated insights, short-term fads and natural bias.
  • It creates a stronger body of knowledge and as a result, a more trusted profession.
  • It gives more gravitas to professionals, leads to increased influence on other business leaders and has a more positive impact in work.

Before making an important decision or introducing a new practice, an evidence-based people professional should start by asking: "What is the available evidence?" As a minimum, people professionals should consider four sources of evidence.

what is critical thinking in evidence based practice

Evidence from people professionals

The expertise and professional judgement of practitioners, such as colleagues, managers, staff members, employees and leaders, is vital for determining whether a people management issue does require attention, if the data from the organisation are reliable, whether research findings are applicable, or whether a proposed solution or practice is likely to work given the organisational context.

Evidence from scientific literature

In past decades, a large number of scientific studies have been published on topics relevant to people professionals, for example on topics such as the characteristics of effective teams, the drivers of knowledge worker performance, the recruitment and selection of personnel, the effect of feedback on employee performance, the antecedents of absenteeism, and the predictors of staff turnover. Empirical studies published in peer-reviewed journals are especially relevant, as they provide the strongest evidence on cause-and-effect relationships, and thus what works in practice.

Evidence from the organisation

This can be financial data or performance indicators (for example, number of sales, costs, return on investment, market share), but it can also come from customers (for example, customer satisfaction, brand recognition), or employees (for example, task performance, job satisfaction). It can be ‘hard’ numbers such as staff turnover rates, medical errors or productivity levels, but it can also include ‘soft’ elements such as perceptions of the organisation’s risk climate or attitudes towards senior management. This source of evidence typically helps leaders identify the existence or scale of a need or problem, possible causes and potential solutions.

Evidence from stakeholders

Stakeholders are people (individuals or groups inside or outside the organisation) whose interests affect or are affected by a decision and its outcomes. For example, internal stakeholders such as employees can be affected by the decision to reduce the number of staff, or external stakeholders such as suppliers may be affected by the decision to apply higher-quality standards. However, stakeholders can also influence the outcome of a decision; for example, employees can go on strike, regulators can block a merger, and the general public can stop buying a company’s products. For this reason, this evidence is often an important guide to what needs or issues an organisation investigates, what improvements it considers and whether any trade-offs or unintended consequences of proposed interventions are acceptable.

The importance of combining all sources

Finally, none of the four sources are enough on their own – every source of evidence has its limitations and its weaknesses. It may not always be practical to draw on all four sources of evidence or to cover each of them thoroughly, but the more we can do, the better decisions will be.

Since the 1990s, evidence-based practice has become an established standard in many professions. The principles and practices were first developed in the field of medicine and following this, have been applied in a range of professions – including architecture, agriculture, crime and justice, education, international development, nutrition and social welfare, as well as management. Despite differing contexts, the approach broadly remains the same.

Below, each step is discussed and illustrated with examples. It is important to note that following all six steps will not always be feasible. However, the more that professionals can do – especially when making major or strategic decisions – the better.

what is critical thinking in evidence based practice

Asking questions to clarify the problem and potential solution and to check whether there is evidence in support of that problem and solution is an essential first step. Without these questions the search for evidence will be haphazard, the appraisal of the evidence arbitrary, and its applicability uncertain.

Asking critical questions should be constructive and informative. It is not about tearing apart or dismissing other people's ideas and suggestions. By the same token, evidence-based practice is not an exercise in myth busting, but rather seeking to establish whether claims are likely to be true and potential solutions are likely to be effective.

Example 1 shows the type of questions you can ask.

Example 1: Autonomous teams – an example of asking critical questions

Consider a typical starting point: a senior manager asks you to develop and implement autonomous teams in the organisation. Rather than jumping into action and implementing the proposed solution, an evidence-based approach first asks questions to clarify the (assumed) problem:

  • What is the problem we are trying to solve with autonomous teams?
  • How do we know we have this problem? What is the evidence?
  • What are the organisational consequences of this problem?
  • How serious and how urgent is this problem? What happens if we do nothing?

The next step would be to ask more specific questions on whether there is sufficient evidence confirming the existence and seriousness of the problem. The senior manager explains that the organisation has a serious problem with absenteeism, and that a lack of autonomy – employees’ discretion and independence to schedule their work and determine how it is to be done – is assumed to be its major cause. Important questions to ask are:

  • Do experienced practitioners (for example, supervisors, managers) agree we have a serious problem with absenteeism? Do they agree lack of autonomy is a major cause?
  • Do the organisational data confirm we have a problem with absenteeism? How does our rate of absenteeism compare to the average in the sector? Is there a trend? Do the data suggest the problem will increase when nothing is done?
  • Does the scientific literature confirm that lack of autonomy is an important driver of absenteeism? What are other common causes?
  • How do stakeholders (for example, employees, supervisors) feel about the problem? Do they agree lack of autonomy is a major cause?

Based on the answers, you should be able to conclude whether there is sufficient evidence to support the senior manager’s claim that the organisation has a problem with absenteeism, and that this problem is most likely caused by a lack of autonomy. If one or more questions can’t be answered, this may be an indication that more evidence is needed. The next step would be to ask the executive manager critical questions about the proposed solution:

  • Do we have a clear idea of what autonomous teams are? How are they different from ‘traditional’ teams?
  • How exactly are autonomous teams supposed to have a positive effect on absenteeism? How does this work? What is the causal mechanism/logic model?

The final step would be to ask questions to check whether there is sufficient evidence from multiple sources indicating the proposed solution will indeed solve the problem:

  • Do experienced practitioners (for example, supervisors, managers) agree that the introduction of autonomous teams is the ’best’ solution to lower the organisation’s absenteeism rate? Do they see downsides or unintended negative consequences? Do they see alternative solutions that may work better?
  • Can organisational data be used to monitor the impact of autonomous teams on absenteeism?
  • Does the scientific literature confirm that autonomous teams have a positive effect on absenteeism? Does the literature suggest other solutions that may work better?
  • How do stakeholders (for example, employees, supervisors) feel about the introduction of autonomous teams? Do they think it will have a positive impact on absenteeism?

Based on the answers to the questions in Step 1 , we should have a good understanding of whether there is sufficient evidence from multiple sources to support the assumed problem and preferred solution. In most cases, however, the available evidence is too limited or important sources are missing. In that case, we proceed with the second step of evidence-based practice: acquiring evidence.

Acquiring evidence from practitioners

This could be through:

  • face-to-face conversations: this is the easiest way. While it can be prone to bias, sometimes simply asking people about their experience can give good insight.
  • more structured interactive group meetings, workshops or other ways of collecting views, such as surveys.

Practitioner expertise is a useful starting point in evidence-based practice to understand the assumed problem and preferred solutions. It is also helpful in interpreting other sources of evidence – for example, in assessing whether insights from scientific literature are relevant to the current context.

Acquiring evidence from scientific literature

This could be through the following types of publication:

  • Peer-reviewed academic journals, which can be found in research databases. These are usually behind a paywall or are accessible only through a university, but CIPD members have access to one such database in EBSCO's Discovery Service . It is also worth noting that different databases focus on different specialisms – for example, like the Discovery Service, EBSCO's more expansive Business Source Elite and ProQuest's ABI/INFORM cover business and management in general, whereas the APA's PsychINFO focuses on psychology (these are all available via CEBMa ). However, even once you access them, peer-reviewed articles often contain theoretical and technical information that's hard to understand for non-researchers.
  • ‘Evidence reviews’ such as systematic reviews (see below) and shorter rapid evidence assessments (REAs). These are easier to use as they aim to identify and summarise the most relevant studies on a specific topic. They also do the work of identifying the best research, and selecting and critically appraising  studies on the basis of explicit criteria. The CIPD produces evidence reviews on a range of HR and L&D topics – you can access these via our Evidence review hub .

Acquiring evidence from the organisation

This could be through the following sources:

  • Internal management information: Often the finance department and the HR/personnel department are the key custodians of people data and analytics.
  • Internal research and evaluation: This could be conducted via trials of interventions, bespoke surveys or focus groups.
  • External sources such as census bureaus, industry bodies, professional associations and regulators. However, sometimes relevant organisational data is not available, either because collecting it is too time-consuming and costly, because of data sensitivities and a lack of disclosure (for example on  employee diversity ), or simply due to a lack of  analytical capability in processing and interpreting data.

Acquiring evidence from stakeholders

Organisational decisions often have lots of stakeholders both inside and outside the organisation. A stakeholder map is therefore a useful tool to identify which stakeholders are the most relevant. A stakeholder’s relevance is determined by two variables:

  • The extent to which the stakeholder’s interests are affected by the decision (harms and benefits).
  • The extent to which the stakeholder can affect the decision (power to influence).

When the most important stakeholders are identified, often qualitative methods such as focus groups and in-depth interviews are used to discuss their concerns.

Unfortunately, evidence is never perfect and can be misleading in many different ways. Sometimes the evidence is so weak that it is hardly convincing at all, while at other times the evidence is so strong that no one doubts its correctness. After we have acquired the evidence we therefore need to critically appraise what evidence is ‘best’ – that is, the most trustworthy.

Appraising evidence from practitioners

When appraising the evidence from practitioners, the first step is to determine whether their insights and opinions are based on relevant experience or personal opinions. Next, we need to determine how valid and reliable that experience is. We can assess this by considering:

  • whether the experience concerns repeated experience
  • whether the situation allowed for direct, objective feedback
  • whether the experience was gained within a regular, predictable work environment. 

For example, based on these three criteria, it can be determined that the expertise of a sales agent is more likely to be trustworthy than the expertise of a business consultant specialised in mergers. In general, sales agents work within a relatively steady and predictable work environment, they give their sales pitch several times a week, and they receive frequent, direct and objective feedback. Consultants, however, are involved in a merger only a few times a year (often less), so there are not many opportunities to learn from experience. In addition, the outcome of a merger is often hard to determine – what is regarded as a success by one person may be seen as a failure by another. Finally, consultants accompanying a merger do not typically operate in a regular and predictable environment: contextual factors such as organisational differences, power struggles and economic developments often affect the outcome.

Professional expertise that is not based on valid and reliable experience is especially prone to bias, but any evidence from practitioners is likely to reflect personal views and be open to bias (see Building an evidence-based people profession ). For this reason, we should always ask ourselves how a seemingly experienced professional’s judgement could be biased and always look at it alongside other evidence.

Appraising evidence from scientific literature

Appraising scientific evidence requires a certain amount of research understanding. To critically appraise findings from scientific research, we need to understand a study’s ‘design’ (the methods and procedures used to collect and analyse data). Examples of common study designs are cross-sectional studies (surveys), experiments (such as randomised controlled trials, otherwise known as ‘RCTs’), qualitative case studies, and meta-analyses. The first step is to determine whether a study’s design is the best way to answer the research question. This is referred to as ‘methodological appropriateness’.

Different types of research questions occur in the domain of people management. Very often we are concerned with ‘cause-and-effect’ or ‘impact’ questions, for example:

  • What works in managing effective virtual teams?
  • Does digital work affect mental wellbeing?
  • How can managers help employees be more resilient?
  • What makes goal setting and feedback more effective in improving performance?

The most appropriate study designs to answer cause-and-effect questions are RCTs and controlled before-after studies, along with systematic reviews and meta-analyses that gather together these types of study.

Other research questions that are relevant to people professionals are questions about prevalence (for example, “How common is burnout among nurses in hospitals?”), attitudes (for example, “How do employees feel about working in autonomous teams?”), prediction (for example, “What are drivers/predictors of absenteeism?”) or differences (for example, “Is there a difference in task performance between virtual teams and traditional teams?”).

Each of these questions would ideally have a specific study design to guarantee a valid and reliable (non-biased) answer.

An overview of common study designs and what they involve can be found in the Appendix . Following this (also in the Appendix) is an overview of types of questions and the appropriateness of each design. This gives a useful guide for both designing and appraising studies. For example, if you want information on how prevalent a problem is, a survey will work best; a qualitative study will give the greatest insight into people’s experiences of, or feelings about, the problem; and an RCT or before-after study will give the best information on whether a solution to the problem has the desired impact.

Of course, which design a study uses to answer a research question is not the only important aspect to consider. The quality of the study design – how well it was conducted – is equally important. For example, key considerations in quantitative studies include how participants are selected and whether measures are reliable and valid. 2 For systematic reviews and meta-analyses, a key question is how included studies are selected.

Finally, if an impact study seems to be trustworthy, we want to understand what the impact is of the intervention or factor being studied. Statistical measures of ‘effect sizes’ give us this information, both for an intervention or factors of influence itself, and how it compares to others. Being able to compare effect sizes is very important for practice. For example, the critical question is not simply, “Does a new management practice have a small or large effect on performance?” but rather, “Is this the best approach or are other practices more impactful?” For more information on effect sizes, see Effects sizes and interpreting research findings in the Appendix .

Appraising evidence from the organisation

When critically appraising evidence from the organisation, the first thing to determine is whether the data are accurate. Nowadays, many organisations have advanced management information systems that present metrics and KPIs in the form of graphs, charts and appealing visualisations, giving the data a sense of objectivity. However, the data in such systems are often collected by people, which is in fact a social and political endeavour. An important appraisal question therefore is: “Were the data collected, processed and reported in a reliable way?”

In addition to the accuracy of data, several other factors can affect its trustworthiness, such as measurement error, missing contextual information and the absence of a logic model. Some organisations use advanced data-analytic techniques that involve big data, artificial intelligence or machine learning. Big data and AI technology often raise serious social, ethical and political concerns as these techniques are based on complex mathematical algorithms that can have hidden biases and, as a result, may introduce gender or racial biases into the decision-making process.

Appraising evidence from stakeholders

Unlike the scientific literature and organisational data, which serve to give objectifiable and trustworthy insights, stakeholder evidence concerns subjective feelings and perceptions that can’t be considered as facts. Nonetheless, we can make sure that stakeholder evidence comes from a representative sample, so that it is an accurate reflection of all relevant stakeholders.

Evidence-based practitioners should present stakeholders with a clear view of the other sources of evidence. That is, they should summarise what the body of published research and organisational data tell us, as viewed with the benefit of professional knowledge. This can serve as the basis for a well-informed and meaningful two-way exchange. For example, the scientific literature may point to a certain solution being most effective, but stakeholders may advise on other important aspects that should be weighed up against this evidence – for example, whether the intervention is difficult to implement, ethically questionable or too expensive.

Use the best available evidence

The purpose of critical appraisal is to determine which evidence is the best available – that is, the most trustworthy. Sometimes, the quality of the evidence available is less than ideal; for example, there may not be any randomised controlled trials on your intervention of interest. But this does not leave us empty-handed. We can look at other studies that are less trustworthy but still go some way to showing cause-and-effect. Indeed, it’s possible that the best available evidence on an important question is the professional experience of a single colleague. However, even this limited evidence can still lead to a better decision than not using it, as long as we are aware of and open about its limitations. A useful maxim is that “the perfect is the enemy of the good”: if you don’t have the ideal evidence, you can still be evidence-based in how you make decisions.

After we have acquired and critically appraised the different types of evidence, how should we bring them all together? The broad process of knitting together evidence from multiple sources is more craft than science. It should be based on the question you wish to answer and the resources available.

A potential approach is illustrated in Figure 3 below. The steps illustrated here are as follows:

  • The starting point is evidence from the organisation in the form of people data; for example, let’s say employee survey results and key performance indicators have identified a likely problem.
  • This evidence informs the next phase: workshop or roundtable discussions with practitioner experts and stakeholders on issues the organisation faces.
  • Once there is agreement on the priority issues, the project managers scope researchable questions, which are examined in an evidence review of the published scientific literature.
  • The review finds little research on a practice of particular interest, so to fill the evidence gap, researchers run an in-house trial.
  • The findings of this pilot are presented to practitioner experts and stakeholders, discussing with them the implications for practice.
  • All the sources of evidence, including expert and stakeholder views, are then brought together into a final report with recommendations.
  • These are presented and discussed with stakeholders in a final workshop.

what is critical thinking in evidence based practice

In most cases, the answer as to whether to implement a new practice is not a simple yes or no. Questions to consider include the following:

  • Does the evidence apply to our organisational context?
  • Is the intervention in question the most effective or are others more effective?
  • Are the anticipated benefits likely to outweigh any risks?
  • Are there ethical issues to consider; for example, if the benefits aren't evenly distributed among shareholders?
  • Do the costs, necessary resources and timescale fit the organisation’s needs?

The final part of this step concerns how, and in what form, the evidence should be applied. This includes the following possible approaches:

  • The 'push' approach: Actively distributing the evidence to the organisation’s relevant stakeholders, often in the form of a protocol, guideline, checklist or standard operating procedure. The push approach is typically used for operational, routine practices (for example, hiring and selection procedures, or how to deal with customer complaints).
  • The 'pull' approach: Evidence from multiple sources is actively obtained and succinctly summarised. When it concerns summarising findings from the scientific literature, rapid evidence assessments (REAs) are often used – see Step 2 and the CIPD Evidence review hub . The pull approach is typically used for non-routine decisions that involve making changes to the way an organisation operates (for example, the implementation of an autonomous team, or the introduction of performance feedback).
  • The 'learn by doing' approach: Pilot testing and systematically assessing outcomes of the decisions we take in order to identify what works (see also Step 6 ). This is especially appropriate if there is no other option, for novel or hyper-complex decisions when evidence is often not (yet) available (for example, starting a new business in an emerging market).

The final step of evidence-based practice is assessing the outcome of the decision taken: did the decision (or the implementation of the new practice) deliver the desired results?

Unfortunately, organisations seldom evaluate the outcome of decisions, projects or new practices. Nevertheless, assessing the outcome of our decisions is something we can and always should do. Before we assess the outcome, however, we first need to determine:

  • whether the decision or new practice was executed/implemented
  • whether it was executed/implemented as planned.

After all, if we don’t know with certainty if a practice was implemented as planned, we don’t know whether a lack of impact is due to poor implementation or the practice itself.

When we assess the outcome of a decision, we are asking whether the decision had an effect on a particular outcome. As discussed in Step 3 , for a reliable answer to a cause-and-effect question, we need a control group (preferably randomised), a baseline measurement, and a post-measurement – also referred to as a randomised controlled trial. However, often there is no control group available, which leaves us no other option than to assess the impact of the decision or practice by comparing the baseline with the outcome. This type of assessment is referred to as a before-after measurement.

When we do not (or cannot) obtain a baseline, it is harder to reliably assess the outcome of a new practice or intervention. This is often the case in large-scale interventions or change projects that have multiple objectives. But even in those cases, assessing the outcome retrospectively is still beneficial. For example, a meta-analysis found that retrospective evaluations can increase performance by 25%.

People professionals may be thinking: “How do I follow the six steps if I’m not a qualified researcher?” But evidence-based practice is not about trying to turn practitioners into researchers. Rather, it’s about bringing together complementary evidence from different sources, including research and practice. Some aspects of evidence-based practice are technical, so people professionals may find it useful to work with academics or other research specialists.

For example, it’s unlikely all people professionals in an organisation will need to be highly trained in statistics, but an HR team may benefit from bringing in one or two data specialists, or hiring them for ad hoc projects. Similarly, conducting evidence reviews and running trials requires well-developed research skills – practitioners could either develop these capabilities in-house or bring them in from external researchers. Academics are also usually keen to publish research, so it may not be necessary for practitioners to do much additional work to support this. Events like the CIPD’s annual  Applied Research Conference can be a good way for people professionals to develop networks with academic researchers.

A good aim for practitioners themselves is to become a ‘savvy consumer’ of research, understanding enough that one can ask probing questions; for example, about the strength of evidence and the size of impacts. This is underpinned by skills in critical thinking, in particular being clear about what questions are really of interest and what evidence will do the best job of answering those questions.

To develop your knowledge and capability, visit the  CIPD’s online course  on evidence-based practice or, for more advanced skills, look at CEBMa’s online course . 

While it is fair to say that evidence-based practice and HR is still in its infancy compared to some other professions, people professionals can begin with these practical steps to pump prime their decision-making:

  • Read research.
  • Collect and analyse organisational data.
  • Review published evidence.
  • Trial new practices.
  • Share knowledge.
  • Above all, think critically.

Our thought leadership article gives further insight into how the people profession can become more evidence-based.

Evidence-based practice is about using the best available evidence from multiple sources to optimise decisions. Being evidence-based is not a question of looking for ‘proof’, as this is far too elusive. However, we can – and should – prioritise the most trustworthy evidence available. The gains in making better decisions on the ground, strengthening the body of knowledge and becoming a more influential profession are surely worthwhile.

To realise the vision of a people profession that’s genuinely evidence-based, we need to move forward on two fronts:

  • We need to make sure that the body of professional knowledge is evidence-based – the CIPD’s Evidence review hub is one way in which we are doing this.
  • People professionals need to develop knowledge and capability in evidence-based practice. Resources such as the CIPD Profession map and courses from the CIPD and CEBMa can help. Our case studies demonstrate how people professionals are already using an evidence-based approach to successfully address issues in their organisations.

In applying evidence-based thinking in practice, there are certain tenets to hold onto. For substantial decisions, people professionals should always consider drawing on four sources of evidence: professional expertise, scientific literature, organisational data, and stakeholder views and concerns. It can be tempting to rely on professional judgement, received wisdom and ‘best practice’ examples, and bow to senior stakeholder views. But injecting evidence from the other sources will greatly reduce the chance of bias and maximise your chances of effective solutions.

Published management research is a valuable source of evidence for practitioners that seems to be the most neglected. When drawing on the scientific literature, the two principles of critical appraisal (‘not all evidence is equal’) and looking at the broad body of research on a topic (‘one study is not enough’) stand us in excellent stead. This has clear implications for how we look for, prioritise and assess evidence. A systematic approach to reviewing published evidence goes a long way to reducing bias and giving confidence that we’ve captured the most important research insight.

Becoming a profession worthy of the label ‘evidence-based’ is a long road. We need to chip away over time to see real progress. HR, learning and development, and organisational development are newer to evidence-based practice than other professions, but we can take inspiration from them, for whom it has also been a long road, and be ambitious.

This appendix explains some important technical aspects of appraising scientific research, which is inevitably the trickiest aspect of evidence-based practice for non-researchers. As we note in this guide, most people professionals won’t need to become researchers themselves, but a sensible aim is to become ‘savvy consumers’ of research.

To support this, below we explain four aspects of appraising scientific research:

  • The three conditions that show causal relationships.
  • Common study designs.
  • Assessing methodological appropriateness.
  • Interpreting research findings (in particular effect sizes).

We hope that this assists you in developing enough understanding to be able to ask probing questions and apply research insights.

Three conditions to show causal relationships

In HR, people management and related fields, we are often concerned with questions about ‘what works’ or what’s effective in practice. To answer these questions, we need to get as close as possible to establishing cause-and-effect relationships.

Many will have heard the phrase ‘correlation is not causality’ or ‘correlation does not imply causation’. It means that a statistical association between two measures or observed events is not enough to show that one characteristic or action leads to (or affects, or increases the chances of) a particular outcome. One reason is that statistical relationships can be spurious, meaning two things appear to be directly related, but are not.

For example, there is a statistically solid correlation between the amount of ice-cream consumed and the number of people who drown on a given day. But it does not follow that eating ice-cream makes you more likely to drown. The better explanation is that you’re more likely to both eat ice-cream and go swimming (raising your chances of drowning) on sunny days.

So what evidence is enough to show causality? Three key criteria are needed: 3

  • Association: A statistical relationship (such as a correlation) between reliable measures of an intervention or characteristic and an important outcome.
  • Temporality or prediction: That one of these comes before the other, rather than the other way round. We obtain this from before-and-after measures to show changes over time.
  • Other factors (apart from the intervention or influencer of interest) don’t explain the relationship: We obtain this from various things: studying a control group alongside the treatment group to see what would have happened without the intervention (the counterfactual); randomizing the allocation of people to intervention and control to avoid selection bias, and controlling for other relevant factors in the statistical analysis (for example, age, gender or occupation).

Common study designs

Different study designs do better or worse jobs at explaining causal relationships.

Single studies

  • Randomised controlled trials (RCTs): Conducted well, these are the ideal method that meets all three criteria for causality. They are often referred to as the ‘gold standard’ of impact studies.
  • Quasi-experimental designs: These are a broad group of studies that go some way towards meeting the criteria. While weaker than RCTs, they are often much more practical or ethical to conduct, and can provide good evidence for cause and effect. One example is single-group before-and-after studies. Because these don’t include control groups, we don’t know whether any improvement observed would have happened anyway, but by virtue of being longitudinal they at least show that one thing happens following another.
  • Parallel cohort studies: These compare changes in outcomes over time for two groups who are similar in many ways but treated differently in a way that is of interest. Because people are not randomly allocated to the two groups, there is a risk of ‘confounders’ – that is, factors that explain both the treatment and outcomes, and interfere with the analysis. But these studies are still useful as they show change over time for intervention and control groups.

These research designs go much further to show cause-and-effect or prediction than cross-sectional surveys, which only observe variables at one point in time. In survey analysis, statistical relationships could be spurious or the direction of causality could even be the opposite to what you might suppose. For example, a simple correlation between ‘employee engagement’ and performance could exist because engagement contributes to performance, or because being rated as high-performing makes people feel better.  

Other single study designs include controlled before-after studies (also called 'non-randomized controlled trials' or 'controlled longitudinal studies'), controlled studies with post-test only, and case studies. Case studies often use qualitative methods, such as interviews, focus groups, documentary analysis, narrative analysis, and ethnography or participant observation. Qualitative research is often exploratory, in that it is used to gain an understanding of underlying reasons or opinions and generate new theories. These can then be tested as hypotheses in appropriate quantitative studies.

Systematic reviews and meta-analyses

Systematic reviews and meta-analyses are central to evidence-based practice. Their strength is that they look across the body of research, allowing us to understand the best available evidence on a topic overall. In contrast, even well-conducted single studies can give different results on the same topic, due to differences in context or the research approaches used.

Characteristics of these are as follows:

  • Systematic reviews: These are studies that summarise the body of studies on the same topic. They use consistent search terms in different scientific databases, ideally appraise the quality of studies and are explicit about the methods used. The CIPD conducts evidence reviews based on rapid evidence assessments (REAs), a shortened form of the systematic review that follows the same principles.
  • Meta-analysis: This is often based on a systematic review. It is a study that uses statistical analysis to combine the results of individual studies to get a more accurate estimate of an effect. It can also be used to analyse what conditions make an effect larger or smaller.

More information on research designs can be found in CEBMa resources .

Assessing methodological appropriateness

When conducting an evidence review, we need to determine which research evidence is ‘best’ (that is, most trustworthy) for the question in hand, so we can prioritise it in our recommendations. At the same time, we assess the quality of research evidence to establish how certain we can be of our recommendations: well-established topics often have a strong body of research, but the evidence on new or emerging topics is often far less than ideal.

This involves appraising the study designs or research methods used. For questions about intervention effectiveness or cause-and-effect, we use tables such as that below to inform a rating of evidence quality. Based on established scientific standards, we can also estimate the trustworthiness of the study. Hypothetically, if you were deciding whether to use a particular intervention based on evidence that was only 50% trustworthy, you would have the same 50/50 chance of success as tossing a coin, so the evidence would be useless. On the other hand, using evidence that was 100% trustworthy would give you certain success. Of course, in reality nothing is 100% certain, but highly trustworthy research can conclusively demonstrate that, in a given context, an intervention has a positive or negative impact on the outcomes that were measured.

Table 1: Methodological appropriateness of effect studies and impact evaluations

Systematic review or meta-analysis of randomized controlled studies AA: Very high 95%
Systematic review or meta-analysis of non-randomized controlled and/or before-after studies A: High 90%
Randomized controlled study
Systematic review or meta-analysis of controlled studies without a pre-test or uncontrolled study with a pre-test B: Moderate 80%
Non-randomized controlled before-after study
Interrupted time series
Systematic review or meta-analysis of cross-sectional studies C: Limited 70%
Controlled study without a pre-test or uncontrolled study with a pre-test
Cross-sectional survey D: Low 60%
Case studies, case reports, traditional literature reviews, theoretical papers E: Very low 55%

Notes: Trustworthiness takes into consideration not only which study design was used but also how well it was applied. Table reproduced from CEBMa (2017), based on the classification system of Shadish, Cook and Campbell (2002) 4 and Petticrew and Roberts (2006) 5 .  

There are two important points to note about using such hierarchies of evidence. First, as we discuss in this guide, evidence-based practice involves prioritising the best available evidence. A good mantra here is ‘the perfect is the enemy of the good’: if studies with very robust (highly methodologically appropriate) designs are not available on your topic of interest, look at others. For example, if systematic reviews or randomized controlled studies are not available on your question, you will do well to look at other types of studies, such as those with quasi-experimental designs.

Second, although many questions for managers and people and HR relate to effectiveness or causality, this is by no means always the case. Broadly, types of research questions include the following:

Table 2: Types of research question

Does A have an effect/impact on B? What are the critical success factors for A? What are the factors that affect B?
Does A precede B? Does A predict B over time?
Is A related to B? Does A often occur with B? Do A and B co-vary?
Is there a difference between A and B?
How often does A occur?
What is people's attitude toward A? Are people satisfied with A? How many people prefer A over B? Do people agree with A?
What are people's experiences, feelings or perceptions regarding A? What do people need to do/use A?
Why does A occur? How does A impact/affect B? Why is A different from B?

Different methods are suited to different types of questions. For example, a cross-sectional survey is a highly appropriate or trustworthy design for questions about association, difference, prevalence, frequency and attitudes. And qualitative research is highly appropriate for questions about experience, perceptions, feelings, needs and exploration and theory building. For more discussion of this, see Petticrew and Roberts (2003).

Effect sizes and interpreting research findings

Even if practitioners wanting to be evidence-based can search for and find relevant research, they are left with another challenge: how to interpret it. Unfortunately, academic research in human resource management is often highly technical, written in inaccessible language and not closely linked to practice. A recent analysis found that in a sample of 324 peer-reviewed articles, half of them dedicated less than 2% of the text to practical implications, and where implications were discussed, this was often obscure and implicit.  

Even if published research does include good discussion of practical implications, it’s helpful and perhaps necessary for practitioners wishing to draw on them to understand the findings. This can be tricky, as they contain fairly technical statistical information.

Statistical significance

There’s an obvious need to simplify the technical findings of quantitative studies. The typical way to try to simplify research findings is to focus on statistical significance, or p-values. Reading through a research paper, this may seem intuitive, as the level of significance is identified with asterisks: typically, * means sufficiently significant and ** or *** means highly significant. However, there is a lot of confusion about what the p-value is – even quantitative scientists struggle to translate it into something meaningful and easy to understand – and a growing number of scientists are arguing that it should be abandoned. What’s more, statistical significance does nothing to help a practitioner who wants to know if a technique or approach is likely to have a meaningful impact – that is, it does not answer the most important practical question of how much difference  an intervention makes.

Effect sizes

The good news is that effect sizes do give this information. The information is still technical and can still be hard to understand, as studies often use different statistics for effect sizes. Fortunately, however, we can translate effect sizes into every-day language. A useful tool is 'Cohen’s Rule of Thumb', which matches different statistical measures to small/medium/large categories. 6   

According to Cohen:

  • a ‘small’ effect is one that is visible only through careful examination – so may not be practically relevant
  • a ‘medium’ effect is one that is ‘visible to the naked eye of the careful observer’
  • a ‘large’ effect is one that anybody can easily see because it is substantial. An example of a large effect size is the relationship between sex and height: if you walked into a large room full of people in which all the men were on one side and all the women on the other side, you would instantly see a general difference in height.

The rule of thumb has since been extended to account for very small, very large and huge results. 7

Effect sizes need to be contextualised. For example, a small effect is of huge importance if the outcome is the number of fatalities, or indeed, sales revenue. Compared to this, if the outcome is work motivation (which is likely to affect sales revenue but is certainly not the same thing) even a large effect will be less important. This shows the limits of scientific studies and brings us back to evidence from practitioners and stakeholders, who are well placed to say what outcomes are most important.

1 Gifford, J. (2016) In search of the best available evidence: Positioning paper. London: Chartered Institute of Personnel and Development.

2 For a discussion of reliability and validity in performance measures, see People performance: an evidence review.

3,4 Shadish, W. R. Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Belmont, CA, Wadsworth Cengage Learning. 

5 Petticrew, M., & Roberts, H. (2006). How to appraise the studies: an introduction to assessing study quality. In: Systematic reviews in the social sciences: A practical guide,  pp125-63. Oxford: Blackwell.

6 For a table showing different measures of effect sizes according to Cohen’s Rule of Thumb, see CEBMa Guideline for Rapid Evidence Assessments in Management and Organizations,  p 20.

7 Sawilowsky, S. S. (2009) New Effect Size Rules of Thumb. Journal of Modern Applied Statistical Methods. Vol 8(2), pp 597–599.

CIPD evidence reviews are available on a range of HR and L&D topics.

Barends, E. and Rousseau, D. M. (2018)  Evidence-based management: how to use evidence to make better organizational decisions . London: Kogan Page.

Barends, E., Rousseau, D. and Briner, R. B. (2014) Evidence-Based Management: the basic principles . Amsterdam, Center for Evidence-Based Management.

Guadalupe, M. (2020) Turn the Office Into a Lab . INSEAD Economics & Finance – Blog.

Petticrew, M. and Roberts, H. (2003)  Evidence, hierarchies, and typologies: horses for courses . Journal Of Epidemiology And Community Health. Vol 57(7): 527.

Pfeffer, J. and Sutton, R. I. (2006) Hard facts, dangerous half-truths, and total nonsense:profiting from evidence-based management . Boston, Mass., Harvard Business School Press.

Pindek, S., Kessler, S. R. and Spector, P. E. (2017) A quantitative and qualitative review of what meta-analyses have contributed to our understanding of human resource management . Human Resource Management Review. Vol 27(1), pp26–38.

Rousseau, D. M. (2006) Is there such a thing as "evidence-based management"? Academy of Management Review. Vol 31(2), pp256–269.

Rousseau, D. M. (2020) Making Evidence-Based Organizational Decisions in an Uncertain World . Organizational Dynamics. Vol 49(1): 100756.

This report was written by Jonny Gifford and Jake Young of the CIPD and Eric Barends of the Center for Evidence-Based Management (CEBMa).

Please cite this guide as: Gifford, J., Barends, E. and Young, J. (2023) Evidence-based HR: Make better decisions and step up your influence . Guide. London: Chartered Institute of Personnel and Development.

In this short video, HR and business experts offer an introduction to the what, why and how of evidence-based practice.

 alt=

Read our case studies which demonstrate how an evidence-based practice approach helped:

  • Surrey and Sussex Police shape new fair selection processes
  • the BBC reinvigorate its performance management practice
  • the International SOS Foundation improve the support needed for the wellbeing of hybrid workers .

Tackling barriers to work today whilst creating inclusive workplaces of tomorrow.

Bullying and harassment

Discover our practice guidance and recommendations to tackle bullying and harassment in the workplace. 

Evidence-based practice: Essential Insights

Get to the heart of evidence-based practice and recognise the types of evidence that best inform your professional judgement and decision-making

Callout Image

More on this topic

We all know that being evidence-based helps us make better decisions, but how can we turn this into a reality?

A case study on using evidence-based practice to better understand how to support hybrid workforces

A case study on using evidence-based practice to reinvigorate performance management practices

A case study on using evidence-based practice to review selection processes for promoting police officers

Latest guides

what is critical thinking in evidence based practice

Practical advice for managers on tackling sexual harassment in the workplace

what is critical thinking in evidence based practice

Practical advice on how to tackle sexual harassment in the workplace

what is critical thinking in evidence based practice

This guide offers advice on assessing skills, planning skills development and deploying and redeploying staff

what is critical thinking in evidence based practice

Practical guidance on helping employees adapt and thrive when faced with workplace stress

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Medicine (Baltimore)
  • v.98(39); 2019 Sep

Evidence-based practice

Background:.

This study is to summarize the status of knowledge, attitudes, implementation, facilitators, and barriers of evidence-based practice (EBP) in community nurses (CNs). EBP has been widely adopted but the knowledge, attitudes, and implementation of EBP among CNs, and the facilitators and barriers they perceived have not been clearly confirmed.

A literature search was conducted using combined keywords in 3 English databases and 3 Chinese databases of peer-reviewed publications covering the dates of publication from 1996 to July, 2018. Twenty articles were included. The information of the knowledge, attitudes, implementation, and the perceived facilitators and barriers of EBP in CNs was extracted and summarized.

CNs had positive attitudes toward EBP, but insufficient knowledge and unprepared implementation. The most cited facilitators were academic training, management functions, and younger age. Inadequate time and resources were recognized as main barriers hindering the transforming from knowledge and attitudes to implementation. Developed interventions mainly focused on knowledge facilitation rather than the elimination of objective barriers.

Conclusions:

Findings demonstrate a compelling need for improvement in knowledge and implementation of EBP in CNs, compared with the better attitudes. Except education, knowledge translating into implementation needs more coordination with authorities to magnify the facilitators and overcome the barriers. Further studies need to concentrate on deficient knowledge and implementation of EBP among CNs. Policy makers can use the facilitators and barriers found by this review to modify nursing education, current scientific resources supplement, practice supports for care improving.

1. Introduction

Nurses can provide personal care and treatment, work with families and communities, and play a central part in public health and controlling disease and infection. These roles of nurses have been recognized by the World Health Organization. [ 1 ] Community nurses (CNs) combine the skills of nursing, public health, and some phases of social assistance, and they function as a part of the entire public health programs. [ 2 ] CNs can provide health care services, contributing to disease and injury prevention, disability alleviation, and health promotion. [ 1 , 2 ] CNs generally face more independent work in the varied and dynamic community when there is no medical diagnosis or treatment provided by physicians for either patients or family. Therefore, they have to think critically, analyze complex situations, perform health assessment, and make decisions. [ 3 ] However, CNs do not always make decisions based on the up-to-date high quality evidence, but on experiences. [ 4 , 5 ]

WHO has suggested that health improving in communities is dependent upon nursing services underpinned by evidence-based practice (EBP). [ 2 ] EBP refers to using the best available evidence for decision-making and providing efficient and effective care for patients on a scientific basis. [ 6 ] Systematic implementation of EBP can enhance healthcare safety and improve patient outcomes. [ 7 , 8 ] Although EBP is equally important to CNs as it is to clinical nurses, EBP in community nursing is still in the initial stage. [ 5 ]

Researchers have reviewed the importance of nursing leadership in EBP, [ 9 ] the state of readiness for EBP, [ 10 ] barriers and facilitators in guidelines, [ 11 ] and strategies of EBP implementation, [ 12 ] but all these researches were designed for hospital nurses, but not CNs. One study [ 7 ] concluded the practical contents of EBP in community nursing without analyzing the level of CNs’ EBP. Another study [ 13 ] reviewed the attitudes, knowledge, and perceptions of CNs regarding EBP, but the study was limited in European community settings.

In this review, the knowledge, attitudes, and implementation of EBP of CNs were analyzed globally, as well as the facilitators and barriers of EBP implementation of CNs.

1.1. Aim of study

The aims of the review are to answer the questions: what is the status of knowledge, attitude, and implementation of EBP among CNs worldwide? What facilitators and barriers influence EBP implementation of CNs?

2. Materials and methods

2.1. literature search.

Literatures were retrieved within the authority of the university which authors belonged. A literature search was conducted using combined keywords in 3 English databases (PubMed/MEDLINE, Mag Online Library, Science Direct) and 3 Chinese databases (Chinese Journal Full-Text Database, Wan fang Database, VIP Database for Chinese Technical Periodicals) of peer-reviewed publications covering the dates of publication from January 1996 (the earliest year when EBP in primary care was introduced in detail [ 14 ] ) to July 2018. The following keywords were used: [((primary care nursing) OR (community health nursing) OR (public health nursing)) OR ((primary care) AND (nurse)) OR ((community health) AND (nurse)) OR ((public health) AND (nurse))] AND [(evidence based) OR (evidence-based practice) OR (evidence-based nursing)] AND [(knowledge) OR (skill) OR (attitude) OR (belief) OR (facilitators) OR (barriers)]. The field was limited to “title/abstract” and the publication type was limited to “journal article.” Reference tracking was carried out to identify additional potentially relevant references. Bibliographic citation management NoteExpress software (Version V3.2.0, Aegean Corporation, Beijing, China) was used to manage the retrieved studies. No published or in-progress systematic review on this topic is found in Cochrane Library and Jonna Briggs Institute Library before this review.

2.2. Inclusion and exclusion criteria

Two researchers independently screened the titles and abstracts to identify CNs based on the authors’ description and the definition of WHO. [ 2 ] Inclusion criteria: reports involving CNs’ EBP knowledge/skill, attitude/belief, implementations; reports involving CNs perceived barriers or facilitators of EBP; original scientific studies; written in English or Chinese. Exclusion criteria: reports on EBP theory/framework, and narrative description of writer's personal opinion; studies without a clearly defined population and sub-analysis of CNs, or with mixed population (hospital and care organization nurses, other health professionals); reports on private nursing homes and rural hospitals; systematic reviews; non-research literature, that is, conference notice.

2.3. Data extraction and analysis

Two researchers extracted the data, including author, year, study design, sampling, outcome methods, and main results. The following aspects of CNs’ EBP were extracted knowledge, attitude, implementation, facilitator, and barrier. Each study covered at least one theme in this review (Table ​ (Table1 1   ).

Summary of the included studies.

An external file that holds a picture, illustration, etc.
Object name is medi-98-e17209-g001.jpg

Table 1 (Continued)

An external file that holds a picture, illustration, etc.
Object name is medi-98-e17209-g003.jpg

2.4. Ethical approval

Ethical approval is not necessary because no human subjects and patient information were collected and studied.

3.1. Literature screening

A systematic integrated literature review was conducted with the guidance of preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) protocol. [ 46 ] A total of 2873 articles were obtained by database-searching and 4 additional articles were identified by reference-tracking (Fig. ​ (Fig.1). 1 ). After screening the titles, abstracts, and full-text, 19 English articles and 1 Chinese article were included in this study.

An external file that holds a picture, illustration, etc.
Object name is medi-98-e17209-g004.jpg

Flowchart of the systematic search and review process.

3.2. Study characteristics

As shown in Table ​ Table1  , 1   , the articles involved in this study were published between 2004 and 2018, conducted in 8 developed countries and 1 developing country. The study designs were mainly cross-sectional survey (n = 11), and the sample sizes ranged from 19 to 719. Four studies (20%) used probability sampling method. Thirteen previously reported questionnaires [ 16 , 18 – 20 , 25 , 26 , 31 , 34 , 35 , 38 , 40 , 44 ] and 6 self-developed questionnaires were used to explore the status of CNs’ EBP. For those studies that used questionnaires (n = 16), 12 studies reported the reliability or validity of questionnaires. Only 2 studies had a response rate under 50%, so the results might be reflective for the actual situation. Face-to-face interview was used in 5 studies, so the feelings of the participants could be directly presented. All studies focused on at least 1 of the 5 themes: knowledge (n = 8, 40%), attitude (n = 12, 60%), implementation (n = 14, 70%), facilitator (n = 8, 40%), and barrier (n = 7, 35%).

3.3. Study quality

No study was excluded in the quality assessment stage because potential valuable insights may be presented even in lower quality researches. [ 47 ] The bias caused by researchers should be noted because no cross-sectional study explained unified training and education for the data collectors in the use of measuring tools, and the dealing of confounding factors within the study design or in data analysis was not shown. No author stated the influences of researcher on the study of qualitative designs, nor specified clear philosophical perspective. With regards to quasi-experimental studies, there was a lack of report about confounding variables on whether participants were involved in other similar studies contemporarily. The designs of randomized controlled trial's (RCT) randomization and concealment were insufficient and needed to be improved in blind design. [ 48 ] Comprehensive application of mixed methods studies was satisfied, but the interview context description and controlling of confounding variables were deficient. [ 49 ]

3.4. Outcome measurements

3.4.1. cn's knowledge of ebp.

EBP knowledge of CNs was not satisfied generally. A total of 93% (n = 139) CNs did not or knew little about EBP. They made decisions depending on individual experiences and consultation. [ 4 , 47 ] Pericas-Beltran et al [ 23 ] pointed out the possible reason may be that most CNs are lack of EBP training or education though some CNs search for more information by Google, e-journals, clinical enquiry websites, institutional websites, and databases. CNs faced difficulties in critical thinking, identifying research articles and journals, evaluating research quality. [ 41 , 29 , 15 ] Gerrish and Cooke [ 29 ] found that only 26% of the participants were competent in finding research evidence and 29% in using research evidence. Though a study in UK [ 39 ] reported 64% CNs had the ability to find research evidence and review critically, all participants were nursing leaders. In addition, participants with higher title have been confirmed to have greater EBP competency. [ 50 ]

3.4.2. CN's attitudes toward EBP

Contrary to the lacking of knowledge, CNs consistently expressed satisfactory attitudes and beliefs about EBP. They agreed that primary care needed to keep up with current scientific base and best evidence. [ 23 , 27 , 28 ] This positive attitude toward EBP may be associated with evidence-based intervention experience, working years in nursing, [ 37 ] the role of leader (having more authority and influences), [ 15 ] EBP experiences, [ 24 ] job satisfaction and group cohesion, organizational culture, and, readiness for system-wide integration of EBP. [ 33 ] However, positive attitudes toward EBP did not mean the implementation of EBP. [ 51 ] Rutledge and Skelton [ 15 ] found that after 1-year training program focusing on EBP facilitation skills, almost all participants have more confidence in EBP, whereas they had not implement EBP into daily job. Other researchers [ 51 ] also supported that despite familiarity with EBP, nurses seldom participated in EBP.

3.4.3. CNs’ implementation of EBP

CNs’ positive attitudes toward EBP do not guarantee the implementation of EBP. [ 10 ] One study [ 39 ] found that 97% respondents agreed or strongly agreed on the promotive role of EBP in patient care, while only 5% by learning the skills of EBP and 2% by seeking and applying evidence-based summaries established evidence-based implementation. Pereira et al [ 45 ] discovered CNs got higher scores of believing in EBP will improve care (4.02), but the implementation of EBP sufficiently got relatively lower scores (2.71). Similar findings [ 35 ] of poor implementation were also reported in Canada, as 7% respondents formulated questions and performed knowledge search, 20% used database, and 22% used research in practice. Some CNs did read the nursing journals, which gave them access to relevant information within the fields, nonetheless, this did not yet lead to EBP implementation. [ 4 ]

In a pilot study in the United States, [ 43 ] an intervention program was offered to 24 CNs by EBP mentors, including teaching content on EBP; EBP toolkit; environmental prompts; and EBP mentors who encouraged participants to use EBP to provide supports; EBP training (e.g., how to build searchable questions, find evidence, systematic review and meta-analysis, appraise, EBP in clinical decision making). The intervention contended 2 periods to integrate learning and practice: a 16-week educational intervention phase and a following 12-week project intervention period. Data were collected at baseline (Time 1), after the 16-week educational intervention (Time 2), mentored intervention period (Time 3), and after completion of the intervention (Time 4, i.e., 9 months after Time 3). Results demonstrated that the training might be a promising strategy for a short-term enhancement of EBP implementation, but the long-term effect was undetermined. Another study [ 28 ] explored the project including EBP training handbooks and online courses for CNs to enhance evidence-based sexual healthcare and found that 86% participants dedicated useful EBP training handbooks and all participants agreed that online courses were helpful to the implementation of EBP. However, near to one-third participants reported no implementation of the learned EBP skills.

3.4.4. Facilitators and barriers of EBP

Fourteen facilitators and 21 barriers of CNs’ EBP application were identified and were divided into 3 themes: characters of the evidence (e.g., the presentation, quantity, and quality of the studies); characters of the environment, that is, facilitators and barriers perceived in the work settings; characters of the nurses, that is, the nurses’ values, skills, and awareness about EBP (Table ​ (Table2 2 ).

An external file that holds a picture, illustration, etc.
Object name is medi-98-e17209-g002.jpg

Facilitators and barriers of CNs’ EBP.

An external file that holds a picture, illustration, etc.
Object name is medi-98-e17209-g005.jpg

4. Discussion

This review found that CNs showed interests in EBP and believed that exerting EBP was useful for the quality of care. [ 23 ] However, the lack of knowledge or skills and barriers of CNs limited the implementation of EBP application. [ 52 ] Numerous teaching approaches, such as small group exercises, article review and critique, case studies, literature search, and scenario simulation training, have been found to be promising ways in improving EBP knowledge and beliefs. [ 53 ] EBP as a scientific approach is easy to be accepted, but difficult to be acquired and applied. [ 54 ] This review found that researchers tended to focus on the cultivation of EBP knowledge and interests, but not the implementation of EBP. However, ensuring EBP implementation is the ultimate aim.

Understanding and identifying the facilitators and barriers of EBP may be the cornerstone to achieve successful knowledge transferring. [ 55 ] This review identified that most facilitators were related to individual knowledge and beliefs. Administrators tend to carry out more EBP because they can get more authority and coordination than general nurses. [ 56 ] Younger age and shorter working experiences were also recognized as promoting factors of EBP, which may be due to the academic training. [ 22 ] These CNs received modern nursing curriculums, including EBP, and they were liable to apply EBP with the cultivated consciousness and ability. [ 17 ] Effective implementation is also associated with organizational culture of valuing EBP, which means “evidence-based culture” perceived and created by community healthcare providers and managements. [ 41 , 33 , 22 ] CNs working in an evidence-based group with a better EBP belief and culture will support the fresh findings and scientific behaviors. [ 57 ]

Barriers of EBP among CNs were mostly gathered in an environment scale. Time and resources were referred mostly. [ 58 ] When workload is too heavy, nurses are less likely to search and apply evidence. [ 55 , 58 ] Nurses who have access to more available resources, such as electronic databases, libraries, and professional guidelines, tend to rely more on scientific evidence. [ 59 , 30 ] Four articles referred inadequate knowledge as a barrier, and indicated that the current academic education programs did not adequately prepared for EBP implementation. [ 23 , 15 , 28 , 59 , 60 ] Barriers distributed evidence was relatively less, and these can be overcome by providing more evidence resources, peer supports, and literature screening skills. All cited barriers can be ascribed to 3 possible factors ultimately: inadequate supports of time and resources, inadequate knowledge and training, inadequate encouragement, and assistance from organizations. Barriers were recognized by researchers; however, workable and comprehensive approaches to overcome these barriers are lacking. Measures reported in studies were educational programs generally, which are effective in knowledge improving. More concerns should be focused on EBP implementation. Barriers must be settled, not just in education and training, but also in objective barriers and comprehensive elimination. Policy support and institutional protection is not a choice, but a necessity. More investment in resource supplement, nursing workforce, nursing guidance, and EBP approaches is needed for well EBP implementation, and therefore CNs can get abundant research time and resources, better EBP operating environment, and additional supports from working stuff and managements.

4.1. Limitation

One of the limitations is that though systematically electronic databases searching has been down, some relevant literatures may be missed as in any reviews. [ 61 ] To avoid this, the articles were independently searched by 2 researchers and all eligible articles were saved with maximum degree. Language and publication bias were possible despite that our review scope was increased to worldwide.

4.1.1. Implication to nursing policy

The implication of this review involves the need of relevant training for EBP knowledge. Nursing policy reform must provide a systematic curriculum of EBP for nursing students and manageable continuing education for nurses. Then barriers removing must be a priority for authorities to clear the gap between CNs’ EBP knowledge and implementation. It is expected that more policies will be introduced in EBP supporting among CNs, such as research time protection, resources providing for community health institutions, and responsible EBP coordination.

5. Conclusions

The findings suggest that most CNs’ EBP are not satisfied. Although they make positive gestures and believe in the value of EBP in improving nursing practice and patient outcomes, they did not have matching sufficient knowledge and skills, such as finding proper evidence. In addition, the application of EBP is worse, and several interventions do improve their knowledge, but how to ensure that the abilities CNs acquired can be used in vocational action remains to be explored.

The facilitators of CNs’ EBP mostly belong to the “nurses” part, and relate to the improvement of the ability and values of EBP both in individual and organization. Barriers mostly belong to the environment part, and all barriers can be attributed to the following factors: lacking time and resources, lacking knowledge and training, inadequate encouragement and assistance. Organizations must ensure that the required resources and supports are available for CNs. Strong experimental designs are required to accurately assess the long-term capacity for EBP training strategies and more researches should be devoted to removing objective barriers in EBP implementations.

Author contributions

Conceptualization: Shu Li, Meijuan Cao, Xuejiao Zhu.

Data curation: Shu Li, Meijuan Cao.

Formal analysis: Shu Li, Meijuan Cao.

Project administration: Xuejiao Zhu.

Writing – original draft: Shu Li, Meijuan Cao, Xuejiao Zhu.

Abbreviations: CNs = community nurses, EBP = evidence-based practice.

How to cite this article: Li S, Cao M, Zhu X. Evidence-based practice. Medicine . 2019;98:39(e17209).

This article is funded by 2018 Zhejiang Medical and Health Science and Technology Plan Project (Zhejiang Health and Family Planning Commission, File No.:2017/66) and it is part of the Construction of Community Chronic Disease Self-Management Scheme Integrating Patient-Volunteer and Medical Staff (Ref. 2018KY146).

The authors have no conflicts of interest to disclose.

ONS Voice Home

  • Latest Articles
  • Clinical Practice
  • ONS Leadership
  • Get Involved

hand touching futuristic screen that represents artificial intelligence

  • News and Views

Integrate Evidence With Clinical Expertise and Patient Preferences and Values

Integrate Evidence With Clinical Expertise and Patient Preferences and Values

  • Share on Twitter
  • Share on Facebook
  • Share on Pinterest
  • Share on LinkedIn
  • Email Article
  • Print Article

Nursing is often referred to as both an art and a science. Evidence-based practitioners must combine understanding the science of health, illness, and disease with the art of adapting care to individual patients and situations, all while thinking critically to improve patient outcomes (see Figure 1).

Integrate Evidence With Clinical Expertise and Patient Preferences and Values

Clinical Expertise

Applying the best evidence to our clinical decision making involves examining, critiquing, and synthesizing the available research evidence. However, we must consider the science along with our clinical experience and patients’ values, beliefs, and preferences. In this article we’ll discuss how to incorporate patient preferences and clinical judgment into evidence-based decision making.

Good clinical judgment integrates our accumulated wealth of knowledge from patient care experiences as well as our educational background. We learn quickly as healthcare professionals that one size does not fit all. What works for one patient with fatigue may not work for another. What we can do is draw from our clinical expertise and past experiences to inform our decisions going forward. Our clinical expertise, combined with the best available scientific evidence, allows us to provide patients with the options they need. Patients can’t have a preference if they aren’t given a choice, and they can’t make that choice if they aren’t presented with all options.

Patient Preferences and Values

When we talk about including patient preferences in treatment and care decisions, what does this mean? Patient preferences can be religious or spiritual values, social and cultural values, thoughts about what constitutes quality of life, personal priorities, and beliefs about health. Even though healthcare providers know and understand that they should seek patient input into decisions about patient care, this does not always happen because of barriers such as time constraints, literacy, previous knowledge, and gender, race, and sociocultural influences. 

However, we can learn tools and techniques to help elicit patient preferences. First and foremost, we must listen to our patients. Developing our own interpersonal skills is important in enabling us to have a conversation with patients and not just deliver information. We should also involve family members if patients desire. The ASK (AskShareKnow) Patient–Clinician Communication Model is a tool to teach patients and families three questions to ask their healthcare providers to get information they need to make healthcare decisions.

  • What are my options?
  • What are the possible benefits and harms of those options?
  • How likely are each of those benefits and harms to happen to me, and what will happen if I do nothing?

The ASK model was tested to introduce the questions to patients before they met with their healthcare providers. The result was that patients asked one or more of the questions to their provider during their visit, and they also recalled the questions weeks later.  

Using the best-available scientific evidence by itself is not enough to care for our patients in an evidence-based environment. We must also incorporate our clinical expertise and patient preferences and values to include the art with the science to see patient outcomes improve.

  • Evidence-Based Practice
  • Patient Quality of Life

what is critical thinking in evidence based practice

About the College of Osteopathic Medicine

New York Institute of Technology College of Osteopathic Medicine (NYITCOM) is committed to training osteopathic physicians for a lifetime of learning and practice, based upon the integration of evidence-based knowledge, critical thinking, and the tenets of osteopathic principles and practice.

what is critical thinking in evidence based practice

We are also committed to preparing osteopathic physicians for careers in healthcare, including that in the inner city and rural communities, as well as to the scholarly pursuit of new knowledge concerning health and disease. We provide a continuum of educational experiences to NYITCOM students, extending through the clinical and post-graduate years of training. This continuum provides the future osteopathic physician with the foundation necessary to maintain competence and compassion, as well as the ability to better serve society through research, teaching, and leadership.

To advance patient-centered, population-based osteopathic healthcare through transformative education and illuminating research.

Educational Outcomes

Our graduates will be able to:

  • Apply osteopathic principles and outcomes;
  • Demonstrate medical knowledge by describing and applying concepts and techniques;
  • Provide patient care that is compassionate, appropriate, and effective;
  • Communicate effectively with patients as well as other healthcare professionals;
  • Conduct himself/herself professionally and ethically;
  • Apply practice-based learning and improvement through evaluation and a commitment to continuous improvement; and
  • Understand healthcare delivery systems.

The College of Osteopathic Medicine traces its roots to W. Kenneth Riland, D.O., and a group of visionary osteopathic physicians practicing in the State of New York. Dr. Riland was the personal physician to former President Nixon, Vice President Nelson Rockefeller, and Secretary of State Henry Kissinger. Dr. Riland and his colleagues saw the establishment of the medical school as a way to promote and strengthen the credibility of osteopathic medicine and leveraged the support of Rockefeller and other political leaders to establish the New York College of Osteopathic Medicine in 1977. The college changed its name to the NYIT College of Osteopathic Medicine in 2012.

Accreditation

what is critical thinking in evidence based practice

The  AOA Commission on Osteopathic College Accreditation (COCA)  serves the public by establishing, maintaining, and applying accreditation standards and procedures to ensure that academic quality and continuous quality improvement delivered by the colleges of osteopathic medicine reflect the evolving practice of osteopathic medicine. The scope of the COCA encompasses the accreditation of the colleges of osteopathic medicine.

NYITCOM most recently received a seven-year accreditation from the COCA. Any student who has a complaint related to the COCA accreditation standards and procedures should file the complaint, confidentially, with the following:

The American Osteopathic Association Department of Accreditation 142 East Ontario Chicago, IL 60611-2864 312.202.8124 [email protected]

IMAGES

  1. The benefits of critical thinking for students and how to develop it

    what is critical thinking in evidence based practice

  2. How to Improve Critical Thinking

    what is critical thinking in evidence based practice

  3. What is critical thinking?

    what is critical thinking in evidence based practice

  4. The Importance of Teaching Critical Thinking Skills

    what is critical thinking in evidence based practice

  5. Critical Thinking Definition, Skills, and Examples

    what is critical thinking in evidence based practice

  6. (PDF) Critical Thinking: Knowledge and Skills for Evidence-Based Practice

    what is critical thinking in evidence based practice

COMMENTS

  1. PDF The Importance of Critical Thinking in Evidenced-Based Practice

    nce 3 of Critical Thinking in Evidenced-Based Practice. O ne ofthe hallmarks of EBP is its focus on c. itical thinking. Astleitner (2002) defines critical thinking asa higher-ord. r thinking skill which mainly consists of evaluating arguments. It is a purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation ...

  2. Critical Thinking and Evidence-Based Practice

    CRITICAL THINKING (CT) is vital in developing evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care that can be "individualized to patients and their families, is more effective, streamlined, and dynamic, and maximizes effects of clinical judgment" ( Youngblut & Brooten, 2001, p. 468).

  3. Evidence-based practice for effective decision-making

    At the heart of evidence-based practice is the idea that good decision-making is achieved through critical appraisal of the best available evidence from multiple sources. When we say 'evidence', we mean information, facts or data supporting (or contradicting) a claim, assumption or hypothesis. This evidence may come from scientific research ...

  4. The Effectiveness of an Evidence-Based Practice (EBP) Educational

    1. Introduction. Evidence-based practice (EBP) is defined as "clinical decision-making that considers the best available evidence; the context in which the care is delivered; client preference; and the professional judgment of the health professional" [] (p. 2).EBP implementation is recommended in clinical settings [2,3,4,5] as it has been attributed to promoting high-value health care ...

  5. Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and

    Before research should be used in practice, it must be evaluated. There are many complexities and nuances in evaluating the research evidence for clinical practice. Evaluation of research behind evidence-based medicine requires critical thinking and good clinical judgment. Sometimes the research findings are mixed or even conflicting.

  6. Critical thinking in healthcare and education

    Critical thinking is just one skill crucial to evidence based practice in healthcare and education, write Jonathan Sharples and colleagues , who see exciting opportunities for cross sector collaboration Imagine you are a primary care doctor. A patient comes into your office with acute, atypical chest pain. Immediately you consider the patient's sex and age, and you begin to think about what ...

  7. What is Evidence-Based Practice in Nursing?

    Evidence-based practice in nursing involves providing holistic, quality care based on the most up-to-date research and knowledge rather than traditional methods, advice from colleagues, or personal beliefs. ... Use critical thinking skills and consider levels of evidence to establish the reliability of the information when you analyze evidence ...

  8. Critical thinking and evidence-based practice

    Critical thinking (CT) is vital to evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care and can contribute positively to patient outcomes across a variety of settings and geographic locations. The nature of EBP, its relevance to nursing, and the skills needed to support it should be required components of ...

  9. PDF Critical Thinking: Knowledge and Skills for Evidence-Based Practice

    Practice" that rational or critical thinking is an essential com-plement to evidence-based practice (EBP). Method: I expand on Kamhi's conclusion and briefly describe what clinicians might need to know to think critically within an EBP profession. Specifically, I suggest how critical thinking is

  10. Critical thinking and the process of evidence-based practice

    Critical thinking and the process of evidence-based practice by Eileen Gambrill, New York, NY, Oxford University Press, 2019, 338 pp., ISBN 978--190-46335-9 (paperback) Jerzy Szmagalski The Maria Grzegorzewska University, Warsaw, Poland Correspondence [email protected]

  11. The Evidence for Evidence-Based Practice Implementation

    Models of Evidence-Based Practice. Multiple models of EBP are available and have been used in a variety of clinical settings. 16-36 Although review of these models is beyond the scope of this chapter, common elements of these models are selecting a practice topic (e.g., discharge instructions for individuals with heart failure), critique and syntheses of evidence, implementation, evaluation ...

  12. Critical thinking in nursing clinical practice, education and research

    Critical thinking is a complex, dynamic process formed by attitudes and strategic skills, with the aim of achieving a specific goal or objective. The attitudes, including the critical thinking attitudes, constitute an important part of the idea of good care, of the good professional. It could be said that they become a virtue of the nursing ...

  13. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  14. Evidence-based practice education for healthcare professions: an expert

    Internationally, evidence-based practice (EBP) is recognised as a foundational element of healthcare professional education. Achieving competency in this area is a complex undertaking that is reflected in disparities between 'best EBP' and actual clinical care. ... Attaining a critical mass of people who are 'trained' was also deemed ...

  15. Critical Thinking and Evidence-Based Nursing

    Thus, Evidence based nursing practice is an important aspect of Critical Thinking in nursing practice. Evidence Based Nursing Practice: Evidence based practice is the conscientious, explicit and judicious use of current best evidence in making decisions about the case of individual patients (Sackett, 1996).

  16. Critical thinking: knowledge and skills for evidence-based practice

    Purpose: I respond to Kamhi's (2011) conclusion in his article "Balancing Certainty and Uncertainty in Clinical Practice" that rational or critical thinking is an essential complement to evidence-based practice (EBP). Method: I expand on Kamhi's conclusion and briefly describe what clinicians might need to know to think critically within an EBP ...

  17. What is Evidence-Based Practice in Nursing? (With Examples, Benefits

    Critical Thinking: Evidence-based practices in nursing require having the ability to evaluate data logically and weigh the evidence. 2. Scientific Mindset: ... Evidence-based practice in nursing involves several components such as creating answerable clinical questions, using resources to find the best evidence to answer the clinical question(s ...

  18. Critical Thinking: Knowledge and Skills for Evidence-Based Practice

    Critical thinking is the ability and willingness to assess claims and make. objective judgments on the basis. of. well-supported reasons and evidence. rather than emotion or anecdote. Critical ...

  19. Evidence-Based Practice and Nursing Research

    Evidence-based practice is now widely recognized as the key to improving healthcare quality and patient outcomes. Although the purposes of nursing research (conducting research to generate new knowledge) and evidence-based nursing practice (utilizing best evidence as basis of nursing practice) seem quite different, an increasing number of research studies have been conducted with the goal of ...

  20. Promoting critical thinking through an evidence-based skills fair

    One type of evidence-based practice that can be used to engage students, promote active learning and develop critical thinking is skills fair intervention ( McCausland and Meyers, 2013; Roberts et al., 2009 ). Skills fair intervention promoted a consistent teaching approach of the psychomotor skills to the novice nurse that decreased anxiety ...

  21. Evidence-Based Practice

    EBP is a process used to review, analyze, and translate the latest scientific evidence. The goal is to quickly incorporate the best available research, along with clinical experience and patient preference, into clinical practice, so nurses can make informed patient-care decisions ( Dang et al., 2022 ). EBP is the cornerstone of clinical practice.

  22. Evidence-based HR: Make better decisions and step up your influence

    Defining 'evidence-based practice'. The basic idea of evidence-based practice is that high-quality decisions and effective practices are based on critically appraised evidence from multiple sources. When we say 'evidence', we mean information, facts or data supporting (or contradicting) a claim, assumption or hypothesis.

  23. Evidence-based practice

    CNs faced difficulties in critical thinking, identifying research articles and journals, evaluating research quality. [41,29,15] Gerrish and Cooke found that only 26% of the participants were competent in finding research evidence and 29% in using research evidence. ... Abbreviations: CNs = community nurses, EBP = evidence-based practice.

  24. Integrate Evidence With Clinical Expertise and Patient Preferences and

    Clinical Expertise. Applying the best evidence to our clinical decision making involves examining, critiquing, and synthesizing the available research evidence. However, we must consider the science along with our clinical experience and patients' values, beliefs, and preferences. In this article we'll discuss how to incorporate patient preferences and clinical judgment into evidence-based ...

  25. About the College of Osteopathic Medicine

    Mission. New York Institute of Technology College of Osteopathic Medicine (NYITCOM) is committed to training osteopathic physicians for a lifetime of learning and practice, based upon the integration of evidence-based knowledge, critical thinking, and the tenets of osteopathic principles and practice.