Key points from our efforts: EBP/research learning should be fun. Content, serious!
The related publication that records some of our fun efforts and the full collaborative picture: Highfield, M.E.F., Collier, A., Collins, M., & Crowley, M. (2016). Partnering to promote evidence-based practice in a community hospital: Implications for nursing professional development specialists, Journal of Nursing Staff Development, 32(3):130-6. doi: 10.1097/NND.0000000000000227.
Yes.It is easier to do things the way we’ve always done them (and been seemingly successful).
Yet, most of us want to work more efficiently or improve our own or patients’ health.
So, there you have the problem: a tension between status quo and change. Perhaps taking the easy status quo is why ‘everyday nurses’ don’t read research.
Ralph (2017) writes encountering 3 common mindsets that keep nurses stuck in the rut of refusing to examine new research:
I’m not a researcher.
I don’t value research.
I don’t have time to read research.
But, he argues, you have a choice: you can go with the status quo or challenge it (Ralph). And (admit it), haven’t we all found that the status quo sometimes doesn’t work well so that we end up
choosing a “work around,” or
ignoring/avoiding the problem or
leaving the problem for someone else or
….[well….,you pick an action.]
How to begin solving the problem of not reading research? Think of a super-interesting topic to you and make a quick trip to PubMed.com. Check out a few relevant abstracts and ask your librarian to get the articles for you. Read them in the nurses’ lounge so others can, too.
Let me know how your challenge to the status quo works out.
Bibliography: Fulltext available for download through https://www.researchgate.net/ of Ralph, N. (2017 April). Editorial: Engaging with research & evidence is a nursing priority so why are ‘everyday’ nurses not reading the literature, ACORN 30(3):3-5. doi: 10.26550/303/3.5
Reliability & validity are terms that refer to the consistency and accuracy of a quantitative measurement questionnaire, technical device, ruler, or any other measuring device. It means that the outcome measure can be trusted and is relatively error free.
Reliability– This means that the instrument measures CONSISTENTLY
Validity – This means that the instrument measures ACCURATELY. In other words it measures what it is supposed to measure and not something else.
For example: If your bathroom scale measures weight, then it is a valid measure of weight (e.g. it doesn’t measure BP or stress). You might say it had high validity. If your bathroom scale measures your weight as the same thing when you step on and off of it several times then it is measuring weight reliably or consistently; and you might say it has high reliability.
Practice based in evidence (EBP) means that you must critique/synthesize evidence and then apply it to particular setting and populations using your best judgement. This means that you must discriminate about when (and when NOT) to apply the research. Be sure to use best professional judgment to particularize your actions to the situation!
Actually when it comes to quantitative data, there are 4 levels, but who’s counting? (Besides Goldilocks.)
Nominal (categorical) data are names or categories: (gender, religious affiliation, days of the week, yes or no, and so on)
Ordinal data are like the pain scale. Each number is higher (or lower) than the next but the distances between numbers are not equal. In others words 4 is not necessarily twice as much as 2; and 5 is not half of 10.
Interval data are like degrees on a thermometer. Equal distance between them, but no actual “0”. 0 degrees is just really, really cold.
Ratio data are those with real 0 and equal intervals (e.g., weight, annual salary, mg.)
(Of course if you want to collect QUALitative word data, that’s closest to categorical/nominal, but you don’t count ANYTHING. More on that another time.)
The difference between research and evidence-based practice (EBP) can sometimes be confusing, but the contrast between them is sharp. I think most of the confusion comes because those implementing both processes measure outcomes. Here are differences:
RESEARCH :The process of research (formulating an answerable question, designing project methods, collecting and analyzing the data, and interpreting themeaning of results) iscreating knowledge(AKA creating research evidence).A research project that has been written up IS evidence that can be used in practice. The process of research is guided by the scientific method.
EVIDENCE-BASED PRACTICE: EBP is using existing knowledge (AKA using research evidence) in practice. While researchers create new knowledge,
The creation of evidence obviously precedes its application to practice. Something must be made before it can be used. Research obviously precedes the application of research findings to practice. When those findings are applied to practice, then we say the practice is evidence-based.
A good analogy for how research & EBP differ & work together can be seen in autos.
Designers & factory workers create new cars.
Driversuse existing cars that they choose according to preferences and best judgments about safety.
CRITICAL THINKING: 1) Why is the common phrase “evidence-based research” unclear? Should you use it? Why or why not? 2) What is a clinical question you now face. (e.g., C.Diff spread; nurse morale on your unit; managing neuropathic pain) and think about how the Stetler EBP model at http://www.nccmt.ca/registry/resource/pdf/83.pdf might help. Because you will be measuring outcomes, then why is this still considered EBP.