Category Archives: evidence based practice

?Unexpected Evidence in The Science of Lockdowns

Headlines are blaring: “New study shows that lockdowns had minimal effect on COVID-19 mortality.”

The January 2022 systematic review and meta-analysis that underlies that news is Herby, Jonung, & Hanke’s “A Literature Review and Meta Analysis of the Effects of Lockdowns on COVID 19 Mortality” in Applied Economics .

Scientists label systematic reviews and meta-analyses as the strongest type of scientific evidence (pyramid of evidence). Of course the strength of the systematic review/meta-analysis depends on whether it is well or poorly done, so never put your research-critique brain in neutral. This one seems well done.

In systematic reviews, researchers follow a methodical, focused process that describes their selection and analysis of all studies on a topic. Meta-analyses treat all the data from those selected studies as a single study. Researchers will specify their process and parameters for selecting studies, and they typically publish a table of evidence that summarizes key information about each study. Herby et al. did so. (Note: systematic reviews should not be confused with integrative reviews in which authors are less systematic and are giving background info.)

For example, from Herby et al’s study cited above: “This study employed a systematic search and screening procedure in which 18,590 studies are identified… After three levels of screening, 34 studies ultimately qualified. Of those 34 eligible studies, 24 qualified for inclusion in the meta-analysis. They were separated into three groups: lockdown stringency index studies, shelter-in-place-order (SIPO) studies, and specific [non-pharmaceutical intervention] NPI studies. An analysis of each of these three groups support the conclusion that lockdowns have had little to no effect on COVID-19 mortality.”

See the full publication below. And rather than reading it beginning to end, first 1) read the abstract; 2) identify parameters used to select the 34 eligible studies and 24 meta-analysis studies, 3) scan the table of evidence, and 4) read the discussion beginning page 40. Then read the complete article, and cut yourself some slack—-just try understand what you can depending on your research expertise.

What do you think? Are the studies that support their conclusions strong? What are the SCIENTIFIC objections to their conclusions? What do they identify as policy implications, and do you agree or disagree?

[NOTE THAT THIS ARTICLE LINK MAY BE GOOD FOR ONLY 30 DAYS, but a librarian can help you get it after that.] Happy evidence hunting.

EBP: OpEd – What it is and what it isn’t

Evidence-based nursing. I have heard and seen the terms evidence-based nursing & evidence-based practice sometimes mis-used by well-educated RNs. Want to know what it is? Here’s the secret (or at least some things you should think about says Dr. Ingersoll). https://www.nursingoutlook.org/article/S0029-6554(00)76732-7/pdf

First, she rightly differentiates 2 processes: research as discovery and evidence-based practice as application.

Too, Ingersoll argues that best evidence may include more than the much-vaunted systematic reviews or randomized controlled trials. Relying only on systematic, scientific research findings, she argues, is not enough to guide evidence-based practice. Her arguments provide a basis for discussion with those who might disagree.

Positivist pyramid

[Note: Ingersoll uses the term “positivist thinking” at one point. For those uncertain about the term, I would define positivists as those who assume that reality and truth are objective, measurable, and discoverable by a detached, impartial researcher. Positivism underlies the empirical scientific process that most readers think of when they hear the word research.]

Do you agree with her that anecdotal and traditional knowledge make valuable contributions to evidence-based practice? Your thoughts about her thoughts?

“How many articles are enough?” Is that even the right question?

How do you know when you have found enough research evidence on a topic to be able to use the findings in clinical practice? How many articles are enough? 5? 50? 100? 1000? Good question!

You have probably heard general rules like these for finding enough applicable evidence: Stick close to your key search terms derived from PICOT statement of problem; Use only research published in the last 5-7 years unless it is a “classic; & Find randomized controlled trials (RCTs), meta-analyses, & systematic reviews of RCTs that document cause-and-effect relationships. Yes, those are good strategies. The only problem is that sometimes they don’t work!

Unfortunately, some clinical issues are “orphan topics.” No one has adequately researched them. And while there may be a few, well-done, valuable published studies on the topic, those studies may simply describe bits of the phenomenon or focus on how to measure the phenomenon (i.e., instrument development). They may give us little to no information on correlation and causation. There may be no RCTs. This situation may tempt us just to discard our clinical issue and to wait for more research (or of course to do research), but either could take years.

In her classic 1998 1-page article, “When is enough, enough?” Dr. Carol Deets, argues that asking how many research reports we need before applying the evidence may be the wrong question! Instead, she proposes, we should ask, “What should be done to evaluate the implementation of research findings in the clinical setting?”

When research evidence is minimal, then careful process and outcome evaluation of its use in clinical practice can: 1) Keep patient safety as the top priority, 2) Document cost-effectiveness and efficacy of new interventions, and 3) Facilitate swift, ethical use of findings that contributes to nursing knowledge. At the same time, Deets recognizes that for many this idea may be revolutionary, requiring us to change the way we think.

So back to the original question…How many articles are enough? Deets’ answer? “One study is enough” if we build in strong evaluation as we translate it into practice.

Reference: Deets, C. (1998). When is enough, enough? Journal of Professional Nursing, 14(4), 196. doi.org/10.1016/S8755-7223(98)80058-6

reposting: dispelling the nice or naughty myth–retrospective observational study of santa claus

Check out this re-post of my Christmas-y blog:

https://discoveringyourinnerscientist.com/2018/01/04/dispelling-the-nice-or-naughty-myth-retrospective-observational-study-of-santa-claus/

Evidence-based Practice Institute

I recommend this event. I have no conflict of interest.

New virtual EBP Institute – Advanced Practice Institute: Promoting Adoption of Evidence-Based Practice is going virtual this October.

This Institute is a unique advanced program designed to build skills in the most challenging steps of the evidence-based practice process and in creating an organizational infrastructure to support evidence-based health care. Participants will learn how to implement, evaluate, and sustain EBP changes in complex health care systems. 

Each participant also receives Evidence-Based Practice in Action: Comprehensive Strategies, Tools, and Tips From the University of Iowa Hospitals and Clinics. This book is an application-oriented EBP resource organized based on the latest Iowa Model and can be used with any practice change. The Institute will include tools and strategies directly from the book.

3-Day Virtual Institute

Wednesday, October 7

Wednesday, October 14

Wednesday, October 21

(participation is required for all 3 days)

Special pricing for this virtual institute: 5 participants from the same institution for the price of 4

Learn more and register for the October 2020 Advanced Practice Institute: Promoting Adoption of Evidence-Based Practice. 

Kristen Rempel

Administrative Services Specialist Nursing Research & Evidence-Based Practice

University of Iowa Health Care | Department of Nursing Services and Patient Care

200 Hawkins Dr, T155 GH, Iowa City, IA 52242 | 319-384-6737

uihc.org/nursing-research-and-evidence-based-practice-and quality

HOMEMADE cloth masks: The good, the bad, & the Ugly

So I’ve been pretty skeptical about people sewing protective face masks at home. And, as with a lot of things we don’t have all the data that we wish we had. So…I’m putting this scientific evidence out there and encouraging you to contribute to this blog by adding other scientific data.

undefined

Here is a 2015 randomized controlled trial (RCT) data: Penetration of cloth masks by particles was almost 97% and medical masks 44%. (N = 1607 HCW > 18 years).

Nevertheless, the expert opinion at CDC is that they are in the “Better Than Nothing” category and gives this additional advice. “In settings where N95 respirators are so limited that routinely practiced standards of care for wearing N95 respirators and equivalent or higher level of protection respirators are no longer possible, and surgical masks are not available, as a last resort, it may be necessary for HCP to use masks that have never been evaluated or approved by NIOSH or homemade masks. It

undefined

may be considered to use these masks for care of patients with COVID-19, tuberculosis, measles, and varicella. However, caution should be exercised when considering this option.1,2

Anecdotally, providers are using them to extend the life of other masks or N95s. Women are also making some with little pockets for other filters, and a material called HANIBON that can be purchased online is used often on the outer layer of disposable masks and works to block out dust and fluids from entering. 

References

  1. Dato, VM, Hostler, D, and Hahn, ME. Simple Respiratory Maskexternal iconEmerg Infect Dis. 2006;12(6):1033–1034.
  2. Rengasamy S, Eimer B, and Shaffer R. Simple respiratory protection-evaluation of the filtration performance of cloth masks and common fabric materials against 20-1000 nm size particlesexternal iconAnn Occup Hyg. 2010;54(7):789-98.

“Sew” there you have it. Expert opinion is that as a last resort you may use inadequately tested cloth masks if it is all you have. I am grateful for all those sewists out there responding to medical center calls to supply them with cotton and elastic homemade masks, and sending out the patterns to do so. Field medicine.

CDC also says “The filters used in modern surgical masks and respirators are considered “fibrous” in nature—constructed from flat, nonwoven mats of fine fibers” If this is true then would nonwoven interfacing undefinedimprove the homemade masks?

A practical place to start

Enrolled in an MSN….and wondering what to do for an evidence-based clinical project?

Recently a former student contacted me about that very question. Part of my response to her is below:

“One good place to start if you are flexible on your topic is to look through Cochrane Reviews, Joanna Briggs Institute, AHRQ Clinical Practice Guidelines, or similar for very strong evidence on a particular topic and then work to move that into practice in some way.  (e.g., right now I’m involved in a project on using evidence of a Cochrane review on the benefits of music listening–not therapy–in improving patient outcomes like pain, mood, & opioid use).

Once you narrow the topic it will get easier.  Also, you can apply only the best evidence you have, so if there isn’t much research or other evidence about the topic you might have to tackle the problem from a different angle” or pick an area where there IS enough evidence to apply.

Blessings! -Dr.H

Challenges to “Medical dogma” – Practice your EBP skills

Medscape just came out with Eric J. Topol article: 15 Studies that Challenged Medical Dogma in 2019. Critically check it out to practice your skills in applying evidence to practice. What are the implications for your practice? Are more or stronger studies needed before this overturning of dogma becomes simply more dogma? Are the resources and people’s readiness there for any warranted change? If not, what needs to happen? What are the risks of adopting these findings into practice?

Your thots? https://www.medscape.com/viewarticle/923150?src=soc_fb_share&fbclid=IwAR1SBNNVGW6BBWuKw7zBjhWIoQoMGtXZCy-BwpTTyavHSxmLleJuliKKG4A

“Two roads diverged in a yellow wood…” R.Frost

TIME TO REPUBLISH THIS ONE:

Below is my adaptation of one of the clearest representations that I have ever seen of when the roads diverge into quality improvement, evidence-based practice, & research.  Well done, Dr. E.Schenk PhD MHI, RN-BC!qi-ebp-research-flow-chart