Happy 2015! I hope Christmas 2014 was a good one. It was my daughters 2nd Christmas and I loved watching her running around excitedly with all her new wrapping paper, ignoring the presents it was covering of course!
I also love the classic films that were rolled out over the festive period. This year I found myself watching arguably the best – certainly the most famous – spaghetti Western of them all starring Clint Eastwood: ‘The Good, the Bad and the Ugly’ (why are they called ‘spaghetti’ by the way?).
Not that am I huge fan of Westerns but its the best link that I could come up with for the topic of this blog: discussing the Good, the Bad and the Unsure around the evidence for teaching evidenced-based medicine (EBM) – or evidence-based practice (EBP) or health care (EBHC) should you prefer. I’ll refer to EBP to cover all these from here on in.
EBP as a recognised discipline within the field of medicine is relatively young (there’s a nice podcast with Richard Smith discussing the history of EBP with its founding fathers here). Dave Sackett, a founding father, defines EBP as ‘the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients.’
He also describes what EBP is not: ‘Evidence based medicine is neither old hat nor impossible to practice. The argument that “everyone already is doing it” falls before evidence of striking variations in both the integration of patient values into our clinical behaviour and in the rates with which clinicians provide interventions to their patients.’
In terms of teaching EBP there has been growing recognition that healthcare professionals need to be trained in the methods of EBP, with the General Medical Council advocating that graduates must ‘…be able to integrate and critically evaluate from all these sources to provide a firm foundation in medical practice.’
Medical schools – at least in the UK – are attempting to address the challenge, although much work is needed.
So the long and short of it is that EBP is important for the care of patients, health care workers might not be practicing it and therefore they should be taught about it at medical school. But what’s the evidence (good, bad and unsure) that teaching EBP actually makes a difference to the practice of EBP, and more importantly to health care outcomes?
In a 2002 edition of JAMA, Rose Hatala and Gordon Guyatt discussed the evidence for teaching EBP and the difficulties in generating it. They point out a 2001 Cochrane systematic review looking at the evidence for teaching EBP found one eligible study which showed a 19% improvement in critical appraisal knowledge of health professionals randomised to critical appraisal teaching compared to a control.
The 2011 updated version of the same Cochrane review found 2 additional eligible studies that confirmed an improvement in knowledge and critical appraisal skills of interns and practising physicians
A recent overview of reviews (basically a systematic review of systematic reviews) by Young and colleague published in PLoS ONE included systematic reviews assessing at least one or more element of teaching EBHC to provide an ‘overall’ assessment of its effectiveness. The overview included 16 reviews published between 1993 and 2011 with a total of 81 separate studies overlapping across the reviews (37 studies were included in more than one review). Four reviews were classed as high quality.
That’s a considerable increase in the number of primary studies (and systematic reviews of these) since Hatala’s and Guyatt’s 2002 paper and acknowledges a marked response to the need for evidence in this area.
The overview found, perhaps predictably, that multifaceted (i.e. delivered using multiple formats e.g. lectures, workshops, small-groups, tutorials, online modules), clinically integrated (real life scenarios) interventions, with assessment, led to improvements in knowledge, skills and attitudes.
Another good thing is that EBM enthusiasts are aware of the issues with teaching and practicing EBM and are happy to discuss these openly and honestly. A nice example is evident in the June 2014 issue of the BMJ. Trish Greenhalgh and colleagues discuss whether EBM is a movement in crisis, highlighting that there is a need to return to the founding principles of EBM – “to individualise evidence and shared decisions in the context of a humanistic and professional clinician-patients relationship”.
The authors highlight the good work in teaching the basics of EBM – asking relevant, focused questions, finding the best evidence and critical appraisal of it – but call for a move towards the use of real case examples, encouraging intuitive reasoning in the clinic and at the bedside, and teaching how to share both evidence and uncertainty with patients using decision aids, adapting to individual needs, circumstances, and preferences.
The Hatala paper highlighted 2 shortcomings of the evidence of teaching EBP available at the time. First was a lack of validated outcome measures, with too much reliance on self-reports and learner satisfaction questionnaires. Second, was a struggle to assess learners behaviours and patient outcomes.
12 years on and both the Cochrane review and recent overview indicate that these problems still remain, particularly as Young et al identify that none if the 81 primary studies across the 16 systematic reviews reported practice related outcomes. The difficulty with assessing these outcomes lies in the size and the duration of study required to observe any meaningful effect.
The quality of the 12 of the 16 systematic reviews was reported as poor, with a high risk of missed studies and reporting and publication bias. However, it is important to note that using patient health measures may not be feasible – unless the scale of programmes is very large and sustained with follow-up over many years – hence the lack of rigorous data on patient outcomes.
Jonathan Mant and Nicholas Hicks suggests we concentrate on immediate measures (knowledge, skills attitudes, behaviour) and then seek to establish a relationship with health outcomes through observational data. Sounds good to me.
Notwithstanding the issues around practitioner behaviour and patient-related outcomes already mentioned, there are a number of areas of uncertainty that I think exist within EBP, both in terms of the practice and teaching thereof.
There are likely to be areas within medicine where performing the 5A’s of EBP is not feasible. An example here might help. Lets say an interested paramedic has recently completed an EBM course and as a result has a new confidence in applying the 5A’s of EBP. She see’s on the BBC website that a new trial is being conducted to assess if giving adrenaline after cardiac arrest actually does more harm than good. She agrees this is a good clinical question (1st A of EBP) makes a note to look out for the ensuing publication.
She is one of the first to find the related paper when it’s published (2nd A of EBP) and uses her critical appraisal skills (3rd A of EBP) to determine that the statistically and clinically significant finding of a 24% absolute reduction in risk of death is a real, unbiased, finding. Armed with this knowledge, she now wishes to apply this to her practice. So for the next cardiac arrest patient she see’s, she decides to withhold the giving of adrenaline. What do you think will be the consequences of her actions?
I know the example above is extreme (and doesn’t take into account the other aspect of EBP – patient values or preferences [note the difficulty of this element for this example]) but it is not unimaginable. There are perhaps a number of areas where similar restrictions of applying the 5A’s would be equally difficult (e.g. black listed pharmacotherapies). Equally, there are likely areas of medicine where the EBP rules can be fully applied. It would be useful to identify and address these when teaching EBP.
So where do we go from here? Unfortunately its a case of ‘more and better research needed’ I’m afraid. To coin a phrase from another Eastwood classic “Go ahead, make my day.”
Also posted on David Nunans Blog & www.cebm.net