Evidence Based Practice in Allied Health: A Time for Reflection and Refocus?
I was recently at an international conference where I was privileged to present on a range of research initiatives focussed on evidence implementation specifically, and Evidence Based Practice (EBP) generally. Conferences provide a wonderful opportunity to meet and network with peers and garner their feedback and perspectives of your research. It was after one of my presentations, when I was approached by an audience member. For the purpose of this exercise, let’s call her Sally.
Sally was a manager of an allied health discipline at a hospital. She had small, but an enthusiastic group of health professionals who were increasingly keen to undertake, and embed, evidence into their everyday clinical practice. However, very quickly they had encountered several barriers that are all too commonly reported in the EBP literature. Sally mentioned to me that while her staff were keen to embrace EBP, they had significant difficulties in accessing, evaluating and synthesising research evidence, let alone implement it in clinical practice.
This got me thinking about the basic concepts underpinning EBP. If we go back to the dawn of EBP, it is widely acknowledged that EBP has five distinct steps. As the first step, a health care practitioner is required to formulate a searchable, answerable clinical question using a determined format (such as PICO or PECOT). As the second step, the question is operationalised using a search strategy to identify and access best available research evidence. As the third step, the research evidence is then appraised and synthesised in a form that is ready to be implemented in clinical practice.
As the fourth step, the appraised and synthesised evidence is then applied in clinical practice (taking into account clinician’s expertise and patient values). As the fifth and final step, evaluate practice to identify and monitor changes attained as a result of evidence implementation. While this is a well known and widely accepted theoretical model of EBP, built on several assumptions about resource availability and human behaviours, practical reality suggests otherwise (as witnessed by Sally’s story).
No don’t worry - I am not going to list every single barrier to EBP ever reported. What I do want to highlight is this is a message that we hear from practitioners consistently. There is an expectation that a health care practitioner has the competence to adequately convert a clinical problem into an answerable clinical question using PICO or PECOT format. Our experience at the CAHE tells us otherwise. The expectation that a health care practitioner can operationalise the clinical question by developing an appropriate search strategy, select relevant databases, conduct the search, trawl through volumes of literature and identify the best available evidence seems farfetched. Similarly, appraising and synthesising research evidence is a highly skilled task. Several of CAHE staff, working on numerous projects over a number of years, fine tune these skills. Current reality is that health care practitioners may not have the requisite knowledge, skills, time and resources to adequately undertake all these tasks.
So, what does all this mean to EBP in allied health? Is it all doom and gloom? Well, not quite. Given that EBP as a philosophy originated almost two decades ago, maybe the time is ripe for reflection and refocus. By expecting the allied health practitioner to be a "jack of all trades,” it is possible they are drowning in a tsunami of expectations. As clinicians, allied health practitioners should be masters at implementing evidence into clinical practice and evaluating change (the fourth and fifth step of EBP). Focussing on evidence implementation and monitoring change builds on their strengths. Accessing, evaluating and synthesising research evidence should be the domain of highly specialised bodies tasked for this very purpose. Several such agencies already exist (such as SIGN, NHMRC, NICE etc) but few allied health practitioners are aware of, and actively use resources from, these agencies.
So, what did I tell Sally? Firstly, I highlighted that the problem she faces at her local setting is neither unique nor uncommon. Many allied health professionals face this very issue on a day to day basis and Sally was not alone. There is ongoing debate on how best to ensure allied health clinicians are informed about EBP concepts. With EBP being increasingly embedded in the undergraduate curriculum, the issues of lack of skills and knowledge could be addressed in the future. Also, there are several programs offered by many educational institutions targeted at allied health practitioners for up skilling in EBP concepts.
Secondly, I suggested to Sally, to overcome issues of resource availability and access to evidence, why not focus on freely available, rigorously synthesised high quality research evidence (as discussed previously). Thirdly, I encouraged Sally to refocus efforts on putting research evidence into practice, an activity which may be recognised by her staff as being clinically relevant, builds on strengths of her staff and may have an immediate and tangible result. Is this the panacea to Sally’s predicament? May be not but it is certainly worth reflecting!
Kumar S. Evidence Based Practice in Allied Health: A Time for Reflection and Refocus?. The Internet Journal of Allied Health Sciences and Practice. 2009 Oct 01;7(4), Article 4.