Everything Old is New Again

There is increasing international availability of good quality systematic reviews and clinical guidelines for a large number of conditions treated by allied health practitioners. This is providing a growing volume of "best practice” information that should assist in improving practice, minimizing undesirable practice variability, and optimizing patient outcomes. However, there is also an increasing amount of research which explores barriers to implementing evidence-based practice. Barriers include lack of understanding of current practice outcomes, lack of time to reflect on and change, clinical practices, limited capacity to access library databases to find research evidence, lack of ability to understand the research, and a mismatch between research findings and daily practice. Even when research evidence is reported in ways which are readily implemented into daily practice, clinicians often fail to embrace this for every patient.[1]

Over the past 30 years, a number of conceptual approaches have been promoted to identify, understand, and deal with the often complex issues constraining health care quality, such as Donabedian, Ishikawa, and the Institute of Medicine (IOM).[2,3,4] The Plan-Do-Study-Act cycle is integral to most quality improvement initiatives, as it promotes the need for continuous reflection, action, and reaction. Using this cycle, allied health practitioners have regularly been exhorted to reflect on their practice using outcome measures, clinical audits, and clinical indicators. Until recently, there has been little research that identifies "best practices," and thus quality improvement activities have not been well supported by "gold standards" of practice to which to aspire. In addition, many quality assurance activities failed because of a lack of clear purpose and structural barriers such as the inability to locate patient notes relative to the conditions under review, difficulties in reading and interpreting clinicians’ handwriting, and lack of recorded measurable outcomes. Even when quality improvement activities were successfully completed, the suggested changes to improve practice were often not implemented because of organizational barriers to embracing change.

There have been disincentives to pursue quality assurance and improvement activities as they have traditionally not been considered to be mainstream or publishable research. Thus, those clinicians and researchers interested in this area have found it difficult to obtain recognition, except in a small number of journals, for the important findings of many of their quality improvement activities. The real-life environment in which quality activities need to be conducted brings inevitable biases which cannot be easily controlled, but perhaps should be embraced and recognized, so that implementation barriers may be more readily identified and addressed.

Recently, questions have been asked around the world about why research evidence is not consistently adopted in practice. These questions have lead to the recognition of a new research science, of evidence implementation. This research is aimed at understanding and improving health professionals’ behaviours using techniques and learning's drawn from a range of sciences including health research, psychology, education, organizational structure and change, marketing, and production management. Implementation science is promising to produce outcomes that will allow practitioners and organizations to use structured approaches to understand what they do and why they do it, whether they could improve practices, how they could implement change consistently and effectively, and how they could best measure improvements in practice.[1]

Implementation science research is picking up on the lessons of quality assurance and quality improvement activities of 30 years ago, giving the Plan-Do-Study-Act cycle more life and meaning. Don Berwick in a plenary address to the First Annual European Forum on Quality Improvement in Health Care, London (March 9th, 1996), noted that any activity to improve health care quality should be patient focused, and it should seek to integrate "improvement, change, and learning".[6] Implementation science is exploring these issues in terms of individual and organizational learning, within the context of understanding and changing organizational culture.

Implementation science research will require champions as it promotes a new type of research inquiry that will not fit comfortably into the accepted research hierarchies and quality appraisal methodologies. There will need to be standards set to assist rigorous implementation science research that is publishable, believable, and will influence thinking in the long term. The Internet Journal of Allied Health Sciences and Practice could take the lead in improving the quality of allied health practice by considering and publishing papers that report on implementation science research. This will assist allied health clinicians and researchers around the world to consider the appropriate and informed uptake of new research findings to improve the quality of clinical care.


  1. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. The Lancet 2003; 362 (9391): 1225 - 1230.
  2. Donabedian A. The Definition of Quality and Approaches to Its Assessment: His Explorations in Quality Assessment. 1980; Health Administration Press.
  3. Ballantyn D. Coming to grips with service intangibles using quality management techniques. 1991; Cranfield School of Management, Cranfield Institute of Technology, Cranfield, Bedford MK43 OAL; https://dspace.lib.cranfield.ac.uk/bitstream/1826/348/2/SWP1991.pdf .Kohn LT, Corrigan JM, 4. Donaldson MS (2000): To Err is Human: Building a Safer Health System. Institute of Medicine (IOM) http://books.nap.edu/openbook.php?isbn=030906837.
  4. Cranfield School of Management, Cranfield Institute of Technology, Cranfield, Bedford MK43 OAL https://dspace.lib.cranfield.ac.uk/bitstream/1826/348/2/SWP1991.pdf.
  5. Berwick DM. A primer on leading the improvement of systems BMJ 1996; 312, 619-622 http://www.bmj.com/archive/7031ed.htm.


Submission Location