Skip to content. Skip to navigation

Faculty Development

News

  • Mar 07, 2017
    Save The Date: UK Trainer Recognition Forum More…
  • Feb 06, 2017
    IT Network Issue affecting our services More…
  • Jan 05, 2017
    Support for supervisors working with trainees under the new contract More…
  • Dec 20, 2016
    Workshops for Clinical Supervisors: Clinical Pharmacist Pathway Programme More…
  • Nov 21, 2016
    Working the night shift: preparation, survival and recovery - A guide for Junior doctors More…
  • Read all news…
You are here: Home / Professional Development Framework for Supervisors / The Evidence Base

The Pilot Evaluation Report

The pilot study of the professional development framework for supervisors was led by Tim Swanwick and Judy McKimm. 16 Trusts took part in the study and the framework was introduced to them over a period of four to six months.

The Pilot Evaluation Report, available for download from the panel on the right, was produced at the end of the study with the feedback collected from focus group sessions held for all the pilot Trusts.

 

The Evidence Supporting Supervision

Introduction

This report aims to provide a succinct summary of the evidence for the following five statements:

  • High quality supervision is associated with improved patient safety
  • High quality supervision is associated with improved quality of patient care
  • High quality supervision results in enhanced acquisition of trainee knowledge, skills and professional attitudes.
  • Trainer development in supervisory knowledge and skills is associated with improved trainee outcomes
  • Portfolio-based approaches are an effective vehicle for professional development

Searching for evidence within medical education poses particular challenges because most bibliographic databases are dedicated to medicine or education rather than “medical education” and the limitations of subject headings make accurate and efficient searching problematic (Haig & Dozier 2003). We have utilised the relevant references from the Kilminster and Jolly review (2000) and combined these with searches for more recent evidence to provide what is a “journalistic” (Greenhalgh 2001), rather than systematic, review.

Whilst randomised controlled trials (RCTs) are foremost in the hierarchy of primary research evidence, because effective randomisation should ensure the only difference between study and control groups is the intervention itself (Guyatt et al. 1995) ethical considerations and feasibility frequently make randomisation impossible. Measurement of supervision processes and outcomes present particular difficulties and, as most instruments are self-reporting measures, it is unclear to what extent these can be extrapolated to improvement in patient outcomes (Kilminster & Jolly 2000).

Whilst qualitative research e.g. (Campbell 1984) may be advantageous in this context, it is an approach that also has difficulties. Pawson (2006) points out that frameworks for evaluation of qualitative studies e.g. (Spencer 2003), by restricting evidence to methodologically strong studies, may actually miss the point of a qualitative approach to analysis and lose potentially important contextual findings within weaker studies. “Realistic synthesis” (Pawson et al. 2004) emphasises context and takes account of complexity and diversity.  Conventional systematic reviews may produce an overall view of the extent to which an intervention works, but do not necessarily answer the key questions of “what factors make an intervention work” and “in what circumstances”. By making the underlying theory of why an intervention might work explicit at the start, “programme theory”, searches for explanations of what works where, for whom, and in what context , rather than looking for a generalisable answer.

Difficulties begin with a definition of supervision, and the distinction between “clinical supervision” and “educational supervision”. We can define the essential features of supervision as including “ensuring patient safety” and “promoting professional development” and this comprises three functions – education, support and administration (Kilminster & Jolly 2000: 829).  We believe that care should be taken to avoid extrapolating evidence relating to clinical supervision, a concept which itself is described in the literature by a number of  context-dependant typologies e.g. surgery (Fallon, Wears, & Tepas III 1993), emergency department (Sox et al. 1998) and outpatients (Gennis & Gennis 1993), and the newer concept of educational supervision,  a process similar to a series of regular appraisal interviews in which “a more senior doctor helps a trainee to maximise the benefits that a trainee gets from a training position in order to fulfil his/her long term career aims” (Lloyd & Becker 2007: 375).

Kirkpatrick’s (1998) four levels provide a useful framework to measure educational outcomes, comprising learner’s reaction to teaching, learners’ changes in knowledge, skills and attitudes, behavioural changes in practice, and results for the organisation and clients. Whilst its conceptual basis has been contested (Alliger & Janak 1989), the hierarchy has been used within the Best Evidence Medical Education reviews (Harden 1999) and we have therefore made reference this framework.

December 2009
Dr David Mendel, Programme Lead FME, Editor

1. High quality supervision is associated with improved patient safety (Jo Loveridge)

2. High quality supervision is associated with improved quality of patient care (Simon Lambden)

3. High quality supervision results in enhanced acquisition of trainee knowledge, skills and professional attitudes (Anna Heald)

4. Trainer development in supervisory knowledge and skills is associated with improved trainee outcomes (Renton L’Heureux)

5. Portfolio-based approaches are an effective vehicle for professional development (Kitty Seed)

 

References

Alliger, G. M. & Janak, E. A. 1989, "Kirkpatrick's Levels of training criteria: Thirty years later", Personnel Psychology, vol. 42, no. 2, pp. 331-342.
Campbell, D. T. 1984, "Can we be scientific in applied social science", Evaluation studies review annual, vol. 9, pp. 26-48.
Fallon, W. F., Wears, R. L., & Tepas III, J. J. 1993, "Resident supervision in the operating room: does this impact on outcome?", The Journal of trauma, vol. 35, no. 4, p. 556.
Gennis, V. M. & Gennis, M. A. 1993, "Supervision in the outpatient clinic", Journal of General Internal Medicine, vol. 8, no. 7, pp. 378-380.
Greenhalgh, T. 2001, "Papers that summarise other papers ( systematic reviews and meta-analysis)," in How to read a paper - The basics of evidence based medicine, BMJ Books, London, pp. 120-138.
Guyatt, G. H., Sackett, D. L., Sinclair, J. C., Hayward, R., Cook, D. J., Cook, R. J., Evidence-Based Medicine Working Group, Bass, E., Gerstein, H., Haynes, B., Holbrook, A., Jaeschke, R., Laupacls, A., Moyer, V., & Wilson, M. 1995, "Users' Guides to the Medical Literature: IX. A Method for Grading Health Care Recommendations", JAMA: The Journal of the American Medical Association, vol. 274, no. 22, pp. 1800-1804.
Haig, A. & Dozier, M. 2003, "BEME Guide No 3: Systematic searching for evidence in medical education--Part 1: Sources of information", Medical Teacher, vol. 25, no. 4, pp. 352-363.
Harden, M. 1999, "BEME Guide No. 1: Best evidence medical education", Medical Teacher, vol. 21, no. 6, pp. 553-562.
Kilminster, S. M. & Jolly, B. C. 2000, "Effective supervision in clinical practice settings: a literature review", Medical Education, vol. 34, no. 10, pp. 827-840.
Kirkpatrick, D. L. 1998, Evaluating training programs: The four levels Berrett-Koehler.
Lloyd, B. W. & Becker, D. 2007, "Paediatric specialist registrars' views of educational supervision and how it can be improved: a questionnaire study", JRSM, vol. 100, no. 8, p. 375.
Pawson, R. 2006, "Digging for Nuggets: How Bad Research Can Yield Good Evidence", International Journal of Social Research Methodology, vol. 9, no. 2, pp. 127-142.
Pawson, R., Greenhalgh, T., Harvey, G., & Walshe, K. 2004, "Realist synthesis: an introduction", Manchester.(ESRC Research Methods Programme.RMP Methods Paper 2/2004).
Sox, C. M., Burstin, H. R., Orav, E. J., Conn, A., Setnik, G., Rucker, D. W., Dasse, P., & Brennan, T. A. 1998, "The effect of supervision of residents on quality of care in five university-affiliated emergency departments", Academic Medicine, vol. 73, no. 7, p. 776.
Spencer, L. 2003, Quality in qualitative evaluation A framework for assessing research evidence, Strategy Unit; Cabinet Office, London.