Skip to content. Skip to navigation

Faculty Development

You are here: Home / Structured Assessments of Clinical Competence / Objective structured clinical examinations

Objective structured clinical examinations

The OSCE is an assessment format in which the candidates rotate sequentially around a series of structured cases located in ‘stations’, at each of which specific tasks have to be performed, usually involving a clinical skill, such as history taking, examination of a patient or a practical skill. The marking scheme for each station is structured and determined in advance. There is a different examiner and a time limit for each station.  The basic structure of an OSCE may be varied in timing for each station, use of checklist or rating scale for scoring, use of clinician or standardised patient as examiner, use of real patients or manikins, but the fundamental principle is that every candidate has to complete the same assignments in the same amount of time and is marked according to a structured marking schedule. 

The use of OSCEs in the quantitative assessment of competence has become widespread in the field of undergraduate and postgraduate medical education since they were originally described (Harden and Gleeson, 1979). The main reasons are the high reliability of this assessment format and the equity that results from all candidates being presented with the same test. Some characteristics of a good OSCE are listed below.

What makes a good OSCE?
  • Blueprinting: ensure the test content maps across the learning objectives of the course
  • Station development and piloting: writing stations that function well
  • Examiner training: engage the examiners, consistency of marking contributes to the reliability
  • Simulated patient training: consistent performance ensures each candidate is presented with the same challenge
  • Organisation: make detailed plans well in advance

Thinking point:
Taking each of the elements of ‘what makes a good OSCE’, which do you think will have the most impact on (a) the learners’ experience and (b) the teachers’ experience?



Essentially the OSCE was developed to address the inherent unreliability of classical long and short cases. OSCEs are more reliable than unstructured observations in three main ways:

  1. Structured marking schedules allow for more consistent scoring by examiners according to pre-determined criteria.
  2. Candidates have to perform a number of different tasks across clinical, practical and communication skill domains – this wider sampling across different cases and skills results in a more reliable picture of a candidate’s overall competence.
  3. As the candidates move through all the stations, each is examined by a number of different examiners, so multiple independent observations are collated. Individual examiner bias is thus attenuated.

To enhance reliability, it is better to have more stations with one examiner per station than fewer stations with two examiners per station (van der Vleuten and Swanson, 1990).


‘Content’ validity is determined by how well the sampling of skills matches the learning objectives of the course or degree for which that OSCE is designed (Downing, 2003). The best way to ensure an adequate spread of sampling is to use a blueprint.

Educational impact

The impact on students’ learning resulting from a testing process is sometimes referred to as ‘consequential’ validity. It is well recognized that students focus on their assessments rather than the learning objectives of the course.  Explicit, clear learning objectives aligned with clinical skills assessment content and format can be a very effective way of encouraging students to learn the desired clinical competencies. 

There is a danger in the use of detailed checklists as this may encourage students to memorize the steps in a checklist rather than learn and practice the skill. Rating scale marking schedules encourage students to learn and practice skills more holistically.

Print module to PDF

Save a PDF of this module, so you can print it and read it in your own time.

Email your comments

Let us know what you think about this module or give us your feedback.

Further information

More information about this module, further reading and a complete list of glossary terms.

Learning activities

Read about the recommended learning activities for this module.