Cookie policy notice

By continuing to use this site you agree to our cookies policy below:
Since 26 May 2011, the law now states that cookies on websites can ony be used with your specific consent. Cookies allow us to ensure that you enjoy the best browsing experience.

This site is intended for health professionals only

At the heart of general practice since 1960

June 2007: Workplace-based assessment explained

What methods will be used in workplace-based assessment?

What competency areas will be tested?

How will candidates be marked?

What methods will be used in workplace-based assessment?

What competency areas will be tested?

How will candidates be marked?

The workplace-based assessment (WPBA) is part of the integrated assessment package that is nMRCGP. WPBA is defined as ‘the evaluation of a doctor's progress in their performance over time in those areas of professional practice best tested in the workplace'.

The underlying sound aim is to link teaching, learning and assessment.In order to build as well informed and accurate a ‘picture' of an individual trainee as possible, it is necessary to use several tools rather than rely on a single method. A good assessment is one that drives learning in important areas of competency, gives feedback on areas of strength and development needs, identifies trainees in difficulty and determines the fitness of trainees to progress to independent practice. This is what the Postgraduate Medical Education and Training Board (PMETB) expects of the RCGP and all other specialty medical colleges.

WPBA is necessary in order to gather evidence at the top of Miller's pyramid of competence, that is, on actual performance in the workplace. Miller's pyramid provides a framework for assessing clinical competence (see figure 1, attached).

At the bottom of the pyramid is knowledge (knows), followed by competence (knows how), performance (shows how), and action (does).

‘Action' refers to what actually happens in practice rather than what happens in an artificial testing situation. Work-based methods of assessment fit into the top level of the pyramid and collect information about doctors' performance in their normal practice.

The system is based on the assumption that assessments of actual practice are a much better reflection of routine performance than assessments carried out under artificial conditions.It also allows assessment of aspects of a doctor's behaviour that are traditionally difficult to assess in examination situations.

Although candidates will receive some limited feedback after sitting the clinical skills assessment (CSA) and applied knowledge test (AKT), it cannot compare with the potential wealth of relevant, regular, structured, informed feedback that trainees will receive about their strengths and weaknesses throughout the three years of WPBA.

What is being assessed?

The PMETB has approved the RCGP GP Curriculum and it is now clear that WPBA is deemed suitable to test 12 of the professional competency areas, all derived from the curriculum (see table 1, attached).

It is felt that these 12 areas are best tested in the workplace and this will be a learner-led process. Each competency should be demonstrated when the learner is ready to do so and it is anticipated that there will be development of skills and progression in each competency over the three years' duration of WPBA.

For each competency, the trainee will be assessed and graded as showing ‘insufficient evidence', progressing to ‘needs further development' or ‘competent', but could then develop still further to ‘excellent'. The competent level reflects the standard for independent practice.

Failure to reach a required standard by the end of ST3 will trigger a review by an expert deanery panel. If there are concerns raised earlier over the training years then these should be discussed with the educational supervisor, who in turn can contact the deanery for advice.

The assessors will be locally based: trainees will have individual clinical supervisors (hospital consultant or GP trainer) and a six-monthly review with an educational supervisor (perhaps GP trainer or course organiser). At each six-monthly meeting, all the assessment information gathered will be reviewed, with a judgement on progress in each competency area and two-way feedback about this judgement.

What tools will be used to assess trainees?

Evidence will be gathered about the trainees' development using a variety of methods. These will include locally assessed tools in WPBA and some external tools. Evidence may also include naturally occurring information that does not fit into a predefined system, such as a very positive audit presentation to the team or a negative attitudinal problem.

The locally assessed tools might differ slightly between hospital-based foundation years and GP registrar years, but will include:

• Case-based discussion (CBD) – a structured oral interview designed to explore professional judgement in clinical cases. GP trainers are already accustomed to this traditional method of assessment, but will need to record formally when it has taken place and what feedback was given to the learner.

• Consultation observation tool (COT) – this tool may be used either in a consultation directly observed by the trainer or with the more familiar video-recorded consultation. Most of the current MRCGP consulting skills criteria will feature in the structured assessment video tool and it is expected that this will be used in a formative way on at least 12 occasions spread over the GP registrar year.

• Mini-clinical evaluation exercise (Mini-CEX) – a 15-minute snapshot of a doctor/patient interaction that is formally assessed with immediate five-minute feedback. This is effectively the secondary care-based equivalent of a COT. Six of these evaluation exercises are expected in both ST1 and ST2.

• Direct observation of procedural skills (DOPS) – another 20-minute tool (15 minutes for assessment, 5 minutes for feedback) that will be familiar to F2 doctors. It is designed for formal assessment of procedural skills that are essential to the provision of good clinical care.

Assessors may be appropriate nursing staff, staff grades, SpRs, consultants or GPs. A total of six mandatory procedures must be included over the three year training period – breast, prostate, rectal, male genital and female pelvic examinations as well as cervical cytology.

There are up to nine optional minor surgery skills which can be formally observed too. Trainees may have demonstrated most of these skills early on in the specialist training years, but if they have not, they will need to demonstrate them in the final registrar year.

Apart from the above local assessment tools, there will be external tools too:

• Multi-source feedback (MSF) – a simple web-based tool that assesses not only clinical ability but also professional behaviour. Feedback will be generated from five clinical staff on two separate occasions in specialist training year 1 (ST1) and from five clinical and five non-clinical staff on two occasions in ST3.

• Patient satisfaction questionnaire (PSQ) – 30 consecutive consultations will need to involve patient feedback in ST3. The questionnaires will be sent off for scanning and the GP trainer will then discuss the outcomes with the trainee, particularly with regard to consultation skills and empathy scores.

There is little new in this: MSF + PSQ = 360-degree appraisal. All GP trainers will have had experience of this from their NHS appraisals.

How will this information be recorded?

The GP trainee will have a competency-based training record that is kept in an electronic portfolio. This e-portfolio will have parts that can be accessed by the educational supervisor, some areas that are accessible to national authorities (Deanery, RCGP and Quality Assurance) as well as a ‘private' space for the learner only.

How can candidates best prepare for WPBA?

A candidate cannot and should not specifically prepare in advance. WPBA is part of lifelong learning in the workplace and close supervision is offered. Educational and clinical supervisors will be receiving deanery guidance and much of what is familiar from the structured trainer's report is found in WPBA. There is also close correlation with what is already happening in F1 and F2 training posts, and so for future young doctors there will be a familiarity about assessment processes from medical student training onwards. Advice will be available from many sources, both at local deanery and national college level.

In essence it is about personal and professional development, learning from the experiences in your training career and recording this learning electronically.

useful info Useful information

The nMRCGP is still in development and could evolve further before 1 August 2007. Candidates are advised to check the RCGP website.
www.rcgp.org.uk

The RCGP curriculum website contains information on WPBA
and support for trainers.
www.rcgp-curriculum.org.uk

author Author

Dr Chris Elfes
BM DRCOG DCH MRCGP
GP, Swanage, Dorset, GP trainer, MRCGP examiner, nMRCGP assessor

key points Key points

key points

figure 1 Figure 1: Miller's pyramid

figure 1

WPBA Table 1: Competency areas assessed in WPBA

WPBA

Rate this article 

Click to rate

  • 1 star out of 5
  • 2 stars out of 5
  • 3 stars out of 5
  • 4 stars out of 5
  • 5 stars out of 5

0 out of 5 stars

Have your say