By Dr Jason Long.
– Dean, Royal College of Emergency Medicine (RCEM), Emergency Medicine and Retrieval Consultant Queen Elizabeth University Hospital Glasgow, Previously Chair of the RCEM Scottish board.
Reviewer – Dafydd Hammond-Jones Speciality Trainee Year 6 Emergency Medicine (EM) Leeds Teaching Hospitals Trust (LTHT)
To define what Success, Competence and Excellence are. What are the best ways of defining these? How RCEM is working to improve how we assess these.
Defining Success, Competence and Excellence
Dr Long started his talk by defining what the definition of success is by saying it is defined as a noun and is the accomplishment of an aim or purpose. He then went on to ask what this meant for trainees and concluded that it was different for different people. He then went on to define what Competence is and again defined it as a noun and the ability to do something successfully or effectively. He then defined excellence as a noun that is the quality of being outstanding or extremely good. He explained that is extremely difficult to measure excellence.
Current Assessment Process
Dr Long went on to explain how we currently try and assess trainees. He explained that we make trainees do exams in order to progress and explained that this looks for the minimal competent standard and does not measure excellence. We have Work Place Based Assessments (WPBAs) that are there to assess and feedback to drive learning. The problem with WPBAs is that they are not valued. 95% of people pass them and 85% are rated as excellent. If excellence is defined as being outstanding then 85% of people should not be excellent. This suggests that the WPBAs are not valued. The WPBAs are not designed to be used this way and in fact we have formative and summative assessments. You can fail a summative assessment but the formative is there to learn how to improve, unfortunately they are not used this way.
A new tool that RCEM are using is Extended Supervised Learning Event (ELSE) which is where a senior clinician will observe a trainee for a period of time during a shift and will mark not only how they dealt with clinical cases but also with non-technical skills and human factors such as leadership. Each trainee has to do a number of these which can improve how we assess trainees. Grading by collective opinion is better for development as you understand that you are not perfect and develop strategies to help with that. One way that we already do this is by anonymous Multi-Source Feedback. The only problem with this is that trainees will pick people who they feel will are likely to give good feedback and therefore it is a skewed result. A novel way that has been looked at in other specialities such as anaesthetics is for collective ranking of trainees by Consultants. This is more likely to work in a department with a large group of consultants rather than just two or three. It allows trainees to realise how they fair amongst colleagues rather than all thinking they are all excellent.
There are further changes to exams happening within RCEM to further improve how we test management skills, and using evidence based medicine to make change. The aim of the College, and the Dean of that College is to constantly look at how we are assessing trainees and ask what are we trying to assess and are we doing this. There is a lot of evidence that traditional ways of assessment only focus on minimal competent levels of knowledge. Being a good EM Consultant requires more than this and hence why we need to develop how we assess trainees.
My take home message from Dr Longs talk is that there are different ways to assess trainees and for different reasons. Are we assessing success, competence or excellence? Are we assessing, core knowledge, management knowledge or leadership? There are different ways to do each of these things and we need to ask whether what we think we are measuring is what we are actually measuring.
I particularly enjoyed how Dr Long talked about the evidence surrounding how assessments work and the evidence around how they can work ineffectively. It made me think about how I use the tools when assessing junior staff to me and also about how the tools are used when I do assessments myself. I like the move away from just using traditional methods and also looking at how we use current assessment approaches so that we are truly developing trainees rather than just getting them past the post.