• Vol. 52 No. 10, 553–555
  • 30 October 2023

How do current paediatrics residency selection criteria correlate with residency performance?

,

Dear Editor,

The selection process for potential residents needs to be reviewed regularly and assessed if effective in selecting the best-fit residents who can achieve academic and professional excellence. Objective measures must take precedence over subjective criteria to reduce selection bias while ensuring transparency and accountability. However, the predictors of an ideal resident and his/her performance during residency training have been a great challenge to identify as part of the selection process. The use of examination results from medical school examination, licensing examinations such as the United States Medical Licensing Examination,1,2,3 and structured interviews4 was reported to correlate positively with doctor’s performances. A Canadian study also reported that the presence of scholarly activity did not affect match outcome, though this is variable for different programmes.5 Competitive programmes like paediatrics have a vested interest in selecting the most suitable applicants who will excel as paediatric residents and emerge as holistic, high-performing paediatricians in their field.6

Our paediatric residency programme was accredited by Accreditation Council for Graduate Medical Education International (ACGME-I) in 2009, with 271 applicants for 37 places in the past 5 years (from 2016 to 2021), with an overall admission rate of 100%. The programme’s selection matrix was initially adopted from other Singapore programmes and evolved to include more objective components such as structured interviews. We reviewed our selection matrix to determine which components could be used to predict future resident performance. Data from application records of paediatric residents from 2016 to 2021 were reviewed.

Applicants in 2016 had an overall selection score that considered components from their academic results, extracurricular activities, and work-based assessment scores.
Subsequently from 2017 onwards, the programme started using a selection matrix comprising other various components. Residents had an overall selection score out of 100 (%) computed based on the following components: interview score, curriculum vitae score, work-based assessment score and 360 feedback score. The curriculum vitae score is an aggregate of the following components: (1) number of completed paediatric postgraduate examinations organised by the Royal College of Paediatrics and Child Health (RCPCH); (2) number of paediatric rotations completed (Neonatology, Paediatric Medicine, Children’s Emergency, and Paediatric surgical postings); (3) scholarly activities: local or international publications, oral presentations, poster presentations and awards at conferences; (4) leadership roles over the past 5 years; (5) volunteer work over the past 5 years; and (6) awards received over the past 5 years. Scores for these components were correlated with residency performance scores, which were obtained from yearly in-training examination (ITE) and workplace-based assessments (WBA). WBAs are assessed based on core ACGME-I competencies, such as professionalism, interpersonal and communication skills, medical knowledge, practice-based learning and improvement, patient care and systems-based practice. The average score across all components is collated 6 monthly. The number of publications and presentations, both local and international, were also collated. Statistical significance between groups were calculated using linear regression method. Pearson’s correlation coefficient was calculated to assess the relationship between each selection matrix component and residency performance. Data was analysed using STATA version 17.0 (StataCorp, College Station, TX, US).

A total of 36 residents’ records were reviewed. There were 13 male residents (36.1%). The mean number of postgraduate years was 3.4 ± 0.3 years. The mean overall selection score was 67.9 ± 0.6%. Applicants who were successful in entering residency training tended to have higher 360 feedback scores (mean 4.3 ± 0.1; scored out of 5), had rotated through more paediatric-related postings (mean 72.7 ± 4.7%), performed well during their interviews (mean 80.4 ± 2.9%) and were at least postgraduate year 3 and above.

The mean ITE score for all paediatric residents was 68.3 ± 1.1%. Paediatric residents were also reported to have at least average or above average WBA scores (mean 7.3 ± 0.0; scored out of 10) and this score did not change across the years of training. Fifty-eight percent of residents had contributions to scholarly activities. Table 1 shows the correlation between components of selection criteria and resident performance during training. Resident performance was defined as academic performance (ITE score) and WBA scores. The academic performance of a resident correlated with his/her overall selection score and interview scores. Residents who had done more paediatric-related postings had better WBA scores during training. Overall selection score, interview scores or 360 feedback scores did not correlate with better WBA scores during residency.

Table 1. Correlation between selection score component and residency performance (work-based assessment score and in-training examination).

Mean WBA scores during training, r (P value) Mean ITE scores during training, r (P value)
Overall selection score -0.02 (0.912) 0.43 (0.009)
360 feedback scores 0.31 (0.104) 0.29 (0.124)
Interview scores 0.04 (0.826) 0.49 (0.008)
PGY 0.15 (0.370) -0.07 (0.701)
C1 scores pre-residencya -0.06 (0.775) 0.08 (0.691)
Total CV score -0.13 (0.514) 0.28 (0.142)
CV: exams 0.12 (0.518) 0.08 (0.691)
CV: postings 0.46 (0.020) 0.08 (0.662)
CV: scholarly activity -0.08 (0.689) 0.20 (0.304)
CV: leadership -0.26 (0.165) -0.01 (0.944)
CV: volunteerism 0.05 (0.796) 0.04 (0.842)
CV: awards -0.21 (0.264) 0.03 (0.883)

CV: curriculum vitae; ITE: in-training examination; PGY: postgraduate years; r: Pearson’s correlation coefficient; WBA: work-based assessment
a C1 score is a work-based assessment score for doctors who applied in PGY 2 and above.

Our review showed that there are no single-objective selection criteria that can best select an all-rounded paediatric resident—emphasising the importance of including diverse data sources in the resident selection process. While we found that residents who had better interview scores had better ITE scores during residency, this did not translate to having better WBA scores. This is similar to the meta-analysis by Kenny et al in 2013, which reported lowest positive associations for selection strategies based on interviews, reference letters and deans’ letters with in-training evaluation of residents.7 Our study also showed that scholarly activity itself is not a major component for residents who were succeeded in entering paediatric training. We also demonstrated that residents with better WBA had rotated through more paediatric-related postings—hence, it is important to consider their rotations prior to entry into residency.

Although this review was limited by the small sample size and to a single institution in a small country, the number of residents studied within a short timeframe can still help to inform future iterations of the selection matrix. Methods to improve the format of interview can also be explored, such as with the use of a commercial occupational analyses and professional development consultant to create specific behavioural-based interview questions for residency selection,8 or use of situational judgement tests to assess non-cognitive attributes in applicants.9

In conclusion, our paediatric residency programme reviews the selection process regularly and may look at placing more emphasis on interview scores, with consideration of candidates’ rotations during the selection process. Regular review of any residency selection process for potential residents is still required. This would ensure accountability and transparency, and cultivate successful future specialists who can contribute to the programme post-training as medical faculty as leaders, educators and researchers.

Acknowledgements

We would like to acknowledge the contribution by Kendrix Lim, programme executive, for providing the selection and performance scores collated over the years.


REFERENCES

  1. Egol KA, Collins J, Zuckerman JD. Success in orthopaedic training: resident selection and predictors of quality performance. J Am Acad Orthop Surg 2011;19:72-80.
  2. Raman T, Alrabaa RG, Sood A, et al. Does Residency Selection Criteria Predict Performance in Orthopaedic Surgery Residency? Clin Orthop Relat Res 2016;474:908-14.
  3. Agarwal V, Bump GM, Heller MT, et al. Do Residency Selection Factors Predict Radiology Resident Performance? Acad Radiol 2018;25:397-402.
  4. Olawaiye A, Yeh J, Withiam-Leitch M. Resident selection process and prediction of clinical performance in an obstetrics and gynecology program. Teach Learn Med 2006;18:310-5.
  5. Gupta R, Norris ML, Barrowman N, et al. Pre-residency publication and its association with paediatric residency match outcome-a retrospective analysis of a national database. Perspect Med Educ 2017;6:388-95.
  6. Rosenbluth G, O’Brien B, Asher EM, et al. The “zing factor”-how do faculty describe the best pediatrics residents? J Grad Med Educ 2014;6:106-11.
  7. Kenny S, McInnes M, Singh V. Associations between residency selection strategies and doctor performance: a meta-analysis. Med Educ 2013;47:790-800.
  8. Prager JD, Myer CM 4th, Hayes KM, et al. Improving methods of resident selection. Laryngoscope 2010;120:2391-8.
  9. Cullen MJ, Zhang C, Marcus-Blank B, et al. Improving Our Ability to Predict Resident Applicant Performance: Validity Evidence for a Situational Judgment Test. Teach Learn Med 2020;32:508-21.