Vol 36 No 2 Original Article PDF

Correlation of the in-training examination in ophthalmology with the written certifying examination of the Philippine Board of Ophthalmology

T R Castillo, MD; M N Valbuena, MD

THERE has been increasing concern over the performance of residency graduates in ophthalmology in the written certifying examinations given by the Philippine Board of Ophthalmology (PBO) over the past years. Data obtained from the PBO showed that the passing rate for the written certifying examinations for the past 9 years ranged from 38 to 67%, with an average of 54%. During the period covering 2001 to 2009, the scores ranged from 30 to 85% with an average of 62%. Since 2004, the Department of Ophthalmology and Visual Sciences of the University of the Philippines–Philippine General Hospital (UP-PGH) has been administering the Ophthalmology In-Training Proficiency Examinations (OPEX) to residents in ophthalmology all over the Philippines. This in-training examination consists of multiple-choice questions covering the various specialty areas in ophthalmology.

The primary goal of the OPEX is to provide residents and institutions with formative feedback regarding knowledge competency in the various ophthalmology content areas. Secondary to this, the examination also familiarizes prospective examinees for the Written Qualifying Examinations of the PBO on its format and structure. Several studies conducted in other countries have showed a positive correlation between scores in previous written multiple-choice examination with those of subsequent multiple-choice examination. In-training examinations (ITE) administered by various specialty boards in the United States generally consist of written multiple-choice questions, and their relationship to subsequent performance to certifying or qualifying written examinations has been extensively analyzed. Scores in the ITE correlated significantly with performance in the written examinations in various fields of specialization such as anesthesiology1-2, pediatrics3, internal medicine4, radiology5, oral and maxillofacial surgery6, family practice7-8, surgery9, psychiatry and neurology10. McClintock and Gravlee1 further examined the relationship of performance of the ITE to the two-step certification process of the American Board of Anesthesiology, of which the second step was a structured oral examination. Their study found the ITE to be a significant predictor of completion of the two-step process. To date, there has been no attempt to assess if there is any correlation between the performance of examinees in the OPEX and PBO written qualifying examinations. This study was conducted to determine if there is any correlation between the performance of graduates taking the PBO written qualifying examinations and their performance in the OPEX during their residency. Specifically, this study compared the scores of the PBO written certifying examinations with the OPEX scores of residents at different levels of training and to both average and terminal OPEX scores, and determined any correlation between the number of times a resident took the OPEX with their performance in the PBO written certifying examination.

This study was also undertaken to determine if the OPEX is a valid assessment tool to monitor improvement of knowledge base of residents undergoing training in ophthalmology as can be observed with progressive increase in OPEX scores as residents move from first to third year of training.

METHODOLOGY

Included in this analytic observational study were all residents who took the PBO written examination from 2005 to 2010 and at least one OPEX during their residency. Data were collected from PBO written examination results covering the years 2005 to 2010 and the OPEX results covering the years 2004 to 2009. The OPEX scores and the PBO written examination scores of all subjects were retrieved and tabulated. For subjects who took the PBO written examination more than once, only results for the first PBO written examination were considered for this study. Data were subjected to descriptive statistical analysis using Excel data-analysis tools. Comparison of OPEX Scores through residency (Y1, Y2, Y3, and Y4) was done using ANOVA to determine if there was significant improvement in scores as residents progressed through their training. OPEX scores for each year of residency and the average of OPEX scores were compared to the PBO written examination scores using Pearson’s statistical test for correlation. OPEX scores of residents for the last year that OPEX was taken, referred to as terminal OPEX scores, were, likewise, correlated to the PBO written examination scores using the same statistical test. OPEX scores of those who passed the PBO written examination were also compared with the scores of those who failed the PBO written examination to determine if there was any statistically significant difference in the OPEX scores between the two groups using the unpaired t-test. Subjects were also grouped according to the number of times that OPEX was taken during their residency. Statistical analysis was performed using ANOVA test to determine if the number of times the OPEX was taken reflected significant differences in the PBO scores of the different groups.

RESULTS

A total of 379 graduates of the residency training program in ophthalmology took the PBO written examination from 2005 to 2010, some more than once. Of the total, 165 took the OPEX during their residency training and were included in this study. Subjects completed their residency training in 29 different institutions. While the standard duration of training for ophthalmology was 3 years, a few institutions were allowed to extend their residency training program to 4 years, primarily due to the small census of patients they have. Of the 165 subjects, 5 took the OPEX four times during their residency, 87 thrice, 39 twice, and 34 once. Table 1 presents the distribution of these subjects according to whether or not they subsequently passed the PBO written qualifying examinations. The difference in the number of successful PBO examinees for the different groups was statistically significant (χ2=20.62, df=3, p < 0.001). Mean OPEX scores for the different levels of residency training was observed to be statistically different (F= 31.33, p < 0.001) (Table 2). Although mean scores were observed to improve from first to third year of residency, a drop in the mean score for the fourth year of residency was noted. Analysis of OPEX scores of the group of subjects who were able to take OPEX throughout the course of the standard three-year residency training program revealed statistically significant improvement (p < 0.001) from first year of residency (mean = 46.40 + 8.08, CI 44.48, 48.31), to second year (mean=54.44 + 9.06, CI 52.62, 56.45) to third year (mean = 61.35 + 10.0, CI 59.43, 63.26). Analysis of variance of the PBO scores of subjects grouped according to the number of times OPEX was taken revealed significant difference in the PBO scores between the groups (Table 3). It was further observed that while the mean PBO scores showed improvement through the number of times that OPEX was taken, i.e. subjects who took the OPEX three times had correspondingly higher PBO scores than those who took the OPEX twice, who in turn was observed to have higher PBO scores than those who took the OPEX only once for the duration of their residency training, there was a significant decline in the mean PBO score for those who took the OPEX four times.









The mean of the average OPEX scores for those who passed the PBO written examination was significantly higher than those who failed the PBO examination with the former having a mean of 61.01 + 9.78 compared to the latter with a mean of 49.31 + 5.85 (Table 4). There was a positive correlation between OPEX and PBO scores for all year levels of residency (Figure 1). The coefficients increased from first to third year of residency. The PBO scores had a stronger positive correlation to the terminal OPEX scores (r = 0.73) than to the average OPEX scores of subjects taken during residency (r = 0.68) (Figures 2 and 3). The PBO scores also improved with increasing number of times subjects took the OPEX (Figure 4).

DISCUSSION

Examinations have played a major role in competency assessment in medical education. In most institutions, performance in entrance examinations would play an important role in determining acceptance to a residency training program. Examinations also serve as a form of formative or summative evaluation in specified areas. For the trainees, examinations can be used as a tool for self-assessment whereby strengths and weaknesses are identified to provide direction for future learning. Examinations also provide feedback for program coordinators in terms of identifying trainees who would require closer supervision. They provide institutions with information regarding the need for remedial measures for their trainees. Corollary to this, examinations may also provide accrediting bodies with a tool whereby adequacy of training programs may be evaluated. Written examinations are generally used as an evaluation tool to measure the cognitive area of learning, while skills and attitudes developed in the training process are measured through other evaluation tools. It is for this reason that the investigators chose to compare the scores in the OPEX to performance in the PBO written qualifying examinations rather than to the outcome of the entire certifying process, which includes practical and oral examinations. The OPEX has been given annually since 2004 by the Department of Ophthalmology and Visual Sciences of UP–PGH to measure cognitive competency of ophthalmology residents. While not a requirement of the PBO, it serves as formative evaluation for the cognitive area of residents at all levels of training. Several studies have reported correlations varying from moderate to very strong (0.48 to 0.86),1-10 supporting that the ITE performance indeed has a relationship to performance in the specialty board examinations. The current study showed similar results for all comparisons between the various OPEX scores and performance in the PBO examinations (r = 0.57 to 0.80, p < 0.001). Of the various comparisons made between OPEX and PBO scores, the highest correlation coefficient observed was between the terminal scores of those who took the OPEX four times during their residency and their corresponding PBO scores (r = 0.80). This was, however, found to be statistically insignificant due to the very small number of subjects in this category. Statistically significant increase in correlation was also observed as the residents progressed from first to third year of residency training, findings comparable to that reported by Baverstock et al.11

There was a high correlation observed between the score for the last OPEX taken by a resident (terminal OPEX) and the scores in the PBO written examination (r = 0.73). This may imply that the residents’ knowledge base in the final OPEX they took was at a similar level already to what they had when the PBO examination was taken. Improvement in the cognitive domain was observed with the progressive increase in the mean OPEX scores of residents from first to third year of residency. Aside from increase in knowledge base of residents with increase in the duration of training, the observed improvement in scores may also be attributed to improvement in testtaking skills of the subjects, as well as familiarity with the examination content and type of questions asked. This observation was similar to that reported by Brill- Edwards and colleagues which noted that mean scores for the internal medicine ITE rose consistently from first to third year of residency training,12 and that of Althouse and associates who reported an increase in average scores in the ITE of pediatric residents as time in training increased.3 A drop in the mean PBO scores for subjects who took the OPEX during their fourth year of residency was also noted. Since the standard residency training program in ophthalmology has a duration of three years only, having taken the examination four times would imply that these residents necessitated an additional year to attain the competencies required for graduation from the program. Only composite scores from both OPEX and PBO written examinations were compared in this study. Comparison of performance in the different content areas was not done. The distribution of examination items among the different content areas, as well as the quality of questions used in any of the examinations, were also not evaluated in this study. Since no established passing level was provided for the OPEX, quantitative measures for predictive values and likelihood ratios could not be performed.

A combination of factors has been reported to influence performance of residents in written examinations. Aside from performance in previous examinations, attendance in conferences, probationary status, and amount of sleep and study prior to an examination have been identified to significantly correlate with performance in ITE.13 The country of medical school attended and gender were also noted to have a significant correlation to performance in the ITE in the study conducted by McLintock and Gravlee.1 Other factors that may affect performance in both the OPEX and PBO written qualifying examinations such as age, gender, medical education background, attendance in conferences or other learning activities and amount of time spent in preparation for examinations were not taken into consideration in this study. The role of these traineerelated factors in influencing performance in both OPEX and PBO written qualifying examination was not studied. For purposes of this study, it was assumed that all residents were comparable in all other aspects. Variables that may affect the quality of training provided, such as the amount of patient exposure and faculty composition were, likewise, assumed to be comparable across the different residency training programs. This study provides evidence that there is a positive correlation between resident performance in the OPEX and their future performance in the written certifying examinations of the PBO.

The correlation increased as residents progressed from Y1 to Y3 of residency. A strong correlation was also noted between the PBO written examinations scores and the terminal OPEX scores; and that the strength of correlation between the two examinations increased with increase in the number of times a resident took OPEX during residency training. These results should, however, be taken with caution as these do not suggest that the in-training examination should be used as the sole indicator of competency. These results also do not imply that training across the various institutions is of the same quality. Additional research is recommended to determine how trainingrelated variables (learning activities, patient exposure, faculty composition, etc.) and trainee-related variables (age, gender, medical education, preparation for the examination, etc.) influence outcome of performance of trainees in certifying examinations.

References

1. McClintock JC, Gravlee GP. Predicting success on the certification examinations of the American Board of Anesthesiology. Anesthesiology 2010; 112: 212-219.

2. Kearney RA, Sullivan P, Skakun E. Performance on ABA-ASA In-training Examination predicts success for RCPSC certification. Can J Anesth 2000; 47: 914-918.

3. Althouse LA, McGuinness GA. The in-training examination: an analysis of its predictive value on performance on the general pediatrics certification examination. J Pediatr 2008; 153: 425-428.

4. Babbott SF, Beasley BW, Hinchey KT, et al. The predictive validity of the internal medicine in-training examination. Am J Med 2007; 120:735-740.

5. Baumgartner BR, Peterman SB. Relationship between American College of Radiology in-training examination scores and American Board of Radiology written examination scores. Acad Radiol 1996; 3: 873-878.

6. Ellis E, Hang R. A comparison of performance on the OMSITE and ABOMS written qualifying examinations. J Oral Maxillofac Surg 2000; 58: 1401-1406.

7. Repogle WH, Johnson WD. Assessing the predictive value of the American Board of Family Practice in-training examination. Fam Med 2004; 36:185-188.

8. Leigh TMN, Johnson TP, Piscano NJ. Predictive validity of the American Board of Family Practice In-Training Examination. Acad Med. 1990; 65:454-457.

9. Itani KMF, Miller C, Church H, McCollum C. American Board of Surgery In-Training Examination (ABSITE) performance: Effects of residents’ perception, preparation, and past performance. Cur Surg 1999; 56: 45-49.

10. Webb LC, Juul D, Reynolds CF III, et.al. How well does the psychiatry residency in-training examination predict performance on the American Board of Psychiatry and Neurology part-I examinations? Am J Psychiatr 1996; 153: 831-832.

11. Baverstock PJ, MacNeily AE, Cole G. The American Urological Association inservice examination: performance correlates with Canadian and American specialty examinations. J Urol 2003; 170: 527-529.

12. Brill-Edwards P, Couture L, Evans G, et al. Predicting performance in the Royal College of Physicians and Surgeons of Canada internal medicine written examination. CMA 2001;165: 1305-1307.

13. Godellas CV, Huang R. Factors affecting performance on the American Board of Surgery in-training examination. Am J Surg 2001; 181: 294-296.