Assessing professional competence in optometry – a review of the development and validity of the written component of the competency in optometry examination (COE)
Version 3 2024-06-19, 01:03Version 3 2024-06-19, 01:03
Version 2 2024-06-03, 22:43Version 2 2024-06-03, 22:43
Version 1 2021-01-18, 11:56Version 1 2021-01-18, 11:56
journal contribution
posted on 2024-06-19, 01:03authored bySimon BackhouseSimon Backhouse, NG Chiavaroli, KL Schmid, T McKenzie, AL Cochrane, G Phillips, I Jalbert
Abstract
Background
Credentialing assessment for overseas-educated optometrists seeking registration in Australia and New Zealand is administered by the Optometry Council of Australia and New Zealand. The aim was to review the validation and outcomes of the written components of this exam to demonstrate credentialing meets entry-level competency standards.
Methods
The Competency in Optometry Examination consists of two written and two clinical parts. Part 1 of the written exam comprises multiple choice questions (MCQ) covering basic and clinical science, while Part 2 has 18 short answer questions (SAQ) examining diagnosis and management. Candidates must pass both written components to progress to the clinical exam. Validity was evaluated using Kane’s framework for scoring (marking criteria, item analysis), generalization (blueprint), extrapolation (standard setting), and implications (outcome, including pass rates). A competency-based blueprint, the Optometry Australia Entry-level Competency Standards for Optometry 2014, guided question selection with the number of items weighted towards key competencies. A standard setting exercise, last conducted in 2017, was used to determine the minimum standard for both written exams. Item response theory (Rasch) was used to analyse exams, produce reliability metrics, apply consistent standards to the results, calibrate difficulty across exams, and score candidates.
Results
Data is reported on 12 administrations of the written examination since 2014. Of the 193 candidates who sat the exam over the study period, 133 (68.9%) passed and moved on to the practical component. Ninety-one (47.2%) passed both the MCQ and SAQ exams on their first attempt. The MCQ exam has displayed consistently high reliability (reliability index range 0.71 to 0.93, average 0.88) across all 12 administrations. Prior to September 2017 the SAQ had a set cutscore of 50%, and the difficulty of the exam was variable. Since the introduction of Rasch analysis to calibrate difficulty across exams, the reliability and power of the SAQ exam has been consistently high (separation index range 0.82 to 0.93, average 0.86).
Conclusions
The findings from collective evidence support the validity of the written components (MCQ and SAQ) of the credentialing of the competency of overseas-educated optometrists in Australia and New Zealand.