How is the methodology used to evaluate competency validated and how often is the validation done?
Most regulatory bodies are not experts in the specialized science of examination analysis and blueprinting. This is why RCDSO and many regulators use external examining bodies, such as the National Dental Examining Board, that specialize in this field.
The National Dental Examining Board regularly reports to RCDSO and Canadian Dental Regulatory Authorities Federation. This report includes progress as an international expert and consultant in the field of examinations and competencies and results of its examinations and psychometric evaluations. In addition, NDEB publishes a technical manual that provides detailed validity and reliability analysis for the NDEB examinations.
Any test will have variables. A variable can be measured, altered and controlled to verify outcomes are consistent and reliable. Dichotomous variables basically mean something has two parts, e.g. “yes/no”, “male/female”. Nondichotomous variables mean something has multiple categories or levels.
NDEB uses two instruments to measure and confirm that its examinations are fair, valid, consistent and reliable: Cronbach’s Alpha and the Kuder-Richardson Formula 20. KR20 is a measure of internal consistency and reliability for measures with dichotomous choices whereas Cronbach’s is used for nondichotomous measures. Using both formulas assures a proper balance and cross-check of accuracy of outcomes.
Examination processes are never static and experts in the field clearly stipulate that goal and standard setting, analyzing test results, performing psychometric validity testing, and periodic revalidation of competency statements are absolutely necessary.
NDEB conducts these validation exercises on an ongoing basis, in addition to periodic major reviews. They prove the necessity and value of a national competency document and serve as a reference for curriculum management, program accreditation, and development of certification examinations.
Every year certification and examination procedures and the responses of the candidates are assessed using these formulas. Scores are adjusted to ensure fairness through a process called test equating. Further modifications are made where needed to improve the validity and reliability of the examinations. In addition to these internal reviews, NDEB invites several external evaluations.
For a detailed accounting of the test construction process, validity, scoring and statistical analysis see Technical Manual for OSCE, Assessment of Fundamental Knowledge, Written Examination, Assessment of Clinical Judgement on the NDEB website at www.ndeb-bned.ca.