Made with FlowPaper - Flipbook Maker
2024 Exam Report Orthopedic Specialty Exam May 2025 2 of 5 BOC 2024 Annual OSE Report Introduction The Board of Certification, Inc., (BOC) is a non-profit credentialing agency that provides certification for the athletic training profession. Although it had already operated for 20 years as a committee of the National Athletic Trainers Association, the BOC was incorporated in 1989 to govern the certification program for entry- level certification for Athletic Trainers (AT) and the standards for recertification. The BOC serves the public interest by developing, administering and continually reviewing a certification process that reflects current standards of practice in athletic training. In 2021, the BOC Orthopedic Specialty Certification (orthopedic specialty) for ATs was launched. The Board- Certified Specialist in Orthopedics (BCS-O®) credential is available to ATs who have acquired specialized education and focused experience in orthopedics beyond the requirements for the ATC® credential. It is the first and only board-certified specialty for ATs who specialize in orthopedics, and it is backed by the BOC’s rigorous process and standards. Standard Setting and Equating of Exam Forms The modified Angoff process was used to establish the performance standard for the Orthopedic Specialty Exam (OSE). In October 2021, a panel of eight certified ATs with a specialty practice in orthopedics participated in the study. All new forms will be equated to the passing standard established for Form PA101 using a method that is appropriate for small programs after the first exam window in which they are used. Score Reporting Raw scores for the OSE are reported as scaled scores. Scaled scores are particularly useful because they provide the basis for meaningful long-term comparisons of results across different forms of an exam. Scaled scores are used because, over the life of every exam program, situations arise in which changes in exam length occur, a decision is made to assess more or fewer areas, and the number of items that are scored versus unscored (field test) changes or forms of the exam of different difficulty are compared. The equated scores are then converted via linear transformation to a scale of 1000 to 1450, with the passing standard reported as 1200. The BOC provides scaled scores and pass/fail standing to applicants approximately two to four weeks after the close of an exam window. Applicants pass or fail based on how their exam performance compares to the criterion-referenced passing standard. Because the content areas specified for the exam were validated as critical for specialty practice, the items and the forms are intended to assess essential knowledge and/or skills for newly certified specialists. Overall scores from the exam can be used to make inferences about the relevant knowledge and skills that applicants have acquired. Scores on the OSE are not, however, intended as predictors of future success in the specialty. 3 of 5 BOC 2024 Annual OSE Report Analysis of the OSE Applicant Performance Applicants may qualify to take the OSE by completing a residency in orthopedics or by satisfying alternative eligibility criteria. Also, applicants taking the OSE in the period addressed in this report were classified as either first-time exams or retake exams: ▪ First-time exams – exams taken by applicants who never previously sat for any form of the OSE ▪ Retake exams – exams taken by applicants who previously sat one or more times for OSE This report includes 22 exam events of the OSE across two administration windows, which matches the number of potential candidates from the 2023 testing year (22). Of the 22 exam events, three (13.6%) were graduates of an athletic training residency in orthopedics and 19 (86.4%) qualified by meeting other requirements. Additionally, 14 (63.6%) of the exam events involved applicants taking the exam for the first time, while 8 (36.4%) were retake attempts, each involving a unique individual (see Table 2). Pass Rates Table 1 provides total pass rates for the OSE exam since 2021 by eligibility pathway; and Table 2 provides annual pass rate by Retake status. Table 1. Historical Count and Pass Rate for the OSE Overall and by Eligibility Pathway Path 1 Residency Pass % Pass Path 2 Other Pass % Pass All Pass % Pass PA103 Total 15 12 80.0% 61 29 47.5% 76 41 53.9% Table 2. Historical Counts and Pass Rates for the OSE Overall and by Retake Status Year First-time Pass % Pass Retake Pass % Pass All Pass % Pass PA103 2021 15 9 60.0% --- --- --- 15 9 60.0% 2022 12 5 41.7% 5 2 40.0% 17 7 41.2% 2023 15 9 60.00% 7 2 28.57% 22 11 50.00% 2024 14 11 78.57% 8 3 37.50% 22 14 63.64% 4 of 5 BOC 2024 Annual OSE Report Table 3 presents pass/fail statistics for each form administered by modality during the 2024 exam year. The overall pass rate for 2024 was 63.64% Table 3. BOC OSE Exam Pass Rates by Form and Modality Form Frequency Percent Modality Pass Fail Total Pass Fail PA102 Onsite 4 3 7 57.14% 42.86% LRP 3 1 4 75.00% 25.00% Total 7 4 11 63.64% 36.36% PA103 Onsite 2 4 6 33.33% 66.67% LRP 5 0 5 100.00% 0.00% Total 7 4 11 63.64% 36.36% All 14 8 22 63.64% 36.36% Distribution of Candidate Scores Table 4 presents scaled score summary statistics for the 2024 annual testing cohort, alongside the previous three years of testing. Appendix A contains definitions and interpretational information for the statistics contained in the report. Table 4. Historical OSE Scaled Score Summary Statistics Grouped by Retake Status Cohort N Mean Median Std Dev Min Max All 2024 22 1225.1 1212 52.5 1158 1344 First-time 14 1242.9 1245 57.8 1158 1344 Retake 8 1194.0 1194 18.1 1170 1224 All 2023 22 1196.2 1203 42.9 1122 1272 First-time 15 1205.6 1218 43.7 1134 1272 Retake 7 1176.0 1182 35.7 1122 1212 All 2022 17 1196.1 1188 65.7 1110 1368 First-time 12 1201.5 1188 74.6 1110 1368 Retake 5 1183.2 1176 41.0 1134 1236 All 2021 15 1233.2 1200 97.3 1104 1410 First-time 15 1233.2 1200 97.3 1104 1410 Retake -- -- -- -- -- -- 5 of 5 BOC 2024 Annual OSE Report Exam Form Reliabilities and Other Summary Data The performance of the form of the OSE used during the year is consistent with reporting requirements for the National Commission for Certifying Agencies accreditation. Reliability is assessed using Cronbach’s alpha (Cronbach, 1951), the most widely used measure of overall exam form reliability; Livingston-Lewis (Livingston & Lewis, 1995), a measure used for estimating the decision consistency (i.e., the reliability of pass/fail decisions based on the test); and the standard error of measurement (SEM) presented in raw score units, a measure of the precision of the exam form. Reliability and decision consistency estimates for exam administration were all in the acceptable range for 2024. Summary and Conclusions Statistics concerning the quality of the OSE as a measurement instrument indicate that the exam complies with psychometric requirements that pertain to certification and licensure exams. Notably, estimates of reliability across forms of the exam are acceptable. Likewise, applicant performance on all parts of the exam is consistent with the public protection mission of the BOC. References American Educational Research Association, American Psychological Association, National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C.: AERA. Brennan, R. L., & Kane, M. T. (1977). An index of dependability for mastery tests. Journal of Educational Measurement, 14, 277–289. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16, 297–334. Equal Employment Opportunity Commission (EEOC), U.S. Civil Service Commission, U.S. Department of Labor, and U.S. Department of Justice. (1978). Uniform Guidelines on Employee Selection Procedures. Federal Register, 43 (166), 38290–38315. Impara, J. C., & Plake, B. S. (1997). Standard setting: An alternative approach. Journal of Educational Measurement, 34, 353–366. National Commission for Certifying Agencies (2021). Standards for the Accreditation of Certification Programs. Washington, DC: Institute for Credentialing Excellence. Kolen, M. J., & Brennan, R. L. (2004) Test Equating, Scaling and Linking: Methods and Practices Statistics for Social Science and Behavioral Sciences (2 ed.). Springer-Verlag New York Inc. Kuder, G. F., & Richardson, M. W. (1937). The theory of the estimation of test reliability. Psychometrika, 2, 151–160. Next >