BOC Annual Report 2011-12 Public Version - Bocatc

Transcription

Examination Report for 2011‐2012 Testing YearBoard of Certification (BOC) Certification Examination for Athletic TrainersStephen B. Johnson, Ph.D.Castle Worldwide, Inc.Prepared March 2012Updated May 2012

TABLE OF CONTENTSINTRODUCTION . 3Description of the Certification Examination. 3Format of Items on the Certification Examination . 4Delivery of the Certification Examination . 4Number of Test Forms . 5Standard Setting and Equating of Test Forms . 5Use of Scaled Scores . 5Score Reporting. 5Certification Examination Development . 6Stand‐Alone Committee . 6Focused Testlet Committee . 6Test Form Assembly . 6I‐Dev . 6Item Bank . 6ANALYSIS OF THE CERTIFICATION EXAMINATION . 7Candidate Performance . 7Candidates Excluded from this Report . 7Pass Rates . 7Distribution of Candidate Scores . 9Test Form Summary Statistics . 11Difficulty and Discrimination. 13Domain Performance . 14Test Form Reliabilities & Other Summary Data . 16SUMMARY . 17REFERENCES . 18APPENDICES . 19Appendix A: Definitions of Form Statistics . 19Mean Score . 19Standard Deviation . 19Standard Error of Measurement . 19Min and Max (Low and High Score) . 19Avg. Diff . 19Avg. Discrim. . 20Reliability Measures . 20Appendix B: Correlations of Candidate Performance on the Five Domains . 21BOC 2011‐12 Testing Year ReportConfidential materials1

FIGURESFigure 1: Cumulative Percentage of First‐time New Graduates and Retake Candidates by Scaled Score,BOC 2011‐2012. . 9Figure 2: Scale Score Distribution of First‐time New Graduates and Retake Candidates,BOC 2011‐2012. . 10TABLESTable 1: Number of Candidates in Three Cohorts and Pass Rates for BOC Certification Examination, 2005‐2006 to2011‐2012. . 8Table 2: Passing Rates for Each Test Form for All Candidates for BOC Certification Examination, 2011‐2012. 8Table 3: Number of Candidates in Three Cohorts, Minimum, Maximum and Average Scaled Score, Median and ModeScaled Score, and Standard Deviation (Scaled Score) for BOC Certification Examination, 2010‐2011. . 9Table 4: Summary Test Form Statistics in Scaled Scores for Candidates for BOC Certification Examination,2011‐2012. . 11Table 5: Univariate Between Subject Effects Assessing Interaction Between Exam Form (ExamForm), Test Window(Month), and Retake Status (Retake) for BOC Certification Examination, 2011‐2012. . 13Table 6: Summary of Item Discrimination and Difficulty for First Forms Tested, BOC 2011‐2012. 14Table 7: Domain Level Statistics for Each Test Form for All Candidates for BOC Certification Examination,2010‐2011 (Raw Scores). . 15Table 8: Summary Statistics for the 2011‐2012 Administrations of BOC Athletic Trainer Test Forms. 16BOC 2011‐12 Testing Year ReportConfidential materials2

INTRODUCTIONThe Board of Certification (BOC) is a non‐profit credentialing agency that provides certification for the athletic trainingprofession. The BOC was incorporated in 1989 to govern the certification program, which had then existed for nearly 20years, for entry‐level athletic trainers and recertification standards for certified athletic trainers. The entry‐levelcertification program is designed to establish a common benchmark for entry into the athletic training profession. TheBOC serves the public interest by developing, administering, and continually reviewing a certification process thatreflects current standards of practice in athletic training.In order to develop a credible and valid examination, the BOC contracts with Castle Worldwide, Inc. (Castle), for thedesign, development, and delivery the BOC’s athletic trainer certification examinations. Castle follows and recommendswidely accepted standards and regulations (e.g., Standards for Educational and Psychological Testing, AmericanEducational Research Association, 1999; Uniform Guidelines on Employee Selection Procedures, EEOC, 1978; Standardsfor the Accreditation of Certification Programs, National Commission for Certifying Agencies, 2005) for the developmentand analysis of the BOC’s athletic trainer certification examinations.The major objective of the BOC’s athletic trainer certification program is to establish that individuals have theknowledge and skills necessary to create and provide safe and effective athletic training services. It provides assurancethat a certified athletic trainer has met eligibility criteria addressing training, experience, and the knowledge and skillsnecessary for competent performance of his or her work.In order to attain certification, an individual must complete an entry‐level athletic training education program accreditedby the Commission on Accreditation of Athletic Training Education (CAATE) and pass the BOC certification examination.In order to qualify as a candidate for the BOC certification examination, an individual must meet the followingrequirements: Endorsement of the certification examination application by the recognized program director (PD) of theCAATE accredited education program. Proof of current certification in emergency cardiac care (ECC).(Note: ECC certification must be current at the time of initial application and any subsequent exam retakeregistration.)Description of the Certification ExaminationThe BOC certification examination is designed to test an individual’s knowledge across the practice of athletic trainingbased on a defined test blueprint. The certification examination is based on test content specifications established in therole delineation/practice analysis study (RD/PA6) introduced in April 2011. From the study, five performance domains(i.e., major areas of responsibilities or duties) were established:1. Injury/illness Prevention and Wellness Protection;2. Clinical Evaluation and Diagnosis;3. Immediate and Emergency Care;4. Treatment and Rehabilitation; and5. Organization and Professional Health and Well‐being.All items and test forms are written to meet these specifications and subsequent performance standards for thecertification examination. The certification examination blueprint is contained in Appendix A.BOC 2011‐12 Testing Year ReportConfidential materials3

Format of Items on the Certification ExaminationItems on the BOC certification examination consist of multiple‐choice, multi‐select, hotspot, and drag‐and‐drop items.The multiple‐choice items contain a stem and four to five possible response options. The stem is typically a directquestion. Of the response options, there is one correct or clearly best answer, referred to as the key. The incorrectresponse options are called distractors. Points for an item are provided for correctly answering the item.The multi‐select items contain a stem and four to eight possible response options. Of these options, more than one canbe correct. Candidates can select more than one option. Points for an item are provided for correctly selecting anoption.The hotspot items contain a stem and an image. Candidates place a “hotspot” on the correct portion of the image.Points are provided for correctly placing the hotspot.The drag‐and‐drop items contain a stem, a list of N options, and a series of N “buckets.” Candidates can select from oneto N options from the list and “drop it” in to one of the “buckets.” Options can be used once, multiple times, or not at alldepending on the item. Candidates are informed of the options usability. This item is scored one point for each correctlyselected option. All items are converted into a scale of 0 to 1.Items on the BOC certification examination also are provided in a focused testlet format. These testlets are designed toassess more complex decision‐making skills required for the role of an entry‐level certified athletic trainer through casesthat are rich in relevant, realistic, and specific information. Each focused testlet consists of a scenario followed by fivequestions utilizing the range of item types. The testlets focus on questions that ask candidates to: Identify important facts specifically stated in the material; Understand the meaning of key words and phrases in the material; Draw conclusions and infer meanings from the material; Consider and evaluate evidence to support or reject different ideas; and/or Apply information presented in the material to a new or different situation.This testlet format is used by many organizations and is best known for its use in reading comprehension examinations(e.g., LSAT and GMAT ). The concept of a focused testlet is exemplified by the Medical College Admission Test(www.aamc.org/students/mcat/) and the Royal Australian College of General Practitioners (RACGPhttp://www.racgp.org.au/exam).Items are constructed using guidelines established by the BOC for the development and review of items.Delivery of the Certification ExaminationThe BOC certification examination test forms include a combination of scored and experimental [unscored] itemstotaling 175 total items. Each test form includes five focused testlets.Certification examinations are completed in one session and candidates are allotted a period of four hours. Shorttutorials are available prior to the start and a short satisfaction survey appears following the end of the examination. TheBOC uses Castle’s Internet‐based test delivery system (PASS) for test administration.For the 2011‐2012 testing year, the certification examination was administered in five 14‐day test windows: March/April2011, May/June 2011, July/August 2011, November 2011, and January/February 2012. The BOC certificationexamination forms consist of scored and experimental items with scored items in common with an anchor form.Candidates who fail are not restricted in their retakes during the testing year.BOC 2011‐12 Testing Year ReportConfidential materials4

Number of Test FormsMultiple sets scored items were developed for 2011‐2012. Each scored set was assigned different experimental sets forthe year, creating six different test forms. Forms 362(1) and 362(2) were administered in April 2011, Forms 362(3) and362(4) in June 2011, Form 362(5) in August 2011, and Form 362(6) in November 2011 and February 2012.Standard Setting and Equating of Test FormsIn February 2011, a panel of 10 currently certified athletic trainers was convened to establish the performance standardto be implemented for the revised test blueprint (RD/PA6). The panel reviewed the scored questions for Forms 362(1)and 362(2) scheduled for introduction in April 2011. The panel participated in three rounds of data collection using amodified‐Angoff model, the Yes/No technique (Impara & Plake, 1997).Forms 362(1) and 362(2) contained the same 125 scored items presented to the standard setting panel, with a differentset of experimental items. Form 362(1) was administered to 1,031 candidates, and Form 362(2) was administered to 977candidates. Candidates were randomly assigned to each form.Following administration, a review of the standard setting panel’s data and performance of candidates on both formswere presented to the BOC. The BOC confirmed the use of the panel’s median recommendation from the third round ofreview.Forms 362(3) and 362(4) administered in June 2011 had items in common with the scored set of items used for thestandard setting form. BOC equating follows the protocols for common items non‐equivalent groups design using theLevine True Score Method Applied to Observed Scores with internal anchors (Kolen & Brennan, 2004). This designcompares the performance of one group of test takers on one examination form to another group of test takers on anearlier examination form with a known cut‐score. Ultimately, all equating is compared to the performance standardestablished for the standard setting form, 362(1).The protocol for equating is to equate the current test forms to a form used within the last two years in order to avoiditem overexposure through repeated selection of the standard setting examination versions, the removal of outdated orinappropriate items, and a potential shift over time of candidate demographics and experiences that impact theperformance.Use of Scaled ScoresSince examination forms are possibly of different difficulty, providing raw scores can be misleading. As a result, manyprograms, including the ACT and SAT examinations, use scaled scores. Scaled scores are particularly useful atproviding the basis for long‐term, meaningful comparisons of results across different administrations of an examination.Scaled scores are used because, over the life of every testing program, there are situations when changes in test lengthoccur: a decision is made to assess more or fewer areas, the numbers of items that are scored versus unscored(experimental) changes, or different examination forms of different difficulty are being compared.For scaled scores, the passing standard (number of items answered correctly) on any examination form is alwaysreported as the same scaled score.The equated scores for the BOC certification examination are converted via linear transformation so that the passingstandard for all test forms are reported to candidates as 500 on a scale of 200 to 800.Score ReportingThe BOC provides scaled scores and pass/fail decisions to candidates approximately two weeks after closure of a testwindow. Candidates pass or fail based on their scaled score performance compared to a criterion‐referencedperformance standard.BOC 2011‐12 Testing Year ReportConfidential materials5

Certification Examination DevelopmentDuring 2010‐2011, new test specifications and the associated passing standard were introduced. All later forms of theBOC certification examination are equated back to this standard.Since 2006, the BOC has provided a computerized certification examination. Prior to 2007‐2008, the certificationexamination consisted of three separate components. Since this period, the certification examination consists of oneassessment experience for candidates. During the 2008‐2009 testing year, focused testlets were introduced to thetesting model.Stand‐Alone CommitteeDuring the February 2011 meeting, the committee developed and reviewed 66 stand‐alone items. During the July 2011meeting, the committee reviewed 107 items that were field tested in April and June 2011. In addition, the committeedeveloped and reviewed 26 stand‐alone items. During the November 2011 meeting, the committee reviewed andfinalized 225 items in order to assemble two forms of the BOC certification examination.Focused Testlet CommitteeDuring the February 2011 meeting, the committee reviewed and finalized two focused testlets and began developmentof 11 focused testlets. In addition, one focused testlet was broken into five stand‐alone items. During the July 2011meeting, the committee reviewed eight focused testlets that were field tested in April and June 2011. In addition, fivefocused testlets were reviewed and finalized, and six focused testlets were developed. During the November 2011meeting, the committee reviewed and finalized 20 stand‐alone items and eight focused testlets.Test Form AssemblyAs part of RD/PA6, a new test blueprint was developed and approved. During the November 2010 meeting, two sets ofscored items aligned to the new test blueprint were assembled for administration during the 2011‐2012 testing year. Asecond review of the standard setting form was conducted in January 2011 in preparation for the standard settingmeeting in February 2011.I‐DevThe BOC also uses Castle’s I‐Dev system for online development of multiple‐choice items. In 2011, BOC subject matterexperts completed development of 608 items. The review and validation of these items will continue into 2012.Item BankCurrently, BOC’s item bank includes multiple‐choice, multi‐select, drag‐and‐drop, and hotspot items. Castle staffcontinually reviews and edits items and the resulting certification examination forms for psychometric and publicationpurposes. Items for the certification examination are stored in I‐Bank, Castle’s proprietary item‐banking system.BOC 2011‐12 Testing Year ReportConfidential materials6

ANALYSIS OF THE CERTIFICATION EXAMINATIONCandidate PerformanceStatistics reported refer to the performance of ‘analyzed’ candidates for the BOC certification examination. Statisticalreports are generated for a particular time (e.g., a test window). Some candidates are excluded from the pool ofanalyzed data, specifically those candidates who completed less than 25% of their examinations. It is likely that thesecandidates experienced problems, such as being late to the site or other issues, and therefore, their data is problematic.As of 2007, the three cohorts of candidates reported for the BOC certification examinations are:1. First‐time candidates – candidates reported as first‐time test takers on the certification examination fromathletic training education programs accredited by the CAATE.2. Retakes – candidates who re‐sat the certification examination one or more times.3. All – candidates who tested.Candidates Excluded from this ReportThe report does not include, except where noted, those candidates who were administered the BOC certificationexamination via paper‐and‐pencil or those candidates with incomplete data. As a result, the number of candidatesanalyzed for this report may not match the number of candidates who sat for the BOC certification examination. Datafrom previous years may only include two of the three cohorts.Data for individual tables also may differ due to exclusion of some candidates from the analysis for that table. Data priorApril 2007 are excluded from the remainder of this report, except where noted, as the program used to assesscandidates is not equivalent to the current BOC certification examination protocol.There were 4,886 reported administrations of the BOC certification examination during the 2011‐2012 testing year, adecline of 14% from 2010‐2011 (5,711), and a 20% decline from 2009‐2010 (6,171) and 2008‐2009 (6,135). Continuingan upward trend since 2008‐2009, of the 4,886 administrations, 3,222 (66%) examinations were administered to first‐time candidates, compared with 52% in 2010‐2011 and 46% in 2008‐2009 and 2009‐2010.Pass RatesTable 1 provides annual pass rates for BOC certification examination since 2005‐2006. Data for 2005‐2006 and 2006‐2007 are for the multiple‐choice component of the three‐part assessment used by the BOC at the time. Forms prior to2011‐2012 were administered under a different blueprint and standard and information is provided for historicalpurposes only.BOC 2011‐12 Testing Year ReportConfidential materials7

Table 1: Number of Candidates in Three Cohorts and Pass Rates for BOC Certification Examination, 2005‐2006 to 2011‐2012.1YearFirst‐time Pass% PassRetakePass% PassAllPass% ,2222,65382.3%1,66469641.8%4,8863,26966.9%Table 2 details the pass rates for each form by test window for the administrative year.Table 2: Passing Rates for Each Test Form for All Candidates for BOC Certification Examination, 2011‐2012.FrequencyPercentTest Window FormFailPassTotalFailPassApril 2011June 2011August 06 and 2006‐2007 data are for the multiple‐choice component only.BOC 2011‐12 Testing Year ReportConfidential materials8

Distribution of Candidate ScoresTable 3 details the overall scaled score performance for the BOC certification examination for 2011‐2012 with acomparison of the performance of candidates since 2008‐2009.Table 3: Number of Candidates in Three Cohorts, Minimum, Maximum and Average Scaled Score, Median and ModeScaled Score, and Standard Deviation (Scaled Score) for BOC Certification Examination, 2010‐2011.CohortNAvg.MedianStd. Dev.MinMaxAll 2011‐12First‐timeRetakeAll 77156230272230200200220692692644672672624All 2009‐106,17147648258200638All 2008‐096,13547347679200686A Univariate General Linear Model (GLM) test determined that there was a statistically significant difference in thescaled scores of retake and first‐time candidates (F (1, 4885) 1195.59, p .001, η .197).Figure 1 presents a cumulative frequency distribution for 2011‐2012 retake and first‐time candidates. The figurerepresents the proportion of candidates who scored at a scale score or lower.Figure 1: Cumulative Percentage of First‐time and Retake Candidates by Scaled Score, BOC 2011‐2012.BOC 2011‐12 Testing Year ReportConfidential materials9

Ideally there should be a sharp increase in the cumulative proportion of candidates around the passing standard, that is,the slope of the curve would be more vertical around the passing standard. Test forms where the slope is before or afterthe passing standard would not be functioning optimally. If the candidates were generally well prepared for thecertification examination, we also would expect to see a relatively constrained set of scores, no long tails to the upper orlower end of the scale. The data in Figure 1 shows that the majority of candidates are performing consistent with thisideal. The figure also shows that first‐time candidates are more successful in their performance than retake candidates.Figure 2 provides information on the distribution of scale scores for the two cohorts of candidates.Figure 2: Scale Score Distribution of First‐time and Retake Candidates, BOC 2011‐2012.BOC 2011‐12 Testing Year ReportConfidential materials10

Test Form Summary StatisticsTable 4 provides test form descriptive statistics for each test window by form and retake status (see Appendix A forinformation on the statistics reported).Table 4: Summary Test Form Statistics in Scaled Scores for Candidates for BOC Certification Examination, 2011‐2012.Test WindowExam FormRetake StatusNMeanStd. Dev.April 2011362(1)362(2)TotalJune 2011362(3)362(4)TotalAugust 2011362(1)362(5)TotalNovember 2011362(1)362(6)TotalFebruary 2012362(6)TotalBOC 2011‐12 Testing Year ReportConfidential 1

Test WindowForm Totals for2011‐2012Exam FormRetake t‐timeRetakeTotalNMeanStd. 5484852474652514252544654494348534356514454As shown in Table 4, consistent with previous test administration years, there were differences in the scaled scores foreach test window and by retake status. A Univariate General Linear Model (GLM) was conducted to assess theinteraction between test form, test window, and retake status. The results indicated that there was no statisticallysignificant performance difference by candidates on each test form, that there was no statistical interaction betweenretake status and test form (i.e., retake candidates performed the same across all test forms, as did first‐timecandidates), and that there were statistical differences in the performance of candidates for different test windows(month) and by retake status. Table 5 provides the Between‐Subjects results for the Univariate GLM.BOC 2011‐12 Testing Year ReportConfidential materials12

Table 5: Univariate Between Subject Effects Assessing Interaction Between Exam Form (ExamForm), Test Window(Month), and Retake Status (Retake) for BOC Certification Examination, 2011‐2012.Type III Sum ofMeanPartial 8 .000.2181356.7201.000ModelIntercept33049897.001 33049897.0 14297.676 .000.746 14297.6761.000ExamForm6831.1141707.8.739 .565.0012.955.241Month33809.67311269.94.875 .002.00314

The major objective of the BOC's athletic trainer certification program is to establish that individuals have the knowledge and skills necessary to create and provide safe and effective athletic training services. . (e.g., LSAT and GMAT ). . Items are constructed using guidelines established by the BOC for the development and review of .