Basic Item Analysis for Multiple-Choice Tests. ERIC-AE Digest.Report as inadecuate




Basic Item Analysis for Multiple-Choice Tests. ERIC-AE Digest. - Download this document for free, or read online. Document in PDF available to download.





This digest presents a list of recommendations for writing multiple-choice test items, based on psychometrics statistics are typically provided by a measurement, or test scoring, service, where tests are machine-scored or by testing software packages. Test makers can capitalize on the fact that "bad" items can be differentiated from "good" items psychometrically. Tests can be improved by maintaining a pool of "good" items from which future tests can be drawn. Once test items are identified as being appropriately written, the extent to which they discriminate among students must be determined. The degree to which they do discriminate is the basic measure of item quality for almost all multiple-choice items. Statistics usually provided by a test scoring service provide the information needed overlapping distracters. (Contains 8 references.) (SLD) Specific suggestions are given for readjusting recorded items to develop an item pool with specific content areas that can be used to construct homogeneous tests for a unified content area. (Contains 4 references.) (SLD)

Descriptors: Item Analysis, Item Banks, Measurement Techniques, Multiple Choice Tests, Psychometrics, Scoring, Test Construction, Test Items

ERIC Clearinghouse on Assessment and Evaluation, The Catholic University of America, Department of Education, O'Boyle Hall, Washington, DC 20064 (free).









Author: Kehoe, Jerard

Source: https://eric.ed.gov/?q=a&ft=on&ff1=dtySince_1992&pg=9537&id=ED398237







Related documents