Background: Despite the crucial importance of the notion of parallel forms within Classical Test Theory, the degree of parallelism between two forms of a test cannot be directly verified due to the unobservable nature of true scores. We intend to overcome some of the limitations of traditional approaches to analyzing parallelism by using the Differential Item Functioning framework.

Method: We change the focus on comparison from total test scores to each of the items developed during test construction. We analyze the performance of a single group of individuals on parallel items designed to measure the same behavioral criterion by several DIF techniques. The proposed approach is illustrated with a dataset of 527 participants that responded to the two parallel forms of the Attention Deficit-Hyperactivity Disorder Scale (Caterino, Gómez-Benito, Balluerka, Amador-Campos, & Stock, 2009).

Results: 12 of the 18 items (66.6%) show probability values associated with the Mantel χ 2 statistic of less than .01. The standardization procedure shows that half of DIF items favoured Form A and the other half Form B.

Conclusions: The “differential functioning of behavioral indicators” (DFBI) can provide unique information on parallelism between pairs of items to complement traditional analysis of equivalence between parallel test forms based on total scores.

Download full-text PDF

Source
http://dx.doi.org/10.7334/psicothema2015.112DOI Listing

Publication Analysis

Top Keywords

parallel forms
12
differential item
8
item functioning
8
functioning behavioral
8
parallel
5
forms
5
test
5
items
5
detecting differential
4
behavioral indicators
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!