Kuram ve Uygulamada Egitim Bilimleri, vol.16, no.1, pp.319-330, 2016 (SSCI)
The study aims to examine whether differential item function is displayed in three different test forms that have item orders of random and sequential versions (easy-to-hard and hard-to-easy), based on Classical Test Theory (CTT) and Item Response Theory (IRT) methods and bearing item difficulty levels in mind. In the correlational research, the data from a total of 578 seventh graders were gathered using an Atomic Structures Achievement Test. R programming language and “difR” package were employed for all the analyses. As a result of the analyses, it was concluded that a comparison of IRT- and CTT-based methods indicate a greater number of items with distinctively significant differential item functioning. Different item ordering leads students at the same ability levels to display different performances on the same items. As a result, it is found that item order differentiates the probability of correct response to the items for those at the same ability levels. A test form of sequential easy-to-hard questions brings more advantages than that of a hard-to-easy sequence or a random version. The findings show that it is essential to arrange tests that are employed to make decisions about people in consideration with psychometric principles.