NEWSBÜCHERJOURNALEONLINE-SHOP



 

Sie befinden sich hier: JOURNALE » Psychological Test and Assessment Modeling » Currently available » Inhalt lesen

« zurück

Psychological Test and Assessment Modeling

» Psychological Test and Assessment Modeling Online-Shop...
 

Published under Creative Commons: CC-BY-NC Licence


2020-2

Editorial
Lale Khorramdel, Artur Pokropek & Peter van Rijn
PDF of the full article

Rapid guessing rates across administration mode and test setting
Ulf Kroehne, Tobias Deribo & Frank Goldhammer
PDF of the full article

Examining gender DIF and gender differences in the PISA 2018 reading literacy scale: A partial invariance approach
Lale Khorramdel, Artur Pokropek, Seang-Hwane Joo, Irwin Kirsch & Laura Halderman
PDF of the full article

A review of different scaling approaches under full invariance, partial invariance, and noninvariance for cross-sectional country comparisons in large-scale assessments
Alexander Robitzsch & Oliver Lüdtke
PDF of the full article

Assessing group comparisons or change over time under measurement non-invariance: The cluster approach for nonuniform DIF
Steffi Pohl & Daniel Schulze
PDF of the full article | Supplements

An extension of the invariance alignment method for scale linking
Artur Pokropek, Oliver Lüdtke & Alexander Robitzsch
PDF of the full article
 


Rapid guessing rates across administration mode and test setting
Ulf Kroehne, Tobias Deribo & Frank Goldhammer

Abstract

Rapid guessing can threaten measurement invariance and the validity of large-scale assessments, which are often conducted under low-stakes conditions. Comparing measures collected under different administration modes or in different test settings necessitates that rapid guessing rates also be comparable. Response time thresholds can be used to identify rapid guessing behavior. Using data from an experiment embedded in an assessment of university students as part of the National Educational Panel Study (NEPS), we show that rapid guessing rates can differ across modes. Specifically, rapid guessing rates are found to be higher for un-proctored individual online assessment. It is also shown that rapid guessing rates differ across different groups of students and are related to properties of the test design. No relationship between dropout behavior and rapid guessing rates was found.

Keywords: rapid guessing, validity, assessment innovations, technology & assessment, test design & test construction, mode effects, test-taking behavior, log data


Ulf Kroehne, PhD
DIPF | Leibniz Institute for Research 
and Information in Education
Rostocker Strasse 6
60323 Frankfurt am Main, Germany


 


Examining gender DIF and gender differences in the PISA 2018 reading literacy scale: A partial invariance approach 
Lale Khorramdel, Artur Pokropek, Seang-Hwane Joo, Irwin Kirsch & Laura Halderman

Abstract

Gender differences in reading performance in international large-scale assessments (ILSAs) are regularly observed across countries and assessments and over time. This paper aims to evaluate different sources of gender differences in PISA 2018. First, we evaluate whether gender differences might be related to gender-specific differential item functioning (DIF). For analyzing DIF in the complex settings of ILSAs, a multiple-group concurrent calibration based on the two-parameter logistic model (2PLM) and generalized partial credit model (GPCM) with partial invariance assumption is used. Second, we examine the diagnostic value of the reading literacy subscales (text sources, text formats, cognitive processes) as well as students’ attitude towards reading through use of multidimensional item response theory (MIRT) models, linear regression, and other exploratory analysis. Results show no strong DIF effects for gender in PISA 2018 and that no additional value is provided by the reading literacy subscales. We show that gender differences might, in part, be related to reading attitudes, at least in some country-by-language groups.

Keywords: differential item functioning, gender differences, item response theory, measurement invariance, PISA


Lale Khorramdel, PhD
Center for Advanced Assessments (CAA)
National Board of Medical Examiners
3750 Market Street
Philadelphia, PA 19104, USA
 

 


A review of different scaling approaches under full invariance, partial invariance, and noninvariance for cross-sectional country comparisons in large-scale assessments
Alexander Robitzsch & Oliver Lüdtke

Abstract

One of the primary goals of international large-scale assessments (ILSAs) in education is the comparison of country means in student achievement. The present article introduces a framework for discussing differential item functioning (DIF) for country comparisons in ILSAs. Three different linking methods are compared: concurrent calibration based on full invariance, concurrent calibration based on partial invariance using the MD or RMSD statistics, and separate calibration with subsequent nonrobust and robust linking approaches. Furthermore, we show analytically the bias in country means of different linking methods in the presence of DIF. In a simulation study, we show that partial invariance and robust linking approaches provide less biased country mean estimates than the full invariance approach in the case of biased items. Some guidelines are derived for the selection of cutoff values for the MD and RMSD statistics in the partial invariance approach.

Keywords: international large-scale assessments, linking, differential item functioning, multiple groups, RMSD statistic


Alexander Robitzsch, PhD
Leibniz Institute for Science 
and Mathematics Education (IPN)
Olshausenstr. 62
24118 Kiel, Germany

 


Assessing group comparisons or change over time under measurement non-invariance: The cluster approach for nonuniform DIF
Steffi Pohl & Daniel Schulze

Abstract

Comparing scale scores across groups or time requires the assumption of measurement invariance. In order to still facilitate comparisons when measurement invariance does not hold, researchers can strive for partial measurement invariance by identifying respective anchor items. Recently, the cluster approach (Bechger & Maris, 2015; Pohl & Schulze, 2020) has been suggested for the identification of such anchor items regarding intercept parameters within the 1PL model only. We extend the cluster approach to intercepts and slopes in the 2PL model. The cluster approach acknowledges the scale indeterminacy problem and – in contrast to previous approaches – identifies multiple possible item clusters that may function as anchor items. This allows researchers to substantially consider various solutions as well as to depict the uncertainty of the results due to anchor item selection. Here, we evaluate the performance of the approach in a simulation study and illustrate its application in an empirical example.

Keywords: Measurement invariance, differential item functioning, cluster analysis, nonuniform DIF, scale indeterminacy


Steffi Pohl, PhD
Freie Universität Berlin
Arbeitsbereich Methoden und 
Evaluation/Qualitätssicherung
Habelschwerdter Allee 45
14195 Berlin, Germany

 


An extension of the invariance alignment method for scale linking
Artur Pokropek, Oliver Lüdtke & Alexander Robitzsch 

Abstract

We examine the extension of the invariance alignment (IA) method originally proposed by As-parouhov and Muthén (2014). The generalized form of a loss function for the IA is discussed, and different forms of the loss function are evaluated using Monte Carlo studies and an empirical example using European Social Survey Data. We compare results obtained by the Mplus software (Muthén & Muthén, 1998-2017) with the R package sirt (Robitzsch, 2019). It is shown that different forms of loss functions that are implemented in the sirt package differ in their performance according to the recovery of group means. This suggests that the performance of IA heavily depends on the form of the loss functions, type of the data (mostly sample size), and type of invariance that could be encountered. The results show that the loss function proposed by Asparouhov and Muthén (2014) might not be optimal in all situations.

Keywords: invariance alignment, CFA, multiple-group model, linking, simulation


Artur Pokropek, PhD
Institute of Philosophy and Sociology
Polish Academy of Sciences
Warsaw, Poland

 



Psychological Test and Assessment Modeling
Volume 62 · 2020 · Issue 2

Pabst, 2020
ISSN 2190-0493 (Print)
ISSN 2190-0507 (Internet)

» Psychological Test and Assessment Modeling Online-Shop...





alttext    

 

Aktuell

Socials

Fachzeitschriften