NEWSBÜCHERJOURNALESHOP



 

Sie befinden sich hier: JOURNALE » Psychological Test and Assessment Modeling » Currently available » Inhalt lesen

« zurück

Psychological Test and Assessment Modeling

» Psychological Test and Assessment Modeling Online-Shop...


2018-2

Contents  PDF of the full article

Guest Editorial
Advances in Educational Measurement

Andreas Frey, Christoph König & Christian Spoden
Abstract | PDF of the full article

Development and implementation of a machine-supported coding system
for constructed-response items in PISA

Kentaro Yamamoto, Qiwei He, Hyo Jeong Shin & Matthias von Davier
Abstract | PDF of the full article

 

Item position effects in a reading comprehension test:
An IRT study of individual differences and individual correlates

Gabriel Nagy, Benjamin Nagengast, Michael Becker, Norman Rose & Andreas Frey
Abstract | PDF of the full article

 

Modeling and predicting non-linear changes in educational trajectories:
The multilevel latent growth components approach

Christoph Kiefer, Yves Rosseel, Bettina S. Wiese & Axel Mayer
Measurement of teachers’ reactive, preventive and proactive classroom

Abstract | PDF of the full article 

 

Measurement of teachers’ reactive, preventive and proactive classroom management skills by student ratings – results from a two-level confirmatory factor analysis
Christian Spoden & Katharina Fricke
Abstract | PDF of the full article

 

Modeling item position effects with a Bayesian item response model applied to PISA 2009–2015 data
Matthias Trendtel & Alexander Robitzsch
Abstract | PDF of the full article

 

How many classes are needed to assess effects of instructional quality? A Monte Carlo simulation of the performance of frequentist and Bayesian multilevel latent contextual models
Christoph Helm
Abstract | PDF of the full article


Advances in Educational Measurement
Andreas Frey, Christoph König & Christian Spoden

 

Andreas Frey
Institute of Educational Science
Department of Research Methods in Education
Friedrich Schiller University Jena
Am Planetarium 4
D-07743 Jena, Germany

andreas.frey@uni-jena.de

top


Development and implementation of a machine-supported coding system for constructed-response items in PISA
Kentaro Yamamoto, Qiwei He, Hyo Jeong Shin & Matthias von Davier
Abstract

Approximately a third of the Programme for International Student Assessment (PISA) items in the core domains (mathematics, reading, and science) are constructed response and require human coding. This process is time consuming, expensive, and prone to error. The shift in PISA 2015 from paper- to computer-based assessment digitized all responses and associated coding, providing opportunities to introduce technology and analytical methods to improve data processing and analyses in future cycles. The current study explains the framework and approach for improving the accuracy and efficiency of the coding process in constructed-response items for future PISA cycles. Using the frequency distributions, consistencies of responses in coding categories, analysis of coder agreement, and graphic representations, we investigated the efficiency of the proposed machine-supported coding system for all human-coded items across multiple countries using PISA 2015 data and demonstrate how the proposed system was implemented in the PISA 2018 field trial.

Keywords: machine-supported coding, constructed-response items, human coding, large-scale assessments, PISA


Kentaro Yamamoto
Educational Testing Service
660 Rosedale Road 13-E
Princeton
NJ 08541, USA

kyamamoto@ets.org

top


Item position effects in a reading comprehension test: An IRT study of individual differences and individual correlates
Gabriel Nagy, Benjamin Nagengast, Michael Becker, Norman Rose & Andreas Frey
 

Abstract

Item position (IP) effects typically indicate that items become more difficult towards the end of a test. Such effects are thought to reflect the persistence with which test takers invest effort and work precisely on the test. As such, IP effects may be related to cognitive and motivational variables that are relevant for maintaining a high level of effort and precision. In this article, we analyzed IP effects in a reading comprehension test. We propose an IRT model that includes random IP effects affecting item difficulties and fixed IP effects affecting item discriminations. We found evidence for gradually increasing item difficulties and decreasing discriminations. Variation in IP effects on the item difficulties was systematically related to students’ decoding speed and reading enjoyment. The results demonstrate that the relationship between the overall scores and other variables is affected by respondents’ test-taking behavior, which is reflected in the random IP effect.

Keywords: reading comprehension, item position effects, correlates of item position effects, item response theory


Gabriel Nagy
Leibniz Institute for Science and Mathematics Education

Olshausenstraße 62
24118 Kiel

nagy@ipn.uni-kiel.de

top


Modeling and predicting non-linear changes in educational trajectories: The multilevel latent growth components approach
Christoph Kiefer, Yves Rosseel, Bettina S. Wiese & Axel Mayer

Abstract

The investigation of developmental trajectories is a central goal of educational science. However, modeling and predicting complex trajectories in the context of large-scale panel studies poses multiple challenges. Statistical models oftentimes need to take into account a) potentially nonlinear shapes of trajectories, b) multiple levels of analysis (e.g., individual level, university level) and c) measurement models for the typically unobservable latent constructs. In this paper, we develop a new approach, termed the multilevel latent growth components model (ML-LGCoM) that can adequately address all three challenges simultaneously. A key feature of this new approach is that it allows researchers to test contrasts of interest among latent variables in a multilevel study. In our illustrative example, we used data from the National Educational Panel Study to model the (non-linear) development of students’ satisfaction with their academic success over four years while taking into account cluster- and individual-level trajectories and measurement error.
 

Keywords: multilevel structural equation modeling, latent growth components, longitudinal models, latent state trait, educational trajectories


RWTH Aachen University
Institute of Psychology
Jägerstraße 17/19
52066 Aachen, Germany

christoph.kiefer@psych.rwth-aachen.de

top


Measurement of teachers’ reactive, preventive and proactive classroom management skills by student ratings – results from a two-level confirmatory factor analysis
Christian Spoden & Katharina Fricke

Abstract

Student ratings can be an effective method to obtain measures of instructional quality although the usage of student ratings involves methodical challenges, raising the need for an adequate psychometric approach. Classroom management skills are an important aspect of instructional quality and a key competence of a teacher. In the present paper a two-level confirmatory factor analysis based on a shared cluster construct model according to Stapleton, Yang and Hancock (2016) is utilized with respect to the measurement of classroom management skills by student ratings. The approach is applied for the investigation of the dimensional structure of classroom management skills, assessed by means of a newly developed questionnaire. The results of the analysis support a three-dimensional structure with reliable measures of reactive, proactive, and preventive components of the construct of interest. These results are discussed in terms of implications for the assessment of classroom management skills using student ratings.

Keywords: two-level confirmatory factor analysis, classroom management, student ratings, instructional quality


Christian Spoden
German Institute for Adult Education
Leibniz Centre for Lifelong Learning
Heinemannstraße 12-14
53175 Bonn, Germany

spoden@die-bonn.de

 

 

top


Modeling item position effects with a Bayesian item response model applied to PISA 2009–2015 data
Matthias Trendtel & Alexander Robitzsch

Abstract

Previous studies have repeatedly demonstrated the existence of item position effects in large-scale assessments. Usually, items are answered correctly more often when administered at the beginning of a test compared to at the end of a test. In this article, the aspects of item position effects that are investigated are their pattern, whether they remain stable over time, and whether they are affected by changes in the test administration mode. For this purpose, a Bayesian item response model for modeling item position effects is proposed. This model allows for nonlinear position effects on the item side and linear individual differences on the person side. A full Bayesian estimation procedure is proposed as well as its extension to data collected from stratified clustered samples. The model was applied to the reading data collected in the 2009, 2012, and 2015 cycles of the Programme for International Student Assessment (PISA) for six countries.

Keywords: Item position effects, Bayesian IRT model, MCMC estimation, stratified clustered sample, PISA


Center for Research on Education and School Development
Dortmund, Germany


Matthias Trendtel
Vogelpothsweg 78
D-44227 Dortmund, Germany

matthias.trendtel@tu-dortmund.de

 

 

 

 

top

 


How many classes are needed to assess effects of instructional quality? A Monte Carlo simulation of the performance of frequentist and Bayesian multilevel latent contextual models
Christoph Helm

Abstract

This study addresses the sample size question for multilevel latent contextual models (MLCM), which are commonly used in educational science to assess the effects of instructional quality. In terms of MLCM, only few studies have investigated whether the Bayesian toolbox helps to overcome small-sample issues. The main goal was to investigate the performance of maximum likelihood versus non, weakly, and highly informative Bayesian estimation techniques under small-sample conditions. We assumed that incorporation of prior information derived from TIMSS data would help to produce reasonable results with small samples. As expected, our results showed that the Bayesian approaches outperformed ML estimation under all conditions when informative priors were used, as these yield almost unbiased and highly accurate estimates even under unfavourable conditions (small number of level-2 groups and small group size). The study results are discussed in the light of published findings. Implications for applied educational research are derived.

Keywords: Bayesian statistics, multilevel modeling, small sample, TIMSS, informative prior distributions


Christoph Helm
Linz School of Education
Johannes Kepler University of Linz
Altenberger Straße 69
4040 Linz-Auhof, Austria

christoph.helm@jku.at

 

 

top


 

 

 

 

 

 

 

 

 

» Psychological Test and Assessment Modeling Online-Shop...





alttext