NEWSBÜCHERJOURNALEONLINE-SHOP



 

Sie befinden sich hier: JOURNALE » Psychological Test and Assessment Modeling » Currently available » Inhalt lesen

« zurück

Psychological Test and Assessment Modeling

» Psychological Test and Assessment Modeling Online-Shop...
 

Published under Creative Commons: CC-BY-NC Licence


2020-3

Editorial
Method Effects in Psychological Assessment
Karl Schweizer
PDF of the full article

The development of inductive reasoning under consideration of the effect due to test speededness
Natalie Borter, Annik E. Völke & Stefan J. Troche
PDF of the full article

Examination of Method Effects with a Social-Emotional Screening Instrument Across Parent and Teacher Raters
Christine DiStefano, Jungsun Go, Fred Greer, Dexin Shi & Erin Dowdy
PDF of the full article

Does impulsivity Contribute to the Item-Position Effect?
Dorothea Krampen, Andreas Gold & Karl Schweizer
PDF of the full article

The factorial structure and construct validity of a German translation of Dweck's Implicit Theories of Intelligence Scale under consideration of the wording effect
Stefan J. Troche & Alexandra Kunz
PDF of the full article 

Investigating the Item-position Effect in a Longitu- dinal Data with Special Emphasis on the Information provided by the Variance Parameter
Tengfei Wang, Qiong Zhang & Karl Schweizer
PDF of the full article



Method Effects in Psychological Assessment
Karl Schweizer

Abstract

Method effects are described as systematic variation observed in measurement that originates from the method of measurement instead of from the attribute, which the scale or measurement procedure is expected to capture. Method effects are major sources of impairment of the quality of measurement. Because of a method effect a scale or measurement procedures does not or only partly measure what is expected to easure. Method effects and statistical methods for the identification and control of method effects are discussed. Special emphasis is given to the item-position, speed and wording effects.

Keywords: Method effect, measurement, item-position effect, speed effect, wording effect



Karl Schweizer
1 Institute of Psychology, 
Goethe University Frankfurt,
Frankfurt, Germany
2 Department of Psychology and Behavioral,
Sciences, Zhejiang,
University, Hangzhou, China


ORCID iD: 0000 0002 3143 2100



The development of inductive reasoning under consideration of the effect due to test speededness
Natalie Borter, Annik E. Völke & Stefan J. Troche

Abstract

Measures of inductive reasoning are frequently used as proxy of a child’s cognitive development. Unfortunately, a reasoning scale might be affected by speededness introduced by limited testing time. As a result, the scale might be heterogeneous and its correlation with age is hard to interpret. Here we investigated the development of inductive reasoning when a possible bias by the effect of speededness is controlled for. In 250 children, ranging in age from 8;0 to 12;8 years, inductive reasoning assessed with the Culture Fair Test 20-R (CFT 20-R) increased with age. The effect of speededness was identified in all four CFT 20-R subtest
and was also related to age indicating increasing processing speed with higher age. After controlling for the effect of speededness, the relation between age and inductive reasoning was still observed but substantially decreased. Consequences of these results for the description of inductive reasoning data obtained with time-limited tests and for developmental studies on the interplay between age, inductive reasoning and speed of information processing are discussed.scussed.

Keywords: inductive reasoning, cognitive development, speededness, culture-fair test, confirmatory factor analysis


Natalie Borter
Institute of Psychology
University of Bern
Fabrikstrasse 8
CH-3012 Bern
Switzerland


 



Examination of Method Effects with a Social-Emotional Screening Instrument Across Parent and Teacher Raters
Christine DiStefano, Jungsun Go, Fred Greer, Dexin Shi & Erin Dowdy

Abstract

With psychological studies, method effects are often encountered when items that differ in the direction of the latent construct (e.g., positively and negatively worded items) are included on the same survey. The presence of method effects has been studied; however, largely investigations have used self-report data. As parents and teachers often provide ratings in school-based settings, wording effects may impact scales used to assess children. This study investigated responses to the Behavioral and Emotional Screening System (BESS) Parent and Teacher Rating Scales-preschool level to determine the strength of method effects related to wording when the same child was rated by teachers and parents. Results showed method effects due to oppositely worded items were apparent in the BESS and effects were more pronounced for teachers than for parent raters. Accounting for method effects due to wording allowed higher concordance between scales to be observed.

Keywords: Method effect, screening, parent rating, teacher rating, wording effect


Christine DiStefano
138 Wardlaw Hall
College of Education
Columbia, SC 29208



Does impulsivity Contribute to the Item-Position Effect?
Dorothea Krampen, Andreas Gold & Karl Schweizer

Abstract

This paper reports an investigation of whether the personality trait impulsivity contributes to the item-position effect observable in reasoning data. The item-position effect refers to the dependence of item positions and item statistics. Lozano (2015) showed that impulsivity predicts the item-position effect. Nevertheless, an attempt to replicate this finding by Ren, Gong, Chu, and Wang (2017) failed. To further investigate the proposed relationship a sample of 284 undergraduate students completed a set of Advanced Progressive Matrices (APM) as well as the Barratt Impulsiveness Scale (BIS). Confirmatory factor models were
used for analyses in our study. Results did not provide evidence for a relationship between impulsivity and the item-position effect. Sample characteristics and cultural differences are discussed as possible reasons for these results.

Keywords: impulsivity, item-position effect, APM, BIS, impulsivity, confirmatory factor analysis


Dorothea Krampen
Institute of Psychology
Goethe University Frankfurt
Frankfurt, Germany


ORCID iD: 0000-0002-5725-0065



The factorial structure and construct validity of a German translation of Dweck's Implicit Theories of Intelligence Scale under consideration of the wording effect
Stefan J. Troche & Alexandra Kunz

Abstract

Dweck’s Implicit Theories of Intelligence Scale (ITIS) assesses laypersons' belief that their own intelligence is a fixed (entity theory) or a malleable trait (incremental theory). The ITIS implies a unidimensional construct but studies using confirmatory factor analyses identified entity and incremental theories as two distinct constructs. Negative wording of half the ITIS items might artificially cause this finding. In two studies, the factorial structure of a German translation of the ITIS was examined in 292 and 195 participants, respectively. Despite high internal consistency (Cronbach's α > .90), a one-factor measurement model did
not describe the data well. A two-factor model described the data better, but a wording-effect model provided the best data description indicating a unidimensional construct with a biasing method effect due to negatively worded items. Implicit theories of intelligence were related to goal choice orientation, general self-esteem and lack of confidence in test situations but unrelated to the Big Five personality traits and aspects of procrastination. Thus, considering the wording effect in the ITIS substantially improved data description and interpretation but did not challenge previous results on the nomological network of implicit theories of intelligence.

Keywords: implicit theories of intelligence, wording effect, confirmatory factor analysis


Stefan J. Troche
Institute of Psychology
University of Bern
Fabrikstrasse 8
CH-3012 Bern
Switzerland



Investigating the Item-position Effect in a Longitudinal Data with Special Emphasis on the Information provided by the Variance Parameter
Tengfei Wang, Qiong Zhang & Karl Schweizer

Abstract

Although the item-position effect has frequently been observed in intelligence testing among adults, it remains unclear whether there is such an effect in children and how it develops over time. Data on Raven’s Standard Progressive Matrices (SPM) were collected from 189 primary school-aged children (10-11 years old) twice, with an interval of one and a half years. The item-position effect of SPM was represented separately from the ability-specific component by means of fixed-links modeling. The variance parameters of the latent variables in the model were thoroughly analyzed. The results indicated that including the
item-position factor yielded better model fit to the SPM data collected at both time points. The variances of both the ability-specific and item-position factors were significant, confirming the presence of the item-position effect in addition to ability. The comparison of the scaled estimates of the variance parameters showed that the item-position effect accounted for more variances in SPM scores at Time 2 (72.08%) than Time 1 (48.10%). Furthermore, the correlation between the item-position factors (r = 0 .88) across two time points was much smaller than that between the ability-specific factors (r = 0 .34). Taken together, these results demonstrate a substantial change of the item-position effect underlying SPM as children grow up.

Keywords: item-position effect, fluid intelligence, Raven’s Standard Progressive Matrices, fixed-links modeling

Qiong Zhang
Department of Psychology and Behavioral
Sciences, Zhejiang University
Tianmushan Road 148
310028 Hangzhou, China



Psychological Test and Assessment Modeling
Volume 62 · 2020 · Issue 3

Pabst, 2020
ISSN 2190-0493 (Print)
ISSN 2190-0507 (Internet)

» Psychological Test and Assessment Modeling Online-Shop...





alttext    

 

Aktuell

Socials

Fachzeitschriften