NEWSBÜCHERJOURNALEONLINE-SHOP



 

Sie befinden sich hier: JOURNALE » Psychological Test and Assessment Modeling » Currently available » Inhalt lesen

« zurück

Psychological Test and Assessment Modeling

» Psychological Test and Assessment Modeling Online-Shop...


2010-3

Psychological Test and Assessment Modeling, Volume 52, 2010 (3)

Carmen Hagemeister & Karl Westhoff
On objectivity and validity of oral examinations in psychology - A replication
Abstract | Startet den Datei-DownloadPDF of the full article

Indiwar Misra, Damodar Suar & Manas K. Mandal
Revisiting the relationship between hand preference and lateral eye movement
Abstract | Startet den Datei-DownloadPDF of the full article

Janke C. ten Holt, Marijtje A. J. van Duijn & Anne Boomsma
Scale construction and evaluation in practice: A review of factor analysis versus item response theory applications
Abstract | Startet den Datei-DownloadPDF of the full article

Karl Schweizer, Michael Altmeyer, Siegbert Reiß & Michael Schreiner
The c-bifactor model as a tool for the construction of semi-homogeneous upper-level measures
Abstract | Startet den Datei-DownloadPDF of the full article

Purya Baghaei
A comparison of three polychotomous Rasch models for super-item analysis
Abstract | Startet den Datei-DownloadPDF of the full article

Jana Höhler, Johannes Hartig & Frank Goldhammer
Modeling the multidimensional structure of students’ foreign language competence within and between classrooms
Abstract | Startet den Datei-DownloadPDF of the full article

 


On objectivity and validity of oral examinations in psychology - A replication
Carmen Hagemeister & Karl Westhoff

Abstract

A hierarchy of requirements applying to practising psychologists is the basis of a concept for oral examinations in psychological assessment. A study on the objectivity of oral examinations was replicated. We found very high correlations between the evaluations of examiner and assessor and high correlations between the examinees’ self-evaluations after the exam. Examinees’ self-evaluations before the examinations correlated at about 0.48 with the marks in the oral examina-tion. The results concerning preparation do not uniformly show that preparation in a group and mutual examination lead to better marks compared to preparation alone.

Key words: oral examination, examination, objectivity


Carmen Hagemeister, PhD
Institute of Psychology II
Technische Universität Dresden
01062 Dresden, Germany 
E-Mail:
carmen.hagemeister@tu-dresden.de

nach oben


Revisiting the relationship between hand preference and lateral eye movement
Indiwar Misra, Damodar Suar & Manas K. Mandal

Abstract

The study examines the relationship between hand preference and conjugate lateral eye movements. The sample comprised of 224 persons. The hand preference was assessed using a handedness inventory. Conjugate lateral eye movements were elicited in response to verbal and spatial questions among left-, mixed- and right-handers. The left- and mixed-handers exhibit significantly greater number of conjugate lateral eye movements than the right-handers. On the verbal task, right-handers exhibit rightward conjugate lateral eye movements, and left- and mixed-handers exhibit leftward conjugate lateral eye movements. On the spatial questions, right-handers exhibit leftward conjugate lateral eye movements and left- and mixed-handers also exhibit leftward conjugate lateral eye movements. While right-handers show normal lateralization pattern, the left- and mixed-handers show right brain dominance irrespective of the nature of questions.

Key words: handedness, conjugate lateral eye movement, non verbal questions, verbal questions


Indiwar Misra, Phd
Department of Psychology
Bhimrao Ambedakar College
University of Delhi, Yamuna Vihar
Main Wazirabad Road
Delhi-110094 INDIA
E-Mail:
indiwarmishra@gmail.com

nach oben


Scale construction and evaluation in practice: A review of factor analysis versus item response theory applications
Janke C. ten Holt, Marijtje A. J. van Duijn & Anne Boomsma

Abstract

In scale construction and evaluation, factor analysis (FA) and item response theory (IRT) are two methods frequently used to determine whether a set of items reliably measures a latent variable. In a review of 41 published studies we examined which methodology - FA or IRT - was used, and what researchers’ motivations were for applying either method. Characteristics of the studies were compared to gain more insight into the practice of scale analysis. Findings indicate that FA is applied far more often than IRT. Many times it is unclear whether the data justify the chosen method because model assumptions are neglected. We recommended that researchers (a) use substantive knowledge about the items to their advantage by more frequently employing confirmatory techniques, as well as adding item content and interpretability of factors to the criteria in model evaluation; and (b) investigate model assumptions and report corresponding findings. To this end, we recommend more collaboration between substantive researchers and statisticians/psychometricians.

Key words: factor analysis, item response theory, test construction, scale evaluation


Janke C. ten Holt, PhD
Grote Rozenstraat 31
9712 TG Groningen, The Netherlands
E-Mail:
j.c.ten.holt@rug.nl

nach oben


The c-bifactor model as a tool for the construction of semi-homogeneous upper-level measures
Karl Schweizer, Michael Altmeyer, Siegbert Reiß & Michael Schreiner

Abstract

The paper addresses problems resulting from the application of methods, which emphasize homogeneity, in the construction of measures that are expected to represent general upper-level constructs. It distinguishes between homogeneous and semi-homogeneous measures. Whereas homogenous measures allow for one underlying dimension only, semi-homogeneous measures are characterized by the presence of one general and dominating dimension in combination with restricted subordinate dimensions. It is the congeneric model of measurement that tends to create homogeneous measures whereas the c-bifactor model enables the construction of semi-homogeneous measures where the prefix "c" indicates that it is a confirmatory bifactor model. It is made obvious that the successful construction of measures representing upper-level constructs requires the c-bifactor model. The congeneric and c-bifactor models were applied to the social optimism scale since social optimism was known to show a hierarchical structure. As expected, only the c-bifactor model indicated a good model fit.

Key words: congeneric model, bifactor model, test construction, homogeneity, social optimism


Karl Schweizer, PhD
Department of Psychology
Goethe University Frankfurt
Mertonstraße 16
60325 Frankfurt, Germany
E-Mail:
k.schweizer@psych.uni-frankfurt.de

nach oben


A comparison of three polychotomous Rasch models for super-item analysis
Purya Baghaei

Abstract

Local dependency is a prevalent phenomenon in educational tests where several dichotomous items are based on a single prompt. This is a violation of one of the major assumptions of Rasch and other IRT models and poses restriction on the analysis of such tests with these models. To solve the problem, it has been suggested that the items which belong to a single prompt be bundled together and analysed as independent polychotomous super-items. However, in the last few decades there has been an array of polychotomous models with different properties and assumptions which makes the choice of the right model rather difficult. The purpose of the present study is two-fold: 1) to compare the performance of three psychotomous Rasch models for super-item analysis and 2) to check the consequences of using ‘inappropriate’ models when the assumption of equal distances between steps within and across items is violated. To this end, a reading comprehension test comprising six independent passages each containing six dichotomous items was analysed with three Rasch models, namely, Andrich’s (1978) rating scale model (RSM), Andrich’s (1982) equidistant model and Masters’ (1982) partial credit model (PCM). Results show that there is not much difference in the three models as far as model data fit, standard error of parameter estimates and discrimination are concerned. Nevertheless, noticeable differences were observed in the estimates of the difficulty parameters across the three models.

Key words: Rasch model, partial credit model, rating scale model, equidistant model, item-bundle approach


Purya Baghaei, PhD
English Department
Islamic Azad University
Ostad Yusofi St.
91886-Mashad, Iran
E-Mail:
pbaghaei@mshdiau.ac.ir

nach oben


Modeling the multidimensional structure of students’ foreign language competence within and between classrooms
Jana Höhler, Johannes Hartig & Frank Goldhammer

Abstract
Combining multilevel (ML) analysis and multidimensional item response theory (MIRT) provides a valuable method for analyzing data of educational assessments, where clustered data (e.g., students in classes) and multidimensional constructs frequently occur. It allows to model multiple ability dimensions while simultaneously taking the hierarchical structure into account. The dimensional structure of students’ foreign language competence within and between classrooms was investigated by applying a ML-MIRT measurement model to data of N = 9,410 students in 427 classes who had answered three different subtests of English as a foreign language. Results were compared to a MIRT model not taking into account the multilevel structure. A markedly more differentiated correlation structure is found within classrooms compared with the between-classroom level and compared with the model without multilevel structure. Results show that by modeling the latent multilevel structure, estimation and interpretation of ability profiles can be possible even with highly correlated ability dimensions.

Key words: item response theory, multidimensional item response theory, multilevel analysis, models of competencies, English as a foreign language


Jana Höhler, PhD
Center for Research on Educational Quality and Evaluation
German Institute for International Educational Research (DIPF)
Schloßstraße 29
60486 Frankfurt, Germany
E-Mail:
hoehler@dipf.de

nach oben


» Psychological Test and Assessment Modeling Online-Shop...





alttext