Open Research Newcastle
Browse

A comparison of single and multi-test working memory assessments in predicting academic achievement in children

Download (565.35 kB)
journal contribution
posted on 2025-05-11, 16:38 authored by Kerry ChalmersKerry Chalmers, Emily FreemanEmily Freeman
Children assessed as having low working memory capacity have also been shown to perform more poorly than their same-aged peers in measures of academic achievement. Early detection of working memory problems is, therefore, an important first step in reducing the impact of a working memory deficit on the development of academic skills. In this study, we compared a single-test assessment, the Working Memory Power Test for Children (WMPT) and a multi-test assessment, the Automated Working Memory Assessment (AWMA), in their ability to predict academic achievement in reading, numeracy, and spelling. A total of 132 Australian school children (mean age 9 years, 9 months) participated in the research. Strong positive correlations between the WMPT and AWMA total scores were found, indicating good convergent validity of the single and multi-test measures. WMPT scores correlated with each of the four AWMA subtests designed to assess verbal and visuospatial short-term and working memory. WMPT and AWMA scores separately predicted performance on Word Reading, Numerical Operations, and Spelling. Compared with either measure alone, the WMPT and the AWMA in combination predicted more of the variance in Word Reading and Numerical Operations, but not in Spelling. Theoretical and practical implications of these findings are discussed.

History

Journal title

Journal of Psychology: Interdisciplinary and Applied

Volume

152

Issue

8

Pagination

613-629

Publisher

Routledge

Language

  • en, English

College/Research Centre

Faculty of Science

School

School of Psychology

Rights statement

This is an Accepted Manuscript of an article published by Taylor & Francis Group in the Journal of Psychology on 30/10/2018, available online: https://www.tandfonline.com/doi/full/10.1080/00223980.2018.1491469

Usage metrics

    Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC