Open Research Newcastle
Browse

Benchmarking introductory programming exams: some preliminary results

Download (416.08 kB)
conference contribution
posted on 2025-05-08, 19:38 authored by Simon, Judy Sheard, Daryl D'Souza, Peter Klemperer, Leo Porter, Juha Sorva, Martijn Stegeman, Daniel Zingaro
The programming education literature includes many observations that pass rates are low in introductory programming courses, but few or no comparisons of student performance across courses. This paper addresses that shortcoming. Having included a small set of identical questions in the final examinations of a number of introductory programming courses, we illustrate the use of these questions to examine the relative performance of the students both across multiple institutions and within some institutions. We also use the questions to quantify the size and overall difficulty of each exam. We find substantial differences across the courses, and venture some possible explanations of the differences. We conclude by explaining the potential benefits to instructors of using the same questions in their own exams.

History

Source title

ICER '16 Proceedings of the 2016 ACM Conference on International Computing Education Research

Name of conference

2016 ACM Conference on International Computing Education Research (ICER 2016)

Location

Melbourne

Start date

2016-09-08

End date

2016-09-12

Pagination

103-111

Publisher

Association for Computing Machinery (ACM)

Place published

New York, NY

Language

  • en, English

College/Research Centre

Faculty of Engineering and Built Environment

School

School of Electrical Engineering and Computer Science

Usage metrics

    Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC