CONTEXT: Viva Voce assessment – where students demonstrate understanding through a presentation with an assessor – is favourable for number of reasons, including inherent academic integrity, student engagement, and industry authenticity. Challengingly, these assessments scale poorly and are not efficient/sustainable for large cohorts or micro assessment. In MECH1750, an undergraduate materials design course at The University of Newcastle, scalable Viva Voce micro assessment was implemented through weekly debate-club style tutorials. These tutorials utilised moderated peer marking to aid in scalability and provide continual feedback. A cohort of 300+ students necessitated the design and deployment of a purpose-built software system for the collection, moderation, and distribution of this feedback. PURPOSE OR GOAL: In this paper we discuss the design considerations for, implementation of and outcomes delivered by our purpose-built software solution to this problem. We outline how this solution facilitates viva-based micro assessment at scale and reduces administrative workload. We also discuss and evaluate the moderated peer marking functionality of this system. Finally, we contribute an open-source, anonymised minimal working example of the system for broader use. APPROACH OR METHODOLOGY/METHODS: We demonstrate viability of the system and evaluate performance through usage analytics and student/instructor feedback. ACTUAL OR ANTICIPATED OUTCOMES: This system was successfully implemented in MECH1750 in 2021 and enabled weekly Viva Voce assessments through peer marking in a cohort of 300+ students. Over 16,000 individual feedback forms were collected and distributed over the semester, with students able to give and receive feedback in real-time through a phone- and computer-compatible web interface. Both student and instructor perceptions of the system were positive. Instructors appreciated the reduced administrative workload that came with automatic collection and distribution, and students appreciated the timeliness and accessibility of feedback afforded by the system. CONCLUSIONS/RECOMMENDATIONS/SUMMARY: In this paper, we present a purpose-built software system that facilitates scalable Viva Voce micro assessment through moderated peer marking. Given our positive experiences, we recommend the use of such a system in these classes more broadly and contribute a minimum working example at github.com/Cornelius2121/SVMS.
History
Source title
Proceedings of the 33rd Annual Conference of the Australasian Association for Engineering Education (AAEE 2022)
Name of conference
The 33rd Annual Conference of the Australasian Association for Engineering Education (AAEE 2022)
Location
Sydney, NSW
Start date
2022-12-04
End date
2022-12-07
Publisher
Australasian Association for Engineering Education