Towards Next Generation Rubrics: An Automated Assignment Feedback System

Nilupulee Nathawitharana, Qing Huang, Kok-Leong Ong, Peter Vitartas, Madhura Jayaratne, Damminda Alahakoon, Sarah Midford, Aleks Michalewicz, Gillian Sullivan Mort, Tanvir Ahmed

Abstract


As the use of blended learning environments and digital technologies become integrated into the higher education sector, rich technologies such as analytics have shown promise in facilitating teaching and learning. One popular application of analytics is Automated Writing Evaluation (AWE) systems. Such systems can be used in a formative way; for example, by providing students with feedback on digitally submitted assignments. This paper presents work on the development of an AWE software tool for an Australian university using advanced text analytics techniques. The tool was designed to provide students with timely feedback on their initial assignment drafts, for revision and further improvement. Moreover, it could also assist academics in better understanding students’ assignment performance so as to inform future teaching activities. The paper provides details on the methodology used for development of the software, and presents the results obtained from the analysis of text-based assignments submitted in two subjects. The results are discussed, highlighting how the tool can provide practical value, followed by insights into existing challenges and possible future directions.

Keywords


Learning analytics; Automated Writing Evaluation; Text analysis; Assignment feedback

Full Text:

PDF


DOI: http://dx.doi.org/10.3127/ajis.v21i0.1553

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Creative Commons License
ISSN: Online: 1326-2238 Hard copy: 1449-8618
This work is licensed under a Creative Commons Attribution-NonCommercial Licence. Uses the Open Journal Systems. Web design by TomW.