SAGE Journals
Browse

Automated scoring of junior and senior high essays using Coh-Metrix features: Implications for large-scale language testing

Posted on 2020-06-24 - 12:06

An automated essay scoring (AES) program is a software system that uses techniques from corpus and computational linguistics and machine learning to grade essays. In this study, we aimed to describe and evaluate particular language features of Coh-Metrix for a novel AES program that would score junior and senior high school students’ essays from their large-scale assessments. Specifically, we studied nine categories of Coh-Metrix features for developing prompt-specific AES scoring models for our sample. We developed the models by capitalizing on the nine features’ informativeness as a function of dimensionality reduction. We used a three-staged scoring framework. The machine scores were validated against a “gold standard” of ratings, that is, those assigned by two human raters. The nine language features reliably captured the construct of the students’ writing quality. We performed a secondary analysis to see how the scoring models performed in relation to other, already established AES systems, and there was no systematic pattern of scoring discrepancy. However, for essays with widely divergent human ratings, the scoring models were disadvantaged owing to the inherent unreliability of the human scores.

CITE THIS COLLECTION

DataCite
3 Biotech
3D Printing in Medicine
3D Research
3D-Printed Materials and Systems
4OR
AAPG Bulletin
AAPS Open
AAPS PharmSciTech
Abhandlungen aus dem Mathematischen Seminar der Universität Hamburg
ABI Technik (German)
Academic Medicine
Academic Pediatrics
Academic Psychiatry
Academic Questions
Academy of Management Discoveries
Academy of Management Journal
Academy of Management Learning and Education
Academy of Management Perspectives
Academy of Management Proceedings
Academy of Management Review
or
Select your citation style and then place your mouse over the citation text to select it.

SHARE

email
need help?