Standards-aligned annotations reveal organizational patterns in argumentative essays at scale

While scoring rubrics are widely used to evaluate student writing, they often fail to provide actionable feedback. Delivering such feedback—especially in an automated, scalable manner—requires the standardized detection of finer-grained information within a student’s essay. Achieving this level of d...

Full description

Saved in:
Bibliographic Details
Main Authors: Amy Burkhardt, Suhwa Han, Sherri Woolf, Allison Boykin, Frank Rijmen, Susan Lottridge
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-06-01
Series:Frontiers in Education
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/feduc.2025.1569529/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:While scoring rubrics are widely used to evaluate student writing, they often fail to provide actionable feedback. Delivering such feedback—especially in an automated, scalable manner—requires the standardized detection of finer-grained information within a student’s essay. Achieving this level of detail demands the same rigor in development and training as creating a high-quality rubric. To this end, we describe the development of annotation guidelines aligned with state standards for detecting these elements, outline the annotator training process, and report strong inter-rater agreement results from a large-scale annotation effort involving nearly 20,000 essays. To further validate this approach, we connect annotations to broader patterns in student writing using Latent Class Analysis (LCA). Through this analysis, we identify distinct writing patterns from these fine-grained annotations and demonstrate their meaningful associations with overall rubric scores. Our findings show promise for how fine-grained analysis of argumentative essays can support students, at scale, in becoming more effective argumentative essay writers.
ISSN:2504-284X