A crowding free digital interface to help French-speaking children learn to read.

Learning to read is a challenging task for first-graders. Letter crowding in the peripheral visual field has been identified as a key interference process during reading acquisition. To reduce crowding and enhance selective attention, we designed a new way to read (Digit-tracking) in which words and...

Full description

Saved in:
Bibliographic Details
Main Authors: Viet Chau Linh Nguyen, Guillaume Lio, Thomas Perret, Alice Gomez, Angela Sirigu
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0323623
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Learning to read is a challenging task for first-graders. Letter crowding in the peripheral visual field has been identified as a key interference process during reading acquisition. To reduce crowding and enhance selective attention, we designed a new way to read (Digit-tracking) in which words and sentences appear blurred. By sliding the index finger along the blurred text, the letters just above the finger position appear unblurred and are seen in foveal vision. We hypothesized that this approach might facilitate orthographic decoding and promote reading skills. Using a tablet device, two groups of first-grade children (N = 54) were trained on digit-tracking exercises and paper exercises using a crossover design. Results showed that performance in letter, syllable and meaningless text-reading was significantly higher after digit-tracking training compared to paper-based training. Using the recorded finger trajectories as a proxy for eye movements, we found that text scanning patterns (saccade length, landing position, regressive saccades) predicted children's decoding and fluency. We conclude that training with the digit-tracking procedure accelerates decoding and reading fluency in school beginners and may provide a sensitive metric of reading competence.
ISSN:1932-6203