Cross-modal matching of monosyllabic and bisyllabic items varying in phonotactic probability and lexicality

In two experiments, English words and non-words varying in phonotactic probability were cross-modally compared in an AB matching task. Participants were presented with either visual-only (V) speech (a talker's speaking face) or auditory-only (A) speech (a talker's voice) in the A position....

Full description

Saved in:
Bibliographic Details
Main Author: Kauyumari Sanchez
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-02-01
Series:Frontiers in Language Sciences
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/flang.2025.1488399/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In two experiments, English words and non-words varying in phonotactic probability were cross-modally compared in an AB matching task. Participants were presented with either visual-only (V) speech (a talker's speaking face) or auditory-only (A) speech (a talker's voice) in the A position. Stimuli in the B position were of the opposing modality (counterbalanced). Experiment 1 employed monosyllabic items, while experiment 2 employed bisyllabic items. Accuracy measures for experiment 1 revealed main effects for phonotactic probability and presentation order (A-V vs. V-A), while experiment 2 revealed main effects for lexicality and presentation order. Reaction time measures for experiment 1 revealed an interaction between probability and lexicality, with a main effect for presentation order. Reaction time measures for experiment 2 revealed two 2-way interactions: probability and lexicality and probability and presentation order, with significant main effects. Overall, the data suggests that (1) cross-modal research can be conducted with various presentation orders, (2) perception is guided by the most predictive components of a stimulus, and (3) more complex stimuli can support the results from experiments using simpler stimuli, but can also uncover new information.
ISSN:2813-4605