The ECOLANG Multimodal Corpus of adult-child and adult-adult Language

Abstract Communication comprises a wealth of multimodal signals (e.g., gestures, eye gaze, intonation) in addition to speech and there is a growing interest in the study of multimodal language by psychologists, linguists, neuroscientists and computer scientists. The ECOLANG corpus provides audiovisu...

Full description

Saved in:
Bibliographic Details
Main Authors: Yan Gu, Ed Donnellan, Beata Grzyb, Gwen Brekelmans, Margherita Murgiano, Ricarda Brieke, Pamela Perniss, Gabriella Vigliocco
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Data
Online Access:https://doi.org/10.1038/s41597-025-04405-1
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832594951428374528
author Yan Gu
Ed Donnellan
Beata Grzyb
Gwen Brekelmans
Margherita Murgiano
Ricarda Brieke
Pamela Perniss
Gabriella Vigliocco
author_facet Yan Gu
Ed Donnellan
Beata Grzyb
Gwen Brekelmans
Margherita Murgiano
Ricarda Brieke
Pamela Perniss
Gabriella Vigliocco
author_sort Yan Gu
collection DOAJ
description Abstract Communication comprises a wealth of multimodal signals (e.g., gestures, eye gaze, intonation) in addition to speech and there is a growing interest in the study of multimodal language by psychologists, linguists, neuroscientists and computer scientists. The ECOLANG corpus provides audiovisual recordings and ELAN annotations of multimodal behaviours (speech transcription, gesture, object manipulation, and eye gaze) by British and American English-speaking adults engaged in semi-naturalistic conversation with their child (N = 38, children 3-4 years old, face-blurred) or a familiar adult (N = 31). Speakers were asked to talk about objects to their interlocutors. We further manipulated whether the objects were familiar or novel to the interlocutor and whether the objects could be seen and manipulated (present or absent) during the conversation. These conditions reflect common interaction scenarios in real-world communication. Thus, ECOLANG provides ecologically-valid data about the distribution and co-occurrence of multimodal signals across these conditions for cognitive scientists and neuroscientists interested in addressing questions concerning real-world language acquisition, production and comprehension, and for computer scientists to develop multimodal language models and more human-like artificial agents.
format Article
id doaj-art-08c716982263485bb9a4bfbc10e17081
institution Kabale University
issn 2052-4463
language English
publishDate 2025-01-01
publisher Nature Portfolio
record_format Article
series Scientific Data
spelling doaj-art-08c716982263485bb9a4bfbc10e170812025-01-19T12:09:53ZengNature PortfolioScientific Data2052-44632025-01-0112111310.1038/s41597-025-04405-1The ECOLANG Multimodal Corpus of adult-child and adult-adult LanguageYan Gu0Ed Donnellan1Beata Grzyb2Gwen Brekelmans3Margherita Murgiano4Ricarda Brieke5Pamela Perniss6Gabriella Vigliocco7Experimental Psychology, University College LondonExperimental Psychology, University College LondonExperimental Psychology, University College LondonDepartment of Biological and Experimental Psychology, Queen Mary University of LondonExperimental Psychology, University College LondonExperimental Psychology, University College LondonDepartment of Rehabilitation and Special Education, University of CologneExperimental Psychology, University College LondonAbstract Communication comprises a wealth of multimodal signals (e.g., gestures, eye gaze, intonation) in addition to speech and there is a growing interest in the study of multimodal language by psychologists, linguists, neuroscientists and computer scientists. The ECOLANG corpus provides audiovisual recordings and ELAN annotations of multimodal behaviours (speech transcription, gesture, object manipulation, and eye gaze) by British and American English-speaking adults engaged in semi-naturalistic conversation with their child (N = 38, children 3-4 years old, face-blurred) or a familiar adult (N = 31). Speakers were asked to talk about objects to their interlocutors. We further manipulated whether the objects were familiar or novel to the interlocutor and whether the objects could be seen and manipulated (present or absent) during the conversation. These conditions reflect common interaction scenarios in real-world communication. Thus, ECOLANG provides ecologically-valid data about the distribution and co-occurrence of multimodal signals across these conditions for cognitive scientists and neuroscientists interested in addressing questions concerning real-world language acquisition, production and comprehension, and for computer scientists to develop multimodal language models and more human-like artificial agents.https://doi.org/10.1038/s41597-025-04405-1
spellingShingle Yan Gu
Ed Donnellan
Beata Grzyb
Gwen Brekelmans
Margherita Murgiano
Ricarda Brieke
Pamela Perniss
Gabriella Vigliocco
The ECOLANG Multimodal Corpus of adult-child and adult-adult Language
Scientific Data
title The ECOLANG Multimodal Corpus of adult-child and adult-adult Language
title_full The ECOLANG Multimodal Corpus of adult-child and adult-adult Language
title_fullStr The ECOLANG Multimodal Corpus of adult-child and adult-adult Language
title_full_unstemmed The ECOLANG Multimodal Corpus of adult-child and adult-adult Language
title_short The ECOLANG Multimodal Corpus of adult-child and adult-adult Language
title_sort ecolang multimodal corpus of adult child and adult adult language
url https://doi.org/10.1038/s41597-025-04405-1
work_keys_str_mv AT yangu theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT eddonnellan theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT beatagrzyb theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT gwenbrekelmans theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT margheritamurgiano theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT ricardabrieke theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT pamelaperniss theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT gabriellavigliocco theecolangmultimodalcorpusofadultchildandadultadultlanguage
AT yangu ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT eddonnellan ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT beatagrzyb ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT gwenbrekelmans ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT margheritamurgiano ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT ricardabrieke ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT pamelaperniss ecolangmultimodalcorpusofadultchildandadultadultlanguage
AT gabriellavigliocco ecolangmultimodalcorpusofadultchildandadultadultlanguage