Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics

Future work scenarios envision increased collaboration between humans and robots, emphasizing the need for versatile interaction modalities. Robotic systems can support various use cases, including on-site operations and telerobotics. This study investigates a hybrid interaction model in which a sin...

Full description

Saved in:
Bibliographic Details
Main Authors: Dominykas Strazdas, Matthias Busch, Rijin Shaji, Ingo Siegert, Ayoub Al-Hamadi
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-07-01
Series:Frontiers in Robotics and AI
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frobt.2025.1561188/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849254192063971328
author Dominykas Strazdas
Matthias Busch
Rijin Shaji
Ingo Siegert
Ayoub Al-Hamadi
author_facet Dominykas Strazdas
Matthias Busch
Rijin Shaji
Ingo Siegert
Ayoub Al-Hamadi
author_sort Dominykas Strazdas
collection DOAJ
description Future work scenarios envision increased collaboration between humans and robots, emphasizing the need for versatile interaction modalities. Robotic systems can support various use cases, including on-site operations and telerobotics. This study investigates a hybrid interaction model in which a single user engages with the same robot both on-site and remotely. Specifically, the Robot System Assistant (RoSA) framework is evaluated to assess the effectiveness of touch and speech input modalities in these contexts. The participants interact with two robots, Rosa and Ari, utilizing both input modalities. The results reveal that touch input excels in precision and task efficiency, while speech input is preferred for its intuitive and natural interaction flow. These findings contribute to understanding the complementary roles of touch and speech in hybrid systems and their potential for future telerobotic applications.
format Article
id doaj-art-ef5d23fc72a74a9b8494f33a7b71f38e
institution Kabale University
issn 2296-9144
language English
publishDate 2025-07-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Robotics and AI
spelling doaj-art-ef5d23fc72a74a9b8494f33a7b71f38e2025-08-20T03:56:05ZengFrontiers Media S.A.Frontiers in Robotics and AI2296-91442025-07-011210.3389/frobt.2025.15611881561188Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and teleroboticsDominykas Strazdas0Matthias Busch1Rijin Shaji2Ingo Siegert3Ayoub Al-Hamadi4Neuro-Information Technology Group, Faculty of Electrical Engineering and Information Technology, Otto-von-Guericke-University, Magdeburg, GermanyMobile Dialog Systems, Faculty of Electrical Engineering and Information Technology, Otto-von-Guericke-University, Magdeburg, GermanyNeuro-Information Technology Group, Faculty of Electrical Engineering and Information Technology, Otto-von-Guericke-University, Magdeburg, GermanyMobile Dialog Systems, Faculty of Electrical Engineering and Information Technology, Otto-von-Guericke-University, Magdeburg, GermanyNeuro-Information Technology Group, Faculty of Electrical Engineering and Information Technology, Otto-von-Guericke-University, Magdeburg, GermanyFuture work scenarios envision increased collaboration between humans and robots, emphasizing the need for versatile interaction modalities. Robotic systems can support various use cases, including on-site operations and telerobotics. This study investigates a hybrid interaction model in which a single user engages with the same robot both on-site and remotely. Specifically, the Robot System Assistant (RoSA) framework is evaluated to assess the effectiveness of touch and speech input modalities in these contexts. The participants interact with two robots, Rosa and Ari, utilizing both input modalities. The results reveal that touch input excels in precision and task efficiency, while speech input is preferred for its intuitive and natural interaction flow. These findings contribute to understanding the complementary roles of touch and speech in hybrid systems and their potential for future telerobotic applications.https://www.frontiersin.org/articles/10.3389/frobt.2025.1561188/fullhuman-robot interactionteleroboticstouchspeechmultimodaluser study
spellingShingle Dominykas Strazdas
Matthias Busch
Rijin Shaji
Ingo Siegert
Ayoub Al-Hamadi
Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
Frontiers in Robotics and AI
human-robot interaction
telerobotics
touch
speech
multimodal
user study
title Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
title_full Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
title_fullStr Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
title_full_unstemmed Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
title_short Robot System Assistant (RoSA): evaluation of touch and speech input modalities for on-site HRI and telerobotics
title_sort robot system assistant rosa evaluation of touch and speech input modalities for on site hri and telerobotics
topic human-robot interaction
telerobotics
touch
speech
multimodal
user study
url https://www.frontiersin.org/articles/10.3389/frobt.2025.1561188/full
work_keys_str_mv AT dominykasstrazdas robotsystemassistantrosaevaluationoftouchandspeechinputmodalitiesforonsitehriandtelerobotics
AT matthiasbusch robotsystemassistantrosaevaluationoftouchandspeechinputmodalitiesforonsitehriandtelerobotics
AT rijinshaji robotsystemassistantrosaevaluationoftouchandspeechinputmodalitiesforonsitehriandtelerobotics
AT ingosiegert robotsystemassistantrosaevaluationoftouchandspeechinputmodalitiesforonsitehriandtelerobotics
AT ayoubalhamadi robotsystemassistantrosaevaluationoftouchandspeechinputmodalitiesforonsitehriandtelerobotics