Sensory Integration with Articulated Motion on a Humanoid Robot
This paper describes the integration of articulated motion with auditory and visual sensory information that enables a humanoid robot to achieve certain reflex actions that mimic those of people. Reflexes such as reach-and-grasp behavior enables the robot to learn, through experience, its own state...
Saved in:
| Main Authors: | J. Rojas, R. A. Peters II |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2005-01-01
|
| Series: | Applied Bionics and Biomechanics |
| Online Access: | http://dx.doi.org/10.1533/abbi.2004.0057 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Motion Planning and Control with Environmental Uncertainties for Humanoid Robot
by: Zhiyong Jiang, et al.
Published: (2024-11-01) -
From text to motion: grounding GPT-4 in a humanoid robot “Alter3”
by: Takahide Yoshida, et al.
Published: (2025-05-01) -
Coordinating the Redundant DOFs of Humanoid Robots
by: Pietro Morasso
Published: (2025-07-01) -
Humanoid Robot Technology and Industry Development
by: Chenghao Xu, et al.
Published: (2025-02-01) -
Learning aerodynamics for the control of flying humanoid robots
by: Antonello Paolino, et al.
Published: (2025-06-01)