Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory
Skill learning autonomously through interactions with the environment is a crucial ability for intelligent robot. A perception-action integration or sensorimotor cycle, as an important issue in imitation learning, is a natural mechanism without the complex program process. Recently, neurocomputing m...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2017-01-01
|
| Series: | Complexity |
| Online Access: | http://dx.doi.org/10.1155/2017/7948684 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849402714490929152 |
|---|---|
| author | Xinzheng Zhang Jianfen Zhang Junpei Zhong |
| author_facet | Xinzheng Zhang Jianfen Zhang Junpei Zhong |
| author_sort | Xinzheng Zhang |
| collection | DOAJ |
| description | Skill learning autonomously through interactions with the environment is a crucial ability for intelligent robot. A perception-action integration or sensorimotor cycle, as an important issue in imitation learning, is a natural mechanism without the complex program process. Recently, neurocomputing model and developmental intelligence method are considered as a new trend for implementing the robot skill learning. In this paper, based on research of the human brain neocortex model, we present a skill learning method by perception-action integration strategy from the perspective of hierarchical temporal memory (HTM) theory. The sequential sensor data representing a certain skill from a RGB-D camera are received and then encoded as a sequence of Sparse Distributed Representation (SDR) vectors. The sequential SDR vectors are treated as the inputs of the perception-action HTM. The HTM learns sequences of SDRs and makes predictions of what the next input SDR will be. It stores the transitions of the current perceived sensor data and next predicted actions. We evaluated the performance of this proposed framework for learning the shaking hands skill on a humanoid NAO robot. The experimental results manifest that the skill learning method designed in this paper is promising. |
| format | Article |
| id | doaj-art-96519cb5cb5443e2842c4dd6ea90fdce |
| institution | Kabale University |
| issn | 1076-2787 1099-0526 |
| language | English |
| publishDate | 2017-01-01 |
| publisher | Wiley |
| record_format | Article |
| series | Complexity |
| spelling | doaj-art-96519cb5cb5443e2842c4dd6ea90fdce2025-08-20T03:37:28ZengWileyComplexity1076-27871099-05262017-01-01201710.1155/2017/79486847948684Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal MemoryXinzheng Zhang0Jianfen Zhang1Junpei Zhong2School of Electrical and Information Engineering, Jinan University, Zhuhai, ChinaSchool of Electrical and Information Engineering, Jinan University, Zhuhai, ChinaNational Institute of Advanced Industrial Science and Technology (AIST), Tokyo, JapanSkill learning autonomously through interactions with the environment is a crucial ability for intelligent robot. A perception-action integration or sensorimotor cycle, as an important issue in imitation learning, is a natural mechanism without the complex program process. Recently, neurocomputing model and developmental intelligence method are considered as a new trend for implementing the robot skill learning. In this paper, based on research of the human brain neocortex model, we present a skill learning method by perception-action integration strategy from the perspective of hierarchical temporal memory (HTM) theory. The sequential sensor data representing a certain skill from a RGB-D camera are received and then encoded as a sequence of Sparse Distributed Representation (SDR) vectors. The sequential SDR vectors are treated as the inputs of the perception-action HTM. The HTM learns sequences of SDRs and makes predictions of what the next input SDR will be. It stores the transitions of the current perceived sensor data and next predicted actions. We evaluated the performance of this proposed framework for learning the shaking hands skill on a humanoid NAO robot. The experimental results manifest that the skill learning method designed in this paper is promising.http://dx.doi.org/10.1155/2017/7948684 |
| spellingShingle | Xinzheng Zhang Jianfen Zhang Junpei Zhong Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory Complexity |
| title | Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory |
| title_full | Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory |
| title_fullStr | Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory |
| title_full_unstemmed | Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory |
| title_short | Skill Learning for Intelligent Robot by Perception-Action Integration: A View from Hierarchical Temporal Memory |
| title_sort | skill learning for intelligent robot by perception action integration a view from hierarchical temporal memory |
| url | http://dx.doi.org/10.1155/2017/7948684 |
| work_keys_str_mv | AT xinzhengzhang skilllearningforintelligentrobotbyperceptionactionintegrationaviewfromhierarchicaltemporalmemory AT jianfenzhang skilllearningforintelligentrobotbyperceptionactionintegrationaviewfromhierarchicaltemporalmemory AT junpeizhong skilllearningforintelligentrobotbyperceptionactionintegrationaviewfromhierarchicaltemporalmemory |