Appearance-Based Gaze Estimator for Natural Interaction Control of Surgical Robots

Robots are playing an increasingly important role in modern surgery. However, conventional human–computer interaction methods, such as joystick control and sound control, have some shortcomings, and medical personnel are required to specifically practice operating the robot. We propose a...

Full description

Saved in:
Bibliographic Details
Main Authors: Peng Li, Xuebin Hou, Xingguang Duan, Hiuman Yip, Guoli Song, Yunhui Liu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8648437/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Robots are playing an increasingly important role in modern surgery. However, conventional human–computer interaction methods, such as joystick control and sound control, have some shortcomings, and medical personnel are required to specifically practice operating the robot. We propose a human–computer interaction model based on eye movement with which medical staff can conveniently use their eye movements to control the robot. Our algorithm requires only an RGB camera to perform tasks without requiring expensive eye-tracking devices. Two kinds of eye control modes are designed in this paper. The first type is the pick and place movement, with which the user uses eye gaze to specify the point where the robotic arm is required to move. The second type is user command movement, with which the user can use eye gaze to select the direction in which the user desires the robot to move. The experimental results demonstrate the feasibility and convenience of these two modes of movement.
ISSN:2169-3536