A New Contact Force Estimation Method for Heavy Robots Without Force Sensors by Combining CNN-GRU and Force Transformation

In response to the safety control requirements of heavy robot operations, to address the problems of cumbersome, time-consuming, poor accuracy and low real-time performance in robot end contact force estimation without force sensors by using traditional manual modeling and identification methods, th...

Full description

Saved in:
Bibliographic Details
Main Authors: Peizhang Wu, Hui Dong, Pengfei Li, Yifei Bao, Wei Dong, Lining Sun
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Technologies
Subjects:
Online Access:https://www.mdpi.com/2227-7080/13/5/192
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In response to the safety control requirements of heavy robot operations, to address the problems of cumbersome, time-consuming, poor accuracy and low real-time performance in robot end contact force estimation without force sensors by using traditional manual modeling and identification methods, this paper proposes a new contact force estimation method for heavy robots without force sensors by combining CNN-GRU and force transformation. Firstly, the CNN-GRU machine learning method is utilized to construct the robot Joint Motor Current-Joint External Force Model; then, the Joint External Force-End Contact Force Model is constructed through the Kalman filter and Jacobian force transformation method, and the robot end contact force is estimated by finally uniting them. This method can achieve robot end contact force estimation without a force sensor, avoiding the cumbersome manual modeling and identification process. Compared with traditional manual modeling and identification methods, experiments show that the proposed method in this paper can approximately double the estimation accuracy of the contact force of heavy robots and reduce the time consumption by approximately half, with advantages such as convenience, efficiency, strong real-time performance, and high accuracy.
ISSN:2227-7080