Integral reinforcement learning-based event-triggered H∞ control algorithm for affine nonlinear systems with asymmetric input saturation and external disturbances

This paper addresses an integral reinforcement learning (IRL)-based novel event-triggered (ET) H∞ control algorithm for affine continuous-time nonlinear systems with completely unknown drift dynamics, asymmetric input saturation, and external disturbances. The algorithm uses a zero-sum game theory t...

Full description

Saved in:
Bibliographic Details
Main Authors: Luy Nguyen Tan, Dien Nguyen Duc
Format: Article
Language:English
Published: Elsevier 2024-09-01
Series:Franklin Open
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2773186324000628
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper addresses an integral reinforcement learning (IRL)-based novel event-triggered (ET) H∞ control algorithm for affine continuous-time nonlinear systems with completely unknown drift dynamics, asymmetric input saturation, and external disturbances. The algorithm uses a zero-sum game theory to reject external disturbances and an ET mechanism to reduce communication costs and computation bandwidth. Compared to the existing ET control schemes, the algorithm in the first time deals with ET H∞ control relaxing identification of the unknown part of dynamics for systems with asymmetric input saturation. ET control laws and the worst-case disturbance strategies are approximated synchronously by a designed triggering threshold. The stability is guaranteed by Lyapunov analysis and the Zeno behavior is avoided since the inter-event time is greater than zero. Comparative results in simulations confirm that the proposed algorithm is effective.
ISSN:2773-1863