Stochastic Multiple Chaotic Local Search-Incorporated Gradient-Based Optimizer

In this study, a hybrid metaheuristic algorithm chaotic gradient-based optimizer (CGBO) is proposed. The gradient-based optimizer (GBO) is a novel metaheuristic inspired by Newton’s method which has two search strategies to ensure excellent performance. One is the gradient search rule (GSR), and the...

Full description

Saved in:
Bibliographic Details
Main Authors: Hang Yu, Yu Zhang, Pengxing Cai, Junyan Yi, Sheng Li, Shi Wang
Format: Article
Language:English
Published: Wiley 2021-01-01
Series:Discrete Dynamics in Nature and Society
Online Access:http://dx.doi.org/10.1155/2021/3353926
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this study, a hybrid metaheuristic algorithm chaotic gradient-based optimizer (CGBO) is proposed. The gradient-based optimizer (GBO) is a novel metaheuristic inspired by Newton’s method which has two search strategies to ensure excellent performance. One is the gradient search rule (GSR), and the other is local escaping operation (LEO). GSR utilizes the gradient method to enhance ability of exploitation and convergence rate, and LEO employs random operators to escape the local optima. It is verified that gradient-based metaheuristic algorithms have obvious shortcomings in exploration. Meanwhile, chaotic local search (CLS) is an efficient search strategy with randomicity and ergodicity, which is usually used to improve global optimization algorithms. Accordingly, we incorporate GBO with CLS to strengthen the ability of exploration and keep high-level population diversity for original GBO. In this study, CGBO is tested with over 30 CEC2017 benchmark functions and a parameter optimization problem of the dendritic neuron model (DNM). Experimental results indicate that CGBO performs better than other state-of-the-art algorithms in terms of effectiveness and robustness.
ISSN:1607-887X