Please use this identifier to cite or link to this item:
Title: Integrated optimization algorithm: a metaheuristic approach for complicated optimization
Authors: Li, Chen
Chen, Guo
Liang, Gaoqi
Luo, Fengji
Zhao, Junhua
Dong, Zhao Yang
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Source: Li, C., Chen, G., Liang, G., Luo, F., Zhao, J. & Dong, Z. Y. (2022). Integrated optimization algorithm: a metaheuristic approach for complicated optimization. Information Sciences, 586, 424-449.
Journal: Information Sciences
Abstract: This paper proposes an integrated optimization algorithm (IOA) designed for solving complicated optimization problems that are non-convex, non-differentiable, non-continuous, or computationally intensive. IOA is synthesized from 5 sub-algorithms: follower search, leader search, wanderer search, crossover search, and role learning. The follower search finds better solutions by tracing the leaders. The leader search refines current optimal solutions by approaching or deviating from the central point of the population and then executes a single-round coordinate descent. The wanderer search carries out comprehensive search space expansion. The crossover search generates offspring using solutions from superior parents. Role learning automates the process in which a search agent decides whether to become a follower or a wanderer. A global optima estimation framework (GOEF) is proposed to offer guidelines for designing an efficient optimization algorithm, and IOA is proved to attain global optima. A differentiable integrated optimization algorithm (DIOA) that extends gradient descent is put forward to train deep learning models. Empirical case studies conclude that IOA shows a much faster convergence speed and finds better solutions than the other 8 comparative algorithms based on 27 benchmark functions. IOA has also been applied to solve unit commitment problems in the power system and shows satisfactory results. A power line sub-image classification model based on a convolutional neural network (CNN) is optimized by DIOA. Compared with the pure gradient descent approach, DIOA converges significantly faster and obtains a high test set accuracy with much fewer training epochs.
ISSN: 0020-0255
DOI: 10.1016/j.ins.2021.11.043
Rights: © 2021 Elsevier Inc. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Journal Articles

Citations 20

Updated on Jan 30, 2023

Web of ScienceTM
Citations 20

Updated on Jan 28, 2023

Page view(s)

Updated on Feb 3, 2023

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.