关键词: Combinatorial optimization problem 、 Implicit bias 、 Premature convergence 、 Pareto principle 、 Mathematical optimization 、 Optimization problem 、 Genetic algorithm 、 Mathematics 、 Multi-objective optimization 、 Entropy (information theory)
摘要: This paper discusses a structure of multi-objective optimization problems, which cause deception for conventional Multi-Objective Genetic Algorithms (MOGAs). Further, we propose Distributed Algorithm (DMOGA), employs multiple subpopulation implementation and replacement scheme based on the information theoretic entropy, to improve performance MOGA in such deceptive problems. Several studies have reported that MOGAs’ difficulties generating marginal segments Pareto front combinatorial though structural causes their behaviors not yet been thoroughly studied. Our analysis two test problems suggests use local density selection an implicit bias results premature convergence. DMOGA is distributed MOGA, emphasizes diversity subpopulations by entropy objective functions. approach alleviates convergence enables effectively generate fronts complex In set simulated experiments, proposed method generated more comprehensive than MOGAs, i. e., NSGA-II SPEA2 functions, also achieved comparable standard benchmarks.