TY - JOUR
T1 - Evolutionary Dynamic Multiobjective Optimization
T2 - Benchmarks and Algorithm Comparisons
AU - Jiang, Shouyong
AU - Yang, Shengxiang
N1 - Funding Information:
This work was supported in part by the Engineering and Physical Sciences Research Council of U.K. under Grant EP/K001310/1, and in part by the National Natural Science Foundation of China under Grant 61273031.
Publisher Copyright:
© 2016 IEEE.
PY - 2017/1
Y1 - 2017/1
N2 - Dynamic multiobjective optimization (DMO) has received growing research interest in recent years since many real-world optimization problems appear to not only have multiple objectives that conflict with each other but also change over time. The time-varying characteristics of these DMO problems (DMOPs) pose new challenges to evolutionary algorithms. Considering the importance of a representative and diverse set of benchmark functions for DMO, in this paper, we propose a new benchmark generator that is able to tune a number of challenging characteristics, including mixed Pareto-optimal front (convexity-concavity), nonmonotonic and time-varying variable-linkages, mixed types of changes, and randomness in type change, which have rarely or not been considered or tested in the literature. A test suite of ten instances with different dynamic features is produced from the generator in this paper. Additionally, a few new performance measures are proposed to evaluate algorithms for DMOPs with different characteristics. Six representative multiobjective evolutionary algorithms from the literature are investigated based on the proposed DMO test suite and performance measures. The experimental results facilitate a better understanding of strengths and weaknesses of these compared algorithms for DMOPs.
AB - Dynamic multiobjective optimization (DMO) has received growing research interest in recent years since many real-world optimization problems appear to not only have multiple objectives that conflict with each other but also change over time. The time-varying characteristics of these DMO problems (DMOPs) pose new challenges to evolutionary algorithms. Considering the importance of a representative and diverse set of benchmark functions for DMO, in this paper, we propose a new benchmark generator that is able to tune a number of challenging characteristics, including mixed Pareto-optimal front (convexity-concavity), nonmonotonic and time-varying variable-linkages, mixed types of changes, and randomness in type change, which have rarely or not been considered or tested in the literature. A test suite of ten instances with different dynamic features is produced from the generator in this paper. Additionally, a few new performance measures are proposed to evaluate algorithms for DMOPs with different characteristics. Six representative multiobjective evolutionary algorithms from the literature are investigated based on the proposed DMO test suite and performance measures. The experimental results facilitate a better understanding of strengths and weaknesses of these compared algorithms for DMOPs.
KW - Benchmark
KW - dynamic multiobjective optimization (DMO)
KW - evolutionary algorithm
KW - performance metric
UR - http://www.scopus.com/inward/record.url?scp=85027705941&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2015.2510698
DO - 10.1109/TCYB.2015.2510698
M3 - Article
C2 - 26766387
AN - SCOPUS:85027705941
VL - 47
SP - 198
EP - 211
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
SN - 2168-2267
IS - 1
ER -