Abstract:
This dissertation studies the roles of negative correlation learning of building blocks in evolutionary algorithms. It is based on a hypothesis called a Negative Building Block Hypothesis (NBBH) simply states that “An algorithm can seeks new-optimal performance by avoiding the juxtaposition of short, low-order, low-performance schemata, called the negative building blocks”. The hypothesis is tested by developing of a new edge based estimation of distribution algorithm named Coincidence algorithm (COIN). Such algorithm utilizes the negative building blocks contained in the below average solution in order to avoid the composition of bad substructures. COIN is tested in several multimodal combinatorial problems. The results conclude that the negative correlation capability of COIN contributes in both quantity and quality of the solutions. In summary, the roles of negative knowledge in combinatorial optimization extracted from the experiments are as follows: (i) The negative knowledge forces the algorithm to explore out of the search space marked as forbidden areas. (ii) The negative knowledge helps producing more diverse solutions, however dissimilar to the solutions considered to be bad quality. (iii) In cooperating with the positive knowledge, the negative knowledge contributes in discrimination of good and bad building blocks. (iv) The negative knowledge enhances a constructive algorithm to recognize better substructures and to compose better solutions. Finally, COIN is tested in several real world multiobjective applications and has shown competitive results compared to the other algorithms in the experiments.