论文标题

多层本地Optima网络,用于分析高级基于本地搜索的算法

Multi-layer local optima networks for the analysis of advanced local search-based algorithms

论文作者

Martins, Marcella Scoczynski Ribeiro, Yafrani, Mohamed El, Delgado, Myriam R. B. S., Luders, Ricardo

论文摘要

本地Optima网络(LON)是一个图形模型,可压缩基于特定社区操作员和本地搜索算法的特定组合优化问题的健身景观。确定景观特征如何影响搜索算法的有效性与预测其性能并改善设计过程相关。本文提出了多层lons的概念,以及一种探索这些模型的方法,旨在提取用于健身景观分析的指标。构建此类模型,提取和分析其指标是将研究扩展到单个邻里操作术启发式方法的方向的初步步骤,向使用多个操作员的更复杂的启发式方法。因此,在本文中,我们研究了使用BITFLIP和交换操作员从组合问题实例获得的Twolayer LON。首先,我们列举了NK-Landscape模型的实例,并使用爬山训练式启发式训练来建立相应的lons。然后,使用LON指标,我们分析了将两种策略结合在一起时搜索的有效效率。实验显示出令人鼓舞的结果,并证明了多层lons提供有用信息的能力,这些信息可根据多个操作员(例如可变邻域搜索)在元启发式学中使用。

A Local Optima Network (LON) is a graph model that compresses the fitness landscape of a particular combinatorial optimization problem based on a specific neighborhood operator and a local search algorithm. Determining which and how landscape features affect the effectiveness of search algorithms is relevant for both predicting their performance and improving the design process. This paper proposes the concept of multi-layer LONs as well as a methodology to explore these models aiming at extracting metrics for fitness landscape analysis. Constructing such models, extracting and analyzing their metrics are the preliminary steps into the direction of extending the study on single neighborhood operator heuristics to more sophisticated ones that use multiple operators. Therefore, in the present paper we investigate a twolayer LON obtained from instances of a combinatorial problem using bitflip and swap operators. First, we enumerate instances of NK-landscape model and use the hill climbing heuristic to build the corresponding LONs. Then, using LON metrics, we analyze how efficiently the search might be when combining both strategies. The experiments show promising results and demonstrate the ability of multi-layer LONs to provide useful information that could be used for in metaheuristics based on multiple operators such as Variable Neighborhood Search.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源