This paper introduces a new metaheuristic technique known as the Greater Cane Rat Algorithm (GCRA) for addressing optimization problems. The optimization process of GCRA is inspired by the intelligent foraging behaviors of greater cane rats during and off mating season. Being highly nocturnal, they are intelligible enough to leave trails as they forage through reeds and grass.
View Article and Find Full Text PDFFeature selection problem represents the field of study that requires approximate algorithms to identify discriminative and optimally combined features. The evaluation and suitability of these selected features are often analyzed using classifiers. These features are locked with data increasingly being generated from different sources such as social media, surveillance systems, network applications, and medical records.
View Article and Find Full Text PDFThis paper proposes a modified version of the Dwarf Mongoose Optimization Algorithm (IDMO) for constrained engineering design problems. This optimization technique modifies the base algorithm (DMO) in three simple but effective ways. First, the alpha selection in IDMO differs from the DMO, where evaluating the probability value of each fitness is just a computational overhead and contributes nothing to the quality of the alpha or other group members.
View Article and Find Full Text PDFDifferential evolution (DE) is one of the highly acknowledged population-based optimization algorithms due to its simplicity, user-friendliness, resilience, and capacity to solve problems. DE has grown steadily since its beginnings due to its ability to solve various issues in academics and industry. Different mutation techniques and parameter choices influence DE's exploration and exploitation capabilities, motivating academics to continue working on DE.
View Article and Find Full Text PDFThis paper proposes an improvement to the dwarf mongoose optimization (DMO) algorithm called the advanced dwarf mongoose optimization (ADMO) algorithm. The improvement goal is to solve the low convergence rate limitation of the DMO. This situation arises when the initial solutions are close to the optimal global solution; the subsequent value of the alpha must be small for the DMO to converge towards a better solution.
View Article and Find Full Text PDFSelecting appropriate feature subsets is a vital task in machine learning. Its main goal is to remove noisy, irrelevant, and redundant feature subsets that could negatively impact the learning model's accuracy and improve classification performance without information loss. Therefore, more advanced optimization methods have been employed to locate the optimal subset of features.
View Article and Find Full Text PDFSelecting relevant feature subsets is vital in machine learning, and multiclass feature selection is harder to perform since most classifications are binary. The feature selection problem aims at reducing the feature set dimension while maintaining the performance model accuracy. Datasets can be classified using various methods.
View Article and Find Full Text PDFArch Comput Methods Eng
August 2022
The Moth flame optimization (MFO) algorithm belongs to the swarm intelligence family and is applied to solve complex real-world optimization problems in numerous domains. MFO and its variants are easy to understand and simple to operate. However, these algorithms have successfully solved optimization problems in different areas such as power and energy systems, engineering design, economic dispatch, image processing, and medical applications.
View Article and Find Full Text PDFThe distributive power of the arithmetic operators: multiplication, division, addition, and subtraction, gives the arithmetic optimization algorithm (AOA) its unique ability to find the global optimum for optimization problems used to test its performance. Several other mathematical operators exist with the same or better distributive properties, which can be exploited to enhance the performance of the newly proposed AOA. In this paper, we propose an improved version of the AOA called nAOA algorithm, which uses the high-density values that the natural logarithm and exponential operators can generate, to enhance the exploratory ability of the AOA.
View Article and Find Full Text PDF