Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Mine Blast Algorithm (MBA) is newly developed metaheuristic technique. It has outperformed Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and their variants when solving various engineering optimization problems. MBA has been... more
Mine Blast Algorithm (MBA) is newly developed metaheuristic technique. It has outperformed Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and their variants when solving various engineering optimization problems. MBA has been improved by IMBA, which is modified in this paper to accelerate its convergence speed furthermore. The proposed variant, so called Accelerated MBA (AMBA), replaces the previous best solution with the available candidate solution in IMBA. ANFIS accuracy depends on the parameters it is trained with. Keeping in view the drawbacks of gradients based learning of ANFIS using gradient descent and least square methods in two-pass learning algorithm, many have trained ANFIS using metaheuristic algorithms. In this paper, for getting high performance, the parameters of ANFIS are trained by the proposed AMBA. The experimental results of real-world benchmark problems reveal that AMBA can be used as an efficient optimization technique. Moreover, the results also indicate that AMBA converges earlier than its other counterparts MBA and IMBA.
Research Interests:
Harris' hawk optimization (HHO) is a recent addition to population-based metaheuristic paradigm, inspired from hunting behavior of Harris' hawks. It has demonstrated promising search behavior while employed on various optimization... more
Harris' hawk optimization (HHO) is a recent addition to population-based metaheuristic paradigm, inspired from hunting behavior of Harris' hawks. It has demonstrated promising search behavior while employed on various optimization problems, however the diversity of search agents can be further enhanced. This paper represents a novel modified variant with a long-term memory concept, hence called long-term memory HHO (LMHHO), which provides information about multiple promising regions in problem landscape, for improvised search results. With this information, LMHHO maintains exploration up to a certain level even until search termination, thus produces better results than the original method. Moreover, the study proves that appropriate tools for in-depth performance analysis can help improve search efficiency of existing metaheuristic algorithms by making simple yet effective modification in search strategy. The diversity measurement and exploration-exploitation investigations prove that the proposed LMHHO maintains trade-off balance between exploration and exploitation. The proposed approach is investigated on high-dimensional numerical optimization problems, including classic benchmark and CEC'17 functions; also, on optimal power flow problem in power generation system. The experimental study suggests that LMHHO not only outperforms the original HHO but also various other established and recently introduced metaheuristic algorithms. Although, the research can be extended by implementing more efficient memory archive and retrieval approaches for enhanced results.
In this study, we investigate the classification problem of heart disease with incomplete datasets. Our pragmatic approach is to exploit the potential of complete data for selecting relevant features in incomplete datasets. We define our... more
In this study, we investigate the classification problem of heart disease with incomplete datasets. Our pragmatic approach is to exploit the potential of complete data for selecting relevant features in incomplete datasets. We define our approach by implementing fuzzy-based particle swarm optimization to impute missing values with tuning the exist structure with the data which leads to better solution. The FCM clustering is applied to identify the similar records in the complete dataset. We compared the Root Mean Square Error (RMSE) results of three different datasets with seven different ratios with range 1% to 20% of missing data. The experimental results provide the evidence that our approach performs better accuracy compared to other approach. The proposed method makes it possible to select relevant feature by offering good combination of its setting to classification of heart disease problem.
Research Interests:
Adaptive Neuro-Fuzzy Inference System (ANFIS) has been widely applied in industry as well as scientific problems. This is due to its ability to approximate every plant with proper number of rules. However, surge in auto-generated rules,... more
Adaptive Neuro-Fuzzy Inference System (ANFIS) has been widely applied in industry as well as scientific problems. This is due to its ability to approximate every plant with proper number of rules. However, surge in auto-generated rules, as the inputs increase, adds up to complexity and computational cost of the network. Therefore, optimization is required by pruning the weak rules while, at the same time, achieving maximum accuracy. Moreover, it is important to note that over-reducing rules may result in loss of accuracy. Artificial Bee Colony (ABC) is widely applied swarm-based technique for searching optimum solutions as it uses few setting parameters. This research explores the applicability of ABC algorithm to ANFIS optimization. For the practical implementation, classification of Malaysian SMEs is performed. For validation, the performance of ABC is compared with one of the popular optimization techniques Particle Swarm Optimization (PSO) and recently developed Mine Blast Algorithm (MBA). The evaluation metrics include number of rules in the optimized rule-base, accuracy, and number of iterations to converge. Results indicate that ABC needs improvement in exploration strategy in order to avoid trap in local minima. However, the application of any efficient metaheuristic with the modified two-pass ANFIS learning algorithm will provide researchers with an approach to effectively optimize ANFIS when the number of inputs increase significantly.
Research Interests:
Fuzzy Neural Networks (FNNs) with the integration of fuzzy logic, neural networks and optimization techniques have not only solved the issue of " black box " in Artificial Neural Networks (ANNs) but also have been effective in a wide... more
Fuzzy Neural Networks (FNNs) with the integration of fuzzy logic, neural networks and optimization techniques have not only solved the issue of " black box " in Artificial Neural Networks (ANNs) but also have been effective in a wide variety of real-world applications. Despite of attracting researchers in recent years and outperforming other fuzzy inference systems, Adaptive Neuro-Fuzzy Inference System (ANFIS) still needs effective parameter training and rule-base optimization methods to perform efficiently when the number of inputs increase. Moreover, the standard gradient based learning via two pass learning algorithm is prone slow and prone to get stuck in local minima. Therefore many researchers have trained ANFIS parameters using metaheuristic algorithms however very few have considered optimizing the ANFIS rule-base. Mostly Particle Swarm Optimization (PSO) and its variants have been applied for training approaches used. Other than that, Genetic Algorithm (GA), Firefly Algorithm (FA), Ant Bee Colony (ABC) optimization methods have been employed for effective training of ANFIS networks while solving various problems in the field of business and finance.
Research Interests: