Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Emmanuel Gbenga Dada
  • Department of Computer Engineering, Faculty of Engineering, P.M.B 1069, University of Maiduguri, Nigeria.
  • Emmanuel Gbenga DADA is an Associate Professor of Computer Science. He received his Ph.D. in Computer Science from Un... moreedit
The simplicity, transparency, reliability, high efficiency and robust nature of PID controllers are some of the reasons for their high popularity and acceptance for control in process industries around the world today. Tuning of PID... more
The simplicity, transparency, reliability, high efficiency and robust nature of PID controllers are some of the reasons for their high popularity and acceptance for control in process industries around the world today. Tuning of PID control parameters has been a field of active research and still is. The primary objectives of PID control parameters are to achieve minimal overshoot in steady state response and lesser settling time. With exception of two popular conventional tuning strategies (Ziegler Nichols closed loop oscillation and Cohen-Coon's process reaction curve) several other methods have been employed for tuning. This work accords a thorough review of state-of-the-art and classical strategies for PID controller parameters tuning using metaheuristic algorithms. Methods appraised are categorized into classical and metaheuristic optimization methods for PID parameters tuning purposes. Details of some metaheuristic algorithms, methods of application, equations and implementation flowcharts/algorithms are presented. Some open problems for future research are also presented. The major goal of this work is to proffer a comprehensive reference source for researchers and scholars working on PID controllers.
Public health is now in danger because of the current monkeypox outbreak, which has spread rapidly to more than 40 countries outside of Africa. The growing monkeypox epidemic has been classified as a “public health emergency of... more
Public health is now in danger because of the current monkeypox outbreak, which has spread rapidly to more than 40 countries outside of Africa. The growing monkeypox epidemic has been classified as a “public health emergency of international concern” (PHEIC) by the World Health Organization (WHO). Infection outcomes, risk factors, clinical presentation, and transmission are all poorly understood. Computer- and machine-learning-assisted prediction and forecasting will be useful for controlling its spread. The objective of this research is to use the historical data of all reported human monkey pox cases to predict the transmission rate of the disease. This paper proposed stacking ensemble learning and machine learning techniques to forecast the rate of transmission of monkeypox. In this work, adaptive boosting regression (Adaboost), gradient boosting regression (GBOOST), random forest regression (RFR), ordinary least square regression (OLS), least absolute shrinkage selection operato...
The movie industry has grown into a several billion-dollar enterprise, and there is now a ton of information online about it. Numerous machine learning techniques have been created by academics and can produce effective classification... more
The movie industry has grown into a several billion-dollar enterprise, and there is now a ton of information online about it. Numerous machine learning techniques have been created by academics and can produce effective classification models. In this study, different machine learning classification techniques are applied to our own movie dataset for multiclass classification. This paper's main objective is to compare the effectiveness of various machine learning techniques. This study examined five methods: Multinomial Logistic Regression (MLR), Support Vector Machine (SVM), Bagging (BAG), Naive Bayes (NBS) and K-Nearest Neighbor (KNN), while noise was removed using All K-Edited Nearest Neighbors (AENN). These techniques all utilize previous IMDb dataset to predict a movie's net profit value. The algorithms predict the profit at the box office for each of these five techniques. Based on the dataset used in this paper, which consists of 5043 rows and 14 columns of movies, thi...
From the development and sale of a product through its delivery to the end customer, the supply chain encompasses a network of suppliers, transporters, warehouses, distribution centers, shipping lines, and logistics service providers all... more
From the development and sale of a product through its delivery to the end customer, the supply chain encompasses a network of suppliers, transporters, warehouses, distribution centers, shipping lines, and logistics service providers all working together. Lead times, bottlenecks, cash flow, data management, risk exposure, traceability, conformity, quality assurance, flaws, and language barriers are some of the difficulties that supply chain management faces. In this paper, deep learning techniques such as Long Short-Term Memory (LSTM) and One Dimensional Convolutional Neural Network (1D-CNN) were adopted and applied to classify supply chain pricing datasets of health medications. Then, Bayesian optimization using the tree parzen estimator and All K Nearest Neighbor (AllkNN) was used to establish the suitable model hyper-parameters of both LSTM and 1D-CNN to enhance the classification model. Repeated five-fold cross-validation is applied to the developed models to predict the accurac...
Machine Learning has found application in solving complex problems in different fields of human endeavors such as intelligent gaming, automated transportation, cyborg technology, environmental protection, enhanced health care, innovation... more
Machine Learning has found application in solving complex problems in different fields of human endeavors such as intelligent gaming, automated transportation, cyborg technology, environmental protection, enhanced health care, innovation in banking and home security, and smart homes. This research is motivated by the need to explore the global structure of machine learning to ascertain the level of bibliographic coupling, collaboration among research institutions, co-authorship network of countries, and sources coupling in publications on machine learning techniques. The Hierarchical Density-Based Spatial Clustering of Applications with Noise (HDBSCAN) was applied to clustering prediction of authors dominance ranking in this paper. Publications related to machine learning were retrieved and extracted from the Dimensions database with no language restrictions. Bibliometrix was employed in computation and visualization to extract bibliographic information and perform a descriptive ana...
Introduction Vaccines are the most important instrument for bringing the pandemic to a close and saving lives and helping to reduce the risks of infection. It is important that everyone has equal access to immunizations that are both safe... more
Introduction Vaccines are the most important instrument for bringing the pandemic to a close and saving lives and helping to reduce the risks of infection. It is important that everyone has equal access to immunizations that are both safe and effective. There is no one who is safe until everyone gets vaccinated. COVID-19 vaccinations are a game-changer in the fight against diseases. In addition to examining attitudes toward these vaccines in Africa, Asia, Oceania, Europe, North America, and South America, the purpose of this paper is to predict the acceptability of COVID-19 vaccines and study their predictors. Materials and methods Kaggle datasets are used to estimate the prediction outcomes of the daily COVID-19 vaccination to prevent a pandemic. The Kaggle data sets are classified into training and testing datasets. The training dataset is comprised of COVID-19 daily data from the 13th of December 2020 to the 13th of June 2021, while the testing dataset is comprised of COVID-19 da...
A molecule is the smallest particle in a chemical element or compound that possesses the element or compound’s chemical characteristics. There are numerous challenges associated with the development of molecular simulations of fluid... more
A molecule is the smallest particle in a chemical element or compound that possesses the element or compound’s chemical characteristics. There are numerous challenges associated with the development of molecular simulations of fluid characteristics for industrial purposes. Fluid characteristics for industrial purposes find applications in the development of various liquid household products, such as liquid detergents, drinks, beverages, and liquid health medications, amongst others. Predicting the molecular properties of liquid pharmaceuticals or therapies to address health concerns is one of the greatest difficulties in drug development. Computational tools for precise prediction can help speed up and lower the cost of identifying new medications. A one-dimensional deep convolutional gated recurrent neural network (1D-CNN-GRU) was used in this study to offer a novel forecasting model for molecular property prediction of liquids or fluids. The signal data from molecular properties w...
Since the declaration of COVID-19 as a global pandemic, it has been transmitted to more than 200 nations of the world. The harmful impact of the pandemic on the economy of nations is far greater than anything suffered in almost a century.... more
Since the declaration of COVID-19 as a global pandemic, it has been transmitted to more than 200 nations of the world. The harmful impact of the pandemic on the economy of nations is far greater than anything suffered in almost a century. The main objective of this paper is to apply Structural Equation Modeling (SEM) and Machine Learning (ML) to determine the relationships among COVID-19 risk factors, epidemiology factors and economic factors. Structural equation modeling is a statistical technique for calculating and evaluating the relationships of manifest and latent variables. It explores the causal relationship between variables and at the same time taking measurement error into account. Bagging (BAG), Boosting (BST), Support Vector Machine (SVM), Decision Tree (DT) and Random Forest (RF) Machine Learning techniques was applied to predict the impact of COVID-19 risk factors. Data from patients who came into contact with coronavirus disease were collected from Kaggle database bet...
Face recognition which is a sub-discipline of computer vision is gaining a lot of attraction from large audience around the world. Some application areas include forensics, cyber security and intelligent monitoring. Face recognition... more
Face recognition which is a sub-discipline of computer vision is gaining a lot of attraction from large audience around the world. Some application areas include forensics, cyber security and intelligent monitoring. Face recognition attendance system serves as a perfect substitute for the conventional attendance system in organizations and classrooms. The challenges associated with most face recognition techniques is inability to detect faces in situations such as noise, pose, facial expression, illumination, obstruction and low performance accuracy. This necessitated the development of more robust and efficient face recognition systems that will overcome the drawbacks associated with conventional techniques. This paper proposed a parallel faces recognition attendance system based on Convolutional Neural Network a branch of artificial intelligence and OpenCV. Experimental results proved the effectiveness of the proposed technique having shown good performance with recognition accuracy of about 98%, precision of 96% and a recall of 0.96. This demonstrates that the proposed method is a promising facial recognition technology.
COVID-19 is a strain of coronavirus that first broke out in Wuhan, China, in December 2019 and has since become a global pandemic. In this chapter, we apply four machine learning techniques, i.e., logistic regression (LR), support vector... more
COVID-19 is a strain of coronavirus that first broke out in Wuhan, China, in December 2019 and has since become a global pandemic. In this chapter, we apply four machine learning techniques, i.e., logistic regression (LR), support vector machine (SVM), recurrent neural network (RNN), and long short-time memory (LSTM) in predicting transmission of coronavirus (COVID-19). Data were collected from patients who have contacted coronavirus disease from 22 January 2020 to 14 March 2020 obtained from Kaggle database. The data consisted of the confirmed, death, and recovered cases of all the countries infected with coronavirus (COVID-19). The performance of each machine learning techniques was compared using mean absolute error (MAE), root mean square error (RMSE), and mean absolute scaled error (MASE). The results indicate that logistic regression (LR) is effective in predicting accurately different continents such as Africa, Asia, Australia/Oceania, Europe, North America, and South America and cruise ships with 0.4590, 57005.25, 0.6829, 44.35, 2.2764, 0.5401, and 4.7508, respectively. This is an indication that it is a promising technique in predicting the spread of coronavirus. The study reveals that there are upward trends of COVID-19 in Africa, Europe, Australia/Oceania, North America, and South America, while trends of transmission of pandemic diseases have been stable in Asia and Diamond Princess cruise ship. As COVID-19 cases continue to rise in Africa, Europe, Australia/Oceania, North America, and South America, there are urgent needs to curtail transmission of this disease. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
Rainfall prediction is an important meteorological problem that can greatly affect humanity in areas such as agriculture production, flooding, drought, and sustainable management of water resources. The dynamic and nonlinear nature of the... more
Rainfall prediction is an important meteorological problem that can greatly affect humanity in areas such as agriculture production, flooding, drought, and sustainable management of water resources. The dynamic and nonlinear nature of the climatic conditions have made it impossible for traditional techniques to yield satisfactory accuracy for rainfall prediction. As a result of the sophistication of climatic processes that produced rainfall, using quantitative techniques to predict rainfall is a very cumbersome task. The paper proposed four non-linear techniques such as Artificial Neural Networks (ANN) for rainfall prediction. ANN has the capacity to map different input and output patterns. The Feed Forward Neural Network (FFNN), Cascade Forward Neural Network (CFNN), Recurrent Neural Network (RNN), and Elman Neural Network (ENN) were used to predict rainfall. The dataset used for this work contains some meteorological variables such as temperature, wind speed, humidity, rainfall, v...
Retinal vessel tortuosity is an early indicator of different retinopathies. Although various automated methods in determining retinal vessel tortuosity have been proposed in the literature, there are needs for further study. This study... more
Retinal vessel tortuosity is an early indicator of different retinopathies. Although various automated methods in determining retinal vessel tortuosity have been proposed in the literature, there are needs for further study. This study extracted three different features namely distance metric, normalized hybrid metric and non-normalized hybrid metric from the thinned vessels. The weights of vessel data samples were dynamically updated using the Adaboost with linear discriminant analysis (LDA) and the feature correlation was used to facilitate the selection of the best feature combination at each of the boosting iteration rather than a single feature that minimizes the weighted error at each of the iterations. Adaboost with LDA method is then used for the classification of the retinal vessels as either tortuous or normal using a majority voting method. The proposed method achieves the accuracy rate of 100% for the training sample sizes of 70%, 80% and 90%.
Usability assessment of interactive systems has been a hot topic in human-computer interaction. People at different times and places have attempted to evaluate software, websites, and other tools to ascertain their levels of usability.... more
Usability assessment of interactive systems has been a hot topic in human-computer interaction. People at different times and places have attempted to evaluate software, websites, and other tools to ascertain their levels of usability. The essence is to indicate the extent to which such interactive systems are easy to learn, easy to use, easy to remember to use, effective, efficient, error tolerant, aesthetically pleasing and satisfying by users. This work adopts a user-centered approach to usability evaluation of university of Maiduguri website (www.unimaid.edu.ng) by applying a systematic methodology of involving users in performing set tasks (user testing) and using the tasks completion time as metric. The perspective of users due to their ability to allow quick tasks performance,fast downloads, effective navigation, error tolerant, consistency, and minimal background coloring was used in the evaluation. Data collected from the tasks completion time were statistically analyzed fo...
As the volume of medicinal information stored electronically increase, so do the need to enhance how it is secured. The inaccessibility to patient record at the ideal time can prompt death toll and also well degrade the level of health... more
As the volume of medicinal information stored electronically increase, so do the need to enhance how it is secured. The inaccessibility to patient record at the ideal time can prompt death toll and also well degrade the level of health care services rendered by the medicinal professionals. Criminal assaults in social insurance have expanded by 125% since 2010 and are now the leading cause of medical data breaches. This study therefore presents the combination of 3DES and LSB to improve security measure applied on medical data. Java programming language was used to develop a simulation program for the experiment. The result shows medical data can be stored, shared, and managed in a reliable and secure manner using the combined model.
Particle swarm optimization (PSO) is a metaheuristic optimization algorithm that has been used to solve complex optimization problems. The Interior Point Methods (IPMs) are now believed to be the most robust numerical optimization... more
Particle swarm optimization (PSO) is a metaheuristic optimization algorithm that has been used to solve complex optimization problems. The Interior Point Methods (IPMs) are now believed to be the most robust numerical optimization algorithms for solving large-scale nonlinear optimization problems. To overcome the shortcomings of PSO, we proposed the Primal-Dual Asynchronous Particle Swarm Optimization (pdAPSO) algorithm. The Primal Dual provides a better balance between exploration and exploitation, preventing the particles from experiencing premature convergence and been trapped in local minima easily and so producing better results. We compared the performance of pdAPSO with 9 states of the art PSO algorithms using 13 benchmark functions. Our proposed algorithm has very high mean dependability. Also, pdAPSO have a better convergence speed compared to the other 9 algorithms. For instance, on Rosenbrock function, the mean FEs of 8938, 6786, 10,080, 9607, 11,680, 9287, 23,940, 6269 a...
Attacks on computer systems are becoming progressively frequent. Many machine learning techniques have been developed in the bid to increase the effectiveness of intrusion detection systems (IDS). However, the sophistication of intrusion... more
Attacks on computer systems are becoming progressively frequent. Many machine learning techniques have been developed in the bid to increase the effectiveness of intrusion detection systems (IDS). However, the sophistication of intrusion attacks on computer networks and the large size of dataset pose a serious challenge as they drastically reduce the effectiveness of the IDS. We do not propose any new algorithm in this paper. However, experiments were conducted to investigate the performance of six (6) machine learning techniques found in literature and how they can effectively detect intrusion activities on a network. This work examines how effective each algorithm under investigation handles intrusion events. In our experiment, the NSL-KDDTrain+ dataset was partitioned into training subgroups subject to the type of network protocol. Subsequent to this, extraneous and unneeded attributes are removed from each training subgroup. The effectiveness of the algorithms was evaluated. The...
The declining rate of student graduation in today’s higher institutions of learning has become a major source of concern to educational authorities, school administration and parents. While school administrators are trying to increase the... more
The declining rate of student graduation in today’s higher institutions of learning has become a major source of concern to educational authorities, school administration and parents. While school administrators are trying to increase the rate of graduation, students are dropping out at an alarming rate. The ability to correctly predict student’s graduation time after admission into graduate program is critical for educational institutions because it allows for developing strategic programs that will aid or improve students’ performances towards graduating on time (GOT). This paper explores predictive nature of artificial neural networks (ANN) to design a model based on cognitive and non-cognitive measures of students, together with background information, in order to predict students’ graduation time. Synthetic data was used to test and verify the effectiveness of the proposed model. The results shows that artificial neural network is a promising tool for prediction.
A fault in electrical equipment can be defined as a defect in its electrical circuit due to which the current is diverted from the intended path. Faults are generally caused by mechanical failure, accidents, excessive internal, external... more
A fault in electrical equipment can be defined as a defect in its electrical circuit due to which the current is diverted from the intended path. Faults are generally caused by mechanical failure, accidents, excessive internal, external stresses, and others. When a cable is faulty the resistance of such cable is affected. If left unrectified, it will totally hinder voltage from flowing through the cable. The challenge with the existing methods used for locating faults in underground cables is the inaccuracy in calculating the distance where the fault is located and the low durability of such equipment. To overcome these challenges, this paper presents a novel underground cable fault detector that has the capacity to measure the resistance of the cable, detect the type of fault in a cable, and also accurately compute the location of the fault using cheap materials. Several tests were conducted using the proposed device, and the results indicated that the proposed method produced sati...
This study presents a report of our research work on the design, construction and testing of a cell phone detector. It has become obvious that blocking or jamming of cell phone signals is difficult, expensive, and/or illegal in many... more
This study presents a report of our research work on the design, construction and testing of a cell phone detector. It has become obvious that blocking or jamming of cell phone signals is difficult, expensive, and/or illegal in many situations. A more practical means of controlling cell phones involves detecting their RF signals, followed by confiscation or other intervention. With this, a cell phone detector is a device designed to detect the presence of a cell phone within a certain range of vicinity (from a distance of one anda-half metres.). Our aim is to design a cell phone detector that can be used to prevent the use of mobile phones in examination halls, confidential rooms, banks, petrol filling stations, military intelligent gathering etc. We made use of two signal detectors each with a dipole antenna, choke, and diode. Each dipole antenna is tuned to 900MHz. When the antennas resonate at 900 MHz a charge is induced in the inductor. A diode then demodulates the signal, which...
The prediction of protein structure has a major role in drugs design and network pharmacology. However, complexity of the protein structure, time consumption and expensive cost incurred by the state-of-the-art methods of predicting... more
The prediction of protein structure has a major role in drugs design and network pharmacology. However, complexity of the protein structure, time consumption and expensive cost incurred by the state-of-the-art methods of predicting protein structure motivated researchers to propose deep learning solutions. Despite the growing applications of deep learning in protein structure, no dedicated comprehensive survey on the exploration of deep learning in protein structure. In this paper, a comprehensive survey on the exploration of deep learning in protein structure is presented. This survey starts by addressing a quantitative and synthesis analysis to show an insight on deep learning in protein structure prediction. Based on the synthesis analysis provided, the survey created a taxonomy of the literature related to exploring deep learning in protein structure. Research challenges and new perspective for future research direction for developing deep learning solutions for protein structur...
Conventional traffic monitoring methods are becoming less efficient as numerous applications are rapidly adapting to counteract attempts to identify them, which creates new challenges for traffic monitoring. Autonomous Distributed Network... more
Conventional traffic monitoring methods are becoming less efficient as numerous applications are rapidly adapting to counteract attempts to identify them, which creates new challenges for traffic monitoring. Autonomous Distributed Network Monitoring (ADNM) scheme is a promising approach to address these challenges, nonetheless each ADNM node has its own limitation, such as adapting to concept drift and self-learning, hence needs to collaborate with other autonomic nodes to monitor the network efficiently. This paper, presents a collaborative learning and sharing structure among self-managed network monitoring nodes, ensuring interaction for efficient information exchange for distributed autonomic monitoring towards the achievement of global network management objectives. A machine learning algorithm for collaborative learning among distributed autonomic monitoring nodes is proposed. This algorithm is based on the concept of online incremental k-means traffic classification model. Ex...
Acute damage to the retina vessel has been identified to be main reason for blindness and impaired vision all over the world. A timely detection and control of these illnesses can greatly decrease the number of loss of sight cases.... more
Acute damage to the retina vessel has been identified to be main reason for blindness and impaired vision all over the world. A timely detection and control of these illnesses can greatly decrease the number of loss of sight cases. Developing a high performance unsupervised retinal vessel segmentation technique poses an uphill task. This paper presents study on the Primal-Dual Asynchronous Particle Swarm Optimisation (pdAPSO) method for the segmentation of retinal vessels. A maximum average accuracy rate 0.9243 with an average specificity of sensitivity rate of 0.9834 and average sensitivity rate of 0.5721 were achieved on DRIVE database. The proposed method produces higher mean sensitivity and accuracy rates in the same range of very good specificity.
The mobile agent approach is a relatively new concept in the distributed systems environment. The agents migrate from Client to server in a network where the state of the running program is saved, transported to the new host, and are... more
The mobile agent approach is a relatively new concept in the distributed systems environment. The agents migrate from Client to server in a network where the state of the running program is saved, transported to the new host, and are stored, allowing the program to continue from the point where it stopped. In this paper, we evaluate the performance of the JADE and Aglet mobile agents. We developed a simulation program to evaluate the performance of the two mobile agents using the Encryption time, Decryption time and file transfer time. Our findings revealed that there is no significant difference between the performances of these two mobile agents using the parameters mentioned before.
Retinal vessel segmentation is a practice that has the potential of enhancing accuracy in the diagnosis and timely prevention of illnesses that are related to blood vessels. Acute damage to the retinal vessel has been identified to be the... more
Retinal vessel segmentation is a practice that has the potential of enhancing accuracy in the diagnosis and timely prevention of illnesses that are related to blood vessels. Acute damage to the retinal vessel has been identified to be the main cause of blindness and impaired vision. A timely detection and control of these illnesses can greatly decrease the number of loss of sight cases. However, the manual protocol for such detection is laborious and although autonomous methods have been recommended, the accuracy of these methods is often unreliable. We propose the utilization of the Primal-Dual Asynchronous Particle Swarm Optimisation ( pdAPSO ) and differential image methods in addressing the drawbacks associated with segmentation of retinal vessels in this study. The fusion of pdAPSO and differential image (which focuses on the median filter) produced a significant enhancement in the segmentation of huge and miniscule retinal vessels. In addition, the method also decreased errone...
The Joint Admissions and Matriculation Board (JAMB) have over the years been in the news over the use of Computer Based Test (CBT) mode over the Paper Pencil Test (PPT) mode for its Unified Tertiary Matriculation Examination (UTME). This... more
The Joint Admissions and Matriculation Board (JAMB) have over the years been in the news over the use of Computer Based Test (CBT) mode over the Paper Pencil Test (PPT) mode for its Unified Tertiary Matriculation Examination (UTME). This study examines the two test modes, and also tries to ascertain which particular mode makes the unified tertiary matriculation examination more comfortable for the students in privileged environment and those in the rural areas. Predicting student performance can be useful to the managements in many contexts. The purpose of this research work is to do a performance evaluation of computer-based and paper-based version of Joint Admissions and Matriculation Board (JAMB) test data conducted in the previous year using a robust Support Vector Machine model. This work attempts to determine whether there is any difference in the performance of student when comparing CBT to identical PPT test mode and also investigate the levels of malpractices involved in bo...
This paper presents the application of Moth Flame optimization (MFO) algorithm to determine the best impulse response coefficients of FIR low pass, high pass, band pass and band stop filters. MFO was inspired by observing the navigation... more
This paper presents the application of Moth Flame optimization (MFO) algorithm to determine the best impulse response coefficients of FIR low pass, high pass, band pass and band stop filters. MFO was inspired by observing the navigation strategy of moths in nature called transverse orientation composed of three mathematical submodels. The performance of the proposed technique was compared to those of other well-known high performing optimization techniques like techniques like Particle Swarm Optimization (PSO), Novel Particle Swarm Optimization (NPSO), Improved Novel Particle Swarm Optimization (INPSO), Genetic Algorithm (GA), Parks and McClellan (PM) Algorithm. The performances of the MFO based designed optimized FIR filters have proved to be superior as compared to those obtained by PSO, NPSO, INPSO, GA, and PM Algorithm. Simulation results indicated that the maximum stop band ripples 0.057326, transition width 0.079 and fitness value 1.3682 obtained by MFO is better than that of ...
For people in developing countries, cassava is a major source of calories and carbohydrates. However, Cassava Mosaic Disease (CMD) has become a major cause of concern among farmers in sub-Saharan Africa countries, which rely on cassava... more
For people in developing countries, cassava is a major source of calories and carbohydrates. However, Cassava Mosaic Disease (CMD) has become a major cause of concern among farmers in sub-Saharan Africa countries, which rely on cassava for both business and local consumption. The article proposes a novel deep residual convolution neural network (DRNN) for CMD detection in cassava leaf images. With the aid of distinct block processing, we can counterbalance the imbalanced image dataset of the cassava diseases and increase the number of images available for training and testing. Moreover, we adjust low contrast using Gamma correction and decorrelation stretching to enhance the color separation of an image with significant band-to-band correlation. Experimental results demonstrate that using a balanced dataset of images increases the accuracy of classification. The proposed DRNN model outperforms the plain convolutional neural network (PCNN) by a significant margin of 9.25% on the Cass...
Introduction Sleep scoring is an important step in the treatment of sleep disorders. Manual annotation of sleep stages is time-consuming and experience-relevant and, therefore, needs to be done using machine learning techniques. Methods... more
Introduction Sleep scoring is an important step in the treatment of sleep disorders. Manual annotation of sleep stages is time-consuming and experience-relevant and, therefore, needs to be done using machine learning techniques. Methods Sleep-EDF polysomnography was used in this study as a dataset. Support vector machines and artificial neural network performance were compared in sleep scoring using wavelet tree features and neighborhood component analysis. Results Neighboring component analysis as a combination of linear and non-linear feature selection method had a substantial role in feature dimension reduction. Artificial neural network and support vector machine achieved 90.30% and 89.93% accuracy, respectively. Discussion and Conclusion Similar to the state of the art performance, the introduced method in the present study achieved an acceptable performance in sleep scoring. Furthermore, its performance can be enhanced using a technique combined with other techniques in featur...
ABSTRACT This paper presents a hybrid algorithm called Primal-Dual-PSO algorithm to address the problem of swarm robotics flocking motion. This algorithm combines the explorative ability of PSO with the exploitative capacity of the Primal... more
ABSTRACT This paper presents a hybrid algorithm called Primal-Dual-PSO algorithm to address the problem of swarm robotics flocking motion. This algorithm combines the explorative ability of PSO with the exploitative capacity of the Primal Dual Interior Point Method. We hypothesize that the fusion of the two algorithms provides a strong probability of avoiding premature convergence, and also ensure that the robots are not trapped in their local minimal. Our simulation result provides a clear indication of the effectiveness of the algorithm. The hybrid algorithm performs better in terms of precision, rate of convergence, steadiness, robustness and flocking capability for homogenous set of swarm robots. Keywords: Particle Swarm Optimization (PSO), Interior Point Method, Primal-Dual, gbest, and lbest. I. INTRODUCTION Typically, the problem synonym with the field of swarm robotics is managing and directing the movement of considerable number of robots to carry out a mission together. This type of task is normally impracticable, demanding and laborious for a particular set of robots to accomplish. The inspiration of swarm robots is fundamentally drawn from the study of behaviour of animals like the flock of birds, herd of cattle, and shoal of fish. According to [1] and [2], the performance of the swarm at the global level will be largely influenced by the performance of the individual agent at the local level. Some of the traits of swarm robotics that have been extensively investigated are convergence, foraging [1], pattern formation [4], flocking, aggregation and segregation [3], box-pushing [2], cooperative mapping [5], soccer tournaments [6], site preparation [7], and sorting [8]. From the list of the various attributes of swarms above, flocking is the most attractive; where potential practical applications in areas like search and rescue, system for monitoring behaviour or changing information, system for acquiring data which is used for measuring physical phenomenon, and networks of small low cost sensors [8] can be realized. In swarm robotics, the problem of flocking entails directing a set of robots to move to a specific direction and converge to a target in an unfamiliar location. The robots that made up the swarm are required to accomplish this purpose as they are adjusting to their environments. Researchers have developed a number of
Lack of success in disaster recovery occurs for many reasons, with one predominant catalyst for catastrophic failure being flawed and inefficient communication systems. Increased occurrences of devastating environmental hazards and... more
Lack of success in disaster recovery occurs for many reasons, with one predominant catalyst for catastrophic failure being flawed and inefficient communication systems. Increased occurrences of devastating environmental hazards and human-caused disasters will continue to promulgate throughout the United States and around the globe as a result of the continuous intensive urbanization forcing human population into more concentrated and interconnected societies. With the rapid evolutions in technology and the advent of Information and communication technology (ICT) interfaces such as Facebook, Twitter, Flickr, Myspace, and Smartphone technology, communication is no longer a unidirectional source of information traveling from the newsroom to the public. In the event of a disaster, time critical information can be exchanged to and from any person or organization simultaneously with the capability to receive feedback. A literature review of current information regarding the use of ICT as ...

And 68 more

As the number of users opting for credit card payment is increasing daily worldwide, the threats posed by internet fraudsters on this type of payment are also on the increase. Banks, merchants and consumers globally have lost billions of... more
As the number of users opting for credit card payment is increasing daily worldwide, the threats posed by internet fraudsters on this type of payment are also on the increase. Banks, merchants and consumers globally have lost billions of dollars as a result of this type of fraud. The shortcomings of many of the existing credit card fraud detection techniques include their inability to effectively detect fraudulent transactions, the high false alarm rate, and high computational cost. These necessitated the development of more efficient credit card fraud prevention measures. Many models have been developed in the literature; however, the accuracy of the model is critical. In this paper, fraud detection model using K-Star machine learning algorithm is presented and the performance is evaluated using German Credit and Australian Credit datasets. The algorithm proposed in this paper proved to be highly effective and efficient with a resultant classification accuracy of 100%, very low false positive rate (0.00) and very high true positive rate of 1.00. All experiments are conducted on WEKA data mining and machine learning simulation environment.
Diabetes is a disease that is gaining popularity on daily basis in recent times globally and among different age groups. Diabetes causes damage to nerves, blood vessels, kidney and retina. Machine learning techniques have proved to be... more
Diabetes is a disease that is gaining popularity on daily basis in recent times globally and among different age groups. Diabetes causes damage to nerves, blood vessels, kidney and retina. Machine learning techniques have proved to be very effective in detecting diabetes. In this study, we applied the Non-Nested Generalisation exemplar classifiers on Pima Indians diabetes dataset to effectively and efficiently classify whether patients are having diabetes or not. Our proposed algorithm proved to be highly effective and efficient with a resultant classification accuracy of 100%, very low false positive rate (0.00) and very high true positive rate of 1.00. All experiments are conducted on WEKA data mining and machine learning simulation environment.
Email spam is one of the major challenges faced daily by every email user in the world. On a daily basis email users receive hundreds of spam mails having a new content, from anonymous addresses which are automatically generated by robot... more
Email spam is one of the major challenges faced daily by every email user in the world. On a daily basis email users receive hundreds of spam mails having a new content, from anonymous addresses which are automatically generated by robot software agents. The traditional methods of spam filtering such as black lists and white lists using (domains, IP addresses, mailing addresses) have proven to be grossly ineffective in curtailing the menace of spam messages. This have brought afore the need for the invention of highly reliable email spam filters. Of recent, machine learning approach have been successfully applied in detecting and filtering spam emails. This paper proposes the use of random forest machine learning algorithm for efficient classification of email spam messages. The main purpose is to develop a spam email filter with better prediction accuracy and less numbers of features. From the Enron public dataset consisting of 5180 emails of both ham, spam and normal emails, a set of prominent spam email features (from the literatures) were extracted and applied by the random forests algorithm with a resultant classification accuracy of 99.92%, very low false positive rate (0.01) and very high true positive rate of 0.999. All experiments are conducted on WEKA data mining and machine learning simulation environment.
The recent growth and application of Peer-to-Peer (P2P) networks in file sharing, streaming media, instant messaging and other fields, which have attracted large attention. At the same time P2P network traffic significantly contributes to... more
The recent growth and application of Peer-to-Peer (P2P) networks in file sharing, streaming media, instant messaging and other fields, which have attracted large attention. At the same time P2P network traffic significantly contributes to network traffic congestion. The masquerading nature of P2P traffic has rendered the conventional identification strategies futile. In order to better manage and control P2P traffic, it is paramount to identify P2P traffic online and accurately. This paper proposes a strategy for online P2P identification, based on the host and flow behaviors characteristics of P2P traffic using complex event processing (CEP) technique. The experiments on real world network traffic data demonstrates that this technique is capable to efficiently identify P2P traffic at online rate. The classification accuracy achieves 97.7 % in terms of flows, with a false detection rate of about 0.2%.
Cloud computing is defined as the delivery of on-demand computing resources ranging from infrastructure, application to datacenter over the internet on a pay-per-use basis. Most cloud computing applications does not guarantee high level... more
Cloud computing is defined as the delivery of on-demand computing resources ranging from infrastructure, application to datacenter over the internet on a pay-per-use basis. Most cloud computing applications does not guarantee high level of security such as privacy, confidentiality and integrity of data because of third-party transition. This brings the development of Blowfish cloud encryption system that enables them to encrypt their data before storage in the cloud. Blowfish encryption scheme is a symmetric block cipher used to encrypt and decrypt data. Microsoft Azure cloud server was used to test the proposed encryption system. Users are able to encrypt their data and obtain a unique identification to help them retrieve encrypted data from the cloud storage facility as when needed.
With the increasing availability of an in-line chip area available for cache, most contemporary microprocessors have moved the L2 cache onto the processor chip and added an L3 cache as a way of enhancing faster access to system memory to... more
With the increasing availability of an in-line chip area available for cache, most contemporary microprocessors have moved the L2 cache onto the processor chip and added an L3 cache as a way of enhancing faster access to system memory to meet the performance need of advanced processors such as Pentium and PowerPC. We want to determine if the inclusion of additional level of cache memory is a solution to the problem of slow memory access. In this paper, we did analysis of the effectiveness of an L3 cache in uniprocessor. Simulation studies with SPECint2006 and SPECfp2006 benchmarks shows that a 4MB L3 with 8-way set associativity is capable of running in a shared mode on uniprocessor with over a 96% hit rate, decreasing the average memory access time (AMAT) by 6.5% and reducing the traffic by 95% from a system without an L3. There is a dramatic reduction in bus traffic and thus an improvement in AMAT.
Research Interests:
Particle Swarm Optimization (PSO) is a metaheuristic optimization algorithm that have been used to solve complex optimization problems that the traditional techniques finds very difficult to solve. The Interior-Point Methods (IPMs) are... more
Particle Swarm Optimization (PSO) is a metaheuristic optimization algorithm that have been used to solve complex optimization problems that the traditional techniques finds very difficult to solve. The Interior-Point Methods (IPMs) are efficient tools for solving nonlinear optimization problems. The IPMs having constrains that are active at the current point, are now believed to be the most robust algorithms for solving large-scale nonlinear optimization problems. Though they are very efficient, but they are still plagued with several challenges such as how to handle of non-convexity, the procedure for making the barrier constraint up to date is cumbersome despite the existence of nonlinearities, and the need to ensure progress toward the solution. In order to overcome some of the shortcomings of the standard PSO such as premature convergence and particles been trapped at the local minimal, we proposed the Primal-Dual Interior Point Particle Swarm Optimization (pdipmPSO) to surmount the shortcomings of the original PSO. We applied the Primal Dual to each particle in a finite number of iterations, and feed the PSO with the output of the Primal Dual. We compared the performance of our new algorithm (pdipmPSO) with IPM and PSO using 13 different benchmark functions. Optimization results reveal that pdipmPSO performs better than PSO and IPM. Our proposed algorithm is shown to have great capacity to prevent premature convergence, and the curse of particles being trapped in the local minimal which have characterised many variants of PSO.
Research Interests:
This paper presents a hybrid algorithm called Primal-Dual-PSO algorithm to address the problem of swarm robotics flocking motion. This algorithm combines the explorative ability of PSO with the exploitative capacity of the Primal Dual... more
This paper presents a hybrid algorithm called Primal-Dual-PSO algorithm to address the problem of swarm robotics flocking motion. This algorithm combines the explorative ability of PSO with the exploitative capacity of the Primal Dual Interior Point Method. We hypothesize that the fusion of the two algorithms provides a strong probability of avoiding premature convergence, and also ensure that the robots are not trapped in their local minimal. Our simulation result provides a clear indication of the effectiveness of the algorithm. The hybrid algorithm performs better in terms of precision, rate of convergence, steadiness, robustness and flocking capability for homogenous set of swarm robots.
Research Interests:
Attacks on computer systems are becoming progressively frequent. Many machine learning techniques have been developed in the bid to increase the effectiveness of intrusion detection systems (IDS). However, the sophistication of intrusion... more
Attacks on computer systems are becoming progressively frequent. Many machine learning techniques have been developed in the bid to increase the effectiveness of intrusion detection systems (IDS). However, the sophistication of intrusion attacks on computer networks and the large size of dataset pose a serious challenge as they drastically reduce the effectiveness of the IDS. We do not propose any new algorithm in this paper. However, experiments were conducted to investigate the performance of six (6) machine learning techniques found in literature and how they can effectively detect intrusion activities on a network. This work examines how effective each algorithm under investigation handles intrusion events. In our experiment, the NSL-KDDTrain+ dataset was partitioned into training subgroups subject to the type of network protocol. Subsequent to this, extraneous and unneeded attributes are removed from each training subgroup. The effectiveness of the algorithms was evaluated. The experimental results show that the Logistic Model Tree Induction method is more effective in terms of (classification accuracy: 99.40%, F-measure: 0.991, false positive rate: 0.32%, precision: 98.90% and Receiver Operating Characteristics: 98.6%) compared to the other five machine learning techniques we investigated.