Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
  • noneedit
  • Academic/Professional Qualification BSc (Hons), PGD, MSc, Bayero University Kano, Nigeria PhD, International Islamic ... moreedit
  • Professor Teddy Mantoroedit
Abstract Drilling through reactive shale formations is normally associated with borehole instability and is a problem the petroleum industry continues to face every day, with enormous losses encountered as a result. Tackling this problem... more
Abstract Drilling through reactive shale formations is normally associated with borehole instability and is a problem the petroleum industry continues to face every day, with enormous losses encountered as a result. Tackling this problem has typically been achieved through the application of diesel and synthetic oil-based muds during drilling activities. These types of muds, however, are not without adverse impacts on the ecosystem, and improved water-based mud formulations have been deployed on the field, but the problem continues to persist. In the quest to ensure better drilling practices, mathematical modelling as a predictive tool before drilling has proven to reduce wellbore problems and improve efficiency. These modelling procedures are applied in order to forecast suitable mud formulations, safe mud weights and optimum borehole trajectories that would negate wellbore instabilities. This manuscript highlights the mathematical modelling approaches to wellbore stability analysis hitherto in use and their limitations with the aim of providing a more profound and concise understanding of wellbore stability modelling. A comparison of the various types of mathematical models indicated a narrowing of the mud weight window prediction with an increase in the number of transport variables considered. The order of magnitude of the mud weight window narrows in order of the elastic, poroelastic, chemo-poroelastic, thermo-poroelastic and chemo-thermo-poroelastic models. This review introduces the concept of wellbore stability, types, and causes. Furthermore, a comparison of the conventional modelling approaches is provided and recommendations for future research areas highlighted.
The advancement and rapid development of mobile money has created extraordinary opportunities for poor people in developing countries to help contribute in the development of the economy. Mobile money has formed an effective and efficient... more
The advancement and rapid development of mobile money has created extraordinary opportunities for poor people in developing countries to help contribute in the development of the economy. Mobile money has formed an effective and efficient mechanism for the electronic transaction of money. Regardless of the remarkable improvement in the recent decades, the economy of Afghanistan still has not improved. Citizens of this country are some of the poorest in the world. Due to the system being undigitized and corruption, there is distrust among banks, bribery, graft of employee salaries by their supervisors and high costs of currency transport. There is very limited accessibility to basic financial services due to lack of e-transaction and banking services. The introduction of mobile money as an electronic transfer of money can be a partial solution to the problems being faced in Afghanistan. However, lack of system quality, information quality and service quality assurances is an essential problem to the success and usage of mobile money which need to be improved. This paper aims to determine the impact and influence of system, information, and service qualities on the use of mobile money. The study can help unbanked citizens to access basic financial services through mobile money. It can also help owners assess mobile money to improve the system, information, and service qualities. The DeLone and McLean IS Success Model has been adapted as theoretical method for this paper to evaluate the success of mobile money services based on four influencing factors as the main objectives of the paper. Based on a review of the literature, a quantitative paper method was applied for data collection through an online survey questionnaire. The data has been randomly collected from the users of mobile money in Afghanistan. The data has been analyzed through SPSS and PLS-SEM model. PLS-SEM analyses were performed for the result of the data analysis. Based on the analysis of the responses of the participant, system quality, information quality, service quality and customer value were seen to have significant and positive impact on the success of mobile money services in Afghanistan
This paper presents a user study of “perception of the cryptocurrency-based transaction from the Islamic views”. The motivation lies with the fact that some users of cryptocurrency-based transaction raised concern on the nature of... more
This paper presents a user study of “perception of the cryptocurrency-based transaction from the Islamic views”. The motivation lies with the fact that some users of cryptocurrency-based transaction raised concern on the nature of transactions with Bitcoin. Specifically, some argued that Bitcoin can be easily used for illegal purposes. Therefore, “Technological Acceptance Model” was adopted and quantitative research methodology was utilized, to formulate and test some hypothesis that will lead to an establishment of a model. Sample of 306 participants was used in the study. The result of the hypothesis testing indicates that “Behavioral Intention to Use Cryptocurrency from the Islamic perspective” is influenced directly by Shari’ah Compliance, Perceived Ease of Use, Emotionality, Perceived Usefulness, and Financial Concern. As evident from the analysis, Emotionality is influenced directly by Financial concern and Shari’ah Compliance. Whereas, Behavioral Intention is influenced indir...
Multi agent systems and consensus problems represents the theoretical aspect of Quadratic Stochastic Operators (QSO). The doubly stochastic quadratic operators (DSQOs) on two-dimensional simplex (2DS) expose a complex problem within QSO... more
Multi agent systems and consensus problems represents the theoretical aspect of Quadratic Stochastic Operators (QSO). The doubly stochastic quadratic operators (DSQOs) on two-dimensional simplex (2DS) expose a complex problem within QSO and majorization theories in non-linear models. In such models, the DSQO is considered as a very large class and it is more convenient to study from the sub-classes perspective. Consequently, the DSQO is further classified into the sub-class of extreme doubly stochastic quadratic operators (EDSQOs). In this work, the limit behavior of the trajectories of EDSQO on a two-dimensional simplex (2DS) are studied. In turn, we present further results on the related generalizations of DSQO based on their EDSQO sub-classes. It is demonstrated in the work at hand that, the EDSQO has convergent, fixed, or periodic points. If each positive point of the EDSQO has an undirected interaction (its update is shared among other points), the operator is then convergent. ...
The Internet of Things (IoT) smart city initiative has transformed technology spectrum into its new era of development. The increasing amount of data generated by millions of IoT devices and the rapid flow of data across distributed IoT... more
The Internet of Things (IoT) smart city initiative has transformed technology spectrum into its new era of development. The increasing amount of data generated by millions of IoT devices and the rapid flow of data across distributed IoT devices are transmitting to remotely located cloud infrastructure over the Internet. Unfortunately, these large amounts of data and its flow based on the traditional energy-intensive network infrastructure is neither efficient nor substantially scalable. It is essential to design a comprehensive network infrastructure to handle large amount of high-speed data-processing in an IoT spectrum. Apparently, Blockchain and Software-Defined Networking (SDN) approaches can leveraged the scalability of the environment for IoT spectrum. In addition, the emergence of distributed cloud technology and Li-Fi spectrum can transform the capability of data-processing for IoT devices. The challenge lies in efficiently blend the integration of Li-Fi, Blockchain, SDN and...
ABSTRACT
3D map for mobile devices provide more realistic view of an environment and serves as better navigation aid. Previous research studies shows differences in 3D maps effect on acquiring of spatial knowledge. This is attributed to the... more
3D map for mobile devices provide more realistic view of an environment and serves as better navigation aid. Previous research studies shows differences in 3D maps effect on acquiring of spatial knowledge. This is attributed to the differences in mobile device computational capabilities. Crucial to this, is the time it takes for 3D map dataset to be rendered for a required complete navigation task. Different findings suggest different approach on solving the problem of time require for both in-core (inside mobile) and out-core (remote) rendering of 3D dataset. Unfortunately, studies on analytical techniques required to shows the impact of computational resources required for the use of 3D map on mobile device were neglected by the research communities. This paper uses Support Vector Machine (SVM) to analytically classify mobile device computational capabilities required for 3D map that will be suitable for use as navigation aid. Fifty different Smart phones were categorized on the b...
AbstractWhen crude oil prices began to escalate in the 1970s, conventional methods were the predominant methods used in forecasting oil pricing. These methods can no longer be used to tackle the nonlinear, chaotic, non-stationary,... more
AbstractWhen crude oil prices began to escalate in the 1970s, conventional methods were the predominant methods used in forecasting oil pricing. These methods can no longer be used to tackle the nonlinear, chaotic, non-stationary, volatile, and complex nature of crude oil prices, because of the methods’ linearity. To address the methodological limitations, computational intelligence techniques and more recently, hybrid intelligent systems have been deployed. In this paper, we present an extensive review of the existing research that has been conducted on applications of computational intelligence algorithms to crude oil price forecasting. Analysis and synthesis of published research in this domain, limitations and strengths of existing studies are provided. This paper finds that conventional methods are still relevant in the domain of crude oil price forecasting and the integration of wavelet analysis and computational intelligence techniques is attracting unprecedented interest from scholars in the domai...
3D map for mobile devices provide more realistic view of an environment and serves as better navigation aid. Previous research studies shows differences in 3D maps effect on acquiring of spatial knowledge. This is attributed to the... more
3D map for mobile devices provide more realistic view of an environment and serves as better navigation aid. Previous research studies shows differences in 3D maps effect on acquiring of spatial knowledge. This is attributed to the differences in mobile device computational capabilities. Crucial to this is the time it takes for 3D map dataset to be rendered for a required complete navigation task. Different findings suggest different approaches on solving the problem of time required for both in-core (inside mobile) and out-core (remote) rendering of 3D dataset. Unfortunately, studies on analytical techniques required to show the impact of computational resources required for the use of 3D map on mobile devices were neglected by the research communities. This paper uses Support Vector Machine (SVM) to analytically classified mobile device computational capabilities required for 3D map that will be suitable for use as navigation aid. Fifty different Smart phones were categorized on the bases of their Graphical Processing Unit (GPU), display resolution, memory and size. The result of the proposed classification shows high accuracy.
Prior research studies have shown that the Peak Signal to Noise Ratio (PSNR) is the most frequent watermarked image quality metric that is used for determining the levels of strength and weakness of watermarking algorithms. Conversely,... more
Prior research studies have shown that the Peak Signal to Noise Ratio (PSNR) is the most frequent watermarked image quality metric that is used for determining the levels of strength and weakness of watermarking algorithms. Conversely, Normalised Cross Correlation (NCC) is the most common metric used after attacks were applied to a watermarked image to verify the strength of the algorithm used. Many researchers have used these approaches to evaluate their algorithms. These strategies have been used for a long time, however, which unfortunately limits the value of PSNR and NCC in reflecting the strength and weakness of the watermarking algorithms. This paper considers this issue to determine the threshold values of these two parameters in reflecting the amount of strength and weakness of the watermarking algorithms. We used our novel watermarking technique for embedding four watermarks in intermediate significant bits (ISB) of six image files one-by-one through replacing the image pixels with new pixels and, at the same time, keeping the new pixels very close to the original pixels. This approach gains an improved robustness based on the PSNR and NCC values that were gathered. A neural network model was built that uses the image quality metrics (PSNR and NCC) values obtained from the watermarking of six grey-scale images that use ISB as the desired output and that are trained for each watermarked image's PSNR and NCC. The neural network predicts the watermarked image's PSNR together with NCC after the attacks when a portion of the output of the same or different types of image quality metrics (PSNR and NCC) are obtained. The results indicate that the NCC metric fluctuates before the PSNR values deteriorate.
Research Interests:
Mobile Government or mGovernment evolves recently as another platform for eGovernment system on mobile devices or smart mobile devices. This paper investigates the factors that influence the use of Kingdom of Saudi Arabia (KSA)... more
Mobile Government or mGovernment evolves recently as another platform for eGovernment system on mobile devices or smart mobile devices. This paper investigates the factors that influence the use of Kingdom of Saudi Arabia (KSA) mGovernment services. The motivation of this study dwells on the inability ofSAUDI National e-Government portal to elucidate the facet of mobile apps and its related services under the Government services link in their portal. Furthermore, the use of mobile devices for many government services in KSA remains tacit. It is unclear if the services suit for mobile devices or not. There are no specificationson the mobile platform for the use of mobile applications for many government services. In regards to these drawbacks, this research formulated some hypotheses in order to test our claim that the factors that will influence the use of KSA government services on mobile device are yet to be in public domain. Quantitative survey research method is used; random sampling of 240 respondents who are using some KSA government services on mobile devices participated voluntarily in this study. Statistical analyses were used for the hypothesis testing. The results reveal those factors necessary for using government services on mobile devices for KSA. Copy Right, IJAR, 2016,. All rights reserved.
Research Interests:
When crude oil price began to escalate in the 1970s, conventional methods were the predominant methods used in forecasting oil price. These methods are can no longer be used to tackle the nonlinear, chaotic, non-stationary, volatile, and... more
When crude oil price began to escalate in the 1970s, conventional methods were the predominant methods used in forecasting oil price. These methods are can no longer be used to tackle the nonlinear, chaotic, non-stationary, volatile, and complex nature of crude oil price because of the methods’ linearity. To address the methodological limitations, computational intelligence techniques and, more recently, hybrid intelligent systems have been deployed. In this paper, we present an extensive review of the existing research that have been conducted on applications of computational intelligence algorithms to crude oil price forecasting. Analysis and synthesis of published research in this domain, limitations and strengths of existing studies are provided. This paper finds that conventional methods are still relevant in the domain of crude oil price forecasting and the integration of wavelet analysis and computational intelligence techniques is attracting unprecedented interest from scholars in the domain of crude oil price forecasting. We intend for researchers to use this review as a starting point for further advancement as well as an exploration of other techniques that have received little or no attention from researchers. Energy demand and supply projection can effectively be tackled with accurate forecasting of crude oil price, which can create stability in the oil market.
Research Interests:
This paper utilize modular neural network for prediction of possible emergencies locations during hajj pilgrimage. Available location, localization and positioning determination systems become increasingly important for use in day-to-day... more
This paper utilize modular neural network for prediction of possible emergencies locations during hajj pilgrimage. Available location, localization and positioning determination systems become increasingly important for use in day-to-day activities. These systems dwells on various scientific tools which ensure that the systems will provide accurate response to the needed service at the right time. Unfortunately, some tools were faced with drawbacks, either their use was not appropriate or they do not give reliable results, or the results obtained in certain scenario might not be apply to other scenarios. For this reasons, we utilize modular neural network tool to examine the analysis of determining possible emergencies locations within point of Interest of Hajj Pilgrimage in Meccah Saudi Arabia. The prediction results are generated by the use of longitude, latitude and distances as the dataset. Modular neural network takes longitude and latitude as inputs and predict distances within pilgrim’s possible point of interest. The learning systems were trained on the collected data. Experimental investigation demonstrated that modular network produce higher prediction accuracy compaired to other tools. This finding would contribute to the design of add-on applications which will deem to provide location based services for possible emergencies locations.
Research Interests:
This paper proposes a support vector machine (SVM) model to advance the prediction accuracy of global land-ocean temperature (GLOT), which is globally significant for understanding the future pattern of climate change. The GLOT dataset... more
This paper proposes a support vector machine (SVM) model to advance the prediction accuracy of global land-ocean temperature (GLOT), which is globally significant for understanding the future pattern of climate change. The GLOT dataset was collected from NASA's GLOT index (C) (anomaly with base: 1951–1980) for the period 1880 to 2013. We categorise the dataset by decades to describe the behaviour of the GLOT within those decades. The dataset was used to build an SVM Model to predict future values of the GLOT. The performance of the model was compared with a multilayer perceptron neural network (MLPNN) and validated statistically. The SVM was found to perform significantly better than the MLPNN in terms of mean square error and root mean square error, although computational times for the two models are statistically equal. The SVM model was used to project the GLOT from the pre-existing NASA's GLOT index (C) (anomaly with base: 1951–1980) for the next 20 years (2013–2033). The projection results of our study can be of value to policy makers, such as the intergovernmental organisations related to environmental studies, e.g., the intergovernmental panel on climate change (IPCC).
This paper utilize modular neural network for prediction of possible emergencies locations during hajj pilgrimage. Available location, localization and positioning determination systems become increasingly important for use in day-to-day... more
This paper utilize modular neural network for prediction of possible emergencies locations during hajj pilgrimage. Available location, localization and positioning determination systems become increasingly important for use in day-to-day activities. These systems dwells on various scientific tools which ensure that the systems will provide accurate response to the needed service at the right time. Unfortunately, some tools were faced with drawbacks, either their use was not appropriate or they do not give reliable results, or the results obtained in certain scenario might not be apply to other scenarios. For this reasons, we utilize modular neural network tool to examine the analysis of determining possible emergencies locations within point of Interest of Hajj Pilgrimage in Meccah Saudi Arabia. The prediction results are generated by the use of longitude, latitude and distances as the dataset. Modular neural network takes longitude and latitude as inputs and predict distances within pilgrim’s possible point of interest. The learning systems were trained on the collected data. Experimental investigation demonstrated that modular network produce higher prediction accuracy compaired to other tools. This finding would contribute to the design of add-on applications which will deem to provide location based services for possible emergencies locations.
Data transformation (normalization) is a method used in data preprocessing to scale the range of values in the data within a uniform scale to improve the quality of the data; as a result, the prediction accuracy is improved. However, some... more
Data transformation (normalization) is a method used in data preprocessing to scale the range of values in the data within a uniform scale to improve the quality of the data; as a result, the prediction accuracy is improved. However, some scholars have questioned the efficacy of data normalization, arguing that it can destroy the structure in the original (raw) data. To address these arguments, we compared the prediction performances of the two methods in the domain of crude oil prices due to its global significance. It was found that the multilayer perceptron neural network model that was built using normalized data significantly outperformed the multilayer perceptron neural network that was built using raw data. The number of iterations and the computation time for both of the methods were statistically equal as well as for the regression. In view of the arguments in the literature about data standardization, the results of this research could allow researchers in the domain of crude oil price prediction to choose the best opinion
Global warming is attracting attention from policy makers due to its impacts such as floods, extreme weather, increases in temperature by 0.7°C, heat waves, storms, etc. These disasters result in loss of human life and billions of dollars... more
Global warming is attracting attention from policy makers due to its impacts such as floods, extreme weather, increases in temperature by 0.7°C, heat waves, storms, etc. These disasters result in loss of human life and billions of dollars in property. Global warming is believed to be caused by the emissions of greenhouse gases due to human activities including the emissions of carbon dioxide (CO2) from petroleum consumption. Limitations of the previous methods of predicting CO2 emissions and lack of work on the prediction of the Organization of the Petroleum Exporting Countries (OPEC) CO2 emissions from petroleum consumption have motivated this research. The OPEC CO2 emissions data were collected from the Energy Information Administration. Artificial Neural Network (ANN) adaptability and performance motivated its choice for this study. To improve effectiveness of the ANN, the cuckoo search algorithm was hybridised with accelerated particle swarm optimisation for training the ANN to build a model for the prediction of OPEC CO2 emissions. The proposed model predicts OPEC CO2 emissions for 3, 6, 9, 12 and 16 years with an improved accuracy and speed over the state-of-the-art methods. An accurate prediction of OPEC CO2 emissions can serve as a reference point for propagating the reorganisation of economic development in OPEC member countries with the view of reducing CO2 emissions to Kyoto benchmarks-hence, reducing global warming. The policy implications are discussed in the paper.
This study was conducted to validate the use of a modified algometer device to measure mechanical nociceptive thresholds in six dogs. Dogs were administered morphine intravenously (IV) at 1 mg/kg or saline at equivolume in a crossover... more
This study was conducted to validate the use of a modified algometer device to measure mechanical nociceptive thresholds in six dogs. Dogs were administered morphine intravenously (IV) at 1 mg/kg or saline at equivolume in a crossover design with one-week washout period. Mechanical nociceptive thresholds were determined before, after the administration of treatments at 5 minutes, and hourly for 8 hours. Thresholds were recorded at the carpal pad, metacarpal foot pad, tibia, femur, and abdomen. Heart rates, body temperature, and respiration were recorded at similar time points. Thresholds increased significantly (P < 0.05) from baseline values for up to 3 hours at tibia and abdomen, 4 hours at metacarpal pad, and 5 hours at the carpal pad and femur. Hypothermia, bradycardia, and change in respiration were observed in all dogs after morphine injection. Saline did not alter any threshold levels during the eight-hour study period, indicating no evidence of tolerance, learned avoidance, or local hyperaesthesia. The device and methods of testing were well tolerated by all the dogs. Results suggest that the modified algometer and method of application are useful to measure nociceptive mechanical thresholds in awake dogs.
Research Interests:
Research Interests:
Neural Network is said to emulate the brain, though, its processing is not quite how the biological brain really works. The Neural Network has witnessed significant improvement since 1943 to date. However, modifications on the Neural... more
Neural Network is said to emulate the brain, though, its processing is not quite how the biological brain really works. The Neural Network has witnessed significant improvement since 1943 to date. However, modifications on the Neural Network mainly focus on the structure itself, not the activation function despite the critical role of activation function in the performance of the Neural Network. In this paper, we present the modification of Neural Network activation function to improve the performance of the Neural Network. The theoretical background of the modification, including mathematical proof is fully described in the paper. The modified activation function is code name as SigHyper. The performance of SigHyper was evaluated against state of the art activation function on the crude oil price dataset. Results suggested that the proposed SigHyper was found to improved accuracy of the Neural Network. Analysis of variance showed that the accuracy of the SigHyper is significant. It was established that the SigHyper require further improvement. The activation function proposed in this research has added to the activation functions already discussed in the literature. The study may motivate researchers to further modify activation functions, hence, improve the performance of the Neural Network.
Research Interests:

And 56 more