Mehran University Research Journal of Engineering and Technology
Heading and position control system of ships has remained a challenging control problem. It is a ... more Heading and position control system of ships has remained a challenging control problem. It is a nonlinear multiple input multiple output system. Moreover, the dynamics of the system vary with operating as well as environmental conditions. Conventionally, simple Proportional Integral Derivative controller is used which has well known limitations. Other conventional control techniques have also been investigated but they require an accurate mathematical model of a ship. Unfortunately, accuracy of mathematical models is very difficult to achieve. During the past few decades computational intelligence techniques such as artificial neural networks have been very successful when an accurate mathematical model is not available. Therefore, in this paper, an artificial neural network controller is proposed for heading and position control system. For simulation purposes, a mathematical model with four effective thrusters have been chosen to test the performance of the proposed controller. T...
Hundreds of people dying from heart disease almost every day that is how terrific a delayed diagn... more Hundreds of people dying from heart disease almost every day that is how terrific a delayed diagnosis can be. Living in an advanced era full of intelligent systems, the increasing number of deaths can be reduced. This research paper focuses on the development of a cardiovascular disease prediction system particularly a heart disease, by developing machine learning classifiers, for instance, Support Vector Machine (SVM), Decision Tree, and XGBoost Classifiers. We also scaled the features to standardize unconstrained features in data, available in a fixed range for better optimization of models. For efficiency, the classification of features was also done in two categories, Independent features, and dependent features. Furthermore, the performance measures helped with best practices for model assessment & classifier performance. Eventually, after tuning hyper-parameters, the results exhibit high accuracy for XGBoost among other trained classifiers. After a comparative analysis, the be...
2020 3rd International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), 2020
In today’s era, data is stored for later analysis but most of the significant applications such a... more In today’s era, data is stored for later analysis but most of the significant applications such as WSN, and Medical Science are producing uncertain data. Still, the recorded uncertain data can be analyzed to produce probabilistic answers but the conventional DBMS are designed based on First Order Logic so they are unable to store and process data with some uncertainty or missing values. At the same time, it is not beneficial to delete uncertain data it may affect the result. To deal with uncertain data, Different research groups at the world's renowned institutes developed the Probabilistic DBMS. Like a research group at Oxford University has developed the MayBMS: A probabilistic database management system to analyze the uncertain data. But before using the probabilistic DBMS to manage the uncertain data, the uncertainty in the data should be calculated using probability theory to know the correctness of each record. The purpose of writing this research paper is to find a way to measure the uncertainty available in the data before managing it. Because the management of uncertain data is the second phase, the first thing is to know the correctness or falseness of each available record in the dataset.
The edge computing model offers an ultimate platform to support scientific and real-time workflow... more The edge computing model offers an ultimate platform to support scientific and real-time workflow-based applications over the edge of the network. However, scientific workflow scheduling and execution still facing challenges such as response time management and latency time. This leads to deal with the acquisition delay of servers, deployed at the edge of a network and reduces the overall completion time of workflow. Previous studies show that existing scheduling methods consider the static performance of the server and ignore the impact of resource acquisition delay when scheduling workflow tasks. Our proposed method presented a meta-heuristic algorithm to schedule the scientific workflow and minimize the overall completion time by properly managing the acquisition and transmission delays. We carry out extensive experiments and evaluations based on commercial clouds and various scientific workflow templates. The proposed method has approximately 7.7% better performance than the bas...
During recent decades, several studies have been conducted in the field of weather forecasting pr... more During recent decades, several studies have been conducted in the field of weather forecasting providing various promising forecasting models. Nevertheless, the accuracy of the predictions still remains a challenge. In this paper a new forecasting approach is proposed: it implements a deep neural network based on a powerful feature extraction. The model is capable of deducing the irregular structure, non-linear trends and significant representations as features learnt from the data. It is a 6-layered deep architecture with 4 hidden units of Restricted Boltzmann Machine (RBM). The extracts from the last hidden layer are pre-processed, to support the accuracy achieved by the forecaster. The forecaster is a 2-layer ANN model with 35 hidden units for predicting the future intervals. It captures the correlations and regression patterns of the current sample related to the previous terms by using the learnt deep-hierarchal representations of data as an input to the forecaster.
The edge computing paradigm has experienced quick development in recent years. This paradigm is f... more The edge computing paradigm has experienced quick development in recent years. This paradigm is featured by pushing the storage and computational resources closer to the end-user on edge network. For this purpose, service providers are allowed to add resources on enriched servers at access points (APs) in networks for hosting a number of end-users tasks. However, the deployment of edge servers is still a technological challenge with respect to the end-user pricing model, resource capacity of the server, worthy server, and the management of the latency between users and the servers, etc. A careful investigation into existing methods shows that most of the existing approaches are limited in many ways: (1) they tended to consider the single service provider configuration with a single pricing model only, and (2) they tended to ignore real-time performance variations of edge resources. In this work, we present a meta-heuristic-based method for resource allocation. It overcomes the above...
In recent era, ubiquitous, high Quality of Service (QoS), pledged mobile data is exceedingly dema... more In recent era, ubiquitous, high Quality of Service (QoS), pledged mobile data is exceedingly demanded. With the mass enhancement of research activities QoS guarantee mechanisms are becoming significant. IEEE 802.16 promises to provide wireless access over long remoteness in a variety of momentums and becoming challenging in wireless environments. We have analyzed and focused different QoS parameters on various scheduling services observed in our simulated WiMAX network. The simulation study considers throughput, packet delivery ratio, number of packets dropped, delay and jitter parameters with three core services for handling essential traffic of ERTPS, RTPS and UGS . We have considered various packet sizes analyzed with each of the service separately. The study also considered the data rate change effects amongst these service parameters. On the basis of our simulations we propose to use the packet size of 200 to ensure the delivery of maximum packets to the destination. We suggest...
Weather forecasting is a challenging time series forecasting problem because of its dynamic, cont... more Weather forecasting is a challenging time series forecasting problem because of its dynamic, continuous, data-intensive, chaotic and irregular behavior. At present, enormous time series forecasting techniques exist and are widely adapted. However, competitive research is still going on to improve the methods and techniques for accurate forecasting. This research article presents the time series forecasting of the metrological parameter, i.e., temperature with NARX (Nonlinear Autoregressive with eXogenous input) based ANN (Artificial Neural Network). In this research work, several time series dependent Recurrent NARX-ANN models are developed and trained with dynamic parameter settings to find the optimum network model according to its desired forecasting task. Network performance is analyzed on the basis of its Mean Square Error (MSE) value over training, validation and test data sets. In order to perform the forecasting for next 4,8 and 12 steps horizon, the model with less MSE is c...
The alcoholic and aqueous extract of Caesalpinia crista Linn (Caesalpiniaceae) was evaluated for ... more The alcoholic and aqueous extract of Caesalpinia crista Linn (Caesalpiniaceae) was evaluated for protection against isoproterenol (85 mg/kg b.w.) induced myocardial infarction in albino rats. The heart damage induced by isoproterenol was indicated by elevated levels of the marker enzymes such as Creatine Kinase-isoenzyme (CK-MB), Lactate dehydrogenase (LDH), Serum Glutamate Oxaloacetic Transaminase (SGOT) and Serum Glutamate Pyruvate Transaminase (SGPT) in serum with increased lipid peroxide and reduced glutathione content in heart homogenates. Microscopical examination (histopathology) was also performed on the myocardial tissue. Pretreatment with an ethanolic and aqueous extract of Caesalpinia crista Linn at a dose of 400 mg/kg body wt., administered orally for 30 days, reduced significantly (p < 0.01) the elevated marker enzyme levels in serum and heart homogenates in isoproterenol-induced myocardial infarction. Histopathological observation revealed a marked protection by the...
Nature brings time series data everyday and everywhere, for example, weather data, physiological ... more Nature brings time series data everyday and everywhere, for example, weather data, physiological signals and biomedical signals, financial and business recordings. Predicting the future observations of a collected sequence of historical observations is called time series forecasting. Forecasts are essential, considering the fact that they guide decisions in many areas of scientific, industrial and economic activity such as in meteorology, telecommunication, finance, sales and stock exchange rates. A massive amount of research has already been carried out by researchers over many years for the development of models to improve the time series forecasting accuracy. The major aim of time series modelling is to scrupulously examine the past observation of time series and to develop an appropriate model which elucidate the inherent behaviour and pattern existing in time series. The behaviour and pattern related to various time series may possess different conventions and infact requires s...
Digital data must be compressed and encrypted to maintain confidentiality and efficient bandwidth... more Digital data must be compressed and encrypted to maintain confidentiality and efficient bandwidth usage. These two parameters are essential for information processing in most communication systems. Image compression and encryption may result in reduced restoration quality and degraded performance. This study attempts to provide a compression and encryption scheme for digital data named as Secure-JPEG. This scheme is built on the JPEG compression format, the most widely used lossy compression scheme. It extends the standard JPEG compression algorithm to encrypt data during compression. Secure-JPEG scheme provides encryption along with the process of compression, and it could be altered easily to provide lossless compression. On the other hand, the lossless compression provides less compression ratio and is suitable only in specific scenarios. The paper address the problem of security lacks due to the use of a simple random number generator which can not be cryptographically secure. T...
2020 3rd International Conference on Information and Computer Technologies (ICICT), 2020
Forecasting the exchange rates is a serious issue that is getting expanding consideration particu... more Forecasting the exchange rates is a serious issue that is getting expanding consideration particularly as a result of its trouble and pragmatic applications. Artificial neural networks (ANNs) have been generally utilized as a promising elective methodology for an anticipating task as a result of a few recognized highlights. Research endeavors on ANNs for gauging exchange rates are extensive. In this paper, we endeavor to give a review of research around there. A few structure factors fundamentally sway the exactness of neural network gauges. These elements incorporate the determination of information factors, getting ready information, and network design. There is no accord about the components. In various cases, different choices have their own adequacy. We additionally depict the combination of ANNs with different strategies and report the correlation between exhibitions of ANNs also, those of other anticipating techniques, and finding blended outcomes. At long last, what's to come inquire about headings around there are examined. This paper presents the forecast of top exchanged monetary utilizing diverse Machine learning models which incorporate top foreign exchange (Forex) monetary standards utilizing a hybrid comparison of Support Vector Regressor (SVR) and Artificial Neural Network (ANN), Short-Term Memory (STM), and Neural Network with Hidden Layers. They anticipate the exchange rate between world's top exchanged monetary forms, for example, USD/PKR, from information by day, 30-39 years till December 2018.
2019 13th International Conference on Mathematics, Actuarial Science, Computer Science and Statistics (MACS), 2019
The world has more than 5000 digital-currencies, bitcoin is one of it, which has more than 5.8 mi... more The world has more than 5000 digital-currencies, bitcoin is one of it, which has more than 5.8 million dynamic client and approximately more than 111 exchanges throughout the world. So, the aim for this paper is to do the near prediction of the price of Bitcoin in USD. Precious details are taken from the price index of Bitcoin. A Bayesian recurrent hierarchical (RNN) neural network and a long-term memory (LSTM) network can accomplish this function. The total identification accuracy of 52% and an 8% RMSE is obtained by the LSTM. In contrast to the profound training systems, the common ARIMA method for the prediction of time series. This model have not much efficient as deep learning model can be performed. The deep learning methods were predicted to outperform the poorly performing ARIMA prediction. So here we used Gated Recurrent Network model (GRU) to forecasting Bitcoin price Eventually, all deep learning models have a GPU and CPU that beat the GPU implemented by 94.70 percent for their GPU training time.
Every year, a large amount of population reconciles gun-related violence all over the world. In t... more Every year, a large amount of population reconciles gun-related violence all over the world. In this work, we develop a computer-based fully automated system to identify basic armaments, particularly handguns and rifles. Recent work in the field of deep learning and transfer learning has demonstrated significant progress in the areas of object detection and recognition. We have implemented YOLO V3 “You Only Look Once” object detection model by training it on our customized dataset. The training results confirm that YOLO V3 outperforms YOLO V2 and traditional convolutional neural network (CNN). Additionally, intensive GPUs or high computation resources were not required in our approach as we used transfer learning for training our model. Applying this model in our surveillance system, we can attempt to save human life and accomplish reduction in the rate of manslaughter or mass killing. Additionally, our proposed system can also be implemented in high-end surveillance and security ro...
Mehran University Research Journal of Engineering and Technology
Heading and position control system of ships has remained a challenging control problem. It is a ... more Heading and position control system of ships has remained a challenging control problem. It is a nonlinear multiple input multiple output system. Moreover, the dynamics of the system vary with operating as well as environmental conditions. Conventionally, simple Proportional Integral Derivative controller is used which has well known limitations. Other conventional control techniques have also been investigated but they require an accurate mathematical model of a ship. Unfortunately, accuracy of mathematical models is very difficult to achieve. During the past few decades computational intelligence techniques such as artificial neural networks have been very successful when an accurate mathematical model is not available. Therefore, in this paper, an artificial neural network controller is proposed for heading and position control system. For simulation purposes, a mathematical model with four effective thrusters have been chosen to test the performance of the proposed controller. T...
Hundreds of people dying from heart disease almost every day that is how terrific a delayed diagn... more Hundreds of people dying from heart disease almost every day that is how terrific a delayed diagnosis can be. Living in an advanced era full of intelligent systems, the increasing number of deaths can be reduced. This research paper focuses on the development of a cardiovascular disease prediction system particularly a heart disease, by developing machine learning classifiers, for instance, Support Vector Machine (SVM), Decision Tree, and XGBoost Classifiers. We also scaled the features to standardize unconstrained features in data, available in a fixed range for better optimization of models. For efficiency, the classification of features was also done in two categories, Independent features, and dependent features. Furthermore, the performance measures helped with best practices for model assessment & classifier performance. Eventually, after tuning hyper-parameters, the results exhibit high accuracy for XGBoost among other trained classifiers. After a comparative analysis, the be...
2020 3rd International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), 2020
In today’s era, data is stored for later analysis but most of the significant applications such a... more In today’s era, data is stored for later analysis but most of the significant applications such as WSN, and Medical Science are producing uncertain data. Still, the recorded uncertain data can be analyzed to produce probabilistic answers but the conventional DBMS are designed based on First Order Logic so they are unable to store and process data with some uncertainty or missing values. At the same time, it is not beneficial to delete uncertain data it may affect the result. To deal with uncertain data, Different research groups at the world's renowned institutes developed the Probabilistic DBMS. Like a research group at Oxford University has developed the MayBMS: A probabilistic database management system to analyze the uncertain data. But before using the probabilistic DBMS to manage the uncertain data, the uncertainty in the data should be calculated using probability theory to know the correctness of each record. The purpose of writing this research paper is to find a way to measure the uncertainty available in the data before managing it. Because the management of uncertain data is the second phase, the first thing is to know the correctness or falseness of each available record in the dataset.
The edge computing model offers an ultimate platform to support scientific and real-time workflow... more The edge computing model offers an ultimate platform to support scientific and real-time workflow-based applications over the edge of the network. However, scientific workflow scheduling and execution still facing challenges such as response time management and latency time. This leads to deal with the acquisition delay of servers, deployed at the edge of a network and reduces the overall completion time of workflow. Previous studies show that existing scheduling methods consider the static performance of the server and ignore the impact of resource acquisition delay when scheduling workflow tasks. Our proposed method presented a meta-heuristic algorithm to schedule the scientific workflow and minimize the overall completion time by properly managing the acquisition and transmission delays. We carry out extensive experiments and evaluations based on commercial clouds and various scientific workflow templates. The proposed method has approximately 7.7% better performance than the bas...
During recent decades, several studies have been conducted in the field of weather forecasting pr... more During recent decades, several studies have been conducted in the field of weather forecasting providing various promising forecasting models. Nevertheless, the accuracy of the predictions still remains a challenge. In this paper a new forecasting approach is proposed: it implements a deep neural network based on a powerful feature extraction. The model is capable of deducing the irregular structure, non-linear trends and significant representations as features learnt from the data. It is a 6-layered deep architecture with 4 hidden units of Restricted Boltzmann Machine (RBM). The extracts from the last hidden layer are pre-processed, to support the accuracy achieved by the forecaster. The forecaster is a 2-layer ANN model with 35 hidden units for predicting the future intervals. It captures the correlations and regression patterns of the current sample related to the previous terms by using the learnt deep-hierarchal representations of data as an input to the forecaster.
The edge computing paradigm has experienced quick development in recent years. This paradigm is f... more The edge computing paradigm has experienced quick development in recent years. This paradigm is featured by pushing the storage and computational resources closer to the end-user on edge network. For this purpose, service providers are allowed to add resources on enriched servers at access points (APs) in networks for hosting a number of end-users tasks. However, the deployment of edge servers is still a technological challenge with respect to the end-user pricing model, resource capacity of the server, worthy server, and the management of the latency between users and the servers, etc. A careful investigation into existing methods shows that most of the existing approaches are limited in many ways: (1) they tended to consider the single service provider configuration with a single pricing model only, and (2) they tended to ignore real-time performance variations of edge resources. In this work, we present a meta-heuristic-based method for resource allocation. It overcomes the above...
In recent era, ubiquitous, high Quality of Service (QoS), pledged mobile data is exceedingly dema... more In recent era, ubiquitous, high Quality of Service (QoS), pledged mobile data is exceedingly demanded. With the mass enhancement of research activities QoS guarantee mechanisms are becoming significant. IEEE 802.16 promises to provide wireless access over long remoteness in a variety of momentums and becoming challenging in wireless environments. We have analyzed and focused different QoS parameters on various scheduling services observed in our simulated WiMAX network. The simulation study considers throughput, packet delivery ratio, number of packets dropped, delay and jitter parameters with three core services for handling essential traffic of ERTPS, RTPS and UGS . We have considered various packet sizes analyzed with each of the service separately. The study also considered the data rate change effects amongst these service parameters. On the basis of our simulations we propose to use the packet size of 200 to ensure the delivery of maximum packets to the destination. We suggest...
Weather forecasting is a challenging time series forecasting problem because of its dynamic, cont... more Weather forecasting is a challenging time series forecasting problem because of its dynamic, continuous, data-intensive, chaotic and irregular behavior. At present, enormous time series forecasting techniques exist and are widely adapted. However, competitive research is still going on to improve the methods and techniques for accurate forecasting. This research article presents the time series forecasting of the metrological parameter, i.e., temperature with NARX (Nonlinear Autoregressive with eXogenous input) based ANN (Artificial Neural Network). In this research work, several time series dependent Recurrent NARX-ANN models are developed and trained with dynamic parameter settings to find the optimum network model according to its desired forecasting task. Network performance is analyzed on the basis of its Mean Square Error (MSE) value over training, validation and test data sets. In order to perform the forecasting for next 4,8 and 12 steps horizon, the model with less MSE is c...
The alcoholic and aqueous extract of Caesalpinia crista Linn (Caesalpiniaceae) was evaluated for ... more The alcoholic and aqueous extract of Caesalpinia crista Linn (Caesalpiniaceae) was evaluated for protection against isoproterenol (85 mg/kg b.w.) induced myocardial infarction in albino rats. The heart damage induced by isoproterenol was indicated by elevated levels of the marker enzymes such as Creatine Kinase-isoenzyme (CK-MB), Lactate dehydrogenase (LDH), Serum Glutamate Oxaloacetic Transaminase (SGOT) and Serum Glutamate Pyruvate Transaminase (SGPT) in serum with increased lipid peroxide and reduced glutathione content in heart homogenates. Microscopical examination (histopathology) was also performed on the myocardial tissue. Pretreatment with an ethanolic and aqueous extract of Caesalpinia crista Linn at a dose of 400 mg/kg body wt., administered orally for 30 days, reduced significantly (p < 0.01) the elevated marker enzyme levels in serum and heart homogenates in isoproterenol-induced myocardial infarction. Histopathological observation revealed a marked protection by the...
Nature brings time series data everyday and everywhere, for example, weather data, physiological ... more Nature brings time series data everyday and everywhere, for example, weather data, physiological signals and biomedical signals, financial and business recordings. Predicting the future observations of a collected sequence of historical observations is called time series forecasting. Forecasts are essential, considering the fact that they guide decisions in many areas of scientific, industrial and economic activity such as in meteorology, telecommunication, finance, sales and stock exchange rates. A massive amount of research has already been carried out by researchers over many years for the development of models to improve the time series forecasting accuracy. The major aim of time series modelling is to scrupulously examine the past observation of time series and to develop an appropriate model which elucidate the inherent behaviour and pattern existing in time series. The behaviour and pattern related to various time series may possess different conventions and infact requires s...
Digital data must be compressed and encrypted to maintain confidentiality and efficient bandwidth... more Digital data must be compressed and encrypted to maintain confidentiality and efficient bandwidth usage. These two parameters are essential for information processing in most communication systems. Image compression and encryption may result in reduced restoration quality and degraded performance. This study attempts to provide a compression and encryption scheme for digital data named as Secure-JPEG. This scheme is built on the JPEG compression format, the most widely used lossy compression scheme. It extends the standard JPEG compression algorithm to encrypt data during compression. Secure-JPEG scheme provides encryption along with the process of compression, and it could be altered easily to provide lossless compression. On the other hand, the lossless compression provides less compression ratio and is suitable only in specific scenarios. The paper address the problem of security lacks due to the use of a simple random number generator which can not be cryptographically secure. T...
2020 3rd International Conference on Information and Computer Technologies (ICICT), 2020
Forecasting the exchange rates is a serious issue that is getting expanding consideration particu... more Forecasting the exchange rates is a serious issue that is getting expanding consideration particularly as a result of its trouble and pragmatic applications. Artificial neural networks (ANNs) have been generally utilized as a promising elective methodology for an anticipating task as a result of a few recognized highlights. Research endeavors on ANNs for gauging exchange rates are extensive. In this paper, we endeavor to give a review of research around there. A few structure factors fundamentally sway the exactness of neural network gauges. These elements incorporate the determination of information factors, getting ready information, and network design. There is no accord about the components. In various cases, different choices have their own adequacy. We additionally depict the combination of ANNs with different strategies and report the correlation between exhibitions of ANNs also, those of other anticipating techniques, and finding blended outcomes. At long last, what's to come inquire about headings around there are examined. This paper presents the forecast of top exchanged monetary utilizing diverse Machine learning models which incorporate top foreign exchange (Forex) monetary standards utilizing a hybrid comparison of Support Vector Regressor (SVR) and Artificial Neural Network (ANN), Short-Term Memory (STM), and Neural Network with Hidden Layers. They anticipate the exchange rate between world's top exchanged monetary forms, for example, USD/PKR, from information by day, 30-39 years till December 2018.
2019 13th International Conference on Mathematics, Actuarial Science, Computer Science and Statistics (MACS), 2019
The world has more than 5000 digital-currencies, bitcoin is one of it, which has more than 5.8 mi... more The world has more than 5000 digital-currencies, bitcoin is one of it, which has more than 5.8 million dynamic client and approximately more than 111 exchanges throughout the world. So, the aim for this paper is to do the near prediction of the price of Bitcoin in USD. Precious details are taken from the price index of Bitcoin. A Bayesian recurrent hierarchical (RNN) neural network and a long-term memory (LSTM) network can accomplish this function. The total identification accuracy of 52% and an 8% RMSE is obtained by the LSTM. In contrast to the profound training systems, the common ARIMA method for the prediction of time series. This model have not much efficient as deep learning model can be performed. The deep learning methods were predicted to outperform the poorly performing ARIMA prediction. So here we used Gated Recurrent Network model (GRU) to forecasting Bitcoin price Eventually, all deep learning models have a GPU and CPU that beat the GPU implemented by 94.70 percent for their GPU training time.
Every year, a large amount of population reconciles gun-related violence all over the world. In t... more Every year, a large amount of population reconciles gun-related violence all over the world. In this work, we develop a computer-based fully automated system to identify basic armaments, particularly handguns and rifles. Recent work in the field of deep learning and transfer learning has demonstrated significant progress in the areas of object detection and recognition. We have implemented YOLO V3 “You Only Look Once” object detection model by training it on our customized dataset. The training results confirm that YOLO V3 outperforms YOLO V2 and traditional convolutional neural network (CNN). Additionally, intensive GPUs or high computation resources were not required in our approach as we used transfer learning for training our model. Applying this model in our surveillance system, we can attempt to save human life and accomplish reduction in the rate of manslaughter or mass killing. Additionally, our proposed system can also be implemented in high-end surveillance and security ro...
Uploads
Papers by Sanam Narejo