2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 2017
kNN is frequently used as a clustering method in machine learning or data mining. The main applic... more kNN is frequently used as a clustering method in machine learning or data mining. The main application of a kNN join is k-nearest neighbour classification. Some data points are given for training and new unlabelled data is given for testing. The focus is to find the class label for the new points. For each unlabelled data in kNN query on training set will be performed to estimate its class membership. This process can be considered as kNN join for new data to test with the training set. The different existing approaches for computing knn on MapReduce are compared, first theoretically and then performing an extensive experimental evaluation. To compare these solutions, there are three generic steps for kNN computation on MapReduce: data pre-processing, data partitioning and computation. Theoretically each step is analysed for load balancing, accuracy and complexity aspects.
Nowadays, everybody is facing security issues in door operating system like unwanted entrance of ... more Nowadays, everybody is facing security issues in door operating system like unwanted entrance of unauthorize people without permission. In order to resolve these issues an high security system is implemented. Rather than monitoring the door operating system through passwords or pins, unique faces are often made used as they're one’s biometric trait. These can't be modified or stolen easily, the extent of security is often raised by using face detection. The proposed system uses face recognition for providing higher security. System uses Haar classifier for face detection, local binary pattern histogram (LBPH) algorithm for face recognition. Whenever the person comes in front of the door, it captures the face and if the captured face is registered then it unlocks the door and if the face isn't registered the door will be in the locked state. The proposed system is trained and tested against 150 images and it has acquired an accuracy rate of 95%. Keywords— Internet of Thin...
2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 2017
Agent Based Discovery and Monitoring is used mainly for asset discovery and monitoring purpose. T... more Agent Based Discovery and Monitoring is used mainly for asset discovery and monitoring purpose. The aim is to monitor the systems in the network and also to ensure adherence to the IT compliance within the organization. Discovery and Monitoring agent will be designed to run as a system service. This paper proposes the creation of an agent service which will be capable of discovering the systems, monitoring files and restricting access, alerting when an external device is plugged into the system and software metering. The service agent has to be installed on every machine and this service will return the required data about the machine on which it is installed. The goal of this project is to enable the administrator to discover and monitor the systems in a network.
In this paper, we have proposed a novel method for two-dimensional shape object recognition and r... more In this paper, we have proposed a novel method for two-dimensional shape object recognition and retrieval. The proposed method is based on Ridgelet Principal Component Analysis (Ridgelet PCA). In our proposed approach we first use the ridgelet transform to extract line singularity features and point singularity features by applying the radon and wavelet transform respectively and then applying PCA to extract the effective features. For recognition and retrieval we have conducted a study by using seventeen different distance measure techniques. The training and testing process is conducted using leave-one-out strategy. The retrieval process is carried out by considering standard test ’bullseye’ score. The proposed method is tested on the collected standard dataset MPEG-7. Experimental results of Ridgelet PCA are compared with the existing PCA method, which show that our approach results are favorable compared to the reference methods, in terms of recognition and retrieval rate. keywo...
2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 2017
Cloud computing and Big Data both leverages Hadoop framework to process the Bid Data parallel. Ha... more Cloud computing and Big Data both leverages Hadoop framework to process the Bid Data parallel. Hadoop framework is used to build both high performance and large scale clusters. The limitation posed by the framework for efficient execution due to data locality provides the scope to improve the performance. An enhanced architecture is proposed in this paper which addresses the issue of reducing the time taken to read the data and to increase the number task executed. The architecture provides an additional layer to the NameNode using which it directs the task to the DataNode having the required data rather than searching the data every execution.
International Journal of Advance Research, Ideas and Innovations in Technology, 2019
Performance of Financial Institutions is dependent on various aspects that can affect their growt... more Performance of Financial Institutions is dependent on various aspects that can affect their growth. Many of them are standardized aspects such as Portfolio At Risk, Provision Expense Ratio, Risk Coverage Ratio, Write Off Ratio. Along with these, we introduce a scientific cost distribution value that we name as the Expense Accrual Ratio (EAR). Along with the other parameters of measurement, EAR can be used to assess if the health of a financial institution. In this paper, we explain the role and functioning of EAR along with ways of calculation of it with formulas for the derivation. The effect of dependent parameters on EAR is explored and the graphs are explained. Further, its impact on performance is analyzed with data references. Comparison is done with the current interest rates of the institution to derive at a conclusion based on EAR. Along with other parameters of the measure this can be used as a powerful tool to adjust the functioning of the financial institution to avoid f...
Advances in Intelligent Systems and Computing, 2017
The key functionality of this proposed system is its ability to handle, collect and analysis huge... more The key functionality of this proposed system is its ability to handle, collect and analysis huge volume of different kinds of log data. When deployed in a network would facilitate collection of logs from different nodes across the network. This paper explains the proposed system which collects the logs using Logstash which is having a capability of handling the many types of Logs data which helps to identify the malicious activity in the network.
Phishing is a typical assault on gullible individuals by making them reveal their one of a kind d... more Phishing is a typical assault on gullible individuals by making them reveal their one of a kind data utilizing fake sites. The goal of phishing site URLs is to steal the individual data like client name, passwords and on the web banking exchanges. Phishers utilize the sites which are outwardly also, semantically like those genuine sites. As innovation keeps on developing, phishing methods began to advance quickly also, this should be protected by utilizing against phishing systems to distinguish phishing. Machine Learning is an amazing asset used to endeavour against phishing assaults. A novel idea is proposed to detect malicious and non-malicious URL links using Extreme Learning Machine Algorithm KeywordsPhishing, Extreme Learning Machine, malicious URL, non-malicious URL
Content retrieval from large databases needs an efficient approach due to the increasing growth i... more Content retrieval from large databases needs an efficient approach due to the increasing growth in the digital images. Especially content based image retrieval is an extensive research area. This mainly includes retrieving similar images from the large dataset based on the extracted features. The extracted feature content can be texture, colour, shape etc. Efficient method for image recuperation is proposed in this paper based on shape feature. Shape features like computing Boundary, mode using morphological operations and Harris corner detector and Voronoi diagram are proposed. These matching decisions can be made by different classification models. SVM classifier is used in this research work to get the best matched images during image retrieval. The proposed algorithm is evaluated on JPEG images to get accuracy of about 90 %.
The suggested algorithm for shape classification described in this paper is based on several step... more The suggested algorithm for shape classification described in this paper is based on several steps. The algorithm analyzes the contour of pairs of shapes. Their contours are recovered and represented by a pair of N points. Given two points pi and qj from the two shapes the cost of their matching is evaluated by using the shape context and by using dynamic programming, the best matching between the point sets is obtained. Dynamic programming not only recovers the best matching, but also identifies occlusions, i.e. points in the two shapes which cannot be properly matched. From dynamic programming we obtain the minimum cost for matching pairs of shapes. After computing pair wise minimum cost between input and all reference shapes in the given database, we sort based on the minimum cost in ascending order and select first two shapes to check if it belongs to the input class. If it belongs to the input class, then we say that the shape is classified as a perfect match, else it is a mism...
A technique for 2D shape recognition and retrieval is proposed. The proposed technique is based o... more A technique for 2D shape recognition and retrieval is proposed. The proposed technique is based on the 8-neighborhood pattern which represents each point or pixel on the contour of the shape. These patterns are used as a framework in matching the shape of the object. The recognition and retrieval process are conducted by traversing through the contour of the shape and analyzes each point on the contour by considering the 8-neighborhood pattern. The 8-neighborhood patterns are assigned unique labels which are computed on their every occurrence during contour traversal. The cost of the best match between the shapes is evaluated by comparing the hit value obtained by the contour traversal of the shapes to be matched. The recognition and retrieval are carried out using the leave-one-out strategy and standard bull eye score, respectively. The proposed method is experimented on the MPEG-7 data set and the chicken piece data set. The results both for recognition and retrieval outperform mo...
International Journal of Rough Sets and Data Analysis
In the area of computer vision and machine intelligence, image recognition is a prominent field. ... more In the area of computer vision and machine intelligence, image recognition is a prominent field. There have been several approaches in use for 2D shape recognition using shape features extraction. This paper suggest, subspace method approach. Normally in the earlier methods proposed so far, an entire image is considered in the training and matching operation, with sub pattern approach a given image is partitioned in to many sub images. The recognition process is carried out in two steps, in the first step the Ridgelet transform is used to feature extraction, in the second step PCA is used for dimensionality reduction. For recognition efficiency rate a test study is conducted by using seventeen different distance measure technique. The training and testing process is conducted using leave-one-out strategy. The proposed method is tested on the standard MPEG-7 dataset. The results of Ridgelet PCA are compared with PCA results.
2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 2017
kNN is frequently used as a clustering method in machine learning or data mining. The main applic... more kNN is frequently used as a clustering method in machine learning or data mining. The main application of a kNN join is k-nearest neighbour classification. Some data points are given for training and new unlabelled data is given for testing. The focus is to find the class label for the new points. For each unlabelled data in kNN query on training set will be performed to estimate its class membership. This process can be considered as kNN join for new data to test with the training set. The different existing approaches for computing knn on MapReduce are compared, first theoretically and then performing an extensive experimental evaluation. To compare these solutions, there are three generic steps for kNN computation on MapReduce: data pre-processing, data partitioning and computation. Theoretically each step is analysed for load balancing, accuracy and complexity aspects.
Nowadays, everybody is facing security issues in door operating system like unwanted entrance of ... more Nowadays, everybody is facing security issues in door operating system like unwanted entrance of unauthorize people without permission. In order to resolve these issues an high security system is implemented. Rather than monitoring the door operating system through passwords or pins, unique faces are often made used as they're one’s biometric trait. These can't be modified or stolen easily, the extent of security is often raised by using face detection. The proposed system uses face recognition for providing higher security. System uses Haar classifier for face detection, local binary pattern histogram (LBPH) algorithm for face recognition. Whenever the person comes in front of the door, it captures the face and if the captured face is registered then it unlocks the door and if the face isn't registered the door will be in the locked state. The proposed system is trained and tested against 150 images and it has acquired an accuracy rate of 95%. Keywords— Internet of Thin...
2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 2017
Agent Based Discovery and Monitoring is used mainly for asset discovery and monitoring purpose. T... more Agent Based Discovery and Monitoring is used mainly for asset discovery and monitoring purpose. The aim is to monitor the systems in the network and also to ensure adherence to the IT compliance within the organization. Discovery and Monitoring agent will be designed to run as a system service. This paper proposes the creation of an agent service which will be capable of discovering the systems, monitoring files and restricting access, alerting when an external device is plugged into the system and software metering. The service agent has to be installed on every machine and this service will return the required data about the machine on which it is installed. The goal of this project is to enable the administrator to discover and monitor the systems in a network.
In this paper, we have proposed a novel method for two-dimensional shape object recognition and r... more In this paper, we have proposed a novel method for two-dimensional shape object recognition and retrieval. The proposed method is based on Ridgelet Principal Component Analysis (Ridgelet PCA). In our proposed approach we first use the ridgelet transform to extract line singularity features and point singularity features by applying the radon and wavelet transform respectively and then applying PCA to extract the effective features. For recognition and retrieval we have conducted a study by using seventeen different distance measure techniques. The training and testing process is conducted using leave-one-out strategy. The retrieval process is carried out by considering standard test ’bullseye’ score. The proposed method is tested on the collected standard dataset MPEG-7. Experimental results of Ridgelet PCA are compared with the existing PCA method, which show that our approach results are favorable compared to the reference methods, in terms of recognition and retrieval rate. keywo...
2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 2017
Cloud computing and Big Data both leverages Hadoop framework to process the Bid Data parallel. Ha... more Cloud computing and Big Data both leverages Hadoop framework to process the Bid Data parallel. Hadoop framework is used to build both high performance and large scale clusters. The limitation posed by the framework for efficient execution due to data locality provides the scope to improve the performance. An enhanced architecture is proposed in this paper which addresses the issue of reducing the time taken to read the data and to increase the number task executed. The architecture provides an additional layer to the NameNode using which it directs the task to the DataNode having the required data rather than searching the data every execution.
International Journal of Advance Research, Ideas and Innovations in Technology, 2019
Performance of Financial Institutions is dependent on various aspects that can affect their growt... more Performance of Financial Institutions is dependent on various aspects that can affect their growth. Many of them are standardized aspects such as Portfolio At Risk, Provision Expense Ratio, Risk Coverage Ratio, Write Off Ratio. Along with these, we introduce a scientific cost distribution value that we name as the Expense Accrual Ratio (EAR). Along with the other parameters of measurement, EAR can be used to assess if the health of a financial institution. In this paper, we explain the role and functioning of EAR along with ways of calculation of it with formulas for the derivation. The effect of dependent parameters on EAR is explored and the graphs are explained. Further, its impact on performance is analyzed with data references. Comparison is done with the current interest rates of the institution to derive at a conclusion based on EAR. Along with other parameters of the measure this can be used as a powerful tool to adjust the functioning of the financial institution to avoid f...
Advances in Intelligent Systems and Computing, 2017
The key functionality of this proposed system is its ability to handle, collect and analysis huge... more The key functionality of this proposed system is its ability to handle, collect and analysis huge volume of different kinds of log data. When deployed in a network would facilitate collection of logs from different nodes across the network. This paper explains the proposed system which collects the logs using Logstash which is having a capability of handling the many types of Logs data which helps to identify the malicious activity in the network.
Phishing is a typical assault on gullible individuals by making them reveal their one of a kind d... more Phishing is a typical assault on gullible individuals by making them reveal their one of a kind data utilizing fake sites. The goal of phishing site URLs is to steal the individual data like client name, passwords and on the web banking exchanges. Phishers utilize the sites which are outwardly also, semantically like those genuine sites. As innovation keeps on developing, phishing methods began to advance quickly also, this should be protected by utilizing against phishing systems to distinguish phishing. Machine Learning is an amazing asset used to endeavour against phishing assaults. A novel idea is proposed to detect malicious and non-malicious URL links using Extreme Learning Machine Algorithm KeywordsPhishing, Extreme Learning Machine, malicious URL, non-malicious URL
Content retrieval from large databases needs an efficient approach due to the increasing growth i... more Content retrieval from large databases needs an efficient approach due to the increasing growth in the digital images. Especially content based image retrieval is an extensive research area. This mainly includes retrieving similar images from the large dataset based on the extracted features. The extracted feature content can be texture, colour, shape etc. Efficient method for image recuperation is proposed in this paper based on shape feature. Shape features like computing Boundary, mode using morphological operations and Harris corner detector and Voronoi diagram are proposed. These matching decisions can be made by different classification models. SVM classifier is used in this research work to get the best matched images during image retrieval. The proposed algorithm is evaluated on JPEG images to get accuracy of about 90 %.
The suggested algorithm for shape classification described in this paper is based on several step... more The suggested algorithm for shape classification described in this paper is based on several steps. The algorithm analyzes the contour of pairs of shapes. Their contours are recovered and represented by a pair of N points. Given two points pi and qj from the two shapes the cost of their matching is evaluated by using the shape context and by using dynamic programming, the best matching between the point sets is obtained. Dynamic programming not only recovers the best matching, but also identifies occlusions, i.e. points in the two shapes which cannot be properly matched. From dynamic programming we obtain the minimum cost for matching pairs of shapes. After computing pair wise minimum cost between input and all reference shapes in the given database, we sort based on the minimum cost in ascending order and select first two shapes to check if it belongs to the input class. If it belongs to the input class, then we say that the shape is classified as a perfect match, else it is a mism...
A technique for 2D shape recognition and retrieval is proposed. The proposed technique is based o... more A technique for 2D shape recognition and retrieval is proposed. The proposed technique is based on the 8-neighborhood pattern which represents each point or pixel on the contour of the shape. These patterns are used as a framework in matching the shape of the object. The recognition and retrieval process are conducted by traversing through the contour of the shape and analyzes each point on the contour by considering the 8-neighborhood pattern. The 8-neighborhood patterns are assigned unique labels which are computed on their every occurrence during contour traversal. The cost of the best match between the shapes is evaluated by comparing the hit value obtained by the contour traversal of the shapes to be matched. The recognition and retrieval are carried out using the leave-one-out strategy and standard bull eye score, respectively. The proposed method is experimented on the MPEG-7 data set and the chicken piece data set. The results both for recognition and retrieval outperform mo...
International Journal of Rough Sets and Data Analysis
In the area of computer vision and machine intelligence, image recognition is a prominent field. ... more In the area of computer vision and machine intelligence, image recognition is a prominent field. There have been several approaches in use for 2D shape recognition using shape features extraction. This paper suggest, subspace method approach. Normally in the earlier methods proposed so far, an entire image is considered in the training and matching operation, with sub pattern approach a given image is partitioned in to many sub images. The recognition process is carried out in two steps, in the first step the Ridgelet transform is used to feature extraction, in the second step PCA is used for dimensionality reduction. For recognition efficiency rate a test study is conducted by using seventeen different distance measure technique. The training and testing process is conducted using leave-one-out strategy. The proposed method is tested on the standard MPEG-7 dataset. The results of Ridgelet PCA are compared with PCA results.
Uploads
Papers by MUZAMEEL AHMED