Ph.D. in computer science and expertise in network security and cryptography. More than 25 years teaching experience and 40 research publications. Working with Noida International University.
The modern era of organizations are generating huge amount of data by digitalizing their way of p... more The modern era of organizations are generating huge amount of data by digitalizing their way of promoting services and products. The companies trying to know what customers are saying in terms of products through reviews in social media analytics constitutes a prime factor to enhance the success of big data era. However, the social media data analytics is a very complex discipline due to subjectivity in the textual review and the productivity in their complexity. Different approach of framework to tackle this problem is proposed: The first stage discussed, in this paper, the sentiment analysis of social media through the different approaches of machine learning. The second stage is used to discuss the challenges faced for processing the sentiment analysis of social media. Then, an overview of the case study presented for the two stages sentiment analysis in different approaches with the help of customer reviews. The machine learning approaches will be followed to analyze the sentiment analysis of social media in processing the big data.
International Journal of Innovative Technology and Exploring Engineering, 2019
An advanced Incremental processing technique is planned for data examination in knowledge to have... more An advanced Incremental processing technique is planned for data examination in knowledge to have the clustering results inform. Data is continuously arriving by different data generating factors like social network, online shopping, sensors, e-commerce etc. [1]. On account of this Big Data the consequences of data mining applications getting stale and neglected after some time. Cloud knowledge applications regularly perform iterative calculations (e.g., PageRank) on continuously converting datasets. Though going before trainings grow Map-Reduce aimed at productive iterative calculations, it's miles also pricey to carry out a whole new big-ruler Map-Reduce iterative task near well-timed quarter new adjustments to fundamental records sets. Our usage of MapReduce keeps running [4] scheduled a big cluster of product technologies and is incredibly walkable: an ordinary Map-Reduce computation procedure several terabytes of records arranged heaps of technologies. Processor operator lo...
Blue Eyes Intelligence Engineering & Sciences Publication, 2019
An advanced Incremental processing technique is planned for data examination in knowledge to have... more An advanced Incremental processing technique is planned for data examination in knowledge to have the clustering results inform. Data is continuously arriving by different data generating factors like social network, online shopping, sensors, e-commerce etc. [1]. On account of this Big Data the consequences of data mining applications getting stale and neglected after some time. Cloud knowledge applications regularly perform iterative calculations (e.g., PageRank) on continuously converting datasets. Though going before trainings grow Map-Reduce aimed at productive iterative calculations, it's miles also pricey to carry out a whole new big-ruler Map-Reduce iterative task near well-timed quarter new adjustments to fundamental records sets. Our usage of MapReduce keeps running [4] scheduled a big cluster of product technologies and is incredibly walkable: an ordinary Map-Reduce computation procedure several terabytes of records arranged heaps of technologies. Processor operator locates the machine clean to apply: masses of MapReduce applications, we look at that during many instances, The differences result separate a totally little part of the data set, and the recently iteratively merged nation is very near the recently met state. I2MapReduce clustering adventures this commentary to keep re-calculated by way of beginning after the before affected national [2], and by using acting incremental up-dates on the converging information. The approach facilitates in enhancing the process successively period and decreases the jogging period of stimulating the consequences of big data.
The increasing use of information systems led to dramatically improve the func-tionality with res... more The increasing use of information systems led to dramatically improve the func-tionality with respect to safety, cost and reliability. However, with this growth of information systems the likelihood of vulnerabilities also increases. Security problems involving computers and software are frequent, widespread, and serious. The number and variety of attacks from outside organizations, particularly via the Internet, and the amount and consequences of insider attacks are increasing rapidly. We routinely hear customers claim that their system is insecure. However, without knowing what assumptions they make, it is hard to justify such a claim, and it is important to identify security requirements of the system. Enumerating the security to a system helps system architects develop realistic and meaningful secure software. In this paper, we propose a checklist for security requirements and assess the security with the help of a metrics based on checklist threshold value.
The main objective of this work is to use the given dataset (about the information of different v... more The main objective of this work is to use the given dataset (about the information of different variables of the different kind of bank notes that were analyzed) in such a way using an algorithm that will group the large values in the data as efficiently as possible into two groups of values then we can use those groups and make a model that can in future identify a forged note from the real ones. Forged banknotes are a major problem that the bank must tackle. It is very important to find a solution to stop this criminal action. One of the most important steps is to be able to identify which banknote is forged. There are many solutions to identify the forged banknote. In this manuscript, we propose one of the solutions to identify the forged banknotes using one of the machine learning methods called “K-Means Clustering”.
This paper examines the use of enterprise resource planning (ERP) solutions by institutes of high... more This paper examines the use of enterprise resource planning (ERP) solutions by institutes of higher education. In particular, this paper examines the infrastructure required to manage ERP software. ERP is a general term for integrated systems that are used in data processing organization. Data mining provides a number of tools and techniques that enables analysis of such data sets. Data mining incorporates techniques from a number of fields including statistics, machine learning, database management, artificial intelligence, pattern recognition and data visualisation. A service-oriented architecture (S.O.A.) is essentially a collection of services. These services communicate with each other. The communication can involve either simple data passing or it could involve two or more services coordinating some activity [1]. However, it is important to note that S.O.A. is not an off-the-shelf technology; rather it is a way of architecting and organizing I.T. infrastructure and business fu...
International Journal of Artificial Intelligence and Knowledge Discovery, 2017
- The business of any organization depends on the data they have stored and fetch in quick span o... more - The business of any organization depends on the data they have stored and fetch in quick span of time to take decision in recent competitive era. Data compression is a skill of sinking the number of bits needed to store or transmit data. The objective of this paper is to sink the size of statistics through compression techniques using different algorithms. In this paper a new compression algorithm based on J-bit encoding techniques has been proposed. This algorithm based on designing, optimizing and splitting the input data by dividing into two parts 1 for “true” and 0 for “false”. Data warehouse has large volume of storage capacity that can be used by J-bit encoding for largest data bits. It is tough to reduce the size of database and minimize the retrieval time. This proposed algorithm of data compression is named as Enhanced J-Bit Encoding (EJBE). It is conducive for enumerated data that provides the output of any format of data.
In the recent years it has been experienced that improvement of software qualities are gaining mo... more In the recent years it has been experienced that improvement of software qualities are gaining more attention by using Goal Question Metrics methods for business driven organization. Software products are often struggle with quality problems due to size and its complexities, software engineers are often not capable to handle such situations. Many high-tech software projects turn out to be disastrous due to above problems. As the uses of internet technology increases for getting more information and services the risk of potential liability, cost and its negative consequences increases as well, because it has been reported that a large numbers of security attacks are performing almost every day. One most important security problem is not allowing the security requirements of the whole project.
Blue Eyes Intelligence Engineering & Sciences Publication, 2019
An advanced Incremental processing technique is planned for data examination in knowledge to have... more An advanced Incremental processing technique is planned for data examination in knowledge to have the clustering results inform. Data is continuously arriving by different data generating factors like social network, online shopping, sensors, e-commerce etc. [1]. On account of this Big Data the consequences of data mining applications getting stale and neglected after some time. Cloud knowledge applications regularly perform iterative calculations (e.g., PageRank) on continuously converting datasets. Though going before trainings grow Map-Reduce aimed at productive iterative calculations, it's miles also pricey to carry out a whole new big-ruler Map-Reduce iterative task near well-timed quarter new adjustments to fundamental records sets. Our usage of MapReduce keeps running [4] scheduled a big cluster of product technologies and is incredibly walkable: an ordinary Map-Reduce computation procedure several terabytes of records arranged heaps of technologies. Processor operator lo...
The modern era of organizations are generating huge amount of data by digitalizing their way of p... more The modern era of organizations are generating huge amount of data by digitalizing their way of promoting services and products. The companies trying to know what customers are saying in terms of products through reviews in social media analytics constitutes a prime factor to enhance the success of big data era. However, the social media data analytics is a very complex discipline due to subjectivity in the textual review and the productivity in their complexity. Different approach of framework to tackle this problem is proposed: The first stage discussed, in this paper, the sentiment analysis of social media through the different approaches of machine learning. The second stage is used to discuss the challenges faced for processing the sentiment analysis of social media. Then, an overview of the case study presented for the two stages sentiment analysis in different approaches with the help of customer reviews. The machine learning approaches will be followed to analyze the sentiment analysis of social media in processing the big data.
International Journal of Innovative Technology and Exploring Engineering, 2019
An advanced Incremental processing technique is planned for data examination in knowledge to have... more An advanced Incremental processing technique is planned for data examination in knowledge to have the clustering results inform. Data is continuously arriving by different data generating factors like social network, online shopping, sensors, e-commerce etc. [1]. On account of this Big Data the consequences of data mining applications getting stale and neglected after some time. Cloud knowledge applications regularly perform iterative calculations (e.g., PageRank) on continuously converting datasets. Though going before trainings grow Map-Reduce aimed at productive iterative calculations, it's miles also pricey to carry out a whole new big-ruler Map-Reduce iterative task near well-timed quarter new adjustments to fundamental records sets. Our usage of MapReduce keeps running [4] scheduled a big cluster of product technologies and is incredibly walkable: an ordinary Map-Reduce computation procedure several terabytes of records arranged heaps of technologies. Processor operator lo...
Blue Eyes Intelligence Engineering & Sciences Publication, 2019
An advanced Incremental processing technique is planned for data examination in knowledge to have... more An advanced Incremental processing technique is planned for data examination in knowledge to have the clustering results inform. Data is continuously arriving by different data generating factors like social network, online shopping, sensors, e-commerce etc. [1]. On account of this Big Data the consequences of data mining applications getting stale and neglected after some time. Cloud knowledge applications regularly perform iterative calculations (e.g., PageRank) on continuously converting datasets. Though going before trainings grow Map-Reduce aimed at productive iterative calculations, it's miles also pricey to carry out a whole new big-ruler Map-Reduce iterative task near well-timed quarter new adjustments to fundamental records sets. Our usage of MapReduce keeps running [4] scheduled a big cluster of product technologies and is incredibly walkable: an ordinary Map-Reduce computation procedure several terabytes of records arranged heaps of technologies. Processor operator locates the machine clean to apply: masses of MapReduce applications, we look at that during many instances, The differences result separate a totally little part of the data set, and the recently iteratively merged nation is very near the recently met state. I2MapReduce clustering adventures this commentary to keep re-calculated by way of beginning after the before affected national [2], and by using acting incremental up-dates on the converging information. The approach facilitates in enhancing the process successively period and decreases the jogging period of stimulating the consequences of big data.
The increasing use of information systems led to dramatically improve the func-tionality with res... more The increasing use of information systems led to dramatically improve the func-tionality with respect to safety, cost and reliability. However, with this growth of information systems the likelihood of vulnerabilities also increases. Security problems involving computers and software are frequent, widespread, and serious. The number and variety of attacks from outside organizations, particularly via the Internet, and the amount and consequences of insider attacks are increasing rapidly. We routinely hear customers claim that their system is insecure. However, without knowing what assumptions they make, it is hard to justify such a claim, and it is important to identify security requirements of the system. Enumerating the security to a system helps system architects develop realistic and meaningful secure software. In this paper, we propose a checklist for security requirements and assess the security with the help of a metrics based on checklist threshold value.
The main objective of this work is to use the given dataset (about the information of different v... more The main objective of this work is to use the given dataset (about the information of different variables of the different kind of bank notes that were analyzed) in such a way using an algorithm that will group the large values in the data as efficiently as possible into two groups of values then we can use those groups and make a model that can in future identify a forged note from the real ones. Forged banknotes are a major problem that the bank must tackle. It is very important to find a solution to stop this criminal action. One of the most important steps is to be able to identify which banknote is forged. There are many solutions to identify the forged banknote. In this manuscript, we propose one of the solutions to identify the forged banknotes using one of the machine learning methods called “K-Means Clustering”.
This paper examines the use of enterprise resource planning (ERP) solutions by institutes of high... more This paper examines the use of enterprise resource planning (ERP) solutions by institutes of higher education. In particular, this paper examines the infrastructure required to manage ERP software. ERP is a general term for integrated systems that are used in data processing organization. Data mining provides a number of tools and techniques that enables analysis of such data sets. Data mining incorporates techniques from a number of fields including statistics, machine learning, database management, artificial intelligence, pattern recognition and data visualisation. A service-oriented architecture (S.O.A.) is essentially a collection of services. These services communicate with each other. The communication can involve either simple data passing or it could involve two or more services coordinating some activity [1]. However, it is important to note that S.O.A. is not an off-the-shelf technology; rather it is a way of architecting and organizing I.T. infrastructure and business fu...
International Journal of Artificial Intelligence and Knowledge Discovery, 2017
- The business of any organization depends on the data they have stored and fetch in quick span o... more - The business of any organization depends on the data they have stored and fetch in quick span of time to take decision in recent competitive era. Data compression is a skill of sinking the number of bits needed to store or transmit data. The objective of this paper is to sink the size of statistics through compression techniques using different algorithms. In this paper a new compression algorithm based on J-bit encoding techniques has been proposed. This algorithm based on designing, optimizing and splitting the input data by dividing into two parts 1 for “true” and 0 for “false”. Data warehouse has large volume of storage capacity that can be used by J-bit encoding for largest data bits. It is tough to reduce the size of database and minimize the retrieval time. This proposed algorithm of data compression is named as Enhanced J-Bit Encoding (EJBE). It is conducive for enumerated data that provides the output of any format of data.
In the recent years it has been experienced that improvement of software qualities are gaining mo... more In the recent years it has been experienced that improvement of software qualities are gaining more attention by using Goal Question Metrics methods for business driven organization. Software products are often struggle with quality problems due to size and its complexities, software engineers are often not capable to handle such situations. Many high-tech software projects turn out to be disastrous due to above problems. As the uses of internet technology increases for getting more information and services the risk of potential liability, cost and its negative consequences increases as well, because it has been reported that a large numbers of security attacks are performing almost every day. One most important security problem is not allowing the security requirements of the whole project.
Blue Eyes Intelligence Engineering & Sciences Publication, 2019
An advanced Incremental processing technique is planned for data examination in knowledge to have... more An advanced Incremental processing technique is planned for data examination in knowledge to have the clustering results inform. Data is continuously arriving by different data generating factors like social network, online shopping, sensors, e-commerce etc. [1]. On account of this Big Data the consequences of data mining applications getting stale and neglected after some time. Cloud knowledge applications regularly perform iterative calculations (e.g., PageRank) on continuously converting datasets. Though going before trainings grow Map-Reduce aimed at productive iterative calculations, it's miles also pricey to carry out a whole new big-ruler Map-Reduce iterative task near well-timed quarter new adjustments to fundamental records sets. Our usage of MapReduce keeps running [4] scheduled a big cluster of product technologies and is incredibly walkable: an ordinary Map-Reduce computation procedure several terabytes of records arranged heaps of technologies. Processor operator lo...
Uploads
Papers by Dr. Mahtab Alam