Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
  • At present working as Assistant Professor in Department of Computer Science & Engineering (CSE), Babu Banarasi Das No... moreedit
In the today scenario technological intelligence is a higher demand after commodity even in traffic-based systems. These intelligent systems do not only help in traffic monitoring but also in commuter safety, law enforcement and... more
In the today scenario technological intelligence is a higher demand after commodity even in traffic-based systems. These intelligent systems do not only help in traffic monitoring but also in commuter safety, law enforcement and commercial applications. The proposed Saudi Arabia Vehicle License plate recognition system splits into three major parts, firstly extraction of a license plate region secondly segmentation of the plate characters and lastly recognition of each character. This act is quite challenging due to the multiformity of plate formats and the nonuniform outdoor illumination conditions during image collection. In this paper recognition of the license plates is achieved by the implementation of the Learning Vector Quantization artificial neural network. Their results are based upon their completeness in the Saudi Arabia Vehicle License plate character recognition and theirs have obtained encouraging results from proposed technique.
The data mining techniques have the ability to discover hidden patterns or correlation among the objects in the medical data. There are many areas that adapt data mining techniques, namely marketing, stock, health care sector and so on.... more
The data mining techniques have the ability to discover hidden patterns or correlation among the objects in the medical data. There are many areas that adapt data mining techniques, namely marketing, stock, health care sector and so on. In the health care industry produces gigantic quantities of data that clutches complex information relating to the sick person and their medical conditions. The data mining has an infinite potential to make use of healthcare data more effectually and efficiently to predict various kinds of disease. The present-time healthcare industry heart ailment is a term that assigns to an enormous number of health care circumstances related to heart. These medical circumstances relate to the unexpected health circumstance that straight control the cardiac.  In this paper we are using a ROCK algorithm because it uses Jaccard coefficient on the contrary using the distance measures to find the similarity between the data or documents to classify the clusters and th...
People used to carry their documents about on CDs only a few years ago. Many people have recently turned to memory sticks. Cloud computing, in this case, refers to the capacity to access and edit data stored on remote servers from any... more
People used to carry their documents about on CDs only a few years ago. Many people have recently turned to memory sticks. Cloud computing, in this case, refers to the capacity to access and edit data stored on remote servers from any Internet-connected platform. Cloud computing is a self-service Internet infrastructure that allows people to access computing resources at any location worldwide. The world has altered as a result of cloud computing. Cloud computing can be thought of as a new computing typology that can provide on-demand services at a low cost. By increasing the capacity and flexibility of data storage and providing scalable compute and processing power that fits the dynamic data requirements, cloud computing has aided the advancement of IT to higher heights. In the field of information technology, privacy and data security have long been a serious concern. It becomes more severe in the cloud computing environment because data is stored in multiple locations, often across the globe. Users' primary challenges regarding the cloud technology revolve around data security and privacy. We conduct a thorough assessment of the literature on data security and privacy issues, data encryption technologies, and related countermeasures in cloud storage systems in this study. Ubiquitous network connectivity, location-independent resource pooling, quick resource flexibility, usage-based pricing, and risk transference are all features of cloud computing.
As the Internet continues to expand, immense people around the globe join the Internet. The Internet of Things (IoT) can be defined as the interconnection of peerless identifiable embedded computing devices within the current Internet... more
As the Internet continues to expand, immense people around the globe join the Internet. The Internet of Things (IoT) can be defined as the interconnection of peerless identifiable embedded computing devices within the current Internet infrastructure. This paradigm encompasses an infrastructure of software, hardware, and services that link tangible objects called things to the Internet. In Internet of Things technology, multimedia big data which is said to be the huge amount of data from multimedia devices will be generated with the swiftly rise of the multimedia gadgets and devices. The multimedia devices need higher processing and memory resources to process the obtained multimedia information. The Internet of Things systems are fiasco in realizing the multimedia devices connectivity unless they are able in processing multimedia gadgets and devices at a moment. In this paper, we are introduces a new concept of Internet of Multimedia Things (IoMT) for multimedia communications in Internet of Things (IoT). Internet of Multimedia Things (IoMT) communications play a vital role in Internet of Things (IoT) applications such as traffic control and handling, environmental monitoring, healthcare sector, observation & surveillance, event recognition and house monitoring and automation. In this paper, we present a comprehensive survey of IoMT and future research directions. The Internet of Multimedia Things (IoMT) applications such as real-time multimedia based security and monitoring in smart house, Smart Agriculture, multispecialty hospitals, metropolitan area, and smart transportation handling systems are of the most difficult systems to deploy.
The increase of intelligent environments suggests the inter-connectivity of applications and the use of the Internet. For this reason, arise what is known as the Internet of Things (IoT). The expansion of the IoT concept gives access to... more
The increase of intelligent environments suggests the inter-connectivity of applications and the use of the Internet. For this reason, arise what is known as the Internet of Things (IoT). The expansion of the IoT concept gives access to the Internet of Nano Things (IoNT). A new communication networks paradigm based on nano technology and IoT, in other words, a paradigm with the capacity to interconnect nano-scale devices through existing networks. From the interconnection of these nano machines with the Internet emerged the concept of Internet of Nano Things (IoNT). The Internet of Nano-Things (IoNT) is a system of nano connected devices, objects, or organisms that have unique identifiers to transfer data over a computer or cellular network wirelessly to the Cloud. The data delivery, caching, and energy consumption are among the most significant topics in the IoNT nowadays. The nano-networks paradigm can empower the consumers to make a difference to their well-being by connecting data to personalized analysis within timely insights. The real-time data can be used in a diversification of nano-applications in the Internet of Nano-Things (IoNT), from preventive treatment to diagnostics and rehabilitation. In this paper intelligibly explains the Internet of Nano Things (IoNT), its architecture, challenges, explains the role of IoNT in global market, IoNT applications in various domains. Internet of things has provided countless new opportunity to create a powerful industrialized structure and many more. The key applications for IoNT communication including healthcare, transportation and logistics, defense and aerospace, media and entertainment, manufacturing, oil and gas, high speed data transfer & cellular, multimedia, immune system support and others services. In the end, since security is considered to be one of the main issues of the IoNT system, we provide an in-depth discussion on security, communication network and Internet of Nano Things (IoNT) market trends.
The increase of intelligent environments suggests the inter-connectivity of applications and the use of the Internet. For this reason, arise what is known as the Internet of Things (IoT). The expansion of the IoT concept gives access to... more
The increase of intelligent environments suggests the inter-connectivity of applications and the use of the Internet. For this reason, arise what is known as the Internet of Things (IoT). The expansion of the IoT concept gives access to the Internet of Nano Things (IoNT). A new communication networks paradigm based on nano technology and IoT, in other words, a paradigm with the capacity to interconnect nano-scale devices through existing networks. From the interconnection of these nano machines with the Internet emerged the concept of Internet of Nano Things (IoNT). The Internet of Nano-Things (IoNT) is a system of nano connected devices, objects, or organisms that have unique identifiers to transfer data over a computer or cellular network wirelessly to the Cloud. The data delivery, caching, and energy consumption are among the most significant topics in the IoNT nowadays. The nano-networks paradigm can empower the consumers to make a difference to their well-being by connecting data to personalized analysis within timely insights. The real-time data can be used in a diversification of nano-applications in the Internet of Nano-Things (IoNT), from preventive treatment to diagnostics and rehabilitation. In this paper intelligibly explains the Internet of Nano Things (IoNT), its architecture, challenges, explains the role of IoNT in global market, IoNT applications in various domains. Internet of things has provided countless new opportunity to create a powerful industrialized structure and many more. The key applications for IoNT communication including healthcare, transportation and logistics, defense and aerospace, media and entertainment, manufacturing, oil and gas, high speed data transfer & cellular, multimedia, immune system support and others services. In the end, since security is considered to be one of the main issues of the IoNT system, we provide an in-depth discussion on security, communication network and Internet of Nano Things (IoNT) market trends.
The Big data is the name used ubiquitously now a day in distributed paradigm on the web. As the name point out it is the collection of sets of very large amounts of data in pet bytes, Exabyte etc. related systems as well as the algorithms... more
The Big data is the name used ubiquitously now a day in distributed paradigm on the web. As the name point out it is the collection of sets of very large amounts of data in pet bytes, Exabyte etc. related systems as well as the algorithms used to analyze this enormous data. Hadoop technology as a big data processing technology has proven to be the go to solution for processing enormous data sets. MapReduce is a conspicuous solution for computations, which requirement one-pass to complete, but not exact efficient for use cases that need multi-pass for computations and algorithms. The Job output data between every stage has to be stored in the file system before the next stage can begin. Consequently, this method is slow, disk Input/output operations and due to replication. Additionally, Hadoop ecosystem doesn't have every component to ending a big data use case. Suppose we want to do an iterative job, you would have to stitch together a sequence of MapReduce jobs and execute them in sequence. Every this job has high-latency, and each depends upon the completion of the previous stage. Apache Spark is one of the most widely used open source processing engines for big data, with wealthy language-integrated APIs and an extensive range of libraries. Apache Spark is a usual framework for distributed computing that offers high performance for both batch and interactive processing. In this paper, we aimed to demonstrate a close-up view about Apache Spark and its features and working with Spark using Hadoop. We are in a nutshell discussing about the Resilient Distributed Datasets (RDD), RDD operations, features, and limitation. Spark can be used along with MapReduce in the same Hadoop cluster or can be used lonely as a processing framework. In the last comparative analysis between Spark and Hadoop and MapReduce in this paper.
In 2020 the world has generating 52 times the amount of data as in 2010, and 76 times the number of information sources. Being able to use this data provides huge opportunities and to turn these opportunities into reality, people need to... more
In 2020 the world has generating 52 times the amount of data as in 2010, and 76 times the number of information sources. Being able to use this data provides huge opportunities and to turn these opportunities into reality, people need to use data to solve problems. Unfortunately in the midst of a global pandemic, when people throughout the world are looking for reliable, trustworthy information about COVID-19(Coronavirus). In this scenario tableau play the vital role, because Tableau is an extremely powerful tool for visualizing massive sets of data very easily. It has an easy to use drag and drop interface. You can build beautiful visualizations easily and in a short amount of time. Tableau supports a wide array of data sources. COVID-19 (Coronavirus) analytics with Tableau, you will create dashboards that help you identify the story within our data, and we will better understand the impact of COVID-19 (Coronavirus). In this paper comprehensive review about Tableau. The Tableau are the tools which deals with the big data analytics also it generates the output in visualization technique i.e., more understandable and presentable. Its features include data blending, real-time reporting and collaboration of data. In the end, this paper gives the clear picture of growing COVID-19(Coronavirus) data and the tools which can help more effectively, accurately and efficiently.
At present time huge numbers of research articles are available on World Wide Web in any domain. The research scholar explores a research papers to get the appropriate information and it takes time and effort of the researcher. In this... more
At present time huge numbers of research articles are available on World Wide Web in any domain. The research scholar explores a research papers to get the appropriate information and it takes time and effort of the researcher. In this scenario, there is the need for a researcher to search a research based on its research article. In the present paper a method of Knowledge ablation from a collection of research articles, is presented to evolve a system research paper recommendation system (RPRS), which would generate the recommendations for research article based on researcher choice. The RPRS accumulate the knowledge ablated from the pertinent research articles in the form of semantic tree. It accumulates all the literal sub parts with their reckoning in nodes. These parts are arranged based on their types in such a way that the leaf nodes stores the words with its prospect, the higher layer gives details about dictum with its reckoning, next to it an abstract. A Bayesian network is applied to construct a verisimilitude model which would quotation the pertinent tidings from the knowledge tree to construct the recommendation and word would be scored through TF-IDF value.
As there is an enormous amount of online research material available, finding pertinent information for specific purposes has become a tedious chore. So there is a requirement of the research paper recommendation system to facilitate... more
As there is an enormous amount of online research material available, finding pertinent information for specific purposes has become a tedious chore. So there is a requirement of the research paper recommendation system to facilitate research scholars in finding their interested and relevant research papers. There are many paper recommendation systems available, most of them are depending on paper assemblage, references, user profile, mind maps. This information is generally not easily available. The majority of the prevailing recommender system is based on collaborative filtering that rely on other user's proclivity. On the other hand, content-based methods use information regarding an item itself to make a recommendation. In this paper, we present a research paper recommendation method that is based on single paper. Our method uses content-based recommendation approach that employs information extraction and text categorization.. It performs the profile learning by using naive Bayesian text classifier and generates recommendation on the basis of an individual's preference.
The IoT has primarily focused on establishing connectivity in a diversify of constrained networking environments, and the next logical aim is to build on top of network connectivity by focusing on the application layer. In the Web of... more
The IoT has primarily focused on establishing connectivity in a diversify of constrained networking environments, and the next logical aim is to build on top of network connectivity by focusing on the application layer. In the Web of Things (WoT), we are thinking about smart things as first class citizens of the Web. We position the Web of Things as a purification of the Internet of Things by integrating smart things not only into the Internet, for instance the network, but into the Web (the application layer). The Web of Things is a computing concept that describes a future where day-today objects are fully integrated with the Web. The WoT is very homogeneous to the IoT in some ways and in others it is drastically different. The stipulation for WoT is for the "things" to have embedded computer systems that enable communication with the Web. This type of smart devices would then be able to communicate with each other using current Web standards. For instance, renowned Web languages PHP, HTML, Python, and JavaScript can be used to easily build applications involving smart things and users can leverage well-known Web mechanisms such as caching, browsing, searching, and bookmarking to communicate and share these devices. In this paper, aim to demonstrate a close-up view about Web of Things, including Web of things architecture, Open platform in Web of Thing, Web-enabling devices, Web of Thing security, use cases of Web of Things. The WoT concept, smart things and their services are fully integrated in the Web by reusing and conforming technologies and patterns commonly used for conventional Web content.
Recent years have seen the swift development and deployment of Internet-of-Things (IoT) applications in a variety of application domains. In this scenario, people worldwide are now ready to delight the benefits of the Internet of Things... more
Recent years have seen the swift development and deployment of Internet-of-Things (IoT) applications in a variety of application domains. In this scenario, people worldwide are now ready to delight the benefits of the Internet of Things (IoT). The IoT is emerging as the third wave in the evolution of the Internet. The 1990s' Internet wave connected 1.2 billion subscribers while the 2000s' mobile wave connected another 2.4 billion. Actually, IoT is expected to consist of more than 84 billion connected devices generating 186 zettabyte of data by 2025, in the opinion of IDC. It includes major types of networks, such as distributed, ubiquitous, grid, and vehicular, these have conquered the world of information technology over a decade. IoT is growing fast across several industry verticals along with increases in the number of interconnected devices and diversify of IoT applications. In spite of the fact that, IoT technologies are not reaching maturity yet and there are many challenges to overcome. The Internet of Things combines actual and virtual anywhere and anytime, fascinate the attention of both constructor and hacker. Necessarily, leaving the devices without human interference for a long period could lead to theft and IoT incorporates many such things. In this paper, we are briefly discussing technological perspective of Internet of Things security. Because, the protection was a major concern when just two devices were coupled. In this context, security is the most significant of them. Today scenario, there are millions of connected devices and billions of sensors and their numbers are growing. All of them are expected secure and reliable connectivity. Consequently, companies and organizations adopting IoT technologies require well-designed security IoT architectures.
The gigantic growth of information on the Internet makes discovery information challenging and time consuming. We are encircled by a plethora of data in the form of blogs, papers, reviews, and comments on different websites. Recommender... more
The gigantic growth of information on the Internet makes discovery information challenging and time consuming. We are encircled by a plethora of data in the form of blogs, papers, reviews, and comments on different websites. Recommender systems endow a solution to this situation by automatically capturing user interests and recommending respective information the user may also find relevant. The purpose of developing recommender systems is to detract information overload by retrieving the most pertinent knowledge and services from an enormous amount of data, thereby providing personalized services. The most vital feature of a recommender system is its proficiency to "supposition" a user's preferences and interests by examining the behavior of this user and/or the behavior of other users to originate personalized recommendations. So several research works have been done in this area, but nothing consolidated has been appraised. In this paper, we are going to discuss a brief summary of imperfection in the available recommender system. We are also trying to figure out these shortcomings of the available recommender system to generate a new method that improves these shortcomings.
The data mining techniques have the ability to discover hidden patterns or correlation among the objects in the medical data. There are many areas that adapt data mining techniques, namely marketing, stock, health care sector and so on.... more
The data mining techniques have the ability to discover hidden patterns or correlation among the objects in the medical data. There are many areas that adapt data mining techniques, namely marketing, stock, health care sector and so on. In the health care industry produces gigantic quantities of data that clutches complex information relating to the sick person and their medical conditions. The data mining has an infinite potential to make use of healthcare data more effectually and efficiently to predict various kinds of disease. The present-time healthcare industry heart ailment is a term that assigns to an enormous number of health care circumstances related to heart. These medical circumstances relate to the unexpected health circumstance that straight control the cardiac. In this paper we are using a ROCK algorithm because it uses Jaccard coefficient on the contrary using the distance measures to find the similarity between the data or documents to classify the clusters and the contrivance for classifying the clusters based on the similarity measure shall be used over a given set of data. Afterward, C4.5 algorithm is used as the training algorithm to show the rank of a cardiac ailment with the decision tree. The C4.5 can be referred as the statistic classifier as well as this algorithm uses avail radio for feature selection and to build the decision tree. The C4.5 algorithm is widely used because of its expeditious classification and high exactitude. Lastly, the cardiac ailment database is clustered using the K-means clustering, which will alienate the data convenient to cardiac sickness from the database.
Blockchain has swiftly become one of the most dominant and promising technologies of the past couple of years. The information security is the key to the development of contemporaneous Internet technology. The distributed mechanism,... more
Blockchain has swiftly become one of the most dominant and promising technologies of the past couple of years. The information security is the key to the development of contemporaneous Internet technology. The distributed mechanism, scripted mechanism, password mechanism and decentralized mechanism of the Blockchain present a perfectly new perspective for the development of Internet information security technology. Blockchain is a distributed database that maintains a successively increasingly list of records called blocks that are secured from any kind of interfere with and revision endeavor. A word that often emerges when talking about Blockchain is Bitcoin. The numerous people still confuse Blockchain with Bitcoinal though, they are not the same. Bitcoin is just one of several applications that use Blockchain technology. In Blockchain every block contains a time stamp and a link to the previous block. Blockchain extant level of security of a system and data perspective for both private and public ledgers. Alternatively, uploading data to a cloud server or storing it in a single location, as well as breaking everything into small chunks and distributes them across the whole network of computers. In this paper, we try to conduct a comprehensive survey on the Blockchain security as well as the challenges and opportunities for the prospective of security and privacy of data in Blockchain. In its present state, several leading companies and governments detect demonstrations of the Blockchain integrated into identity management, credential validation, finance, supply chain, property exchange recording, and other territory.
Soft computing approaches have different capabilities in error optimization for controlling the complex system parameters, Soft computing approaches provide a learning and desion making support from the relevant datasets or others experts... more
Soft computing approaches have different capabilities in error optimization for controlling the complex system parameters, Soft computing approaches provide a learning and desion making support from the relevant datasets or others experts review experiences. Soft computing optimization approaches can be variety of many environmental and alsostability related uncertainties. This paper explain the different soft computing approaches viz., Genetic algorithms, Fuzzy logics, results of different error optimization control case studies. Mathematical Models refer to the Conventional error optimization control , which define dynamic control Conventional controllers are often inferior to the intelligent controllers, due to lack in comprehensibility. The results that controllers provide better control on errors than conventional controllers. Hybridization of technique such as fuzzy logic with genetic algorithms etc., provide a better optimization control for the designing and developing of intelligent systems.
Now a day's, well known accepted soft computing methods likes as neural sets and fuzzy sets, adaptive evolutionary computing and neuro fuzzy based system, and Intelligence Computing's etc. for carrying out numerical varying data... more
Now a day's, well known accepted soft computing methods likes as neural sets and fuzzy sets, adaptive evolutionary computing and neuro fuzzy based system, and Intelligence Computing's etc. for carrying out numerical varying data simulation analysis. These approaches become applied on different engineering problems as independently. In this research paper many introducer and researchers describe the soft computing review technique which using for solving real domains problems and simulate the reappearance of the intelligence, as adaptation and learning for real times engineering problems... The intelligence computing Computational intelligence is a universal approximate, and it has the great function of non-linear mapping and optimization techniques. Also we discuss a brief of intelligent systems in Neural Network. And discuss it features and constituents of real problems. We concentrate on solving real domain problems in Neural Network of Intelligent System as Intelligence computing and soft computing.
Recommender systems are extensively seen as an effective means to combat information overload, as they redound us both narrow down the number of items to choose. They are seen as assistance us make better decisions at a lower transaction... more
Recommender systems are extensively seen as an effective means to combat information overload, as they redound us both narrow down the number of items to choose. They are seen as assistance us make better decisions at a lower transaction cost. Hence, recommender systems have become omnipresent in e-commerce and are also increasingly used in services in different other domains both online and offline where the number of items exceeds our potentiality to consider them all individually. The research papers recommender systems are software applications or systems that help individual users to discover the most relevant research papers to their needs. These systems use filtering techniques to create recommendations. These techniques are categorized majorly into collaborative-based filtering, content-based technique, and hybrid algorithm. In addition, they assist in decision making by providing product information both personalized and non-personalized, summarizing community opinion, search research papers, and providing community critiques. As a result, recommender systems have been shown to ameliorate the decision.
In the last twelve years, the number of web user increases, so intensely leading to intense advancement in web services which leads to enlargement the usage data at higher rates. The purpose of a recommender System is to generate... more
In the last twelve years, the number of web user increases, so intensely leading to intense advancement in web services which leads to enlargement the usage data at higher rates. The purpose of a recommender System is to generate meaningful recommendations to a collection of users for items or products that might interest them. Recommender systems differ in the way they analyze these data sources to develop notions of congeniality between users and items which can be used to identify well-matched pairs. The recommender system technology intentions to help users in finding items that match their personal interests. It has a successful usage in e-commerce applications to deal with problems related to information overload proficiently. In this paper, we will extensively present a survey of six existing recommendation system. The Collaborative Filtering systems analyze historical interactions alone, while Content-Based Filtering systems are based on profile attributes, Hybrid Techniques attempt to combine both of these designs, Demographic Based Recommender systems aim to categorize the user based on personal attributes and make recommendations based on demographic classes, while Knowledge-Based Recommendation attempts to suggest objects based on inferences about a user's needs and preferences, and Utility-Based Recommender systems make recommendations based on the computation of the utility of each item for the user. In this paper, we have recognized 60 research papers on recommender systems, which were published between 1971 and 2014. Finally, few research papers had an influence on research paper recommender systems in practice. We also recognized a lack of authority and long term research interest in the field, 78% of the authors published no more than one paper on research paper recommender systems, and there was miniature cooperation among different co-author groups.
The Big data is the name used ubiquitously now a day in distributed paradigm on the web. As the name point out it is the collection of sets of very large amounts of data in pet bytes, Exabyte etc. related systems as well as the algorithms... more
The Big data is the name used ubiquitously now a day in distributed paradigm on the web. As the name point out it is the collection of sets of very large amounts of data in pet bytes, Exabyte etc. related systems as well as the algorithms used to analyze this enormous data. Hadoop technology as a big data processing technology has proven to be the go to solution for processing enormous data sets. MapReduce is a conspicuous solution for computations, which requirement one-pass to complete, but not exact efficient for use cases that need multi-pass for computations and algorithms. The Job output data between every stage has to be stored in the file system before the next stage can begin. Consequently, this method is slow, disk Input/output operations and due to replication. Additionally, Hadoop ecosystem doesn't have every component to ending a big data use case. Suppose we want to do an iterative job, you would have to stitch together a sequence of MapReduce jobs and execute them in sequence. Every this job has high-latency, and each depends upon the completion of the previous stage. Apache Spark is one of the most widely used open source processing engines for big data, with wealthy language-integrated APIs and an extensive range of libraries. Apache Spark is a usual framework for distributed computing that offers high performance for both batch and interactive processing. In this paper, we aimed to demonstrate a close-up view about Apache Spark and its features and working with Spark using Hadoop. We are in a nutshell discussing about the Resilient Distributed Datasets (RDD), RDD operations, features, and limitation. Spark can be used along with MapReduce in the same Hadoop cluster or can be used lonely as a processing framework. In the last comparative analysis between Spark and Hadoop and MapReduce in this paper.
Big Data make conversant with novel technology, skills and processes to your information architecture and the people that operate, design, and utilization them. The big data delineate a holistic information management contrivance that... more
Big Data make conversant with novel technology, skills and processes to your information architecture and the people that operate, design, and utilization them. The big data delineate a holistic information management contrivance that comprise and integrates numerous new types of data and data management together conventional data. The Hadoop is an unlocked source software framework licensed under the Apache Software Foundation, render for supporting data profound applications running on huge grids and clusters, to proffer scalable, credible, and distributed computing. This is invented to scale up from single servers to thousands of machines, every proposition local computation and storage. In this paper, we have endeavored to converse about on the taxonomy for big data and Hadoop technology. Eventually, the big data technologies are necessary in providing more actual analysis, which may leadership to more concrete decision-making consequence in greater operational capacity, cost deficiency, and detect risks for the business. In this paper, we are converse about the taxonomy of the big data and components of Hadoop.
In the today scenario technological intelligence is a higher demand after commodity even in traffic-based systems. These intelligent systems do not only help in traffic monitoring but also in commuter safety, law enforcement and... more
In the today scenario technological intelligence is a higher demand after commodity even in traffic-based systems. These intelligent systems do not only help in traffic monitoring but also in commuter safety, law enforcement and commercial applications. The proposed Saudi Arabia Vehicle License plate recognition system splits into three major parts, firstly extraction of a license plate region secondly segmentation of the plate characters and lastly recognition of each character. This act is quite challenging due to the multiformity of plate formats and the nonuniform outdoor illumination conditions during image collection. In this paper recognition of the license plates is achieved by the implementation of the Learning Vector Quantization artificial neural network. Their results are based upon their completeness in the Saudi Arabia Vehicle License plate character recognition and theirs have obtained encouraging results from proposed technique.
Forecasting is the process of computation in unknown situations from the historical data. Financial forecasting and planning is usually an essential part of the business plan, and would be done as part of setting up the organization and... more
Forecasting is the process of computation in unknown situations from the historical data. Financial forecasting and planning is usually an essential part of the business plan, and would be done as part of setting up the organization and receive funds. Financial forecasting and planning is also an essential activity to confirm a good management, keeping the organization financially sound is a key objective. The prediction of stock market has been a long time tempting topic for researchers from different fields. Stock analysts use various forecasting methods to determine how a stock's price will move in the ensuing day. The purpose of this paper is to explore the radial basis function (RBF) and function linked artificial neural network (FLANN) algorithms for forecasting of financial data. We have based our models on data taken and compared those using historical data from the Bombay Stock Exchange (BSE). The RBF and FLANN parameters updated by Particle swarm optimization (PSO). In this paper, we have examined this algorithm on a number of various parameters including error convergence and the Mean Average Percentage Error (MAPE) and comparative assessment of the RBF and FLANN algorithms is done. The proposed method indeed can help investors consistently receive gains. Finally, a simple merchandise model is established to study the accomplishment of the proposed prediction algorithm against other criterion.
In order to survive and stay ahead in today’s competitive world companies are expanded to their limits in search for organizational skills and technologies. Out off those Supply chain Management and Enterprise Resource Planning are the... more
In order to survive and stay ahead in today’s competitive world companies are expanded to their limits in search for organizational skills and technologies. Out off those Supply chain Management and Enterprise Resource Planning are the two most primarily used terms. The utmost important factors here are to improve the speed of production with least cost and with more efficiency in order to stay
and survive in the competition in the present globalised economic scenario. There is a serious need for integration of information across the supply chains and proper planning of enterprises' resources. One
hand the supply chains will enhance the efficiency of movement of various inputs and on the other hand the ERP will improve the overall efficiency of the resources to bring down the cost of production
and operations. Once a firm is able to achieve the best quality output with least cost, it could attract more consumers towards its production. This research paper troughs light on how the integration of SCM(Supply Chain Management) and ERP(Enterprise Resource Planning) would be beneficial to the company to achieve greater Competitive Advantage with the Case Examples of Cadbury and Nokia.
The breakthrough in wireless networking has prompted a new concept of computing, called mobile computing in which users tote portable devices have access to a shared infrastructure, independent of their physical location. Mobile computing... more
The breakthrough in wireless networking has prompted a new concept of computing, called mobile computing in which users tote portable devices have access to a shared infrastructure, independent of their physical location. Mobile computing is becoming increasingly vital due to the increase in the number of portable computers and the aspiration to have continuous network connectivity to the Internet irrespective of the physical location of the node. Mobile computing systems are computing systems that may be readily moved physically and whose computing ability may be used while they are being moved. Mobile computing has rapidly become a vital new example in today's real world of networked computing systems. It includes software, hardware and mobile communication. Ranging from wireless laptops to cellular phones and WiFi/Bluetooth-enabled PDA's to wireless sensor networks; mobile computing has become ubiquitous in its influence on our quotidian lives. In this paper various types of mobile devices are talking and they are inquiring into in details and existing operation systems that are most famed for mentioned devices are talking. Another aim of this paper is to point out some of the characteristics, applications, limitations, and issues of mobile computing.
In this paper, we present recognition of handwritten characters of Arabic script. Arabic is now the 6th most spoken language in the world and is spoken by more than 200 million people worldwide. The 7th Century A.D., Arabic started to... more
In this paper, we present recognition of handwritten characters of Arabic script. Arabic is now the 6th most spoken language in the world and is spoken by more than 200 million people worldwide. The 7th Century A.D., Arabic started to spread to the Middle East as many people started to convert to Islam. During this time of religious conversions, Arabic replaced many South Arabian languages, most of which are no longer commonly spoken or understood languages. The challenges in Arabic handwritten character recognition wholly lie in the variation and disfigurement of Arabic handwritten characters, since different Arabic people may use a different style of handwriting, and direction to draw the same shape of the characters of their known Arabic script. Though various new propensity and technologies come out in these days, still handwriting is playing an important role. To recognize Arabic handwritten data there are different strategies like Simplified Fuzzy ARTMAP and Hidden Markov Models (HMM). In this paper, we are using Simplified Fuzzy ARTMAP, which is an updated version of Predictive Adaptive Resonance Theory. It also has a capacity to adjust clusters, as per the requirements Arabic script, which is remunerative to mitigate noise. We have tested our method on Arabic scripts and we have obtained encouraging results from our proposed technique.
In the present globalized Marketing environment, the marketers has a cherished dream to capture a sizeable market share compared to his competitors. Particularly, in the Indian context there are innumerable opportunities for FMCG (Fast... more
In the present globalized Marketing environment, the marketers has a cherished dream to capture a sizeable market share compared to his competitors. Particularly, in the Indian context there are innumerable opportunities for FMCG (Fast Moving Consumer Goods)industry that has high potential to garner rich returns to various companies. The researchers of this paper has focused on how to get the benefit of latest IT technologies in the retail supply chains with reference to C class towns at Tamil Nadu. The main emphasis in this paper was to identify the strategies followed by various FMCG companies to improve close customer relationships. To this end a study was conducted to know the role of IT usage to improve the performance of retail supply chains. This paper is exploratory and descriptive in nature, wherein much of the information about the existing supply chains has been extracted through interviews with the retailers and wholesalers of FMCG majors. Originality/value: We have made a suitable study on how these latest developments in IT is being used in FMCG sector in India particularly in supply chain management. We have taken some C class towns in Tamil Nadu of India for the purpose of this study by discussions and observation and mainly the companies like Dabur and HLL have taken for this study.
In the age of digital and network, every high efficiency and high profit activity has to harmonize with internet. The business behaviors and activities always are the precursor for getting high efficiency and high profit. Consequently,... more
In the age of digital and network, every high efficiency and high profit activity has to harmonize with internet. The business behaviors and activities always are the precursor for getting high efficiency and high profit. Consequently, each business behavior and activities have to adjust for integrating with internet. Underlay on the internet, business extension and promotion behaviors and activities general are called the Electronic Commerce (E-commerce). The quality of web-based customer service is the capability of a firm's website to provide individual heed and attention. Today scenario personalization has become a vital business problem in various e-commerce applications, ranging from various dynamic web content presentations. In our paper Iterative technique partitions the customer in terms of frankly combining transactional data of various consumers that forms dissimilar customer behavior for each group, and best customers are acquired, by applying approach such as, IE (Iterative Evolution), ID (Iterative Diminution) and II (Iterative Intermingle) algorithm. The excellence of clustering is improved via Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO). In this paper these two algorithms are compared and it is found that Iterative technique chorus Particle Swarm Optimization (PSO) is better than the other Ant Colony Optimization (ACO) algorithms. Additionally the results show that the Particle Swarm Optimization (PSO) algorithm outperforms other Ant Colony Optimization (ACO) algorithms methods. Finally quality is superior along with this response time higher and cost wise performance is increased and both accuracy and efficiency.
My proposed work is inspired by the experiment that uses expert judgment for estimation of the cost on the basis of previous project results. In this paper estimator can use Analogical strategies as well as Algorithmic Strategies as they... more
My proposed work is inspired by the experiment that uses expert judgment for estimation of the cost on the basis of previous project results. In this paper estimator can use Analogical strategies as well as Algorithmic Strategies as they wish. The proposed method is divided into two phases. First phase computed the probability of each selected factors by ant colony system. Second phase combines the value of these factors to calculate the cost overhead for the project by using Bayesian belief network. Once this overhead is computed productivity is directly calculated which can be converted in effort and cost. Our computation gives the Cost Overhead that depends on various factors. Till date Ant Colony Optimization Algorithm has provided solutions for the problems that have multiple solution and user are interested in best solution. This algorithm provides a proper heuristic for the problem and computes the best possible solution. It gives the solutions in terms of probability, i.e. The most likely occurred solution and the best solution. It was first introduced in Traveling Salesman Problem for finding the minimum cost path. We have mapped our problem in a simple graph by using a questionnaire. That gives the minimum length path, the path that obtains minimum deviation from the nominal project for each factor and theirs encouraging results from proposed technique.
In 2020 the world has generating 52 times the amount of data as in 2010, and 76 times the number of information sources. Being able to use this data provides huge opportunities and to turn these opportunities into reality, people need to... more
In 2020 the world has generating 52 times the amount of data as in 2010, and 76 times the number of information sources. Being able to use this data provides huge opportunities and to turn these opportunities into reality, people need to use data to solve problems. Unfortunately in the midst of a global pandemic, when people throughout the world are looking for reliable, trustworthy information about COVID-19(Coronavirus). In this scenario tableau play the vital role, because Tableau is an extremely powerful tool for visualizing massive sets of data very easily. It has an easy to use drag and drop interface. You can build beautiful visualizations easily and in a short amount of time. Tableau supports a wide array of data sources. COVID-19(Coronavirus) analytics with Tableau, you will create dashboards that help you identify the story within our data, and we will better understand the impact of COVID-19 (Coronavirus). In this paper comprehensive review about Tableau. The Tableau are the tools which deals with the big data analytics also it generates the output in visualization technique i.e., more understandable and presentable. Its features include data blending, real-time reporting and collaboration of data. In the end, this paper gives the clear picture of growing COVID-19(Coronavirus) data and the tools which can help more effectively, accurately and efficiently.