Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
  • International Journal Of Engineering And Computer Science is a leading international journals for publication of new ... moreedit
  • Mr. Pawan Loharedit
In the study of Human-computer-interaction (HCI) the design and use of technology using digitalized computer systems mainly focusing on the particular interfaces between people and computers. There is an ongoing research that taking place... more
In the study of Human-computer-interaction (HCI) the design and use of technology using digitalized computer systems mainly focusing on the particular interfaces between people and computers. There is an ongoing research that taking place till today using Human-computer-interactions especially on visually impaired people. This system mainly introduces the thematic study on " Blind and visually impaired people Human computer and access to Graphics " represents a current research study towards solution for impaired people and brings together a new researchers and practitioners. Here, we are approaching one of the methods which can be useful for the visually impaired people in the form of recognising the clothing patterns. Choosing clothing pattern is one of the challenging tasks for visually impaired people. We matured a camera-based model to notice the clothing patterns. The clothing patterns are categories as five types like (plaid, striped, pattern less, horizontal-vertical, irregular etc) and it identifies 11 clothing colors. The system mainly integrates with the microphone, camera, Bluetooth, earpiece for audio signal. The output of our system is given by audio signal. To recognize clothing patterns, we propose a Hough line Transformation for the detection of pattern and canny detection for detection of edges in the clothing pattern. we proposed the CCNY Clothing Pattern dataset and other different pattern datasets to our method. Using various other performances our method is under the study. In this project we are using OpenCV library for capture the images. Thought such a system would support more independence in blind person's daily life.
Research Interests:
—In the cloud, for achieving access control and keeping data confidential, the data owners could adopt attribute-based encryption to encrypt the stored data. Users with limited computing power are however more likely to delegate the mask... more
—In the cloud, for achieving access control and keeping data confidential, the data owners could adopt attribute-based encryption to encrypt the stored data. Users with limited computing power are however more likely to delegate the mask of the decryption task to the cloud servers to reduce the computing cost. As a result, attribute-based encryption with delegation emerges. Still, there are caveats and questions remaining in the previous relevant works. For instance, during the delegation, the cloud servers could tamper or replace the delegated ciphertext and respond a forged computing result with malicious intent. They may also cheat the eligible users by responding them that they are ineligible for the purpose of cost saving. Furthermore, during the encryption, the access policies may not be flexible enough as well. Since policy for general circuits enables to achieve the strongest form of access control, a construction for realizing circuit ciphertext-policy attribute-based hybrid encryption with verifiable delegation has been considered in our work. In such a system, combined with verifiable computation and encrypt-then-mac mechanism, the data confidentiality, the fine-grained access control and the correctness of the delegated computing results are well guaranteed at the same time. Besides, our scheme achieves security against chosen-plaintext attacks under the k-multilinear Decisional Diffie-Hellman assumption. Moreover, an extensive simulation campaign confirms the feasibility and efficiency of the proposed solution.
Research Interests:
In our day today life quiz competition is rapidly increasing. So to get the appropriate results Fastest finger first (FFF) is used to know the players respond time. It is rapidly used in institute level as well as commercial level. In... more
In our day today life quiz competition is rapidly increasing. So to get the appropriate results Fastest finger first (FFF) is used to know the players respond time. It is rapidly used in institute level as well as commercial level. In early days it is very tedious work to know who has buzzer the alarm first in fastest finger first. So the solution for this problem is to use PIC microcontroller 16F877A. This project is designed as a product based. Also it has a vast impact on industrial level for security purpose.
Research Interests:
The huge blast of information and Internet gadgets has prompted fast approach of Big Data in later past. Administration industry which is a noteworthy client for these Big Data applications will prompt real change to the conveyance... more
The huge blast of information and Internet gadgets has prompted fast approach of Big Data in later past. Administration industry which is a noteworthy client for these Big Data applications will prompt real change to the conveyance process and new bits of knowledge into utilization example and work processes, which thusly will help with new worldwide conveyance models incorporating new innovations and dispersion of work comprehensively. The Service Industry will utilize Big Data for different choices making information framework and making the work process more ideal. The idea of large scale manufacturing lead to Industrial Revolution, likewise Big Data is relied upon to drive new types of financial movement in Service industry with connected human capital, achieving new level of monetary action, development, and development.
Research Interests:
Research Interests:
Using the cloud storage, users store their data on the cloud without the burden of data storage and maintenance and services and high-quality applications from a shared pool of configurable computing resources. Cryptography is probably... more
Using the cloud storage, users store their data on the cloud without the burden of data storage and maintenance and services and high-quality applications from a shared pool of configurable computing resources. Cryptography is probably the most important aspect of communications security and is becoming increasingly important as a basic building block for computer security. As data sharing is an important functionality in cloud storage, In this paper we show that how to securely, efficiently and flexibly share data with others in cloud storage, Cloud-validation based Flexible Distributed, Migration, ciphertext with aggregate key encryption for data stored in cloud. This scheme provides secure data storage and retrieval. Along with the security the access policy is also hidden for hiding the user's identity. This scheme is so powerful since we use aggregate encryption and string matching algorithms in a single scheme. The scheme detects any change made to the original file and if found clear the error's. The algorithm used here are very simple so that large number of data can be stored in cloud without any problems. The security, authentication, confidentiality are comparable to the centralized approaches. A set of constant-size cipher texts such that efficient delegation of decryption rights for any set of cipher texts is possible the best.
Research Interests:
—Visual Cryptography (VC), a cryptographic technique which allows visual information to be encrypted in such a way that the decryption can be performed by the Human Visual System (HVS), without the help of computers. Visual Cryptography... more
—Visual Cryptography (VC), a cryptographic technique which allows visual information to be encrypted in such a way that the decryption can be performed by the Human Visual System (HVS), without the help of computers. Visual Cryptography Scheme (VCS) eliminates complex computation problem in decryption process, by stacking operation we can restore the secret image. This property makes VC especially useful for the low computation load requirement. During encryption, the image is encrypted and then it is divided into two shares. Two shares are superimposed to reveal the secret image in the decryption process. The objective of our project is to get the better quality of decrypted image with the same size as the original image. The OR based VCS degrades the contrast by its monotone property. In XOR based scheme, the share images are superimposed with the help of XOR operation which results in perfect reconstruction. Hence, the XOR operation is proposed in decoding process to enhance the contrast, quality, and to reduce noise.
Research Interests:
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent... more
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Research Interests:
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent... more
Data Mining plays an essential role for mining useful pattern hidden in large databases. Apriori algorithm is used to find frequent itemsets in large databases. Apriori is a Bottom-up generation of Frequent item set combinations. Frequent itemset mining may discover the large amount of frequent but low revenue itemsets and lose the information on the valuable itemsets having low selling frequencies. High Utility Itemset mining identifies itemsets whose utility satisfies the given threshold. It allows the users to quantify the usefulness or preferences of items using different values. A High Utility Itemset which is not included in another itemset having the same support is called Closed High Utility Itemset. Mining High utility itemsets uses Apriori algorithm which takes the Input as Frequent Itemsets from the Transactional database, profit, and price and gives the High Utility Itemsets as the Output. To mine the Closed High Utility Itemsets the system addresses an efficient Depth-First search algorithm named CHUD.
Research Interests:
Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively monitor physical or environmental conditions, and send their information to a " sink " node over multi hop wireless communication links. The... more
Wireless sensor networks are embedded with distributed set of sensor nodes that are cooperatively monitor physical or environmental conditions, and send their information to a " sink " node over multi hop wireless communication links. The sensor nodes battery energy depletion will significantly affect the network lifetime of a WSN. Most researchers have aimed to design energy-aware routing protocol to minimize the usage of the battery energy to prolong network lifetimes. This paper proposes a sink relocation approach for efficient utilization of sensor " s battery energy called Energy Efficient Sink Relocation Scheme (E-SRS) which considers regulation of transmission coverage range depend on the residual energy of a sensor node. The EE-SRS proposed in this paper discusses the algorithm to find the optimal place for relocating the sink, and " when and where to relocate the sink ". The EE-SRS algorithm is developed and simulated using network simulator. The performance analysis has also been done in terms of the network lifetime, throughput and packet delay.
Research Interests:
Moving object detection and tracking are the more important and challenging task in video surveillance and computer vision applications. Object detection is the procedure of finding the non-stationary entities in the image sequences.... more
Moving object detection and tracking are the more important and challenging task in video surveillance and computer vision applications. Object detection is the procedure of finding the non-stationary entities in the image sequences. Detection is the first step towards tracking the moving object in the video. Object representation is the next important step to track. Tracking is the method of identifying, the position of the moving object in the video. Identifying the position is much more challenging task then detecting the moving object in a video. Object tracking is applied in numerous applications like in robot vision, monitoring the traffic, Video surveillance, Video in-painting and Simulation. Here we are going to present a brief review of numerous object detection, object classification and object tracking algorithms available.
Research Interests:
In this paper various techniques used for CRM in data mining are defined and compared with each other. Data mining is a useful and powerful tool for any organization especially for marketing people. Data mining is used in managing... more
In this paper various techniques used for CRM in data mining are defined and compared with each other. Data mining is a useful and powerful tool for any organization especially for marketing people. Data mining is used in managing relationships with customers. Data mining process can be extremely useful for Medical practitioners for extracting hidden medical knowledge. It would otherwise be impossible for traditional pattern matching and mapping strategies to be so effective and precise in prognosis or diagnosis without data mining techniques.
Research Interests:
With the increase in amout of data and information one has to deal with,now a days,going through all the documents is a time consuming process.We are implementing an android application that helps organizations such as law firms to manage... more
With the increase in amout of data and information one has to deal with,now a days,going through all the documents is a time consuming process.We are implementing an android application that helps organizations such as law firms to manage the hundreds of documents and to get summary of these documents.We are also using concept of ontology for this application.Ontology is basically the relationship between entities.The application that we are implementing allow the users to search for files in the database,upload files and summarize multiple documents.
Research Interests:
Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence it is often called a restriction plate. The flow exerts some force on the plate due to impact of jet. The orifice plate acts an as obstacle... more
Orifice plate is a mechanical element used for measuring rate of flow by restricting flow, hence it is often called a restriction plate. The flow exerts some force on the plate due to impact of jet. The orifice plate acts an as obstacle for the flow. Here in our work we have done static analysis for three different geometries for orifice maintaining net impact area and orifice area same in all three cases. At the end we calculated maximum stress and maximum deformation for all the three geometries of orifice for the assumed working conditions, and found the best geometry which has the minimum stress and minimum deformation.
Research Interests:
Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. With the help of GDI, circuits requiring large number of transistors can be realized with the help of quite less number of transistors. This... more
Gate Diffusion Input provides one of the effective alternatives in low power VLSI application. With the help of GDI, circuits requiring large number of transistors can be realized with the help of quite less number of transistors. This approach tends to optimize various performance parameters such as area, delay, power dissipation. In this paper GDI cell has been applied in realizing various combinational circuits. One of the novel design has been proposed (XOR circuit using only nMOS) for providing low power in digital circuit. Based on simulation results their waveforms have been analyzed and various performance parameters have been calculated. These parameters are then compared to standard CMOS logic. The schematic and layout are drawn on 120nm technology file on a Dsch tool and their analysis is done on a Microwind 3.1 tool and BSIM simulator.
Research Interests:
Numerical integration is an important ingradient within many techniques of applied mathematics,engineering and scinietific applications, this is due to the need for accurate and efficient integration schemes over complex integration... more
Numerical integration is an important ingradient within many techniques of applied mathematics,engineering and scinietific applications, this is due to the need for accurate and efficient integration schemes over complex integration domains and the arbitrary functions as their corresponding integrands. In this paper,we propose a method to discretise the physical domain in the shape of a linear polyhedron into an assemblage of all hexahedral finite elements. The idea is to generate a coarse mesh of all tetrahedrons for the given domain,Then divide each of these tetrahedron further into a refined mesh of all tetrahedrons, if necessary. Then finally, we divide each of these tetrahedron into four hexahedra.We have further demonstrated that each of these hexahedra can be divided into and hexahedra. This generates an all hexahedral finite element mesh which can be used for various applications In order to achieve this we first establish a relation between the arbitrary linear tetrahedron and the standard tetrahedron.We then decompose the standard tetrahedron into four hexahedra. We transform each of these hexahedra into a 2-cube and discover an interesting fact that the Jacobian of these transformations is same and the transformations are also the same but in different order for all the four hexahedra.This fact can be used with great advantage to generate the numerical integration scheme for the standard tetrahedron and hence for the arbitrary linear tetrahedron. We have proposed three numerical schemes which decompose a arbitrary linear tetrahedron into 4, 4(hexahedra.These numerical schemes are applied to solve typical integrals over a unit cube and irregular heptahedron using Gauss Legendre Quadrature Rules. Matlab codes are developed and appended to this paper.
Research Interests:
For a PV array, system monitoring is considered important to analyze the stability and performance. The simple monitoring system involves a data logging system with wired cables for transmitting data. Removing all those drawbacks observed... more
For a PV array, system monitoring is considered important to analyze the stability and performance. The simple monitoring system involves a data logging system with wired cables for transmitting data. Removing all those drawbacks observed in the existing system this proposed work is designed for the wireless monitoring of photovoltaic cell as a high precision solar array monitoring system. It is planned to measure the basic PV array characteristics like Module Temperature (T), Open Circuit Voltage (Voc), Short Circuit Current (Isc) and wirelessly transmit the data into real time GUI in the computer. The GUI was developed using the PROCESSING software. The commercially available WPAN hardware module ZigBee is used for implementation with API protocol for exchanging information. A sensor node with XBee and a set of analog sensors (eliminating the use of controllers at the sensor node) for measuring current and voltage generated in the PV array has been deployed. A coordinator node with Atmel microcontroller and Xbee connected with a PC to analyze the parameters.
Research Interests:
In the era of globalization internet plays a vital role in all spheres of life and industries.Internet is very famous nowadays for satisfying people with various services related to various different fields. It is a very versatile... more
In the era of globalization internet plays a vital role in all spheres of life and industries.Internet is very famous nowadays for satisfying people with various services related to various different fields. It is a very versatile facility which can help you in completing many tasks easily and conveniently with few clicks. It can be any work of daily usage or any specific service which needs a lot of research and formalities to be done beforehand, as well as this marketing is not an exception either. Online marketing, which is also called internet marketing, involves use of interactive,virtual spaces for the sake of promoting and selling goods and services. In fact, new synchronous, internet-based communication technologies had contributed to the restructuration of major economic sectors including marketing. Being cost-effective, flexible, and fast and enjoying an onunprecedented global reach, internet marketing has brought about different businesses incredible gainsrity, privacy, etc, emerged in the field of marketing from implementation of virtual space produce.
Research Interests:
In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for highly non-linear benchmark electromechanical systems such as Vehicle Active Suspension System (VASS) and Magnetic Suspension System (MSS)... more
In this work, PID and modified form of PID (I-PD) controller design procedure is proposed for highly non-linear benchmark electromechanical systems such as Vehicle Active Suspension System (VASS) and Magnetic Suspension System (MSS) discussed in the literature. The proposed controller design is implemented using the most successful heuristic algorithms, such as Particle Swarm Optimization (PSO), Bacterial Foraging Optimization (BFO) and Firefly Algorithm (FA). A weighted sum of objective function comprising the overshoot (Mp), settling time (Ts), integral square error (ISE) and integral absolute error (IAE) is chosen to guide the heuristic search in order to find the controller parameters such as Kp, Ki, and Kd. In the proposed work, the major aim is to compare the performance of the considered heuristic algorithms for the controller design problem. The simulation work is implemented using the Matlab software and the performance of this study is validated using Mp, Ts, ISE and IAE values for the reference tracking and disturbance rejection operations. This study confirms that, FA offers faster convergence compared with the PSO and BFO algorithms.
Research Interests:
This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We introduce a new MultiChord Protocol which is another variant of Chord Protocol defined over overlay network of Distributed Hash table.... more
This paper proposes the new architecture of overlay network of Distributed Hash table (DHT). We introduce a new MultiChord Protocol which is another variant of Chord Protocol defined over overlay network of Distributed Hash table. MultiChord inherits basic properties of Chord protocol with some added new features.
Research Interests:
we consider a sub-optimum joint transmit receive antenna selection (JTRAS) scheme in multiple input multiple output(MIMO) systems equipped with N transmit and three receive antennas.At the transmitter,we keep one antenna as fixed and... more
we consider a sub-optimum joint transmit receive antenna selection (JTRAS) scheme in multiple input multiple output(MIMO) systems equipped with N transmit and three receive antennas.At the transmitter,we keep one antenna as fixed and select the best among the remaining N-1 Antenna.After selecting two transmit antennas,we select the receive antenna for which the signal to noise ratio (SNR) is maximum.we assume spatially independent flat fading channels with perfect channel state information(CSI) at receiver and an ideal feedback link.we use Alamouti transmit diversity and derive the exact closed-form expression for the pdf of received SNR, using which we obtain bit error rate (BER) for BPSK constellation. We have presented simulation results and compared them with the derived analytical expressions. We have discussed some special cases of the considered antenna performance of the considered scheme with the other available schemes in terms of number of feedback bits and BER. We conclude that the considered JTRAS scheme reduces number of feedback bits compare with the golden code. Keywords—Alamouti transmit diversity (ATD), bit error rate (BER), Rayleigh fading channel, joint transmit receive antenna selection (JTRAS).
Research Interests:
An artificial intelligence system is capable of elucidating and representing knowledge along with storing and manipulating data. Knowledge could be a collection of facts and principles build up by human. It is the refined form of... more
An artificial intelligence system is capable of elucidating and representing knowledge along with storing and manipulating data. Knowledge could be a collection of facts and principles build up by human. It is the refined form of information. Knowledge representation is to represent knowledge in a manner that facilitates the power to draw conclusions from knowledge. Knowledge representation is a good approach as conventional procedural code is not the best way to use for solving complex problems. Frames, Semantic Nets, Systems Architecture, Rules, and Ontology are its techniques to represent knowledge. Forward and backward chaining are the two main methods of reasoning used in an inference engine. It is a very common approach for " expert systems " , business and systems. This paper focus on the concept of knowledge representation in artificial intelligence and the elaborating the comparison of forward and backward chaining.
Research Interests:
Big data applications are a great benefit to organizations, business, companies and many large scale and small scale industries .We also discuss various possible solutions for the issues in cloud computing security and Hadoop. Cloud... more
Big data applications are a great benefit to organizations, business, companies and many large scale and small scale industries .We also discuss various possible solutions for the issues in cloud computing security and Hadoop. Cloud computing security is developing at a rapid pace which includes computer security, network security, information security, and data privacy. Cloud computing plays a very vital role in protecting data, applications and the related infrastructure with the help of policies, technologies, controls, and big data tools. Big data is a data analysis methodology enabled by recent advances in technologies and architecture. Cloud computing is a set of it services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. It advantages includes scalability, resilience, flexibility, efficiency and outsourcing non-core activities. This paper introduces a detailed analysis between big data and cloud computing security issues and challenges focusing on the cloud computing types and the service delivery types. However, big data entails a huge commitment of hardware and processing resources, making adoption costs of big data technology prohibitive to small and medium sized businesses. It offers an innovative business model for organizations to adopt it services without upfront investment irrespective of the potential gains achieved from the cloud computing, the organizations are slow in accepting it due to the security issues and associated challenges security is one of the major issues which hamper the growth of cloud. The use of big data could provide sufficient benefit to a small to medium sized company to the extent that the business would be willing to commit resources to implement big data technology in-house. However, the level of benefit is difficult to determine without some experience. The main focus is on security issues in cloud computing that are associated with big data. Moreover, cloud computing, big data and its applications, advantages are likely to represent the most promising new frontiers in science
Research Interests:
Computer-based sensors and actuators such as worldwide positioning systems, appliance vision, and laser-based sensors have increasingly been incorporated into mobile robots with the aim of configuring independent systems capable of... more
Computer-based sensors and actuators such as worldwide positioning systems, appliance vision, and laser-based sensors have increasingly been incorporated into mobile robots with the aim of configuring independent systems capable of shifting operator activities in agricultural tasks. However, the incorporation of many electronic systems into a robot impairs its trustworthiness and increases its cost. Hardware minimization, as well as software minimization and ease of combination, is essential to obtain feasible robotic systems. A step forward in the application of mechanical equipment in agriculture is the use of fleets of robots, in which a number of expert robots collaborate to accomplish one or several rural tasks.
Research Interests:
Disease Prediction has always been a matter of research due to increasing number of health risks. Modern Medicine System produce huge amount of data which needs to be organized. Medical Data Mining plays a vital role in generating... more
Disease Prediction has always been a matter of research due to increasing number of health risks. Modern Medicine System produce huge amount of data which needs to be organized. Medical Data Mining plays a vital role in generating efficient results when it comes to prediction based analytics. Data Mining turns data into patterns which are useful for analyzing the diseases. This research paper focuses on role of K-Means Algorithm in disease prediction.
Research Interests:
— The agricultural field monitoring has become very important in these recent times. The farmers are struggling in many ways to get the yield from the fields. The farmers thus must have to struggle a lot in order to maintain their fields... more
— The agricultural field monitoring has become very important in these recent times. The farmers are struggling in many ways to get the yield from the fields. The farmers thus must have to struggle a lot in order to maintain their fields such that their crops are not damaged and they must get the required crops. Few technologies are being developed which are used to monitor the status of the fields. This project is also developed in order to provide the monitoring of the field's condition. In this project, we are continuously checking water in the field using sensor. If there is no water in the field, the sprinkler should be ON for that indication we have used one motor for pumping and if the water is more in the field automatically motor should be OFF and in rainy season whenever water is more field ending (spool) will be damaged so to avoid that we have used another sensor at field ending (spool) if the water level is high the gate should be open otherwise gate should be closed using a motor and another motor is used to sprinkle the fertilizers. In the field, if fire is occurred user have to receive a message that fire accident had occurred then sprinklers automatically ON in order to control the fire. Thus, this method will be more advantageous for the monitoring of the fields.
Research Interests:
—In order to avoid dangerous environmental disasters, robots are being recognized as good entrants to step in as human rescuers. Robots has been gaining interest of many researchers in rescue matters especially which are furnished with... more
—In order to avoid dangerous environmental disasters, robots are being recognized as good entrants to step in as human rescuers. Robots has been gaining interest of many researchers in rescue matters especially which are furnished with advanced sensors. In distributed wireless robot system main objective for a rescue system is to track the location of the object continuously. This paper provides a novel idea to track and locate human in disaster area using stereo vision system and ZigBee technology. This system recursively predict and updates 3D coordinates in a robot coordinate camera system of a human which makes the system cost effective. This system is comprised of ZigBee network which has many advantages such as low power consumption, self-healing low data rates and low cost.
Research Interests:
In industrial processes, unstable system produces undesirable peak overshoot. So, PID controller with compensator and set point filter is designed using direct synthesis method. The set point filter reduces the peak overshoot. PID... more
In industrial processes, unstable system produces undesirable peak overshoot. So, PID controller with
compensator and set point filter is designed using direct synthesis method. The set point filter reduces the peak overshoot.
PID controller with compensator improves the overall response of the system. In this method, the characteristics equation
of the system with PID controller and a compensator is compared with a desired characteristics equation. A single tuning
parameter is used to find controller parameters, compensator and set point filter.
Research Interests:
Today users sharing large volume of images through social sites inadvertently become a major problem of maintaining confidentiality. To help the users to control access to their shared content needs some tools. An Adaptive Privacy Policy... more
Today users sharing large volume of images through social sites inadvertently become a major problem of maintaining confidentiality. To help the users to control access to their shared content needs some tools. An Adaptive Privacy Policy Prediction (A3P) used in this paper to address the confidentiality problem.A3P system helps the user to compose confidentiality setting of their images by examine the role of social context, image content and metadata these act as a possible indicators of users privacy preferences.A3P system uses the two-level framework according to users available history on the site to determines the best available privacy policy for users images being uploaded. The solution relies on an image classification framework for image categories which may be associated with similar policies, and on an algorithm which predict the policy to automatically generate a policy for each newly uploaded image, also according to user's social features. The generated policies fallow the evolution of users' confidentiality attitude.
Research Interests:
Generally, users can cast up two sorts of requests, such as elastic requests that contain no delay constraints, and inelastic requests that take an inflexible delay constraint. The distribution of WSN's in multiple areas like, target... more
Generally, users can cast up two sorts of requests, such as elastic requests that contain no delay constraints, and inelastic requests that take an inflexible delay constraint. The distribution of WSN's in multiple areas like, target tracking in battle fields, environmental control needs an optimization for communication among the detectors to serve information in shorter latency and with minimal energy consumption. Cooperative data caching emerged as a productive technique to accomplish these ends simultaneously. The execution of protocols for that network depends mainly on the selection of the sensors which will call for special roles in accordance to the procedure of caching and take forwarding decisions. A perception of Wireless content distribution was shown in which there are numerous cellular base stations, each of which encompass a cache for storing of content. Content is typically partitioned into two disjoint sets of inelastic as well as elastic content. Cooperative caching is shown to be capable to reduce content provisioning cost which heavily depends on service and pricing dependencies among several stakeholders including content providers, web service providers, and end consumers. Hither, a practical network, service, and economic pricing models which are then utilized for making an optimal cooperative caching strategy based on social community abstraction in wireless nets are broken. Inelastic requests are provided by means of broadcast transmissions and here we develop algorithms in support of content spread by means of elastic and inelastic requests. The developed framework includes optimal caching algorithms, analytical models, for evaluating the operation of the suggested scheme. The primary donations are: i) formulation of economic cost-reward flow models among the WNET stakeholders, ii) developing optimal distributed cooperative caching algorithms, iii) characterizing the impacts of network, user and object dynamics, and finally iv) investigating the impacts of user noncooperation,
Research Interests:
The most accepted payment mode is credit card for both online and offline in today's world, it provides cashless shopping at every shop in all countries. It will be the most convenient way to do online shopping, paying bills etc. Hence,... more
The most accepted payment mode is credit card for both online and offline in today's world, it provides cashless shopping at every shop in all countries. It will be the most convenient way to do online shopping, paying bills etc. Hence, risks of fraud transaction using credit card has also been increasing. In the existing credit card fraud detection business processing system, fraudulent transaction will be detected after transaction is done. It is difficult to find out fraudulent and regarding loses will be barred by issuing authorities. Hidden Markov Model is the statistical tools for engineer and scientists to solve various problems. In this paper, it is shown that credit card fraud can be detected using Hidden Markov Model during transactions. Hidden Markov Model helps to obtain a high fraud coverage combined with a low false alarm rate.
Research Interests:
In the present work, a computer aided classification system has been proposed for classification of mammogram images into normal, benign and cancer classes. The work has been carried out on thirty Digital Database for Screening... more
In the present work, a computer aided classification system has been proposed for classification of mammogram images into normal, benign and cancer classes. The work has been carried out on thirty Digital Database for Screening mammography (DDSM) cases consisting of 10 normal, 10 benign and 10 cancer images. The regions of interest (ROI) have been extracted from the right Medio Lateral Oblique (RMLO) part of the mammogram. We extracted 256×256 pixel size ROI from each case. Texture descriptors based on gray level co-occurrence method by varying the value of inter pixel distance 'd' from 1 to 8 have been used. The SVM classifier has been used for the classification task. The result of the study indicates that GLCM mean and range features computed at d=1 yield the maximum overall classification accuracy of 75% and 65 % respectively.
Research Interests:
KDD field is concerned with the development of methods and techniques for making raw data into useful information. Real world practical application issues are also outlined. The basic assumption in predicting financial markets is that it... more
KDD field is concerned with the development of methods and techniques for making raw data into useful information. Real world practical application issues are also outlined. The basic assumption in predicting financial markets is that it is not possible. This is consistent with remarks many financial professionals have made. In this survey, lot of information from the customer and vendors of shopping mall is collected. Vendors are needed to extract the required combination of customers. To find out the solution of this automatic research application to improve the business strategy is undertaken. This paper highlights the concepts, review the status and limitations.
Research Interests:
KDD field is concerned with the development of methods and techniques for making raw data into useful information. Real world practical application issues are also outlined. The basic assumption in predicting financial markets is that it... more
KDD field is concerned with the development of methods and techniques for making raw data into useful information. Real world practical application issues are also outlined. The basic assumption in predicting financial markets is that it is not possible. This is consistent with remarks many financial professionals have made. In this survey, lot of information from the customer and vendors of shopping mall is collected. Vendors are needed to extract the required combination of customers. To find out the solution of this automatic research application to improve the business strategy is undertaken. This paper highlights the concepts, review the status and limitations.
Research Interests:
Data mining is the study of data for relationships that have not previously been discovered. In sociology, discrimination is the hurtful treatment of an individual based on the group, class or category to which that person or things... more
Data mining is the study of data for relationships that have not previously been discovered. In sociology, discrimination is the hurtful treatment of an individual based on the group, class or category to which that person or things belongs rather than on individual merit: racial and religious intolerance and discrimination. Along with confidentiality, discrimination is a very essential issue when considering the legal and ethical aspects of data mining. It is more than obvious that most people do not want to be discriminated because of their race, gender, religion, nationality, age etc, especially when those attributes are used for making decisions about them like giving them a job, loan, education, insurance etc. Because of this reason, antidiscrimination techniques with discrimination discovery as well as discrimination prevention have been introduced in data mining. Discrimination can be either direct or indirect. Direct discrimination occurs when decisions are taken by considering sensitive attributes. Indirect discrimination occurs when decisions are taken on the basis of nonsensitive attributes which are strongly associated with biased sensitive ones. Here, discrimination prevention in data mining is tackle as well as propose new techniques applicable for direct or indirect discrimination prevention individually or both at the same time. Several decision-making tasks are there which let somebody use themselves to discrimination, such as education, life insurances, loan granting, and staff selection. In many applications, information systems are used for decision-making tasks.
Research Interests:
This paper gives an indication of our study in building rare class prediction models for identifying known intrusions and their variations and anomaly detection schemes for detecting novel attacks whose nature is unknown. Data mining and... more
This paper gives an indication of our study in building rare class prediction models for identifying known intrusions and their variations and anomaly detection schemes for detecting novel attacks whose nature is unknown. Data mining and machine learning have been subjected to general explore in intrusion detection with emphasis on improving the accuracy of detection classifier. The quality of the feature selection methods is one of the important factors that affect the effectiveness of Intrusion Detection scheme (IDS). This paper evaluates the performance of data mining classification algorithms namely C4.5, J48, Nave Bayes, NB-Tree and Random Forest using NSL KDD dataset and focuses on Correlation Feature Selection (CFS) assess. The results demonstrates that NB-Tree and Random Forest outperforms other two algorithms in terms of predictive accuracy and detection rate
Research Interests:
This paper presents the modeling of a Wind Diesel Hybrid System (WDHS) comprising a Diesel Generator (DG), a Wind Turbine Generator (WTG), the consumer Load, a Ni–Cd Battery based Energy Storage System (BESS) and a Distributed Control... more
This paper presents the modeling of a Wind Diesel Hybrid System (WDHS) comprising a Diesel Generator (DG), a Wind Turbine Generator (WTG), the consumer Load, a Ni–Cd Battery based Energy Storage System (BESS) and a Distributed Control System (DCS). All the models of the previously mentioned components are presented and the performance of the WDHS is tested through simulation. Simulation results with graphs for frequency and voltage of the isolated power system, active powers generated/absorbed by the different elements and the battery voltage/current/state of charge are presented for negative load and wind speed steps. The negative load step reduces the load consumed power to a level less than the WTG produced power, so that to balance active powers a negative DG power is needed (DG reverse power). As the DG speed governor cannot control system frequency in a DG reserve power situation, it is shown how the DCS orders the BESS to load artificially the system until the DG power falls in a positive power interval. The negative wind step decreases the WTG produced power, returning the power system to a situation where the needed DG power returns to positive, so that the BESS is not needed to load the system.
Research Interests:
This paper presents the modeling of a Wind Diesel Hybrid System (WDHS) comprising a Diesel Generator (DG), a Wind Turbine Generator (WTG), the consumer Load, a Ni–Cd Battery based Energy Storage System (BESS) and a Distributed Control... more
This paper presents the modeling of a Wind Diesel Hybrid System (WDHS) comprising a Diesel Generator (DG), a Wind Turbine Generator (WTG), the consumer Load, a Ni–Cd Battery based Energy Storage System (BESS) and a Distributed Control System (DCS). All the models of the previously mentioned components are presented and the performance of the WDHS is tested through simulation. Simulation results with graphs for frequency and voltage of the isolated power system, active powers generated/absorbed by the different elements and the battery voltage/current/state of charge are presented for negative load and wind speed steps. The negative load step reduces the load consumed power to a level less than the WTG produced power, so that to balance active powers a negative DG power is needed (DG reverse power). As the DG speed governor cannot control system frequency in a DG reserve power situation, it is shown how the DCS orders the BESS to load artificially the system until the DG power falls in a positive power interval. The negative wind step decreases the WTG produced power, returning the power system to a situation where the needed DG power returns to positive, so that the BESS is not needed to load the system.
Research Interests:
Positioning based services really picked up pace in the early 90's which came with the emergency response lines like 911, 100, etc. and which due public awareness regarding the location services being offered. These services also extend... more
Positioning based services really picked up pace in the early 90's which came with the emergency response lines like 911, 100, etc. and which due public awareness regarding the location services being offered. These services also extend to the transport and logistics side an example of which would be that nowadays taxies have inbuilt location bases services to track the vehicles. There are multiple positioning based services available like GPS, GLONASS, Time of Arrival, Angle of Arrival, etc. with each having their own pros and cons. We shall cover some of these services and review them for their functioning and functionality.
Research Interests:
MANETs are autonomous and decentralized networks. It is accessible to both genuine users and malicious attackers. In the network services, confidentiality and integrity of the data is affected due to security issues. MANETs are very... more
MANETs are autonomous and decentralized networks. It is accessible to both genuine users and malicious attackers. In the network services, confidentiality and integrity of the data is affected due to security issues. MANETs are very vulnerable to various attacks from malicious nodes, some classical malicious attacks (eg., DoS attacks, warm-hole attack, gray-hole attack and black-hole attack). In this paper, we are calculating the trust value of nodes using decision factors such as direct degree and recommendation degree. We focus on further improvement of routing, trust based Ad-hoc On-demand Multi-path Distance Vector Routing (AOTMDV) protocol is used, which is an extension form of Ad-hoc On-demand Multi-path Distance Vector Routing (AOMDV) protocol. AOTMDV protocol is to incorporate other decision factors (Incentive function and Active degree) to improve the trust. By enhancing the trust, gray hole attacks will be prevented and the performance of network throughput, packet delivery ratio and malicious attack resistance also improved effectively. A Mobile Ad-hoc NETwork(MANET) is a self-configuring, infrastructure-less network of mobile devices connected without wires. There are some unique characteristics of mobile ad-hoc networks. First, the connection between network nodes are wireless and the communication medium is broadcast. The wireless connection provides the nodes with free to move, so the mobile nodes may come together as needed and shape a network. Second, mobile ad hoc network do not have any fixed infrastructure. Thus, the network topology is always changing, it is extremely dynamic. The intercomne-ctions between mobile ad hoc network nodes are not permanent; they are capable of changing on a continual basis to adapt this dynamically and arbitrarily pattern. Third, the membership is always changing. The mobile nodes are free to move anywhere, leave at any time and new nodes can enter unexpected. There is no mechanism to administrate or manage the membership. Fourth, the execution environment is insecure and unfriendly. Due to the lack of fixed infrastructure and administration, there are increased chances malicious nodes can mount attacks. Also, nodes may behave selfishly and result a degradation of the performance or even disable the functionality. Finally, the nodes in a mobile ad hoc network are usually portable mobile devices with constrained resources, such as power, computation ability and storage capacity. The remains of the paper are organized as follows. Section II discuss about the related work. Section III discuss about the proposed works of Trust model. Section IV illustrates the simulation results and analysis using NS2 simulation and result analysis. Section V gives the conclusion and future work. II. RELATED WORK A. AOTMDV Protocol The Trust based Ad-hoc On-demand Multi path Distance Vector (AOTMDV) [8] Routing protocol is an extension to AOMDV routing protocol for computing multiple loop-free and link disjoint paths using trust values. The routing entries for each destination contain a list of the next-hop counts and path trust. All the next hops have the same sequence number and all the nodes have path trust value. This helps in keeping track of a route. For each destination, a node maintains the advertised hop count and path trust, which is defined as the maximum hop count for all the paths, which is used for sending route advertisements of the destination. Each duplicate route advertisement received by a node defines an alternate path to the destination. Loop freedom is assured for a node by accepting alternate paths to destination if it has a less hop count than the advertised hop count for that destination. Because the maximum hop count is used,
Research Interests:
1.Abstract This paper is on the project 'Online FIR registration and SOS system'. The system Online FIR registration and SOS project is the first of its kind .It is designed to bridge the gap between the police and the common people.... more
1.Abstract This paper is on the project 'Online FIR registration and SOS system'. The system Online FIR registration and SOS project is the first of its kind .It is designed to bridge the gap between the police and the common people. There are plenty of applications nowadays for shopping, travel and even for gaming purposes. However there is no application for the purpose of registering FIR or for helping the people while facing emergency situations. We intend to create a system where the users could register an FIR under various IPC sections and inform the police whenever in an emergency situation. We believe this will be a widely used system in the future and will help to bridge the gap between the police department and the people.
Research Interests:
In this paper I want to highlight what we believe to be the key technology dimensions for evaluating data management solutions. This paper offers and explore define what is meant by big data, We review analytics techniques for text,... more
In this paper I want to highlight what we believe to be the key technology dimensions for evaluating data management solutions. This paper offers and explore define what is meant by big data, We review analytics techniques for text, audio, video, and social media data, We make the case for new statistical techniques for big data, We highlight the expected future developments in big data analytics, Examines cloud data management architectures and Covers Big Data analytics and visualization.It not only considers data management and analytics for vast amounts of unstructured data but also Explores clustering, classification, and link analysis of Big Data. Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. that addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The basic concepts and tools of large-scale Big Data processing and cloud computing can be view here and hope every one will get basic idea about these concepts. It also provides an overview of different programming models and cloud-based deployment models that examines the usage of advanced Big Data processing techniques in different domains, including semantic web, graph processing, and stream processing Big Data such as consistency management, privacy, and security. Supplying a comprehensive summary from both the research and applied perspectives, the paper covers recent research discoveries and applications, making it an ideal reference for a wide range of audiences, including researchers and academics working on databases, data mining, and web scale data processing. And hope all will gain a fundamental understanding of how to use Big Data-processing tools and techniques effectively across application domains and includes cloud data management architectures, big data analytics visualization, data management, analytics for vast amounts of unstructured data, clustering, classification, link analysis of big data, scalable data mining, and machine learning techniques
Research Interests:
In backup environments field deduplication yields major advantages. Deduplication is process of automatic elimination of duplicate data in storage system and it is most effective technique to reduce storage costs. De duplication effects... more
In backup environments field deduplication yields major advantages. Deduplication is process of automatic elimination of
duplicate data in storage system and it is most effective technique to reduce storage costs. De duplication effects predictably in
data fragmentation, because logically continuous data is spread across many disk locations. Fragmentation mainly caused by
duplicates from previous backups of the same back upset, since such duplicates are frequent due to repeated full backups
containing a lot of data which is not changed. Systems with in-line deduplicate intends to detects duplicates during writing and
avoids storing them, such fragmentation causes data from the latest backup being scattered across older backups. This survey
focused on various techniques to detect inline deduplication. As per literature, need to develop a focused on deduplication reduce
the time and storage space. Proposed novel method to avoid the reduction in restores performance without reducing write
performance and without affecting deduplication effectiveness.
This android application creates updates, deletes and retrieves records on the cloud database.Google's android OS provides great amount of flexibility to the implementation of an android application that is as pervasive as that. The given... more
This android application creates updates, deletes and retrieves records on the cloud database.Google's android OS provides great amount of flexibility to the implementation of an android application that is as pervasive as that. The given information can be of any format like DICOM image or data file in text. The system can be evaluated using any cloud service. Users are entities like administrator and normal patient. Cloud computing makes us to use out data in pervasively, ubiquitously and in distributive manner. The app stores and retrieves the data using above principles of cloud computing. The patient health record management can be done using these capabilities of Google Android and we can also store DICOM images. This app gives prediction of risk of occurrence of a disease.
Research Interests:
This article gives an important guideline for Automated Toll Collection System (ATCS) Using NFC and Theft Vehicle Detection. ATCS emerges as a converging technology where time and efficiency are important in toll collection systems... more
This article gives an important guideline for Automated Toll Collection System (ATCS) Using NFC and Theft Vehicle Detection. ATCS emerges as a converging technology where time and efficiency are important in toll collection systems nowadays. In this, NFC tag will be placed by toll authority having unique identification number (UIN) and user details. Active NFC tag will be attached to the RC (Registration Certificate) Book or Smart card. When vehicle passes through the tollbooth system, data on NFC will be read by NFC Reader and also sent to the server for verification. Server will check details and toll amount will be deducted from user's account. Theft Vehicle Detection is done with the help of various algorithms such as OCR and BLOB Detection.
Research Interests:
Epilepsy is a neurological disorder marked by sudden recurrent episodes of sensory disturbance, loss of consciousness, or convulsions, associated with abnormal electrical activity in the brain. The sudden and seemingly unpredictable... more
Epilepsy is a neurological disorder marked by sudden recurrent episodes of sensory disturbance, loss of consciousness, or convulsions, associated with abnormal electrical activity in the brain. The sudden and seemingly unpredictable nature of seizures is one of the most compromising aspects of the disease epilepsy. Most epilepsy patients only spend a marginal part of their time actually having a seizure and show no clinical signs of their disease during the time between seizures, the so-called inter-ictal interval. But the constant fear of the next seizure and the feeling of helplessness associated with it often have a strong impact on the everyday life of a patient (Fisher et al. 2000). A method capable of reliably predicting the occurrence of seizures could significantly improve the quality of life for these patients and open new therapeutic possibilities. Apart from simple warning devices, fully automated closed-loop seizure prevention systems are conceivable. Treatment concepts could move from preventive strategies towards on demand therapy which resets brain dynamics and minimize the risk during epilepsy.
Research Interests:
Digitization of bio medical signals has brought a drastic change in analysis of signals. Electrocardiogram (ECG) is an important tool for the primary diagnosis of heart disease.ECG signal, the electrical interpretation of the cardiac... more
Digitization of bio medical signals has brought a drastic change in analysis of signals. Electrocardiogram (ECG) is an important tool for the primary diagnosis of heart disease.ECG signal, the electrical interpretation of the cardiac muscle activity is very easy to interfere with different noises while gathering and recording. The ECG signal must be clearly represented and filtered to remove all noise and artifacts from signal. In this paper a new approach to filter the ECG signal from noise using Multi resolution Technique based on Wavelet Transform. This method gives better results than the other technique applied in this field.
Research Interests:
— In today's fast moving world for survival of company many factors are important such as, cost effective production flexibility of work according in work , production on time etc. In this paper we are going to survey about the liquid... more
— In today's fast moving world for survival of company many factors are important such as, cost effective production flexibility of work according in work , production on time etc. In this paper we are going to survey about the liquid mixing and bottle filling process using PLC. The importance of use of PLC is that will allow the mixing of different liquids in desired amount ,the high speed process, accuracy in amount of liquid to be fill into bottles and in any case liquid is any chemical which is dangerous to human health then the mixing and filling both processes are carried out without human touch.
Research Interests:
During the next few years there will be a profound change in the generation and usage of energy, influenced factors such as environment, capital and production costs as well as geopolitical factors, old power stations in the world will be... more
During the next few years there will be a profound change in the generation and usage of energy, influenced factors such as environment, capital and production costs as well as geopolitical factors, old power stations in the world will be phased out. Newer power stations, having large output capacity will be difficult to permit in many existing political climates. With global warming now apparent, environment and efficient fuel utilization have become significant factors in the adaptation of newer and emerging energy conversion technologies. High installation cost is the major obstacle of the commercialization of fuel cell for distributed power generation. A fuel cell power system that contains a dc–ac inverter tends to draw an ac ripple current at twice the output frequency. Such a ripple current may shorten input cell lifespan and worsen the efficiency and output capacity. In this paper, an advanced active control technique is proposed to incorporate a current control loop in the dc–dc converter for ripple reduction. This will reduce both size and cost of the system. The proposed active ripple reduction method has been verified with MATLAB simulation.
Research Interests:
MANET and VANET are the active research areas and lots of routing protocols have been proposed for use in these areas.. In MANET, nodes are connected through wireless channels in a network and each node acts as a router and as a host. One... more
MANET and VANET are the active research areas and lots of routing protocols have been proposed for use in these areas.. In MANET, nodes are connected through wireless channels in a network and each node acts as a router and as a host. One of the scenario of MANET is Vehicular ad-hoc networks. For communication in VANET, efficient Routing Protocols are needed. Because of highly changing network topology and frequent disconnection it's strenuous to design an efficient routing protocol for vehicles, there can be two types of VANET that are V2V(Vehicle to Vehicle) and V2RSU(Vehicle to Road Side Unit). Because of daily happening of accidents VANET is one of the affecting areas for the refinement of Intelligent Transportation System (ITS) which can insure passengers and road safety. The Intelligent Transport Systems gives information if there exists any emergency and tells about traffic density. the traffic and traffic density. The existing routing protocols for VANET are not efficient enough to meet all traffic scenarios. Worthy routing protocols are required to initiate communication between vehicles in future for passengers and road safety. This paper shows literature survey related to Reactive and Proactive Routing Protocols of MANET as AODV, DSDV, OLSR, and DSR. Analysis and characterization of these protocols is shown in the paper which helps in further improvement of existing routing protocols.
Research Interests:

And 3 more

This is implemented for the purpose of getting a fully automized electricity billing system. This is aim to measure and monitor the electricty consumed by consumers in a locality and transmitting the consumed power to the station as well... more
This is implemented for the purpose of getting a fully automized electricity billing system. This is aim to measure and monitor the electricty consumed by consumers in a locality and transmitting the consumed power to the station as well as issuing the bill of consumed power automatically. It is also aims to find the malpractices in the meter. Using this system the Electricity Board can access all data regarding the consumed power in each home and in each station whenever required. From the data the Board can find out power theft also it is also offers a system to charge extra payment for the excess usage of power at peak time (6.00-10.00pm) .Online payment is also possible for our new system. GSM is using for automating the system. The consumed unit transmission, alerts and bill reception are achieved by the GSM module in the client side as set by the user. Server station is also served by a GSM module for transmission and reception of data .
Research Interests:
—In Today's scenario, files are not secure. They are fetch by any means of attack by eavesdropper like cracking the pins, crashing the OS by viruses, malwares, and plenty of ways. We today can't sure that files protection wizards are... more
—In Today's scenario, files are not secure. They are fetch by any means of attack by eavesdropper like cracking the pins, crashing the OS by viruses, malwares, and plenty of ways. We today can't sure that files protection wizards are secure and data can't be reached to the attacker. But if files are encrypted then even files are accessed original data remains confidential. Therefore, this paper represents the File Encryption System based on Symmetric Key Cryptography. I proposed the strategy to encrypt the files/even multiple files can be encrypted by compressing the files into one 'rar' file and it uses Blowfish as encryption/decryption standard and Cipher Block Chain Mode to perform the operations. I implemented compression function for 64-bit Initialization Vector(IV), use CBC mode with Blowfish and RC4 for 256-bit keystream. It is more efficient and secure then other general encryption process.
Research Interests:
The cloud is redefining the IT architectures of various business domains. Organizations have clearly recognized the unlimited benefits of cloud like dynamic payloads on the technical side and elastic financial models on the commercial... more
The cloud is redefining the IT architectures of various business domains. Organizations have clearly recognized the unlimited benefits of cloud like dynamic payloads on the technical side and elastic financial models on the commercial side which guarantees in greater than before efficiency. The benefits of cloud computing can be maximized fully by applying novel technologies in security and risk management processes. The risk factors in terms of security is much more in public cloud computing compared to traditional computing which are bases on datacenter. In a highly shared ad hoc cloud environment which is in the control of instance-to-instance network connectivity security of the applications and sensitive data is a big challenge faced by the cloud providers. As the entire stack of applications, platform, infrastructure of the cloud is designed and managed by service providers the cloud users are uncertain about the security. This paper studies the generic security challenges in an ad hoc cloud environment .the security challenges and mitigations are discussed in section I. A survey was conducted with users of cloud from domains like health care, education and retail business. Analysis of survey data and the results are also discussed in section II. Introduction Ad hoc clouds rely on scalable virtualization technology which gives its user access to set of non-dedicated computing resources. The users of an ad hoc cloud are in a highly shared environment where resources are dynamically provisioned and organizations share the same remotely located physical hardware with strangers. Security and privacy has been the biggest challenges for organizations who are already using cloud as end to end solution for their IT needs and who are still considering moving into cloud. In a SPI model of cloud SaaS demands security at all levels of user identity and data access and maintain integrity and continuity of the applications. In IaaS secure networking and trusted computing should be the prime concern, and whereas at PaaS demands protection at the resource-management level.[1]
Research Interests:
Wireshark is a network protocol analyser. Wireshark is able to intercept packets transmitted over the network and compile statistics about network usage, allow the user to view content that is being accessed by other network users, and... more
Wireshark is a network protocol analyser. Wireshark is able to intercept packets transmitted over the network and compile statistics about network usage, allow the user to view content that is being accessed by other network users, and store usage information for offline access. This paper depicts the comparison of Wireshark, with one other similar tool, Network Miner, which is a Network Forensic Analysis Tool (NFAT), based on different parameters: graphical user interface (basic), packet information and traffic analysis. Network Miner can be used as a passive network sniffer/packet capturing tool and can parse PCAP files for off-line analysis.
Research Interests:
As in real world, in virtual world also there are people who want to take advantage of you by exploiting you whether it would be your money, your status or your personal information etc. MALWARE helps these people accomplishing their... more
As in real world, in virtual world also there are
people who want to take advantage of you by exploiting you
whether it would be your money, your status or your personal
information etc. MALWARE helps these people
accomplishing their goals. The security of modern computer
systems depends on the ability by the users to keep software,
OS and antivirus products up-to-date. To protect legitimate
users from these threats, I made a tool
(ADVANCE DYNAMIC MALWARE ANAYSIS USING API
HOOKING) that will inform you about every task that
software (malware) is doing over your machine at run-time
Index Terms— API Hooking, Hooking, DLL injection, Detour
Research Interests:
It has always been an arduous task for the election commission to conduct free and fair polls in our country, the largest democracy in the world. Crore of rupees have been spent on this to make sure that the elections are riot free. But,... more
It has always been an arduous task for the election commission to conduct free and fair polls in our country, the largest democracy in the world. Crore of rupees have been spent on this to make sure that the elections are riot free. But, now-a-days it has become common for some forces to indulge in rigging which may eventually lead to a result contrary to the actual verdict given by the people. This paper aims to present a new voting system employing biometrics in order to avoid rigging and to enhance the accuracy and speed of the process. The system uses thumb impression for voter identification as we know that the thumb impression of every human being has a unique pattern. Thus it would have an edge over the present day voting systems. As a pre-poll procedure, a database consisting of the thumb impressions of all the eligible voters in a constituency is created. During elections, the thumb impression of a voter is entered as input to the system. This is then compared with the available records in the database. If the particular pattern matches with anyone in the available record, access to cast a vote is granted. But in case the pattern doesn't match with the records of the database or in case of repetition, access to cast a vote is denied or the vote gets rejected. Also the police station nearby to the election poll booth is informed about the identity of the imposter. All the voting machines are connected in a network, through which data transfer takes place to the main host. The result is instantaneous and counting is done finally at the main host itself. The overall cost for conducting elections gets reduced and so does the maintenance cost of the systems.
Research Interests:
This paper presents an interactive model based system for the management of production and controlling process in spinning mill using embedded system and Internet of Things. This system consists of various sensors to measure and store... more
This paper presents an interactive model based system for the management of production and controlling process in spinning mill using embedded system and Internet of Things. This system consists of various sensors to measure and store different parameters that are calculated to find the production rate. Apart from a comprehensive presentation of the set of the modules the system is composed of, together with their interrelationships, the above characteristics are analyzed, and their impact on the production control system is explained.. The system is also related to two control process namely air cooler controller and moisture mixer sprayer controller. This process is currently done manually which is being automated.Making Automated in this system we can effectively control the quality of the yarn that is produced in those alike industries. The system's attributes are presented with the aid of data structure diagrams, while the complete algorithm concerning the arduino module, in a algorithm form is provided and it presents a survey of all such systems.
Research Interests:
Most often in our daily life we have to carry lot of cards such as credit cards, debit cards and some other special cards for toll system ERP, parking and personal identification purpose. Currently smart card implementations can be seen... more
Most often in our daily life we have to carry lot of cards such as credit cards, debit cards and some other special cards for toll system ERP, parking and personal identification purpose. Currently smart card implementations can be seen around the world but they are not unified i.e. each developers uses different programming standards and data structures. The smart card will provide service to the user only within a university campus or an organization. In order to make available such multiple application access using a single card to every individual person we have planned to use RFID technology, which is cost effective. As RFID technology is used in the proposed concept, the programming standards and data structures will be unified. Unlike smart card, the RFID card can be used by every individual person to access different applications. Thus, a person needs not to carry number of cards; he can just carry a single card for different purpose.
Research Interests:
Now a day people tend to seek knowledge or information from internet that concern with health through online healthcare services. The basic aim of this system is to bridge the vocabulary gap between the health providers by proving instant... more
Now a day people tend to seek knowledge or information from internet that concern with health through online healthcare services. The basic aim of this system is to bridge the vocabulary gap between the health providers by proving instant replies to the questions posted by patients. Automatic generated content for healthcare services are chosen instead of traditional community generated systems because they are reliable, compatible, and provide instant replies. This paper proposes a scheme to code the medical record using local mining and global approaches. Local mining aims to code the medical records by extracting the medical concepts from individual record and then mapping them to terminologies based on external authenticated vocabularies. Local Mining establishes a tri-stage framework to accomplish this task. Global learning aims to learn missing key concepts and propagates precise terminologies among underlying connected records over a large collection.
Research Interests:
— In India, everyday many lives are affected because the patients are not timely and properly operated. Also for real time parameter values are not efficiently measured in clinic as well as in hospitals. Sometimes it becomes difficult for... more
— In India, everyday many lives are affected because the patients are not timely and properly operated. Also for real time parameter values are not efficiently measured in clinic as well as in hospitals. Sometimes it becomes difficult for hospitals to frequently check patients' conditions. Also continuous monitoring of ICU patients is not possible. To deal with these types of situations, our system is beneficial. Our system is designed to be used in hospitals for measuring and monitoring various parameters like temperature, ECG, heart beat etc. The results can be recorded using Raspberry Pi displayed on a LCD display. Also the results can be sent to server using GSM module. Doctors can login to a website and view those results.
Research Interests:
Image binarization is the procedure of separating pixel values into two parts, black as foreground and white as background. In binarization document image is converted into binary image using thresholding techniques. Many binarization... more
Image binarization is the procedure of separating pixel values into two parts, black as foreground and white as background. In
binarization document image is converted into binary image using thresholding techniques. Many binarization algorithms have been
proposed for different types of degraded document images. The main objective of image enhancement is to modify attributes of an image to
make it more suitable for a given task . To remove the noise and improve the quality of the document binarization techniques are used.
Thresholding is one of such binarzation technique which is used for this purpose. Thresholding is further divided into the global and local
thresholding techniques. In the document with uniform contrast delivery of background and foreground, global thresholding has been
found to be best technique. Local thresholding , is an approach for the situations in which single value thresholding does not yield proper
result, to overcome this a hybrid approach is introduced which is a combination of local with Otsu’s thresholding.
Research Interests:
[1][2][3]Department of Information Technology [1][2][3 Abstract: Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. Because of these new computing and... more
[1][2][3]Department of Information Technology [1][2][3 Abstract: Cloud computing promises to significantly change the way we use computers and access and store our personal and business information. Because of these new computing and communication paradigm there arise data security challenges. Even though existing techniques use security mechanisms, data theft attacks prevention fails. To overcome this we can use decoy technology to secure data stored in cloud. Although, Fog Computing is defined as the extension of the Cloud Computing paradigm, its distinctive characteristics in the location sensitivity, wireless connectivity, and geographical accessibility create new security and forensics issues and challenges which have not been well studied in Cloud security and Cloud forensics. We monitor data access in the cloud and detect abnormal data access patterns. When unauthorized access is suspected and then verified using challenge questions, we launch a disinformation attack by returning large amounts of decoy information to the attacker. This protects against the misuse of the user's real data. Experiments conducted in a local file setting provide evidence that this approach may provide unprecedented levels of user data security in a Cloud environment.
Research Interests:
— Material management is related to planning, controlling and organizing the flow of material from availability to requirement. Mapping excess material in appropriate location considering multiple criteria is one of the administrative... more
— Material management is related to planning, controlling and organizing the flow of material from availability to requirement. Mapping excess material in appropriate location considering multiple criteria is one of the administrative decision making task. Material utilization is a multi –criteria decision making problem consist of several conflicting criteria must verified before taking any decision. A systematic methodology is proposed in this paper based on TOPSIS and AHP method. This paper discusses about important mechanism that provide guideline to decision maker for material mobilization and material utilization using TOPSIS and AHP methods. The combination of TOPSIS and AHP provide faster, reliable and convenient way for decision maker evaluating the most important criteria. Using TOPSIS and AHP method, we can provide best tool for decision maker so that material management will achieve easily.
Research Interests:
Bluetooth technology unplugs our digital peripherals. In short, it is a wireless replacement for many of the cables we currently use to transmit voice and data signals. Bluetooth radio modules use Gaussian Frequency Shift Keying (GFSK)... more
Bluetooth technology unplugs our digital peripherals. In short, it is a wireless replacement for many of the cables we currently use to transmit voice and data signals. Bluetooth radio modules use Gaussian Frequency Shift Keying (GFSK) for modulation. Bluetooth employs an FHSS spreading technique, changing frequencies at a rate of 1600 times per second-160 times the rate at which a wireless LAN changes frequencies. This paper focuses on the attacks and security issues in Bluetooth technology.
Research Interests:
In Cloud, we describe the ways to enable protection of Multimedia contents from redistributing. Web has billions of documents including video, audio and images, but there is no central management system, where duplication of contents is... more
In Cloud, we describe the ways to enable protection of Multimedia contents from redistributing. Web has billions of documents including video, audio and images, but there is no central management system, where duplication of contents is more common. It is said that each and every document has a duplicate copy. This is more prevelant in videos stored in multiple formats, versions, size etc. and are found unaware by content creators when modified and republished using Video Editorial Tools. This may lead to security problems and also reduplicating the identity of owners and also loss of revenue to content creators. This also occupies a enormous space over the web. In cloud storage too it is more common involving both public and private clouds. But the private cloud is said to be more secure when compared to the public cloud. So to avoid this situation some of the techniques have been used to avoid duplication of contents and focused mainly over the 3D-video contents.
Research Interests:
Steganography is used to encrypt any secret information like password, text and picture, audio behind original cover file. Original message is converted into cipher text by using mystery key and then hidden into the LSB of original image.... more
Steganography is used to encrypt any secret information like password, text and picture, audio behind original cover file. Original message is converted into cipher text by using mystery key and then hidden into the LSB of original image. The current work signifies cryptostegnography of audio and video which is the combination of image steganography, audio and video steganography by making use of Forensics Technique as a tool for authentication. The main aim is to hide secret data behind image and audio of video file. As video is the utilization of many still frames of images and audio, thus for hiding secret information any frames can be selected for audio and video. Suitable algorithm such as AES for security and authentication image processing is used, hence data security can be increased. And for data embedding, use 4LSB algorithm
Research Interests:
In the modern age of Internet, usage of social media is growing rapidly on internet, organizing the data, interpreting and supervising User generated content (UGC) has become one of the major concerns. Updating new topics on internet is... more
In the modern age of Internet, usage of social media is growing rapidly on internet, organizing the data, interpreting and supervising User generated content (UGC) has become one of the major concerns. Updating new topics on internet is not a big task but searching topics on the web from a vast volume of UGC is one of the major challenges in the society. In this paper we deal with web search result clustering for improving the search result returned by the search engines. However there are several algorithms that already exist such as Lingo, K-means etc. In this paper basically we work on descriptive-centric algorithm for web search result clustering called IFCWR algorithm. Maximum numbers of clusters are randomly selected by using Forgy's strategy, and it iteratively merges clusters until most relevant results are obtained. Every merge operation executes Fuzzy C-means algorithm for web search result clustering. In Fuzzy C-means, clusters are merged based on cosine similarity and create a new solution (current solution) with this new configuration of centroids. In this paper we investigate the Fuzzy C-means algorithm, performing pre-processing of search query algorithm and try to giving the best solution.
Research Interests:
Pilot contamination posts a elementary limit on the performance of huge multiple-input–multiple-output (MIMO) antenna systems owing to failure in correct channel estimation. To address this drawback, we tend to propose estimation of... more
Pilot contamination posts a elementary limit on the performance of huge multiple-input–multiple-output (MIMO) antenna systems owing to failure in correct channel estimation. To address this drawback, we tend to propose estimation of solely the channel parameters of the specified links during a target cell, however those of the interference links from adjacent cells. The desired estimation is, nonetheless, AN underdetermined system. During this paper, we show that if the propagation properties of huge MIMO systems will be exploited, it's potential to get a correct estimate of the channel parameters. Our strategy is impressed by the observation that for a cellular network, the channel from user instrumentality to a base station consists of solely a number of clustered methods in space. With an awfully massive antenna array, signals may be discovered under extraordinarily sharp regions in space. As a result, if the signals are discovered within the beam domain (using Fourier transform), the channel is around thin, i.e., the channel matrix contains only a little fraction of huge elements, and different elements are near zero. This observation then permits channel estimation based on thin Bayesian learning strategies, wherever thin channel components may be reconstructed employing a little variety of observations. Results illustrate that compared to traditional estimators; the planned approach achieves far better performance in terms of the channel estimation accuracy and doable rates in the presence of pilot contamination.
Research Interests:
This document shows the concept for a broad topic and ambiguous query, different types of users may have different search goals when they submit the query to the search engine. The inference and analysis of user search goals can be very... more
This document shows the concept for a broad topic and ambiguous query, different types of users may have different search goals when they submit the query to the search engine. The inference and analysis of user search goals can be very useful in improving search engine relevance information and user experience. In this paper, we propose a novel approach to infer user search goals by analyzing search engine query logs. First, we propose a framework to search different user search goals for a query by making cluster to the proposed feedback sessions. Feedback sessions are constructed from user click-through logs i.e. user response and can efficiently reflect the information needs to users. Second, we propose a novel approach to create pseudo-documents to better represent the feedback sessions for clustering. Finally, we propose a new criterion Classified Average Precision (CAP) to calculate the performance of inferring user search goals. Experimental results are presented using user click-through logs from a commercial search engine to check the effectiveness of our proposed methods.
Research Interests:
Data mining is the task of discovering useful and interested patterns from the huge amount of the data where the data can be stored in databases, data warehouses and other information repositories. Data mining comprises an integration of... more
Data mining is the task of discovering useful and interested patterns from the huge amount of the data where the data can be stored in databases, data warehouses and other information repositories. Data mining comprises an integration of techniques from various disciplines such as data visualization, database technology, information retrieval, high performance computing, machine learning and pattern recognition, etc. The classification of multi-dimensional data is one of the major challenges in data mining and data warehousing. In a classification problem, each object is defined by its attribute values in multidimensional space. Some of the existing systems consider the data analysis might identify the set of candidate data cubes for exploratory analysis based on domain knowledge. Unfortunately, conditions occurred for such assumptions are not valid and these include high dimensional databases, which are difficult or impossible to pre-calculate the dimensions and cubes. Some proposed system is formulated automatically find out the dimensions and cubes, which holds the informative and interesting data. In high dimensional datasets, the data analysis procedures need to be integrated with each other. Based on the information theoretic measures like Entropy is used to filter out the irrelevant data from the dataset in order to formulate a more compact, manageable and useful schema.
Research Interests:
Mobile phone in our present life has become an important device for communication, selective from formal and informal talks to sharing confidential and secure information. This secure information includes personal communication to large... more
Mobile phone in our present life has become an important device for communication, selective from formal and informal talks to sharing confidential and secure information. This secure information includes personal communication to large business deals. Thus, there is a need to guarantee security to those applications which use to send confidential information. Focusing to meet the mobile users' demand, many cryptographic protocols are chosen based on confidentiality, integrity and authentication. Public Key Cryptography is a better solution that fulfills the above mentioned necessities. Applications that use public key cryptography deals with computing power, key size to measure the efficiency of the protocol. This work mainly focuses on the performance attributes of the ECC algorithm modified with its addition and multiplication points after generating a prime number, which leads to few changes in the algorithm parameters of encryption and decryption process to improve performance with limited power consumption. The proposed algorithm studied and compared with its conventional one, is highly secured for a mobile communication along with minimum power and current consumption. The protocol energy efficiencies are measured based on the consumption of current, power against time. Experiments results that the proposed protocol consumes less power and current when compared with the conventional ECC algorithm
Research Interests:
— The cloud computing and the Internet of things are tightly coupled with each other. The rapid growth of the Internet of Things (IoT) and the development of technologies created a widespread connection of " things ". This results in the... more
— The cloud computing and the Internet of things are tightly coupled with each other. The rapid growth of the Internet of Things (IoT) and the development of technologies created a widespread connection of " things ". This results in the production of large amounts of data which needs to be stored, processed and accessed. Cloud computing is a paradigm for big data storage and analytics while the Internet of Things is exciting on its own that the real innovation will come from combining it with cloud computing. This can enable sensing services and powerful processing of sensing data stream. More things are being connected to address a growing range of business needs. In fact, by the year 2020, more than 50 billion things will connect to the Internet—seven times our human population. Insufficient security will be a critical barrier to large-scale deployment of IoT systems and broad customer adoption of IoT applications using cloud. Simply extending the existing IT security architectures to the IoT and cloud will not be sufficient. The IoT world requires new security approaches, creating fertile ground for innovative thinking and solutions. This paper discusses key issues that are believed to have long-term significance in IoT and cloud computing security and privacy, based on documented problems and exhibited weaknesses.
Research Interests:
A sensor is a device which is able to convert any objective quantity to be measured into a signal which is interpreted, displayed, stored or used to control some other data. The signal formed by the sensor is equal to the quantity to be... more
A sensor is a device which is able to convert any objective quantity to be measured into a signal which is interpreted, displayed, stored or used to control some other data. The signal formed by the sensor is equal to the quantity to be measured. Sensors are used to calculate a particular characteristic of any object or device.[1] This paper provides the overview about What is sensors, different fields where sensors are used, how sensors are used in day to day life, its advantages and disadvantages and types of sensors. Paper also gives idea about how sensors plays an important role in cyber security.
Research Interests:
There is an up-and-coming topic in the field of computer science and technology that is getting a lot of publications these days and that is Big Data. The term Big data is referring for a collection of large and complex data sets which is... more
There is an up-and-coming topic in the field of computer science and technology that is getting a lot of publications these days and that is Big Data. The term Big data is referring for a collection of large and complex data sets which is very hard to process by database management tools or data processing applications. The volumes are in the range of Exabyte and above. Most of companies and government agencies are dealing with it in this current speedy moving technology environment. Stand-alone applications as well as today's newer web-based processes are generating large amount of data. This raise some issues that has to be considered when dealing with big data. Some of the issues are storage, management and processing. This data also creates new challenges which make the framework based developers to come up with solutions to the problems. They introduced different frameworks for the storage, management and processing of big data that are scalable and portable with fault tolerance capabilities. Newer technologies for the analysis of such large amount data that is complex to handle are also introduced. These challenges include Privacy and Security, Scale, and Heterogeneity to mention a few. This paper reviews the Issues and Challenges of Big Data from Data Analytic and Storage Perspectives. We also briefly discussed about the frameworks and databases that can be used to tackle the challenges that is facing Big Data world.
Research Interests:
In the present scenario, some people need to be online all the time but it is not possible to have wired media available at everywhere. In this case Boundless media is the best solution. There are many types of boundless transmission... more
In the present scenario, some people need to be online all the time but it is not possible to have wired media available at everywhere. In this case Boundless media is the best solution. There are many types of boundless transmission available like radio transmission, microwaves, infrared and laser/light wave transmission. In this paper we will focus on the uses, requirements, benefits and types of unbounded communication.
Research Interests:
In the present scenario, some people need to be online all the time but it is not possible to have wired media available at everywhere. In this case Boundless media is the best solution. There are many types of boundless transmission... more
In the present scenario, some people need to be online all the time but it is not possible to have wired media available at everywhere. In this case Boundless media is the best solution. There are many types of boundless transmission available like radio transmission, microwaves, infrared and laser/light wave transmission. In this paper we will focus on the uses, requirements, benefits and types of unbounded communication.
Research Interests:
Cloud computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy the on-demand high quality applications and services from a shared pool of configurable... more
Cloud computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy the on-demand high quality applications and services from a shared pool of configurable computing resources. By data outsourcing, users can be relieved from the burden of local data storage and maintenance. Thus, enabling public auditability for cloud data storage security is of critical importance so that users can resort to an external audit party to check the integrity of outsourced data when needed. To securely introduce an effective third party auditor (TPA), the following two fundamental requirements have to be met: 1) TPA should be able to efficiently audit the cloud data storage without demanding the local copy of data, and introduce no additional on-line burden to the cloud user. Specifically, our contribution in this work can be summarized as the following three aspects: 1) We motivate the public auditing system of data storage security in Cloud Computing and provide a homomorphic authenticable ring signature mechanism for Public Auditing on shared data in the Cloud, i.e., our scheme supports an external auditor to audit user's outsourced data in the cloud without learning knowledge on the data content. 2) To the best of our knowledge, our scheme is the first to support scalable and efficient public auditing in the Cloud Computing. In particular, our scheme achieves batch auditing where multiple delegated auditing tasks from different users can be performed simultaneously by the TPA.
Research Interests:
Natural language processing (NLP) is a field in computer science, artificial intelligence and the linguistics which mainly concentrates on the interactions between human languages (natural language) and the computer. One of the main... more
Natural language processing (NLP) is a field in computer science, artificial intelligence and the linguistics which mainly concentrates on the interactions between human languages (natural language) and the computer. One of the main challenges in NLP is ambiguity. Every language is ambiguous in nature, in the way that one word has multiple meaning and multiple words have same meaning. The ambiguities are generally categorized into two groups: lexical and structural ambiguities. Lexical ambiguity arises where there are two or more possible meaning for a single word. Structural ambiguities appear when a given sentence is interpreted in more than one way due to ambiguous sentence structure. Word Sense Disambiguation (WSD) is defined as the task of finding the correct sense of a word in a specific context. This paper presents our preliminary work towards building WSD system by constructing a corpus. We include a detailed analysis of the factor that affects the WSD algorithm and propose a modified algorithm based on random walk algorithm and com pare the working of each of these algorithms
Research Interests:
— In wireless sensor network, packet size optimization is an important issue in energy constrained. We provide dynamic packet length optimization scheme to achieve best performance in terms of channel utilization, energy efficiency. The... more
— In wireless sensor network, packet size optimization is an important issue in energy constrained. We provide dynamic packet length optimization scheme to achieve best performance in terms of channel utilization, energy efficiency. The adaptation of dynamic packet length is 802.11 wireless system. We increase data delivery ratio, system throughput and decrease network conjunction, end to end delay. In the system, the packet delivery ratio keeps high i.e. 95% above and link estimated error within 10% for 95% link. The system provides accurate link estimation method which achieves best performances related to the previous works.
Research Interests:
— Recent years have seen unprecedented growth in the popularity of social network systems. Social networks are online applications which allow its users to connect by various link types. Online social networks, such as Facebook, LinkedIn... more
— Recent years have seen unprecedented growth in the popularity of social network systems. Social networks are online applications which allow its users to connect by various link types. Online social networks, such as Facebook, LinkedIn are progressively utilized by many people. It makes digital communication technologies sharpening tools for extending the social circle of people. It has already become an important integral part of our daily lives, enabling us to contact our friends and families on time. As social networks have developed rapidly, recent research has begun to explore social networks to understand their structure, advertising and marketing, and data mining online population, representing 1.2 billion users around the world. More and more social network data has been made publicly available and analyzed in one way or another. But some of the information revealed is meant to be private hence social network data has led to the risk of leakage of confidential information of individuals. However, privacy concerns can be used to prevent these efforts. Privacy preserving publishing of social network data becomes a more and more important concern. This leads for privacy-preserving social network data mining, which is the discovery of information and relationships from social network data without violating privacy. Privacy in online social networks data has been of utmost concern. Hence, the research in this field is still in its early years. Privacy preservation is a significant research issue in social networking. Since more personalized information is shared with the public, violating the privacy of a target user become much easier. We argue that the different privacy problems are entangled and that research on privacy in OSNs would benefit from a more holistic approach. In this paper, after an overview on different research subareas of SNSs, we will get more focused on the subarea of privacy protection in social network and we present a brief yet systematic review of the existing anonymization techniques for privacy preserving publishing of social network data.
Research Interests:
With the introduction of good phones and mobile devices, the speed of technology usage has became widespread. it's terribly possible to visualize anyone victimization mobile device on the road, in the bus, at work, in school or maybe... more
With the introduction of good phones and mobile devices, the speed of technology usage has became widespread. it's terribly possible to visualize anyone victimization mobile device on the road, in the bus, at work, in school or maybe reception in spite of if there's a totally useful pc close. significantly, the usage of good phones isn't simply restricted to basic communication; they're getting used as a technological contrivance in our daily lives for years. we will simply reach such applications on totally different fields e.g. image process, audio standardization, vidseo piece of writing, voice recognition. With the assistance of those good applications we tend to ar able to move with our surroundings a lot of quicker, and consequently build life easier; comparatively needless to say. aside from itinerant business, sharing and staying connected to our surroundings has become extremely popular. someone WHO incorporates a mobile phones with latest technology wishes to speak with friends, share feelings instantly, and in fact meet new folks that perceive him well.
Research Interests:
— in today's world, everyone commuting from one place to another for specific reasons. E.g. school, job, excursion etc. Carpooling is the application based on android platform help sharing of car journeys so that more than one person... more
— in today's world, everyone commuting from one place to another for specific reasons. E.g. school, job, excursion etc. Carpooling is the application based on android platform help sharing of car journeys so that more than one person travels in a same car. By adopting this application, users, will help to reduce high amount of CO2 emission because of car. Carpooling reduces each person's travels costs such as fuel costs, parking space, tolls, and the stress of driving. Carpooling application is a more environmentally friendly and sustainable way to travel as sharing journeys. Carpooling is more suitable, especially during high fuel prices and high pollution periods. Carpooling is Android based application that provide more security and easy way to find a car for journey. Also it is working on mobile device which makes it more usable and makes the service more dynamic.
Research Interests:
Recommendation techniques are very important in the fields of E-commerce and other Web-based services. One of the main difficulties is dynamically providing high-quality recommendation on sparse data. In this paper, a novel dynamic... more
Recommendation techniques are very important in the fields of E-commerce and other Web-based services. One of the main difficulties is dynamically providing high-quality recommendation on sparse data. In this paper, a novel dynamic personalized recommendation algorithm is proposed, in which information contained in both ratings and profile contents are utilized by exploring latent relations between ratings, a set of dynamic features are designed to describe user preferences in multiple phases, and finally a recommendation is made by adaptively weighting the features. Recommended systems are changing from novelties used by a few E-commerce sites, to serious business tools that are reshaping the world of E-commerce. Many of the largest commerce Web sites are already using recommended systems to help their customers find products to purchase. A recommended system learns from a customer and recommends products that she will find most valuable from among the available products. In this paper we present an explanation of how recommended systems help E-commerce sites increase sales
Research Interests:
This paper describes the planning and development of location. Mobile commercial enterprise application is employed to seek out commercial enterprise place. This application is employed in each robot platform. owing to this app folks will... more
This paper describes the planning and development of location. Mobile commercial enterprise application is employed to seek out commercial enterprise place. This application is employed in each robot platform. owing to this app folks will notice actual location. The frontend of this app is robot. Backend is Xamp server, php server. This app helps for any someone for locating and visiting any location at intervals less time. Through this app we will visit the place in future and there's no want of guide. This app can advocate the places wherever we will visit the place. Emergency is additionally provided during this app. once a user thinks that he's in peril he will contact others
Research Interests:
—Clustering can be define as a method of unsupervised classification, in which the data points are grouped into cluster based on their similarity. K-means is an efficient and widely used algorithm for the purpose of partitioned... more
—Clustering can be define as a method of unsupervised classification, in which the data points are grouped into cluster based on their similarity. K-means is an efficient and widely used algorithm for the purpose of partitioned clustering. K-means algorithm is effective in producing clusters for many real time practical applications. In this Work a modified parallel K-Means algorithm is proposed that overcome the problem of fixed number of input clusters and their serial execution. Originally K-means algorithm accepts the fix number of clusters as input. But in real world scenario it is very difficultto fix the number of clusters in order to get the optimal outcomes. K-means is popular and widely used clustering technique. Many research has been already done in same area for the improvement of K-means clustering, but further investigation is always required to reveal the answers of the important questions such as 'is it possible to find optimal number of clusters dynamically while ignoring the empty clusters' or 'does the parallel execution of any clustering algorithm really improves it performance in terms of speedup'. This research presents an improved K-Means algorithm which is capable to calculate the number of clusters dynamically using Dunn's index approach and further executes the algorithm in parallel using the capabilities of Microsoft's Task Parallel Libraries. The original K-Means and Improved parallel modified K-Means algorithm performed for the two dimensional raw data consisting different numbers of records. From the results it is clear that the Improved K-Means is better in all the scenarios either increase the numbers of clusters or change the number of records in raw data. For the same number of input clusters and different data sets in original K-Means and Improved K-Means, the performance ofModified parallel K-Means is 18 to 46 percent better than the original K-Means in terms of Execution time and Speedup.
Research Interests:
—when there is need to use Memory allocation on relatively huge datasets, there becomes possibilities to encounter the exception that is OutOfMemoryException. That shows that memory is not available for the allocation. This exception does... more
—when there is need to use Memory allocation on relatively huge datasets, there becomes possibilities to encounter the exception that is OutOfMemoryException. That shows that memory is not available for the allocation. This exception does not occur due to the limitation of memory of system, it occurs when virtual address space is not available for that byte of data. This takes place not because of the insufficient memory or the fact that memory has been reached limitations of the system memory, but it is because of the current implementation of memory allocation which uses a single byte array as a backing store. When data set is relatively larger the backing store of memory allocation space requires more contiguous memory than is available in the virtual address space. If there is no continuous memory available for the process then it encounters the exception of OutOfMemoryException even there is enough space available but not continuous. In this research we proposed an approach for dynamically deciding the best memory allocator for every application. The proposed solution does not require contiguous memory to store the data contained in the stream. This approach uses a dynamic list of small blocks as the backing store, which are allocated on demand as the stream is used. If there is no contiguous memory available in the Stream then memory allocation can be done from these small blocks of memory with no
Research Interests:
The Mobile Ad hoc Networks are considered as a new paradigm of infrastructure-less mobile wireless communication systems. Routing in MANETs is considered as a challenging task due to the unpredictable changes in the network topology,... more
The Mobile Ad hoc Networks are considered as a new paradigm of infrastructure-less mobile wireless communication systems. Routing in MANETs is considered as a challenging task due to the unpredictable changes in the network topology, Nodes can dynamically join and leave the network without any warning and All nodes in network are energy dependent and efficient energy utilization is one of the important criteria in MANET. In this research work, propose the Dynamic Efficient Power Consumption Congestion Control Scheme (DEPCCCS) for congestion control and improving the quality of service in mobile network. The dynamic behavior of network connection is not maintained for long time. In a mobile network nodes are not aware about their energy status, some situation routing packets are consumes more energy. In this paper, the DEPCCCS is incorporated in the routing protocol to reduces that possibility of destination finding by maintain the record of location of each node in network respect to wireless base station is used. The central base station stores locations of the mobile nodes in a position table. The proposed protocol dynamically calculates every nodes energy status and their current location and speed for the minimum energy consumption for mobile nodes. And also the proposed scheme presents congestion control and power consumption. By simulation results, shown that our proposed technique attains better delivery ratio and throughput with less delay and energy consumption when compared with the existing technique.
Research Interests:
Pipelining is a concept which improves the performance of processor. A five stage pipelined RISC processor has stages as instruction fetch, decode, execute, memory, write back. RISC has a simpler and faster instruction set architecture.... more
Pipelining is a concept which improves the performance of processor. A five stage pipelined RISC processor has stages as instruction fetch, decode, execute, memory, write back. RISC has a simpler and faster instruction set architecture. The aim of paper is to design instruction fetch unit and ALU which are part of RISC processor architecture. Instruction fetch is designed to read the instructions present in memory. ALU is in the execution stage of pipelining which performs all computations i.e. arithmetic and logical operations. Xilinx 8.1i is used to simulate the design using VHDL language.
Research Interests:
Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segmentation, atlas selection and combination are considered as two key factors affecting the performance. Recently, manifold learning based atlas... more
Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segmentation, atlas selection and combination are considered as two key factors affecting the performance. Recently, manifold learning based atlas selection methods have emerged as very promising methods. However, due to the complexity of structures in raw images, it is difficult to get accurate atlas selection results only measuring the distance between raw images on the manifolds. Although the distance between the regions to be segmented across images can be readily obtained by the label images, it is infeasible to directly compute the distance between the test image (gray) and the label images (binary). Here is a small try to solve this problem by proposing a label image constrained atlas selection method, which exploits the label images to constrain the manifold projection of raw images. Compared with other related existing methods, the experimental results on prostate segmentation showed that the selected atlases are closer to the target structure and more accurate segmentation were obtained by using our proposed method. We present a multi-atlas-based framework for accurate, consistent and simultaneous segmentation of a group of target images. Multi-atlas-based segmentation algorithms consider concurrently complementary information from multiple atlases to produce optimal segmentation outcomes. When segmenting a group of target images, most current methods consider these images independently with disregard of their correlation, thus resulting in inconsistent segmentations of the same structures across different target images.
Research Interests:
Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segmentation, atlas selection and combination are considered as two key factors affecting the performance. Recently, manifold learning based atlas... more
Multi-atlas based method is commonly used in image segmentation. In multi-atlas based image segmentation, atlas selection and combination are considered as two key factors affecting the performance. Recently, manifold learning based atlas selection methods have emerged as very promising methods. However, due to the complexity of structures in raw images, it is difficult to get accurate atlas selection results only measuring the distance between raw images on the manifolds. Although the distance between the regions to be segmented across images can be readily obtained by the label images, it is infeasible to directly compute the distance between the test image (gray) and the label images (binary). Here is a small try to solve this problem by proposing a label image constrained atlas selection method, which exploits the label images to constrain the manifold projection of raw images. Compared with other related existing methods, the experimental results on prostate segmentation showed that the selected atlases are closer to the target structure and more accurate segmentation were obtained by using our proposed method. We present a multi-atlas-based framework for accurate, consistent and simultaneous segmentation of a group of target images. Multi-atlas-based segmentation algorithms consider concurrently complementary information from multiple atlases to produce optimal segmentation outcomes. When segmenting a group of target images, most current methods consider these images independently with disregard of their correlation, thus resulting in inconsistent segmentations of the same structures across different target images.
Research Interests:
—with recent advance in web technology, many online shopping websites have been emerged. Despite its advantages, however, online shopping presents certain drawbacks. One drawback is that it may be difficult for a person to visualize how a... more
—with recent advance in web technology, many online shopping websites have been emerged. Despite its advantages, however, online shopping presents certain drawbacks. One drawback is that it may be difficult for a person to visualize how a given article would look if worn by that person-owing to the rich variation in body size and shape, hair and skin color, etc., in the human population. Testing the fitness of clothes is highly important for both customer and trader. Our approach concentrate on how the selected garment fitted the user's body and how it will be appear as if he/she in real world. This was carried out by identifying critical points on garment and user's body dimensions using image processing techniques. In this paper, an application for mitigating the virtual dressing room was designed, implemented and tested.
Research Interests:
— The data security in cloud is an important issue. The important data can be stored in cloud and the security of that data is totally dependent on cloud. The data might be uncovered by the malicious third party user because of wireless... more
— The data security in cloud is an important issue. The important data can be stored in cloud and the security of that data is totally dependent on cloud. The data might be uncovered by the malicious third party user because of wireless connection between client and cloud without proper authentication and protection. In this paper we figure out the different security issues with the cloud. When the data is stored in cloud the data should be properly managed and cloud have to provide a proper security to the data. In this paper discussing the different type of issues with the cloud and also possible policies are mentioned here that we can take care of those issues while discussing about the security provided by the cloud.
Research Interests:
The rapidly advancing mobile communication technology and the decrease in costs make it possible to incorporate mobile technology into Smart Home systems. We propose a mobile and internet based Smart Home system that consists of a mobile... more
The rapidly advancing mobile communication technology and the decrease in costs make it possible to incorporate mobile technology into Smart Home systems. We propose a mobile and internet based Smart Home system that consists of a mobile phone with android capabilities, an internet based application, and a home server. The home appliances are controlled by the Arduino which receives commands from the server computer, which operates according to the commands received from the mobile application via the wireless network or the internet. In our proposed system the home server is built upon a Wi-Fi technology which receives commands from the client and the command is processed via Arduino, which allows a user to control and monitor any parameters related to the home by using any Andriod capable smart phone or via the internet. This paper presents an innovative low cost design and implementation of automated control based on weather conditions, appliance control, and, home security together with the design of android application to enable the smart phone to send commands and receive alerts through the server based system.
Research Interests:
Powering Nation means developing nation as powerful nation. It depends on energy. Energy is the chief gauge of all of categories work done by human beings and nature. Energy sources are of two genuses there are renewable and... more
Powering Nation means developing nation as powerful nation. It depends on energy. Energy is the chief gauge of all of categories work done by human beings and nature. Energy sources are of two genuses there are renewable and non-renewable. Examples of renewable sources are solar, wind, hydropower, tidal hydropower and geothermal energy. These are generally sustainable and environmentally friendly. Non-renewable sources are extracted or created after considerable damage to the environment. Examples are coal, petroleum, natural gas and nuclear power. Geothermal energy is an unimpeded source of energy and is available from earth crust. Maneuver of geothermal energy resource is a swap resource for conventional energy resources. Geothermal energy is a clean energy resource. It not only produces electricity but also has many applications like space heating, drying, industrial processes. In USA and New Zealand the usage of geothermal energy is very high. In India the convention of geothermal energy is in burgeoning juncture.
Research Interests:
The timing pathway is a fully-organized pathway for exchanging the ideas in which they are encoded by the timing within events. Absolutely, whilst a jammer has the power to disorganize the data gathered in the stormed packets. Timing data... more
The timing pathway is a fully-organized pathway for exchanging the ideas in which they are encoded by the timing within events. Absolutely, whilst a jammer has the power to disorganize the data gathered in the stormed packets. Timing data is not able to be jammed. As long as on a blocked pathway the data can deliver to the acceptor and achieve their results. The transmission located in the wireless midway is determined by the familiar attack called as jamming attack. Below this attack, because the knot join with the jammer have contrary interests, their communications can be designed by the method of game theory. Correspondingly, in this summary a game theoretic imitation of the interplay among nodes with the help of timing pathways to attain flexibility to jamming intrusion along with a jammer copied and guessed. Especially, the Nash equilibrium is analyzed in some kinds of time period, singleness, and merging below best response dynamics. Moreover, the case in whatever the interacting nodes decide their plan of action join with the jammer respond correspondingly is designed and estimate as a stackelberg game, by in the view of a pair the perfect and imperfect acquaintance of the jammers efficiency function. It will show the collision of network framework on the arrangement performance.
Research Interests:
Any large unstructured data sets with sizes beyond the ability of the software tools to manage and process within a tolerable elapsed time is rightly observed as bigdata. Cloud computing is delivery of on demand computing resources from... more
Any large unstructured data sets with sizes beyond the ability of the software tools to manage and process within a tolerable elapsed time is rightly observed as bigdata. Cloud computing is delivery of on demand computing resources from application to data center over internet. Combining these two strong reliable platforms helps in tackling extraveneous real time problems and obtaining solutions for it. Cloud embedded bigdata supports inexpensive reliable storage and tools for analyzing structured and unstructured, semi streaming, click streaming and various types of data. The existing system tends to be more costlier because of cloud deployment costs and it is not elastic in nature. The subjective nature of cloud delivery to incoming data streams pulls back the efficiency of the system. The paper aims to minimize the cost for cloud adoption by determining the cloud adoption factors from Net present value computation and derives a mathematical expression for 'α'(Cloud adoption factor). It also addresses the issues that affect the performance issues of bigdata by implementing subordinate virtual cloud mechanism to overcome the addressed bottlenecks.
Research Interests:
Any large unstructured data sets with sizes beyond the ability of the software tools to manage and process within a tolerable elapsed time is rightly observed as bigdata. Cloud computing is delivery of on demand computing resources from... more
Any large unstructured data sets with sizes beyond the ability of the software tools to manage and process within a tolerable elapsed time is rightly observed as bigdata. Cloud computing is delivery of on demand computing resources from application to data center over internet. Combining these two strong reliable platforms helps in tackling extraveneous real time problems and obtaining solutions for it. Cloud embedded bigdata supports inexpensive reliable storage and tools for analyzing structured and unstructured, semi streaming, click streaming and various types of data. The existing system tends to be more costlier because of cloud deployment costs and it is not elastic in nature. The subjective nature of cloud delivery to incoming data streams pulls back the efficiency of the system. The paper aims to minimize the cost for cloud adoption by determining the cloud adoption factors from Net present value computation and derives a mathematical expression for 'α'(Cloud adoption factor). It also addresses the issues that affect the performance issues of bigdata by implementing subordinate virtual cloud mechanism to overcome the addressed bottlenecks.
Research Interests:
Watermark is embedded by modifying the third level mid frequency coefficients of the host image with multiple SFs. As many combinations of SFs are possible, it is difficult to obtain optimal solutions by trial and error method. Hence, in... more
Watermark is embedded by modifying the third level mid frequency coefficients of the host image with multiple SFs. As many combinations of SFs are possible, it is difficult to obtain optimal solutions by trial and error method. Hence, in order to achieve the highest possible transparency and robustness, optimization of the scaling factors is necessary. This task employs Genetic Algorithm (GA) to obtain optimum SFs. GA can search for multiple solutions simultaneously over a wide range, and an optimum solution can be gained by combining the obtained results appropriately. The aim of the task is to develop an optimal watermarking technique based on DWT domain for grey-scale images. In this paper, a robust and oblivious image watermarking algorithm using maximum wavelet coefficient modulation is proposed. Simulation results show that performance of the proposed method is superior in terms of Peak Signal to Noise Ratio (PSNR) and Normalized Correlation Coefficient (NCC).
Research Interests:
Valuable information can be hidden in images, however, few research discuss data mining on them. Image retrieval means searching, browsing and retrieving images from image databases. There are two different methodologies for image... more
Valuable information can be hidden in images, however, few research discuss data mining on them. Image retrieval means searching, browsing and retrieving images from image databases. There are two different methodologies for image retrieval i.e. text based image retrieval and content based image retrieval. Former one is obsolete. In latter one many visual features like texture, size, intensities, and frequency of pixels and color of image are extracted. In query-by-example search extracted featured are compared with stored ones. In this work an efficient for extracting image features is considered using intensity histogram of gray color image. Here in this general framework based on the decision tree for mining and processing image data. Pixel wised image features were extracted and transformed into a database-like table which allows various data mining algorithms to make explorations on it. Finally results of average gradient vectors are to be compared with previously stored one dimensional array of intensities to find similarities in image data.
Research Interests:
Matrix converters are frequency converters which do not contain a direct current link circuit with passive components, unlike conventional frequency converters. Thus, matrix converters may provide a solution for applications where large... more
Matrix converters are frequency converters which do not contain a direct current link circuit with passive components, unlike conventional frequency converters. Thus, matrix converters may provide a solution for applications where large passive components are not allowed, or a purely semiconductor-based solution provides an economically more efficient result than conventional frequency converters. The matrix converter (MC) is an alternative AC-AC power converter by connecting the direct input to output phases through bidirectional switches and without using any dc-link or energy storing element, therefore, is called an all-silicon converter. Two topologies of matrix converter are established such as linear topology and the indirect topology. This paper is devoting to presents the topology of the Very Sparse Matrix Converter (VSMC). The article is focused on Easy Commutation Space Vector Modulation (ECSVM) modelling applied to the very sparse matrix converter (VSMC).
Research Interests:
The research investigated the impact on the power system with an extensive penetration of photovoltaic (PV) generation. A model of PV generation suitable for studying its interactions with the power system was developed. The dynamic... more
The research investigated the impact on the power system with an extensive penetration of photovoltaic (PV) generation. A model of PV generation suitable for studying its interactions with the power system was developed. The dynamic response of a PV generation system to rapid changes in irradiance was investigated. An aggregated model of grid-connected PV generation was built, and it was used for simulating the integration of PV generation on a large-scale. Voltage control technique was investigated by simulation. Distributed Generation (DG) units are connected to the grid increasing nowadays for several reasons. Most DG units are relatively small and connected to the distribution network. A large part of the DG units connected to the grid via power electronic converters. The main task of the converters is to convert the power that is available from the prime source to the correct voltage and frequency of the grid. The general objective of this paper is to investigate how the power electronic converters can support the grid and solve power quality problems. An IEEE-5 bus system considered for this work to validate the power electronic converter using MATLAB/ Simulink.
Research Interests:
In the contemporary era, Data is the most valuable resource that is used in day to day life from an individual to large organizations. Database contains useful and confidential information so it becomes necessary to protect it from any... more
In the contemporary era, Data is the most valuable resource that is used in day to day life from an individual to large organizations. Database contains useful and confidential information so it becomes necessary to protect it from any unauthorized access. Any unauthorized user can try to perform unauthorized activities at unauthorized time on sensitive data. So to prevent the database from any misuse, different security mechanisms are applied to the database to make it secure. This paper focuses on the challenges and security mechanisms in database.
Research Interests:
" The MHD flow and convective heat transfer from water functionalized CNTs over a static/moving wedge in the presence of heat source/sing are studied numerically. Thermal conductivity and viscosity of both single and multiple wall carbon... more
" The MHD flow and convective heat transfer from water functionalized CNTs over a static/moving wedge in the presence of heat source/sing are studied numerically. Thermal conductivity and viscosity of both single and multiple wall carbon nanotubes (CNTs) within a base fluid (water) of similar volume are investigated to determine the impact of these properties on thermo fluid performance. The governing partial differential equations are converted into nonlinear, ordinary, and coupled differential equations and are solved using bvp4c Matlab solver. The effects of volume fraction of CNTs and magnetic and wedge parameters are investigated and presented graphically. The numerical results are compared with the published data and are found to be in good agreement.
Research Interests:
Instant fuzzy search is important developing technique from which users can find results character by character with better search experiences. The results must have high speed, good relevancy score and also good ranking functions used to... more
Instant fuzzy search is important developing technique from which users can find results character by character with better search experiences. The results must have high speed, good relevancy score and also good ranking functions used to get top results. Many functions are used to consider proximity of keywords which ultimately gives good relevancy score. In this paper, proximity information is used to ranking query results with which gives good time and space complexities. Many previously proposed techniques are used to achieve proximity ranking into instant fuzzy search. Most of the techniques firstly compute results and rank then according to some ranking functions, but if the dataset used is large then it takes time to compute all results and its very time consuming. At this state early termination technique is used to minimize space and time complexity. In this paper, incremental computation algorithm is used to overcome all drawbacks of previous systems and compute relevant results. Also query logs are used which are very useful for most of query suggestion systems, which ultimately reduces time complexity efficiently. The experimental results are computed to show space, time complexity and quality of results.
Research Interests:
Cloud computing is an upcoming technology, which offers various services. The services include infrastructure, platform or software services. The service provided by the cloud is over the internet. In the Cloud, the services are available... more
Cloud computing is an upcoming technology, which offers various services. The services include infrastructure, platform or software services. The service provided by the cloud is over the internet. In the Cloud, the services are available quickly. The cloud has high demand in the market. Most of the organisation or user prefers their storage as on cloud storage and which is located at the remote place. The user has no control over the cloud storage data. The cloud computing uses the resources like Memory, storage and processor, which are not physically present at the user's location, rather they are located outside the premises and managed by a service provider. The user can access the resources via the Internet. The main focus of this paper is to check data integrity of file which is stored on remote cloud storage with less communication overhead. Security in terms of integrity is most vital aspects in cloud computing environment. In this paper, we focus on a cloud data security problem. We also, try to get a security with minimal computational overhead because all computation is done over the Internet. The different techniques are discussed for data integrity in cloud storage and their performance is measured in terms of computational overhead.
Research Interests:
Intrusion detection has become an essential element of network administration thanks to the huge range of attacks persistently threaten our computers. Ancient intrusion detection systems area unit restricted and do not give a whole... more
Intrusion detection has become an essential element of network administration thanks to the huge range of attacks persistently threaten our computers. Ancient intrusion detection systems area unit restricted and do not give a whole resolution for the matter. They look for potential malicious activities on network traffics; they generally succeed to search out true security attacks and anomalies. However, in several cases, they fail to observe malicious behaviors (false negative) or they fireplace alarms once nothing wrong within the network (false positive). Additionally, they need thorough manual process and human professional interference. Applying data processing (DM) techniques on network traffic information may be a promising resolution that helps develop higher intrusion detection systems. Moreover, Network Behavior Analysis (NBA) is additionally associate degree effective approach for intrusion detection. During this paper, we tend to discuss DM and NBA approaches for network intrusion observation and recommend that a mix of each approach has the potential to detect intrusions in networks additional effectively.
Research Interests:
Can WiFi signals be used for sensing purpose? The growing PHY layer capabilities of WiFi has made it possible to reuse WiFi signals for both communication and sensing. Sensing via WiFi would enable remote sensing without wearable sensors,... more
Can WiFi signals be used for sensing purpose? The growing PHY layer capabilities of WiFi has made it possible to reuse WiFi signals for both communication and sensing. Sensing via WiFi would enable remote sensing without wearable sensors, simultaneous perception and data transmission without extra communication infrastructure, and contactless sensing in privacy-preserving mode. Due to the popularity of WiFi devices and the ubiquitous deployment of WiFi networks, WiFi-based sensing networks, if fully connected, would potentially rank as one of the world's largest wireless sensor networks. Yet the concept of wireless and sensorless sensing is not the simple combination of WiFi and radar. It seeks breakthroughs from dedicated radar systems, and aims to balance between low cost and high accuracy, to meet the rising demand for pervasive environment perception in everyday life. Despite increasing research interest, wireless sensing is still in its infancy. Through introductions on basic principles and working prototypes, we review the feasibilities and limitations of wireless, sensorless, and contactless sensing via WiFi. We envision this article as a brief primer on wireless sensing for interested readers to explore this open and largely unexplored field and create next-generation wireless and mobile computing applications.
Research Interests:
The aim of our project is to analyze the amount of power that may be harvested from the encircling surroundings and processed to realize levels of energy that are sufficient to charge up low power electronic circuits and devices.... more
The aim of our project is to analyze the amount of power that may be harvested from the encircling surroundings and processed to realize levels of energy that are sufficient to charge up low power electronic circuits and devices. Radio-Frequency energy gather holds a promising future for energizing low power electronic devices in wireless communication circuits and Extended for top power necessities. This project presents RF energy gather system that may harvest energy from the close surroundings at the downlink frequency vary of GSM-900 and GSM-1200 bands. Energy gather may be key techniques that may not be overcome the barriers that forestall the important world preparation of wireless Sensor networks (WSNs). Specially, solar power gather has been unremarkably wont to overcome this barrier. We tend to develop an RF energy gather WSN Circuit to indicate the effectiveness of RF energy gather for the usage of a WSN. The RF energy gather technique is effective in long amount measure applications that don't need high power consumption
Research Interests:
Vibration control plays a very important role in the modern day world especially in control of earthquakes & in aerospace engineering. With reference to this, research is being carried out in this exciting field of smart intelligent... more
Vibration control plays a very important role in the modern day world especially in control of earthquakes & in aerospace engineering. With reference to this, research is being carried out in this exciting field of smart intelligent structures. Control of vibrations in smart intelligent structures for a single input single output case using fast output sampling method is presented in this paper using the modeling done in Matlab-Simulink environment. The model is run & the simulation results are observed on the scopes. The simulation results show the effectiveness of the method presented in this paper, how the vibrations are suppressed in a shorter time.
Research Interests:
Modern knowledge bases contain huge data with complex relations between the attributes. From these sort of databases obtaining satisfactory results is troublesome task. Use of traditional predefined query interfaces during this sort of... more
Modern knowledge bases contain huge data with complex relations between the attributes. From these sort of databases obtaining satisfactory results is troublesome task. Use of traditional predefined query interfaces during this sort of databases doesn't provide satisfactory results. projected system generates query interface forms with user participation. User will provide feedback by click through therefore capturing user's preference. Query form is adaptive since it dynamically refined until user gets satisfactory results.
Research Interests:
Data mining facilitates the discovery of unrevealed trends from huge datasets. Data warehouse is a key technology for complex data analysis automatic knowledge extraction and decision making. Multi-dimensional database permits the data... more
Data mining facilitates the discovery of unrevealed trends from huge datasets. Data warehouse is a key technology for complex data analysis automatic knowledge extraction and decision making. Multi-dimensional database permits the data for efficient and appropriate storage. The dimensional table holds all different attributes and dimensions. Detecting the hidden association between the items are limited in OLAP. Researchers have proposed many ideas to reduce the limitations. This paper presents an approach called Association rule classification for Multi-Dimensional dataset. This proposed work detects the hidden association form OLAP and also categorize the rule sets effectively.
Research Interests:
This paper describes a feature extraction method for optical character recognition system for handwritten documents in Malayalam, a South Indian language. The scanned image is first passed through various preprocessing stages of... more
This paper describes a feature extraction method for optical character recognition system for handwritten documents in Malayalam, a South Indian language. The scanned image is first passed through various preprocessing stages of operations like noise removal, binarization, thinning and cropping. After preprocessing projection profiles of each character is found. 1-D Discrete Cosine Transform (DCT) of projection profiles used as a feature. A multilayer artificial neural network (ANN) with logsig activation function is used for classification. The promising feature of the work is that successful classification of 44 handwritten characters.
Research Interests:
— The PCS network uses two-tier database architecture to locate a mobile terminal (MT). This two tier architecture has: Visitor Location Register (VLR) and Home Location Register (HLR). In HLR, profile of an MT is permanently stored while... more
— The PCS network uses two-tier database architecture to locate a mobile terminal (MT). This two tier architecture has: Visitor Location Register (VLR) and Home Location Register (HLR). In HLR, profile of an MT is permanently stored while VLR is used as a cache memory. Each VLR may serve one or more registration area (RA). When an MT comes in a new VLR, profile of newly joined MT is fetched from the HLR and stored in the VLR. In existing location management, there is a single HLR. During busy hours this single HLR may be in bottleneck state. This situation leads to call miss routing. When a call is initiated, the current location of called MT is fetched from the HLR. In the conventional location management and call delivery scheme, the HLR is always queried. When an MT only performs movement, location registration process is executed and finally HLR is updated. This paper is proposing a concept of MT profile forwarding scheme to avoid the HLR from the bottleneck and call misrouting. In the conventional scheme if both called and calling MT are residing in the same VLR, to get the current location of the called MT, the HLR is queried. This is known as tromboning problem. The proposed scheme is efficient in both the cases: location management and call delivery. In the proposed scheme, entire coverage area of a PCS network is divided into zones. Each zone has its own HLR. When an MT moves from one HLR to other HLR its profile copy is made available to the HLR. The serving HLR stores this profile in its database on temporary basis.
With the increase in smart phone users and communication through messaging. In all smart phones there are various applications (apps) which can be used for communication with each other. But many of these apps are sending text data in... more
With the increase in smart phone users and communication through messaging. In all smart phones there are various applications (apps) which can be used for communication with each other. But many of these apps are sending text data in plain text format through network. When using such apps in a public Wi-Fi network, anybody can able to sniff incoming and outgoing messages. Data compression is a common requirement for most of the computerized applications. There are number of data compression algorithms, which are dedicated to compress different data formats. Even for a single data type there are number of different compression algorithms, which use different approaches. Mobile communication devices have become popular tools for gathering and disseminating information and data. This paper proposes an efficient data compression technique by doing some modification to Huffman Coding. The aim of this paper is to compress data to send through Android based mobiles to optimize bandwidth.
Research Interests:
Extraction of linear features from Remote Sensing Image (RSI) has found many applications as in urban planning, disaster mitigation and environmental monitoring. There were many previous studies in this field appreciating the significance... more
Extraction of linear features from Remote Sensing Image (RSI) has found many applications as in urban planning, disaster mitigation and environmental monitoring. There were many previous studies in this field appreciating the significance of statistical operators to extract linear features. But in RSI domain, it has a different significance as it involve handling a large data set of multiband data involving complexities in terms of spectral, spatial and temporal domain. Most of the objects in nature were not easily discernable and extracted as they were often contaminated or mixed with other objects and might influence the spectral character of the object. This may be less in urban environment as they exhibit more or less uniform spectral behavior where as in natural setting it may exhibit complex spectral behavior. Present study demonstrates such complexities in extracting linear features in different setting – urban and coastal area – using first order derivative gradient filters.
Research Interests:
Education is the basic fundamental of any country. The purpose of this paper is to examine the different categories of schools in Aurangabad city, Maharashtra, India. It checks the compatibility of GIS with education facilities. We... more
Education is the basic fundamental of any country. The purpose of this paper is to examine the different categories of schools in Aurangabad city, Maharashtra, India. It checks the compatibility of GIS with education facilities. We classified four different categories of schools in Aurangabad city, mapped them using Google Earth and KML. The categories of schools are classified on medium of the schools. We traced these all schools with their exact longitude and latitude and detail information about the particular school is also shown using Google Earth API. In this paper some management levels are used by assigning some random points anywhere in Aurangabad city. Due to that people get the best schools information from that random point which is nearer. We have assigned some ranking to the schools according to their facilities, infrastructure, admission status, outcome etc. So the people can see covered area about the particular school and area where actual schools are required.
Research Interests:
In today's world one of the major challenge to defense against Distributed Denial of Service (DDoS) Attack. We cannot completely avoid DDoS attack but we can reduce the DDoS attack. In IP traceback schemes, the victim can identify the... more
In today's world one of the major challenge to defense against Distributed Denial of Service (DDoS) Attack. We cannot completely avoid DDoS attack but we can reduce the DDoS attack. In IP traceback schemes, the victim can identify the sources of an attack and can block them. However, these methods react to the attack once it is completed. This means the critical resource of the victim already have been consumed by the attacker and reached the goal of blocking the access to the victim. To overcome this problem of existing IP traceback scheme, defense mechanism against DDoS flooding attacks have been proposed based on existing Deterministic Flow Marking (DFM) IP traceback method. The fundamental issue worried with discovery frameworks is IP spoofing. This paper proposes a bundle marking plan which checks the data into IP header field of the packet to beat the issue of IP spoofing. The marked data is utilized to remake the IP location of the entrance router joined with the attack source at the distinguishing end. The work is sent in the programmable router progressively and the attack source recognition systems are completed. It will improve the performance of the legitimate traffic.
Research Interests:
This paper presents a neural networks as image processing tools for image compression, present a direct solution method based neural network for image compression. Digital images require large amounts of memory for storage. Thus, the... more
This paper presents a neural networks as image processing tools for image compression, present a direct solution method based neural network for image compression. Digital images require large amounts of memory for storage. Thus, the transmission of an image from one computer to another can be very time consuming. By using data compression techniques, it is possible to remove some of the redundant information contained in images, requiring less storage space and less time to transmit. To observe and compare the different algorithms of image compression and Levenberg marquardt is best as compare to other algorithms.
Research Interests:
This paper presents a neural networks as image processing tools for image compression, present a direct solution method based neural network for image compression. Digital images require large amounts of memory for storage. Thus, the... more
This paper presents a neural networks as image processing tools for image compression, present a direct solution method based neural network for image compression. Digital images require large amounts of memory for storage. Thus, the transmission of an image from one computer to another can be very time consuming. By using data compression techniques, it is possible to remove some of the redundant information contained in images, requiring less storage space and less time to transmit. To observe and compare the different algorithms of image compression and Levenberg marquardt is best as compare to other algorithms.
Research Interests:
The dropped-call rate (DCR) is the fraction of the telephone calls which, due to technical reasons, were cut off before the speaking parties had finished their conversational tone and before one of them had hung up (dropped calls) This... more
The dropped-call rate (DCR) is the fraction of the telephone calls which, due to technical reasons, were cut off before the speaking parties had finished their conversational tone and before one of them had hung up (dropped calls) This fraction is usually measured as a percentage of all .CDR is defined as the ratio of abnormal disconnect of calls to total calls established. The performance of a TSP relating to call drop is assessed through this parameter. The benchmark set by TRAI for CDR is <2%.Since the CDR for the service area as a whole does not reveal the extent of number of areas or localities where the CDR is poor, the Authority is monitoring another parameter called " Worst affected cells " in which the call drop rate exceeds 3% during cell busy hour, averaged over a month for a service area " provides a much more localised view of the network. In this paper we have suggested a forced handoff method (Directed Retry)for cells going down due to weak transmission plan or other reason to reduce call drops .
Research Interests:
Gesture recognition and pattern recognition are advancing at an exponential rate in the passing years. Hand gesture recognition has been a fascinating research area. Hand gesture recognition system provides us a novel, natural, innovative... more
Gesture recognition and pattern recognition are advancing at an exponential rate in the passing years. Hand gesture recognition has been a fascinating research area. Hand gesture recognition system provides us a novel, natural, innovative user friendly way of communication with the computers. Gesture recognition has varied area of application including human computer interaction, sign language, game playing etc. Hand gesture recognition have enormous in human computer interaction and robotic machineries. Today interfaces between human and computer are mouse and keyboard are but in near future hand and eye gestures would replace it. Through this application we can identify the number of fingers of our hand. We can trigger on event using predefined gesture scenarios. Hand gesture recognition also provides a low cost interfaces device for interacting with objects in virtual environment using hand gestures.
Research Interests:
Software Defined Networking (SDN) is an emerging architecture in the field of networking in which the control plane and forwarding plane of traditional networking devices (e.g. Switches, Routers) are decoupled. The network-wide traffic... more
Software Defined Networking (SDN) is an emerging architecture in the field of networking in which the control plane and forwarding plane of traditional networking devices (e.g. Switches, Routers) are decoupled. The network-wide traffic flow can be directly programmed. SDN plays an important role in today's enterprises and applications with drastically changing requirements which are monitored and adapted by the change in traffic flows through the networking devices. This survey paper on SDN provides an outline on the standard communication interface, characteristics of SDN and the pros and cons that are associated with SDN architecture.
Research Interests:
Image fusion is to reduce uncertainty and minimize redundancy. It is a process of combining the relevant information from a set of images, into a single image, wherein the resultant fused image will be more informative and complete than... more
Image fusion is to reduce uncertainty and minimize redundancy. It is a process of combining the relevant information from a set of images, into a single image, wherein the resultant fused image will be more informative and complete than any of the input images. Till date the image fusion techniques were like DWT or pixel based. These conventional techniques were not that efficient and they did not produced the expected results as the edge preservance, spatial resolution and the shift invariance are the factors that could not be avoided during image fusion. This paper discusses the implementation of two  categories of image fusion. The Stationary wavelet transform (SWT), and Principal component analysis (PCA).  The Stationary wavelet transform (SWT) is a wavelet transform algorithm designed to overcome the lack of translation-invariance of the discrete wavelet transform (DWT). Whereas The Principal component analysis (PCA) is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. To overcome the disadvantages of the earlier techniques used for image fusion a new hybrid technique is proposed that works by combining the SWT and PCA i.e. stationery wavelet transform and principal component analysis. This hybrid technique is proposed to obtain a better efficient and a better quality fused image which will have preserved edges and its spatial resolution and shift invariance will be improved. This hybrid technique will produce better fusion results. The image obtained after fusion using proposed technique will be of better quality than the images fused using conventional techniques.
Research Interests:
A MANET is a type of ad-hoc network that consists of wireless mobile nodes communicates with each other without using wires, a fixed infrastructure, central administration and which establishes route from source to destination. In Mobile... more
A MANET is a type of ad-hoc network that consists of wireless mobile nodes communicates with each other without using wires, a fixed infrastructure, central administration and which establishes route from source to destination. In Mobile Ad-hoc Network (MANET), each node can freely move in any direction and every node also act as router as they forward traffic for other nodes. So, various routing protocols such as AODV, DSR, DSDV and OLSR are designed for routing in ad-hoc networks. This paper analyzed the literature of routing protocols which are discussed by comparing the various routing protocols on the basis of different schemes.
Research Interests:
— The cloud computing and the Internet of things are tightly coupled with each other. The rapid growth of the Internet of Things (IoT) and the development of technologies created a widespread connection of " things ". This results in the... more
— The cloud computing and the Internet of things are tightly coupled with each other. The rapid growth of the Internet of Things (IoT) and the development of technologies created a widespread connection of " things ". This results in the production of large amounts of data which needs to be stored, processed and accessed. Cloud computing is a paradigm for big data storage and analytics while the Internet of Things is exciting on its own that the real innovation will come from combining it with cloud computing. This can enable sensing services and powerful processing of sensing data stream. More things are being connected to address a growing range of business needs. In fact, by the year 2020, more than 50 billion things will connect to the Internet—seven times our human population. Insufficient security will be a critical barrier to large-scale deployment of IoT systems and broad customer adoption of IoT applications using cloud. Simply extending the existing IT security architectures to the IoT and cloud will not be sufficient. The IoT world requires new security approaches, creating fertile ground for innovative thinking and solutions. This paper discusses key issues that are believed to have long-term significance in IoT and cloud computing security and privacy, based on documented problems and exhibited weaknesses.
Research Interests:
Digital equipment such as telecommunication, computers systems and instruments use microprocessors that operate at high frequencies allowing them to carry billions of operations per second. A disturbance in the electrical supply lasting... more
Digital equipment such as telecommunication, computers systems and instruments use microprocessors that operate at high frequencies allowing them to carry billions of operations per second. A disturbance in the electrical supply lasting just a few milliseconds can affect millions of basic operations. The result may be malfunctioning and loss of data with dangerous or costly consequences (e.g. loss of production). That is why many loads, called sensitive or critical loads, require a supply that is protected. Many manufacturers of sensitive equipment specify very strict tolerances, much stricter than those in the distribution system for the supply of their equipment, one example being Computer Business Equipment Manufacturer's Association for computer equipment against distribution system disturbances. The design of this uninterrupted power supply (UPS) for personal computer (PC) is necessitated due to a need for enhanced portability in the design of personal computer desktop workstations. Apart from its original functionality as a backup source of power, this design incorporates the unit within the system unit casing, thereby reducing the number of system components available. Also, the embedding of this unit removes the untidiness of connecting wires and makes the whole computer act like a laptop. Not to be left out is the choice of Arduino as an important part of the circuitry. This has eliminated the weight and space-consuming components that make up an original design. The singular use of this Arduino places the UPS under the class of an advanced technology device.
Research Interests:
— Cancer is one of the most commonly affected diseases in the developed countries. Early diagnosis plays a significant role in curing cancer patients. Every year, thousands of people die due to Lung cancer. In this paper, a novel... more
— Cancer is one of the most commonly affected diseases in the developed countries. Early diagnosis plays a significant role in curing cancer patients. Every year, thousands of people die due to Lung cancer. In this paper, a novel Candidate group search algorithm based on evolutionary approach is proposed. This optimization algorithm allows assistant doctors to identify the nodules present in the lungs at the early stages. As manual interpretations are time consuming and very critical. Genetic algorithm (GA) helps in identifying genes that help to classify patient lung cancer status with notable predictive performance.
Research Interests:
Digital electronic maps are used to track the location in outdoor and indoor environment. Most of electronic maps are useful for outdoor environment. There is no efficiency technology for search the indoor location. Many smart-phones are... more
Digital electronic maps are used to track the location in outdoor and indoor environment. Most of electronic maps are useful for outdoor environment. There is no efficiency technology for search the indoor location. Many smart-phones are used to track the location by electronic map. The electronic maps may be the Google map, GPS navigation, waze and offline GPS maps. These are only useful for outdoor environment. Indoor based location tracking system can be implemented by using the Indoor Atlas Android SDK. It provides an API for developers to create application for inside building navigation. This app also useful for vision it have impaired people because the speech recognition for searched location and also include the event details in indoor based location tracking application. Using this app to find the shortest path for desired location.
Research Interests:
Presently in the system patient need to contact a doctor and take an appointment is possible only if patient goes to that particular doctor's clinic. Even the people can't get the correct information about doctors, their details and... more
Presently in the system patient need to contact a doctor and take an appointment is possible only if patient goes to that particular doctor's clinic. Even the people can't get the correct information about doctors, their details and different hospitals available in a particular city. The only way to get all these are through directly contacting particular persons personally and it's a very big problem for a person new to that city. Those who want to have some information in the medical field or wants to get appointment to particular doctor from his own place are not possible. Nowadays in order to get correct information and right treatment for a patient has to go by him wherever needed. This is a lengthy process, which takes a lot of time to design manually, and also costs more and even limited to certain extent. It's not possible to get all the information or details as well as we can't satisfy the user through this process. So, this " E-Clinic " website will provide all the information regarding different types of therapy available in clinic and patient can easily book the appointment.
Research Interests:
Wireless spoofing attacks are easy to launch and can significantly impact the performance of networks. Although the identity of a node can be verified through cryptographic authentication, conventional security approaches are not always... more
Wireless spoofing attacks are easy to launch and can significantly impact the performance of networks. Although the identity of a node can be verified through cryptographic authentication, conventional security approaches are not always desirable because of their overhead requirements. In this paper, we propose to use spatial information, a physical property associated with each node, hard to falsify, and not reliant on cryptography, as the basis for (1) detecting spoofing attacks; (2) determining the number of attackers when multiple adversaries masquerading as a same node identity; and (3) localizing multiple adversaries. We propose to use the spatial correlation of received signal strength (RSS) inherited from wireless nodes to detect the spoofing attacks. We then formulate the problem of determining the number of attackers as a multi-class detection problem. Cluster-based mechanisms are developed to determine the number of attackers. When the training data is available, we explore using Support Vector Machines (SVM) method to further improve the accuracy of determining the number of attackers. In addition, we developed an integrated detection and localization system that can localize the positions of multiple attackers. We evaluated our techniques through two test beds using both an 802.11 (WiFi) network and an 802.15.4 (ZigBee) network in two real office buildings. Our experimental results show that our proposed methods can achieve over 90% Hit Rate and Precision when determining the number of attackers. Our localization results using a representative set of algorithms provide strong evidence of high accuracy of localizing multiple adversaries.
Research Interests:
– In this competitive world, exams decide the parameter for a student's success and his fate in this competitive world. But this examination practises have proved to be wrong by time and many malpractices were introduced. Difficult... more
– In this competitive world, exams decide the parameter for a student's success and his fate in this competitive world. But this examination practises have proved to be wrong by time and many malpractices were introduced. Difficult question papers and out of syllabus questions have been some problems faced in many examinations by the poor students and it's the students who face the brunt of this mistakes .but with this new software that has been designed the examination papers are being made more flexible and transparent such that the head of the institution has the control for the entire examinations question papers. Currently being used only in colleges, it can be extended to further institutions and at higher levels.
Research Interests:
Artificial intelligence is the mechanism which makes human work very easy by making the devices to process from the perspective of a human being. Predictive text is a feature in smart phones, which runs on Operating Systems such as iOS,... more
Artificial intelligence is the mechanism which makes human work very easy by making the
devices to process from the perspective of a human being. Predictive text is a feature in smart phones,
which runs on Operating Systems such as iOS, Android, Windows etc. This feature plays a major role in
helping the individual to select the word that succeeds the current word of their text message or
statement. In the proposed system, the concept of predictive mechanism is taken to the next level by
predicting mathematical solutions and chronological data. Thus, the system automatically does the
mathematical calculations in the middle of a chat, without making the user switch to the calculator
application; and also finds chronological data such as date and day, without having to switch to the
calendar application.
Research Interests:
— Block Compressive sensing technique has been proposed to exploit the sparse nature of medical images in a transform domain to reduce the storage space. Block based compressive sensing is applied to dicom image, where original dicom... more
— Block Compressive sensing technique has been proposed to exploit the sparse nature of medical images in a transform domain to reduce the storage space. Block based compressive sensing is applied to dicom image, where original dicom image is divided in terms of blocks and each block is processed separately. The main advantage of block compressive sensing is that each block is processed independently and combined with parallel processing to reduce the amount of time required for processing. Compressed sensing exploits the sparse nature of images to reduce the volume of the data required for storage purpose. Inspired by this, we propose a new algorithm for image compression that combines compressed sensing with different transforms. Different sparse basis like discrete cosine transform, discrete wavelet transform and contourlet are used to compress the original input image. Among these transforms, Dct transform has block artifacts problem [14]. Wavelet transform can overcome the block artifacts introduced in the reconstructed image. Contourlet transform effectively captures smooth contours[4] and hence Contourlet transform provides better reconstruction quality image. In order to reconstruct original image, different techniques such as basis pursuit, orthogonal matching pursuit etc. are used at the decoder.
Research Interests:
In this paper we will discuss about how we can create an artificial environment. In this Hi tech era Temperature plays an important part in our environment. Changes in temperature can affect the behaviour of human beings, plants and even... more
In this paper we will discuss about how we can create an artificial environment. In this Hi tech era Temperature plays an important part in our environment. Changes in temperature can affect the behaviour of human beings, plants and even materials such as semiconductors. This project is to control the temperature of a given environment such as baby incubator, industrial boiler, for automatic room temperature control, for creating artificial weather etc. A microcontroller is used to control the temperature in a circuit. Where the temperature had to be kept constant at particular value. The system will function as stated in the programming code of Atmega 8 in order to keep the temperature stable. A simple temperature controller which has least complex circuitry has to be designed so that it saves space and be more reliable for an incubator. Present design which uses microprocessor as main controller in digital signal processing combined with complex combinational logic circuit are redundant and needs to be improved in the sense of functionality. Hence, replacement of microprocessor with an microcontroller is prudent action due to its efficiency and reliability especially in an incubator and boiler.
Research Interests:
— The aim of proposed systems (also called as collaborative filtering systems) is to suggest items which a client is expected to order. In this paper we describe the recommendation system related research and then Introduces various... more
— The aim of proposed systems (also called as collaborative filtering systems) is to suggest items which a client is expected to order. In this paper we describe the recommendation system related research and then Introduces various techniques and approaches used by the recommender system User-based approach, Item based approach, Hybrid recommendation approaches and related research in the recommender system. Normally, recommended systems are used online to propose items that users discover interesting, thereby, benefiting both the user and merchant Recommender systems benefit the user by building him suggestions on things that he is probable to buy and the business by raise of sales. we also explained the challenges, issues in data mining and how to build a recommendation system to improve performance accuracy by applying the techniques.
Research Interests:
Model Predictive Control is a process control technique that is used in process industries that predict the future behavior of the process state by predicting the change in the dependent variables of the modeled system. It can compute the... more
Model Predictive Control is a process control technique that is used in process industries that predict the future behavior of the process state by predicting the change in the dependent variables of the modeled system. It can compute the future input at each step by minimizing a cost function on the manipulated and controlled variables. The main goal of this paper is to design a Model Predictive for a multivariable process. Here, the distillation column is used as a multivariable process. Finally, the settling time, overshoot, ISE, IAE, ITAE errors of MPC controller is compared with PID controller for both SISO and MIMO systems.
Research Interests:
— Power quality is a major issue of loads in the distribution system used in industrial and domestical appliances. In this work, we proposed a modified solid state transformer system using statcom to improve the power quality of loads in... more
— Power quality is a major issue of loads in the distribution system used in industrial and domestical appliances. In this work, we proposed a modified solid state transformer system using statcom to improve the power quality of loads in the distribution system. The proposed model is used to eliminate voltage sag, and swell. The matrix converter is adopted in the proposed design of modified solid state transformer to reduce the power loss. The control strategy of matrix converter is done by sinusoidal pulse width modulation techniques. In addition to this, harmonics are reduced by using vector proportional integral controller. The simulation is done in MATLAB/SIMULINK software, the several case studies are carried out and the simulation results shows that the proposed system has better voltage regulation than that of conventional system.
Research Interests:
— Power quality is a major issue of loads in the distribution system used in industrial and domestical appliances. In this work, we proposed a modified solid state transformer system using statcom to improve the power quality of loads in... more
— Power quality is a major issue of loads in the distribution system used in industrial and domestical appliances. In this work, we proposed a modified solid state transformer system using statcom to improve the power quality of loads in the distribution system. The proposed model is used to eliminate voltage sag, and swell. The matrix converter is adopted in the proposed design of modified solid state transformer to reduce the power loss. The control strategy of matrix converter is done by sinusoidal pulse width modulation techniques. In addition to this, harmonics are reduced by using vector proportional integral controller. The simulation is done in MATLAB/SIMULINK software, the several case studies are carried out and the simulation results shows that the proposed system has better voltage regulation than that of conventional system.
Research Interests:
Big data plays a major role in all aspects of business and IT infrastructure. Today many organizations, Social Media Networking Sites, E-commerce, Educational institution, satellite communication, Aircrafts and others generate huge volume... more
Big data plays a major role in all aspects of business and IT infrastructure. Today many organizations, Social Media Networking Sites, E-commerce, Educational institution, satellite communication, Aircrafts and others generate huge volume of data on a daily basis. This data is in the form of structured, semi-structured and unstructured. So this huge voluminous amount of data is coined as big data. These big data should be stored and processed in the effective manner. But, in the traditional distributed system this data cannot be effectively handled because of lack of resources. So the term Hadoop comes in to the picture. Hadoop stores and process the huge voluminous amount of data with their strong Hadoop ecosystem. It contains many modules for processing the data, storing the data, allocating the resources, Configuration Management, retrieving the data and for providing highly fault tolerance mechanism. In this paper it focuses on big data concepts, characteristics, real time examples of big data, Hadoop Modules and their pros and cons.
Research Interests:
The rapid growth of wireless content access implies the need for content placement and scheduling at wireless base stations. We study a system under which users are divided into clusters based on their channel conditions, and their... more
The rapid growth of wireless content access implies the need for content placement and
scheduling at wireless base stations. We study a system under which users are divided into clusters based on
their channel conditions, and their requests are represented by different queues at logical front ends.
Requests might be elastic (implying no hard delay constraint) or inelastic (requiring that a delay target be
met). Correspondingly, we have request queues that indicate the number of elastic requests, and deficit
queues that indicate the deficit in inelastic service. Caches are of finite size and can be refreshed periodically
from a media vault. We consider two cost models that correspond to inelastic requests for streaming stored
content and real-time streaming of events, respectively. We design provably optimal policies that stabilize
the request queues (hence ensuring finite delays) and reduce average deficit to zero [hence ensuring that the
quality-of-service (QoS) target is met] at small cost. We illustrate our approach through simulations.
Index Terms—Content distribution network (CDN), delay-sensitive traffic, prediction, quality of service
(QoS), queueing.
Research Interests:
In this paper the model of broadcast digital video signal with embedded audio (SDI) using cloud computing is detailed in each transcoding process of the signal. The SDI signal is coding and multiplexing in ASI signal, Multidecriptor... more
In this paper the model of broadcast digital video signal with embedded audio (SDI) using cloud computing is detailed in each transcoding process of the signal. The SDI signal is coding and multiplexing in ASI signal, Multidecriptor transcoding the signal in Transport Stream, with GT-3 we can change the program in a new TS. Then with CPC develop chuncks with HLS (Http live streaming), with diferents profile. Anevia recive this chunck and delivery the signal to cloud. Testing bandwidth and bit rate for each signal process is performed, without losing the quality control standards and without losing the services of metadata (V-chip, closed caption, cue tone, DPI, GPI, Watermarket, SCTE 35 y104, etc.). How the stream is received by the end user it is shown.
Research Interests:
Internet is collection of vast information. Need of searching the web pages for a specific piece of information is a common practice. Search engines are used to search and retrieve the information as per query raised by the user. Making... more
Internet is collection of vast information. Need of searching the web pages for a specific piece of information is a common practice. Search engines are used to search and retrieve the information as per query raised by the user. Making this search process better and fast has always been the area of interest for researchers involved in web mining. The process of searching the web can be improved by Web harvesting. Web harvesting is the process by which specialized software collects data from the Internet and places it into files for end user. It serves a function similar to, but more advanced than, the tasks a search engine performs. Web harvesting is also known as Web scraping. In this article we have explored the field of Web harvesting and emphasized its use for fast and effective retrieval of information from web
Research Interests:
In today's world our life style accumulates too much of stress. This stress is major factor for deteriorating health. To reduce the stress level, meditation is one of the ways. The Electroencephalography (EEG) is characterized by five... more
In today's world our life style accumulates too much of stress. This stress is major factor for deteriorating health. To reduce the stress level, meditation is one of the ways. The Electroencephalography (EEG) is characterized by five signals they are Alpha, Beta, Gamma, Delta and Theta. This paper focuses on the analysis of EEG signals of meditating and non-meditating persons. The signals are analyzed using Matlab and Verilog Xilinx 14.7 on FPGA. It was found that the variation of alpha wave in a person who is meditating regularly was less and delta wave density is more compared to a person who is not meditating.
Research Interests:
The network communication is mandatory and critical in our day-today life. The cryptography is the technique to secure the data in communication field such that privacy of data is maintained. The cryptography has different types of... more
The network communication is mandatory and critical in our day-today life. The cryptography is the technique to secure the data in communication field such that privacy of data is maintained. The cryptography has different types of algorithms; they are Data Encryption Standard (DES), Advanced Encryption Standard (AES) and Ron Rivest Adi Shamir & Leonard Adlemen (RSA) Triple Data Encryption Standard algorithm (TDES). The Multi-processor is most used in this era as it is subject to constraints while computing. The Multi-processing is the processing of multiple tasks at a time using two or more central processing units in one system. Here in this paper the Multiprocessing processor and lightweight DES algorithm is implemented using Xilinx 14.7. The simulation is done in Model Sim 6.3 and verifying the code is done by dumping onto Spartan 3 Field programmable gate array (FPGA).
Research Interests:
The on-demand use, high scalability, and low maintenance cost nature of cloud computing have attracted more and more enterprises to migrate their legacy applications to the cloud environment. Although the cloud platform itself promises... more
The on-demand use, high scalability, and low maintenance cost nature of cloud computing have attracted more and more enterprises to migrate their legacy applications to the cloud environment. Although the cloud platform itself promises high reliability, ensuring high quality of service is still one of the major concerns, since the enterprise applications are usually complicated and consist of a large number of distributed components. Thus, improving the reliability of an application during cloud migration is a challenging and critical research problem. To address this problem, we propose a reliability-based optimization framework, named RO Cloud, to improve the application reliability by fault tolerance. RO Cloud includes two ranking algorithms. The first algorithm ranks components for the applications that all their components will be migrated to the cloud. The second algorithm ranks components for hybrid applications that only part of their components are migrated to the cloud. Based on the ranking result, optimal fault-tolerant strategy will be selected automatically for the most significant components with respect to their predefined constraints. The experimental results show that by refactoring a small number of error-prone components and tolerating faults of the most significant components, the reliability of the application can be greatly improved.
Research Interests:
Data is vulnerable at many points in any computer system, and many security techniques and types of functionality can be applied to protect it. Sulimanyah Court is a big and popular Government sector in Iraq. Many clients accessing... more
Data is vulnerable at many points in any computer system, and many security techniques and types of functionality can be applied to protect it. Sulimanyah Court is a big and popular Government sector in Iraq. Many clients accessing E-Court Systems, SES (Sulimanyah E-Court Systems) is using SQL SERVER database to store and accesses court cases with different clients. Most of the security models available for databases today protect them from outside, unauthorized users and cannot be provides internal security in relationship with the user type of access to the database. This database can access by hundreds of clients. We propose a framework to increase security needs of database systems. Clients in this approach can be analysed to instituting sub channels between specific (groups of) users such that authorized subchannels appropriating from accessing objects that contain some sensitive information and restricting unauthorized users. In this paper we propose a general framework for Multi-Level Security (MLS) in Sulaymaniyah E-Court database.
Research Interests:
This paper investigated self-excited vibration response of double-bay multi-storey building frames for the effect of joint stiffening on natural frequencies. One of the frames has normal rigid joints. Three others of the frames have... more
This paper investigated self-excited vibration response of double-bay multi-storey building frames for the effect of joint stiffening on natural frequencies. One of the frames has normal rigid joints. Three others of the frames have stiffened joints of stiffened lengths: 250mm, 400mm and 750mm respectively. On account of limitations of shear frame models, the frames are modeled as frames with flexible horizontal members having multi degrees of freedom (MDOF). Classical displacement method of analysis is adopted using stiffness coefficients which are modified to include and embrace the contributions of joint stiffening. Results revealed: joint stiffening increased the natural frequencies of the frame; natural frequencies also increased with the length of stiffening. In addition, joint stiffening brought about enhanced values of other dynamic characteristics of the frame which are functions of natural frequencies.
Research Interests:
The field of technology is evolving at a very fast pace. The competition is very intense. So the need of the hour is to produce efficient system. In accomplishing this objective we are required to establish better interaction among all... more
The field of technology is evolving at a very fast pace. The competition is very intense. So the need of the hour is to produce efficient system. In accomplishing this objective we are required to establish better interaction among all the components of the system. This requirement is fulfilled by the Advanced Microcontroller Bus Architecture (AMBA) protocol from Advanced RISC Machines (ARM).The AMBA is the on-chip standard for the communication among components in Application Specific Integrated Circuits (ASIC) or System on Chip (SoC). This paper focuses on the 2 protocols of AMBA which are Advanced High Performance Bus (AHB) and Advanced Peripheral Bus (APB) and the APB bridge. The coding is done in Verilog synthesis on Xilinx 14.7 ISE and simulation on ISim simulator and FPGA implementation on Spartan 3.
Research Interests:
The technology evolvement has led to incorporation of more and more features into the system. One such feature is the flexibility of changing the hardware and software at the time of fabrication. Such a feature is called reconfiguration.... more
The technology evolvement has led to incorporation of more and more features into the system. One such feature is the flexibility of changing the hardware and software at the time of fabrication. Such a feature is called reconfiguration. The reconfiguration allows the user to incorporate more control over the hardware and software even at final stages of the process. Such type of reconfiguration is possible in Field Programmable Gate Array (FPGA). There are various classifications of reconfiguration based on application of reconfiguring a device usually FPGA. In this paper we go through the overview of partial reconfiguration and various techniques employed to achieve it.
Research Interests:
The brain tumor detection is a very important vision application in the medical field. In this paper, an efficient brain tumor detection using the object detection and modified hough metric has been proposed. To enhance the tumor... more
The brain tumor detection is a very important vision application in the medical field. In this paper, an efficient brain tumor detection using the object detection and modified hough metric has been proposed. To enhance the tumor detection rate further we have integrated the proposed object based tumor detection with the Decision based alpha trimmed global mean. The proposed technique has the ability to produce effective results even in case of high density of the noise. This approach has been tested on various images thus defining an efficient and robust technique for automated detection and segmentation of brain tumors.
Research Interests:
Barcoding, since its evolution has emerged as a very secure and quick way to identify the data or things uniquely, but the other major use of the bar-coding is to secure the data using barcodes. The concept of 2-D barcodes is of great... more
Barcoding, since its evolution has emerged as a very secure and quick way to identify the data or things uniquely, but the other major use of the bar-coding is to secure the data using barcodes. The concept of 2-D barcodes is of great relevance for use in wireless data transmission between handheld electronic devices. In a general setup of barcode standalone systems, any file on a cell phone, for example, can be transferred to a second cell phone through a series of images on the LCD which are then captured and decoded through the barcode scanner of the second cell phone. In the proposed system, a new approach for data modulation in high capacity 2-D barcodes is introduced, where in the High capacity 2 D barcode of 448*63 pixels is used to barcode the data and thereby transmitted through any wireless or wired medium. The proposed barcode has a high capacity of encoding the large amount of data by making use of MD5 secure hashing algorithm for uniquely identifying the data. The data is hashed using MD5 algorithm and thereby retrieved by cross hashing the values decoded from the barcode. The proposed bar-coding and barcode modulation technique makes use of the latest technology for generating and reading the barcodes and thereby avail users to securely transmit the data. In this paper we have propose the survey of barcode types and compare this with high capacity 2D barcode.
Research Interests:
Wireless sensor networks (WSNs) are potentially increased in research field due to their wide range of application. The topology of advanced WSNs form Multi-Hop WSNs. The Main issues in this topology is location based security and... more
Wireless sensor networks (WSNs) are potentially increased in research field due to their wide range of application. The topology of advanced WSNs form Multi-Hop WSNs. The Main issues in this topology is location based security and lifetime preserving in energy resources. In this paper, we projected a Geographical based Adaptive Lifetime Security Routing (GALSR) Protocol and GALSR Algorithm to improve the lifetime and security of WSNs. This algorithm enhanced with Greedy for Shortest path finder and Direct Random Propagation for Random Walking. We then determine that the energy consumption is rigorously disproportional to the uniform energy deployment for the given network topology, which greatly reduces the lifetime of the sensor networks. To resolve this issue, we introduce non-uniform energy deployment to optimize the lifetime and message delivery ratio under same energy resources and security. This energy balancing is set by Energy balance control (EBC) and security parameter using Metric Energy Balancer (MEB).We validate our finding in both Quantitative and Qualitative measures. Our quantitative measures is shown out of NetSim Simulator similarly our implementation is an best prove for our qualitative measures, Our proposed model shows that we achieved with increased Lifetime, Adaptive Security and Monetary efficient protocol.
Research Interests:
— Users may search for different type of things from anywhere. But Search results depend on the user entered query which has to satisfy their searched properties that is stored in the spatial database. Due to rapid growth of users it... more
— Users may search for different type of things from anywhere. But Search results depend on the user entered query which has to satisfy their searched properties that is stored in the spatial database. Due to rapid growth of users it becomes essential to optimize search results based on nearest neighbour property in spatial databases. Conventional spatial queries, such as range search and nearest neighbour retrieval, involve only geometric properties of objects which satisfies condition on geometric objects. Nowadays many modern applications aim to find objects satisfying both a spatial condition and a condition on their associated texts which is known as Spatial keyword search. For example, instead of considering all the hotels, a nearest neighbor query would instead ask for the hotel that is closest to among those who provide services such as pool, internet at the same time. For this type of query a variant of inverted index is used that is effective for multidimensional points and comes with an R-tree which is built on every inverted list, and uses the algorithm of minimum bounding method that can answer the nearest neighbor queries with keywords in real time.
Research Interests:
A pattern is an entity that could be named, e.g. fingerprint image, handwritten word, human face, speech signal, DNA sequence and many more. Pattern recognition has its roots in artificial intelligence(AI). Pattern recognition is the... more
A pattern is an entity that could be named, e.g. fingerprint image, handwritten word, human face, speech signal, DNA sequence and many more. Pattern recognition has its roots in artificial intelligence(AI). Pattern recognition is the study of how machines can learn to distinguish patterns and make some decisions about the categories of the patterns. Medical and healthcare sector are a big industry nowadays. Image based medical diagnosis is one of the important service areas in this sector. Various artificial intelligence techniques such as artificial neural network and fuzzy logic are used for classification problems in the area of medical diagnosis. Most of these computer-based systems are designed by using artificial neural network techniques. Electronic health records (EHR) provide detailed, time-stamped, and highly multivariate data for a large patient population, enabling the use of AI techniques to connect care practices and outcomes.
Research Interests:
This paper modifies the existing TCP and AODV system to handle the jelly fish periodic dropping attack, the jellyfish packet reordering attack and the jelly fish delay variance attack. The proposed system modifies the AODV routing... more
This paper modifies the existing TCP and AODV system to handle the jelly fish periodic dropping attack, the jellyfish packet reordering attack and the jelly fish delay variance attack. The proposed system modifies the AODV routing protocol and TCP to handle the jelly fish attack variants. The proposed system uses the E_TCP of the existing system along with the modified AODV routing to get the effective results. In the E_TCP protocol the buffer stores the sequence number and the acknowledgement time while in the NAODV_ETCP protocol the fr(forwarding ratio) is also stored in buffer. This paper analyzes the performance using PDR, E2Edelay and the throughput on the various scenario attacked by different types of jellyfish attack. The result analysis shows that the performance of NAODV_ETCP is better than the ETCP protocol.
Research Interests:
—Data security is one of the most vigorous fields of study in Informatics and Computer Forensic. Author right for intellectual stuff is a real challenge, particularly when information is processed and transmitted. One of the electronic... more
—Data security is one of the most vigorous fields of study in Informatics and Computer Forensic. Author right for intellectual stuff is a real challenge, particularly when information is processed and transmitted. One of the electronic methods of digital data is images. They are widely used in organizations, research institutions, and in environments where speedy, secured and minimized distortion is needed. In the proposed using the Complement, Rotate, and Mirrored with respect to Local Texture Using Syndrome Trellis Code (CRMST) algorithm. Entrenching the data in binary image is potential by flipping the pixels. The flip ability decision of a pixel depends on three transitions from the pixels to its eight neighbors in a Local window. Flipping a pixel does not abolish the connectivity among pixel, to safeguard the good visual quality of an image. The steganography scheme engenders the cover vector by dividing the scrambled image into super pixels. By testing on both simple binary images and the constructed image data set, show that the proposed measurement can well describe the distortions on both visual quality and statistics. A spatial domain-based binary image steganography scheme is proposed. The scheme minimizes a unique flipping distortion measurement which considers both HVS and statistics. This measurement employs the weighted sum of CRMST changes to measure the flip ability of a pixel. It is used for military purposes and provide high security.
Research Interests:
—Data security is one of the most vigorous fields of study in Informatics and Computer Forensic. Author right for intellectual stuff is a real challenge, particularly when information is processed and transmitted. One of the electronic... more
—Data security is one of the most vigorous fields of study in Informatics and Computer Forensic. Author right for intellectual stuff is a real challenge, particularly when information is processed and transmitted. One of the electronic methods of digital data is images. They are widely used in organizations, research institutions, and in environments where speedy, secured and minimized distortion is needed. In the proposed using the Complement, Rotate, and Mirrored with respect to Local Texture Using Syndrome Trellis Code (CRMST) algorithm. Entrenching the data in binary image is potential by flipping the pixels. The flip ability decision of a pixel depends on three transitions from the pixels to its eight neighbors in a Local window. Flipping a pixel does not abolish the connectivity among pixel, to safeguard the good visual quality of an image. The steganography scheme engenders the cover vector by dividing the scrambled image into super pixels. By testing on both simple binary images and the constructed image data set, show that the proposed measurement can well describe the distortions on both visual quality and statistics. A spatial domain-based binary image steganography scheme is proposed. The scheme minimizes a unique flipping distortion measurement which considers both HVS and statistics. This measurement employs the weighted sum of CRMST changes to measure the flip ability of a pixel. It is used for military purposes and provide high security.
Research Interests:
One Time Password is one of the preferable security techniques to do online transaction. If the security level is compared, it shows the need of improvement in its transaction methods and medium. In this paper, two authorities are... more
One Time Password is one of the preferable security techniques to do online transaction. If the security level is compared, it shows the need of improvement in its transaction methods and medium. In this paper, two authorities are considered viz. bank domain itself and third party as a trusted authority. Third party authority will have its own secrete key based on its own considered attributes. This key will be combined with user attribute key to form encrypted key. This key will be stored in temp database till session time is available. Here the two new concepts are added as to use third party secrete key and storing this key into temp database for predefined session time only. After these stages only main database can be accessed. It will increase the security level to make secure online transaction.
Vehicular Ad Hoc Network (VANET) is a subgroup of mobile ad hoc network (MANET). It is an emerging new technology to exchange information between vehicles to vehicles. VANETs are considered as one of the most noticeable technologies for... more
Vehicular Ad Hoc Network (VANET) is a subgroup of mobile ad hoc network (MANET). It is an emerging new technology to exchange information between vehicles to vehicles. VANETs are considered as one of the most noticeable technologies for improving the efficiency and safety of transportation systems. VANET mainly used to exchange traffic information between the vehicles and prevent accident. In VANETs the high mobility of the nodes is the major concern. This dynamic topology makes the route unstable and unreliable for exchange of information or messages among the vehicles in the ad hoc network. To improve the throughput and performance of the VANETs, routes between nodes must be reliable, less overhead and stable. It is a challenging task to design a routing protocols for VANETs which should support the intelligent transportation system (ITS) for enhancing the driver's safety, improving whole driver experience and regulating traffic. In this paper, the various challenges and issues of routing protocols of VANETs are discussed about its advantages and disadvantages in VANETs scenarios.
Research Interests:
Vehicular Ad Hoc Network (VANET) is a subgroup of mobile ad hoc network (MANET). It is an emerging new technology to exchange information between vehicles to vehicles. VANETs are considered as one of the most noticeable technologies for... more
Vehicular Ad Hoc Network (VANET) is a subgroup of mobile ad hoc network (MANET). It is an emerging new technology to exchange information between vehicles to vehicles. VANETs are considered as one of the most noticeable technologies for improving the efficiency and safety of transportation systems. VANET mainly used to exchange traffic information between the vehicles and prevent accident. In VANETs the high mobility of the nodes is the major concern. This dynamic topology makes the route unstable and unreliable for exchange of information or messages among the vehicles in the ad hoc network. To improve the throughput and performance of the VANETs, routes between nodes must be reliable, less overhead and stable. It is a challenging task to design a routing protocols for VANETs which should support the intelligent transportation system (ITS) for enhancing the driver's safety, improving whole driver experience and regulating traffic. In this paper, the various challenges and issues of routing protocols of VANETs are discussed about its advantages and disadvantages in VANETs scenarios.
Research Interests:
The paper presents review on innovative design to develop a system based on AVR micro controller that is used for monitoring the voltage, current and temperature of a distribution transformer in a substation and to protect the system from... more
The paper presents review on innovative design to develop a system based on AVR micro controller that is used for monitoring the voltage, current and temperature of a distribution transformer in a substation and to protect the system from the rise in mentioned parameters. Providing the protection to the distribution transformer can be accomplished by shutting down the entire unit with the aid of the Radio frequency Communication. Moreover the system displays the same on a PC at the main station which is at a remote place. Furthermore it is capable of recognizing the break downs caused due to overload, high temperature and over voltage. The design generally consists of two units, one in the substation unit, called as transmitter and display unit, and another in the Main station called as controlling unit. The transmitter and the display units in the substation is where the voltage, current and temperature are monitored continuously by AVR microcontroller and is displayed through the display unit. An RF transmitter is used for transmitting the signals that are obtained. The controlling unit in the main station by means of a PC and a RF receiver receives the RF signals that are transmitted by the Transmitter unit and reacts in accordance to the received signal. In general, the proposed design is developed for the user to easily recognize the distribution transformer that is suffered by any open or short circuit and rise in temperatures. The ultimate objective is to monitor the electrical parameters continuously and hence to guard the burning of distribution transformer or power transformer due to the constraints such as overload, over temperature and input high voltage. If any of these values increases beyond the limit then the entire unit is shut down by the designed controlling unit.
Research Interests:
Outlier detection in high-dimensional data presents various challenges resulting from the “curse of dimensionality.” A prevailing view is that distance concentration, i.e., the tendency of distances in high-dimensional data to become... more
Outlier detection in high-dimensional data presents various challenges resulting from the “curse of
dimensionality.” A prevailing view is that distance concentration, i.e., the tendency of distances in high-dimensional data
to become indiscernible, hinders the detection of outliers by making distance-based methods label all points as almost
equally good outliers. In this paper, we provide evidence supporting the opinion that such a view is too simple, by
demonstrating that distance-based methods can produce more contrasting outlier scores in high-dimensional settings.
Furthermore, we show that high dimensionality can have a different impact, by reexamining the notion of reverse nearest
neighbors in the unsupervised outlier-detection context. Namely, it was recently observed that the distribution of points’
reverse-neighbor counts becomes skewed in high dimensions, resulting in the phenomenon known as hubness. We provide
insight into how some points (antihubs) appear very infrequently in k-NN lists of other points, and explain the connection
between antihubs, outliers, and existing unsupervised outlier-detection methods. By evaluating the classic k-NN method,
the angle-based technique designed for high-dimensional data, the density-based local outlier factor and influenced
outlierness methods, and antihub-based methods on various synthetic and real-world data sets, we offer novel insight into
the usefulness of reverse neighbor counts in unsupervised outlier detection. Index Terms—Outlier detection, reverse
nearest neighbors, high-dimensional data, distance concentration
Named information organizing (NDN) is another worldview for the future Internet wherein hobby and information parcels convey content names as opposed to the present IP worldview of source and destination addresses. Security is... more
Named information organizing (NDN) is another worldview for the future Internet wherein hobby and information parcels convey content names as opposed to the present IP worldview of source and destination addresses. Security is incorporated with NDN by installing open token mark in every information bundle to empower check of credibility and uprightness of the substance. Be that as it may, existing heavyweight signature and check calculations to avoid all inclusive uprightness confirmation among NDN nodes, which may bring about substance contamination and refusal of administration assaults. We propose a lightweight respectability confirmation (LIVE) engineering, an expansion to the NDN convention, to address these two issues. Besides, it permits a substance supplier to control content access in NDN hubs by specifically dispersing uprightness check tokens to approved hubs. We also introduce IP address verification to avoid unauthorized users. Here our tokens valid from the user can access his accounts from another system.
Research Interests:
Network administrators are always faced with numerous challenges of identifying threats and in retrospect, securing the organization's network. The classical approach of identifying the vulnerability in the network is by using... more
Network administrators are always faced with numerous challenges of identifying threats and in retrospect, securing the organization's network. The classical approach of identifying the vulnerability in the network is by using commercially developed tools that do not take into cognisance vulnerability interaction between network elements and their behavioral pattern.Therefore, network administrators have to take a hollistic methods to identify vulnerability interrelationships to be captured by an attack graph which will help in identifying all possible ways an attacker would have access to critical resources. The objective therefore is to design an attack graph–based approach for analyzing security vulnerabilities in enterprise networks, implement and evaluate performance of the approach. This work proposes an attack graph network security analyser based. The attack graph directly illustrates logical dependencies among attack goals and configuration information. In the attack graph, a node in the graph is a logical statement and an edge in the graph is represented by causality relation between network configurations and an attacker's potential privileges. The benchmark is just a collection of Datalog tuples representing the configuration of the synthesized networks, the graph generation CPU time was compared to Sheyner attack graph toolkit. The result in the graph shows the comparison of the graph builder CPU time for the case of a fully connected network and 5 vulnerabilities per host which shows Sheyner's tools grows exponentially.Some important contributions of this work include establishing an attack graph–based approach for enterprise networks security analysis that can capture generic security interactions and specify security relevant configuration information.
Research Interests:
The three chlorine atoms on lumefantrine were replaced with 5-aminotetrazole using both thermal method (TM) and microwave method (MW). The products were characterized with GC-MS spectral analyses. The usual isotopic mass percentages of... more
The three chlorine atoms on lumefantrine were replaced with 5-aminotetrazole using both thermal method (TM) and microwave method (MW). The products were characterized with GC-MS spectral analyses. The usual isotopic mass percentages of 35(75%) and 37(25%) of chlorine abundance in nature were absent in the product's fragmentation peaks. Gas Chromatogram of the two methods showed that the percentage yield of the microwave method (MW) is about double that of the thermal method (TM).
Research Interests:
In the previous system, all the information has to view in a hard file, or in website [6]. At the same time while searching any information it is too difficult to access and takes a lot of time to search the particular website. Hence, in... more
In the previous system, all the information has to view in a hard file, or in website [6]. At the same time while searching any information it is too difficult to access and takes a lot of time to search the particular website. Hence, in order to overcome this problem a smart phone based application using Android can be used to make this process easier, secure and less error prone. More efficient information's will be achieved through this system. Android is an open source Linux based system developed by Google, and primarily aimed at mobile handsets and other portable devices. In short, we will be using them to accomplish our daily task. One application that falls into this category is the Android Application for College. This College application provides a wide range of useful information which split into several functionalities. These include: academics, news, events, facilities and all the college details. Users can install this application in their android mobile to view all of these college details and make use it.To improve mobile experience for users[17] android project aims to create successful real world products. Android is a mobile OS but it's not limited to mobile only. It's currently used in smart phones and televisions. There are many code name of android such as Lollipop, Kitkat, Jelly Bean, Ice cream Sandwich, Froyo, Ecliar, Donut etc. In this paper we are also discussing aboutcomparison of androidLollipop version, its features, and its issues with android Marshmallow versions and also the development of Campsys mobile application.
Research Interests:
Bitcoin is the latest addition to the online payment transaction systems. It is a digital currency also known as cryptocurrency. Bitcoin system is the first transaction payment system that deviated from the conventional approach of... more
Bitcoin is the latest addition to the online payment transaction systems. It is a digital currency also known as cryptocurrency. Bitcoin system is the first transaction payment system that deviated from the conventional approach of processing and clearing transactions though the trusted third parties and allowed direct transactions between parties. Thus making the whole system decentralized. It is a pure peer to peer network system which facilitates every party on the network to keep track of all the transactions that are taking place on the network. It uses Cryptography for its implementation and to deal with the internet security threats. This papers aims to explore the need of the decentralized system, technology used for its implementation and also the key features of bitcoin system that makes it so unique as compared to the conventional currency. It also throws the light on the benefits of bitcoin system and its shortcomings.
Research Interests:
One important aspect of mobile ad-hoc networks is the group mobility of nodes in topological area, since any node can enter or leave the topological area at any time. The resources are limited with mobile ad-hoc networks, so increasing... more
One important aspect of mobile ad-hoc networks is the group mobility of nodes in topological area, since any node can enter or leave the topological area at any time. The resources are limited with mobile ad-hoc networks, so increasing number of node cannot generate more resources, but they can use and share existing resources with pre-existing nodes in the topological area. An important aspect of group mobility is mobility of nodes inside the group with reference to their leader. Inthis paper we analyze the two protocols AODV and DSDV with equal and unequal distribution of nodes in group.
Research Interests:
— in this survey we have a furnished detail of Clustering. Clustering is not only bounded in boundary of grouping of same kind of objects in cluster, it would also be like to get or retrieve specific data by just analyzing clustering... more
— in this survey we have a furnished detail of Clustering. Clustering is not only bounded in boundary of grouping of same kind of objects in cluster, it would also be like to get or retrieve specific data by just analyzing clustering approach. This analytic survey focused on the current clustering technique of data categorizing and retrieving as faster as possible from huge amount of data because data is growing like square or cube of their current position. So saving of all information and easier retrieving will always face new challenges as proportion of data increasing in various aspects which would population of any country or data of any field related to them.
Research Interests:
Photo sharing is an attractive feature which popularizes Online Social Networks (OSNs). Unfortunately, it may leak users privacy if they are allowed to post, comment, and tag a photo freely. In this paper, we attempt to address this issue... more
Photo sharing is an attractive feature which popularizes Online Social Networks (OSNs). Unfortunately, it may leak users
privacy if they are allowed to post, comment, and tag a photo freely. In this paper, we attempt to address this issue and study the
scenario when a user shares a photo containing individuals other than himself/herself (termed co-photo for short). To prevent possible
privacy leakage of a photo, we design a mechanism to enable each individual in a photo be aware of the posting activity and participate in
the decision making on the photo posting. For this purpose, we need an efficient facial recognition (FR) system that can recognize
everyone in the photo. However, more demanding privacy setting may limit the number of the photos publicly available to train the FR
system. To deal with this dilemma, our mechanism attempts to utilize users' private photos to design a personalized FR system
specifically trained to differentiate possible photo co-owners without leaking their privacy. We also develop a distributed consensusbased
method to reduce the computational complexity and protect the private training set. We show that our system is superior to other
possible approaches in terms of recognition ratio and efficiency. Our mechanism is implemented as a proof of concept Android application
on Facebook's platform.
Research Interests:
Sensors are electronic devices that are meant for recording particular environment variable for example temperature, pressure, speed etc. with the advancement in the field of electronics and sensors, it is possible to the gather... more
Sensors are electronic devices that are meant for recording particular environment variable for example temperature, pressure, speed etc. with the advancement in the field of electronics and sensors, it is possible to the gather physically measured data from sensors remotely. This is possible only because of the network facilities like Ethernet, Wi-Fi, and cellular networks that enable remote access to the data sampled by a sensor.Faults, errors, failures and attacks are common in a wireless communication and this imperfection in wireless communication compromises the effective data collection of a wireless sensing system. In this paper we are going to discuss some various challenges and solutions in robust data collection in wireless sensing system.
Research Interests:
Energy consumption is a significant issue in ad hoc networks since mobile nodes are battery powered. In order to prolong the lifetime of ad hoc networks, it is the most critical issue to optimize the energy consumption of nodes. In this... more
Energy consumption is a significant issue in ad hoc networks since mobile nodes are battery powered. In order to prolong the lifetime of ad hoc networks, it is the most critical issue to optimize the energy consumption of nodes. In this paper, we propose a power aware routing protocol for choosing energy efficient optimum route selection & network approach. This system also considers transmission power of nodes and residual energy as energy metrics in order to maximize the network lifetime and to reduce energy consumption of mobile nodes. The objective of our proposed system is to find an optimal route based on two energy metrics while choosing a route to transfer data packets. This system is implemented by using NS-2.35. Simulation results show that the proposed routing protocol E-EPAR with transmission power and residual energy control mode can extend the lifespan of network and can achieve higher performance when compared to EPAR and DSR routing protocols.
Research Interests:
In this age of universal electronic connectivity when world is becoming a global village ,different threats like viruses and hackers, eavesdropping and fraud, undeniably there is no time at which security does not matter. In view of large... more
In this age of universal electronic connectivity when world is becoming a global village ,different threats like
viruses and hackers, eavesdropping and fraud, undeniably there is no time at which security does not matter. In view of large
growing population of vulnerabilities, major challenge is how to prevent exploitation of these vulnerabilities by attackers. The
first step in understanding vulnerabilities is to classify them into a taxonomy based on their characteristics. A good taxonomy
also provides a common language for the study of the field. Properties and requirements of good taxonomy are described in
this paper to lead security experts for the development of secure infrastructure. An analysis of some prominent taxonomies and
their valuable aspects are highlighted that can be used to create a complete useful taxonomy.
Research Interests:
Weather forecasting has been one of the most scientifically and technologically challenging problems around the world in the last century. One of the most popular methods used for weather prediction is data mining. Over the years many... more
Weather forecasting has been one of the most scientifically and technologically challenging problems around the world in the last century. One of the most popular methods used for weather prediction is data mining. Over the years many data mining techniques have been used to forecast weather such as Decision tree and Artificial neural network. In this paper we present a new technique for weather prediction namely CPT+ which is highly efficient than all the methods used till date for prediction. Here we present a novel prediction model for weather prediction which losslessly compresses the training data so that all relevant information is available for each prediction. Our approach is incremental, offers a low time complexity for its training phase and is easily adaptable for different applications and contexts.
Research Interests:
There is an urgent need for improving security in banking region. In this paper discussion is made about the face recognition technology, an important field of biometrics which will be employed for the purpose of checks on frauds using... more
There is an urgent need for improving security in banking region. In this paper discussion is made about the face recognition technology, an important field of biometrics which will be employed for the purpose of checks on frauds using ATMs. The recent progress in biometric identification techniques, has made a great efforts to rescue the unsafe situations at the ATM. Several facial recognition techniques are studied which include two approaches, appearance based and geometric based. A new facial recognition technique: 3-D technique is also reviewed in the paper. These techniques are widely used in e-passports and on the airports for entry of travellers and others. ATM systems today use no more than an access card and PIN for identity verification. If the proposed technology of facial recognition becomes widely used, faces would be protected as well as PINs.
Research Interests:

And 21 more

— A Cloud is expanding from application aggregation and sharing to data aggregation and utilization. To make use of data tens of terabytes and tens of beta bytes of data to be handled. These massive amounts of data are called as a big... more
— A Cloud is expanding from application aggregation and sharing to data aggregation and utilization. To make use of data tens of terabytes and tens of beta bytes of data to be handled. These massive amounts of data are called as a big data. Range-aggregate queries are to apply a certain aggregate function on all tuples within given query ranges. Fast RAQ first divides big data into independent partitions with a balanced partitioning algorithm, and then generates a local estimation sketch for each partition. When a range-aggregate query request arrives, Fast RAQ obtains the result directly by summarizing local estimates from all partitions & Collective Results are provided. Data Mining can process only Structured Data only. Big Data Approach is spoken all over the Paper. They insist of Three Tier Architecture, 1. Big Data implementation in Multi System Approach, 2. Application Deployment-Banking / Insurance. 3. Extraction of Useful information from Unstructured Data. We implement this Project for Banking Domain. There will be Two Major Departments. 1. Bank Server for Adding New Clients and maintaining their Accounts. Every User while Registration has to provide their aadhar card as a ID Proof to create Account in any Bank. 2. Accounts Monitoring Sever will monitor every users and their Account Status in different Banks. This Server will retrieve users who maintain & Transact more than Rs. 50,000 / Annum in all 3 Accounts in different Banks using the same ID Proof. Map & Reduce is achieved.
Research Interests:
Cloud computing is becoming popular. Build high-quality cloud applications is a critical research problem. QoS rankings provide valuable information for make optimal cloud service selection from a set of functionally equivalent service... more
Cloud computing is becoming popular. Build high-quality cloud applications is a critical research problem. QoS rankings provide valuable information for make optimal cloud service selection from a set of functionally equivalent service candidates. To obtain Qos values real-world invocations the service candidates are usually required based on the Cloud Broker. To avoid the time consuming and expensive real-world service invocations, It proposes a QoS ranking prediction framework for cloud services by taking an advantage of the past service usage experiences of other consumers. Our proposed framework requires no need additional invocations of cloud services when making QoS ranking prediction by cloud broker service provider. Two personalized QoS ranking prediction approaches are proposed to predict the QoS rankings directly based on cost and ranking. Comprehensive experiments are conducted employing real-world QoS data, including 300 distributed users and 500 real world web services to all over the world. The experimental results show that our approaches outperform other competing approaches.
Research Interests:
Cloud computing is becoming popular. Build high-quality cloud applications is a critical research problem. QoS rankings provide valuable information for make optimal cloud service selection from a set of functionally equivalent service... more
Cloud computing is becoming popular. Build high-quality cloud applications is a critical research problem. QoS rankings provide valuable information for make optimal cloud service selection from a set of functionally equivalent service candidates. To obtain Qos values real-world invocations the service candidates are usually required based on the Cloud Broker. To avoid the time consuming and expensive real-world service invocations, It proposes a QoS ranking prediction framework for cloud services by taking an advantage of the past service usage experiences of other consumers. Our proposed framework requires no need additional invocations of cloud services when making QoS ranking prediction by cloud broker service provider. Two personalized QoS ranking prediction approaches are proposed to predict the QoS rankings directly based on cost and ranking. Comprehensive experiments are conducted employing real-world QoS data, including 300 distributed users and 500 real world web services to all over the world. The experimental results show that our approaches outperform other competing approaches.
Research Interests:
The growth of software engineering can justifiably be attributed to the advancement in Software Testing. The quality of the test cases to be used in Software Testing determines the quality of software testing. This is the reason why test... more
The growth of software engineering can justifiably be attributed to the advancement in Software Testing. The quality of the test cases to be used in Software Testing determines the quality of software testing. This is the reason why test cases are primarily crafted manually. However, generating test cases manually is an intense, complex and time consuming task. There is, therefore, an immediate need for an automated test data generator which accomplishes the task with the same effectiveness as manual crafting of test cases. The work presented intends to automate the process of Test Path Generation with a goal of attaining maximum coverage. The work presents a technique using Cellular Automata (CA) for generating test paths. The work opens the window of Cellular Automata to Software Testing. The approach has been verified on programs selected in accordance with their Lines of Code and utility. The results obtained have been verified.
Research Interests:
In distributed applications data centers process high volume of data in order to process user request. Using SQL analyzer to process user queries is centralized and it is difficult to manage large data sets. Retrieving data from the... more
In distributed applications data centers process high volume of data in order to process user request. Using SQL analyzer to process user queries is centralized and it is difficult to manage large data sets. Retrieving data from the storage is also difficult. Finally we can't execute the system in a parallel fashion by distributing data across a large number of machines. Systems that compute SQL analytics over geographically distributed data operate by pulling all data to a central location. This is problematic at large data scales due to expensive transoceanic links. So implement Continuous Hive (CHIVE) that facilitates querying and managing large datasets residing in distributed storage. Hive provides a mechanism to structure the data and query the data using a SQL-like language called HiveQL and it optimizes query plans to minimize their overall bandwidth consumption. The proposed system optimizes query execution plans and data replication to minimize bandwidth cost.
Research Interests:
Cloud computing is an emerging pattern that provides computing, communication and storage resources as a service over a network. In existing system, data outsourced in a cloud is unsafe due to the eaves dropping and hacking process. And... more
Cloud computing is an emerging pattern that provides computing, communication and storage resources as a service over a network. In existing system, data outsourced in a cloud is unsafe due to the eaves dropping and hacking process. And it allows minimizing the security network delays in cloud computing. In this paper to study data replication in cloud computing data centers. Unlike another approaches available in the literature, consider both security and privacy preserving in the cloud computing. To overcome the above problem we use DROPS methodology. The data encrypted using AES (Advanced Encryption Standard Algorithm). In this process, the common data are divided into multiple nodes also replicate the fragmented data over the cloud nodes. Each data is stored in a different node in fragments individual locations. We ensure a controlled replication of the file fragments, here each of the fragments is replicated only once for the purpose of improved security. The results of the simulations revealed that the simultaneous focus on the security and performance, resulted in improved security level of data accompanied by a slight performance drop.
Research Interests:
— Mining the needed data based on our application was the crucial activity in the computerized environment.For that mining techniques was introduced.This project used to extract the mobile apps.The Ranking fraud in the mobile App market... more
— Mining the needed data based on our application was the crucial activity in the computerized environment.For that mining techniques was introduced.This project used to extract the mobile apps.The Ranking fraud in the mobile App market refers to fraudulent or deceptive activities which have a purpose of bumping up the Apps in the popularity list. Indeed, it becomes more and more frequent for App developers to use shady means, such as inflating their Apps' sales or posting phony App ratings, to commit ranking fraud. Here first propose to accurately locate the ranking fraud by mining the active periods, namely leading sessions, of mobile Apps. Furthermore, we investigate three types of evidences, i.e., ranking based evidences, rating based evidences and review based evidences, by modeling Apps' ranking, rating and review behaviors through statistical mining based hypotheses tests. In addition, In this project an optimization based application used to integrate all the evidences for fraud detection based on EIRQ (efficient information retrieval for ranked query) algorithm. Finally, evaluate the proposed system with real-world App data collected from the iOS App Store for a long time period. Experiment was need to be done for validate the effectiveness of the proposed system, and show the scalability of the detection algorithm as well as some regularity of ranking fraud activities.
Research Interests:
— to secure outsourced information in distributed storage against defilements, adding adaptation to internal failure to distributed storage together with information uprightness checking and disappointment reparation gets to be basic. As... more
— to secure outsourced information in distributed storage against defilements, adding adaptation to internal failure to distributed storage together with information uprightness checking and disappointment reparation gets to be basic. As of late, recovering codes have picked up ubiquity because of their lower repair transfer speed while giving adaptation to non-critical failure. Existing remote checking systems for recovering coded information just give private examining, requiring information proprietors to dependably stay online and handle reviewing, and additionally repairing, which is some of the time unreasonable and also all the distributed data are stored in same functional location ,so search and data retrieval takes much time. This time delay will affect the distributed storage efficiency. In this project an open examining plan for the recovering code based distributed storage is proposed and also Attribute Based Clustering Technique (ABCT) For Distributed Data. The ABCT will recover the issue of time delayness and makes the system more efficient. We randomize the encode coefficients with a pseudorandom capacity to protect information security. The ABCT achieves the much faster performance data searching and retrieval. Broad security examination demonstrates that our plan is provable secure under arbitrary prophet model and trial assessment shows that our plan is exceptionally productive and can be attainably coordinated into the recovering code based distributed storage.
Research Interests:
Identification of the plant diseases is the key to preventing the losses in the yield and quantity of the agricultural product. The studies of the pomegranate plant diseases mean the studies of visually observable patterns seen on the... more
Identification of the plant diseases is the key to preventing the losses in the yield and quantity of
the agricultural product. The studies of the pomegranate plant diseases mean the studies of visually
observable patterns seen on the plant. It is very difficult to monitor the pomegranate plant diseases manually.
Hence, image processing is used for the detection of pomegranate plant diseases. Disease detection involves
the steps like image acquisition, image pre-processing, image segmentation, statistical feature extraction and
classification. K-means clustering algorithm is used for segmentation and support vector machine is used for
classification of disease.
Research Interests:
— Ration distribution is one of the big issues that involves corruption and smuggling of goods. The only reason of this to happen is because every work in the ration shop involves manual work. These irregularities activities are happen... more
— Ration distribution is one of the big issues that involves corruption and smuggling of goods. The only reason of this to happen is because every work in the ration shop involves manual work. These irregularities activities are happen like – the entries which are incorrect in collection record of store containing wrong stock information of the products that is supplied to the peoples, at times it may chance of distribution of minimum quality products than the actual products given by the Government for providing to the public, also the information regarding the available stock quantity in a ration shop that is supplying by the Government to the public. In this paper we propose the concept of changing manual work in public distribution system by automated system which can be installed at the ration shop with ease. This would bring the transparency in rationing system as there will be a direct communication between user and Government through this system.
Research Interests:
— Ration distribution is one of the big issues that involves corruption and smuggling of goods. The only reason of this to happen is because every work in the ration shop involves manual work. These irregularities activities are happen... more
— Ration distribution is one of the big issues that involves corruption and smuggling of goods. The only reason of this to happen is because every work in the ration shop involves manual work. These irregularities activities are happen like – the entries which are incorrect in collection record of store containing wrong stock information of the products that is supplied to the peoples, at times it may chance of distribution of minimum quality products than the actual products given by the Government for providing to the public, also the information regarding the available stock quantity in a ration shop that is supplying by the Government to the public. In this paper we propose the concept of changing manual work in public distribution system by automated system which can be installed at the ration shop with ease. This would bring the transparency in rationing system as there will be a direct communication between user and Government through this system.
Research Interests:
—The process of fixing bug is bug triage that aims to properly assign a developer to a new bug. Software companies pay out most of their expenses in dealing with these bugs. To reduce time and cost of bug triaging, an automated approach... more
—The process of fixing bug is bug triage that aims to properly assign a developer to a new bug. Software companies pay out most of their expenses in dealing with these bugs. To reduce time and cost of bug triaging, an automated approach is developed to predict a developer with relevant experience to solve the new coming report. In proposed approach data reduction is done on bug data set which will reduce the scale of the data as well as increase the quality of the data. Instance selection and feature selection is also used simultaneously with historical bug data. Previously, text classification techniques are applied to conduct bug triage. The problem here is to get quality bug data sets as they are of very huge in size. In the proposed system, the problems of reducing the size and to improve the quality of bug data are addressed.First, pre-processing is done to the remove unimportant attributes and to identify missing terms. Then instance selection is combined with feature selection by using Dimensionality reduction technique to simultaneously reduce data size on the bug dimension and the word dimension. By using PSO algorithm, the reduction order is determined using fitness value. It is used to produce quality bug data set. The results show that the proposed system can effectively reduce the data size and improve the accuracy of bug triage. The proposed system provides an approach to leveraging techniques on data processing to form reduced and high eminence bug data in software improvement and maintenance.
Research Interests:
The agile methods, such as Scrum and Extreme Programming (XP), have been a topic of much discussion in the software community over the last few years. While these have gained importance in the industry because of their approach on the... more
The agile methods, such as Scrum and Extreme Programming (XP), have been a topic of much discussion in
the software community over the last few years. While these have gained importance in the industry because of
their approach on the issues of human agility and return on investment, usually within a context of small-tomedium
size projects with significant requirements volatility, the ones who do not support these methods have
expressed serious concerns about the effectiveness of the methods. Scrum attempts to build the work in short
iterations where each iteration consists of short time boxes. This paper reviews several papers on Scrum, its
framework including its artifacts and the ceremonies which are involved. This paper gives an insight of the
Scrum Methodology to any beginner.
Keywords: Agile methods, methodology, Scrum, software process, Sprint, Backlog, Artifacts
Research Interests:
—In general, all the keypad based authentication system having several possibilities of password guessing by means of shoulder movements. Shoulder-surfing is an attack on password authentication that has traditionally been hard to defeat.... more
—In general, all the keypad based authentication system having several possibilities of password guessing by means of shoulder movements. Shoulder-surfing is an attack on password authentication that has traditionally been hard to defeat. This problem has come up with a new solution. Devising a user authentication scheme based on personal identification numbers (PINs) that is both secure and practically usable is a challenging problem. The greatest difficulty lies with the susceptibility of the PIN entry process to direct observational attacks, such as human shoulder-surfing and camera-based recording. PIN entry mechanism is widely used for authenticating a user. It is a popular scheme because it nicely balances the usability and security aspects of a system. However, if this scheme is to be used in a public system then the scheme may suffer from shoulder surfing attack. In this attack, an unauthorized user can fully or partially observe the login session. Even the activities of the login session can be recorded which the attacker can use it later to get the actual PIN. In this paper, we propose an intelligent user interface, known as Color Pass to resist the shoulder surfing attack so that any genuine user can enter the session PIN without disclosing the actual PIN. The Color Pass is based on a partially observable attacker model. The experimental analysis shows that the Color Pass interface is safe and easy to use even for novice users.
Research Interests:
The main objective is to preserve data privacy during communication. In this paper, we show how external aggregators or multiple parties use algebraic statistics over their private data without exploiting data privacy assuming all... more
The main objective is to preserve data privacy during communication. In this paper, we show how external aggregators or multiple parties use algebraic statistics over their private data without exploiting data privacy assuming all channels and communications are open to eavesdropping attacks. Firstly, we propose many protocols that guarantee data privacy. Later we propose advanced protocols that tolerate maximum of k passive adversaries who do not try to modify the computation.
Research Interests:
—The sources such as wind power and solar power are expected to be promising energy sources when it is connected to the power grid. The wind generators have a significant impact on the power quality, voltage profile and the power flow for... more
—The sources such as wind power and solar power are expected to be promising energy sources when it is connected to the power grid. The wind generators have a significant impact on the power quality, voltage profile and the power flow for customers and electricity suppliers. The power exhausted from above energy sources varies due to environmental conditions. Due to the fluctuation in nature of the wind, the wind power injection into an electric grid affects the power quality. The influence of the wind sources in the grid system concerns the power quality such as the reactive power, active power, voltage variation, harmonics and electrical behaviour in switching operation[1]. Demonstration of a grid side connected wind turbine is considered here with the problem arise due to the above system. At the point of common coupling a Static Synchronous Compensator with Battery Energy Storage System-STATCOM/BESS, can regulate four-quadrant active and reactive power, which is an ideal scheme to solve problems of wind power generation. As the power from wind generation varies with time so the battery energy storage used to maintain constant real power comprehensively from varying wind power. The power generated through wind generator can be stored in the batteries at low power demand hours[2-4]. The combination of battery storage with wind energy generation system will synthesize the output waveform by absorbing or injecting reactive power and enable the real power flow required by the load. The control strategy can coordinate charge or discharge of batteries with reactive power compensation of STATCOM, and balance the batteries capacity. If required, amount of energy consumed or given to the grid can be observed through an online smart meter connected in the circuit.
Research Interests:
The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern technologies various civil military applications. It is an automatic system The shrinking size and increasing capabilities of microelectr-onic... more
The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern technologies various civil military applications. It is an automatic system The shrinking size and increasing capabilities of microelectr-onic devices in recent years has opened up the doors to more capable autopilot and pushed for more real time UAVs applications. The Unmanned Aerial Vehicle (UAV) market is to grow dramatically by 2020, as military, civil and comercial applications continue to develop. Potential changes in air traffic management include the creation of an information It defines a UAV to be " An aircraft which is management system to exchange information among Air Traffic Management users and providers, the introduction of navigation, and the development of alternative separation procedures. The impact of each scenario on the future air traffic and surveillance is summarized, and associated issues identified. The paper concludes by describing the need for a UAV roadmap to the future. This paper aims to provide a simple and low-cost solution of an autonomous aerial surveyor which can do aerial surveillance ,recognize and track various objects, able in making simple 3d map .
Research Interests:
The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern technologies various civil military applications. It is an automatic system The shrinking size and increasing capabilities of microelectr-onic... more
The research work on this paper aims to develop an unmanned aerial vehicle equipped with modern technologies various civil military applications. It is an automatic system The shrinking size and increasing capabilities of microelectr-onic devices in recent years has opened up the doors to more capable autopilot and pushed for more real time UAVs applications. The Unmanned Aerial Vehicle (UAV) market is to grow dramatically by 2020, as military, civil and comercial applications continue to develop. Potential changes in air traffic management include the creation of an information It defines a UAV to be " An aircraft which is management system to exchange information among Air Traffic Management users and providers, the introduction of navigation, and the development of alternative separation procedures. The impact of each scenario on the future air traffic and surveillance is summarized, and associated issues identified. The paper concludes by describing the need for a UAV roadmap to the future. This paper aims to provide a simple and low-cost solution of an autonomous aerial surveyor which can do aerial surveillance ,recognize and track various objects, able in making simple 3d map .
Research Interests:
In the modern world, growth is a factor every company seeks for, And Growth is determined by the amount of revenue they generate, the number of consumers they satisfy etc. For product based companies, achieving Greater number of sales is... more
In the modern world, growth is a factor every company seeks for, And Growth is determined by the amount of revenue they generate, the number of consumers they satisfy etc. For product based companies, achieving Greater number of sales is their key objective. And it achieved by understanding customers' requirements, employing various marketing strategy, carrying out relevant analysis, advertisement etc., and all of these factors pose a great challenge for every enterprise and companies around the world, but foremost understanding the customers interest is the greatest challenge of all. Every time we step into a mall or Exhibition, It is the natural tendency of a person to spend more time with the object that meets their interest, but till now we have never kept the track of the interaction between the consumers and product with respect to time. In our project, with the help of wireless communication & ubiquitous sensors such as RFID and proper GUI, we are developing a consumer interest tracking device which is capable of gathering valuable information regarding time spent by an individual at various stores, Products in an exhibition or shopping mall. Based on the information collected we determine the interest of consumer, which in turn help the company to manufacture better products, take smarter decisions and ensure a safer future for the enterprise. The information gathered is made available for real-time monitoring, or can be stored for future analysis
Research Interests:
— The classical fuzzy system model method kindly assumed data which is generated from single task. This data can be acquired from the perspective of multiple task the modeling has on intrinsic inconsistency. In this project , a multiple... more
— The classical fuzzy system model method kindly assumed data which is generated from single task. This data can be acquired from the perspective of multiple task the modeling has on intrinsic inconsistency. In this project , a multiple fuzzy system modeling method by mining interact common hidden structure is propose to overcome the weakness of classical TSK-based fuzzy modeling method system for multitask learning. When the classical fuzzy modeling method are applied to multitask datasets, they usually focus on the task independence information and the ignore the correlation between different task. Here we mine the common hidden structure among multiple tasks to realize multitask TSK fuzzy system learning it makes good used of the independence information of each task and correlation information captured by common hidden structure among all tasks as well. Thus, the proposed learning algorithm can effectively improve both the generalization and fitting performance of the learned fuzzy system for each task. Our experiment result demonstrate at the proposed MTCS-TSK_FS has better modeling performance and adaptability than the existing TSK based fuzzy modeling method on multitask datasets. Learning multiple tasks across different datasets is a challenging problem since the feature space may not be the same for different tasks. The data can be any type or the datasets. The data can be any type or the datasets are any type like text datasets. Index Terms—Common hidden structure, fuzzy modeling,multitask learning, Takagi-Sugeno-Kang (TSK) fuzzy systems.
Research Interests:
[1][2][3][4]Department of Information Technology [1][2][3][4 Abstract: Verification of person by means of Biometrics can be uniquely identified by one or more biological traits of that person. These unique identifiers include eye pattern,... more
[1][2][3][4]Department of Information Technology [1][2][3][4 Abstract: Verification of person by means of Biometrics can be uniquely identified by one or more biological traits of that person. These unique identifiers include eye pattern, hand pattern, fingerprints, earlobe geometry, retinas, DNA, voice waves and signatures. Recent research in biometrics field has done into finding more ways to identify someone through different gaits proposed. Among many, earlobe biometrics is the stable one considered in the light of aging of Human bodies. As others earlobes geometry does not change with time and emotions. Geometric features considered in earlobe biometrics are ears height, corresponding angles, and inner ears curve and reference line cut points. Random orientation is performed and it shows greater accuracy than previous model. The recognition accuracy is increased by training images in databases. This class of passive identification is ideal for biometrics because the features are robust and can be reliably extracted from distance.
Research Interests:
Nowadays for the analysis of ECG signals and its interpretation, Signal processing plays an important role. ECG signal processing has an important objective to give society a filtered result with maximum accuracy and the information which... more
Nowadays for the analysis of ECG signals and its interpretation, Signal processing plays an important role. ECG signal processing has an important objective to give society a filtered result with maximum accuracy and the information which is not readily extracted from visual assessment of ECG signal. ECG signals are obtained by placing electrodes on the body surface of a human being. It leads to contamination of noise to ECG signals. These noises are baseline wander, power-line interference, electromyographic (EMG) noise, electrode motion artifacts and much more. These noises act as hurdles during processing of ECG signal and thus for removal and rejection of such noise, pre-processing of ECG signal is an important task. Therefore on a primary basis, filtering techniques are used for preprocessing of any signals and similarly for ECG signals. The only care for ECG signal should be taken that the real information should not be distorted. In this paper, the main concentration will be on filtering of the baseline wander and the power-line interference.
Research Interests: