Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
—The present drinking water supply control system is facing many problems related to testing of water, purification, pumpi ng of water, storage and distri buti on of water. The major problems in water suppl y system are leakage or wastage... more
—The present drinking water supply control system is facing many problems related to testing of water, purification, pumpi ng of water, storage and distri buti on of water. The major problems in water suppl y system are leakage or wastage of water and the majority public is using suction motors to suck water from main supply connecti on. The problems an automated system has been proposed which enhances the water purification, storage, distri buti on and reduces wastage of water as well as identify the theft of water. The GS M Modem for wireless communicati on so that the information can be passed to particular responsi ble officer's cell phone for immediate acti on. The water supply system is a continuity of the water distributi on, water quality control and the monitoring.
Research Interests:
The Multiple-Input Multiple-Output based wireless system is a promising high data rate interface technology. A method based on Space-Time Block Coding (STBC) with Multiple-Input Multiple-Output (MIMO) set-up for use in wireless channels.... more
The Multiple-Input Multiple-Output based wireless system is a promising high data rate interface technology. A method based on Space-Time Block Coding (STBC) with Multiple-Input Multiple-Output (MIMO) set-up for use in wireless channels. In the special version of STBC called Alamouti code is used for exploiting the performance of MIMO in Adaptive modulation. The higher speed of communication without compromising in the accuracy is the prime focus of research work in wireless communication. The space time coding is one of the techniques which enable the higher speed with maintaining the error rate. In this paper, the analysis and study of the ber performance and capacity analysis is given. The OFDM system which is space time coding. The simulation has been performed with different modulation scheme. The comparative result has been given in this paper.
Research Interests:
Image re-ranking is one of the most effective technique to improve the results of web-based image search. This technique is adopted by current commercial search engines such as Bing & Google. In this paper... more
Image  re-ranking  is  one  of  the  most  effective  technique  to  improve  the results  of  web-based  image  search. This  technique  is  adopted  by  current  commercial search  engines  such  as  Bing  &  Google. In  this  paper  we  propose  a  novel  image  re-ranking  framework  with  two  stages ,  offline  and  online  stage.  In  the  offline  stage, different  semantic  spaces  for  different  query  keywords  are  automatically  learned  and  the visual  features  of  images  are  projected  into  their  related  semantic  spaces  to  get  semantic  signatures. At  the  online  stage,  images  are  re-ranked  by  comparing  their  semantic signatures  obtained  from  the  semantic  space  specified  by  the  query  keyword.  The proposed  system  significantly  improves  both  the  accuracy  and  efficiency  of  image  re-ranking  by  making  use  of  query-specific  semantic  signatures.
Research Interests:
Biometric systems are vulnerable to spoofing attack. A reliable and efficient countermeasure is needed in order to combat the epidemic growth in identity theft. To ensure biometric system security, liveness assessment can be applied in... more
Biometric systems are vulnerable to spoofing attack. A reliable and efficient countermeasure is needed in order to combat the epidemic growth in identity theft. To ensure biometric system security, liveness assessment can be applied in order to guard against such harmful spoofing attacks. In this paper, we present novel software based protection measure against fraudulent biometric system access attempt. Legitimate biometric sample comprises of efficient information that can be  analyzed  to discriminate it from self-manufactured, synthetic or reconstructed fake biometric trait used in fraudulent system access. This novel software based  method computes more than 25 Reference and No-Reference image quality features of biometric sample to verify its legitimacy. These extracted features are efficient to judge between legitimate and imposter sample. Proposed approach makes biometric system more users friendly, fast, less complex than the hardware based system and more suitable for real time applications.
Research Interests:
As we know that now a days there is a huge advancement in wireless communication and its applications, efficient use of spectrum has been a great issue for researchers. Due to such issue, development in cognitive radio (the intelligent... more
As we know that now a days there is a huge advancement in wireless communication and its applications, efficient use of spectrum has been a great issue for researchers. Due to such issue, development in cognitive radio (the intelligent network) has been in demand which will analyze the spectrum to increase its efficiency by utilizing it properly. This paper focuses on sensing the spectrum in order to find out the licensed underutilized spectrum, to accommodate it to the secondary user without interfering the primary user. Two spectrum sensing methods, i.e. Energy Detection (ED) method and Cyclostationary feature detection (CFD) method are carried out over three fading channels (AWGN, Rayleigh, and Rician) at different SNR values were prior knowledge of the signal is not required. Similarly graph of such simulation has been resulted and compared in this paper. This simulation proves that at low SNR, CFD is used.
Research Interests:
This project describes the controlling the robot using wireless Bluetooth technology via android application. The application robot control interaction display. We fix the camera to the robot for monitor the robot through android mobile.... more
This project describes the controlling the robot using wireless Bluetooth technology via android application. The application robot control interaction display. We fix the camera to the robot for monitor the robot through android mobile. The prototype of the mobile robot is based on the wireless technology. This kind of robot can be helpful for spying in war fields.
Research Interests:
An institutional repository is an online resource for the storing in digital form of academic materials, such as thesis, dissertations and research articles, on behalf of a university or other institution. This paper focuses on the... more
An institutional repository is an online resource for the storing in digital form of academic materials, such as thesis, dissertations and research articles, on behalf of a university or other institution. This paper focuses on the requirements, functions and use of digital preservation in an institutional repository context. The paper highlights on various feature and importance of both Institutional Digital Repository and Open Source Digital Library Software that is DSpace. The Central Library IIT, Kharagpur uses open sources digital library software namely DSpace. The IDR of Indian Institute of Technology Kharagpur (IITKGP) collects preserves and makes available in digital format of the scholarly output of IIT Kharagpur community. Its interface provides for an easy self-archiving by faculty, and organizes the documents in logical, easily retrieved fashion.
Research Interests:
The ceramic heat exchanger is studied to find the performance of heat transfer and pressure drop by numerical computation. The numerical computation was performed throughout the domain including fluid region in exhaust gas side... more
The ceramic heat exchanger is studied to find the performance of heat transfer and pressure drop by numerical computation. The numerical computation was performed throughout the domain including fluid region in exhaust gas side rectangular ducts, ceramic core and fluid region in air side rectangular duct with the air and exhaust in cross flow direction. The main aim is to reduce the hot side temperature from 1100oc to 600oc and later it passes through the metallic heat exchanger temperature ranges less than 600oc. By increasing the Reynolds number on the cold air side this increase the velocity of the cold fluid Increase the heat transfer rate also increase the velocity by using nuzzling effect on cold air slot. The main purpose using the ceramic is to withstand with high temperature than metal.
Research Interests:
In this paper, the Seismic behavior of in-filled Reinforced concrete frame having re- entrant corner is evaluated. Plan configuration of these space frames contain reentrant corner, where both projections of the structure beyond a... more
In this paper, the Seismic behavior of in-filled Reinforced concrete frame having re- entrant corner is evaluated. Plan configuration of these space frames contain reentrant corner, where both projections of the structure beyond a reentrant corner is 33 percent of the plan dimension of the structure in the given direction. Bare and Infilled frames are considered. It is found that infill had beneficial effect in R.C. frame having reentrant corner.
Research Interests:
In this paper the experimental study of the thermal behaviour of the single phase flow through a automobile radiator. The radiator is an important accessory of vehicle engine. Normally, it is used as a cooling system of the engine and... more
In this paper the experimental study of the thermal behaviour of the single phase flow through a automobile radiator. The radiator is an important accessory of vehicle engine. Normally, it is used as a cooling system of the engine and generally water is the heat transfer medium. For this liquid-cooled system, the waste heat is removed via the circulating coolant surrounding the devices or entering the cooling channels in devices. Nanofluids have attracted attention as a new generation of heat transfer fluids in building in automotive cooling applications, because of their excellent thermal performance. This study attempts to investigate the heat transfer characteristics of an automobile radiator using water combination based CuO nanofluids as coolants. Thermal performance of an automobile radiator operated with nanofluids is compared with a radiator using conventional coolants
Research Interests:
A Langmuir probe is being developed for measurement of electron densities, ion densities, and electron temperatures in the MARS ionosphere. This article describes how a Langmuir Probe can be used to study ionosphere irregularities. It... more
A Langmuir probe is being developed for measurement of electron densities, ion densities, and electron temperatures in the MARS ionosphere. This article describes how a Langmuir Probe can be used to study ionosphere irregularities. It begins with the basic principle of the Langmuir Probe, the ionospheric regions where it can be used, various sizes and shapes of the Langmuir Probe sensor.  Here In this paper, we present the design and how implementation of Langmuir Probe instrument. I have prepared a block diagram of Langmuir Probe (see figure 2). The Langmuir probe experiment consists of a cylindrical sensor. A conducting sensor called probe is exposed to the medium under study and the current collected by the probe is measured to study various plasma parameters. The probe can be kept ambient plasma potential or given a negative voltage (to collect ion current) or a positive voltage (To collect electron current).After the current and voltage charatertics we analyzed plasma parameters. The operational amplifier will be connected with the sensor. A floating sweep voltage supply will also be applied to the sensor for collect current. Here after than collected current is goes to differential amplifier. The differential amplifier is amplifying the difference between both signals. Later we will use low pass filter for remove the un-wanted frequency of signal. Here the analog to digital convertor for generating a digital signal for processing. The signal output will be displayed on LCD display screen by using microcontroller.
Research Interests:
In the rapidly changing time and especially in India where service sector is witnessing exponential growth; the online marketing sector is all set to witness bright future ahead. The increased use of internet in India provides greater... more
In the rapidly changing time and especially in India where service sector is witnessing exponential growth; the online marketing sector is all set to witness bright future ahead. The increased use of internet in India provides greater prospects for online shopping. Despite of this increased use of internet, there are several factors affecting Indian consumer’s online buying behaviour. And if the online retailers make aware themselves about these factors they can further develop their prospects and converts potential customers into active ones. In this research paper an effort has been made to find out the perceived risks of Indian customers with reference to online shopping. The risk in online shopping is mainly concerned with the misuse of credit cards, leakage of personnel information, product risk and risk of convenience.
Research Interests:
This paper presents preemptive scheduling approach to focus on providing a solution for online scheduling problem of real-time tasks, we present a system that uses virtualization technology to allocate resources for the tasks that are... more
This paper presents preemptive scheduling approach to focus on providing a solution for online scheduling problem of real-time tasks, we present a system that uses virtualization technology to allocate resources for the tasks that are reaching its deadline. The primary objective of real time preemptive scheduling is to increase throughput and minimize average response time instead of meeting deadlines. Earlier real-time tasks are scheduled non-preemptively with the objective to maximize the total utility, but the task with higher priority needs to wait until the currently running task gets completed. Hence if the waiting task reaches its deadline gets aborted. That before a task is aborted, it consumes system resources including network bandwidth, storage space and processing power. This leads to affect the overall system performance and response time of a task. Preemptive algorithm is proposed where the task with higher priority are executed first, and the tasks which are reaching its deadline are spoon to virtual machine, this this increases the total system performance.
Research Interests:
The ubiquity of smartphones together with their ever-growing computing, networking, and sensing powers have been changing the landscape of people’s daily life. The goal of our application ‘GAPS’ is to manage the user activities of the... more
The ubiquity of smartphones together with their ever-growing computing, networking, and sensing powers have been changing the landscape of people’s daily life. The goal of our application ‘GAPS’ is to manage the user activities of the smartphone with respect to locations. Various features of GAPS are Automated Profile Management with auto-response, Location Based Reminders and Log Maintainer. GAPS make use of GPS service for tracking user’s current location and Google map to fetch the information about location detected. The information from Google map is used diversely within the application for various modules. Automated Profile Management is one of the feature which recognizes the type of location and configures the audio profiles. An auto generated message is sent as response to the caller when the user is unable to receive the call.  Location based Reminder is a unique feature of the application which reminds the user not only with respect to time but also location. Log Maintainer will keep track of the user visited locations along with the timestamps.
Research Interests:
This paper deals with automatically collecting the Water consumption by a customer and also detecting the leakages in the water distribution system. Water leakage is an important component of water losses. In addition to raising consumer... more
This paper deals with automatically collecting the Water consumption by a customer and also detecting the leakages in the water distribution system. Water leakage is an important component of water losses. In addition to raising consumer awareness of their water use, metering is also an important way to identify and localize water leakage. A leak detection program can be highly proactive, helping water utilities automate water systems, detecting problem areas earlier, giving customers tools to monitor water use, providing more accurate rates and reducing demand. The objective is to overcome the disadvantages of using current meter technology and make the billing and troubleshooting process faster along with reducing the wastage of water. GSM and ZigBee technology automatically collects consumption, diagnoses, and collects status data from energy meter and transferring that data to a central database for billing, troubleshooting, and analyzing. This advancement saves utility providers the expense of periodic trips to each physical location to read a meter. Another advantage is billing can be based on near real time consumption rather than on estimates based on previous or predicted consumption. This timely information coupled with analysis, can help both utility providers and customers better control to the use and production of water consumption.
Research Interests:
Big Data is a collection of data that is large or complex to process using on-hand database management tools or data processing applications. It is becoming very difficult for companies to store, retrieve and process the ever-increasing... more
Big Data is a collection of data that is large or complex to process using on-hand database management tools or data processing applications. It is becoming very difficult for companies to store, retrieve and process the ever-increasing data. In other words we can say, Big Data is term given to humungous amount of data which is difficult to store and process. The issue lies in using the traditional system is, how to store and analyze Big Data. Risk prediction involves integration of clinical factors with socio-demographic factors like health conditions, disease parameters, hospital care quality parameters, and a variety of variables specific to each health care provider making the task increasingly complex. Unsurprisingly, many of such factors need to be extracted independently from different sources, and integrated back to improve the quality of predictive modeling. Such sources are typically voluminous, diverse, and vary significantly over the time. This project takes Apache Hadoop, an intrinsic part for storing, retrieving, evaluating and processing huge volumes of data for processing effectively. In this work, we study big data driven solutions to predict the 30-day risk of readmission for congestive heart failure (CHF) incidents. We will predict this process by using Logistic Regression and Naive Bayes classification on the basis of data collected from patients. The results are remarkable after the comparison between the two techniques and presented through confusion matrix.
Research Interests:
Biometric time and attendance system is one of the most successful applications of biometric technology, serves an alternate for traditional manual signing process. We designed and implemented a reliable, scalable and cost effective... more
Biometric time and attendance system is one of the most successful applications of biometric technology, serves an alternate for traditional manual signing process. We designed and implemented a reliable, scalable and cost effective Biometric Attendance System over GSM technology by which we can avoid human interaction to update the data. In the proposed system the primary part is enrolling the user fingerprints into the memory. At the time of verification if the user is already an enrolled person then the proposed system will provide the attendance to the user or else it will give the result as invalid user. The results however show that fingerprint biometric identifier was found suitable for the employee attendance management system of the organization.
Research Interests:
File security is a field which is less secure than what is needed for the current technological environment. The existing systems use normal encryption techniques using passwords, which can easily be bypassed. Even though the password is... more
File security is a field which is less secure than what is needed for the current technological environment. The existing systems use normal encryption techniques using passwords, which can easily be bypassed. Even though the password is unknown, the security can be bypassed by using techniques like phishing and brute force. They can be protected by simple algorithms, but still, if the password is known by any third person, the security can still be breached. Our system encourages the concept of security for a file which belongs to, and can be accessed by only authenticated people . The system also proposes a higher security level ring that compliments the security of password protected encryption. It also acts as a monitoring system in addition to the security system.
Research Interests:
In power system to discriminate faults from switching transients a new method is proposed in this paper. In this method before the occurrence of first peak of the fault current, fast fault detection and fast fault clearing are essentially... more
In power system to discriminate faults from switching transients a new method is proposed in this paper. In this method before the occurrence of first peak of the fault current, fast fault detection and fast fault clearing are essentially required for stable power system. An Example of this system is in an industrial system where bulk power is desired but there high short-circuit currents cannot be measured. Here we are proposing the method of Phase-Locked Loop (PLL) to perform the task of discrimination of Faults and transients. The power system model is developed and simulated for fault performance analysis. MATALB simulations have been performed and the output of the PLL is found completely different for a fault, as compared to a Switching Transient. This difference is being used in this work for discrimination between a fault and a Switching Transient. This capability of discrimination is useful for the safety and stability of the power system.
Research Interests:
A cloud services require in big scale, for users to share a private data such as health records, transactional data for analysis of data or mining of that data which bringing privacy concerns. Recently, because of new social behavior,... more
A cloud services require in big scale, for users to share a private data such as health records, transactional data for analysis of data or mining of that data which bringing privacy concerns. Recently, because of new social behavior, social transformation as well as vast spread of social system many cloud applications increase in accordance with the Big Data style, and make it a challenge for commonly used software tools to manage, capture and process the large-scale data within an elapsed time. In this paper, we are going to implement a scalable two-phase top-down specialization approach to anonymize large scale data sets of electronic health records using the MapReduce framework on cloud. In both phases of our project, we are going to design a group of inventive MapReduce jobs to concretely accomplish the specialization computation in a highly scalable way.
Research Interests:
In the proposed work, feature extraction algorithm is used for speaker identification system, the proposed feature extraction algorithm is Mel frequency Cepstrum Coefficient (MFCC). The extracted speech features (MFCC’s) of a speaker... more
In the proposed work, feature extraction algorithm is used for speaker identification system, the proposed feature extraction algorithm is Mel frequency Cepstrum Coefficient (MFCC). The extracted speech features (MFCC’s) of a speaker using vector quantization algorithm are quantized to a number of centroids. The distance between centroids of individual speaker in testing phase and the MFCC’s of each speaker in training phase is measured and the speaker is identified according to the minimum distance. The code performs the identification satisfactorily and is developed in the MATLAB environment.
Research Interests:
In recent years there are many multimedia documents captured and stored with the advances in computer technology and hence the demand for recognizing and retrieval of such documents has increased tremendously .In such environment the... more
In recent years there are many multimedia documents captured and stored with the advances in computer technology and hence the demand for recognizing and retrieval of such documents has increased tremendously .In such environment the large volume of data and variety of scripts make manual identification unworkable. In such cases the ability to automatically determine the script, and further the language of a document would reduce the time and cost of document handling. So the development of script identification from multilingual document image systems and then retrieving document image by matching with a query image (input image) has become an important task. It is noted that the research in this field is relatively thin and still more research is to be done, particularly in case of machine printed and handwritten documents. Here present a method developed to identify the script in machine printed document images automatically without manual intervention Particularly in case of machine printed and handwritten documents. The objective of this paper is to develop procedure to identify different text portions of a document. In this work eight feature namely top max row, bottom max row, top horizontal lines, vertical lines,  bottom components, tick components, top holes and bottom holes have been used to  identify the script type. Using these features two methods that is heuristic based algorithms and KNN approach proposed to identify the script type with the  scripts of Telugu, Hindi , English, Bangla. There are a large number of different approaches to recognize the scripts currently available in OCR System. In this paper we look to identify the script of multilingual documents. In the proposed script identification system, we have considered different Indian languages such as English, Devanagari, Kannada, Gurumukhi, Bangla Script.
Research Interests:
The migration to wireless network from wired network has been a global trend in the past few decades. The mobility and scalability brought by wireless network made it possible in many applications. Mobile Ad hoc NETwork (MANET) is one of... more
The migration to wireless network from wired network has been a global trend in the past few decades. The mobility and scalability brought by wireless network made it possible in many applications. Mobile Ad hoc NETwork (MANET) is one of the most important and unique applications. On the contrary to traditional network architecture, MANET does not require a fixed network infrastructure; every single node works as both a transmitter and a receiver. With the improvements of the technology and cut in hardware costs, we are witnessing a current trend of expanding MANETs into industrial applications. To adjust to such trend, we strongly believe that it is vital to address its potential security issues. In this paper, we propose and implement a new intrusion-detection system named Enhanced Adaptive ACKnowledgment (EAACK) specially designed for MANETs. Compared to contemporary approaches, EAACK demonstrates higher malicious- behavior-detection rates in certain circumstances while does not greatly affect the network performances.
Research Interests:
Electricity turned to be a basic need in this modern world. We need to achieve a state where we get 24x7 electric supply without injuring and polluting the nature. Several attempts were already made in catching the Excess Energy in moving... more
Electricity turned to be a basic need in this modern world. We need to achieve a state where we get 24x7 electric supply without injuring and polluting the nature. Several attempts were already made in catching the Excess Energy in moving vehicles. In this paper the electricity is generated through the flip plate mechanism. For obtaining the electricity through the flip plate mechanism a prototype model is developed and studied. Findings from this research work are discussed in this paper. This research work used a permanent magnet D.C. generator thereby generating 12 Volt D.C. This D.C. voltage is stored to the lead 12-volt battery. Electricity stored in battery is used to activate the light, fan etc. By increasing the capacity of the battery power rating is increased.
Research Interests:
The objective of the study was to characterize the properties of a magnesium alloy welded by friction stir welding. The results led to a better understanding of the relationship between this process and the microstructure and anisotropic... more
The objective of the study was to characterize the properties of a magnesium alloy welded by friction stir welding. The results led to a better understanding of the relationship between this process and the microstructure and anisotropic properties of alloy materials. Welding principally leads to a large reduction in grain size in welded zones due to the phenomenon of dynamic recrystallization. The most remarkable observation was that crystallographic textures appeared from a base metal without texture in two zones: the thermo-mechanically affected and stir welded zones. The latter zone has the peculiarity of possessing a marked texture with two components on the basal plane and the pyramidal plane. These characteristics disappeared in the thermo-mechanically affected zone (TMAZ), which had only one component following the basal plane. These modifications have been explained by the nature of the plastic deformation in these zones, which occurs at a moderate temperature in the TMAZ and high temperature in the SWZ.
Research Interests:
In this era of information, need for protection of data is more pronounced than ever. Secure communication is necessary to protect sensitive information in military and government institutions as well as private individuals. Current... more
In this era of information, need for protection of data is more pronounced than ever. Secure communication is necessary to protect sensitive information in military and government institutions as well as private individuals. Current encryption standards are used to encrypt and protect data not only during transmission but storage as well. Advanced Encryption Standard (AES), a Federal Information Processing Standard (FIPS), and categorized as Computer Security Standard. The AES algorithm is a block cipher that can encrypt and decrypt digital information. The AES algorithm is capable of using cryptographic keys of 128, 192, and 256 bits. The major advantage lay in the non-linearity of the key- schedule which eliminated the possibility of weak and semi weak keys. This encryption algorithm is virtually crack-proof till date but research has concluded that side channel attacks can be a concern if the encryption and crack are running on the same server. In this paper we introduce the concept of hybridizing the AES and DES standards with comparison of MAES using ROW-SHIFT techniques. The MAES algorithm uses cryptographic keys of 128, 192, and 256 bits to encrypt and decrypt dates. This methodology uses VHDL implementation over FPGA. We have programmed in Xilinx – 12.1 xst software and implemented on FPGA families which are Spartan2, Spartan3.
Research Interests:
This project aims at implementing a distributed On Board Diagnostics (OBD) system with a HMI that can be used for many purposes like safety, vehicle management and maintenance etc. For the implementation of this various electronic sub... more
This project aims at implementing a distributed On Board Diagnostics (OBD) system with a HMI that can be used for many purposes like safety, vehicle management and maintenance  etc. For the implementation of this various electronic sub systems in the vehicle like the antilock braking system (ABS), the engine control system (ECU), fuel level, exhaust gases sensors, load cell and many other such sub systems that are critical to either the condition of the vehicle or the safety of the users are interfaced on to a single embedded network within the vehicle using CAN bus for communication. The embedded network is in turn connected to  internal HMI that can gather, analyze and take respective action when required. Such a system if interfaced with a communication option like GSM / Bluetooth can interact with smart phones to automatically give contextual information to the users from time to time to ensure proper maintenance of the vehicle. Also the same can be used to automatically convey information to service centers about the problem with the vehicles and the maintenance that is needed.
Research Interests:
The study deals with CFD simulation of concentric tube heat exchanger and concentric tube heat exchanger with insert used for heating air using ANSYS FLUENT Software for steel. Nowadays, heat exchangers with twisted-tape inserts have... more
The study deals with CFD simulation of concentric tube heat exchanger and concentric tube heat exchanger with insert used for heating air using ANSYS FLUENT Software for steel. Nowadays, heat exchangers with twisted-tape inserts have widely been applied for enhancing the convective heat transfer in various industries such as thermal power plants, chemical processing plants, air conditioning equipment, refrigerators, petrochemical, biomedical and food processing plants. In general, twisted tape insert introduces swirl into the bulk flow which consequently disrupts a thermal boundary layer on the tube surface.  In general, twisted tape insert introduces swirl into the bulk flow which consequently disrupts a thermal boundary layer on the tube surface. Design process for heat exchanger and insert has been carried out in solid works fluid domain is formed in ANSYS workbench, followed by meshing in default mesh tool of ANSYS and solution is developed using ANSYS Fluent software as Finite element tool and the results are compared between the two designs for parallel flow.
Research Interests:
This work aims to investigate the effect of slip angle on stress distribution and fatigue life of wheel rim. To improve the quality of aluminum wheels, a new method for evaluating the fatigue life of aluminum wheels is proposed in this... more
This work aims to investigate the effect of slip angle on stress distribution and fatigue life of wheel rim. To improve the quality of aluminum wheels, a new method for evaluating the fatigue life of aluminum wheels is proposed in this paper. The ABAQUS software was used to build the static load finite element model of aluminum wheels for simulating the rotary fatigue test. The results from the aluminum wheel rotary fatigue bench test showed that the baseline wheel failed the test and its crack initiation was around the hub bolt hole area that agreed with the simulation. The results indicated that the proposed method of integrating finite element analysis and nominal stress method was a good and efficient method to predict the fatigue life of aluminum wheels.
Research Interests:
Relation Extraction is a tedious process in an environment where there is no information present for binding of related data. Even if the data is extracted in that environment, it is difficult to prove that the extracted information is... more
Relation Extraction is a tedious process in an environment where there is no information present for binding of related data. Even if the data is extracted in that environment, it is difficult to prove that the extracted information is truthful. In order to overcome this drawback, a method called context-based approach is proposed which attempts to prove the truthfulness of the extracted data. The proposed technique strives to achieve Connection completion (RC) criterion which is the serious drawback with the existing approach. This approach is built by identifying certain terms called key terms from the user request and grouping them as entities. These entities are then framed as individual incomplete instance pairs. Depending upon the instance pairs specific queries are generated based on the proposed context based approach. The query is given with some test data for better accuracy of the retrieved relations. Hence Connection completion is achieved and the retrieved relation will have better precision and recall in comparison with existing approach. The results were obtained based on real time dynamic data gathered and grouped into data sets.
Research Interests:
Mobile sensing applications play a role in collecting the desired statistics from large number of mobile users. In today’s world, cell phones come with multiple sensors which enable the aggregator to provide collective information. In the... more
Mobile sensing applications play a role in collecting the desired statistics from large number of mobile users. In today’s world, cell phones come with multiple sensors which enable the aggregator to provide collective information. In the existing system, there is bidirectional interchange between the mobile nodes and the aggregator or high processing overhead.  The communications between the untrusted aggregator and the mobile node requires high level of privacy. By generation of secret keys which are not known to each other, privacy is achieved. Key generation is implemented using sum aggregation protocol and it scales to large systems.
Research Interests:
The Location-aware keyword query returns ranked objects that are near a query location and that have textual descriptions that match query keywords.IR2-tree is used for finding nearest neighbor. Compression scheme and object aggregation... more
The Location-aware keyword query returns ranked objects that are near a query location and that have textual descriptions that match query keywords.IR2-tree is used for finding nearest neighbor. Compression scheme and object aggregation method based on spatial inverted index. Spatial Inverted Index has collections of spatial represent points are conferred with the collections of key word .Compression scheme reducing the space cost. Object aggregation is done by grouping the aggregative objects to retrieve the placement and textual dependent data. Priority level search is used to search the objects based on the priority given for the keywords depends on the user.This method has few drawbacks. To overcome this technique, Using Group nearest neighbor search algorithm group the object .In GNN algorithm it has two techniques such as Location based service and Euclidean Distance .Location based service used to retrieve the location using algorithm. Euclidean Distance are using the formula find the distance. When using these algorithms, its fully efficient and more accurate .The optimized groups of objects are retrieved.
Research Interests:
There has exponential been growth of mobile computing, and as the capabilities of smartphones has increased so have malware threats. Enterprises and individual users have extensively adopted the use of Android mobile devices, users can... more
There has exponential been growth of mobile computing, and as the capabilities of smartphones has increased so have malware threats. Enterprises and individual users have extensively adopted the use of Android mobile devices, users can download apps from unofficial marketplaces which pose a security risk. We overview Android malware detection methods and showcase an architecture for a malware detection system as an in-cloud service. The architecture uses automated static methods to decompile apk source code and then utilizes machine learning methods to classify risky, malicious and benign apps according to permissions requested, and installation origin. We present our system and experimental results on a dataset of 300malware and 500 benign application. Our trained model provides a detection rate after evaluation of 89%.
Research Interests:
This paper presents a fault tolerant Control System design which plays a vital role in the system design of any application. In time dependent applications, fault tolerant control systems are used. If fault tolerant or redundancy control... more
This paper presents a fault tolerant Control System design which plays a vital role in the system design of any application. In time dependent applications, fault tolerant control systems are used. If fault tolerant or redundancy control systems are not present then catastrophic failures will occur. A redundant control system provides automatic monitoring of primary and secondary CPU’s. Redundancy is common approach to improve the reliability and availability of a system. This paper describes the use of Hot standby redundancy approach to observe the switch over between different CPU’s by open source RTOS (Xenomai).
Research Interests:
The object that appears frequently in a set or sequence of images is defined as thematic object. In a collection of image sequences, thematic object as a key object. Detecting common objects that appear frequently in a set of images is an... more
The object that appears frequently in a set or sequence of images is defined as thematic object. In a collection of image sequences, thematic object as a key object. Detecting common objects that appear frequently in a set of images is an intriguing problem. It also lacks the prior knowledge in common pattern. System is proposed so that it overcomes the above problems. System consists of Pre-processing, which enhances the data images prior to computational processing. SURF algorithm is used for detecting, extracting and for matching the feature points with respect to geometric transformations. Obtained feature points from the different images are compared to determine the common feature from the data images. This solution helps to locate the thematic objects much more easily within less time.
Research Interests:
This project proposes a multipurpose robot for surveillance and rescue operations in remote locations and inside of buildings. This is mainly designed to be used for military purpose and can be used by military personnel combating... more
This project proposes a multipurpose robot for surveillance and rescue operations in remote locations and inside of buildings. This is mainly designed to be used for military purpose and can be used by military personnel combating terrorism, as this involves no human activity to have a complete lookout of any area along the borders and thus reduces the loss of human lives. The movement of the robot is controlled with the help of the Radio frequency transceivers. This being mounted with ultrasonic sensor measures the distance it travels and thus gives the exact distance it is from the place of control. This is fitted with various sensors such as the PIR sensor, obstacle sensor, gas sensor which perform various operations. The PIR sensor, which detects the thermal radiation, detects the presence of any human presence in any particular area, even behind walls. The obstacle sensor detects obstacles of any forms, such as walls, objects of any kind and thus bringing the robot to a halt. The presence of any poisonous gases is detected by the gas sensor fitted. The system even has a video camera mounted to it so that the controller can have a clear view of what the area, where the robot is employed, is like. Thus gaining a clear idea about that particular area and this helps in responding with a counter attack in case of any terrorist presence. The detected information is sent through a GSM module fitted to the robot to the control area. Even the exact location of the robot can be sensed with the GPS module mounted and the same can even be shared with the control personnel so that it will be easy to send in the rescue measures in case of any emergency, such that anyone lying injured can be rescued. This can even be used during natural calamities, where people stuck inside fallen building can be found with the help of the sensors attached. For further developments, this robot can be fitted with a mounted weapon which can be triggered from a remote location.
Research Interests:
Cloud computing revolutionize IT and business by offering computing as a utility over the internet. The evolution from internet to a cloud computing platform, the emerging development paradigm and technology and how these will change the... more
Cloud computing revolutionize IT and business by offering computing as a utility over the internet. The evolution from internet to a cloud computing platform, the emerging development paradigm and technology and how these will change the way enterprise applications should be architected for cloud deployment plays an important role but these enterprise technologies are critical to cloud computing. New cloud analytics and business intelligence (BI) services can help businesses (organizations) better manage big data and cloud applications. Analysing and gathering business intelligence (BI) has never been easy, but today BI is complicated further by overwhelming amounts of data loads and the number of data entry and access points. New cloud analytics advancements may offer BI relief and even profit-increasing predictability for enterprises. These new cloud analytics applications can deliver functional capabilities that can be easily, quickly and economically deployed, producing tangible and measurable benefits far more rapidly than in the past. Many organizations that recognized, effectively analysing their business needs and providing the data they require to make the right business decisions depends on a combination of internally generated data and externally available data
Research Interests:
Withstanding of gas turbine blades for the elongations is a major consideration in their design because they are subjected to high tangential, axial, centrifugal forces during their working conditions. The first stage rotor blade of a... more
Withstanding of gas turbine blades for the elongations is a major consideration in their design because they are subjected to high tangential, axial, centrifugal forces during their working conditions. The first stage rotor blade of a two-stage gas turbine has been analyzed for structural, thermal using ANSYS 12 which is powerful Finite Element Software. In the process of getting the thermal stresses, the temperature distribution in the rotor blade has been evaluated using this software. This project specifies how the program makes effective use of the ANSYS pre-processor to analyse the complex turbine blade geometries and apply boundary conditions to examine steady state thermal & structural performance of the blade for N 155, Hastealloy x & Inconel 625 materials.
Research Interests:
This study focuses on an advanced mobile security system to provide rapid and highly secure human friendly M-Commerce transaction. M-commerce transaction works in multistep process. The process involves User authentication, Merchant... more
This study focuses on an advanced mobile security system to provide rapid and highly secure human friendly M-Commerce transaction. M-commerce transaction works in multistep process. The process involves User authentication, Merchant authentication, Message authentication, secure payment details transaction authentication. M-commerce provides availability, reliability and security in transaction phases. In Proposed method consists of, more security, efficiency and accuracy levels are provided by using Minutiae Maps (MM) in fingerprint feature extraction level. User send the finger image to the biometric server by using discrete wavelet transform method and water marking techniques, hence finger image can be hidden. User finger image is checked and also compared with MM methods using biometric server. The comparison, fingerprint intensity level is recognized then the threshold level is (60-99%) means the process is successfully done.
Research Interests:
Automobiles, invariably comprises of digital control systems in order to avoid Vehicle Thefts the minute changes can be done with low cost that protects vehicle from burglary. The paper presents two Electronic locking Systems. First... more
Automobiles, invariably comprises of digital control systems in order to avoid Vehicle Thefts the minute changes can be done with low cost that protects vehicle from burglary. The paper presents two Electronic locking Systems. First system uses Radio Frequency (RF) module to unlock the door from remote control. Where in if user presses the remote key to unlock the door of car signal transmitter in the Remote transmits the signals to the RF Receiver which is at the door of the car. The Signal sent by the RF Transmitter to the RF Receiver is unique code so the door cannot be either locked or unlocked without this remote. Hence the System provides the authentication at the door with the remote. For the authentication purpose user has to enter a unique password to know whether the owner is accessing the car or an intruder. The electrical ignition of vehicle can be unlocked only if correct password is entered and sends message to the owner .If the user fails to enter the correct password in three trials, an Short Message Service (SMS) will be sent to the Owner and Police with the vehicle number and the location be tracked using a Global Positioning System (GPS) Module through Global System for Mobile communication (GSM) Module about the unauthorized usage and also buzzer be triggered. If the burglar takes away the car then the owner can control the car through GSM and all these subsystems are communicated through Control Area Network (CAN) BUS interfaced with Nuvoton Micro controller.
Research Interests:
The cloud service allows users to store and share their data with relative ease. Unfortunately these data are stored in untrusted servers and maintaining integrity of these shared data is skeptical, also the system hardware and software... more
The cloud service allows users to store and share their data with relative ease. Unfortunately these data are stored in untrusted servers and maintaining integrity of these shared data is skeptical, also the system hardware and software failure also affect the data integrity. There are several mechanisms defined to allow the public auditor to verify the integrity of data, but these existing mechanisms reveal identity of data owners. By using ring signature mechanism, the public auditor can audit the shared data without compromising user identity. The athos (AuTHenticated Outsourced Storage) is a platform independent architecture that allows authentication of outsourced data in untrusted storage. In this paper we utilize both ring signature and athos architecture for efficient public auditing and ensuring data freshness (latest version of stored data) of shared data.
Research Interests:
An induction motor is an asynchronous AC (alternating current) motor. The least expensive and most widely used induction motor is the squirrel cage motor. The interest in sensor less drives of induction motor (IM) has grown significantly... more
An induction motor is an asynchronous AC (alternating current) motor. The least expensive and most widely used induction motor is the squirrel cage motor. The interest in sensor less drives of induction motor (IM) has grown significantly over the past few years due to some of their advantages, such as mechanical robustness, simple construction, and less maintenance. These applications include pumps and fans, paper and textile mills, subway and locomotive propulsions, electric and hybrid vehicles, machine tools and robotics, home appliances, heat pumps and air conditioners, rolling mills, wind generation systems, etc. So, Induction motors have been used more in the industrial variable speed drive system with the development of the vector control technology. This method requires a speed sensor such as shaft encoder for speed control. This paper presents a novel design of a Takagi-Sugeno fuzzy logic control scheme for controlling some of the parameters, such as speed, torque, flux, voltage, etc. of the induction motor. Induction motors are characterized by highly non-linear, complex and time-varying dynamics and inaccessibility of some of the states and outputs for measurements, and hence it can be considered as a challenging engineering problem. The development of advanced control techniques has partially solved induction motor’s speed control problems; because they are sensitive to drive parameter variations. Fuzzy logic based controllers are considered as potential candidates for such an application. Further, the Takagi-Sugeno control strategy coupled with rule based approach in a fuzzy system when employed to the induction motor yields excellent results compared to the other methods as this becomes a hybrid & integrated method of approach. Such a mixed implementation leads to a more effective control design with improved system performance, cost-effectiveness, efficiency, dynamism, & reliability. The closed loop speed control of the induction motor using the above technique thus provides a reasonable degree of accuracy which can be observed from the results depicted at the end. The simulation results presented in this paper show the effectiveness of the method developed & have got a wide number of advantages in the industrial sector & can be converted into a real time application using some interfacing cards.
Research Interests:
In wireless sensor networks (WSNs), the multi-hop routing procedure offers some kind of protection against identity deception through routing information. There are so many harmful attacks which distract information of routing protocols,... more
In wireless sensor networks (WSNs), the multi-hop routing procedure offers some kind of protection against identity deception through routing information. There are so many harmful attacks which distract information of routing protocols, such as sinkhole attacks, wormhole attacks and Sybil attacks. In  wireless sensor networks (WSNs),the traditional cryptographic techniques are use for enhancing trust aware routing protocol but this technique do not detect such severe problem. So for protecting the WSNs against misdirecting the information during the multi-hop routing procedure, we developed and implement a protocol named as TARF, i.e. A trust aware routing framework for WSN. TARF provides a trustworthy and energy efficient path, without any help of tight time synchronization or geographic condition. TARF provide effective path during routing without any effect of harmful attacks. The property of recover the information which has been stretched during routing process of TARF is provide through both extensive simulation and empirical evolution with large scale WSNs. We have implemented a ready to use tiny-Os module of TARF with low overhead. This TARF can be implemented into exciting routing protocol with least efforts. Based on TARF, the concept of mobile target detection application in anti-detection mechanism is demonstrated.
Research Interests:
The honeycomb texture based on the lung diseases are diagnosed based on the part of lung tissue in CT images, since the texture of segmentation is the most essential part of Computer Aided Diagnosis (CAD) systems. In this paper... more
The honeycomb texture based on the lung diseases are diagnosed based on the part of lung tissue in CT images, since the texture of segmentation is the most essential part of Computer Aided Diagnosis (CAD) systems. In this paper abnormality in lung tissues shown in CT imageis segmented by wavelet transform method. Wavelet transform uses honeycomb texture in lung region for segmentation. High resolution areas are extracted from the vertical sub-image of lung region.  The regions that have low pixel size of intensities are placed in separate region and it is grown for segmentation.
Research Interests:
Steganography is the technique of hiding secret data into a cover image, audio, text using a key which can be later recovered using the same key. The owner of the sensitive cover image cannot hide the details of the image from the data... more
Steganography is the technique of hiding secret data into a cover image, audio, text using a key which can be later recovered using the same key. The owner of the sensitive cover image cannot hide the details of the image from the data hider. A Reversible Data Hiding (RDH) technique on a compressed image is proposed along with encryption of the cover image. The cover image is first compressed. The compressed cover image is encrypted with using encryption key.  The contents of the image are not disclosed to anyone without the encryption key. The data hider embeds the secret data into the encrypted cover image. At the receiver end the embedded data is extracted from the cover image using the embedding key. The cover image can be decrypted using the encryption key. The cover image can be also decrypted without extraction of the embedded data using the encryption key to produce an approximate image.
Research Interests:
Cloud computing is the emerging field of the computer science. It delivers on the emerging business essentials for quick, elastic and performance. Successful development of cloud computing paradigm necessitates accurate performance... more
Cloud computing is the emerging field of the computer science. It delivers on the emerging business essentials for quick, elastic and performance. Successful development of cloud computing paradigm necessitates accurate performance evaluation of cloud data centres. As exact modelling of cloud centres is not feasible due to the nature of cloud centres and diversity of user requests, in this project a novel approximate analytical model for performance evaluation of cloud server farms is described and solve it to obtain accurate estimation of the complete probability distribution of the request response time and other important performance factors. It allows cloud users to define the relationship between the number of servers and input buffer size, and the performance factors such as mean number of tasks, blocking probability, and probability that a task will get immediate service. Using the performance indicators, resource provisioning either under or over is avoided so that reliability is maintained to improve performance of the queuing model.
Research Interests:
When gas prices rise, people's thoughts naturally jump to alternative fuel sources. Stringent emissions standards that are being enforced in many countries around the world are encouraging vehicle manufacturers to meet such emissions... more
When gas prices rise, people's thoughts naturally jump to alternative fuel sources. Stringent emissions standards that are being enforced in many countries around the world are encouraging vehicle manufacturers to meet such emissions norms as EURO 5, the ACEA agreement and EURO 6.Due to the stringent emission norms, the research in the field of internal combustion engines in general and diesel engines in particular gathered huge importance and also increasing demand on fuel consumption. So high demands are placed on large gas engines in the areas of performance, fuel consumption and emissions. One way to reach this goal is to using new combustion concepts, such as Homogeneous Charge Compression Ignition. Homogeneous Charge Compression Ignition (HCCI) engines promise high thermal efficiency combined with low levels of nitric oxide and particulate matter emissions. However, due to the absence of an immediate means of triggering ignition, stable operation over a wide range of conditions and transient control have proven most challenging and have so far prevented commercialization by opening up new technical avenues, such as micro-hybridization and bio fuels. Most alternative fuel conversions involve reconfiguring a gasoline or diesel vehicle or engine to operate on natural gas, propane, alcohols, or on a blend of conventional and alternative fuels. Use of clean alternative fuels opens new fuel supply choices and can help consumers address concerns about fuel costs, energy security, and emissions. HCCI engines can operate on gasoline, diesel fuel, and most alternative fuels. HCCI combustion is achieved by controlling the temperature, pressure, and composition of the fuel and air mixture so that it spontaneously ignites in the engine. This control system is fundamentally more challenging than using a spark plug or fuel injector to determine ignition timing as used in SI and DI engines, respectively.  The purpose of this study is to summaries the alternative fuel effect on the HCCI engine combustion process.
Research Interests:
This paper presents applicable methods available using finite element analysis (FEA) in the field of study of pressure vessel. Discontinuity stresses in cylindrical pressure vessel connected to flat and hemispherical heads with internal... more
This paper presents applicable methods available using finite element analysis (FEA) in the field of study of pressure vessel. Discontinuity stresses in cylindrical pressure vessel connected to flat and hemispherical heads with internal pressure are determined by using principles specified in American Society of Mechanical Engineers (A.S.M.E) sec VIII division 1. In the present study 3D, axi-symmetric and symmetric models of pressure vessels are analyzed using ANSYS. The three models and the type of elements used are 3D model with solid elements, axi-symmetric model using plane elements, symmetric model using shell elements. Theoretical values obtained by discontinuity analysis are checked with ansys results and best methods in pressure vessel analysis are stated. Contact elements were tested to determine their usefulness in modeling the interaction between pressure vessel cylinder walls and heads. When modeled correctly, contact elements proved to be useful, but the operator also needs to be able to interpret the results properly. Problems such as local stress risers, unrealistic displacements and understanding how to use such data become extremely important in this kind of analysis. This highlights the key to proper use of finite element analysis. The analyst should be able to approximate the solution using classical methodology (hand calculations) in order to verify the solution.
Research Interests:
Implementation of DMA controller of AMBA Bus with two masters is described in this paper. DMA controller is connected to the AMBA AHB Bus. The Direct Memory Access (DMA) Controller is a hardware feature. It enables movement of blocks of... more
Implementation of DMA controller of AMBA Bus with two masters is described in this paper. DMA controller is connected to the AMBA AHB Bus. The Direct Memory Access (DMA) Controller is a hardware feature. It enables movement of blocks of data from peripheral to memory, memory to peripheral, memory to memory and peripheral to peripheral. This movement of data reduces the load on the processor. A DMA controller save power in a system by putting the CPU in a low power state. DMA controller is used to move the data. Architecture of DMA controller for AMBA bus consists of DMA system, Host and Arbiter. Arbiter gives response to DMA system and Host. Three buses are defined within AMBA specification i.e. The Advanced High-Performance Bus (AHB), The Advanced System Bus (ASB), The Advanced peripheral Bus (APB). The proposed architecture provides bus access to any one master at a time for improved speed and performance.
Research Interests:
A Protected data transmission has become a great challenge especially in education system. The question paper transmission during examinations needs a quick, highly protected with proper legal system. In this paper, a secure and fast way... more
A Protected data transmission has become a great challenge especially in education system. The question paper transmission during examinations needs a quick, highly protected with proper legal system. In this paper, a secure and fast way of direct end to end data transmission is proposed. This system acquires the data from storage device and transmits the data securely to another storage device through wireless network. The microcontroller based secure data transmission system communicates through GSM/GPRS modem. This system helps for question paper broadcast just before the commencement of examination, where the approved person at Head Office or University can send directly to the educational institution itself.
Research Interests:

And 17 more

Network based technology and Cloud Computing is becoming popular day by day as many enterprise applications and data are moving into cloud or Network based platforms. Because of the distributed and easy accessible nature, these services... more
Network based technology and Cloud Computing is becoming popular day by day as many enterprise applications and data are moving into cloud or Network based platforms. Because of the distributed and easy accessible nature, these services are provided over the Internet using known networking protocols, Protocol standards and Protocol formats under the supervision of different management’s tools and programming language. Existing bugs and vulnerabilities in underlying technologies and legacy protocols tend to open doors for intrusion so many Attacks like Denial of Service (DDOS), Buffer overflows, Sniffer attacks and Application-Layer attacks have become a common issue today. Recent security incidents and analysis Have manual response to such attacks and resolve that attacks are no longer feasible. In Internet and Network system application or platform facing various types of attacks in every day. Intrusion Prevention and the IDS tools that are employed to detect these attacks and discuss some open source tools to prevent and detection of intrusion and how can we use Open Source tools in our system. Snort is an open source Network Intrusion Detection System (NIDS) which is available free of cost. NIDS is the type of Intrusion Detection System (IDS) that is used for scanning data flowing on the network. There is also host-based intrusion detection systems, which are installed on a particular host and detect attacks targeted to that host only. Although all intrusion detection methods are still new, Snort is ranked among the top quality systems available today.
Research Interests:
we are living in a world where it is very critical for a physician to predict the disease a person is suffering from and to suggest medicine based on the diagnosis. Evidence based medicine (EBM) is the accurate use of evidence available... more
we are living in a world where it is very critical for a physician to predict the disease a person is suffering from and to suggest medicine based on the diagnosis. Evidence based medicine (EBM) is the accurate use of evidence available in making decisions to improve the care given for individual patients. Every year, a significant number of researches (potentially serving as evidence) are reported. With the help of this burgeoning amount of health records from tens and thousands of patients, researchers and medics are not just challenged to leverage the full potential of EBM but also to use it to treat their patients for the best results. We develop a system which would analyse the patient’s health with the help of a series of questions and determine the disease based on the evidences gathered previously. We also aim to provide the user with the best known medicine for that particular disease (Also based on EBM
Research Interests:
Mobile Ad hoc Networks are designed of roaming mobile nodes using temporary network operated by battery power. This paper proposes a set of performance metrics in evaluating energy efficiency in MANETs, and studies the energy consumption... more
Mobile Ad hoc Networks are designed of roaming mobile nodes using temporary network operated by battery power. This paper proposes a set of performance metrics in evaluating energy efficiency in MANETs, and studies the energy consumption of MANET from a variety of aspects: at different MAC layers including network layer, at different operation mode including idle, transmit and receive, with AODV routing protocols. Also I am changing Value of parameter like Beacon Interval Length, Beacon Window, Multihop Transmission indication Map, Mobility Speed, Node Density. Extensive simulations were run in the C compiler for various scenarios. The results and analysis reveal some important findings. A substantial amount of energy is consumed at MAC layer, especially at idle mode. IEEE 802.11 achieves completely different pattern in terms of its energy efficient when combined with AODV routing protocols. And I am changing value in P-Manet Protocol and also using synchronization & NAV with comparison between different value of BI, BW, MTIM window, Node Density and Mobility speed.
Research Interests:
Recently, the major problems of major companies how to mine effective data from their data sources, and they also want fast retrieval of datasets, therefore it is big area for researcher to researching new technique. so many researchers... more
Recently, the major problems of major companies how to mine effective data from their data sources, and they also want fast retrieval of datasets, therefore it is big area for researcher to researching new technique. so many researchers are try to find and develop new methods or new techniques for finding fast retrieval item sets or generate new datasets. In data mining there are many techniques for mine effective data from data sources, the techniques like classification, association, generalization, regression, clustering, etc. Among them I am try to focusing on cyclic association rule mining, which cyclic association rule is classified as a category of the temporal association rules, and temporal association rule classified as a category of association rule. Cyclic association rule have been introduced in order to discover rules from items that characterized by their regular variation over time or cyclic in nature. There are lots of algorithms for generating rules that the algorithm can yield significant performance benefits when compared to other algorithms and also provide predication that benefit to take decisions on bases of predication.  So in this survey paper I am representing the comparison of different algorithms which are introduced by various authors during some past years.
Research Interests:
Characteristic and prediction of the events are essential in many applications, such as forecasting economic growth, financial decision making etc. This can be done by processing the temporal patterns which are observed event data... more
Characteristic and prediction of the events are essential in many applications, such as forecasting economic growth, financial decision making etc. This can be done by processing the temporal patterns which are observed event data sequence often closely related to certain time-ordered structures. Among several existing method reconstructed phase space work well but only for univariate data sequence. So we propose a multivariate reconstructed phase space which is uses supervised clustering for characteristic and prediction of event from these dynamic data sequence. An optimization method is applied finally to estimate the parameters of the classifier that defines an optimal decision boundary in the Multivariate RPS.
Research Interests:
Websites provides a lot of information to the Users. Websites is a collection of Web Pages which contains a bunch of information. In these information to find or retrieve particular page or information is difficult task. So to make this... more
Websites provides a lot of information to the Users. Websites is a collection of Web Pages which contains a bunch of information. In these information to find or retrieve particular page or information is difficult task. So to make this task easy there are different web page classification methods. Using this method we can identify web pages. Based on web page information we have to classify the web page. Web page classification is area of web mining. Web mining is the integration of information gathered by traditional data mining methodologies and techniques with information gathered over the WWW. Web page Classification retrieves WebPages based on different features/parameters of web pages. Web Page Classification can be done in many ways such as using the content of the web page, using the structure of web page, using clustering methods etc. This paper is focuses on the Web Page Classification using the inverse document frequency.
Research Interests:
The scenario of the present era shows that the environmental impact was high on water resources. Water is the eternal source for living with nature. The contamination of groundwater resources is hike at many areas. In this paper an... more
The scenario of the present era shows that the environmental impact was high on water resources. Water is the eternal source for living with nature. The contamination of groundwater resources is hike at many areas. In this paper an attempt has been taken to analyze the physical and chemical parameters groundwater in the city of Tiruppur. The city of Tiruppur is famous for textiles. There is an high water scarcity in the city due to its various demands but the water is highly polluted. Due to increase in industries the water has higher effluent contamination.  The four  samples were collected and analyzed for pH, Temperature, Turbidity, electrical conductivity, TH, Calcium hardness, Magnesium Hardness, PH, Sulphates, Chlorides, BOD, COD, DO, TS, TVS, TFS. The results were taken and compared with IS 10500:2012 standards. The results showed that sulphates, Biochemical Oxygen Demand (BOD) were found to be higher than the permissible limits. The groundwater should be treated for drinking purposes.
Research Interests:
Image segmentation is a process of grouping image pixels into image regions i.e. regions corresponding to particular characteristics. There are many methods developed for image segmentation. This paper presents one of the graph based... more
Image segmentation is a process of grouping image pixels into image regions i.e. regions corresponding to particular characteristics.  There are many methods developed for image segmentation. This paper presents one of the graph based method of image segmentation to partition an image depending on global criterion.  Proposed paper implements normalized cut algorithm and compares it with graph based image segmentation.
Research Interests:
Reinforced concrete structures may be required to be upgraded for various reasons like structural inadequacy of materials (concrete and steel), deterioration of materials, initial deficiencies in design or construction, poor and... more
Reinforced concrete structures may be required to be upgraded for various reasons like structural inadequacy of materials (concrete and steel), deterioration of materials, initial deficiencies in  design or construction, poor  and improper maintenance, unexpected loads not considered in design initially and revisions in design codes and practices . The use of FRP to repair and rehabilitate damaged steel and concrete structures has become increasingly attractive due to the well-known good mechanical properties of this material. Strength, stiffness and ductility of few elements or whole building can be altered by retrofitting. Ferrocement jacketing is a technique of retrofitting which can be useful as it is economical, easy to apply from view point of less skill labour requirement, lightweight and high tensile strength. In this study for shear lacking RC beam- column joints, deflections and loads of specimens were noted down after stressing the specimens to ultimate load and load levels lesser then ultimate loads. Then retrofitting was done with ferrocement jacket and and specimens were tested again. Resultant Strength and other parameters were compared to control specimens. Increase in load carrying capacity was observed as a result of retrofitting. Yield load was also observed to increase.
Research Interests:
To develop a robotic vehicle with wireless camera using brain robot interface for monitoring purpose. The robot along with camera can wirelessly transmit real time video with night vision capabilities. This is a kind of robot can be... more
To develop a robotic vehicle with wireless camera using brain robot interface for monitoring purpose. The robot along with camera can wirelessly transmit real time video with night vision capabilities. This is a kind of robot can be helpful for spying purpose in war fields. A major challenge in two-class Brain Computer Interface (BCI) systems is the low bandwidth of the communication channel, especially while communicating and controlling assistive devices, which requires multiple motion command options in the form of forward, left, right, backward, and start/stop. BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. A wireless camera is mounted on the robot body for spying purpose even in complete darkness by using infrared lighting.
Research Interests:
In this paper presents that the LwIP stack has better performance, comparable with the TCP/IP stack implementation in various operating systems .The Internet of Things (IoT) allows for the interconnection between the virtual world of... more
In this paper presents that the LwIP stack has better performance, comparable with the TCP/IP stack implementation in various operating systems .The Internet of Things (IoT) allows for the interconnection between the virtual world of computers and the physical world of our everyday lives. Recent advances in the field of sensor networks and the standardization of Internet protocols for constrained environment allow for the seamless integration of low-power sensor nodes into the Internet with Web-based Services. One of the information model SensorML standard to describe the sensors processes. The Proposed design implementation the LwIP Ethernet stack protocol designed hardware platform of two nodes such as a ARM Cortex M3&A8. Transmission of sensor data through Ethernet medium to the host device which is based on SensorML with Apache Tom Cat server to represent the client subscribed application data.
Research Interests:
Due to the increasing availability of digital data, text document continue to grow as well hence the need of text mining. These digital documents comprise of the normal body text as well as side information. The side information will be... more
Due to the increasing availability of digital data, text document continue to grow as well hence the need of text mining. These digital documents comprise of the normal body text as well as side information. The side information will be in different formats for example hyperlinks and may contain useful information for mining. It is of utmost importance that the value of the side information be ascertained before consideration in the data selected for the text mining process as it may give an adverse impact on the quality of text mined. A principled way to perform the mining process is therefore required so as to maximize on the benefits of side information. In this paper, we use the Naive Bayes model to create an effective text mining approach.
Research Interests:
This research was studied to check the workability and strength properties of an pervious concrete along various percentage mix of rice hush ash with cement. The RHA was added partially to the cement to increase the strength of pervious... more
This research was studied to check the workability and strength properties of an pervious concrete along various percentage mix of rice hush ash with cement. The RHA was added partially to the cement to increase the strength of pervious concrete. In general RHA is an agricultural waste considering that 20% of the grain is husk and those 20% is combusted and made in RHA and it can be utilized in the partial mixing with cement. RHA has contain silica. Silica is a good chemical admixture to increase the strength of concrete. By using this RHA we can reduce the environmental impact and also eco friendly and free of cost. since india is the largest producer of rice we can obtain in plenty. In this study components used for pervious concrete is opc  M53 grade cement RHA, 12.5mm and 20mm coarse aggregate & water. Since pervious concrete don’t have proper mix ratio so we have obtained mix ratio by trial and error method. So we considered different mix proportion such as 1:1, 1:2, 1:3, 1:4, 1:5, 1:6, 1:7& 1:8 among this mix we have got better strength in 1:4 mix ratio of cement and coarse aggregate and 0.3 water cement ratio. in this study we are going to compare the strength of pervious concrete by different mix % of 20%, 25%, 30% of RHA with various size of 12.5mm & 20mm coarse aggregate.the pervious concrete was casted in cubes of 150 x 150 x 150mm moulds and beam of 170 x 150 x 150mm to find compressive and flexural strength of concrete. When the RHA content is increased beyond 30% the compressive & flexural strength of concrete reduced comparing to the conventional pervious concrete.
Research Interests:
Public private partnerships (PPPs) play an important role in bringing private sector competition to public monopolies in infrastructure development and service provision and in merging the resources of both public and private sectors to... more
Public private partnerships (PPPs) play an important role in bringing private sector competition to public monopolies in infrastructure development and service provision and in merging the resources of both public and private sectors to serve the public needs in better way. The development of PPP contracts, the structure includes a vital influence on the economic and policy success of privately financed roads throughout their lifecycle. Following fundamentals of PPP and the public policy differentiation between public interest and public objectives, many approaches for establishing the key contract method of concession length and risk mitigation are explored. The patterns of PPP contract strategies which correspond to common policy objectives achieving managing congestion, identified through evaluation of five national highway project agreements. This paper proposes decision making tools resulting from this work are illustrated through application to current PPP concession agreements (1) Process of PPP (2) Frame work of PPP contracts (BOT) (3) Comparisons (4) Recommendations & Suggestions, which concludes with thoughts on the appropriate role of PPPs in infrastructure delivery.
Research Interests:
- In wireless communication, multipath propagation results in several fading effect. To equalize such long fading channel traditional single carrier time domain equalization becomes infeasible due to its large computational complexity.... more
- In wireless communication, multipath propagation results in several fading effect. To equalize such long fading channel traditional single carrier time domain equalization becomes infeasible due to its large computational complexity. Single-carrier frequency-domain equalization (SCFDE) offers low complexity, minimum peak to average power ratio as well as less sensitive to carrier frequency offset compared with orthogonal frequency division multiplexing (OFDM). In this paper we proposed SCFDE combining with space time block coding (Alamouti like scheme) for linear zero forcing equalizer achieves significant diversity gain at low computational complexity over frequency selective fading channel also we show that diversity depends not only on antenna configuration and channel memory but also on data block length and data transmission rate.
Research Interests:
Gaming involves heavy concentration, presence of mind, quicker responses from the gamers. Many users use conventional input devices like keyboard, mouse,for interacting with the gaming application.These conventional devices reduce the... more
Gaming involves heavy concentration, presence of mind, quicker responses from the gamers. Many users use conventional input devices like keyboard, mouse,for interacting with the gaming application.These conventional devices reduce the physical work and increase the chances of obesity, mental tiredness, etc., and are considered unhealthy practices. These devices can be replaced by using HCI technology which provides functionalities to computer that can recognize hand gestures and body motions of the user. Human Computer Interaction (HCI) techniques like hand gesture recognition, voice identification etc., can increase the physical work and can make the user more interactive thus reducing health related risks.
Research Interests:
The aim of this work is to implement RTnet in industrial distributed control system environment. RTnet it is one of the hard real-time network protocol stack introduces a hardware independent and flexible communication in real time... more
The aim of this work is to implement RTnet in industrial distributed control system environment. RTnet it is one of the hard real-time network protocol stack introduces a hardware independent and flexible communication in real time environment. Now-a-day’s real time communication is one of the key feature in Distributed Control Systems. In this work, we proposed a prototype system for the real time data acquisition and monitoring with real time compliant data transfer both qualitatively and quantitatively in industries. In this system it consist of One Master node and two slave nodes in which all are working under Xenomai (real time operating system) and on top of this RTnet protocol stack is built .The slave nodes are connected to field elements for data acquisition via RS485 and communicate with master node via RTnet. Low latencies and jitter and gives deterministic behavior in industrial distributed control systems.
Research Interests:
An EtherCAT based data acquisition system is acquired data from the field sensors through the many slaves. It consists of a master with many slaves. The personal computer on which the controller runs is the EtherCAT Master (SOEM), while... more
An EtherCAT based data acquisition system is acquired data from the field sensors through the many slaves. It consists of a master with many slaves. The personal computer on which the controller runs is the EtherCAT Master (SOEM), while the devices that make data of connected I/O devices available for the master are called slaves. The Ethernet network has become the most popular network in data acquisition system to reduce media cost and achieve the higher performance. This paper presents that development of a standard network data acquisition system that is based on the communication protocol called EtherCAT. In particular, we focus on the times necessary to transfer quantity of data among slave stations.
Research Interests:
Biodiesel, consisting of the alkyl esters of fatty acids from vegetable oils or animal fats, is a promising alternative to petroleum diesel fuel. Biodiesel, when produced from vegetable oils is considered a carbon-neutral fuel based on... more
Biodiesel, consisting of the alkyl esters of fatty acids from vegetable oils or animal fats, is a promising alternative to petroleum diesel fuel. Biodiesel, when produced from vegetable oils is considered a carbon-neutral fuel based on the fact that the atmospheric carbon dioxide (CO2) is consumed during plant growth. Research has shown that biodiesel-fueled engines produce less carbon monoxide (CO), unburned hydrocarbon (HC), and particulate matter (PM) emissions compared to using petroleum diesel fuel. However, there is a major problem associated with biodiesel; it has worse cold flow property compared to petroleum diesel, which prohibits its use in cold atmospheric conditions. The study in this work will investigate the effect of a biodiesel additive (Wintron XC 30) on the cold flow property and examines the performance and emissions of a direct injection (DI) diesel engine. Wintron XC 30 is added from 0.25 to 2.0 vol.% in different diesel-biodiesel blends. Low percentage of biodiesel (5%) in the blend with additives shows a little better cold flow property than diesel. Brake specific fuel consumption (bsfc) and brake thermal efficiency (ηth) are measured as the engine performance parameters, and CO, HC, NO, NO2, NOx, CO2 and O2 are measured in emissions. B5 and B10 with lower amount of additives show comparable engine performance and emissions that of diesel.
Research Interests:
Power transformers represent the largest portion of capital investment in transmission and distribution substations. In addition, power transformer outages have a considerable economic impact on the operation of an electrical network. One... more
Power transformers represent the largest portion of capital investment in transmission and distribution substations. In addition, power transformer outages have a considerable economic impact on the operation of an electrical network. One of the most important parameters governing a transformer’s life expectancy is the hot-spot temperature value. The classical approach has been to consider the hot-spot temperature as the sum of the ambient temperature, the top-oil temperature rise in tank, and the hot-spot-to-top-oil (in tank) temperature gradient. When fibre optic probes were taken into use, winding hottest spot temperatures higher than those predicted by the loading guides during transient states after the load current increases, before the corresponding steady states have been reached. This paper presents new and more accurate temperature calculation method taking into account for findings hot spot temperature using IEEE standard using computer programming.
Research Interests:
In critical situations many vehicles face accidents, due to this lot of persons lose their lives. Some can be saved if the victim is given medical care as soon as possible. But this has not been the case in real time since no information... more
In critical situations many vehicles face accidents, due to this lot of persons lose their lives. Some can be saved if the victim is given medical care as soon as possible. But this has not been the case in real time since no information is provided to rescue sectors regarding the location if accident occurs. This paper provides an optimum solution to the drawback shown above. This system identifies the accident and intimates the information to the emergency care unit. If the accident is diminutive, then the victim needs to press the emergency switch so that the emergency care unit will skip the rescue process. The impact of the accident is detected using the vibration sensor. Here the image comparison technique is introduced to protect vehicle from theft. Speed limiters on the roads have become very common nowadays due to the increase in the speed of the vehicles. These speed limiters that are meant to reduce accidents sometimes act as a cause for it. Especially at night, speeding vehicles tend to overturn and heavy vehicles with loads find it difficult to stop at speed breakers. This even leads to traffic jam at such places. This paper proposes a solution to these problems by attaching an automated speed reduction unit on the vehicles. These units reduce the speed of the vehicle at critical junctions automatically without the consent of the driver. The RF transmitter is placed at sign Boards along the roads and the RF receiver in vehicle communicates with the transmitter and helps in reducing the speed of the vehicle. Alcohol sensor is used to detect whether the driver is drunk or not. If it senses the consumption of alcohol then it denies the ignition of vehicle. Temperature sensor is used to measure the temperature of the engine and displays it in the mounted LCD display. Heartbeat sensor is used to monitor the heartbeat of the driver, if the heart beat becomes abnormal, then the vehicle speed is reduced and an alert message is sent to the emergency unit via the GSM module, and the location of the vehicle is shared with the help of GPS module mounted alongside. GAS sensor is used to monitor the emission level in the vehicle which helps in avoiding air pollution.
Research Interests:
- Wire Electro-Discharge Machining (EDM) is a metal-working process whereby material is removed from a conductive work piece by electrical erosion. Wire EDM while not a new process, has rapidly become a key component in the manufacture of... more
- Wire Electro-Discharge Machining (EDM) is a metal-working process whereby material is removed from a conductive work piece by electrical erosion. Wire EDM while not a new process, has rapidly become a key component in the manufacture of injection moulds, and metal stamping dies. Surface roughness and Material Removal Rate. This paper reviews the research carried out for the alloy 6082 of Aluminium material. The research carried out here includes material removal rate and surface roughness of the same material. Input process parameters that are taken into consideration are wire speed & wire tension. The effect of input parameters on performance parameters such as material removal rate and surface roughness was experimentally noted and optimization of parameters is done with the help of RNS.
Research Interests:
A Darknet is a private network and the connections are made only between trusted friends. In the field of computer security, honeypot is an internet attached server that acts as a decoy, to trap the hackers in order to study their... more
A Darknet is a private network and the connections are made only between trusted friends. In the field of computer security, honeypot is an internet attached server that acts as a decoy, to trap the hackers in order to study their activities and monitor how they are able to break into a system. In this paper we present the results of SSH honeypot operations in which it undertook the web trap of attackers who target SSH service in order to gain illegal services. A medium interaction honeypot offers a high interaction level to the attacker and when a connection attempt is made to system port, the honeypot can reply back with specially crafted packets that emulate of a real network services. The fake system has remained online and fully operational, capturing attacks and logging all malicious activity. Lastly we collect the data and analyzed the information.
Research Interests:
Most of the existing routing protocol is based on the assumption that a path exists between the sender and the receiver. Due to intermittent connectivity, applications of delay tolerant networks (DTNs) are often characterized by network... more
Most of the existing routing protocol is based on the assumption that a path exists between the sender and the receiver. Due to intermittent connectivity, applications of delay tolerant networks (DTNs) are often characterized by network partitions. Intermittent connectivity can be a result of mobility, wireless range, sparsity, power management, or malicious attacks. There are many real networks that come under delay tolerant networks (DTNs) for example inter-planetary networks, wildlife tracking sensor networks, military networks etc. However, the lack of rich contact opportunities still causes poor delivery ratio and long delay of DTN routing, especially for large-scale networks.  As a consequence routing in delay tolerant network has received considerable attention in the recent years. In this paper we present Filtered - Flooding Routing Protocol for delay tolerant network (DTN) using WSN nodes. We have tried to maximize the message delivery rate without compromising on the amount of message discarded. The protocol is based on the idea of exploiting nodes as carriers of messages among network partitions to achieve delivery. Filter based prediction techniques and utility theory is used to decide which contact event must be sent or dropped by the nodes. The performance of the routing protocol is evaluated using network simulator ns2 and the results are graphically presented.
Research Interests:
The explosion of social media has created unprecedented opportunities for citizens to publicly voice their opinions, but has created serious bottlenecks when it comes to making sense of these opinions. At the same time, the urgency to... more
The explosion of social media has created unprecedented opportunities for citizens to publicly voice their opinions, but has created serious bottlenecks when it comes to making sense of these opinions. At the same time, the urgency to gain a real time understanding of citizens concerns has grown: because of the viral nature of social media (where attention is very unevenly and fastly distributed) some issues rapidly and unpredictably become important through word Of mouth.The World Wide Web has opened lots of new ways in human interactions. Users can express their opinions about various topics and discuss other user’s views, Blogs, forums; online communities are all places where people can write their feelings. Classical technology in text categorization pays much attention to determining whether a text is related to a given topic, such as education, finance, entertainment, etc. However, research goes on, a subtle problem focuses on how to classify the semantic orientation of the text.  Sentiment analysis is one kind of computational linguistic means sentiment analysis is the task of identifying positive and negative opinions, emotions, and evaluations.
Research Interests:
College Management Software is web enabled software designed to manage the entire Operations of an institution. College Management Software is a simple yet powerful one joint integrated platform that connects all the various departments... more
College Management Software is web enabled software designed to manage the entire Operations of an institution. College Management Software is a simple yet powerful one joint integrated platform that connects all the various departments of an institution like Administration, Account, Student section, Student and many more specialized modules. We have seen over the years that the process of notice boards, important notification about academics has been carried out manually almost across all educational institutions.The process is not only time consuming but also inefficient.Today, we need not to maintain paper based Notice boards. Also in this system we are going to add the web based technology using which we can get the notifications given by the institutes. Following this thought, we have developed a system based on the concept of web services which is implemented on Android mobile application as well as on PC that communicates with the database residing on a remote server. The Unique ID system provides unique identificationnumbers to the persons who using this system. UID Number which would not justhelp the admin  to track down individuals, but would makelife far easier for users as they would not have to submit multiple documents each time because those will be available and to be used.
Research Interests:
Delay tolerant network is an approach should be used in computer network architecture and it may lack of continuous network connectivity. Epidemic routing is the approach in DTN. For efficient transmission of message in the network the... more
Delay tolerant network is an approach should be used in computer network architecture and it may lack of continuous network connectivity. Epidemic routing is the approach in DTN. For efficient transmission of message in the network the main resource is message delivery ratio should be high and delay should be minimum, by doing this effective scheduling strategy should determine. In proposed concept a new scheduling framework for endemic and two-hop forwarding routing in heterogeneous DTN. Such as forwarding decision can be made at a node for optimum message delivery ratio and minimum delay.
Research Interests:
India is blessed with rich solar energy and if exploited efficiently, the country has the potential of producing trillion-kilowatts of electricity. Sunlight is converted to electricity directly when made to fall on solar photovoltaic... more
India is blessed with rich solar energy and if exploited efficiently, the country has the potential of producing trillion-kilowatts of electricity. Sunlight is converted to electricity directly when made to fall on solar photovoltaic (SPV) modules. Systems /devices are made for various applications based on SPV modules connected with suitably designed power conditioning units for meeting electricity requirements. These systems/devices are designed to work in off-grid mode (usually supported with batteries to allow use when sunlight is low or during night). In recent years solar PV systems became viable and attractive. Utility scale plants are being set up worldwide with promotional mechanisms which are set up on ground surface. Available roof-top area on the buildings can also be used for setting up solar PV power plants, and thus dispensing with the requirement of free land area. The electricity generated from SPV systems can also be fed to the distribution or transmission grid after conditioning to suit grid Integration.Currently, whole world is in the midst of an energy revolution that is fundamentally changing the future of rural electrification. So we present a review in this paper on todays policy and status of grid connected roof top PV system in Rajasthan.
Research Interests:
Materials had been in use since time immemorial. Our world is all about materials, which is why Materials Science and Engineering are taking prominence and centre-stage position in many developed and developing countries. Over the years... more
Materials had been in use since time immemorial. Our world is all about materials, which is why Materials Science and Engineering are taking prominence and centre-stage position in many developed and developing countries. Over the years there have been changes in man’s choice of materials for his engineering activities. The ages and times/period of man’s activities on earth are sometimes usually referred to by age and period when such materials were in vogue like the Stone Age, the Iron Age and the current Silicon age, etc. But the challenges of current world are constantly fuelling the discovery and development of new kinds of materials with the desired properties and the right cost to meet the challenges of the current day world. This article is, therefore, aimed at reviewing the advances made in engineering materials, their classification and the role/importance engineering materials in current day world vis-a-vis their properties and areas of application.
Research Interests:
Traffic congestions have become a major problem in the populous cities of India and Ahmedabad is no exception to it due to its high population density, rapid urbanization, industrialization and lack of traffic discipline. Transportation... more
Traffic congestions have become a major problem in the populous cities of India and Ahmedabad is no exception to it due to its high population density, rapid urbanization, industrialization and lack of traffic discipline. Transportation researchers had brain-stormed for many different alternatives to resolve the issue of traffic congestion and finally a BRT system for Ahmedabad named as “Janmarg” was zeroed upon to. Initially it was thought that BRTS was a sure shot ticket for the short comings of public transportation problems faced by AMTS (Ahmedabad Municipal Transportation Services) alone as there was no other major transportation system prevailing in the city. Also it was in parallel thought by the planners that a multi-modal transportation system is now must for a city like Ahmedabad which is the most populous city and also the financial capital of Gujarat. With these considerations a rail project was announced as early as in 2003. The objective of this paper is to provide a literature review of the comparative analysis of BRTS with other transportation systems justifying the need and analysis of the same.
Research Interests:
The malicious and selfish nodes are the serious threat of the delay tolerant network (DTN). The delay tolerant network is an intermittent connected network. It can tolerate larger delay comparing to the other networks. The misbehaving... more
The malicious and selfish nodes are the serious threat of the delay tolerant network (DTN). The delay tolerant network is an intermittent connected network. It can tolerate larger delay comparing to the other networks. The misbehaving nodes must be identified and data must be transferred accordingly. Most of the delay tolerant networks use incentive schemes to make the selfish or misbehaving nodes effectively participate in the packet transmission and to reduce the rate of packet loss. Currently the Probabilistic Misbehavior Detection Scheme is being used for the effective transmission of packets in the delay tolerant network. This paper is a survey based on the delay tolerant network. It deals with the different techniques involved in the delay tolerant network.
Research Interests:
Here in this paper we have compared the optical EXOR gate with additional input beam and without additional input beam using the two nonlinearity property XGM and XPM of SOA. CW Laser or clock signal is used as additional input beam. The... more
Here in this paper we have compared the optical EXOR gate with additional input beam and without additional input beam using the two nonlinearity property XGM and XPM  of SOA. CW Laser or clock signal is used as additional input beam. The extinction ratio of EXOR gate with additional input beam is 44.679 dB and the extinction ratio of EXOR gate without additional input beam is 11 dB, both are operated at 10 GB∕s.
Research Interests:
Over the past few years, the concern of security is increasing day by day. Without proper protection, any part of any network can be susceptible to attacks or unauthorized activity. Ring Signature is a type of digital signature that... more
Over the past few years, the concern of security is increasing day by day. Without proper protection, any part of any network can be susceptible to attacks or unauthorized activity. Ring Signature is a type of digital signature that enable a user to sign a message so that ring of possible signers is identified without revealing exactly which member of that ring actually generated the signature.Ring signatures are completely ad-hoc in nature there is no requirement of any central authority or coordination among different users. In this paper, we review summarize the study of ring signature schemes and scrutinize their relationships with other existing cryptographic schemes and discuss the uses and the mechanism used by ring signature.
Research Interests:
A magnetorheological (MR) fluid including magnetically soft particles suspended in a carrier solvent is disclosed. The MR fluid also includes additive particles of smaller size than the magnetically soft particles and a bridging polymer.... more
A magnetorheological (MR) fluid including magnetically soft particles suspended in a carrier solvent is disclosed. The MR fluid also includes additive particles of smaller size than the magnetically soft particles and a bridging polymer. The additive particles and polymer form a gel-like material which provides a blanket or coating around the magnetically soft particles. The MR fluids possess improved stability and redispersibility, as well as favorable mechanical properties. The concept of vibration controllability with smart fluids within flexible structures has been in the center of interest in the past two decades. Although much research has been done on structures with embedded electrorheological (ER) fluids, there has been little investigation of magnetorheological (MR) fluid adaptive structures. The magnetorheological response of MR fluids results from the polarization induced in the suspended particles by application of an external field. The interaction between the resulting induced dipoles causes the particles to form columnar structures, parallel to the applied field. The relationship between the magnetic field and the complex shear modulus of MR materials in the pre-yield regime is researched using oscillatory rheometry techniques. A structural dynamic modeling approach is discussed and vibration characteristics of MR adaptive structures are predicted for different magnetic field levels. In addition to the model predictions, actual MR adaptive beam is fabricated and tested. Both studies illustrate the vibration minimization capabilities of the MR adaptive beam at different magnetic field levels.
Research Interests:
Respiratory ensures permeation of oxygen into the human body. Respiratory parameters are used for featuring human health state. The parameter of respiration is measured implantable over duration of time. A wireless portable system... more
Respiratory ensures permeation of  oxygen into the human body.  Respiratory parameters are used for featuring human health state. The parameter of respiration is measured implantable over duration of time. A wireless portable system using sensors is proposed to monitor the respiratory parameters. The monitoring system measures the users respiratory air flow using a thermal flow sensor, body posture using a triaxis accelerometer and a photo electric sensor is used to monitor oxygen saturation. Algorithms are proposed for derivation of respiration parameters and estimation of body posture. Additionally, the monitored parameters are transmitted wirelessly to a PC connected in the hospital through a ZigBee transmitter so that the testing results can be recorded and analyzed or transmitted to a remote center or physicians. Respiratory flow sensor can detect weak respiration. Tri axis accelerometer that detects body posture provides a reference for respiration movement. Respiratory parameters like tidal volume, peak inspiratory flow, minute ventilation and respiration rate are derived from the obtained respiratory flow data. The system can act as both sleep recorder and spirometer making the system capable of monitoring and diagnosing various respiratory diseases like obstructive sleep apnea, asthma and chronic obstructive pulmonary diseases. A laptop or mobile connected to the internet is capable of serving the system as remote monitoring and timely risk alarming if any respiratory distress occurs and makes telemedicine achievable.
Research Interests:
Different users usually have different type of information needs when they use search engines to find web information. But the current web search engine provides the same information to the every user. To overcome this problem we propose... more
Different users usually have different type of information needs when they use search engines to find web information.  But the current web search engine provides the same information to the every user. To overcome this problem we propose a personalized web search engine Based on the user profile and the domain knowledge the system keeps on updating the user profile and enhanced user profile. This enhanced user profile is then used for suggesting relevant URL web pages. With the use of re-ranking concept, its providing relevant information based on the user searching browsing history.
Research Interests:
with the increasing concern on occupational safety, people pay more and more attention to mechanical protection. At present, the focus of relevant researches is on the mechanical protection performance of high performance and high... more
with the increasing concern on occupational safety, people pay more and more attention to mechanical protection. At present, the focus of relevant researches is on the mechanical protection performance of high performance and high strength. The influences that mechanical properties have on the mechanical protective performance of material are discussed in this paper. Using material is made on the grapheme, SWNT and al layers. This model is created in the PRO E software. When the material is used for the defense purpose and for the safety purpose at the time of the war enclosure the problem facing here is when the impact is applied at a point the material deformation occurs and a particular point and the deflection will be very high so that the human who is wearing  the material as the body proof experience a heavy pain in that, concluded is we are introducing the new composite material for the design( i.e. Attach aluminum layer at the front and back) which will convert a point in to a uniformly distributed load across the material hence by reducing the point deformation. In this paper validation of experimental and a numerical result of impact tests of composite laminate (al layers, graphene, SWNT) has been carried out using ANSYS. The mechanical response and energy absorption characteristics of material under high speed projectile impact are dependent up on intrinsic constitutive relations, construction parameters such as material type and construction, area, density, projectile shape, and impact conditions such as impact velocity and boundary conditions.
Research Interests:
Wireless Sensor Networks are also a type of Network which has a small and large number of sensor nodes with limited sensing computation and communication capabilities. Basically WSNs are nothing but it is a type of network which has some... more
Wireless Sensor Networks are also a type of Network which has a small and large number of sensor nodes with limited sensing computation and communication capabilities. Basically WSNs are nothing but it is a type of network which has some sensing devices with communication capabilities. WSNs have some advantages and some disadvantages according to their use in different-different fields. To increase the life time of network, it is necessary to reduce the number of bits transmitted over the channel; if it happens then automatically the life time of network will increase. This paper introduces the Basic of WSNs, Security issues, and some techniques like data aggregation, for reducing the data transmission over the network is called data aggregation method. There are a lot of security issues in data aggregation for example data integrity, confidentiality and freshness in data aggregation. So data aggregation becomes a crucial when the WSN is deployed in remote environment or hostile environment where sensors are prone to node failure and compromises. Secure data aggregation schemes are fruitful to achieve the security in WSNs. In this paper, we propose a secure data aggregation schemes which provides end to end data privacy. Through this technique the average no. of bits transmitted per node is reduced by 35%-50%.
Research Interests:
The modern mobile communication systems requires high gain, large bandwidth and minimal size antenna’s that are capable of providing better performance over a wide range of frequency spectrum.This paper presents the design of microstrip... more
The modern mobile communication systems requires high gain, large bandwidth and minimal size antenna’s that are capable of providing better performance over a wide range of frequency spectrum.This paper presents the design of microstrip rectangular patch antenna with center frequency at 1.176GHz for IRNSS application. The array of four by one (1x4) patch array microstrip rectangular antenna with microstrip line feeding based on quarter wave impedance matching technique was designed and simulated using Advance design system(ADS) tool. The performance of the designed antenna was than compared with the single patch rectangle antenna in term of return loss, Voltage Standing Wave Ratio (VSWR), bandwidth, directivity, radiation pattern and gain. The array antenna on the substrate type FR-4 with dielectric constant of 4.6 and thickness of 1.6mm respectively.
Research Interests:
Density-based Clustering algorithms are fundamental technology’s for data clustering with many attractive properties and applications. In high dimensional data, clusters are embedded in various subsets finding of dimensions. Density based... more
Density-based Clustering algorithms are fundamental technology’s for data clustering with many attractive properties and applications. In high dimensional data, clusters are embedded in various subsets finding of dimensions. Density based subspace clustering algorithms treat clusters as the dense regions compared to noise or border regions. The major challenge of high dimensional data is Curse of dimensionality, means that distance measures become increasingly meaningless as the number of dimensions increases in the data set. Another major challenge is, the high dimensional data contains many of the dimensions often irrelevant to clustering. These irrelevant dimensions confuse the clustering algorithms by hiding clusters in noisy data. The task is to reduce the dimensionality of the data, without losing important information.
Research Interests:
Cloud computing has major role in parallel and distributed data processing. Hadoop environment is mainly used for storage and processing of such data. In Hadoop, MapReduce framework is a programming model which processes terabytes of data... more
Cloud computing has major role in parallel and distributed data processing. Hadoop environment is mainly used for storage and processing of such data. In Hadoop, MapReduce framework is a programming model which processes terabytes of data in very less time. MapReduce framework uses a task scheduling method to schedule task.  There are various methods available for scheduling task in MapReduce framework. Scheduling of jobs parallely across nodes is a major concern in distrusted file system. Discussion of these research papers emphasizes the scheduling vulnerabilities of existent as well as new techniques in the field of distributed file system.
Research Interests:
Large and heterogeneous data are maintained by modern database and web database. Dynamic Query Form is a narrative database query form interface, which is able to generate query forms dynamically. The presence of DQF is to catch a user... more
Large and heterogeneous data are maintained by modern database and web database. Dynamic Query Form is a narrative database query form interface, which is able to generate query forms dynamically. The presence of DQF is to catch a user feedback and rank to query form components helping him/her in making decisions. At each iteration the system automatically generates list of rank of form components and the user then includes the desired form components into the query form. Query form is filled by user and submitted to view the query result at each iteration. So, a query form could be dynamically refined until the user is satisfied with the query results. We calculate the expected F- measure/score to measure the goodness of a query form. A Selective model is developed for estimating the goodness of a query form in DQF. Our implementation evaluation demonstrates the effectiveness and efficiency of the system. This is an Independent approach for efficiently generating dynamic search interfaces over databases. Where system provide the interaction among the user, experts and IR system. If user in the requirement of experts, you can also submit a query to the hundreds of thousands of experts in our System. This is the process of search on the basis of query expansion. Widely used query expansion methods are Global Analysis and Local Analysis for this purpose. Here presents a relative study of query expansion with dynamic document analysis and thesaurus based analysis with the HTML parser.
Research Interests:
History is evident of the destruction caused by tropical cyclones. Researchers in past have tried to analyze the trends in the annual occurrence of these tropical cyclones so as to forecast their occurrence in upcoming years. However, the... more
History is evident of the destruction caused by tropical cyclones. Researchers in past have tried to analyze the trends in the annual occurrence of these tropical cyclones so as to forecast their occurrence in upcoming years. However, the variation in the frequency of severe cyclonic storms (tropical cyclones of higher intensity) has a random nature. Hence, conventional statistical techniques prove to be incapable of analyzing the trends. In this paper, a unique Artificial Neural Network (ANN) based technique is proposed to analyze the trends in frequency of severe cyclonic storms in the region of Bay of Bengal. The proposed ANN based technique makes use of invertible nature of exponential smoothing to enhance the learning process. In the proposed technique, ANN is trained using smoothed target data and the output of ANN is inverse-smoothed to obtain the forecast. The ANN based method maps the data much better than conventional statistical methods and gives a fairly accurate forecast which will help to mitigate horrific effects of tropical cyclones.
Research Interests:
For wireless sensor networks with a large number of energy constrained sensors, it is very important to find a method to organize sensors in clusters to minimize the energy used to communicate information from all nodes to the processing... more
For wireless sensor networks with a large number of energy constrained sensors, it is very important to find a method to organize sensors in clusters to minimize the energy used to communicate information from all nodes to the processing center.  In this paper, we look at data compression method which can have significant impact on the overall energy dissipation of these networks. Based on our findings that the conventional direct transmission uses much more minimum-transmission-energy than cluster based transmission, and spatial correlation may not be efficient for large sensor networks, we propose in-network compression and optimum clustering to be implemented together for large WSNs.
Research Interests:
sometimes it is very important to establish relationship between various performance and control parameters. Today many modeling techniques available to establish relationship between performance parameters and control parameters for... more
sometimes it is very important to establish relationship between various performance and control parameters. Today many modeling techniques available to establish relationship between performance parameters and control parameters for material handling equipments. Application of fuzzy logic Technique provide a time saving ,non-functional, free form approach to capture effect of  control parameters like Cable length ,Cable diameter ,Travel speed etc. on performance parameters like Inner diameter of drum ,Number of teeth of sprocket ,Motor RPM etc. for cable reeling drum. Also result of Fuzzy model is compared with Mathematical Model, which is developed by using data fit software. There are three membership function used for both inputs and outputs parameter in mamdani type fuzzy inference engine of MATLAB.
Research Interests:
Suspension system is the crucial part of automotive design from point view of passenger comfort. Dr. Nano Inc., laboratory has introduced a flat composite C-spring replacing the coil spring of rear suspension of a light passenger vehicle.... more
Suspension system is the crucial part of automotive design from point view of passenger comfort. Dr. Nano Inc., laboratory has introduced a flat composite C-spring replacing the coil spring of rear suspension of a light passenger vehicle. The spring is made up of E-glass epoxy material. This particular spring undertaken for study is for compact sedan Maruti Suzuki Swift Dzire. The dissertation work carried out deals with the stress and modal analysis of this composite spring by FEM using ANSYS 15 software. tool. The results evaluated show composite C-spring has much lower stresses, higher natural frequency and higher strength to weight ratio.
Research Interests:
After a cryogenic fluid has been liquefied and purified to the desired level; it must then be stored and transported. Cryogenic fluid storage-vessel and transfer line design has progressed rapidly as a result of the growing use of... more
After a cryogenic fluid has been liquefied and purified to the desired level; it must then be stored and transported. Cryogenic fluid storage-vessel and transfer line design has progressed rapidly as a result of the growing use of cryogenic liquids in many areas of engineering and science. The development of the Dewar vessel represented such an improvement in cryogenic fluid storage vessels that it could be classed as a “break-through” in container design. The high performance storage vessels in use today are based on the concept of the Dewar design principle a double walled container with the space between the two vessels filled with an insulation and the evacuated from the space. The detailed conventional-cryogenic-fluid storage vessel design is covered in such standards as the American society of mechanical engineers (ASME) boiler and pressure vessel code, section VIII (1983), and British Standards Institution standards 1500 or 1515. Most users require that the vessels be designed, fabricated, and tested according to the code for sizes larger than about 250 dm3 i.e. 66 U.S. gallons, because of the proven safety code design. Cryogenic storage vessels are pressure vessels are used for storage cryogenic liquids with minimum heat in-leak into the vessel from the outside as far as possible. The challenge of design is to use such materials that do not lose their desirable properties at such a low temperature. Here the utmost care is taken to design a storage vessel satisfying both mechanical and thermal design. The results will be compared to the existing vessel of industry.
Research Interests:
Hadoop YARN is a software framework that supports data intensive distributed application. Hadoop creates clusters of machine and coordinates the work among them. It include two major component, HDFS (Hadoop Distributed File... more
Hadoop  YARN  is a software  framework  that  supports  data  intensive  distributed  application.  Hadoop  creates  clusters  of machine and coordinates the work among them. It include two major component, HDFS (Hadoop Distributed File System) and  MapReduce.  HDFS  is  designed  to  store  large  amount  of  data  reliably  and  provide  high  availability  of  data  to  user application running at client. It creates multiple data blocks and store each of the block redundantly across the pool of server s to enable reliable, extreme rapid computation. MapReduce is software framework for the analyzing and transforming a very large data set in to desired output. This paper focus on how the replicas are managed in HDFS for providing high availability of  data  under  extreme  computational  requirement.  Later  this  paper  focus  on  possible  failure  that  will  affect  the  Hadoop cluster and which are failover mechanism can be deployed for protecting the cluster.
Research Interests:
The Variable Frequency Drive (VFD) is most used drive now a days in industries. But the conventional VFD has some energy loss as the dynamic braking is used in it. There are some different types of methods in how to use and convert the... more
The Variable Frequency Drive (VFD) is most used drive now a days in industries. But the conventional VFD has some energy loss as the dynamic braking is used in it. There are some different types of methods in how to use and convert the regenerate energy when motor acts as a generator and this energy can be saved rather than dissipating as heat in the dynamic braking. Some methods are shown in this paper.
Research Interests:
This paper deals with static stress analysis of the Quench tank. The Quenching process is utilized to enhance the hardness and strength of some Automobile and Railway bearing parts. The main objective of this paper is to study the theory... more
This paper deals with static stress analysis of the Quench tank. The Quenching process is utilized to enhance the hardness and strength of some Automobile and Railway bearing parts. The main objective of this paper is to study the theory behind stress analysis of a quench tank due to storage salt which is the media through which quenching is done. This article uses finite-element analysis to know the stress distribution of a quench tank especially which is designed in rectangular shape. The numerical simulation needs to be carried out to know the required thickness of the plate due to its internal pressure. The stresses developed in this quench tank are analyzed by using ANSYS, a versatile Finite Element Package. Finally the theoretical values and ANSYS values are compared with experimental set up for quench tank analysis. The results can also significantly help in strengthening tank walls to sustain stresses developed due to quenching operation.
Research Interests:

And 36 more

In recent years, advances in hardware technology have led to an increase in the capability to store and record personal data about consumers and individuals. This has led to concerns that the personal data may be misused for a variety of... more
In recent years, advances in hardware technology have led to an increase in the capability to store and record personal data about consumers and individuals. This has led to concerns that the personal data may be misused for a variety of purposes.Privacy-preserving is an important issue in the areas of data mining and security. The aim of privacy preserving data mining is to develop algorithms to modify the original dataset so that the privacy of confidential information remains preserved and as such, no confidential information could be revealed as a result of applying data mining tasks. the data set complementation approach expands the sample storage size (in the worst case, the storage size equals (2|TU-1|*|TS|) ; perturbation will improve some storage size using like c5.0 algorithm.we will optimize the processing time when generating a decision tree from those samples and funtional dependencies.This paper work on optimize processing time,improve storage size,reduce processing time and functional dependencies.
Delay Tolerant Networks (DTNs) are supports data transfer in challenging environments where a fully connected end to end path may never exist between a source and destination. These networks deal with large transmission delays, frequently... more
Delay Tolerant Networks (DTNs) are supports data transfer in challenging environments where a fully connected end to end path may never exist between a source and destination. These networks deal with large transmission delays, frequently disconnected paths, high link & path error and limited resources .examples of this kind of networks are satellite communication, ad-hoc & sensor, vehicular networks. DTN routing protocols utilize the mobility of the nodes and buffering of messages, this makes is possible for a node to carry a message and in that way bridge partition in the network. It is also known as store-carry-forward. In this paper, we proposed reliable routing protocol in DTNs based on network coding-Multi Generation Mixing policy, that increase the reliability, while increasing data transmission delay as compared to the protocols with the best performance. Simulation of a network coding-based information delivery method in wireless networks which improves the network performance and reliability. Network coding is a paradigm to efficiently broadcast the data in wireless networks in which data flows coming from multiple sources are combined to reduce delay, and enhance robustness against node failures. Simulation results show that the proposed routing protocol Multi Generation Mixing Epidemic Routing Protocol (Epidemic_MGM) increase the Delivery Probability compare to the Traditional epidemic routing protocol.
By implementing feature-based CAM software from Delcam and investing in CNC mills, the industries can be able to reduce machining time and part programming time. Determined to be competitive not only in quality but also in manufacturing... more
By implementing feature-based CAM software from Delcam and investing in CNC mills, the industries can be able to reduce machining time and part programming time. Determined to be competitive not only in quality but also in manufacturing methods, the manufacturing companies are upgrading their capability with the help of Delcam software to take models of their components from 3D modeling software’s to production quickly. Typical machining tolerances in the specified range can be consistently achieved. In this thesis, the time taken to manufacture the main housing used in weighing machine is estimated. Main Housing is drawn in 3D Modeling and feature based software Pro/Engineer. The time taken for modeling, part programming and machining time using software’s Pro/Engineer and Delcam is compared and analyzed in this thesis. Models of the casing will be drawn in 3D Modeling and feature based software Pro/Engineer. The models are e imported in to the feature based CAM software Delcam. The time reduction using the software’s for modeling, part programming and machining time is analyzed in this thesis.
The objective of this paper was to predict the failure load of carbon/epoxy composite test specimens using an online acoustic emission (AE) monitoring and artificial neural networks (ANN). The test specimens were Carbon/epoxy rings made... more
The objective of this paper was to predict the failure load of carbon/epoxy composite test specimens using an online acoustic emission (AE) monitoring and artificial neural networks (ANN). The test specimens were Carbon/epoxy rings made of carbon T700 fibers and Epoxy resins these rings were tested in BISS 300KN Servo-hydraulic (UTM) Universal Testing Machine with the help of split disk test fixtures to ensure uniform distribution of loads on the ring and fixing AE sensors on the specimen at discrete locations.  A series of 24 carbon/epoxy rings were monitored with an acoustic emission (AE) system, while loading them up to failure. AE signals emitted due to different failure modes in tensile specimens were recorded. Amplitude, duration, energy, counts, etc., were the effective parameters to classify the different failure modes in composites, viz., matrix crazing, fiber cut, and delamination, with several subcategories such as matrix splitting, fiber/matrix debonding, fiber pullout, etc.  A  Multi-layer Back propagation neural network was generated to predict the failure load of tensile specimens. The network was trained with the amplitude distribution data of AE collected up to 50%, 60%, and 70% of failure loads, respectively along with their slope of cumulative amplitude distribution plot. 10 specimens were in the training set with their corresponding failure loads. The trained network was able to predict failure loads of remaining 14 specimens within the acceptable error tolerance. The results were compared, and we found that the network trained with 60% data having better prediction performance.
Cable supported structures have distinctive dynamic behaviour. Extradosed bridge, which is intermediate to Girder Bridge and cable stayed bridge, owing to its shallow cables, the structure behaviour of Extradosed Bridge differs from that... more
Cable supported structures have distinctive dynamic behaviour. Extradosed bridge, which is intermediate to Girder Bridge and cable stayed bridge, owing to its shallow cables, the structure behaviour of Extradosed Bridge differs from that of cable stayed bridge. Forced vibration of structure for given Earthquake time history is governed by peak acceleration. For cable stayed structures such as Extradosed cable stayed bridge it is difficult to predict dynamic response using usual methods of dynamic analysis as applied to some other bridge structures like response spectrum analysis,  accurate analysis like time history analysis is time consuming and has time and cost effects.  Nonlinearities can only be considered in time history analysis. The proposed method correlates the peak ground acceleration (PGA) and earthquake deformation ratio (EDR) which can be used for simplified dynamic analysis and can prove handy tool for structural engineers to know earthquake related serviceability without much complicated analysis at initial stages. This ratio can be used to present seismic damage indices. The method is proposed considers Extradosed bridge for example.
Now a day’s speech recognition is used widely in many applications. In computer science and electrical engineering, speech recognition (SR) is the translation of spoken words into text. It is also known as "automatic speech recognition"... more
Now a day’s speech recognition is used widely in many applications.  In computer science and electrical engineering, speech recognition (SR) is the translation of spoken words into text. It is also known as "automatic speech recognition" (ASR), "computer speech recognition", or just "speech to text" (STT). A hidden Markov model (HMM) is a statistical Markov model in which the system being modelled is assumed to be a Markov process with unobserved (hidden) states. An HMM can be presented as the simplest dynamic Bayesian network. Dynamic time warping (DTW) is a well-known technique to find an optimal alignment between two given (time-dependent) sequences under certain restrictions intuitively; the sequences are warped in a nonlinear fashion to match each other. ANN is non-linear data driven self-adaptive approach. It can identify and learn co-related patterns between input dataset and corresponding target values. After training ANN can be used to predict the outcome of new independent input data.
A Vehicle is a Structural Assembly which consists of many components coupled together to make the vehicle to run on initial conditions as well as under various load conditions. Vehicles are designed to carry loads like trucks, buses and... more
A Vehicle is a Structural Assembly which consists of many components coupled together to make the vehicle to run on initial conditions as well as under various load conditions. Vehicles are designed to carry loads like trucks, buses and cars. Different types of vehicles with wide variety of its applications are present in the market which capacity varies from 1Tonne to 40Tonnes and more.  The present scenario in automotive industry is an increase in demand of trucks not only on the cost and weight aspects but also on improved complete vehicle features and overall work performance. The chassis plays an important role in the design of any truck. The chassis design in general is a complex methodology and to arrive at a solution which yields a good performance is a tedious task. Since the chassis has a complex geometry and loading patterns, there is no well-defined analytical procedure to analyze the chassis. So the numerical method of analysis is adopted, in which Finite Element Technique is most widely used method. Vehicle chassis is an important part which supports the major load of the vehicle assembly. As vehicle chassis plays a vital role, its design has to be subjected to Structural Analysis to validate against all the possible cases of load applications and failures to strengthen the design.
A Truck’s chassis Frame forms the structural backbone of a commercial vehicle. The main function of the truck chassis frame is to support the components and payload placed upon it. When the truck travels along the road, the chassis is subjected to vibration induced by road roughness and excitation by vibrating components mounted on it. This paper presents the study of the vibration characteristics of the truck chassis that include the natural frequencies and mode shapes. The responses of the truck chassis which include the stress distribution and displacement under various loading condition are also observed. The method used in the numerical analysis is finite element technique. The results show that the road excitation is the main disturbance to the truck chassis as the chassis natural frequencies lie within the road excitation frequency range. The mode shape results determine the suitable mounting locations of components like engine and suspension system. Some modifications are also suggested to reduce the vibration and to improve the Strength of the truck chassis.
In this paper a robust system for enabling robots to detect and identify humans in domestic environments is proposed. Here presented a sample model on a live human detection and tracking system based on a microcontroller. In recent days... more
In this paper a robust system for enabling robots to detect and identify humans in domestic environments is proposed. Here presented a sample model on a live human detection and tracking system based on a microcontroller. In recent days the high speed wireless technologies are improving day by day in the field of automation applications. Using these advanced technologies a fast, accurate, new robotic controlling device based on advanced control algorithms are developed .The critical part of the system is the microcontroller unit interfaced to the robotic circuitry; the mechanical movements are monitored and controlled by the micro controller in control circuitry. To detect the live person we are using PIR sensor called as passive infrared sensor which is used to detect the persons whether they are alive or not. The remote uses certain range of Radio frequencies which is used to transfer the commands from the remote to the robot. By using this we can change the directions of the robot.  The CPU of the robot is a microcontroller (AT89C52). The power is derived from a set of three 9V batteries. Locomotion is with the help of two geared motors attached to the chassis.
We have been introduced to the passwords since long back. It has been one of the best and widely adopted scale of security in various applications. There are many aspects of giving and receiving passwords, which include Text, audio,... more
We have been introduced to the passwords since long back. It has been one of the best and widely adopted scale of security in various applications. There are many aspects of giving and receiving passwords, which include Text, audio, picture, problem solving etc, but one of the widely used is the text mode authentication. But here we are facing a serious drawback in the Password text mode .It has the vulnerability to be traced .This is what we need to improve and where my research is intended to.
Cluster analysis is the one in which uses to divide the data into groups. It mainly developed for the propose of summarization and improved understanding. The example for cluster analysis has been given below. Let we takes the group which... more
Cluster analysis is the one in which uses to divide the data into groups. It mainly developed for the propose of summarization and improved understanding. The example for cluster analysis has been given below. Let we takes the group which related to document for browsing. That are in order to find the genes and proteins which has similar functionality, or as a means of data compression. The term clustering has a long history and a large no of clustering techniques which have been developed in statistics and pattern recognition. This provide a short introduction to cluster analysis, and then find the focus on challenge of clustering high dimensional data. Hereby i present a brief overview of several recent techniques , including a more detailed description of recent work of our own which uses a concept based clustering approach.
Cognitive radio is widely expected to be the next Big Bang in wireless communications. Spectrum sensing, that is, detecting the presence of the primary users in a licensed spectrum, is a fundamental problem for cognitive radio. In this... more
Cognitive radio is widely expected to be the next Big Bang in wireless communications. Spectrum sensing, that is, detecting the presence of the primary users in a licensed spectrum, is a fundamental problem for cognitive radio. In this paper spectrum sensing techniques are reviewed.
In today’s world where the technologies are emerging at high speed and also new regulations and standards for noise emission increasing in the automotive firms and hence there is need to think over the current working system for the noise... more
In today’s world where the technologies are emerging at high speed and also new regulations and standards for noise emission increasing in the automotive firms and hence there is need to think over the current working system for the noise reduction or to make some improvements about decreasing the engine noise. Nowadays, the perforated reactive mufflers which have an effective damping capability are specifically used for this purpose. In this work new designs should be analyzed with respect to both acoustics and back pressure. In this study, a reactive perforated muffler is investigated numerically, experimentally and by the analysis. For an acoustical analysis, the transmission loss which is independent of sound source of the present cross flow, the perforated muffler was analyzed. To be able to validate the numerical results, transmission loss was measured experimentally. Back pressure was obtained based on the flow field analysis and was also compared with experimental results. Numerical results have an approximate error of 20% compared to experimental results.
Instrumentation transformers act as eyes and ears of a power system. Many measurement and protection related activities depend on current transformers (CTs) as primary sensing unit. Hence, it is of utmost important that the output of a CT... more
Instrumentation transformers act as eyes and ears of a power system. Many measurement and protection related activities depend on current transformers (CTs) as primary sensing unit. Hence, it is of utmost important that the output of a CT should be absolutely trust-worthy. However, CTs show a tendency of getting saturated. This leads to an erroneous secondary waveform, which can lead to malfunctioning of systems which are dependent on CT. This paper proposes a technique to enhance ANN based reconstruction of erroneous secondary current waveform.  The proposed technique uses artificial neural network to forecast ideal waveform. The network uses two inputs: 1. Erroneous secondary waveform. 2. Exponentially smoothed secondary waveform, which acts as an assisting input. The smoothing factor is determined using genetic algorithm. Extensive simulations indicate that the proposed technique efficiently generates reconstructed CT secondary waveform.
cloud is an extreme data center for outsourced data. The quality of video service gets poor when there occurs the long buffering and sporadic disturbance. While requirement of video traffic across mobile networks has become difficult, the... more
cloud is an extreme data center for outsourced data. The quality of video service gets poor when there occurs the long buffering and sporadic disturbance. While requirement of video traffic across mobile networks has become difficult, the wireless link ability cannot put up with the traffic demand.  The Cloud is a well-known tool because of its functional architectural tools. Sharing the data is another common functional property for the cloud users. These data sharing include video data in social networks, this was done frequently through mobile devices like smart phones, tablets, laptops etc. video traffic demand across mobile networks has become a difficult task, and the quality of service get reduced as there is a gap between the traffic demand and link ability. Mobile network has limited bandwidth and long buffering time.  To solve these issues adaptive mobile video streaming and efficient public video sharing has been proposed. AMoV and ESoV build a private agent to give video streaming services for each mobile user. These two approaches show scalable results in social network environment. With this framework, the overloading buffering time and interference can be avoided.
Web usage mining is the application of web mining, which implements various techniques of data mining for discovery and analysis of patterns in click stream and associated data collected or generated as a result of user interactions with... more
Web usage mining is the application of web mining, which implements various techniques of data mining for discovery and analysis of patterns in click stream and associated data collected or generated as a result of user interactions with web resources on one or more web sites. It consists of three phases which are data Preprocessing, pattern discovery and pattern analysis. In the pattern analysis phase interesting knowledge is extracted from frequent patterns and these results are used for website modification. In the proposed paper, a hybrid approach is used to fetch HTML as well as XML contents from a web page. In this approach combined effort of FP-Growth algorithm and Decision Tree is applied for pattern discovery. This approach helps in finding effective usage patterns. FP-Growth algorithm is used to remove the unimportant information from the contents and Decision tree is used to fetch the contents from a web page.
Mobile Ad-Hoc Network (MANET) is a multi-hop, dynamic and autonomous network composed of number of wireless mobile nodes. Based on the one commonly known tree-based multicast routing protocol, MAODV- Multicast Ad-hoc on Demand Distance... more
Mobile Ad-Hoc Network (MANET) is a multi-hop, dynamic and autonomous network composed of number of wireless mobile nodes. Based on the one commonly known tree-based multicast routing protocol, MAODV- Multicast Ad-hoc on Demand Distance Vector Protocol, the paper proposes a way to improve the performance of protocol in terms of the throughput and reduction in the number of mobile nodes participating in multicast routing, which will significantly reduce the overall routing-related control overhead. Based on the TTL value used by the route discovery process- ring search approach is modified and performance of modified algorithm is evaluated.
Advancement in technology is affecting almost all the fields and automotive doesn’t remain untouched. Every automobile manufacturer spends millions in development of cutting edge technologies to employ their vehicles with latest... more
Advancement in technology is affecting almost all the fields and automotive doesn’t remain untouched. Every automobile manufacturer spends millions in development of cutting edge technologies to employ their vehicles with latest technology innovations to keep drivers safe, accident free and give them a better driving experience. These technologies are known in the industry as Advanced Driver Assistance Systems and these are controlled by real-time embedded systems. This paper will present a new method for Driver Assistance System.
This paper propose a novel control strategy for grid connected VSC with LCL filters by using Fuel cell and PV cell . The proposed control strategy is inherently capable of attenuating the resonance phenomenon of such systems. This is an... more
This paper propose a novel control strategy for grid connected VSC with LCL filters by using Fuel cell and PV cell . The proposed control strategy is inherently capable of attenuating the resonance phenomenon of such systems. This is an advantage over the existing methods which require additional damping techniques. Moreover the proposed vector control strategy is able to fully decouple the direct (d) and quadrature (q) components of the current in a rotating reference frame. The design procedure comprises a constrained optimization-based loop shaping. It utilizes the multi-input multi-output (MIMO) nonparametric model of the system along with a high-order linearly parameterized MIMO controller to form an open-loop transfer function matrix. Minimizing the second norm of the error between the open-loop transfer function matrix and a desired one the coefficients of the controller are optimally determined. Conducting several reference tracking scenarios the performance of the proposed vector controller is evaluated both by means of time domain simulation studies in MATLAB/Simulink and experimental results.
Technologies like Voice, video, ftp, etc. have made Mobile Ad hoc Networks (MANET) important in our Real life. MANET is a distinctive type of ad-hoc network that has stations changing their locations frequently, and automatically... more
Technologies like Voice, video, ftp, etc. have made Mobile Ad hoc Networks (MANET) important in our Real life. MANET is a distinctive type of ad-hoc network that has stations changing their locations frequently, and automatically configuring themselves. Nodes can move freely in MANET while transmitting and receiving the data traffic by using wireless radio waves. Consequently, routing has become a vital factor and a key challenge for finding the best route in many MANET environments. The main focus of this paper is to analyze the impact of real time traffic (voice, video conference) and non-real time traffic (HTTP, FTP, Email) on the four routing protocols, Optimized Link State Routing (OLSR), Dynamic Source Routing (DSR), Ad-hoc On-demand Distance Vector (AODV), and Geographical Routing Protocol (GRP) over MANETs and also investigate their performance based on the average end-to-end delay and throughput. It is shown with extensive simulation results that OLSR protocol produces the highest throughput and the lowest delay compared to other protocols for non-real time and real time video traffic. It is also shown that OLSR protocol produces the lowest delay and acceptable throughput for real time voice traffic.
Cloud computing is a computing paradigm shift where computing is moved away from personal computers or an individual application server to a “cloud” of computers. Users of the cloud only need to be concerned with the computing service... more
Cloud computing is a computing paradigm shift where computing is moved away from personal computers or an individual application server to a “cloud” of computers. Users of the cloud only need to be concerned with the computing service being asked for, as the underlying details of how it is achieved are hidden. This method of distributed computing is done through pooling all computer resources together and being managed by software rather than a human. The services being requested of a cloud are not limited to using web applications, but can also be IT management tasks such as requesting of systems, a software stack or a specific web appliance.
A Vehicular Ad-Hoc Network, or VANET, is a technology that uses moving cars as nodes in a network to create a mobile network. Communication: typically over the Dedicated Short Range Communications (DSRC) (5.9 GHz) Example of protocol:... more
A Vehicular Ad-Hoc Network, or VANET, is a technology that uses moving cars as nodes in a network to create a mobile network. Communication: typically over the Dedicated Short Range Communications (DSRC) (5.9 GHz) Example of protocol: IEEE 802.11p. Vehicular Ad-Hoc Network (VANET) are sparse ad hoc networks in which no contemporaneous path exists between source and destination most of the time. In VANET, connectivity graph of the network changes over time either due to mobility or sleep-wake up cycles of the nodes. So, routing protocols proposed for VANET follow `store-carry-forward' paradigm in which two nodes exchange messages with each other only when they come into contact. vehicle connectivity can be fairly considered as future killer application which will add value to car industry. We will develop a multi copy routing protocol for anycasting in vanet which used network coding to reduce communication overhead.
The main motto of automotive industries is to increase vehicle safety and design more precise active safety system to alert the occupants under pre-collision. In this paper, the designed system can analyze the drowsiness and... more
The main motto of automotive industries is to increase vehicle safety and design more precise active safety system to alert the occupants under pre-collision. In this paper, the designed system can analyze the drowsiness and unconsciousness of the driver by using active sensors data. The active sensors (wheel grip sensor, IR sensor) sense the wheel grip under drowsiness and infra red (IR) sensor senses the position and sitting condition of the driver. The sensors data is analyzed by the embedded system (micro-controller) algorithm which gives the condition of consciousness of the driver. GPS and GSM are interfaced with micro controller in this project to track the exact vehicle location and V2I (vehicle to infrastructure) communication gives passive safety for post collision. This implementation gives better outcome when compare to conventional drowsiness detection system like eye tracking system, lane detection with image processing.
Mobile Ad-hoc Network contains autonomous system of mobile nodes which can move freely and communicate to each other without fixed infrastructure. These nodes work either as router or host. In MANET there is no centralized controlling... more
Mobile Ad-hoc Network contains autonomous system of mobile nodes which can move freely and communicate to each other without fixed infrastructure. These nodes work either as router or host. In MANET there is no centralized controlling authority and topology is not static. So this network is more vulnerable compared to wire d and wireless network. Many protocols in MANET work as on demand fashion like AODV. Rushing attacker exploits the duplicate suppression mechanism of AODV, to perform the attack. In this paper we have discussed about the rushing attack and its prevention technique. By modifying some property of AODDV, the attack can be avoided or the effect of the attack can be reduced. We have shown the results of prevention and the effect of the prevention to different size of network and different numbers of attackers.
- Now a days, In such advanced world where the technology has tremendously increased. need of aircrafts in the field of defense aviation has increased a lot which lead to the development of the light combat fighter aircrafts. This concept... more
- Now a days, In such advanced world where the technology has tremendously increased. need of aircrafts in the field of defense aviation has increased a lot which lead to the development of the light combat fighter aircrafts. This concept let us to design and analyze the delta wing payloads of tailless compound delta-wing indigenous 4th generation aircraft (TEJAS). It includes design of supersonic delta wing with customized airfoil according to the aircraft performance requirements, and integrates technologies such as relaxed static stability, fly-by-wire flight control system, multi-mode radar, integrated digital avionics system, composite material structures, and a flat rated engine. It is supersonic and highly maneuverable, and is the smallest and lightest in its class of contemporary combat aircraft. The delta wing structure is designed using catia software, which includes payloads such as drop tanks and missiles. Meshing is done in hyper mesh and static analyses of the delta wing payloads by including weights in different cases are analyzed.
The objective of this paper was design, development and analysis of propeller by using a bamboo composite fibre material with contra rotating advance airboat propellers, airboat propeller shows characteristics of its non-aviation usage.... more
The objective of this paper was design, development and analysis of propeller by using a bamboo composite fibre material with contra rotating advance airboat propellers, airboat propeller shows characteristics of its non-aviation usage. The design of a propeller is broad, flattened blade and squared-off tips because the aerodynamic favors a high drag that is useful for slowing an airboat.  A simple method of predicting the performance of a propeller is the use of Blade Element Theory. In this method the propeller is divided into a number of independent sections along the length. At each section a force balance is applied involving 2D section lift and drag with the thrust and torque produced by the section. At the same time a balance of axial and angular momentum is applied. This produces a set of nonlinear equations that can be solved by iteration for each blade section. The resulting values of section thrust and torque can be summed to predict the overall performance of the propeller.
Fuselage buckling of a stiffened composite cylinder is a very complex phenomenon that involves complex interactions between the skin and the stiffeners. Considering different configurations of the skin and stiffener, different types of... more
Fuselage buckling of a stiffened composite cylinder is a very complex phenomenon that involves complex interactions between the skin and the stiffeners. Considering different configurations of the skin and stiffener, different types of buckling failure modes and failure loads are observed in stiffened cylinders. In this work failure modes and the buckling loads of stiffened composite cylinders under uniaxial loading condition is investigated by using analytical and experimental approaches. Initially, a developed model for buckling problem of an isogrid stiffened composite cylinder. In these models the stiffness contributions of the stiffeners is computed by analyzing the moment and force effect of the stiffener on the unit cell. The equivalent stiffness of the stiffener/shell panel is computed by superimposing the stiffness contribution of the stiffeners and the shell. Once the equivalent stiffness parameter is determined for the whole panel buckling load is calculated using the energy method. A 3-D finite-elements model was also built which takes into consideration the exact geometric configuration and the orthotropic properties of the stiffeners and the shell. Based on finite-elements model, a discussion is made on the different buckling failure modes observed. The results of these three types of analysis methods are compared and comments are made on the reliability of the analytical models developed and finally a parametric study was carried out and general conclusions were drawn regarding the optimum configurations of the different parameters of the grid-stiffened cylinder.
- In This Paper proposes a Hybrid H-bridge (CHB) multilevel grid to reduce the processes of multiple dc–ac–dc or ac–dc–ac conversions in an individual ac or dc grid. The H-bridge (CHB) multilevel grid consists of both ac and dc networks... more
- In This Paper proposes a Hybrid H-bridge (CHB) multilevel grid to reduce the processes of multiple dc–ac–dc or ac–dc–ac conversions in an individual ac or dc grid. The H-bridge (CHB) multilevel grid consists of both ac and dc networks connected together by multi-bidirectional CHB multilevel converters. AC sources and loads are connected to the ac network whereas dc sources and loads are tied to the dc network. Energy storage systems can be connected to dc or ac links. The CHB multilevel inverters increase the output voltage level and enhance power quality. The HPS employs fuel cell (FC) and photovoltaic sources as the main and super-capacitors as the complementary power sources. Fast transient response, high performance, high power density, and low FC fuel consumption are the main advantages of the proposed HPS system. The proposed control strategy consists of a power management unit for the HPS system and a voltage controller for the CHB multilevel inverter. Each distributed generation unit employs a multi proportional resonant controller to regulate the buses voltages even when the loads are unbalanced and/or nonlinear. The proposed H-bridge (CHB) multilevel grid can operate in a grid-tied or autonomous mode. The coordination control algorithms are proposed for smooth power transfer between ac and dc links and for stable system operation under various generation and load conditions. This H-bridge (CHB)  multilevel grid system operates under normal conditions which include normal room temperature in the case of solar energy and normal PV ,FC, MTG, BES and SC speed at plain area in the case of PV ,FC, MTG, BES and SC energy. The Power Balancing Control simulation results are presented to illustrate the operating principle, feasibility and reliability of H-bridge (CHB)  multilevel grid proposed system.
Cloud computing aims to enable end-users to easily create and use software without a need to worry about the technical implementations and the software's physical hosting location, hardware specifications, efficiency of data processing.... more
Cloud computing aims to enable end-users to easily create and use software without a need to worry about the technical implementations and the software's physical hosting location, hardware specifications, efficiency of data processing. Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage applications and services) .User can store that data remotely without maintaining local copy of data So the integrity of the data is major problem in cloud storage. Recently many works focuses on providing data dynamics and public verifiability for checking the remote integrity with the help of third party verifiers. Integrity in cloud computing is achieved by the Boneh-Lynn-Shacham (BLS) Signing algorithm that signing the data block before sending data to the cloud. To remove online burden of user we have public audit ability for cloud data storage. For that we use external audit party to check integrity of outsource data. We introduce third party auditor (TPA) for audit outsourced data without demanding local copy of user. No additional online burden for the cloud user and that can achieved by using Privacy-Preserving Public key based Homomorphic authenticator. Homomorphic authenticator efficiently supports public key based audit ability without having retrieval of the data blocks.
- Using Surveillance systems we get video level processing techniques so that we can recognize the items or objects from any video file. So much of the developed countries and also some developing countries are making use of farm... more
- Using Surveillance systems we get video level processing techniques so that we can recognize the items or objects from any video file. So much of the developed countries and also some developing countries are making use of farm surveillance system so that they can analyze the farm from any place of the world. In this topic we are captivating some videos from farm surveillance system and on the basis of that we perceive animals and as the camera identify animal, alarm rings. It can be very useful to protect the farm from crop exposure by animals. In this topic there is a concise examination of diverse object recognition techniques and also many background subtraction techniques like Kalman filter, frame differencing, Optical Flow method , mixture of Gaussian model, and Combination of GMM and Optical flow methods. Further to identify object as animal there are some special techniques like contour based technique, template matching, edge based technique ,skeleton extraction, etc. But after survey of diverse methods and by combining preeminent attribute of them, the system is projected for animal detection. We use normalized cross correlation method for template matching so that we can identify an item or object as animal. Planned organization uses the Combination of all three GMM, Template Matching and Optical flow method for background subtraction.
In recent years composites have attracted substantial importance as a potential structural material. Most basic and common attractive features of composites that make them useful for industrial applications. In order to meet the... more
In recent years composites have attracted substantial importance as a potential structural material.  Most basic and common  attractive features of composites  that make them useful for industrial applications.  In order to meet the dynamic desires the conventional materials are not alone enough.  So, by combining these traditonal materials with some non-traditional materials hybrid properties can be achieved which  is the origin for the compostie materials.The objective of present work is to use this industrial waste i.e. Slag as particulate filler material to the epoxy matrix composites by molding technique with different weight fractions ( 0%, 5%,10%,15%, 20% ) to study the mechanical behaviour  of reinforced polymer composite material. The change in weight is studied for Slag under different tests like, tensile, bending and impact for obtaining the result. The conclusion helps us to predict the mechanical behavior of various constituents of Slag had resulted in better mechanical properties. The composite can be regarded as a useful light weight engineering material.
Design and analysis of propellant grain configurations for determination of the grain geometry which is an important and critical step in the design of solid propellant rocket motors, because accurate calculation of grain geometrical... more
Design and analysis of propellant grain configurations for determination of the grain geometry which is an important and critical step in the design of solid propellant rocket motors, because accurate calculation of grain geometrical properties plays a vital role in performance prediction. The performance prediction of the solid rocket motor can be achieved easily if the burn back steps of the grain are known. In this study, grain burn back analysis for 3-D star grain geometries for solid rocket motor was investigated. The design process involves parametric modeling of the geometry in CATIA software through dynamic variables that define the complex configuration. Initial geometry is defined in the form of a surface which defines the grain configuration. Grain burn back is achieved by making new surfaces at each web increment and calculating geometrical properties at each step. Equilibrium pressure method is used to calculate the internal ballistics. The procedure adopted can be applied to any complex geometry in a relatively simple way for preliminary designing of grain configuration. As the propellant in the igniter burns which would reduce the area of the remaining propellant and by which there will be an change  of  pressure  in the Solid Rocket Motor with respect to the time and this change in the pressure with cause variation in mass flow rate and in this paper the variation of the  thrust with respect to the time is calculated  .The areas of the grain are found by using MATLAB using 0.05 mm half set and which is gives the area of the remaining grain in the Solid Rocket Motor. The numerical results from the CATIA are checked with the MATLAB and to verify the correct area of the remaining propellant.
Transparent BIST schemes for RAM modules assure the preservation of the memory contents during periodic testing. Symmetric Transparent Built-in Self Test (BIST) schemes skip the signature prediction phase required in traditional... more
Transparent BIST schemes for RAM modules assure the preservation of the memory contents during periodic testing. Symmetric Transparent Built-in Self Test (BIST) schemes skip the signature prediction phase required in traditional transparent BIST, achieving considerable reduction in test time. Previous works on both offline  MARCH-C testing scheme and online testing symmetric transparent  BIST schemes require that a separate BIST module is utilized for each RAM under test. This approach, given the large number of memories available in current chips, increases the hardware overhead of the BIST circuitry. In this paper we propose a Symmetric transparent online BIST scheme that is used to test RAMs of different word widths; hence, more than one RAMs can be tested in a roving manner and here in this paper we are testing 5 RAMs of word lengths 3 bit to 7  bit . The hardware used for proposed scheme is smaller compared to the  previously utilized proposed symmetric transparent schemes, for  typical  memory configurations.
The work in this study explores the load-settlement behavior of the six different shaped footing (Square, circular, rectangular, hexagonal, octagonal and triangular) specimens under monotonic and incremental cyclic loading. For this... more
The work in this study explores the load-settlement behavior of the six different shaped footing (Square, circular, rectangular, hexagonal, octagonal and triangular) specimens under monotonic and incremental cyclic loading. For this purpose, footing specimens of surface area 150cm2 with plate thickness 8mm was taken and choosing their dimension accordingly. They are studied under yellow soil. Monotonic and cyclic loading are two parameters for designing the foundation. Therefore monotonic and cyclic loading tests were conducted on all the specimens and load intensity v/s settlement curves are plotted. For this, a tank of size 125cm x 75cm x 45cm was filled up to 40cm in 4 layers of 10 cm by yellow soil. By applying same load, settlement is noted and curves are drawn Plotted curves gives comparative results of six footing specimens for monotonic and incremental cyclic loading. Square footing specimen shows lesser settlement and triangular footing specimen shows higher settlement while other specimens give intermediate settlement. Means the load carrying capacity of square footing specimen is higher than the other specimens.
The objective of the project is to Design and Analyze a fitting for adjustable horizontal tail for sports utility aircraft. In designing the fittings the modeling strategy adopted is “Relational Design”, so that every part is made... more
The objective of the project is to Design and Analyze a fitting for adjustable horizontal tail for sports utility aircraft. In designing the fittings the modeling strategy adopted is “Relational Design”, so that every part is made relational with the aerodynamic datum surfaces provided by the organization. The designing of the parts is done in the CATIA V5.After the designing, analysis of the fitting has been carried out in Msc. NASTRAN/PATRAN. The loads applied are the reactions acting on the fitting as resultant of the aerodynamic loads acting on the whole horizontal tail for fail proof design 2-sigma loading conditions are considered. The FEM analysis also been validated by classical methods and the margin of safety are calculated and ensured that they clear ‘0’.Finally drafting is done in CATIA V5 after concluding the dimensions.
Data Mining is the process of extracting potentially useful knowledge from raw data. This is usually referred to as the Knowledge discovery process in the context of databases. Recently, Ontologies have come into picture, as being an... more
Data Mining is the process of extracting potentially useful knowledge from raw data. This is usually referred to as the Knowledge discovery process in the context of databases. Recently, Ontologies have come into picture, as being an integral part of knowledge structuring, to create knowledge-intensive systems. An ontology is defined as an explicit formal conceptualization of some domain of interest which helps in the portrayal of concepts and their relationships for that domain. To build any ontology, one needs a domain expert who declares all the domain concepts and the relationships between them for a specialized domain. This paper presents the problem of assessing a given ontology for a particular criterion of an application, by typically determining which of several ontologies would best suit the current application domain. It also focuses on techniques, which incorporate ontology during the data mining process. It further proposes a methodology for building an ontology on the basis of the output of data mining result. The effects of the generated ontology are studied on improving the data mining process.
Map-Reduce is a widely-used model for data parallel applications enabling easy development of scalable programs on clusters of commodity machine. Advancements in disk capacity have greatly surpassed those in disk access time and... more
Map-Reduce is a widely-used model for data parallel applications enabling easy development of scalable programs on clusters of commodity machine. Advancements in disk capacity have greatly surpassed those in disk access time and bandwidth. As a result, disk based systems are finding it increasingly difficult to cater to the demands of a cluster-based system. A cache memory has faster data access rate than normal disk. In this paper, we have implemented a method to improve the performance of map-reduce by using distributed memory cache as middleware between the map and reduce phase with highly secure memcached server. Moreover, it provides partial combining of data along with the Map task itself, so called partial pre-shuffling. The project also provides data assurance of intermediate key-value pairs via a web interface. The results of the overall setup were promising over a Hadoop cluster.
In this paper I have tried to show the various types of CNTFETs yet being manufactured their process of manufacturing and also various applications with their performance yet as various logic gates.
Power transmission involves the crucial part power supply over a country. Continues assessment and inspection can effectively promote the overall productivity of these energy systems. Deployment of mobile robots can provide an effective... more
Power transmission involves the crucial part power supply over a country. Continues assessment and inspection can effectively promote the overall productivity of these energy systems. Deployment of mobile robots can provide an effective solution to this problem. In this paper inspection tasks performed by the robot carrying detection instruments and rolling/crawling along the overhead ground wires are considered. There are different uncertainties involved with the inertial aspects of transmission line and the robot, achieving an accurate mathematical model of the system is difficult, which in turn increases the complexity over control. But coupling the dynamic aspects between the robot and its moving path can give better control over motion precision. This paper involves the dynamic modelling of cable maneuvering robot enabling traction control and centroid compensation. The scope of these analysis will help in understanding the system’s stability and achieving its control effectively. Path planning and obstacle avoidance schemes are developed based on the former results.
Jamuna River is characterized by its extremely dynamic and unstable alluvial channels. This 240km long braided river is located in lowest reach of the Brahmaputra River in Bangladesh. The Brahmaputra River catchment supplies enormous... more
Jamuna River is characterized by its extremely dynamic and unstable alluvial channels. This 240km long braided river is located in lowest reach of the Brahmaputra River in Bangladesh. The Brahmaputra River catchment supplies enormous quantities of sediment from the actively uplifting mountains in the Himalayas, its erosive foothills and the great alluvial deposits stored in the Assam Valley. Large discharge and heavy sediment load during floods causes the Jamuna River to be extremely unstable, because of which it consistently migrates laterally. This tendency of lateral channel migration, results in erosion of the river banks which causes severe problem to the people living in the floodplain of the river Jamuna. This particular study was carried out using the satellite imagery of the last decade as well as some old historical images to show the pattern and extent of channel migration and bank erosion of the Jamuna River. A comprehensive analysis was carried out in this study using the state of the art GIS technology to identify the vulnerable reaches of the river in respect of bank erosion.
Ginger (Zingiber officinale) is used traditionally for many therapeutic purposes. The aim of this study was to investigate the possible hepatoprotective role of ginger against leaded gasoline induced hepatotoxicity in rats. Sixty male... more
Ginger (Zingiber officinale) is used traditionally for many therapeutic purposes. The aim of this study was to investigate the possible hepatoprotective role of ginger against leaded gasoline induced hepatotoxicity in rats. Sixty male adult albino rats (120-150 gm) were divided into 10 groups (n=6). Control group. Groups (2-5) inhaled leaded gasoline with nominal concentration 18.18 ppm for exposure times 3, 6, 9 and 12 hrs/days for 14 successive days. Group (6) orally received 100 mg/kg ginger per day for 14 days. Group (7-10) inhaled gasoline in same conditions and same exposure times of groups (2-5), in addition to orally receiving 100 mg/kg ginger during exposure for 14 days. After sacrificing, the liver of the rats was taken for histological preparation. The results of the present study on the liver showed that subchronic exposure to gasoline  produced changes in some hepatocytes mainly hydrobic degeneration, necrosis and occasional fatty change were seen, congestion of the central vein and infiltrations of the inflammatory cells in the portal area. It can be concluded that Ginger administration (100 mg/kg) showed mild hepatoprotective action against gasoline-induced histological liver damage in rats.
Air pollution issues remain as the ongoing area of concern for posing considerable danger to health throughout the world (WHO, 2005). For this reason, air pollution related issues and prevention approaches is an essential area of study... more
Air pollution issues remain as the ongoing area of concern for posing considerable danger to health throughout the world (WHO, 2005). For this reason, air pollution related issues and prevention approaches is an essential area of study more than ever in developing countries like Malaysia. Experimental air pollution potential forecasting study in Klang Valley-Malaysia during January to December 2009 is described and analysed. In this study, the effect of wind speed, rainfall, temperature and stability factor of the lower level of atmosphere on potential of air pollution are evaluated in which  Joukoff and Malet (1982) model is used as an initial model. The above meteorological factors first are used to calculate meteorological air pollution potential (MPI). After that, MPIs vs meteorological factors are evaluated utilizing time analysis and linear regression. The results discourage the use of foresaid meteorological factor except for temperature for forecasting of air pollution potential in Malaysia and regions with the same climate characteristics. Finally, a model for forecasting air pollution potential considering temperature as the most effective factor for the application in Malaysia is developed.
Data mining is the most fast growing area today which is used to extract important knowledge from large data collections but often these collections are divided among several parties. This paper addresses secure mining of association... more
Data mining is the most fast growing area today which is used to extract important knowledge from large data collections but often these collections are divided among several parties. This paper addresses secure mining of association rules over horizontally partitioned data. This method incorporates a protocol is that of Kantarcioglu and Clifton well known as K&C protocol. This protocol is based on an unsecured distributed version of the Apriori algorithm named as Fast Distributed Mining (FDM) algorithm of Cheung et al. The main ingredients in our protocol are two novel secure multi-party algorithms one that computes the union of private subsets that each of the interacting players hold and another that tests the whether an element held by one player is included in a subset held by another. This protocol offers enhanced privacy with respect to the earlier protocols. In addition, it is not complicated and is importantly more effectual in terms of communication cost, communication rounds and computational cost. We present a two multiparty algorithm for efficiently discovering frequent item sets with minimum support levels without either player (site) revealing it to all players.
Physical impairment can limit the physical function or fine/gross motor ability of limbs of an individual. Such individual then can be called as a handicap. In cases of individuals with loss of limbs and next to nothing residual capacity,... more
Physical impairment can limit the physical function or fine/gross motor ability of limbs of an individual. Such individual then can be called as a handicap. In cases of individuals with loss of limbs and next to nothing residual capacity, it is very difficult for them to associate with daily activities as well as employment, education, independent living, etc. This paper will depict the information relating the substitution of the upper limb of an amputee by Myoelectric Prosthetic Arm. Prosthesis is a part of Rehabilitation Engineering which means, the reintegration of an individual with impairments into society. The primary purpose of an arm prosthesis is to mimic the appearance and replace the function of a missing limb. It requires effective usage of assistive systems to restore the motor functions of an amputee and also it should be cosmetically appealing. These requirements and advances in science and technology have led to development of externally powered prosthesis that interface directly with the neuromuscular system and recreate some of a normal hand’s sophisticated proprioceptive control. This concept of extended physiologic proprioception (EPP) is introduced for the control of gross arm movement whereby the central nervous system is retrained through residual proprioception to coordinate gross actions applying to the geometry of the new extended limb. This device known as Myoelectric Arm is based on biological electronic sensors, is battery operated, controlled by microprocessor and driven by motors. Once it is attached, the prosthetic uses electronic sensors to detect minute muscle, nerve, and EMG activity. It then translates this muscle activity as triggered by the user into information via microcontroller and filtering circuit, which its electric motors use to control the artificial limbs movements via motor driver IC. The end result is that the artificial limb moves much like a natural limb, according the mental stimulus of the user. The Myoelectric artificial limb does not require any unwieldy straps or harnesses to function. Instead, it is custom made to fit and attach to the remaining limb. Myoelectric hand/arm component perform better than conventional prosthesis in terms of function, weight, comfort and cosmetics.
The demand for antenna in present scenario depends on its size and thus miniaturization has become the need. Microstrip patch antennas meet this requirement and are versatile in terms of realization and easy to fabricate. This paper... more
The demand for antenna in present scenario depends on its size and thus miniaturization has become the need. Microstrip patch antennas meet this requirement and are versatile in terms of realization and easy to fabricate. This paper presents a circular shaped microstrip patch antenna at 1.5 GHz using Computer Simulation Technology (CST) Microwave Studio. We mainly focused on optimizing the size, efficiency, return loss and other parameters of circular shaped patch antenna by varying its feed length. Alongside, these parameters were manually computed and in the end we have compared the various results obtained for different feed lengths of circular shaped microstrip patch antenna.
Emerging paradigm of pay-per-use through cloud computing has led to outsourcing of many business-critical functions to cloud service providers. Delegation of Computation is useful concept, only until the results of computation can be... more
Emerging paradigm of pay-per-use through cloud computing has led to outsourcing of many business-critical functions to cloud service providers. Delegation of Computation is useful concept, only until the results of computation can be verified. Such verifiability has been achieved through interactive proof systems, leading to high communication cost. Other methods involve cryptographic primitives which impose high computation cost, rendering it unsuitable for mobile clients. We propose here a protocol with reduced communication cost for light-weight devices, utilizing cryptographic primitives to generate proofs. The proof-generating function is incorporated into the delegated function to make it fool-proof.
In this paper we are presenting the design of a single feed, compact size, single layer, Rectangular, square and Circular micro strip patch antennas for WLAN and WiMAX applications. Comparative study has been done to compare various... more
In this paper we are presenting the design of a single feed, compact size, single layer, Rectangular, square and Circular micro strip patch antennas for WLAN and WiMAX applications. Comparative study has been done to compare various parameters of these patch antennas with and without slots, For the design and simulation of it we have used the method of  EM Simulation software I3ED. The antennas are designed on a “Roger ultralum” substrate with 2.5 dielectric constant, Comparison have been done on the factors like bandwidth, return loss and gain.
"Cloud computing" is the next natural step in the evolution of on-demand information technology services and product.It is expected to provide quick, agile, stable, reliable services. Even though cloud has simplified cloudnetwork... more
"Cloud computing" is the next natural step in the evolution of on-demand information technology services and product.It is expected to provide quick, agile, stable, reliable services. Even though cloud has simplified cloudnetwork architecture, there is some complexity. Some of them are portability and interoperability between different Cloud Computing Services Providers. These problems handicap the widely deployment and quick development of cloudComputing. As a solution to these problems we put forward an inevitable approach "Open Cloud Computing Federation "for the wide use of cloud computing and to realize the value of it. In this paper we propose the MABOCCF mechanism. It will combinethe advantages ofMobile Agents andCloud Computing.ThusMABOCCFcan span over multiple heterogeneous Cloud Computing Platforms and realizes portability and interloper ability.In this paper we also present how cloud schedules user jobs and motivation for the combination of Mobile Agents and Cloud Computing.
Security in Mobile Ad-Hoc Network (MANET) is the most important concern for the basic functionality of network. Availability of network services, confidentiality and integrity of the data can be achieved by assuring that security issues... more
Security in Mobile Ad-Hoc Network (MANET) is the most important concern for the basic functionality of network. Availability of network services, confidentiality and integrity of the data can be achieved by assuring that security issues have been met. MANET often suffer from security attacks because of its features like open medium, changing its topology dynamically, lack of central monitoring and management, cooperative algorithms and no clear defense mechanism. These factors have changed the battle field situation for the MANET against the security threats.
Network based technology and Cloud Computing is becoming popular day by day as many enterprise applications and data are moving into cloud or Network based platforms. Because of the distributed and easy accessible nature, these services... more
Network based technology and Cloud Computing is becoming popular day by day as many enterprise applications and data are moving into cloud or Network based platforms. Because of the distributed and easy accessible nature, these services are provided over the Internet using known networking protocols, Protocol standards and Protocol formats under the supervision of different management’s tools and programming language. Existing bugs and vulnerabilities in underlying technologies and legacy protocols tend to open doors for intrusion so many Attacks like Denial of Service (DDOS), Buffer overflows, Sniffer attacks and Application-Layer attacks have become a common issue today. Recent security incidents and analysis Have manual response to such attacks and resolve that attacks are no longer feasible. In Internet and Network system application or platform facing various types of attacks in every day. Firewalls security and spam filters are in place but they have simple rules such as to allow or deny protocols, ports or IP addresses. Some DoS and other attacks are too complex for today’s firewalls, so firewalls cannot prevent that all attacks. In this paper we define and discuss various types and techniques of Intrusion Detection, Intrusion Prevention and the IDS tools that are employed to detect these attacks and discuss some open source tools to prevent and detection of intrusion and how can we use Open Source tools in our system.
Renewable energy and energy efficiency are sometimes said to be the “twin pillars” of sustainable energy policy. Both resources must be developed in order to stabilize and reduce carbon dioxide emissions. Efficiency slows down energy... more
Renewable energy and energy efficiency are sometimes said to be the “twin pillars” of sustainable energy policy. Both resources must be developed in order to stabilize and reduce carbon dioxide emissions. Efficiency slows down energy demand growth so that rising clean energy supplies can make deep cuts in fossil fuel use, using computer-based remote control and automation. They offer many benefits to utilities and consumer mostly seen in big improvements in energy efficiency on the electricity grid and in the energy users’ homes and offices. Although solar wind and Biomass energy are three of the most viable renewable energy sources, little research has been done on operating both energy sources alongside one another in order to take advantage of their complementary characters. It is fact that solar thermal power plant (STPP) and Wind Plant cannot operate stably and continuously due to variability of solar irradiation. We propose an optimal operating mode of hybrid STPP power plant with biomass for continuous electricity generation. In this we develop an optimal design for a hybrid solar wind- Biomass energy plant, where the variables that are optimized and the goal is to minimize costs. At first we make a stand alone system and then if production is more than consumption, it will fed back to the grid. Simulation studies and sensitivity analysis reveal that the hybrid plant is able to exploit the complementary nature of the three energy sources, and deliver energy reliably throughout the year along with a regular Power Supply Grid and synchronizing with it Provides an Excellent backup of Power and enhances sustainable energy production and reduces harmful effects like Global Warming etc.
The successor of Digital Video Broadcasting for Satellite systems (DVB-S) is the broadcast standard for television, the Second Generation- DVB-S2. The available encapsulation options are MPE and ULE for first generation as well as second... more
The successor of Digital Video Broadcasting for Satellite systems (DVB-S) is the broadcast standard for television, the Second Generation- DVB-S2. The available encapsulation options are MPE and ULE for first generation as well as second generation. The model provides support to fixed size TS packets. The compatibility to BaseBand Frames(BBFs) is provided by GSE for DVB-S2 links and this native method(GSE) declines the double overhead of BBF Encapsulation. Transmission of fickle-length network layer (IP) packets over satellite links with fixed frame lengths requires IP encapsulation. This survey paper presents these three encapsulation protocols for DVB-S2 and the paralleling of MPE, ULE and GSE, representing their manually calculated efficiencies to refer to theoretical efficiency simulations.
This abstract is based on the discussion about the easy technique of seven rules of Strassen’s Matrix algorithm.
This study intends to provide a better understanding of the Last Planner System (LPS) which is a Lean Construction concept by analyzing the various schedule systems involved in the concept. The Last Planner System (LPS) is a production... more
This study intends to provide a better understanding of the Last Planner System (LPS) which is a Lean Construction concept by analyzing the various schedule systems involved in the concept. The Last Planner System (LPS) is a production control system for managing projects. It supplements or replaces a typical management system based on activities and a defined schedule produced by a project manager. The LPS produces predictable workflow and rapid learning. This produces maximum value to the owner by eliminating waste caused by unpredictable workflow. Its use has enabled contractors to reduce the delivery time of a project and at the same time allowed specialty contractors to improve utilization of their resources. This paper focuses on implementing Last Planner System in residential construction by comparing the present scheduling techniques used in the industry. The data is collected through questionnaire survey. A total of 25 respondents are interviewed and the results are analyzed using the software Statistical Package for Social Sciences (SPSS). The results indicated that the respondents are not familiar with the LPS concept. The residential contractors are presently following the Master Schedule method to track their projects. Previous usages of LPS in construction projects proved that the system helps to improve the schedule performance and to avert the possible mistakes. It was concluded that with effective training in the concepts of LPS, the builders can overcome schedule delay and can improve the standards of the projects.
Cognitive radio networks have been proposed to solve the problems in wireless networks caused by the limited available spectrum and spectrum inefficiency. However, they impose unique challenges because of the high fluctuation in the... more
Cognitive radio networks have been proposed to solve the problems in wireless networks caused by the limited available spectrum and spectrum inefficiency. However, they impose unique challenges because of the high fluctuation in the available spectrum as well as diverse quality of service requirements of various applications. In this paper, a method for spectrum decision is introduced to determine a set of spectrum bands by considering the channel dynamics in the cognitive radio network as well as the application requirements. First, a novel spectrum capacity model is defined that considers unique features in cognitive radio networks. Based on this capacity model, a minimum variance-based spectrum decision is developed for real-time applications, which minimizes the capacity variance of the decided spectrum bands subject to the capacity constraints. For best effort applications, a maximum capacity-based spectrum decision is proposed where spectrum bands are decided to maximize the total network capacity. Simulation results show the performance of cognitive radio network for real time applications and best-effort applications.
An assimilated approach to precision agriculture system is used for prediction of soil productivity and soil fertility using expert knowledge as one of its components for achieving sustainability. This prediction system is implemented... more
An assimilated approach to precision agriculture system is used for prediction of soil productivity and soil fertility using expert knowledge as one of its components for achieving sustainability. This prediction system is implemented with the help of fuzzy logic theory. Methodolgy consist of selection of dependent and independent soil parameter, fuzzification, fuzzy inference rule,memebership function and defuzzification process. This work shows that fuzzy inference system is an efficient method to demonstrate defuzzification of expert knowledge using fuzzy inference process. The paper developed a fuzzy logic model using mamdani fuzzy inference system The inference rules are framed using expert knowledge in the form of IF...THEN structures. FIS tool in Matlab is used for building a prediction model. Fuzzy model is used for the prediction of soil productivity using essential soil parameter in percentage.
Project planning plays a crucial role in software project implementation and to minimize the cost. The existing models suffer from a large search space and lack of flexibility in human resource allocation to simplify the model. In the... more
Project planning plays a crucial role in software project implementation and to minimize the cost.  The existing models suffer from a large search space and lack of flexibility in human resource allocation to simplify the model. In the proposed model,  the tasks identification and scheduling is implemented using Event-Based Scheduler (EBS) and efficiency of employees and allocation of tasks is done through Ant Colony Optimization (ACO). Through EBS, beginning and ending of the tasks can be considered as events and it enables optimized resource utilization. The Task Precedence Graph obtained through ACO is used to find shortest path which reflects most suitable employee for the given task. Combining EBS and ACO techniques together will minimize the search space for employee allocation and hence the cost of the given project. The experimental results show that the proposed method is efficient and effective.
This paper is all about the importance of human assets of the organizations & its proposed position in the Balance sheet. It is often felt that human, the most important part of every organization is not getting its real worth & under... more
This paper is all about the importance of human assets of the organizations & its proposed position in the Balance sheet. It is often felt that human, the most important part of every organization is not getting its real worth & under valued at most of the places. Since human is the most valuable assets which operates all other assets of an organization whether  it is government, public sector, private sector or NGO but human assets are not getting its proper & suitable position. The value of an organization is normally measured by traditional balance sheet which is viewed as a sufficient reflection of an organization’s current & fixed assets. This traditional valuation has been called into question due to the recognition that human assets are very important part of an organization's total value in the current scenario. This has led to three important questions.1) How to assess the value of human assets effectively & accurately? 2) Where to show human assets in the balance sheet? and 3) What should be the Human asset to total asset ratio? In this paper included appropriate answer of questions.
A large amount of short, single-shot videos are created by personal camcorder every day, such as the small video clips in family albums. Thus, a solution for presenting and managing these video clips is highly desired. From the... more
A large amount of short, single-shot videos are created by personal camcorder every day, such as the small video clips in family albums. Thus, a solution for presenting and managing these video clips is highly desired. From the perspective of professionalism and artistry, long-take/shot video, also termed one shot video, is able to present events, persons or scenic spots in an informative manner. In this project a novel video composition system “Video Puzzle” is proposed which generates aesthetically enhanced long-shot videos from short video clips. Automatic composition of several related single shots into a virtual long-take video is done. A novel framework is designed to compose descriptive long-take video with content-consistent shots retrieved from a video pool . For each video, frame-by-frame search is performed over the entire pool to find start-end content correspondences through a coarse-to-fine partial matching process. The content correspondence here is general and can refer to the matched regions or objects, such as human body and face. The content consistency of these correspondences enables us to design several shot transition schemes to seamlessly stitch one shot to another in a spatially and temporally consistent manner. The entire long-take video thus comprises several single shots with consistent contents and fluent transitions. Meanwhile, with the generated matching graph of videos, the proposed system can also provide an efficient video browsing mode.
In the current era of digital world, user and investigator are more dependent on digital data. Digital data are very vast in size and also stored in various formats. So, the major problem is identification of upcoming data as true or... more
In the current era of digital world, user and investigator are more dependent on digital data. Digital data are very vast in size and also stored in various formats. So, the major problem is identification of upcoming data as true or false by the user or investigator. To overcome this problem different methods and techniques are adapted. Forensic method is used for validation of data. A computer forensic method can be used for detecting the different types of forgeries and computer crime. Forgeries and computer crime are the most major concern of the digital world. Lots of techniques and methods have been used to find a proper solution to overcome these forgery problems. Occurrences of digital crimes or forgeries are investigated using a method or technique called forensics. Initially a general survey was carried out to understand the different methods used in computer forensics to track the evidences which could be useful for detecting the computer crime and forgery. Forensic tools can be used for making any changes to data or tampering of data. Different rule sets or methods are defined to detect the various errors regarding the changes and the tampering of the data in different windows file systems. The data is tampered or modified in either of the two ways i.e., offline or online. In this research, offline data is of upmost concern.  Digital evidence which stores information in digital form can be used to detect forgery and computer crime. In this paper, a computer forensic method for detecting timestamp forgery in the Windows NTFS file system is presented. The accuracy of timestamp forgery can be further improved by using attributes of files like size, time. The tool can be used for all types of files.
Security of digital products (audio, video, image, and text) on the internet is very essential these days, for these digital watermarking techniques are used. Digital watermarking is the process of embedding data into digital multimedia... more
Security of digital products (audio, video, image, and text) on the internet is very essential these days, for these digital watermarking techniques are used. Digital watermarking is the process of embedding data into digital multimedia content. This is used to validate the reliability of the content or to recognize the identity of the digital content's owner. Sophisticated watermarking techniques have recently been developed, and they can be used to embed in printed text, graphics, or images as well as in digital images, graphics, audio, or video. This present survey is on DWT (Discrete Wavelet Transform) technique used in watermark. DWT transform more computationally efficient than other transform methods, because of its outstanding localization properties which offer the compatibility with the Human Visual System (HVS).
This study has been conducted to develop the Water Quality Index (WQI) of the Mahi River based on water quality parameters of DO, MPN, pH, BOD5, COD, Turbidity and Total Dissolved Solids (TDS). The study included surface water quality... more
This study has been conducted to develop the Water Quality Index (WQI) of the Mahi River based on water quality parameters of DO, MPN, pH, BOD5, COD, Turbidity and Total Dissolved Solids (TDS). The study included surface water quality sampling & its analysis, rating of parameters by experts and determining the water quality using value function graphs. WQI is one of the most effective tools to communicate information on the quality of water to the concerned citizens and policy makers or water resource managers. WQI is calculated from the point of view of the suitability of surface water for fish, wildlife, bathing and other beneficial uses.
Sound attenuation is done by active sound attenuator or by passive sound attenuator. In case of active sound attenuation the anti-noise source is used to reduce the intensity of noise however, in passive sound attenuation various types of... more
Sound attenuation is done by active sound attenuator or by passive sound attenuator. In case of active sound attenuation the anti-noise source is used to reduce the intensity of noise however, in passive sound attenuation various types of sound absorbing porous materials are used to absorb sound energy. In passive sound attenuators the substantial portion of the mechanical pressure wave penetrates to the pores before encountering to the solid surface. Large number of interactions takes place and the energy is transferred to the solid structure through the frictional losses and the sound intensity is reduced. Passive sound attenuator is used in Heating Ventilation and Air Conditioning (HVAC) duct to reduce the sound produced by the blower. Two porous materials of different sound absorbing coefficients are used in the attenuator. This passive attenuator is specifically designed to study the effect at low sound frequency. Frequency of sound is measured with the help of Lab VIEW software. Intensity of sound is measured using digital decibel (dB) meter. To measure the pressure drop digital anemometer is used. Along with the combination of porous materials the effect of thickness of porous absorber and the angle of porous absorbers with the direction of air stream is studied.
The present study deals with profit analysis of a parallel system of two identical units by giving priority to preventive maintenance of one unit over replacement of the other. Each unit has two modes- operative and complete failure. A... more
The present study deals with profit analysis of a parallel system of two identical units by giving priority to preventive maintenance of one unit over replacement of the other. Each unit has two modes- operative and complete failure. A single server is provided immediately to conduct the repair activities whenever needed. The preventive maintenance of the unit is done after a maximum operation time up to which no failure occurs. The failed unit is replaced by new one in case its repair is not possible by the server in a given maximum repair time. The unit performs with full efficiency as new after repair and preventive maintenance. All random variables are statistically independent. The distributions for failure time, replacement time and the rate by which unit undergoes for preventive maintenance are taken as negative exponential while that of  preventive maintenance, repair and replacement rates are assumed as arbitrary with different probability density functions. The semi-Markov process and regenerative point technique are adopted to derive the expressions for some measures of system effectiveness in steady state. The variation of mean time to system failure (MTSF), availability and profit function has been observed graphically for arbitrary values of various parameters and costs.
This article discusses technique of remote monitoring of greenhouse systems using Labview and what is require for the variations in software and hardware implementation. This paper mainly focuses on higher efficiency for the real-time... more
This article discusses technique of remote monitoring of greenhouse systems using Labview and what is require for the variations in software and hardware implementation. This paper mainly focuses on higher efficiency for the real-time monitoring result, low cost systems and can significantly reduce the workload for greenhouse environment monitoring. The major factors that usually rule the development of greenhouse systems are higher efficiency, satisfactory automation, a user friendly interface with the computer, and complexity of computation, profitability while minimizing unintended effects on green house environment.
The theoretical study of a co-axial feed and the analysis of such a feed-line are presented in this paper. The antenna is chosen to be a printed antenna, which can support a large variety of feeds and therefore a comparative a study can... more
The theoretical study of a co-axial feed and the analysis of such a feed-line are presented in this paper. The antenna is chosen to be a printed antenna, which can support a large variety of feeds and therefore a comparative a study can be made on the behavior of the antenna. Assuming, the application intended is WLAN, the antenna is designed for 2.4GHz. The analysis is made by considering the most common and widely used rectangular shaped structure. The substrate of the antenna is a 1.6mm thick RT 5880 Duroid, with a di-electric constant of 2.2. The simulation is carried using the High Frequency Structure Simulator of Ansoft Corp. The designed patch antenna has a return loss of -24dB which is acceptable given the scope of the antenna, but for the low-gain which is due to the inclusion of co-axial feed.
In today's advance technology, security problems become more important. Security measurement and monitoring helps system developers to design and assure secure systems. Today security metrics are used in a variety of fields as software... more
In today's advance technology, security problems become more important. Security measurement and monitoring helps system developers to design and assure secure systems. Today security metrics are used in a variety of fields as software development process. Security metrics for software products provide quantitative measurement for the degree of trustworthiness for software systems. Good metrics should be specific, measurable, repeatable and time defendant. Besides security plan to endure attack, software needs to be clearly blueprinted according to secure design principles. The method of this paper is to discuss the software security metric accompanying with design principles and ascertain metrics characteristics.
Compressed Sensing (CS) is an emerging signal acquisition theory that provides a universal approach for characterizing signals which are sparse or compressible on some basis at sub-Nyquist sampling rate. Based on an over-complete data as... more
Compressed Sensing (CS) is an emerging signal acquisition theory that provides a universal approach for characterizing signals which are sparse or compressible on some basis at sub-Nyquist sampling rate. Based on an over-complete data as the sparse basis specialized for speech signals, CS Sampling and reconstruction of speech signal are realized. Furthermore, we propose to choose the sensing matrix adaptively, according to the energy distribution of original speech signal. Experimental results show significant improvement of speech reconstruction quality by using such adaptive approach against traditional random sensing matrix. The key objective in compressed sensing (also referred to as sparse signal recovery or compressive sampling) is to reconstruct a signal accurately and efficiently from a set of few non-adaptive linear measurements.
The compressed sensing field has provided many recovery algorithms, most with provable as well as empirical results. There are several important traits that an optimal recovery algorithm must possess. The algorithm needs to be fast, so that it can efficiently recover signals in practice. The algorithm should provide uniform guarantees, meaning that given a specific method of acquiring linear measurements, the algorithm recovers all sparse signals (possibly with high probability). Ideally, the algorithm would require as few linear measurements as possible. However, recovery using only this property would require searching through the exponentially large set of all possible lower dimensional subspaces, and so in practice is not numerically feasible. Thus in the more realistic setting, we may need slightly more measurements. Finally, we wish our ideal recovery algorithm to be stable.
This means that if the signal or its measurements are perturbed slightly, then the recovery should still be approximately accurate. This is essential, since in practice we often encounter not only noisy signals or measurements, but also signals that are not exactly sparse, but close to being sparse. The conventional scheme in signal processing, acquiring the entire signal and then compressing it, was questioned by Donoho. Indeed, this technique uses tremendous resources to acquire often very large signals, just to throw away information during compression.
The unified power quality conditioner [UPQC] is one of the advanced forms of custom devices, which consists of a hybrid series filter for compensating the voltage disturbances and a shunt active power filter for eliminating current... more
The unified power quality conditioner [UPQC] is one of the advanced forms of custom devices, which consists of a hybrid series filter for compensating the voltage disturbances and a shunt active power filter for eliminating current distortions. This custom power device is mainly employed to overcome the power quality [PQ] problems by improving the quality of supply voltage and its waveforms, and to minimize present harmonic contents in nonlinear load currents in power distribution systems. In this paper, the configuration of UPQC, the designs of various components of UPQC, such as series and shunt active filter VSI parameters, the controllers, etc have been designed for simulation studies against nonlinear loads using MATLAB/ Simulink environments. The simulation studies on both conventional and modified topology of UPQC are presented, which have been carried out by considering a case example of voltage sag, swell, interruption, and harmonic distortions. From the simulation results on conventional and modified UPQC topology, it can be concluded that the modified topology has less average switching frequency, less THDs in load voltages and currents from the source. It has been also observed that the modified UPQC system gives reduced dc-link voltage than the conventional UPQC topology.
This paper represents the fault tolerant ALU with Triple modular redundancy on FPGA. TMR technique is mitigating the single error upsets of the module. TMR method gives fault tolerant result but with penalty of area. TMR technique is used... more
This paper represents the fault tolerant ALU with Triple modular redundancy on FPGA. TMR technique is mitigating the single error upsets of the module. TMR method gives fault tolerant result but with penalty of area. TMR technique is used in aviation application and space application where radiation effects are takes place.
Multiprocessor system on chip is rising as a replacement trend for System on chip style however the wire and power style constraints square measure forcing adoption of recent style methodologies. Researchers pursued a ascendable answer to... more
Multiprocessor system on chip is rising as a replacement trend for System on chip style however the wire and power style constraints square measure forcing adoption of recent style methodologies. Researchers pursued a ascendable answer to the current downside i.e. Network on Chip (NOC). Network on chip design higher supports the combination of SOC consists of on chip packet switched network. so the thought is borrowed from massive scale multiprocessors and wide space network domain and envisions on chip routers primarily based network. Cores access the network by means that of correct interfaces and have their packets forwarded to destination through multichip routing path. so as to implement a competitive operative design, the router ought to be expeditiously style because it is that the central element of operative design .Design and simulation of five Port Router was designed and its simulation was through with ModelSim6.5e and synthesis victimization Xilinx Ise10.1i.
The main purpose of this paper is to remove the additive noise present in the images such as Gaussian noise and dependent noise. The denoising concentrates on removal of both impulse noise and Gaussian noise from images. The DTBDM... more
The main purpose of this paper is to remove the additive noise present in the images such as Gaussian noise and dependent noise. The denoising concentrates on removal of both impulse noise and Gaussian noise from images. The DTBDM requires high computation time for processing. In DTBDM, the degradation in denoising performance as the algorithm works on noiseless pixels also. In proposed it works only on noisy pixels and it requires less computation time. In order to remove noise, a new noise detection mechanism and also filtering is performed using Non-Local means (NLM) method. To achieve the detection results more robust and more accurate, the strategy coarse-to-fine stage and the iterative method are used. The numerical results will confirm that proposed methods yields the better performance, in the terms of peak signal to noise ratio (PSNR).
Fingerprints are one of the most mature bio-metric technologies and are considered legitimate proofs of evidence in courts of law all over the world. In recent times, more and more civilian and commercial applications are either using or... more
Fingerprints are one of the most mature bio-metric technologies and are considered legitimate proofs of evidence in courts of law all over the world. In recent times, more and more civilian and commercial applications are either using or actively considering using fingerprint-based identification because of the availability of inexpensive and compact solid state scanners as well as its superior and proven matching performance over other bio-metric technologies.
This paper presents algorithm for image quality assessment based on fuzzy logic. First, a simple model of human visual system, consisting of a nonlinear function and a 2-D filter, processes the input evaluates detail losses and additive... more
This paper presents algorithm for image quality assessment based on fuzzy logic. First, a simple model of human visual system, consisting of a nonlinear function and a 2-D filter, processes the input evaluates detail losses and additive impairments for image quality assessment. The detail loss refers to the loss of useful visual information which affects the content visibility, and the additive impairment represents the redundant visual information whose appearance in the test image will distract viewer’s attention from the useful contents causing unpleasant viewing experience.. Noise is usually quantified by the percentage of pixels which are corrupted. Corrupted pixels are either set to the maximum value or have single bits flipped over. So the main objective of this dissertation work is to get almost an actual image from the corrupted image and then finding the fine edges in the image using fuzzy logic.
Clustering approach is widely used in biomedical applications particularly for brain tumor detection in abnormal magnetic resonance (MRI) images. Fuzzy clustering using fuzzy C-means (FCM) algorithm proved to be superior over the other... more
Clustering approach is widely used in biomedical applications particularly for brain tumor detection in abnormal magnetic resonance (MRI) images. Fuzzy clustering using fuzzy C-means (FCM) algorithm proved to be superior over the other clustering approaches in terms of segmentation efficiency. MRI imaging forms one of the core methods to identify Brain Tumors, and access the existence, size and thickness of the tumor. MRI Images are prone to high noise, as the whole principle works on strong electric fields. We try to remove the noise using Artificial Bee Colony Algorithm and then try to abstract the tumor part using Fuzzy C-Means Clustering.
In conventional correlation receiver, the capacity of a single cell using CDMA is limited by Multiple Access Interference (MAI). To overcome this drawback, several advanced receiver structures have been proposed. Unlike the conventional... more
In conventional correlation receiver, the capacity of a single cell using CDMA is limited by Multiple Access Interference (MAI). To overcome this drawback, several advanced receiver structures have been proposed. Unlike the conventional receiver which treats multiple access interference (MAI) as if it were AWGN, multiuser receivers treat MAI as additional information to aid in detection. In this paper I present a comparative study of the most widely discussed receiver structures: the Conventional Matched Filter receiver, the Decorrelator receiver, the Minimum Mean Square Error (MMSE) receiver, and the Successive Interference Cancellation receiver.  BER performances of the above mentioned multiuser receivers are studied as a function of SNR, number of users and processing gain.
Deep Drawing is a widely used sheet metal shape transformation process with material retention. From Automobile components to plumbing and sanitary ware, this process is used for mass production of industrial and domestic artifacts with... more
Deep Drawing is a widely used sheet metal shape transformation process with material retention. From Automobile components to plumbing and sanitary ware, this process is used for mass production of industrial and domestic artifacts with the depth greater than corresponding diameter.  Deep drawing process using a die-punch pair on a press induces radial stresses on flange region and compressive stresses on the center of a blank to provide a permanent shape change.  The production quality and cost associated depend upon obtained thickness distribution and an accurate prediction of formability. This demands a coherent set of correlations between various parameters associated with deep drawing.  This study reviews various theoretical and experimental attempts made to connect punching force, blank holding force, dome height, thickness of blank etc. with thickness distribution on final products. The focus is also drawn onto various modes of failures for different shapes of products and associated problems including creasing and wrinkling, crushing and cracking of sheet metal parts. Conclusions are drawn to suggest impact of isolated deviation of one or more parameters on the resulted product.
Current analysis has proved that cooperative cache is able to progress the system performance in remote P2P network which is similar to ad hoc network as well as mesh network. All though, these analysis are in extreme level, excluding... more
Current analysis has proved that cooperative cache is able to progress the system performance in remote P2P network which is similar to ad hoc network as well as mesh network. All though, these analysis are in extreme level, excluding some plans along with application problems unreturned. This paper, explains plans and applications of cooperative cache inside remote P2P network as well as recommend answers which helps in finding most excellent position near to supply the information.
Rolling element bearing faults are among the main causes of breakdown in rotating machines. Present study shows the effect of the different individual working parameters on vibration signals keeping other parameters constant. But it is... more
Rolling element bearing faults are among the main causes of breakdown in rotating machines. Present study shows the effect of the different individual working parameters on vibration signals keeping other parameters constant. But it is important to know what happens if we vary all the parameters simultaneously. Due to this, one can understand the significant parameter that affects performance of the machine, so he can avoid the sudden breakdown by altering that particular parameter and can continue the working of same machine up to the time of scheduled breakdown, after detecting the first defect or fault in the working bearing. This paper presents a mathematical model which considers the contacts between the balls and races as non linear springs. Hertzian contact deformation theory is applied to this model. The intensity of the vibration signals varies according to working parameter. In order to find out the significant parameter, which affects the vibration signal i.e. the performance of the machine, Taguchi’s Methodology is used. The results, in terms of vibration signals and analysis are presented in the paper, which can give idea about the sensitivity of the vibration signals.
In the search engine, the NLP (Natural Language Processing) and statistically-based systems are used for making the query. The statistical system is recognizing the terms for searching and also it provides the stems and singular and... more
In the search engine, the NLP (Natural Language Processing) and statistically-based systems are used for making the query. The statistical system is recognizing the terms for searching and also it provides the stems and singular and plural forms of words. The statically based system may also provide the weights of every term. In the Natural Language processing system the parts of speech, identifies objects, verbs, subjects, agents and synonyms and alternating forms for appropriate nouns are tags. Then it is able for creating an unambiguous representation of submitted query and the term weights are computed. For the particular query request the list of the documents are retrieve on the search engine from the database. Using the keywords the search engine obtained the results for submitted query. The Stemming algorithms and Stop-lists/Stop-words are used for reducing the consuming of size of the disk. ‘the’, ’is’, ‘an’ are the example of stop-words and ’reading’, ‘playing’, ’watches’ are the examples of stemming algorithms. In the Information Retrieval system the vector space model and the Boolean model are using for the documents ranking. The search engine optimization is started with submitting the keywords on the search engine that should be very clear and understanding for the query processing and also known that which keywords are more relevant and will performs well for better results. So, in this paper, for retrieving the documents from the database the new technique, combination of the ‘two keywords’ are proposed and rearranges the list of documents in the order of weight.
Sheet metal spinning is one of the metal forming processes, where a flat metal blank is rotated at a high speed and formed into an axisymmetric part by a roller which gradually forces the blank onto a mandrel, bearing the final shape of... more
Sheet metal spinning is one of the metal forming processes, where a flat metal blank is rotated at a high speed and formed into an axisymmetric part by a roller which gradually forces the blank onto a mandrel, bearing the final shape of the spun part. The aim is to design the workpiece parameters, process parameters, tooling parameters to reduce failure of the workpiece, wrinkling failure, material deformation in metal spinning for general lathe. with the help of these three parameter improve the mechanical property and quality of product.
Concrete reserved their seat in today’s modern materials. Concrete is a material used in building construction, consisting of a fine aggregate and a coarse aggregate that is bonded by cement and water with various types of admixtures... more
Concrete reserved their seat in today’s modern materials. Concrete is a material used in building construction, consisting of a fine aggregate and a coarse aggregate that is bonded by cement and water with various types of admixtures which are available in market or from the waste materials. A mix design procedure for high-performance concrete mixes has been presented in this paper. Since economical parameters and compressive strength are fundamental properties of concrete in two different stages of production, the correlation between costing parameters and compressive strength has been used instead of using water-cement ratio versus compressive strength relationship. If we maintain the water-cement ratio and by adding various types of admixtures in concrete we can improve the compressive strength of concrete and also get more strength which will be very economical. In the proposed method, the designer is able to estimate parameters like compressive strength and economical costing at the design stage for a given target strength, in addition to ingredients of concrete.
solar energy has a large potential to become the fuel of the future. The challenge however remains to effectively capture the available solar energy and efficiently convert the captured solar energy into electrical energy. The project is... more
solar energy has a large potential to become the fuel of the future. The challenge however remains to effectively capture the available solar energy and efficiently convert the captured solar energy into electrical energy. The project is a definitive attempt to explore the opportunities in effectively capturing the solar energy by designing a mechanical system and support structure to rotate a set of photo voltaic modules which are capable of generating 1 kWh electricity. Large scale solar power generation is the broad framework of the current problem statement. Literature review reveals that tracking the sun in both the directions can improve the power output by 25 to 30 percent. This improvement can play a vital role in adapting the solar power as the major power generating source, as it opens the scenario of large scale grid connected power generation. Within this framework, the present study aims to design a solar tracking system and its support structure that can allow the photovoltaic solar panels, which are capable of generating 1 kWh electrical energy to efficiently absorb solar radiation, thereby improving the electric output from the structure.  The system has to track the sun in both directions, withstand the dynamic loads viz. wind loads, remain safe in most adverse conditions and possess the ease of assembly and manufacturability.
An AODVB (Ad hoc On-Demand Distance Vector with Black hole Avoidance) protocol for avoiding black-hole attack. AODVB forms link displace multi-path during path discovery to provide greater path selection in order to avoid malicious nodes... more
An AODVB (Ad hoc On-Demand Distance Vector with Black hole Avoidance) protocol for avoiding black-hole attack. AODVB forms link displace multi-path during path discovery to provide greater path selection in order to avoid malicious nodes in the path using legitimacy table maintained by each node in the network. Non-malicious nodes gradually isolate the black-hole nodes based on the values collected in their legitimacy table and avoid them while making path between source and destination.
Nowadays in industry development of Silicon on Chip (SOC) devices with reusable IP cores are given higher priority, the major challenge faced here is to ensure proper lossless communication between these IP cores in SOC device, this can... more
Nowadays in industry development of Silicon on Chip (SOC) devices with reusable IP cores are given higher priority, the major challenge faced here is to ensure proper lossless communication between these IP cores in SOC device, this can be ensured with the help of standard communication protocols such as AMBA from ARM Ltd. In this paper we design and synthesize efficient Finite State Machine (FSM) for master and slave interface in AMBA AHB. The interfaces are capable of responding to split, retry and error responses during a simple read and write transfer. The AMBA AHB system is designed using Hardware description language such as Verilog using Modelsim tool and synthesized using Xilinx ISE tool.
this paper illustrates the prototype Unmanned Ground Vehicle (UGV) developed for the military purpose. There are four different modes to control UGV: command controlled mode, self-controlled mode, gesture controlled mode and raptor... more
this paper illustrates the prototype Unmanned Ground Vehicle (UGV) developed for the military purpose. There are four different modes to control UGV: command controlled mode, self-controlled mode, gesture controlled mode and raptor controlled mode. Our prototype UGV is built to undertake missions like border patrol, surveillance and in active combat both as a standalone unit (automatic) as well as in co-ordination with human soldiers (manual). A person from a remote place can comfortably control the motion of the robot wirelessly (manual mode) and in situations where manual control is not prudent, the vehicle is capable of reaching the pre-programmed destination on its own (automatic mode). In few other cases, UGV can use gesture mode and raptor mode. The complete set up and working of the UGV are described in the paper.
In this paper a single server retrial queue with second optional service is considered in which each customer takes discrete service time on value D_j with probability of p_j for obligatory service and F_k with probability of q_k for... more
In this paper a single server retrial queue with second optional service is considered in which each customer takes discrete service time on value D_j with probability of p_j for obligatory service and F_k with probability of q_k for second optional service that each customer need this service with probability of θ. We also consider two disciplines for servicing customers; in first discipline, customers receive second optional service (if it is requested) immediately after obligatory service, in second discipline, if second optional service is requested, have to leave service area after obligatory service completion to retrial orbit, and try to be serviced after a exponentially distributed time. The joint steady-state distribution of the state of server and the number of customers in orbit is derived for both disciplines.
Mobile Ad Hoc Networks (MANETs) is a collection of wireless mobile nodes connected by wireless links forming a temporary network without the aid of any infrastructure or any centralized administration. The nodes communicate with each... more
Mobile Ad Hoc Networks (MANETs) is a collection of wireless mobile nodes connected by wireless links forming a temporary network without the aid of any infrastructure or any centralized administration. The nodes communicate with each other on the basis of mutual trust. These nodes can act as host/router or both at the same time. They can form arbitrary topologies depending on their connectivity with each other in the network. This characteristic makes MANETs more vulnerable to be exploited by an attacker inside the network. Wireless links also makes the MANETs more susceptible to attacks, which make it easier for the attacker to go inside the network and get access to the ongoing communication. Owing to its mobility and broadcast nature MANETs are particularly vulnerable to attacks over traditional wired networks finally makes them susceptible to various active and passive attacks because of its limited physical security, dynamically changing network topology, energy constrained operations and lack of centralized administration. MANETs often suffer from security attacks because of its features like open medium, lack of central monitoring and management, cooperative algorithms and no clear defense mechanism. In particular, black hole attacks can be easily deployed into the MANETs by the adversary. Our objective is to thoroughly capture and analyze the impact of Black Hole attacks on MANET performance using reactive (AODV) routing protocol  with varying number of Black Hole nodes in the MANET. We have used Performance Metrics i.e. Throughput, Packet delivery Ratio, Packet Drop ratio to analyze the impact of Black hole attack on AODV Routing Protocol in MANET using the NS-2 simulator.
Recommender systems suggest items to users by utilizing the techniques of Collaborative filtering based on historical records of items that users have purchased. Recommender systems make use of data mining techniques to determine the... more
Recommender systems suggest items to users by utilizing the techniques of Collaborative filtering based on historical records of items that users have purchased. Recommender systems make use of data mining techniques to determine the similarity among a huge collection of data items, by analysing historical user data and then extracting hidden useful information or patterns. Goal of Collaborative filtering is finding the relationships among the individuals and the existing data items in order to further determine the similarity and provide recommendations. This paper, proposes the Context based Collaborative Filtering Recommender System, which can be used for any commercial online-marketing. Experimental evaluation of results and comparing them with traditional collaborative filtering approach, concludes that context based collaborative approach provide dramatically better performance than traditional-based algorithms, while at the same time providing better recommendation as per customer point of view.
Orthogonal Frequency Division Multiplexing (OFDM) is a multi-carrier modulation technique which divides the available spectrum into many carriers. OFDM uses the spectrum efficiently compared to FDMA by spacing the channels much closer... more
Orthogonal Frequency Division Multiplexing (OFDM) is a multi-carrier modulation technique which divides the available spectrum into many carriers. OFDM uses the spectrum efficiently compared to FDMA by spacing the channels much closer together and making all carriers orthogonal to one another to prevent interference between the closely spaced carriers. The main advantage of OFDM is their robustness to channel fading in wireless environment, but here MB-OFDM transmitter baseband is designing in order to provide high speed for application than OFDM . The objective of this project is to design and implement a baseband of MB-OFDM transmitter on FPGA hardware which provides very high speed for application. This project concentrates on developing Fast Fourier Transform (FFT) and Inverse Fast Fourier Transform (IFFT). The work also includes in designing a mapping module, serial to parallel and parallel to serial converter module. The design uses FFT and IFFT for the processing module which indicate that the processing block contain inputs data. All modules are designed using VHDL programming language and implement using Apex 20KE board. The board is connected to computer through serial port.Input and output data is displayed to computer .Software and tools which used in this project includes VHDLmg Design Entry, Altera and Altera Quartus-II  Software tools are used to assist the design process and downloading process into FPGA board while Apex board is used to execute the designed module.
In recent years, Multi-State Systems (MSS) have been rigorously studied because of their inherent complexity. In this paper, we present an approach for steering multi-state systems in an industrial context, over a given life cycle. This... more
In recent years, Multi-State Systems (MSS) have been rigorously studied because of their inherent complexity. In this paper, we present an approach for steering multi-state systems in an industrial context, over a given life cycle. This approach aims at identifying the best choices for maintenance policies of Multi-State Systems, according to transition parameters of states and performance indicators chosen. The degradation process of the system, based on the Markov model, has been transformed with a dynamic Bayesian model to utilize its advantages. The decision-making of maintenance policies through the Dynamic Bayesian Network is examined, as well as, compared to the simulation results obtained from different test cases.

And 35 more

Resolution enhancement (RE), a process of improving the quality of an image. Resolution enhancement is basically obtained by regaining high frequency contents of image. Without which the output will be a blurred image. In this paper a new... more
Resolution enhancement (RE), a process of improving the quality of an image. Resolution enhancement is basically obtained by regaining high frequency contents of image. Without which the output will be a blurred image. In this paper a new approach in wavelet domain using dual-tree complex wavelet transform (DT-CWT) and nonlocal means (NLM) filter for RE of the images is implemented. An input image is decomposed by DT-CWT to obtain high-frequency sub bands. The high-frequency sub bands and the low-resolution (LR) input image will be interpolated using the bicubic interpolation in proposed work. The high frequency sub bands are passed through an NLM filter to reduce the artifacts generated by DT-CWT. The filtered high-frequency sub bands and the LR part of input image are added using inverse DTCWT for obtaining resolution-enhanced image. DT-CWT provides superior resolution and greater performance ratio because of its nearly shift invariant and directional selective properties. The results are compared in terms of PSNR, MSE for existing techniques.
Transmission and routing of video data over wireless network is a challenging task because of wireless interferences. To improve the performance of video on demand transmission over wireless networks multipath algorithms are used. IPD/S... more
Transmission and routing of video data over wireless network is a challenging task because of wireless interferences. To improve the performance of video on demand transmission over wireless networks multipath algorithms are used. IPD/S (Iterative path discovery/selection) PPD/S (Parallel Path discovery/selection) are two algorithms which is used for discovering maximum number of edge disjoint paths from source to destination, for each VoD request by considering the effects of wireless interferences. In this paper performance evaluation of these multipath discovery algorithms for VoD (Video on demand) streaming in wireless mesh network is presented. These algorithms are evaluated on the bases of Number of Path discovers, Packet drop ratio and delay. Simulation result shows that PPD/S works batter as compared to IPD/S because it’s able to discover more paths than IPD/S under same circumstances.
Image Fusion is a data fusion technique which combines information of the two images which has varied information to form a new single image. The objective is to fuse an MR image and CT image of the same organ to obtain a single image... more
Image Fusion is a data fusion technique which combines information of the two images which has varied information to form a new single image. The objective is to fuse an MR image and CT image of the same organ to obtain a single image containing as much information as possible. In this paper Wavelet Transform and Fast Curvelet Transform are highlighted to perform the image fusion of MR image and CT images. Wavelet Transform has good time-frequency characteristics in one-dimension, but this can’t be extended to two-dimensions or multi-dimensions as wavelet has very poor directivity. Since medical images have several more objects and curved shapes, it is expected that the Fast Curvelet Transform will do better in their fusion. In this project image fusion based on Wavelet Transforn and Fast Curvelet Transform was implemented. The experiment results show the superiority of Fast Curvelet Transform to the Wavelet Transform in the fusion of MR and CT images from both the visual quality and peak signal to noise ratio(PSNR) points of view.
Reed-Solomon codes are very useful and effective in burst error in noisy environment. In decoding process for 1 error or 2 errors create easily with using procedure of Peterson-Gorenstein –Zierler algorithm. If decoding process for 3 or... more
Reed-Solomon codes are very useful and effective in burst error in noisy environment. In decoding process for 1 error or 2 errors create easily with using procedure of Peterson-Gorenstein –Zierler algorithm. If decoding process for 3 or more errors, these errors can be solved with key equation of a new algorithm named Berlekamp-massey algorithm. In this paper, wide discussion of procedures of Peterson-Gorenstein –Zierler algorithm and Berlekamp-Massey algorithm and show the advantages of modified version of Berlekam-Massey algorithm with its steps.
The objective of this work is to compare ligament system with the composite fibre-resin lamina which is not exactly but matches nearer to the ligament structure. The skeletal ligaments are short bands of tough fibrous connective tissue... more
The objective of this work is to compare ligament system with the composite fibre-resin lamina which is not exactly but matches nearer to the ligament structure. The skeletal ligaments are short bands of tough fibrous connective tissue which are same as composite material. This paper presents 3D finite element analysis performed for a composite Fibre-Resin lamina and compare this result with the orthotropic stress -strain relationship formulation for validation of the result. Nonlinear static analysis performed for composite material to evaluate stress-strain relationship for a composite material.
This paper introduce useful concept for transient response analysis for pendulum system using graphical programming language - LabVIEW. Transient response analysis is the most general method for computing forced dynamic response. The... more
This paper introduce useful concept for transient response analysis for pendulum system using graphical programming language - LabVIEW. Transient response analysis is the most general method for computing forced dynamic response. The purpose of transient response analysis is to compute the behavior of a structure subjected to time varying excitation. Once the virtual state space model of pendulum system is obtained it is very easy to analyze and identify the parameters of transient response of pendulum system.
The model is developed for reducing the wastage of electricity due to careless and improper switching in households, schools, colleges and offices etc. It saves energy by maximizing the use of daylight. This is an automatic system which... more
The model is developed for reducing the wastage of electricity due to careless and improper switching in households, schools, colleges and offices etc. It saves energy by maximizing the use of daylight. This is an automatic system which employs solar energy through PV. The system is capable of controlling lights, fans and air conditioners in a room depending upon various parameters such as LUX level, room temperature and motion. All these parameters are measured through various sensors and the controlling is done by microcontroller. PIR sensor detects the occupancy in the room. The microcontroller reads the LUX level of the room from the ambient light sensor. If the daylight value is below the preset threshold value, then the lights are turned on and vice versa. LM 35 sensor reads the room temperature and is compared with preset value and accordingly the ON and OFF of the fan is controlled. The model has been developed and tested in Department of Energy, MANIT Bhopal. This model itself consumes very low power and helps in saving a significant amount of energy. The model can be applied to government offices, private firms, residential buildings, schools, colleges etc. so as to avoid the wastage of electricity and maximum use of day lighting, also reduces our dependence on conventional energy and will help in conserving energy.
Laminated composite material have found extensive application in the mechanical, aerospace, marine, automotive industry due to their high fatigue life and high strength to weight ratio. Predication of the failure of composite laminate... more
Laminated composite material have found extensive application in the mechanical, aerospace, marine, automotive industry due to their high fatigue life and high strength to weight ratio. Predication of the failure of composite laminate structure, mathematical modelling, finite element analysis of composite laminate and load that can laminate take have become an important topic of research and has drawn close attention in recent year. Accurate prediction of failure of composite laminate structure has become more challenging to design due to non-isotropic and non linear material property. This paper presents literature review on modelling and finite element analysis of composite laminate plate. The literature review is devoted to the different finite element method based on the various laminated plate theories.
Self defence and self protection are an important priority for women or Men. Some women find themselves at a greater risk for becoming the victim of either serious assault or murder, but it is more about a feeling of dominance from one... more
Self defence and self protection are an important priority for women or Men. Some women find themselves at a greater risk for becoming the victim of either serious assault or murder, but it is more about a feeling of dominance from one person over another. Even though women are targeted, the overall point is that their needless victimization could have been prevented. Even women who perform the necessary steps to stay alive—such as getting a restraining order, hiding, and filing criminal charges—still end up dead. Therefore, avoiding any violent attack is better that attempting to survive one. Self defence and alert system for individuals is avoid these crimes in alone or being in badly lit areas also. By using this system Whenever there is a threat to the person then he/she has to press the switch of the equipment to alert all the modules i.e. voice alert, self defence device, GPS, GPRS, a web camera is attached to the equipment and capturers the image and sent MMS to the particular registered mobile number and using further evidence about the crime, a voice alert helps whenever a person switch ON the equipment to alert the surrounding persons where there is a crime, Global Positioning Service gives the location of wherever there is a threat as Latitude and Longitude and sent SMS of Latitude and Latitude to particular registered number using GPRS technology .General packet Radio Service sent MMS of the captured image to particular registered number using GPRS technology or using further evidence about the crime and Self Defence Device is used for whenever there is a threat to the person then he/she has to press the switch of the equipment then the spray bottle which contains chloroform automatically spreads over the area where the crime takes place, and the criminal gets unconscious.
In this paper, a new domino circuit is proposed, which has a lower leakage and higher noise immunity without dramatic speed degradation for wide fan-in gates. The technique which is utilized in this paper is based on comparison of... more
In this paper, a new domino circuit is proposed, which has a lower leakage and higher noise immunity without dramatic speed degradation for wide fan-in gates. The technique which is utilized in this paper is based on comparison of mirrored current of the pull-up network with its worst case leakage current. The proposed circuit technique decreases the parasitic capacitance on the dynamic node, yielding a smaller keeper for wide fan-in gates to implement fast and robust circuits. Thus, the contention current and consequently power consumption and delay are reduced. The leakage current is also decreased by exploiting the footer transistor in diode configuration, which results in increased noise immunity. Simulation results of wide fan-in gates designed using a 16-nm high-performance predictive technology model demonstrate 51% power reduction and at least 2.41× noise-immunity improvement at the same delay compared to the standard domino circuits for 64-bit OR gates
Frequently, high-power pulsed ion cyclotron range of frequency experiments are limited by breakdown at the vacuum feedthrough. This paper describes the development and testing of vacuum feedthroughs to increase both reliability and... more
Frequently, high-power pulsed ion cyclotron range of frequency experiments are limited by breakdown at the vacuum feedthrough. This paper describes the development and testing of vacuum feedthroughs to increase both reliability and capability. The ultimate goal of this review paper is to develop a continuous-wave feedthrough for the next generation of fusion experiments. The feedthrough in the interface, which isolates the ADITYA Tokamak vacuum from that of the interface, becomes most critical and vulnerable to RF voltage breakdown and arcing. The use of dielectric material between inner and outer conductor introduces threat of surface discharge in vacuum condition. The feed through are very crucial, as its failure not only affects RF system but also the tokamak vacuum gets affected.
Eichhornia crassipes is a noxious weed which clogs water bodies, interferes with navigation, recreation and power generation due to its fast spread and congested growth. Composting is a bio-oxidative process which converts water hyacinth... more
Eichhornia crassipes is a noxious weed which clogs water bodies, interferes with navigation, recreation and power generation due to its fast spread and congested growth. Composting is a bio-oxidative process which converts water hyacinth into useful product. Therefore, the aim of the present study was the comparative investigation of biological and physical characteristics through pile composting of water hyacinth taken from two different sources but of same composition i.e. from Agricultural site and Disposal/Landfill site. The composting material of pile of disposal site reached thermophilic stage within three days indicating high rate of microbial decomposition. Water hyacinth obtained from disposal site contains more organic matter than that from agricultural site. The compost obtained from pile of disposal site was more stable than that from agricultural site as evaluated by the stability parameters i.e. oxygen uptake rate and CO2 evolution rate.
Automation is the use of control system (such as numerical control, programmable logic control, and other industrial control systems), in contact with other applications of information technology to control industrial machinery and... more
Automation is the use of control system (such as numerical control, programmable logic control, and other industrial control systems), in contact with other applications of information technology to control industrial machinery and processes, reducing the need for human intervention. Whereas mechanization provides human operators with machinery to assist them with the physical requirements of work, automation greatly reduces the need for human sensory and mental requirements as well. Processes and system can also be automated. Automation is a discipline that has been into existence for the past ten to fifteen years. And this is the field that has been continuously explored by the experts for the betterment and ease of engineering industry. The title of our Research is “Separation and Packaging of Acidic and Basic Solution”. This is the Research that we are doing under our esteemed faculty Mr. Dipesh shah. They were the source of guiding factor behind this Research. The automation of this Research is done using a PLC and the representation of the graphic user interface (GUI) might be done using SCADA
Carry Select Adder (CSLA) is one of the fastest adders used in many data-processing processors to perform fast arithmetic functions. From the structure of the CSLA, it is clear that there is scope for reducing the area and power... more
Carry Select Adder (CSLA) is one of the fastest adders used in many data-processing processors to perform fast arithmetic functions. From the structure of the CSLA, it is clear that there is scope for reducing the area and power consumption in the CSLA. This work uses a simple and efficient gate-level modification to significantly reduce the area and power of the CSLA. The proposed design has reduced area and power as compared with the regular SQRT CSLA with only a slight increase in the delay. This work evaluates the performance of the proposed design in terms of area, power. The results analysis shows that the proposed CSLA structure is better than the regular SQRT CSLA.
In multiuser systems the system recourses must be divided into multiple users. This paper discusses these various techniques to allocate resources to multiple users, as well as drawbacks of multiuser system. Also we will discuss multiuser... more
In multiuser systems the system recourses must be divided into multiple users. This paper discusses these various techniques to allocate resources to multiple users, as well as drawbacks of multiuser system. Also we will discuss multiuser channel capacity for both uplink and downlink system.
The advancement in data mining techniques plays an important role in many applications. In context of privacy and security issues, the problems caused by association rule mining technique are investigated by many research scholars. It is... more
The advancement in data mining techniques plays an important role in many applications. In context of privacy and security issues, the problems caused by association rule mining technique are investigated by many research scholars. It is proved that the misuse of this technique may reveal the database owner’s sensitive and private information to others. Many researchers have put their effort to preserve privacy in Association Rule Mining. In this paper, we have presented the survey about the techniques and algorithms used for preserving privacy in association rule mining with horizontally partitioned database.
over the last few decades there has been a considerable development in the field of composites and has been extensively used in various applications. Because of the fact that composite materials possess high stiffness, high strength and... more
over the last few decades there has been a considerable development in the field of composites and has been extensively used in various applications. Because of the fact that composite materials possess high stiffness, high strength and less weight. In the past few years natural fibers are being used in the field of composites. The reason is that natural fibers are environment eco friendly, low cost of raw materials. In this present experiment, composite materials prepared using Cashew nut shell liquid (CNSL) resin as matrix and Teak Wood Saw Dust as reinforcing agents were tested for their tensile strength.
This paper is a review of an image inpainting method by which we can reconstruct a damaged or missing portion of an image. A fast image inpainting algorithm based on TV (Total variational) model is proposed on the basis of analysis of... more
This paper is a review of an image inpainting method by which we can reconstruct a damaged or missing portion of an image. A fast image inpainting algorithm based on TV (Total variational) model is proposed on the basis of analysis of local characteristics,which shows the more information around damaged pixelsappears, the faster the information diffuses. The algorithm firststratifies and filters the pixels around damaged region accordingto priority, and then iteratively inpaint the damaged pixels fromoutside to inside on the grounds of priority again. By using this algorithm inpainting speed of the algorithm is faster andgreater impact.
Smart Energy is an approach to the cleverly, effectively use of energy and providing this feature by the use of Zigbee is called Zigbee SEP. In this literature review we provides an overview of Zigbee base different SEP profile like... more
Smart Energy is an approach to the cleverly, effectively use of energy and providing this feature by the use of Zigbee is called Zigbee SEP. In this literature review we provides an overview of Zigbee base different SEP profile like SEP-1.0,SEP-1.x and SEP-2.0 also show comparison of different wireless technologies standards, Zigbee base SEP Architecture – how it will be implemented on existence technology and SEP feature offer by Zigbee SEP.
This paper presents the stability enhancement of four parallel-operated offshore Wind Turbine Generators (WTGs) is connected to an onshore power system using a Static Synchronous Compensator (STATCOM). The operating characteristics of... more
This paper presents the stability enhancement of four parallel-operated offshore Wind Turbine Generators (WTGs) is connected to an onshore power system using a Static Synchronous Compensator (STATCOM). The operating characteristics of each of the four WTGs are simulated by wind permanent magnet synchronous generator while the onshore power system is simulated by a Synchronous Generator (SG) fed to an infinite bus through two parallel transmission lines. A fuzzy controller for the proposed STATCOM is designed by using an adequate damping to the dominant modes of the SG. A frequency domain approach based on a linearized system model using Eigen value analysis is performed while a time domain scheme based on a linear system model subject to disturbances is also carried out. These papers present the voltage enhancement of the offshore wind turbine by using the fuzzy controller and maintain the voltage of 1p.u.
Aim of Data Mining is to extract significant and Useful knowledge from the Data. Data Stored in the database may be any type such as text, images and videos so on. Due to increasing the size of the data and storing such data is becoming... more
Aim of Data Mining is to extract significant and Useful knowledge from the Data. Data Stored in the database may be any type such as text, images and videos so on. Due to increasing the size of the data and storing such data is becoming complex. Data mining algorithms are facing the challenges for storing such data. Graph become more important in storing and visualizing this complicated data (i.e. chemical datasets, biological dataset, XML datasets, Social networks datasets, the web datasets etc.) In this paper it is discussed about different Algorithm used for Graph Mining and different techniques used for Graph Mining.
This papers a survey of evaluating performance of Ad-hoc ondemand distance vector routing protocol in Mobile adhoc networks with different network parameters using network simulator. Our basic goal is to present vast information related... more
This papers a survey of evaluating performance of Ad-hoc ondemand distance vector routing protocol in Mobile adhoc networks with different network parameters using network simulator. Our basic goal is to present vast information related to AODV protocol and modifications done to it to analyze its performance in TCP and CBR traffic using different performance metrics such as packet delivery ratio, average end-to-end delay, routing load, throughput, packet drop rate.
A parallel program should be evaluated to determine its efficiency, accuracy and benefits. This paper defines how parallel programs differ by sequential programs. A brief discussion on the effect of increasing number of processors on... more
A parallel program should be evaluated to determine its efficiency, accuracy and benefits. This paper defines how parallel programs differ by sequential programs. A brief discussion on the effect of increasing number of processors on execution time is given. Some of the important measurement units which are used for the purpose of measuring performance of a parallel program are discussed. Various performance laws -Amdahl’s Law, Gustafson’s Law to measure speedup are discussed.
This paper presents an efficient approach to find Security Constraint Optimal Power Flow (SCOPF) determining the dispatch schedule of power generators with minimum cost. Satisfying system constraints, transmission constraints along with... more
This paper presents an efficient approach to find Security Constraint Optimal Power Flow (SCOPF) determining the dispatch schedule of power generators with minimum cost. Satisfying system constraints, transmission constraints along with mandates to ensure optimal power flow has led to SCOPF. Revenue paid for electricity will be reduced considerably while the generation cost is reduced. Power world simulator employs linear programming method for finding optimal power flow. The proposed method has been tested and examined on ieee-7 bus system. Results shows advantageous than other conventional methods for solving OPF with security constraint for the same case.
The Image Quality can be most often measured in terms of Resolution. The clarity of the Image can be determined by Resolution which means higher the resolution, the image can be more clearer which is most often required in most of the... more
The Image Quality can be most often measured in terms of Resolution. The clarity of the Image can be determined by Resolution which means higher the resolution, the image can be more clearer which is most often required in most of the applications. It can be achieved by use of Good Sensors ad optics, but it can be very expensive and also limit way of pixel density within Image. Instead of that we can use image processing methods to obtain High resolution image from low resolution image which can very effective and cheap solution. This Kind of Image Enhancement is called Super Resolution Image Reconstruction. This paper focuses on the definition, implementation and analysis on well-known techniques of super resolution. The comparison and analysis are the main concerns to understand the improvements of the super resolution methods over single frame interpolation techniques. In addition, the comparison also gives us an insight to the practical uses of super resolution methods. As a result of the analysis, the critical examination of the techniques and their performance evaluation are achieved. Super Resolution is particularly useful in forensic imaging, where the extraction of minute details in an image can help to solve a major crime cases. Super-resolution image restoration has been one of the most important research areas in recent years which goals to obtain a high resolution (HR) image from low resolutions (LR) blurred, noisy, under sampled and displaced image.
The paper presents an area estimation and mapping of sand deposition around Kosi river due to several floods this river has witnessed over last many decades. Remote sensing technique has been used for this estimation. Difference in the... more
The paper presents an area estimation and mapping of sand deposition around Kosi river due to several floods this river has witnessed over last many decades. Remote sensing technique has been used for this estimation. Difference in the reflectance of different soil types has been used as a tool to mark out the areas with sand deposition
In this paper, A low-power Laser projector and a Webcam are used to design a monitor system for the body breath detection, which monitors and records the patient's breath and sends the information to the server. Our system consists of two... more
In this paper, A low-power Laser projector and a Webcam are used to design a monitor system for the body breath detection, which monitors and records the patient's breath and sends the information to the server. Our system consists of two parts in which the first part has a Webcam, which is used to capture the images of the reflection from a low-power laser projector and transmits it to a PC. For the second part, an image processing program using the temporal differencing algorithm is used to detect the reflective mark movement to determine the breath rate, installed in the PC. The low-power laser which we are using in this project is safe for the patients.
In distribution side the D-STATCOM gives good power quality response. The D-STATCOM operate VSI and energy storing device, the conventional topology can compensated reactive power from the line but the rating of storage device is... more
In distribution side the D-STATCOM gives good power quality response. The D-STATCOM operate VSI and energy storing device, the conventional topology can compensated reactive power from the line but the rating of storage device is increased. So in this topology if we used some arrangement/connection of small dc link capacitor then we can reduces the size of VSI and also dc-link voltage without compromising it’s compensation. In this paper the proposed method use to reduces the rating of VSI and dc-link voltage by some arrangement of small dc-link capacitor. This topology verify by simulation using MATLAB environment.
The six sigma method is a project-driven management approach to improve the organization’s products, services, and processes by continually reducing defects in the organization. This paper examines the evolution, benefits, and challenges... more
The six sigma method is a project-driven management approach to improve the organization’s products, services, and processes by continually reducing defects in the organization. This paper examines the evolution, benefits, and challenges of six sigma practices and identifies the key factors influencing successful six sigma project implementations. It integrates the lessons learned from successful six sigma projects and considers further improvements to the six sigma approach. Effective six sigma principles and practices will succeed by refining the organizational culture continuously. The purpose of this paper is the survey of six sigma process and its impact on the total productivity and final quality of the four wheeler platform truck in leading manufacturing of material handling equipments. So I have studied key concepts, problem solving process of six-sigma as well as the survey of important fields such as: DMAIC, six sigma and productivity applied program, and other advantages of six-sigma. After successful implementation of this concept, the total sigma level for four wheeler platform truck will be improved from 1.2 to 3.2.
In recent years research in active filter is increased .this paper gives a literature review on active filter. Types of active power and reference current generation technique are classified in time domain and frequency domain method.... more
In recent years research in active filter is increased .this paper gives a literature review on active filter. Types of active power and reference current generation technique are classified in time domain and frequency domain method. Then gating pulse generation techniques are described as a linear control technique and hysteresis control technique.
This paper presents the simulation and analysis of DSTATCOM for voltage sag mitigation and then harmonics distortion and power factor improvement using LCL passive filter with DSTATCOM in distribution system. The model, based on 2-level... more
This paper presents the simulation and analysis of DSTATCOM for voltage sag mitigation and then harmonics distortion and power factor improvement using LCL passive filter with DSTATCOM in distribution system. The model, based on 2-level voltage source converter which requires only voltage measurements and reactive power measurements are not required. The operation of simulated control method for DSTATCOM (Distribution Static Synchronous Compensator) in MATLAB SIMULINK R2009b.
The Load Calculation of Automobile Air Conditioning System is presented. From the load calculation, cooling capacity can be calculated & thus tonne of refrigeration required is found out. The Heat Balance Method (HBM) is used for... more
The Load Calculation of Automobile Air Conditioning System is presented. From the load calculation, cooling capacity can be calculated & thus tonne of refrigeration required is found out. The Heat Balance Method (HBM) is used for estimating the heating and cooling loads encountered in a vehicle cabin. Mathematical models of heat transfer phenomena are used to calculate the different load categories. Mathematical load calculation models are devised and collected from various sources for load estimation. Case study of Wagon R car is introduced under arbitrary driving conditions. Simplified geometry & typical material properties of a Wagon R car are considered as input parameters for studies. In the case study of Wagon R Car, the engine, exhaust, and reflected radiation loads may be neglected from consideration. On the other hand, the direct and diffuse radiation loads are important AC loads that tend to give rise to the cabin temperature. The cabin temperature decreases from a soak temperature of 80°C to the comfort temperature after almost 10 minutes represented by the pull-down time. After the pull-down time, a steady-state situation is achieved where the loads are balanced and a zero net load is maintained for the cabin for the rest of the period. This research aims to provide a basis for estimating thermal loads in vehicle cabins. Cooling capacity i.e. tonne of refrigeration required is found out from this load calculation. The result of this study can be used by HVAC engineers to design more efficient car AC systems.
Frequent pattern mining is one of the most important task for discovering useful meaningful patterns from large collection of data.Mining of association rules from frequent pattern from massive collection of data is of interest for many... more
Frequent pattern mining is one of the most important task for discovering useful meaningful patterns from large collection of data.Mining of association rules from frequent pattern from massive collection of data is of interest for many industries which can provide guidance in decision making processes such as cross marketing, market basket analysis, promotion assortment etc. The techniques of discovering association rule from data have traditionally focused on identifying relationship between items predicting some aspect of human behavior, usually buying behavior. In this paper ,the study includes three classical frequent pattern mining methods that are Apriori, Eclat, FP growth and discusses some issues related with these algorithms.
with recent development and research, various network services like FTP (file transfer), VoIP, videoconferencing (streaming), E-mail are going to more useful in current generation. In such services some of services like email, web... more
with recent development and research, various network services like FTP (file transfer), VoIP, videoconferencing (streaming), E-mail are going to more useful in current generation. In such services some of services like email, web browsing (HTTP), FTP (file transfer) are not much sensitive to delay of transmitted data. Services like VoIP (voice), video conferencing are very sensitive to delay, jitter (delay variation), and packet losses of transmitted data. Therefore they require various traffic management systems for efficient data transmission. Queuing disciplines is such a traffic management system in various network services for efficient data transmission. Queuing algorithms FIFO(first in first out), PQ(priority queue), WFQ(weight fair queue) are implemented in OPNET and some of the parameter including end to end delay, and packet received are studied and effect of various queuing algorithms on this parameters is analysed.
Operating the Internet infrastructure in an energy-efficient manner becomes a challenging issue in most recent years with the recognition of significant energy consumption due to its large scale. Yet a limited number of contributions are... more
Operating the Internet infrastructure in an energy-efficient manner becomes a challenging issue in most recent years with the recognition of significant energy consumption due to its large scale. Yet a limited number of contributions are available to address the issue mainly through putting idle routers and links into sleep mode, at the price of low global resource utilization efficiency and degraded network performance. The same line of research but investigates an optimal solution of configuring IP network topology to minimize the number of active routers and links, and hence energy consumption, whilst keep the interference on network operation minimal. An optimal energy-efficient network model along with a heuristic algorithm is proposed for identifying the network elements in sleep mode to attain approximately the best trade-off between the energy-efficiency and performance degradation. Numerical experiments are carried out to assess the solution for a range of network and load scenarios and the results clearly demonstrate its effectiveness and Analysis of the Parameters of QoS.
The natural resources of Eastern India are still comparatively less exploited. The availability of sustainable water resource in urban environment is a key to development and human health of the Region. 116 Eastern Indian cities with more... more
The natural resources of Eastern India are still comparatively less exploited. The availability of sustainable water resource in urban environment is a key to development and human health of the Region. 116 Eastern Indian cities with more than 0.1 million population are having adequate water resources. These cities are hotspots for progressive economic development of the country. Water requirement and availability of ground water resources up to present extractable depth of 250 meter has been assessed for 7 capital cities of Eastern India as a case study. The results so obtained clearly indicate availability of ground water is adequate to cope up with the pace of growing population and urbanisation in these cities. Aquifer based water security as sustainable solution of these cities of Eastern India has given the confidence for futuristic development. The result of these studies can be used to predict the aquifer based water security of other Eastern Indian Cities.
Underwater inspection in offshore and subsea application is an area of research and understanding where, many problems are still unsolved. In the present paper, a brief description of the different commercial NDT inspection techniques has... more
Underwater inspection in offshore and subsea application is an area of research and understanding where, many problems are still unsolved. In the present paper, a brief description of the different commercial NDT inspection techniques has been made. The problems in underwater inspection have also been discussed in context to the existing inspection techniques. Detailed description of an Alternating Current Field Measurement (ACFM) inspection technique along with an example where ACFM probe used in hazardous environment has also discussed.
Now a day, MANET security has become a big issue and one of the central attentions to the researchers. MANET is a network of wirelessly connected self configuring nodes which functions under the particular routing protocols. There isn�t... more
Now a day, MANET security has become a big issue and one of the central attentions to the researchers. MANET is a network of wirelessly connected self configuring nodes which functions under the particular routing protocols. There isn�t any central administration or a fixed infrastructure in the MANET. Any node within the MANET can move to any other network at anytime or other nodes can also connect to the MANET. This makes MANET highly vulnerable for different attacks. This paper focuses on a Gray Hole attack among the different types of attacks possible in a MANET. Gray Hole attack is one type of active attack which tends to drop the packets during the routing from source to destination. This paper also includes the comparison of different techniques to deal with the Gray Hole attack.
The use of image segmentation is to partition an image into meaningful regions with respect to a particular application. The application of segmentation is to simplify and/or change the representation of an image into something that is... more
The use of image segmentation is to partition an image into meaningful regions with respect to a particular application. The application of segmentation is to simplify and/or change the representation of an image into something that is more meaningful and easier to analyze. The result of image segmentation is a set of segments that collectively cover the entire image. Every pixels in a region are similar with respect to some characteristic or computed property, such as color ,intensity..etc. The purpose of clustering is to identify natural groupings of data from a large data set to produce a concise representation of a system's behavior. Fuzzy c-means is a data clustering technique in which a dataset is grouped into n clusters with every data point in the dataset belonging to every cluster to a certain degree.
Silicon on insulator is latest fabrication technology. It is easier and cheaper. Due to its characteristics, it is fast becoming a standard in IC fabrication. After giving overview of Mach-zehander resonator and Sagnac resonator design,... more
Silicon on insulator is latest fabrication technology. It is easier and cheaper. Due to its characteristics, it is fast becoming a standard in IC fabrication. After giving overview of Mach-zehander resonator and Sagnac resonator design, the focus of this paper is on to develop such optical resonator, gives maximum optical power. Here design both resonators using SOI technology.
A mechanical gripper is used as an end effector in a robot for grasping the objects with its mechanically operated fingers. In industries, two fingers are enough for holding purposes. More than three fingers can also be used based on the... more
A mechanical gripper is used as an end effector in a robot for grasping the objects with its mechanically operated fingers. In industries, two fingers are enough for holding purposes. More than three fingers can also be used based on the application. As most of the fingers are of replaceable type, it can be easily removed and replaced. The force that a robotic gripper applies to a part is typically used by engineers to select grippers but in actual practice it is not enough to select gripper. There are so many factors are to be considered while designing the gripper like mechanism for gripping, trajectories, parameters that influencing gripping task etc. So in this paper those factors are discussed in brief.
Scene segmentation is the well-known problem of identifying the different elements of a scene in image processing. Traditionally either color or depth information tracked from single scene view but it turns out to be poorly conditioned... more
Scene segmentation is the well-known problem of identifying the different elements of a scene in image processing. Traditionally either color or depth information tracked from single scene view but it turns out to be poorly conditioned operation which remains very challenging. So fusing of color information and depth information is required for better scene distribution. It is very difficult to find depth information that allows recognizing the object which has same color with different depth, for more reliability purpose stereo image is used. This paper investigates stereo vision algorithms best suited to analysis depth information.
Privacy protection has recently received considerable attention in location-based services. A large number of location cloaking algorithms have been proposed for protecting the location privacy of mobile users. Then consider the scenario... more
Privacy protection has recently received considerable attention in location-based services. A large number of location cloaking algorithms have been proposed for protecting the location privacy of mobile users. Then consider the scenario where different location-based query requests are continuously issued by mobile users while they are moving. Then the most of existing k-anonymity location cloaking algorithms are concerned with snapshot user locations only and cannot effectively prevent location-dependent attacks when users� locations are continuously updated. Therefore, adopting both the location k-anonymity and cloaking granularity as privacy metrics, The new incremental clique-based cloaking algorithm, called ICliqueCloak, to defend against location-dependent attacks is introduced. The main idea is to incrementally maintain maximal cliques needed for location cloaking in an undirected graph that takes into consideration the effect of continuous location updates. Thus, a qualified clique can be quickly identified and used to generate the cloaked region when a new request arrives. The efficiency and effectiveness of the proposed ICliqueCloak algorithm are validated by a series of carefully designed experiments. The experimental results also show that the price paid for defending against location-dependent attacks is small and maintaining security along with the data management of the database for the user end.
This paper presents a simple broad band printed Yagi Uda antenna operating at a resonant frequency of 400 MHz that can be used for Ultra High Frequency (U.H.F) applications like Radio Frequency Identification (R.F.I.D) Systems. The... more
This paper presents a simple broad band printed Yagi Uda antenna operating at a resonant frequency of 400 MHz that can be used for Ultra High Frequency (U.H.F) applications like Radio Frequency Identification (R.F.I.D) Systems. The antenna is horizontally polarized and consists of a dipole, a reflector and a set of three directors which are placed 3 m above the ground. The impedance bandwidth of the proposed antenna is about 21.5 % and the maximum gain in the pass band frequency range is 16.3 dB with a return loss of -19 dB. The design formulas and various antenna parameters like Return loss, Voltage Standing Wave Ratio (V.S.W.R), Input impedance, Gain etc of the proposed antenna is observed and the simulation is carried out using an electromagnetic simulation tool , CADFEKO. The results show that the designed antenna is well suited for Ultra High Frequency applications like RFID systems.
This paper presents a current invention for monitoring the patient health by continuous observation. From today's world of automation, the field of biomedical is no longer aloof. Application of engineering and technology has proved its... more
This paper presents a current invention for monitoring the patient health by continuous observation. From today's world of automation, the field of biomedical is no longer aloof. Application of engineering and technology has proved its significance in the field of biomedical. It not only made doctors more efficient but also helped them in improving total process of medication. The Patient monitoring system is also a new step in the automation of supervision for doctors. The basic idea behind this project is, it implies that weather a person is at home, on a trip, or at his work place, he/she can stay connected with the doctor and he can take immediate action if necessary. The Telemedicine system for doctors provides solution for this. It continuously provides following information to doctors. Heart pulse rate, Blood Pressure, and Drug Level detection. As used in hospital the same system can be used for a person who is not under the continuous observation of doctor, can check his/her vital signs using the sensors in this project if sensors output starts fluctuating above normal rate hence through GSM network sends an indication to doctor's mobile immediately.
Virtualization rapidly gains popularity affecting multiple levels of the computing stack. Virtualization decouples OS from hardware.As virtualization provides high flexibility through dynamic reallocation of the resources and migration... more
Virtualization rapidly gains popularity affecting multiple levels of the computing stack. Virtualization decouples OS from hardware.As virtualization provides high flexibility through dynamic reallocation of the resources and migration help us in the proper load balancing. In this Paper we made a Survey of different methods which are used to implement Dynamic Allocation of virtual machine that helps in improvement of the overall responsiveness of the system. This Paper also presents Migration strategy, such as whether a migration is triggered, what virtual machine should be migrated, and where the destination host of the virtual machine is.
Negative Bias Temperature Instability is an important lifetime reliability problem in microprocessors. SRAM based structures with in the processor are especially susceptible to NBTI since one of the pMOS devices in the cell always has an... more
Negative Bias Temperature Instability is an important lifetime reliability problem in microprocessors. SRAM based structures with in the processor are especially susceptible to NBTI since one of the pMOS devices in the cell always has an input of �0�. SRAM�s are widely used in controller & Processor memories. It supports high speed implementation. The SRAM exhibit a major problem called negative bias temperature instability (NBTI) while storing the data �0� so large amount of energy is wasted here. In our paper we are going to remove this hazard and we are going to modify the SRAM circuit using Recovery Boosting. As well as we are designing 4-Bit SRAM Through Recovery Boosting for improving READ and WRITE ability of the SRAM circuit. By this we are improving the efficiency of the circuit by reduction in power consumption. In this paper design, simulation, layout design are done by using DSCH and MICROWIND tools.
Mobile ad Hoc Networks are Wireless, infrastructure less, Networks. Due to mobility and limited radio range, every node has to perform the dual responsibility of host of different services as well as routers for forwarding information.... more
Mobile ad Hoc Networks are Wireless, infrastructure less, Networks. Due to mobility and limited radio range, every node has to perform the dual responsibility of host of different services as well as routers for forwarding information. Different routing algorithms are used for transmitting the information such as DSDV, DSR, and AODV. These algorithms are designed earlier without taking care of security aspect, so transmitted information and network is vulnerable to different types of attacks. Most popular attack in MANET is Blackhole attack, which has the severe impact on network. In this paper, We will discuss the blackhole attack , its impact and different techniques of detecting and mitigate its effect on DSR based MANET.
Index Terms—MANET, DSR, Black Hole
In this time, due to fast improvement in communication technology through a internet accessing is easier. The main disturbance of digital content is to protect distribution and unauthorized copying. Audio watermarking has been proposed as... more
In this time, due to fast improvement in communication technology through a internet accessing is easier. The main disturbance of digital content is to protect distribution and unauthorized copying. Audio watermarking has been proposed as solution to this issue. Audio watermarking is a technique that hides copyright information into audio signal without affecting original quality of audio signal. In this paper proposes audio watermarking technique to embedding and extraction procedure in DCT domain.

Keywords— Audio watermarking, Copyright protection, DCT, Embedding, Extraction
"Data mining is the process of discovering or extracting new patterns from large data sets involving methods from statistics and artificial intelligence. Classification and prediction are the techniques used to make out important data... more
"Data mining is the process of discovering or extracting new patterns from large data sets involving methods from statistics and artificial intelligence. Classification and prediction are the techniques used to make out important data classes and predict probable trend .The Decision Tree is an important classification method in data mining classification. It is commonly used in marketing, surveillance, fraud detection, scientific discovery. As the classical algorithm of the decision tree ID3, C4.5, C5.0 algorithms have the merits of high classifying speed, strong learning ability and simple construction. However, these algorithms are also unsatisfactory in practical application. When using it to classify, there does exists the problem of inclining to choose attribute which have more values, and overlooking attributes which have less values. This paper provides focus on the various algorithms of Decision tree their characteristic, challenges, advantage and disadvantage.
Keywords- Decision tree algorithms, ID3, C4.5, C5.0, classification techniques
"
Distributed arithmetic (DA) is performed to design bit-level architectures for vector–vector multiplication with a direct application for the implementation of convolution, which is necessary for digital filters. In this brief, A novel... more
Distributed arithmetic (DA) is performed to design bit-level architectures for vector–vector multiplication with a direct application for the implementation of convolution, which is necessary for digital filters. In this brief, A novel DA-based implementation scheme is proposed for adaptive finite-impulse response filter. To propose adaptive filter where filter coefficients are frequently updated in order to minimize the error out. Least-mean-square adaptation is performed to update the coefficients and minimize the mean square error between the estimated and desired output It involves a reduction in LUT(Look-up Table) size to one-fourth of the conventional LUT based on Anti-symmetric Product Coding(APC) and modified Odd Multiple Storage(OMS).
Keywords - Distributed arithmetic (DA), Anti-symmetric Product Coding(APC) and modified Odd Multiple Storage(OMS), Least-mean-square(LMS).
Due to the rapid evolution in internet technology and high speed networks the use of digital data has been increased. Digital data such as audio, video and images are easily created, copied, processed, stored and distributed among the... more
Due to the rapid evolution in internet technology and high speed networks the use of digital data has been increased. Digital data such as audio, video and images are easily created, copied, processed, stored and distributed among the users. To ensure security and protection of digital data new technology has been developed called as a digital watermarking. Digital watermarking is a technology which embeds additional information in the host image to ensure security and protection of the digital data without effecting original data. The purpose of digital watermarking is not to restrict use of digital data but it can be provide copyright protection and authentication against unauthorized uses. In this paper detail study of watermarking definition, concept and the main contribution of watermarking process in which watermarking should be used, features, techniques, Application, challenges and performance metrics of watermarking and comparative analysis of watermarking techniques are included.

Keywords-Digital Image Watermarking, Wavelet Transform, Discrete Cosine Transform, Attacks
Static and Dynamic analysis of the biomedical implants and fixators has been tested using the fatigue -corrosion machine . Millions of patients all over the world undergo surgery for implants and orthopedic fixator’s .These implants... more
Static and Dynamic analysis of the biomedical implants and fixators  has been tested using  the fatigue -corrosion machine  .    Millions of patients all over the world undergo surgery for implants and orthopedic fixator’s .These implants and fixators undergo fatigue [1] and corrosion in human body due to body fluids .Corrosion reduces the strength of the implants and the debris gets dissolved in the blood leading to further medical complications like kidney failure on patients. fatigue occurs on the implants and fixators due to repeated loading and unloading .when corrosion is combined with fatigue  the strength of the material reduces to 10% [5]of its actual strength and also may lead to catastrophic failure of the implants and Fixators .Due to this several tones of 316L Stainless Steel implant metals are wasted ,even though Stainless Steel is reproducible , it will have adverse effect on the environment  , with the depletion of natural resources ,this will lead to scarcity of material for the future population .A fatigue corrosion machine was designed and fabricated ,the results showed fatigue corrosion taking place in the implants ,this paper recommends the possibilities of  new implant metals like 304SS[12]  and fabrication technologies such as surface modifications , which has sustainability and less effect on environment.
Our main objective is to implement a monitoring system which monitors the heart pulse of a patient. This work presents a novel easy-to-use system intended for the fast and non invasive monitoring of the Lead I electrocardiogram (ECG)... more
Our main objective is to implement a monitoring system which monitors the heart pulse of a patient. This work presents a novel easy-to-use system intended for the fast and non invasive monitoring of the Lead I electrocardiogram (ECG) signal by using a wireless steering wheel. The steering wheel used here is a prototype model.  A novel heart rate detection algorithm based on the continuous wavelet transform has been implemented, which is specially designed to be robust against the most common sources of noise and interference present when acquiring the ECG in the hands. Skin Electrodes were used to record the nerve voltages for monitoring the heart pulse. The voltages recorded will be sent to an instrumentation amplifier which amplifies the signal, and then to a filter which filters the noise. Thus, analog signal is given to Analog-to-Digital Convertor (ADC) of Arduino. There, analog voltages are been converted to digital and that digital values will be stored in the EEPROM of Arduino. The values stored in EEPROM will be sent to PC via XBEE (IEEE 802.15.4) wirelessly and a serial port will be opened in the MATLAB by using a serial object. GUI is programmed to make the user interface interactive and simple. Using the real time plot, I’ve plotted the values received by XBEE module and making a running waveform which displays when the MATLAB sent a query to Arduino.

Keywords - ECG, Arduino Uno, Zigbee.
Multi-tenancy has shown promising results in achieving high operational cost efficiency by sharing hardware and software resources among multiple customer organizations, called tenants. In the context of cloud computing, this paradigm... more
Multi-tenancy has shown promising results in achieving high operational cost efficiency by sharing hardware and software resources among multiple customer organizations, called tenants. In the context of cloud computing, this paradigm enables cloud providers to reduce operational costs by dividing resources and to simplify application management and maintenance. These benefits come with associated challenges of isolation, dynamic scaling and elasticity. This paper explores these issues in the context of multi-tenant Database-as-a-Service. In addition, we propose a solution “virtual tenants” to solve the problem of elasticity and isolation in multitenant database applications.

Index Terms – multitenancy, Cloud computing, database as a service, elasticity.
- In many wireless systems where multiuser detection techniques may be applied, The known linear multiuser detectors are designed for communication systems with additive white Gaussian noise(AWGN) assumption. In this regard, the question... more
- In many wireless systems where multiuser detection techniques may be applied, The known linear multiuser detectors are designed for communication systems with additive white Gaussian noise(AWGN) assumption. In this regard, the question about the possibility and efficiency of their use in systems with non-Gaussian noise remains an open question. In the case of an external noise, there are no serious reasons for accurately determining its probability distributions. In this paper the performance analysis of many multiuser detectors in the presence of multiple-access interference, Gaussian, and non Gaussian noise in code-division multiple-access(CDMA) communication systems have been investigated. Simulation results show that the linear multiuser detector provide poor performance in the presence of non Gaussian noise than in AWGN.
Index Terms – multiuser detection, matched filter, decorrelating, MMSE, CDMA.
Classification is a data mining technique used to predict Patterns’ membership. Pattern classification involves building a function that maps the input feature space to an output space of two or more than two classes. Neural Networks (NN)... more
Classification is a data mining technique used to predict Patterns’ membership. Pattern classification involves building a function that maps the input feature space to an output space of two or more than two classes. Neural Networks (NN) are an effective tool in the field of pattern classification. The success of NN is highly dependent on the performance of the training process and hence the training algorithm. Many training algorithms have been proposed so far to improve the performance of neural networks. Usually a traditional backpropagation learning algorithm (BPLA), which minimizes the mean squared error (MSE – cost function) of the training data, be used in the process of training neural networks. However (MSE) based learning algorithm is not robust in presence of outliers that may  pollute the training data. In our work we aim to present another cost functions which backpropagation learning algorithm based on in order to improve the robustness of neural network training by employing a family of robust statistics estimators, commonly known as M-estimators, and hence obtain robust NN classifiers. Comparative study between robust classifiers and non-robust (traditional) classifiers was established in paper using crab classification problem.

Index Terms- Robust Statistics, Feed-Forward Neural Networks, M-Estimators, Classification, Robust classifier.
The work presents a performance study of a dc micro-grid when it is used a voltage droop technique to regulated the grid voltage and to control the load sharing between different sources like Photovoltaic cell , Fuel Cell, DG Set,... more
The work presents a performance study of a dc micro-grid when it is used a voltage droop technique to regulated the grid voltage and to control the load sharing between different sources like Photovoltaic cell , Fuel Cell, DG Set, Batteries, etc. A small model of a dc micro-grid comprising micro-sources and loads was implemented in the Matlab environment. Some aspects about centralized (master-slave) and decentralized (voltage droop) control strategies as well as the procedures to design the controllers, with droop control, are presented and discussed.

Index Terms—dc-dc converter, dc micro-grid, voltage droop control
This paper briefly discusses performance optimization challenges of Ad hoc networks and cross layer congestion control in Ad hoc network. To improve the performance of wireless network, the MAC layer uses transmission data rate based on... more
This paper briefly discusses performance optimization challenges of Ad hoc networks and cross layer congestion control in Ad hoc network. To improve the performance of wireless network, the MAC layer uses transmission data rate based on the channel signal strength information from physical layer and congestion information from network layer. Utilization of MAC layer is sent to DSDV as a congestion aware routing metric for optimal route discovery.  The simulations show that rate adaptation in MAC layer improves the network performance in terms of throughput, delivery ratio and packet transfer delay; using congestion information from MAC layer in routing discovery improves the performance of the network benefited from overall network load balance.
over the past few years, there has been increasing emphasis on extending the services available on wired public telecommunications networks to mobile non wired telecommunications users. At present, for mobile network users only... more
over the past few years, there has been increasing emphasis on extending the services available on wired public telecommunications networks to mobile non wired telecommunications users. At present, for mobile network users only low-bit-rate of 100 to 150 kbps data services are available. However, demands for wireless broadband multimedia communication systems (WBMCS) are increasing. It is necessary to use high-bit-rate transmission of at least several MBPS for upcoming new technologies. If digital data is transmitted at the rate of several MBPS, the delay time of the delayed waves is greater than 1 symbol time. Using one of the probably solution for adaptive equalizing signal. There are practical difficulties in operating this equalization at several megabits per second with compact, low-cost hardware. To overcome such an issue and to Achieve WBMCS. Orthogonal frequency division Multiplexing (OFDM) Apply to parallel-data transmission scheme, which reduces the influence of multipath fading and makes complex equalizers unnecessary.
Multiprocessor system on chip is emerging as a new trend for System on chip design but the wire and power design constraints are forcing adoption of new design methodologies. Researchers pursued a scalable solution to this problem i.e.... more
Multiprocessor system on chip is emerging as a new trend for System on chip design but the wire and power design constraints are forcing adoption of new design methodologies. Researchers pursued a scalable solution to this problem i.e. Network on Chip (NOC). Network on chip architecture better supports the integration of SOC consists of on chip packet switched network. Here we develop a Router is a packet based protocol. In this Router which taken functionality reference from actual Router the design is implemented on single chip using verilog code.
Router drives the incoming packet which comes from the input port to output ports based on the address contained in the packet. The router has a one input port from which the packet enters. It has three output ports where the packet is driven out. The router has an active low synchronous input resetn which resets the router. Thus the idea is borrowed from large scale multiprocessors and wide area network domain and envisions on chip routers based network. This helps to understand how router is controlling the signals from source to destination based on the header adders. It also tells when data have to be extracted for a particular port and also it gives idea about port is full or empty.
This method removes most of the problems cited above and improves the performance of router. The most familiar type of routers are home and small office routers that simply pass data, such as web pages and email, between the home computers and the owners’ cable or DSL modem, which connects to the internet (ISP). Routers may also be used to connect two or more logical groups of computer devices known as subnets, each with a different sub-network address.
- Multicore hardware systems are proving to be more efficient each passing day and so are the scheduling algorithms for these systems. The potential speedup of applications has motivated the widespread use of multiprocessors in recent... more
- Multicore hardware systems are proving to be more efficient each passing day and so are the scheduling algorithms for these systems. The potential speedup of applications has motivated the widespread use of multiprocessors in recent years. Optimal multiprocessor scheduling algorithms remain a challenge to the researchers. Out of the number of algorithms proposed and analyzed we here compare and examine three of them: the classic global EDF, the optimal P-fair algorithm and a newer LLREF which has worked upon the strengths of P-fair. They are compared in terms of task migrations and required number of scheduler invocations and schedulability of a variety of tasks. Results are verified on the basis of a set of randomly generated tasks.
In software industry a large number of projects continue to fail due to non technical issue such as communication gap, requirements and poor executive. The authors identify the reasons for which are available for software development life... more
In software industry a large number of projects continue to fail due to non technical issue such as communication gap, requirements and poor executive. The authors identify the reasons for which are available for software development life cycles fall short of dealing with them. They also proposed the system development for software development life cycle. In this paper, the concept of system development, SDLC is further explored and a number of concepts are discussed in this regard.
Quality assurance makes sure the project will be completed based on the previously approved specifications, standards and functionality. It is required without defects and possible problems. It monitors and tries to progress the... more
Quality assurance makes sure the project will be completed based on the previously approved specifications, standards and functionality. It is required without defects and possible problems. It monitors and tries to progress the development process from the start of the project. Software Quality Assurance (SQA) is the combination of the entire software development process, which includes software design, coding, source code control, code review, change management, configuration management and release management. In this paper we describe the solution for the key problems of software testing in quality assurance. The existing software practices have some problems such as testing practices, attitude of users and culture of organizations. All these tree problems have some combined problems such as shortcuts in testing, reduction in testing time, poor documentation etc. In this paper we are recommending strategies to provide solution of the said problems mentioned above.
A simple Log Periodic Dipole Array (L.P.D.A) antenna for Very High Frequency (V.H.F) applications is discussed in this paper. The antenna is fed by line feeding technique. A printed LPD has several dipoles connected together by several... more
A simple Log Periodic Dipole Array (L.P.D.A) antenna for Very High Frequency (V.H.F) applications is discussed in this paper. The antenna is fed by line feeding technique. A printed LPD has several dipoles connected together by several non radiating transmission lines, each of which are placed at finite distance from each other. The dimension of the largest dipole depends upon the lowest frequency of operation. This paper presents the design and simulation of a Log Periodic Dipole Array antenna for 50 MHz using an electromagnetic simulator CADFEKO. Various antenna parameters like Return loss, VSWR, Input Impedance, Gain etc of the designed antenna are observed.  The results show that the designed wideband antenna is well suited for very high frequency applications like Television and FM broadcasting.
Cloud computing is a distributed computing network and has the ability to run a program or application on many connected computers at the same time. It is an efficient and scalable network, but maintaining the stability of processing so... more
Cloud computing is a distributed computing  network and has the ability to run a program or application on many connected computers at the same time. It is an efficient and scalable network, but maintaining the stability of processing so many applications is a very complex problem. Load balancing is the solution for this problem. Good load balancing makes cloud computing more efficient and improves performance. To improve the load balancing strategy this paper introduces an algorithm that applies game theory concepts for the cloud partitioning.
The Aim of this paper to define detail study on public transport network of VTCOS city bus service by vehicle occupancy survey for understanding of required improvement parameters in Gandhinagar City with five major routes. Gandhinagar is... more
The Aim of this paper to define detail study on public transport network of VTCOS city bus service by vehicle occupancy survey for understanding of required improvement parameters in Gandhinagar City with five major routes. Gandhinagar is not only capital of Gujarat state, but is also loaded with all the accolades of being the most developed city of the state. Carrying forward its chain of developments, we try to develop the effective, economic, time consuming public transport network in the Gandhinagar city including the area which connect the major parts of the city like educational firms, government sectors , shopping mall, theatres etc. It will cover not only the city area but also cover the area which are under GUDA(Gandhinagar Urban Development Authority) and some parts of Ahmedabad city.
At present, bus operation through Public Private Partnership is given to the private operator “VTCOS” strictly on the contractual period of 5 (Five) years only. The capacity of the buses is not so good and effective transportation is not provided. It does not cover the major part of the city area and the routes are not effective.
SOI means Silicon on Insulator. This type of transistors has Silicon-Insulator-Silicon substrate which is different from conventional MOSFET structure where metal layer is used on the top of Insulator [1, 2]. Now a days, the width of the... more
SOI means Silicon on Insulator. This type of transistors has Silicon-Insulator-Silicon substrate which is different from conventional MOSFET structure where metal layer is used on the top of Insulator [1, 2].  Now a days, the width of the oxide of a MOSFET is reduced from 300nm to 1.2nm and even less with scaling in technology. If it is further reduced, the leakage problems (majorly Sub-threshold Leakage) come into play [3]. In order to solve this problem and let the technology to scale further, we use a small Silicon strip on the oxide leading to next generation SOI MOSFET’s. It provides an added advantage of reduction of parasitic capacitance which improves the performance and thereby increases the speed of operation by decreasing the delay values.
In this paper, we use the BSIMSOI Model to simulate the analog circuits using EDA tools like Cadence and thereby verify the modeling of SOI MOSFET’s.
Watermarking is a technique to hide information inside digital media so that any unauthorized person can’t access it. It is a very important field which helps to protect copywrite digital material. This is exponentially growing field in... more
Watermarking is a technique to hide information inside digital media so that any unauthorized person can’t access it. It is a very important field which helps to protect copywrite digital material. This is exponentially growing field in today’s world as information sharing has become much easier due to internet. Many researchers are studying and researching in this field. Using this technique, digital data that is watermarked can be accessed and modified by authorized person only. There are different techniques available so far for watermarking in digital image. This paper provides analytical survey on digital image watermarking based on their representation domain. Comparison has been made between frequency domain and spatial domain.
A Wide Tuning Range Gm-C Continuous-Time Analog Filter is a CMOS operation nal transconductance amplifier for low-power and wide tuning range filter application is proposed. The transconductor can work from the weak inversion region to... more
A Wide Tuning Range Gm-C Continuous-Time Analog Filter is a CMOS operation nal transconductance amplifier for low-power and wide tuning range filter application is proposed. The transconductor can work from the weak inversion region to the strong inversion region to maximize the transconductance tuning range. The trasconductance can be tuned by changing its bias current. A fifth-order Elliptic low-pass filter implemented with the transconductors is proposed to be implemented in SPICE using CMOS. The filter can operate with the cutoff frequency of 250Hz to 1MHz. The wide tuning range filter would be suitable for multi-mode applications, especially under the consideration of saving chip areas. The third-order inter-modulation of -40dB was measured over the tuning range with two tone input signals. The power consumption is 0.8mW at 1MHz cutoff frequency and 1.8-V supply. Cost and power consumption are two most important factors for these products. For the power consumption, digital circuits can benefit from the supply voltage reduction, but analog circuits can not necessarily decrease the power consumption with the decrease of supply voltage. To meet different specifications for low power consumption, new basic analog building blocks should be re-designed. Many of  the previously published papers made efforts on improving the speed, linearity, or dynamic range of transconductor circuits.
With the development of mobile communication technology , mobile phones have not only call functions , but has telephone, dial up recording, call recording and other strong features, and send and receive text messages and email and other... more
With the development of mobile communication technology , mobile phones have not only call functions , but has telephone, dial up recording, call recording and other strong features, and send and receive text messages and email and other functions. There is no function to measure body temperature. Body temperature is a basic parameter for monitoring and diagnosing human health. Phone is a portable communication tool, its style increasingly small and ideal to carry. Therefore, the need for frequent monitoring of body temperature can be easily equipped as an application in cell phone. A temperature sensing unit can be included as an integral part of the mobile phone, resulting in an additional feature (body temperature measurement) provided with mobile phone. It is a very reasonable design for day to day body temperature measurement function but has yet to see this products or reports.
An explosive growth of web applications like Online Banking, eStore, eCommerce, Military secrets and so on. Such fabulous resources are intruding by hacker in the form of cyberpunk like Injecting Query, Cross-Site Scripting and so on. In... more
An explosive growth of web applications like Online Banking, eStore, eCommerce, Military secrets and so on. Such fabulous resources are intruding by hacker in the form of cyberpunk like Injecting Query, Cross-Site Scripting and so on. In that, one of the most web vulnerable attacks is the SQL injection. It is the attack to compromise the database and broken the security wall to access the securable data. In the prior research technique, they have implemented the model of Parameterised Query transformation to prevent the SQL vulnerabilities. But it might not be sufficient technique to handle the Second Order SQL Injection vulnerability as well as it could not effectively discover the vulnerabilities spot based on the functional level. It might produce high level of false rate injection. In this proposed system we presented the concept of pattern based query evaluation technique. This technique will reduce the false rate of injection and ease to handle the vulnerable spot. In this simulation result shows that the security of false rate variation increase the detection of vulnerability spot and input time complexity are superior.
This paper presents simulation results of a grid interactive Uninterruptible Power Supply (UPS) system using fuel-cell and ultra-capacitor storage. The system incorporates 40 kW of grid input supply, a 45 kVA power conditioning unit... more
This paper presents simulation results of a grid interactive Uninterruptible Power Supply (UPS) system using fuel-cell and ultra-capacitor storage. The system incorporates 40 kW of grid input supply, a 45 kVA power conditioning unit capable of operating in both inverting and charging modes, and a 16-Ah battery bank. It was aimed to demonstrate the capability of the system to provide uninterrupted power, demand side management function and load voltage stabilization in a grid which experiences frequent blackouts and under/over voltage problems. Fuel cells (FCs) are being considered as a impending substitute in long term to replace diesel/gasoline combustion engines in vehicles and emergency power sources. However, soaring cost and sluggish dynamic response of FC still persevere as the main hurdles for wider applications. To remedy this problem, energy storage systems like ultra capacitors (UC) with adequate power capacity have to be incorporated. UC are in general very faster in charging and discharging operation and can also be helpful in improving power factor at the point of inter-connection. The results prove the efficiency of fuel cell and ultra-capacitor in maintaining uninterruptable supply to load centers.
Today, the world is moving across the internet. Internet is collection of Web Pages. Web Page contains a bunch of information. In bunch of information to find or retrieve particular page or information is difficult task. It is difficult... more
Today, the world is moving across the internet. Internet is collection of Web Pages. Web Page contains a bunch of information. In bunch of information to find or retrieve particular page or information is difficult task. It is difficult for the Search Engine to identify web page. So make this task easy there are different web page classification methods.web page classification is a web mining area. Using this method we can identify web pages. Web page Classification retrieves WebPages based on content and structure of web page.  This paper  shows results of different Classification methods and comparison of that
This paper describes that concept of harmonics in the power systems and explains the different methods to reduce the harmonics problems. In the power system due to use of highly nonlinear devices harmonics are generated and it reduce the... more
This paper describes that concept of harmonics in the power systems and explains the different methods to reduce the harmonics problems. In the power system due to use of highly nonlinear devices harmonics are generated and it reduce the performance of the power systems. Thus it is necessary to examine and evaluate the different harmonic problems in the power system and introduce the suitable solution techniques. This paper firstly examine the propagation of harmonic current and voltage in the power system network and appreciate their consequences on both utility system equipments and end user components. Throughout the examine of harmonic wave forms and idea about cancellation, effective methods have been introduced by the application of filter. Base on harmonic distortion the filter has been designed and added to the panels to degrade the harmonic distortion.
The digital video application has become more and more popular in mobile terminals such as smart phones, laptop and tablet. But original video data intensity has become impractical to store and transmit. This data must be compressed to... more
The digital video application has become more and more popular in mobile terminals such as smart phones, laptop and tablet. But original video data intensity has become impractical to store and transmit. This data must be compressed to proper size at source and decompressed at destination. For this purpose, compression is useful to reduce the data size without excessively reducing quality of data. There are number of compression and decompression algorithm used for this purpose i.e. Arithmetic coding (AC), Run Length Coding (RLE), Huffman coding, Shannon-Fano coding. In this paper we try to explain use of different encoding techniques more suitable for particular data compression algorithm based on compression ratio and performance from previous work. We find that Arithmetic coding is better than all other based on compression ratio but performance of Huffman coding is better than all other encoding techniques.
the advances in wireless technology, mobile vehicular network are likely to become the most relevant form of MANET. Vehicular ad-hoc Networks (VANET) are wireless networks which facilitate information exchange between mobile vehicles with... more
the advances in wireless technology, mobile vehicular network are likely to become the most relevant form of MANET. Vehicular ad-hoc Networks (VANET) are wireless networks which facilitate information exchange between mobile vehicles with no permanent network infrastructure required. Each vehicle captures and disseminates information such a location, speed and process the information received from other vehicles in the network. The event driven safety messages have stringent requirement on delay and reliability. In dense network large number of vehicle broadcast a beacon messages at a high number of frequencies, the control channel (CCH) will easily congested. It is a very important to keep the CCH channel free from congestion. With the number of vehicle increases rapidly, especially in city whose economy is booming, these situation getting even worse. In this paper we are presenting detection of traffic congestion using proposed approach and analysis of results.
Semantic web is the web with a meaning that computers can understand. In order to create a web with semantics, the information available from the unstructured or semi-structured web data has to be extracted and converted to a structured... more
Semantic web is the web with a meaning that computers can understand. In order to create a web with semantics, the information available from the unstructured or semi-structured web data has to be extracted and converted to a structured form that can be interpreted by computers. Different web mining techniques are used for extracting useful information from web data. In this paper, the main focus is to extract the concepts and conceptual relationships from unstructured textual data using web content mining to create the ontology. Ontologies can then be used to better serve the user queries.
Friction Stir Welding is very conventional but advanced itself welding method to joint non-ferrous materials also. In present study the effect of tool pins depth of penetration is understood. This paper intend to understand the effects of... more
Friction Stir Welding is very conventional but advanced itself welding method to joint non-ferrous materials also. In present study the effect of tool pins depth of penetration is understood. This paper intend to understand the effects of main parameters including tool’s rotational speed, tools feed, tools axial force which are already analyzed by many researches, To understand the effect of penetration, these parameters are taken into consideration. It was found that depth of penetration has direct relation to micro structure of obtained joint. As depth of penetration can be controlled by tool itself and also by axial force also. This paper aims to co relate joint parameters with depth of penetration for obtain optimum results.
the research generally lies on characteristics of FSW tool pin’s profile on FSW joint. Previous researches proven that square, tapper cylindrical profile shaped tool pin gives optimum results. In present work will be carried out using... more
the research generally lies on characteristics of FSW tool pin’s profile on FSW joint. Previous researches proven that square, tapper cylindrical profile shaped tool pin gives optimum results. In present work will be carried out using different tool pin profile like tapper cylindrical, square, tapper hexagonal, and threaded cylindrical. Test specimen will be prepared from obtain results and various tests (tensile and bending test) will be carried out to prove its optimum joints. On the basis of these results and parameters used during experiment the effect of tool pin profile will be understood.
This paper is about the security attacks in relation to the dispersing Internet Protocol version 6 (IPv6). Since it isn’t the default network protocol established nowadays. There are no best practices from the issue of network... more
This paper is about the security attacks in relation to the dispersing Internet Protocol version 6 (IPv6). Since it isn’t the default network protocol established nowadays. There are no best practices from the issue of network administrators. There are no assurances that applied IPv6 protocol stacks and security techniques without any bugs. Thus this paper interprets almost all IPv6 security attacks.
The Back-propagation (BP) training algorithm is a renowned representative of all iterative gradient descent algorithms used for supervised learning in neural networks. It is designed to minimize the mean square error (MSE) between the... more
The Back-propagation (BP) training algorithm is a renowned representative of all iterative gradient descent algorithms used for supervised learning in neural networks. It is designed to minimize the mean square error (MSE) between the actual output of a multilayer feed-forward neural network and the desired output. BP has a great high merit of simplicity on implementation and calculation compared to other mathematically complex techniques. It is its simplicity that over period of time attracts researchers and so that, many improvements and variations of the BP learning algorithm have been reported to beat its limitations of slow convergence rate and convergence to the local minima. It is applied to a wide range of practical problems and has successfully demonstrated its power. This paper summarize the basic BP and gradual improvements over Back propagation technique used for classification in Artificial neural networks(ANN) and comparisons with new methods like genetic algorithms(GA) and showing why it is still effective and has scope to improvements.
Recently, Worldwide interoperability for Microwave Access (WiMAX) emerged as a grand low cost solution for last mile communication. It can also grant an easy solution for internet access specifically in rural areas. This paper highlights... more
Recently, Worldwide interoperability for Microwave Access (WiMAX) emerged as a grand low cost solution for last mile communication. It can also grant an easy solution for internet access specifically in rural areas. This paper highlights the problem of dissipated part of allocated bandwidth in TDD WiMAX network by establish a mechanism to redistribute the allocated bandwidth between the logical uplink sub-channel and downlink sub-channel part according to the current state of the network and the Quality of Service (QoS) requirements. Through OPNET system simulation, the results verify that the proposed scheme can enhance the overall network throughput and offer a flexible and healthy system and also can decrease about 87% of the total WiMAX delay.
Semantic Web Mining is combination of two areas. One is Semantic web and second is Web Mining. These two areas improve the result of www (World Wide Web). The Semantic Web can make Mining much easier and Web Mining can build new structure... more
Semantic Web Mining is combination of two areas. One is Semantic web and second is Web Mining. These two areas improve the result of www (World Wide Web). The Semantic Web can make Mining much easier and Web Mining can build new structure of Web.WebMining applies Data Mining technique on web content, Structure and Usage. Method of Usage Mining can profit from enriched description of web pages visited. This will provide better utilization of web pages, better utilization of web pages and recommendation and personalization of website. Web Personalization may include the provision of recommendation to the users, the creation of new index pages or generation of target advertisements using semantic web mining. This Paper presents overview of web personalization using semantic web mining.
The limited power capacity required by different mobile nodes in Mobile Ad-hoc Network (MANET). Energy efficiency with better in mobile ad hoc network is very important. There always an important issue for energy improvement with better... more
The limited power capacity required by different mobile nodes in Mobile Ad-hoc Network (MANET). Energy efficiency with better in mobile ad hoc network is very important. There always an important issue for energy improvement with better throughput. The DREAM (Distant routing effect algorithm for mobility) with better throughput and energy will maintain the information about output per interval and routing table which contain the information about energy consumption by an individual node. The throughput is the average of the throughputs of all hosts active in the network. It will remove the unnecessary flooding from the network. This paper is base d on several performance measurement of AOMDV(Ad-hoc on demand distance ).Our result will be base on basis of several parameters like PDR (Packet Delivery Ration), Routing load, node residual energy.
In the world of information technology security of data is most important part. Everywhere there is a problem of security threats which are always looking to steal the information. So any how data protection is required. Cryptography,... more
In the world of information technology security of data is most important part. Everywhere there is a problem of security threats which are always looking to steal the information. So any how data protection is required. Cryptography, steganography and watermarking are some of the well known data protection techniques. Steganography and cryptography are data hiding techniques while watermarking is used to give unique identity to the objects like image, audio, video etc which prevents it from forgery. The benefit of steganography over cryptography is that no one except sender and intended user can see the message. This survey paper concentrates on the steganography techniques and mainly on the techniques that have used web document as a carrier to hide the data. The HTML steganography has its own benefit that data doesn’t look suspicious because HTML web pages are fundamental elements of the modern internet technology and are very rapidly used in websites.
Multi-tenancy, which allows a single application to emulate multiple application instances, has been proposed as a solution to this problem. By sharing one application across many tenants, multi-tenancy attempts to replace many small... more
Multi-tenancy, which allows a single application to emulate multiple application instances, has been proposed as a solution to this problem. By sharing one application across many tenants, multi-tenancy attempts to replace many small application instances with one or few large instances thus bringing down the overall cost of IT infrastructure. In this paper, we present importance of Multi-tenancy in cloud computing, degree of Multi-tenancy and Multi-tenancy in databases. It will helpful to understand about Multi-tenancy and its benefits in software as a service in cloud computing environment.
Security on different layer is most important and essential requirement in MANETs (mobile ad hoc networks). Here we present a proposed theory for detecting and recover gray hole attack in AODV based MANET. Here we consider study of AODV... more
Security on different layer is most important and essential requirement in MANETs (mobile ad hoc networks). Here we present a proposed theory for detecting and recover gray hole attack in AODV based MANET. Here we consider study of AODV protocol, and gray hole attack of network layer attacks. In MANET at a same time multiple receivers and senders can communicate with each other and the resources are limited, lack of centralized authority and also the network topology is dynamic due to these characteristics MANET is more vulnerable to the different security attacks. Basically the attacks in MANET are active or passive. The Gray-hole attacks are belonging to network layer and belong to active attacks. Active attack can be INTERNAL or EXTERNAL. These type of attacks are attempt to destroy or alter the data being transferred in a network. The attack carried out by the internal node of the network is known as the internal active attack. And the attack carried out by the node which is not belonging to the network is known as external active attacks. So here we give one algorithm for detecting and recover Gray-Hole attack.
Cryptography is derived from a Greek word which means the art of protecting information by converting it into an unreadable format. In order to prevent some unwanted users or people to get access to the data cryptography is needed. This... more
Cryptography is derived from a Greek word which means the art of protecting information by converting it into an unreadable format. In order to prevent some unwanted users or people to get access to the data cryptography is needed. This paper surveys various modifications approaches applied on standard RSA algorithm in order to enhance it. RSA provide more security as compare to other algorithm but the main disadvantage of RSA is its computation time, so many researchers applied various techniques to enhance the speed of an RSA algorithm by applying various logic and also apply some techniques which can be used for data integrity. This paper does the detailed study about such techniques and represents the summarized results.
intelligent farm surveillance system for animal detection refers to video level processing techniques for recognition of object from farms video. So Many countries are using intelligent farm surveillance system to take care of the farm... more
intelligent farm surveillance system for animal detection refers to video level processing techniques for recognition of object from farms video. So Many countries are using intelligent farm surveillance system to take care of the farm remotely from anywhere. In this paper we have studied various types of techniques proposed by different researchers and summarized all the results. This Paper is based on essential perceptive of the segmenting algorithms and analyzing algorithms for color image segmentation and detection.
The ad-hoc networks are the temporarily established wireless networks which do not require fixed Infrastructure. It is also called as Infrastructure less network. Each mobile node functions as base station and as router forwarding packets... more
The ad-hoc networks are the temporarily established wireless networks which do not require fixed Infrastructure. It is also called as Infrastructure less network. Each mobile node functions as base station and as router forwarding packets for other mobile nodes in network. Among all attacks wormhole attack is most dangerous attack. In this attack an attacker capture the packets at one node in the network and send it to the another attacker node at a distant location through tunnels which is established through different ways  like packet encapsulation, using high power  transmission  or by using direct antennas. Wormhole attack is so strong and detection of this attack is hard. Also, the wormhole attack may cause another type of attacks like Sinkhole or Select forwarding. Using a cryptographic technique is not enough to prevent wormhole attack. In this paper we are going to review some methods in wormhole detection and investigate the weaknesses and strengths of the methods.
Reducing the energy consumption of wireless devices is paramount to a wide spread adoption of mobile applications. Cellular communication imposes high energy consumption on the mobile devices due to the radio resource allocation, which... more
Reducing the energy consumption of wireless devices is paramount to a wide spread adoption of mobile applications. Cellular communication imposes high energy consumption on the mobile devices due to the radio resource allocation, which differs from other networks such as WiFi. Most applications are unaware of the energy consumption characteristics of third generation cellular communication (3G) and Global Positioning Systems (GPS). This makes the background small data transfers of undisciplined applications an energy burden due to inefficient utilisation of resources.. In order to cover this gap, our work realises an existing energy saving algorithm such as of linear discriminant analysis , k-nearest neighbor, and support vector machines are explored and compared on synthetic and user traces from real-world usage studies within the Android platform, and measures its energy footprint.To maximize the lifetime of an ad hoc network ,it is essential to prolong each individual node(mobile) life through  minimizing the total transmission energy  consumption for each communication request. This paper proposes novel techniques that exploit types of Spatial locality available in android mobile devices. The experimental results an average improvement of 24% energy savings is achieved compared to state-of-the -art prior work on energy-efficient location sensing.
A cognitive radio wireless sensor network is major areas where cognitive techniques can be used for opportunistic spectrum access. Research in this area is still in progression The aim of this study is to classify... more
A  cognitive  radio  wireless  sensor  network  is  major  areas  where  cognitive  techniques  can be used for  opportunistic  spectrum  access.  Research  in  this  area  is  still  in  progression The aim of this study is to classify the existing literature of this fast emerging potential application area of cognitive radio wireless sensor networks and indicate open  problems  and  Research  Trends  and  Open  Research  Issues
: In Public key cryptography two different keys (a pair of keys) are used, one for encryption and other for decryption. Main advantage of this technique is that no other one can decrypt the text/message without this key pair. This paper... more
: In Public key cryptography two different keys (a pair of keys) are used, one for encryption and other for decryption. Main advantage of this technique is that no other one can decrypt the text/message without this key pair. This paper surveys various Improvements done on RSA algorithm by applying various modifications in order to enhance it. RSA is highly secure algorithm but have high computation time, so many researchers applied various techniques to enhance the speed of an RSA algorithm by applying various logic. This paper does the detailed study about various techniques and represents the summarized results.
Advances in wireless sensor network (WSN) technology has provided the availability of tiny and low-cost sensor nodes with capability of sensing various types of physical and environmental conditions, data processing, and wireless... more
Advances in wireless sensor network (WSN) technology has provided the availability of tiny and low-cost sensor nodes with capability of sensing various types of physical and environmental conditions, data processing, and wireless communication. However, the characteristic of wireless sensor networks requires more effective methods for data forwarding and processing. In wireless sensor network, the sensor nodes have a limited transmission range, with processing and storage capabilities as well as their energy resources are also limited. Routing protocols for wireless sensor networks are responsible for maintaining the routes in the network, to ensure reliable communication, save energy resources and increase network lifetime. So here we study a survey of routing protocols for Wireless Sensor Network, compare their strengths and limitations and which are more energy efficient LEACH and LEACH-C are clustering protocol and they provide energy efficient routing. Optimizing LEACH-C protocol to getting better network lifetime and Energy consumption.
Today, at the low end of the Communication Protocols there are mainly Two Protocols: Inter- Integrated circuit (I2C) and the Serial Peripheral Interface (SPI) Protocols. Both the protocols are well suited for communications between... more
Today, at the low end of the Communication Protocols there are mainly Two Protocols: Inter- Integrated circuit (I2C) and the Serial Peripheral Interface (SPI) Protocols. Both the protocols are well suited for communications between Integrated Circuits for communication with ON-Board Peripherals. SPI is one of the most commonly used serial protocols for both inter-chip and intra-chip low/medium speed data-stream transfer. In conformity with  design-reuse methodology, this paper introduces high-quality SPI IP with one Master One Slave configuration with that of 8-bit data transfer which incorporates all necessary features required by modern ASIC/SoC applications. The Designed SPI is used for communication between different peripherals with that of a processor in a SoC application. The Designed SPI is Implemented and also Verified using a System Verilog in order to show its code coverage and functional correctness. The whole RTL design code is written in Verilog for synthesis and its Verification code is written in System Verilog, IEEE (2005).
Batch adsorption studies were carried out for the removal of chromium using commercial grade granular activated carbon. Synthetic chromium effluent solutions were prepared using chromium chromium chloride. The effects of various factors... more
Batch adsorption studies were carried out for the removal of chromium using commercial grade granular activated carbon. Synthetic chromium effluent solutions were prepared using chromium chromium chloride. The effects of various factors such as pH, agitation rate, and initial concentration were examined. The adsorption capacity was found to increase with increase in initial concentrations ranging from (300,400,500 mg/l) and was found to be maximum between the pH ranges 4-5. From the kinetic and isotherm studies Pseudo second order and Langmuir model was found to fit best with the experimental data’s with R2 values greater than 0.99.The goodness of fit was checked using MATLAB @2009a.
In this paper a Fuzzy Logic based load frequency control system (LFC) with Superconductor Magnetic Energy Storage System (SMES) in a multi area electric power system was explained. If a large power imbalance is suddenly happened in a... more
In this paper a Fuzzy Logic based load frequency control system (LFC) with Superconductor Magnetic Energy Storage System (SMES) in a multi area electric power system was explained. If a large power imbalance is suddenly happened in a multi area power electric system, generation units and also consumer sides will be affected by the distortion in the energy balance between both two sides. This imbalance is initially handled by the kinetic energy of the system rotating components such as turbines, generators and motors, but, eventually, the frequency will change. Therefore, Load Frequency Control (LFC) is considered as one of the most challenging issues in power system control and operation. PID type controllers are conventional solutions for LFC. The parameters of the PID controllers have been tuned traditionally. In this paper, a PID controller is applied for the LFC problem to stabilize the system after disturbances like load changing and further oscillations are damped by using SMES. To illustrate the application of the method, a multi area network with some uncertainties is provided. Finally the results of the Load Frequency controller are compared without SMES and SMES and also effect of variation of parameters for PI like (Ki) are studied. It is also further studied the performance of the system with PI and FLC techniques. The simulation results show the success and the validity of the PID controller in compare with the FLC controller.
In today’s sensitive environment, there are so many biometric technologies including face recognition are available for person recognition and are coming of age due to the need to address sensitive security concerns in the 21st century.... more
In today’s sensitive environment, there are so many biometric technologies including face recognition are available for person recognition and are coming of age due to the need to address sensitive security concerns in the 21st century. But, the single biometrics technique is not adequate for person identity recognition due to both sufficiently accurate and user acceptable for universal application. The main challenges of face recognition today are handling & implementing the different stages of face recognition such as capturing face, the feature extraction stage, Color segmentation, Skin-region detection stage, the template acquisition and classification stage are spatially and functionally distributed, with complex hierarchies of security levels and interacting user requirements. An approach based on innovative multi-agent based computing paradigm is sufficient & promising towards the face recognition systems deployed in such distributed environments.
The principal goal to design any encryption algorithm must be secure against unauthorized attacks. Data Encryption Standard algorithm is a symmetric key algorithm and it is used to secure the data. DES works on 64 bit data and 56 bit key.... more
The principal goal to design any encryption algorithm must be secure against unauthorized attacks. Data Encryption Standard algorithm is a symmetric key algorithm and it is used to secure the data. DES works on 64 bit data and 56 bit key. Different enhancements of DES algorithms are available. From the enhanced algorithm, few algorithm works on increasing the key length and few has complex S-BOX design, while other has increased the number of states in which the information is to be represented. In DES-96 improved DES security algorithm has used 84-bit key instead of the original 56-bit key, to resist brute-force attack. This would give 284 ≈ 1.934*1025 trials instead of 256 ≈ 7.205* 1016. By increasing the key length, the number of combinations for key is increases which is hard for the intruder to do the brute force attack. As the S-BOX design will become the complex, there will be a good avalanche effect. As the number of states increases in which the information is represented instead of binary representation, it is hard for the intruder to crack the actual information. Block encryption standard for transfer of data algorithms have minimized the memory requirements and execution time complexity. The total number of combinations required to decipher a 4 byte text is: 232 * 210 * 224 * 26 = 272 units.

And 20 more

Wire EDM (WEDM) is a versatile non-traditional machining process used to cut materials of high hardness and to produce very complex and intricate shapes on the wide variety of materials. Hardness of the material is not a constraint for... more
Wire EDM (WEDM) is a versatile non-traditional machining process used to cut materials of high hardness and to produce very complex and intricate shapes on the wide variety of materials. Hardness of the material is not a constraint for WEDM. This advantage makes the WEDM to cut very hardened materials with ease. Parameters that affect the Material Removal  Rate(MRR) and surface roughness mainly consists of pulse on time, pulse off time, discharge current, servo voltage, tension of wire, flushing pressure etc. so, it is very essential to set the parameters that control the outputs in an optimized condition so as get the  maximum  output. Work has been carried out to find the optimal parameter settings for maximum MRR and minimum kerf (width of cut) for a nickel based alloy, HastelloyC276, a very high temperature, corrosion resistant alloy, using both Taguchi methods and Grey Relational Analysis (GRA).
This help’s a visually challenged person to live like any other normal person on this planet without any personal guide. This device does not require any implant or surgery to enable vision, rather this is a piece of device that can be... more
This help’s a visually challenged person to live like any other normal person on this planet without any personal guide. This device does not require any implant or surgery to enable vision, rather this is a piece of device that can be carried. Our nature has the answer for everything. This makes use the concept of echolocation used by bats.
People these days are technically advanced and computers are widely available. People are more prone to diseases due to changing environment. Expert consultation is not timely available. All the diseases have an associated set of some... more
People these days are technically advanced and computers are widely available. People are more prone to diseases due to changing environment. Expert consultation is not timely available. All the diseases have an associated set of some symptoms. A system can be developed using neural networks. The patients can input the symptoms observed and the system can recognize disease depending on the set of symptoms without the need of a medical expert. The patients gives set of symptoms which the patient observes. Depending on the set of symptoms selected the system selects a disease. We give the input as the set of symptoms and the output is a disease. The system can be further enhanced by using a layer of medicines for the diseases. On recognizing the disease, the system can then further select a medicine. The system can be used in emergency situations where medical consultation is not easily available.
The paper presents Car license plate detection (CLPD) system using Vertical Edge Detection Algorithm(VEDA) and Structured component algorithm are used for accurate identification. First the recognition system starts with character... more
The paper presents Car license plate detection (CLPD) system using Vertical Edge Detection Algorithm(VEDA) and Structured component algorithm are used for accurate identification. First the recognition system starts with character identification based on number plate extraction, Splitting characters and Text matching. The system model uses already captured images for this recognition process. The system uses different images of plates for identifying the characters from input image. After character recognition, an identified group of characters will be compared with database number plates for authentication and grant of access.
This study focuses on an advanced mobile security system to provide rapid and highly secure human friendly M-commerce transaction. M-commerce transaction works in multistep process. The process involves User authentication, Merchant... more
This study focuses on an advanced mobile security system to provide rapid and highly secure human friendly M-commerce transaction. M-commerce transaction works in multistep process. The process involves User authentication, Merchant authentication, Message authentication, secure payment details transaction authentication. M-commerce provides availability, reliability and security in transaction phases. The M –commerce phases involves multiple steps like Offering goods (O) , searching for available goods (B) , ordering the goods(O) , paying (P),delivering (D) and  distributing(D). The proposed work improves security in user authentication by Wireless Application Protocol (WAP) gateway, which in turn provides end to end security using Double encryption model. In double encryption model, transfer the authentication data and product ordered using Transportation layer security (TLS) Protocol instead of Wireless Transport Layer Security (WTLS) protocol due to requiring end-to-end security with all IP based technology in order to overcome the WAP gateway security breaches. SSL/TLS protocol is used to transfer data and product between Mobile terminal and WAP gateway and also in WAP gate way and server. WTLS and SSL/TLS protocols use a message authentication code (MAC) technique to provide the data integrity. In our proposed work is use a RC4 algorithm for encryption because Stream cipher algorithm is better than block cipher algorithm.  Fuzzy logic is applied in biometric server to use fingerprint matches exactly or not and measure the threshold level consists of exactly 100% or 60-99% or below 60%. Merchant authentication is done by trusted Third party.
Maintenance is one of the main software creation activities in terms of allocated resources. As existing software ages, newly developed systems are built to improve upon existing systems. As software evolves, modularization structure of... more
Maintenance is one of the main software creation activities in terms of allocated resources. As existing software ages, newly developed systems are built to improve upon existing systems. As software evolves, modularization structure of software degrades and at one point it becomes a challenging task to maintain the software future. Recently, clustering techniques have been used to help with the issues of software evolution and maintenance. It is well known fact that a good modularized software system is easy to understand and maintain. Software Module Clustering is an important task during the maintenance of the software whose main goal is to achieve good modularized software. In recent time, this problem has been converted into search based software engineering problem. All previous work on software module clustering used a single objective formulation of the problem. That is, twin objectives of high cohesion and low coupling have been combined into a single objective called modularization Quality. We introduced hybrid clustering approach which improves modular structure of the software system. We present the results and comparing the results obtained with this from existing single objective formulation on 7 real world model clustering problems. The results of this empirical study prove that hybrid clustering approach produces significantly better solutions than the existing single-objective approach.
The unquestionable relevance of the web in our society has led to an enormous growth of websites offering all kinds of services to users. In the field of software engineering usability defines software system demand and use. HCI is the... more
The unquestionable relevance of the web in our society has led to an enormous growth of websites offering all kinds of services to users. In the field of software engineering usability defines software system demand and use. HCI is the study of interaction between human and computer and one of the key aspects of the study is usability. Usability is considered to be one of the most important quality factors. To help software engineers set of usability guidelines are defined for software development. A web application or web app is any application software that runs in a web browser and relies on a common web browser to render the application. Usability has been evaluated taking into account the user’s satisfaction by using different evaluation methods. To evaluate usability of web applications, WUEP (Web Usability Evaluation Process) method is proposed. In this paper we present WebML, a notion for specifying complex websites at the conceptual model. To measure the usability different metrics are proposed. Additionally, to perform experts evaluation based on heuristics a Sirius framework is defined.
We propose a new model for large scale Software development for Products and Services with high quality expectations. It would be based on investing upfront in the Software Architecture of the system, designing with the software product... more
We propose a new model for large scale Software development for Products and Services with high quality expectations. It would be based on investing upfront in the Software Architecture of the system, designing with the software product monitoring and alerting logic in place, end-to-end user experience, experimentation and quality of service based on Poka-Yoke principles. The basic idea behind developing this new model is to have high quality software products and services that can be developed faster, cheaper and in better way, it can scale with demand in various scenarios, can deliver outstanding user experience and be failing safe for SDLC bottlenecks which arise in both conventional and Agile Software Development. The proposed model has the following areas:
- Get the right Software Architecture in place
- Ensure high quality software is developed
- It is based on POKA-YOKE principles
- Focus is on user experience
- Ensuring need of the software is identified
- Architecture design follows 12 factor principles
The rapid advancement in production of inexpensive cmos, extremely small cameras and microphones and tiny batteries has led to the development of wireless multimedia sensor networks. Wireless multimedia sensor networks have wide... more
The rapid advancement in production of inexpensive cmos, extremely small cameras and microphones and tiny batteries has led to the development of wireless multimedia sensor networks. Wireless multimedia sensor networks have wide applications in multimedia surveillance network, environment monitoring and traffic avoidanceand control system. But sensor devices are curbed in terms of memory, data rate, processing capability .While transmitting a video using compressed sensing through these WMSNsthe memory and PSNR constraints have a significant effect. With the view to minimize the above mentioned constraints, this paper addresses a compressed sensing scheme which adopts the Hadamard matrix for dimensionality reduction and uses CoSaMP (Compressive Sampling Matching Pursuit)recovery algorithm to reconstruct the video .The advantage of Hadamard matrix  over other measurement matrix is that it uses significantly  less number of elements compared to other measurement matrices. CoSaMP algorithm minimizes time complexity in comparison to other recovery algorithm.
Predicting defect-prone software components is an economically important activity. Software defect prediction work focuses on three ways 1) Estimating the number of defects remaining in software systems, 2) Discovering defect... more
Predicting defect-prone software components is an economically important activity. Software defect prediction work focuses on three ways 1) Estimating the number of defects remaining in software systems, 2) Discovering defect associations, and 3) Classifying the defect proneness of software components. The software defect prediction that supports both unbiased and comprehensive comparison between competing prediction systems. This methodology is comprised of 1) scheme evaluation and 2) defect prediction components. The scheme evaluation analyzes the prediction performance of competing learning schemes for given historical data sets. The defect predictor builds models according to the evaluated learning scheme and predicts software defects with new data according to the constructed model. In the evaluation stage different learning schemes are evaluated and best one is selected. In the prediction stage the best learning scheme is used to build a predictor with all historical data and the predictor is finally used to predict defect on new data. This system classifies the defect-proneness of software components into two classes, defect-prone and non defect-Prone.
Secure distance-based localization in the presence of cheating beacon (or anchor) nodes is an important problem in mobile wireless ad hoc and sensor networks. Despite significant research efforts in this direction, some fundamental... more
Secure distance-based localization in the presence of cheating beacon (or anchor) nodes is an important problem in mobile wireless ad hoc and sensor networks. Despite significant research efforts in this direction, some fundamental questions still remain unaddressed: In the presence of cheating beacon nodes, what are the necessary and sufficient conditions to guarantee a bounded error during a two-dimensional distance-based location estimation? Under these necessary and sufficient conditions, what class of localization algorithms can provide this error bound? In this paper, we attempt to answer these and other related questions by following a careful analytical approach. Specifically, we first show that when the number of cheating beacon nodes is greater than or equal to a given threshold, there do not exist any two-dimensional distance-based localization algorithms that can guarantee a bounded error. Furthermore, when the number of cheating  beacons is below this threshold, we identify a class of distance-based localization algorithms that can always guarantee a bounded localization error.
All the company employee have some unique username and password .If colleague anyone know username and password of others means ,there is chance to leak out our company details and important document .so we have to avoid that. If... more
All the company employee have some unique username and password .If  colleague anyone know  username and password of others means ,there is chance to leak out our company details and important document .so we have to avoid that. If hackers any one can type username and password in pc without permission means the intimation msg will be sent to authorized mobile. At the same time the authorized person will logout and change their password of unique id  through mobile .so the unauthorized person will not able to take the file or document .The unique id  will logout and next time hacker will not able  to access the account
Communication systems have been marching towards rapid developments due to the advancements in science and technology. Wireless Sensor Networks (WSNs) have been gaining importance recently due to the development of low-cost nodes and... more
Communication systems have been marching towards rapid developments due to the advancements in science and technology. Wireless Sensor Networks (WSNs) have been gaining importance recently due to the development of low-cost nodes and electronics. But WSNs always come with certain constraints like battery power, resources etc. In order to extend the network lifetime and to minimize the energy consumption, it is essential to propose techniques that will serve the purpose. In clustered Sensor networks Cooperative Multiple-Input-Multiple-Output (CMIMO) can be used to bring in diversity among the different nodes in the cluster. By doing so, all the nodes in the cluster will cooperatively transmit the sensed data to the sink or access point (AP). Data aggregation is one more technique which can be combined with CMIMO to reduce the number of bits present in the data that is transmitted to the destination or AP. In any communication ninety percent of the energy is spent in transmission of the data. So it is essential to compress the data before transmission. In terms of clustered WSNs, nodes that are spatially close to each other will sense more or less the same information like, for example, temperature, pressure, humidity etc. As a result there is redundant data and this redundancy should be removed to enable energy-efficient transmission. The technique combining CMIMO and Data Aggregation is named as CMIMO-A.
This paper deals with a biosensor using a micro fabricated array of micromechanical cantilevers. This biosensor is used to detect tuberculosis. The sensor consists of antibody layer immobilized onto gold-coated cantilevers and interacts... more
This paper deals with a biosensor using a micro fabricated array of micromechanical cantilevers.  This biosensor is used to detect tuberculosis. The sensor consists of antibody layer immobilized onto gold-coated cantilevers and interacts with antigen. The patient blood sample is placed on the cantilever surface. If the sample contains disease causing antigen, immobilized antibody binds with the antigen. This antigen antibody binding causes increase in surface stress. The addition of mass due to antigen antibody binding involved in this process causes the cantilever to bend. The deflection of these cantilever beams can be detected using various techniques like piezoresistive, piezoelectric or capacitive effects. The detection of pathogens requires an extremely sensitive cantilever. Increasing the sensitivity of a microcantilever biosensor can be done by changing the shape of the cantilever. Intellisuit software is used to analyze the proposed microcantilever with increased sensitivity.
Studying user's varied requirement, composition of Web services becomes more and more important in order to fulfill user’s demand. The traditional systems of web services composition don't support variability of user's requirement.... more
Studying user's varied requirement, composition of Web services becomes more and more important in order to fulfill user’s demand. The traditional systems of web services composition don't support variability of user's requirement. Therefore, introducing user's requirement into system of web services composition and extend the modules of composition system. In this paper, we deal with the issues of reconfigurable service modeling and efficient service composition decision making, design web services composition framework based on user’s requirement, and the composition module in detail. User requirement are read by defining the semantic information of web service, defining the user requirement provide trust for web services, extend the need to understand the exact attribute of web services in all aspects like, QoS Choreographies and Trust helps in Composition and Automation Framework for reconfiguration.
The use of information technology and management systems for the betterment of health care is more and more important. However, current efforts mainly focus on informatization of hospitals or medical institutions within the organizations,... more
The use of information technology and management systems for the betterment of health care is more and more important. However, current efforts mainly focus on informatization of hospitals or medical institutions within the organizations, and very few are directly oriented to the patients, their families, and other ordinary people. The crude demand for various medical and public health care services from customer’s calls for the creation of powerful individual-oriented personalized health care service systems. In this paper, we present and implement a software modelled healthcare Information Service Platform, which is based on such technologies. It can support numerous health care tasks and provide individuals with many intelligent and personalized services, and support basic remote healthcare. In order to realize the personalized customization and active recommendation of intelligent services, techniques such as decision making are integrated.
Cloud computing is the use of computing resources that are delivered as a service over a network. Cloud computing can be used for in different ways; in present market it is the attractive one which can be use by everyone. Load balancing... more
Cloud computing is the use of computing resources that are delivered as a service over a network. Cloud computing can be used for in different ways; in present market it is the attractive one which can be use by everyone. Load balancing is the cost effective concept in cloud computing environment and it shows influence on the performance. An optimum load balancing can make cloud computing more cost effective and it gives fulfillment results for user. Load balancing will perform in public cloud based on the cloud partitioning concept and here we have to choose different methods for different problems. In existing system, Round-Robin algorithm used for ideal status and game theory can be used for normal status of the system and as well as for load balancing process. When number of clients send request to cloud at that time we need to face traffic at this time we cannot schedule task to the partitions  to avoid this traffic here we are using KP (Koutsoupias, Papadimitriou) model. The nature of the KP model, can be use for the reduce compilation of the tasks which we got from user due to that we can schedule task to the partition without facing traffic.
Traditional Service Discovery methods are based on the centralized UDDI registries and can easily suffer from the problems of performance bottlenecks like single node failures, load balancing etc. To address these problems, many... more
Traditional Service Discovery methods are based on the centralized UDDI registries and can easily suffer from the problems of performance bottlenecks like single node failures, load balancing etc. To address these problems, many approaches based on distributed architecture have been proposed  such as peer-to-peer-based decentralized service discovery approaches (CAN, Chord,Chord4S etc )which suffer from routing inefficiency and  causes over head delays.  Hence a new hybrid approach that is based on the decentralized peer to peer system as well as a centralized global repository is defined to address the problems. Based on the Chord 4S it supports service query with wildcards also. Load Balancing and Data Availability are further extended by distributing the domain specific services with respective peers and also a method called Request Mediator is introduced to reduce the overhead delay and improve the performance of the system.
As the available number of web services are increasing, the problem of discovering and selecting the most suitable service has become complex. Hence, a new discovery mechanism is needed as the existing discovery techniques fail to find... more
As the available number of web services are increasing, the problem of discovering and selecting the most suitable service has become complex. Hence, a new discovery mechanism is needed as the existing discovery techniques fail to find the similarities between Web services capabilities. A new discovery framework is proposed in this paper, this framework includes techniques like part-of-speech tagging, lemmatization, and word sense disambiguation. After detecting the senses of relevant words gathered from Web service descriptions and the user’s query, a matching process takes place and displays the relevant services. The proposed technique is implemented and its components are validated using some test samples. The results of this experiment promise the proposed framework will have a positive impact on the discovery process.
Web services are gaining momentum in academia and industry as they hold the promise of developing loosely coupled business processes. This momentum is witnessed from the widespread adoption of Web services in multiple research projects... more
Web services are gaining momentum in academia and industry as they hold the promise of developing loosely coupled business processes. This momentum is witnessed from the widespread adoption of Web services in multiple research projects and application domains. LCS is a dynamic collaboration between autonomous web services that collectively provide a value-added service .change reaction process will go through a set of steps, such as discovering Web services based on the requirement, and regenerating a collaboration among outsourced Web services. A short-term has a very limited lifetime. Once the goal is reached, the collaboration among its component services is then dissolved. A framework is presented to detect and react to the exceptional changes that can be raised inside workflow driven Web application. First, the provisioning of Web services drastically reduces the capital required to start a business. Web services are readily available for integration and orchestration. LCS enable a wide-integration of business entities for the business organization. Web services are having some different features like: global availabilities and standardization.
Enterprise service architecture creates an IT environment in which standardized components will work together for reducing complexity. Service Discovery discovers a web services that meets our requirements and hence describes a particular... more
Enterprise service architecture creates an IT environment in which standardized components will work together for reducing complexity. Service Discovery discovers a web services that meets our requirements and hence describes a particular service. Since there is no complete and clear understanding about the behavior of service, ontology is used for providing relevant services to the end user. Even though Web Service Definition Language (WSDL) and Universal Description Discovery Integration (UDDI) standards are used for the definition of service interfaces and service registries, they  do not provide enough basis for a service consumer to get a full understanding of the behavior of a service.  Ontology is a generic knowledge that represents agreed domain semantics that can be reused by different kinds of applications or tasks which is an efficient method to provide clear notion of service. Hence a generic service specification framework for an enterprise using ontology is proposed to get a clear view about the service provided and its functionality. The service specification framework is based on a founded theory, the ψ theory which can be applied both for specifying human services (i.e., services accomplished by the human beings) and IT services (i.e., services accomplished by the IT systems) .The ψ theory is all about the operation of organizations and communication between and production by social actors which underlies the notion of services.
The purpose of this paper is to provide prioritization techniques which are effective for improving rate of fault detection. In this paper dependency structure algorithms are used to make application testability more efficient and... more
The purpose of this paper is to provide prioritization techniques which are effective for improving rate of fault detection. In this paper dependency structure algorithms are used to make application testability more efficient and reliable. The priority is based on a graph coverage value. Test case prioritization techniques organize the test cases in a test suite by ordering such that the most beneficial are executed first thus allowing for an increase in the effectiveness of testing. One of the performance goals i.e. the fault detection rate, is a measure of how quickly faults are detected during the testing process. Previous work on test cases demonstrates system based evaluation but however these approaches do not consider application based testability. The nature of the techniques preserves the dependencies in the test ordering. Experimental evaluation on two applications indicates that our techniques offer a solution to the prioritization problem in the presence of test cases with dependencies. Two applications are tested over different code and shows good results than the existing one. The results demonstrate a cost-effective technique. The results of both experiments provide clear evidence that the dependency structure algorithms have potential to be used to prioritize test suites. Based on our experimental results and analysis, we can also comment upon our expectations for how well prioritization may perform on programs that are much larger. Also, large programs in which the output values are very sensitive to each individual computation are likely to show greater improvement with our approach as well.
Software complexity and the malwares are increasing exponentially since the number of web users is increasing exponentially. Therefore exhaustive and extensive testing of websites has become a necessity today. But testing a website is not... more
Software complexity and the malwares are increasing exponentially since the number of web users is increasing exponentially. Therefore exhaustive and extensive testing of websites has become a necessity today. But testing a website is not 100% exhaustive due to page explosion problem. The basis test paths obtained from the Page-Test-Trees (PTTs) are reused for white box testing of websites. This saves significant amount of time required to generate test paths and hence test cases as compared to the existing approaches of white box testing. The cost and efforts are also minimized. The proposed technique ensures better website testing coverage as white box testing provides better results than black box testing.
Software testing is the most important technique used in industries. In testing process, the targeted application is tested by the tester in order to verify whether the actual result is same as the expected result as per the requirements.... more
Software testing is the most important technique used in industries. In testing process, the targeted application is tested by the tester in order to verify whether the actual result is same as the expected result as per the requirements. Some of the crucial challenges confronted by software testers during regression testing are the lack of test cases, test data for the modified version of the application. To overcome this problem, we propose a method to generate the optimized Test suite based on the comparison of the initial and modified versions of the software. This paper has several advantages like maximum code coverage and number of iterations.
Static test generation (STG) is a technique used to generate the test paper automatically according to the user specification. The generated test paper will be attempt by the user through the web and submit for solution checking. STG will... more
Static test generation (STG) is a technique used to generate the test paper automatically according to the user specification. The generated test paper will be attempt by the user through the web and submit for solution checking. STG will automatically checks the solution and publish the result to the user. Tabu Search  (T S) is an algorithm which is used to construct the test paper quality. It allows further changes in the question paper after the test paper generates. This could be enhanced by admin to add and update individual question paper after the question paper generate. It also provides the feedback to the user automatically according to their performance.
The bigger applications like web servers e.g. Apache, databases e.g. mysql, and application servers e.g. Tomcat are required to be customizable to adapt to particular runtime contexts and application scenarios. One way to support software... more
The bigger applications like web servers e.g. Apache, databases e.g. mysql, and application servers e.g. Tomcat are required to be customizable to adapt to particular runtime contexts and application scenarios. One way to support software customization is to provide configuration options through which the behavior of the system can be controlled. But the configuration spaces of modern software systems are too large to test exhaustively. The proposed method called traditional combinatorial interaction testing which samples the covering arrays and the test cases to make the highly configured application or system. In the combinatorial interaction testing generally we generate configuration options and then we apply test cases for each configuration options. That causes masking effect or skipping of the some reliable configuration options. The obtained system is towards to highly configurable system which uses traditional combinatorial Interaction Testing. Traditional Combinatorial Interaction Testing generates test cases with configuration options and uses test case specific constraints and seeding which avoids the masking effect.
A time dependent land-mark graph to model the dynamic traffic pattern as well as the intelligence of experienced drivers so as to provide a user with the practically fastest route to a given destination at a given departure time.... more
A time dependent land-mark graph to model the dynamic traffic pattern as well as the intelligence of experienced drivers so as to provide a user with the practically fastest route to a given destination at a given departure time. Variance-Entropy-Based Clustering method is used to estimate the distribution of travel time between two landmarks in different time slots. Based on this graph, designing a two-stage routing algorithm to compute the practically fastest and customized route for end users .
Reliability of Software System can be enhanced through Reliability Growth Testing (RGT) process. Considering the impact of the reliability on success or failure of a software product, it is essential to model RGT process using Software... more
Reliability of Software System can be enhanced through Reliability Growth Testing (RGT) process. Considering the impact of the reliability on success or failure of a software product, it is essential to model RGT process using Software Reliability Growth Models (SRGM). A large number of SRGMs have been proposed by a number of researchers. However the availability of software tools for this purpose is limited. Furthermore four flexible SRGMs have been proposed by Subburaj et al. which seem to fit the software failure data adequately irrespective of wide fluctuations in time between failures, presence or absence of the learning phenomenon of the testing team and Quality of Debugging. A Software Tool that facilitates estimation of parameters of the 4 model has been developed by the authors. The new SRGM tool developed has Graphical User Interface (GUI) and built with JAVA and R Language. This dedicated Open Source Tool (OST) is found to be equally accurate as that of general purpose tool such as MATLAB. This tool will help software reliability professionals to apply the SRGMs in their project easily and get accurate results.
Phishing attack is a major attack in online banking which is carried through web spoofing, in this paper proposed an Anti-Phishing Prevention Technique namely APPT. which is based on the concept of preventing phishing attacks by using... more
Phishing attack is a major attack in online banking which is carried through web spoofing, in this paper proposed an Anti-Phishing Prevention Technique namely APPT. which is based on the concept of preventing phishing attacks by using combination of one time random password and encrypted token for user machine identification. The method starts by retrieving the password by SMS or by alternate emails. During login the end user request for the password to the server, in that request it contain of encrypted token. If the end user is valid the password with encrypted token will be send through SMS or EMAIL. By using the login id and OTP password user can access the website. For generating encrypted token, x.509certificate2 uses IP address to generate and it’s been encrypted by RSA algorithm.
Keylogger, a highly specialized tool designed to record every keystroke made on the machine to giving the attacker the ability to steal large amounts of sensitive information silently. The primary objective of this project is to detect... more
Keylogger, a highly specialized tool designed to record every keystroke made on the machine to giving the attacker the ability to steal large amounts of sensitive information silently. The primary objective of this project is to detect keylogger applications and prevent data loss and sensitive information leakage. In This project aims to identify the set of permissions and storage level owned by each of the applications and hence differentiate applications with proper permissions and keylogger applications that can abuse permissions .This technique of detecting keyloggers is completely Black-box its based on behavioral characteristics common to all keyloggers and it does not rely on the internal structure of the keylogger. The paper intends to develop a machine learning-based keylogger detection system on mobile phones to detect malware applications.
Wireless Sensor Network is an emerging technology that consists of several numbers of sensor nodes to sense various parameters in its workspace. These batteries are operated with the help of batteries, wherein most of the cases they are... more
Wireless Sensor Network is an emerging technology that consists of several numbers of sensor nodes to sense various parameters in its workspace. These batteries are operated with the help of batteries, wherein most of the cases they are not replaceable. Hence during the design of such networks it is essential that the sensor nodes consume as less energy as possible. Three transmission mechanism, such as; 1.Cooperative Communication, 2. Coalition-Based Data Transmission Mechanism and 3. Cluster-Based Data Transmission, are discussed in this paper which allows an energy-efficient and at the same time a reliable data transmission from the source node to the destination node. The first mechanism depends on two parameters, namely the number of cooperative nodes and the E2E error probability. The second and the third mechanisms depends on the number of cooperative nodes required. Both the mechanisms are studied and their corresponding results are simulated.
This paper presents the design of a Private key Algorithm based on 2-Dimensional Cellular Automata. Initial implementation of the Stream cipher is done using matlab tool to analyze its functionality and security. Advancement in computing... more
This paper presents the design of a Private key Algorithm based on 2-Dimensional Cellular Automata. Initial implementation of the Stream cipher is done using matlab tool to analyze its functionality and security. Advancement in computing technology applications like mobile communication, PDAs, navigational devices are at present being a part of everyone life. But these devices are restricted in computational power consumption, memory storage and data rate. Needs Security services like Confidentiality, Data integrity, and Authentication.
Adhoc Networks (MANETs) are an emerging area of mobile computing. There are various challenges that are faced in adhoc environment. These are mostly due to the resource poorness of these networks. They are usually set up in situations of... more
Adhoc Networks (MANETs) are an emerging area of mobile computing. There are various challenges that are faced in adhoc environment. These are mostly due to the resource poorness of these networks. They are usually set up in situations of emergency, for temporary operations or simply if there are no resources to set up elaborate networks. Adhoc networks therefore throw up new requirements and problems in all areas of networking. The solutions for conventional networks are usually not sufficient to provide efficient adhoc operations. The wireless nature of communication and lack of any security infrastructure raise several security problems. With little protection against tampering, nodes are susceptible to compromise. Thus the networks are vulnerable to DOS attacks through compromised nodes or intruders. Many denial of service type of attacks are possible in the MANET and one of these type attacks is flooding attack in which malicious node sends the useless packets to consume the valuable network resources. Flooding attack is possible in all most all on demand routing protocol. Thus the paper introduces an opportunistic routing technique to forgo this instant attack. The opportunistic routing takes into account the relative velocity rather than the distance between nodes.
A Wireless Sensor Network (WSN) consists of several sensor nodes deployed in inaccessible areas for monitoring temperature, pressure, vibration, sound, motion etc. A WSN is used for variety of applications such as military, civil,... more
A Wireless Sensor Network (WSN) consists of several sensor nodes deployed in inaccessible areas for monitoring temperature, pressure, vibration, sound, motion etc. A WSN is used for variety of applications such as military, civil, industrial automation, medical, home automation, fleet monitoring, habitat monitoring, preventing theft etc. The availability of inexpensive hardware such as CMOS cameras and microphones has led to the development of Wireless Multimedia Sensor Networks (WMSN) which is used for image and video applications. In case of video applications the captured data will be too large if transmitted as such so it has to be compressed before transmission. Compression in traditional video encoding makes use of motion estimation and motion compensation techniques which requires intensive operations that lead to significant energy consumption and also the storage required
is high. This drawback can be addressed by Compressed sensing, an emerging technique that directly obtains the desired samples,
thereby reducing the energy consumption, storage capacity and bandwidth used in the network. It is used for reconstructing a signal from the M<<N measurements obtained from sparse or compressible signals, where N is the number of samples required for Nyquist sampling. Compressed sensing can overcome the drawbacks of traditional video encoders by simultaneously sensing and compressing the data at low complexity. The original signal can be recovered from measurements using basis pursuit and greedy algorithms. The objective of this paper is to implement a video compressed sensing framework using Gaussian measurement matrix and reconstruct it using Orthogonal Matching Pursuit algorithm and further transmission energy is analysed for the video compressed sensing framework.
The capability of annotating pictures by computers can lead to breakthroughs in a wide range of applications, including Web image search, online picture-sharing communities, and scientific experiments .image retrieval work is the first... more
The capability of annotating pictures by computers can lead to breakthroughs in a wide range of applications, including Web image search, online picture-sharing communities, and scientific experiments .image retrieval work is the first technique to achieve real-time performance with a level of accuracy useful in certain real applications. It is also the first attempt to manually assess the large-scale performance of an image annotation system .It is a particular case of information retrieval. It adds more complex mechanisms to relevance image retrieval: visual content analysis and/or additional textual content. The image auto annotation is a technique that associates text to image, and permits to retrieve image documents as textual documents, thus as in information retrieval. The image auto annotation is then an effective technology for improving the image retrieval. In implementing all the conceptual information provided and in addition, we are trying to integrate the ranking features. So that an automatic ranking on the images will be done based on the meta data features of an image .In addition, the user will be provided an option of ranking the site manually.
A traditional Enterprise BI solution has the capability to extract internal organizational data and transform it into useful information. Ignores the importance of data available in social media. In recent times, social networks have... more
A traditional Enterprise BI solution has the capability to extract internal organizational data and transform it into useful information. Ignores the importance of data available in social media. In recent times, social networks have become virtual business goldmines. Those organizations best able to differentiate themselves have an ability to incorporate social media analytics into their customer processes, to monetize their investments and integrate insight into their customer data. That would build relationship to new customers and stronger connections with existing ones. It is characterize the e-commerce framework of fetching data from social networks by using (Qlik view) tool. It is proposed that the implementation of principal elements of an e-commerce. It is becoming a realization that BI can render better decision-making when corporate data is analyzed in integration with social media & real time data.
Large datasets are being outsourced with help of data owners and mining experts. This type of large datasets are being mined to extract hidden knowledge and patterns that assist decision makers in making effective, efficient and timely... more
Large datasets are being outsourced with help of data owners and mining experts. This type of large datasets are being mined to extract hidden knowledge and patterns that assist decision makers in making effective, efficient and timely decisions in an ever increasing competitive world. In general owner need to define the usability constraints manually to preserve the knowledge contained in the dataset. This paper aims at creating a framework to define the usability constraints in an automated fashion. This novel formal model facilitates a data owner to define usability constraints and to preserve the knowledge contained in the datasets in automated fashion and to provide security to outsourced dataset. We implemented and tested our model on different data sets .Our model not only preserve knowledge contained in the datasets but significantly enhances security to datasets compared with existing system
Advancements in technology lead to many new inventions. Among them Face Detection process plays a key role in imaging processes. This process analysis for skin color in an image and returns the face location. Earlier personal computers or... more
Advancements in technology lead to many new inventions. Among them Face Detection process plays a key role in imaging processes. This process analysis for skin color in an image and returns the face location. Earlier personal computers or digital cameras were used to detect faces. But to its inadequacy in properties like limited computation, power capacity and speed the Embedded Smart Cameras or Mobile Devices with built in cameras are preferred nowadays due to its common purpose workstations. In this work, we merge the design and implementation of Ma’s algorithm and Pyramid-like FAce Detection (P-FAD) algorithm. It is a real-time face detection and recognition system created on general embedded devices. P-FAD is a hierarchical detection scheme and it present a three stage coarse, shift, and refine procedure, to be built and gives less computation overhead, more speed and accuracy without losing the performance factors. Our proposed approach scans the targeted image fully and identifies or verifies the criminal faces through a multi-layered algorithmic process like P-FAD and Ma’s algorithm. The client side has to load/ crap the images and pre-process the image and send it to particular crime department or police department using a secured path as a MMS or Bluetooth or E-Mail. After that, the received image is matched with the existing images from the database using matching algorithms. It implemented on Android phones, laptop with embedded web camera platform. These proposed methodologies are tested with image detection using template matching algorithm and the results proves to be efficient.
Comparing entities is an important part of decision making. Several approaches have been reported for mining comparable entities from Web sources to improve user experience in comparing entities online. However, these eן¬€orts extract... more
Comparing entities is an important part of decision making. Several approaches have been reported for mining comparable entities from Web sources to improve user experience in comparing entities online. However, these eן¬€orts extract only entities explicitly compared in the corpora, and may exclude entities that occur less-frequently but potentially comparable. The prerequisite step of this task is to find comparable entities. In this paper, we propose a novel Bootstrapping algorithm to address the task by mining comparable entities from comparative questions in a collection of questions (e.g., that users posted on-line). For example, a weakly-supervised bootstrapping method can be used to identify comparative questions, comparative patterns, and extract comparable entities. The comparable entities can be used to help users make alternate decisions by comparing relevant mining entities, instead of providing merely recommendation, as is currently provided. By using bootstrapping technique entities has to be extracted with a specific relation. However, our task is different from the previous one that it requires not only extracting entities but also ensuring that the entities are extracted from comparative questions (comparative question identification), which is generally not required in information extraction task. Finally, the ranking is provided based on the userג€™s perspective for entities listed. Experimental results demonstrate that our proposed framework can outperform the baseline systems.
The largest challenge in software development is to ensure that the software is correct, easy to maintain and extend. Identifier names are one of the most important sources of information about program entities. It constitutes the... more
The largest challenge in software development is to ensure that the software is correct, easy to maintain and extend. Identifier names are one of the most important sources of information about program entities. It constitutes the majority of tokens in source code and they are the primary source of conceptual information for program comprehension. The quality of identifier names might influence the quality of source code. One approach to provide meaning to programmers and tools is to expand abbreviations into full words. Problems related to equally typed arguments are hard to analyze. It involves the semantics of the program, which is not explicit in the source code but only exists in the mind of the programmer. The goal is to detect anomalies involving equally typed method arguments statically. It leverages identifiers to infer the semantics of arguments and their intended positions. It is an automatic, mostly language-agnostic static analysis to detect anomalies in the order of equally typed method arguments and reports warnings about potentially erroneous call sites of methods with equally typed arguments. We evaluate our approach with mature and well-tested Java programs. Short and meaningless names not only confuse programmers, but also prevent our analysis from inferring the semantics of arguments. Our work preprocesses naming examples before searching for anomalies.
The sun produces an unbelievable amount of energy that reaches the earth. The amount of energy that is absorbed by the earth in one hour is more energy than mankind uses in one year. The total amount of solar energy reaching the earth in... more
The sun produces an unbelievable amount of energy that reaches the earth. The amount of energy that is absorbed by the earth in one hour is more energy than mankind uses in one year. The total amount of solar energy reaching the earth in one year is huge twice as much energy as ever existed from all sources of coal, oil, natural gas, and uranium combined. With its abundance of sunlight, India has tremendous potential to emerge as one of the leaders in solar power generation. According to the Government of India’s policy for the solar  sector – Jawaharlal Nehru Nationaln Solar Mission (JNNSM) – a target of 20 GW of solar installations by 2022 has been set. India is endowed with vast solar energy potential,  About 5,000 trillion kWh per year energy is incident over India’s land area with most parts receiving 4-7 kWh per sq. m per day. Solar energy intensity varies geographically in India, but Western thar Rajasthan receives the highest annual solar radiation energy. Its western thar part blessed with abundant natural resource with more then 325 days of sunshine every year. Jodhpur , popularly known as the sun city of india. A dedicated 400 kV network with associated 220 & 132 kV strong transmission network in Barmer, Jaisalmer, Jodhpur, Bikaner area was created. Indeed, Rajasthan is the only State in India which has established a strong power evacuation network in desert area. Hence current status, various issues, regulatory policies and incentives for promotion of Solar PV Power Parks in Rajasthan along with a case study , site report and geotechnical investigation for Photovoltaic Solar Power Plant has been discussed in this Paper.
Keywords - PV Solar Power, Renewable Energy, JNNSM, Research and Devlopment in Solar Projects of Rajasthan.
Research Interests:
India is blessed with rich solar energy and if exploited efficiently, the country has the potential of producing trillionkilowatts of electricity. Sunlight is converted to electricity directly when made to fall on solar photovoltaic (SPV)... more
India is blessed with rich solar energy and if exploited efficiently, the country has the potential of
producing trillionkilowatts
of electricity. Sunlight is converted to electricity directly when made to fall on
solar photovoltaic (SPV) modules. Systems /devices are made for various applications based on SPV
modules connected with suitably designed power conditioning units for meeting electricity requirements.
These systems/devices are designed to work in offgrid
mode (usually supported with batteries to allow
use when sunlight is low or during night). In recent years solar PV systems became viable and attractive.
Utility scale plants are being set up worldwide with promotional mechanisms which are set up on ground
surface. Available rooftop
area on the buildings can also be used for setting up solar PV power plants,
and thus dispensing with the requirement of free land area. The electricity generated from SPV systems
can also be fed to the distribution or transmission grid after conditioning to suit grid Integration.Currently,
whole world is in the midst of an energy revolution that is fundamentally changing the future of rural
electrification. So we present a review in this paper on todays policy and status of grid connected roof
top PV system in Rajasthan
Research Interests:
Objective of this study is to identify barrier of supply chain management (SCM) in manufacturing organization through systematic literature review of past ten years and identify most critical barrier which hinder the performance of supply... more
Objective of this study is to identify barrier of supply chain management (SCM) in manufacturing organization through systematic literature review of past ten years and identify most critical barrier which hinder the performance of supply chain. List of barrier identified by comprehensive literature review is presented here. This paper has identified 23 key SCM barriers which help industrial practitioners and academic experts to implement SCM. Nowadays manufacturing organizations are competing based of supply chain to supply chain rather than organization to organization. Effective supply chain management help organization to in securing his position in this competitive environment and improving organizations performance.
Research Interests:
In this project the development of a motorcycle security system that uses a microcontroller to detect theft and inform the owner using a mobile phone when theft occurs. This system protects motorcycle from theft and provides a reliable... more
In this project the development of a motorcycle security system that uses a microcontroller to detect theft and inform the owner using a mobile phone when theft occurs. This system protects motorcycle from theft and provides a reliable security system to motorcyclist with affordable price. The microcontroller is the interface between the GSM module and the vehicle. It stimulates the module in message forwarding and it is programmed in such a way that it switches OFF the engine once it receives a message from the user, thus emphasizing its significance in preventing vehicle hi-jack. The process happens in a short period of time, and hence proves to be vital for theft control. For security purpose we need to interface EEPROM for password. And here we are also sending the coordinates in the way of LAT &LOT getting from GPS nothing but vehicle location to the user .They are much smaller and simplified so that they can include all the functions required on a single chip. Having the micro- controller is of great use, as it has low design cost and adds intelligence to the system.
Research Interests:
Cognitive radio is a seemly hopeful technology for future. Spectrum sensing is the most important function of cognitive radio. Most of the spectrums sensing techniques proposed become dysfunctional at lower signal to noise ratio... more
Cognitive radio is a seemly hopeful technology for future. Spectrum sensing is the most important function of cognitive radio. Most of the spectrums sensing techniques proposed become dysfunctional at lower signal to noise ratio environments. This paper proposes a reliable spectrum sensing algorithm for cooperative cognitive network. A dynamic threshold updating method is developed for which threshold changes with the environmental noise impact. Energy detector will be used as basic block to development. The reliability of sensing technique is evaluated in terms of probability of error and minimum signal to ratio it tolerates. The work includes study of existing spectrum sensing techniques that are single threshold energy detector and double threshold energy detector and their implementation in the cooperative sensing environment. Finally they are juxtaposed in cooperative environment. Optimal number of cognitive radios required in the decision making process is also calculated.
Research Interests:
Load flow has always been a critical problem and has been approached by several researchers. The techniques like Newton Raphson and Gauss Siedel works well but has severe computational disadvantages.The purpose of this paper is to analyse... more
Load flow has always been a critical problem and has been approached by several researchers. The techniques like Newton Raphson and Gauss Siedel works well but has severe computational disadvantages.The purpose of this paper is to analyse the voltage stability of electric power distribution system. A new method is established for computing voltage stability index (VSI) for each nodes of any distribution network. The most sensitive node is that node which having minimum value of voltage stability index. This paper proposes a novel direct method to load flow for distribution systems. The system has been simulated for constant power load considering losses. The methodology has been implemented on MATLAB R2013b and voltages of each bus are found out and real and reactive losses are calculated. Also the most sensitive node is found out. The results shows an improvement over traditional methods
Research Interests:
Load flow has always been a critical problem and has been approached by several researchers. The techniques like Newton Raphson and Gauss Siedel works well but has severe computational disadvantages. Since only two of the required four... more
Load flow has always been a critical problem and has been approached by several researchers. The techniques like Newton Raphson and Gauss Siedel works well but has severe computational disadvantages. Since only two of the required four parameters are known, hence a load flow approach has to be implemented. This paper proposes a novel direct method to load flow for distribution systems using losses which saves computation power. The system has been simulated for constant power load considering losses. The methodology has been implemented on MATLAB R2013b and voltages of each bus is found out and real and reactive losses are calculated. Also the most sensitive node is found out. The results shows an improvement over traditional methods.
Research Interests:
Obstacle detection is a main key of autonomous vehicles. When communicating with huge robots in unstructured background, resilient obstacle detection is required. Few of the existing methods are mainly suited for the backgrounds in which... more
Obstacle detection is a main key of autonomous vehicles. When communicating with huge robots in unstructured background, resilient obstacle detection is required. Few of the existing methods are mainly suited for the backgrounds in which the ground is comparatively flat and with roughly the same color throughout the terrain. A novel procedure proposed in the work presented here uses a monocular camera, for real-time performance. We compute the homography between two successive frames by computing the fundamental matrix between the two frames. Estimation of fundamental matrix is followed by triangulation so as to estimate the distance of the object from the camera. We examine a difficulty intrinsic to any fundamental matrix-based outlook to the provided task, and show how the discussed way can resolve this difficulty by a huge level. An obstacle detection and distance estimation system based on visual particular attribute and stereo vision is hence discussed in the presented work.
Research Interests: