EXCEL International Journal of Multidisciplinary Management Studies, 2013
The incessant growth in the size and use of the internet imposes new methods of design and develo... more The incessant growth in the size and use of the internet imposes new methods of design and development of online information services. Most Web structures are large, complicated and users often miss the purpose of their inquiry or get ambiguous results when they try to navigate through them. Internet is enormous compilation of multivariate data. Several problems prevent effective and efficient knowledge discovery for required better knowledge management techniques, it is important to retrieve accurate and complete data. The hidden Web, also known as the invisible Web or invisible Web, has given rise to a novel issue of Web mining research. A huge amount documents in the hidden Web, as well as pages hidden behind search forms, specialized databases and dynamically generated Web pages, are not accessible by universal Web mining application. In this research we proposed a system that has a robust ability to access these hidden web Effectiveness Improvement Model (EIM) techniques for better Invisible Web Resources Selection and integration system. As dynamic content generation is used in modern web pages and user forms are used to get information from a particular user and stored in a database. The link structure lying in these forms can not be accessed through conventional mining procedures. And the research proposed new technique for Invisible Web Resources Selection and integration and its construction for real-world domains based on database Schemas, web query interfaces and improve traditional methods for information retrieve. Applications of the Invisible Web query interface mapping and intelligent user query intension recognition based on our domain knowledge-base are also introduced briefly. It is valuable to large scale applications of intelligent Invisible Web integration and information retrieval.
The incessant growth in the size and use of the internet imposes new methods of design and develo... more The incessant growth in the size and use of the internet imposes new methods of design and development of online information services. Most Web structures are large, complicated and users often miss the purpose of their inquiry or get ambiguous results when they try to navigate through them. Internet is enormous compilation of multivariate data. Several problems prevent effective and efficient knowledge discovery for required better knowledge management techniques, it is important to retrieve accurate and complete data. The hidden Web, also known as the invisible Web or invisible Web, has given rise to a novel issue of Web mining research. A huge amount documents in the hidden Web, as well as pages hidden behind search forms, specialized databases and dynamically generated Web pages, are not accessible by universal Web mining application. In this research we proposed a system that has a robust ability to access these hidden web Effectiveness Improvement Model (EIM) techniques for be...
In this work the statistical information of various formats of the collection of item in a group ... more In this work the statistical information of various formats of the collection of item in a group are obtained by using various searching and sorting techniques. Which are time consuming and memory usage models the item user while Working with the item has to implement his manual efficiency to trace the list of the selected item in the transactions in various item set collection to identify the most frequency set item in all item .but there is no appraise method to search for the item set result. So we have to use the concept of searching techniques along with the sale on product that find out profit on item trade the required output.
Active queue management (AQM) denotes to a family of packet dropping mechanisms for router queues... more Active queue management (AQM) denotes to a family of packet dropping mechanisms for router queues that has been suggested to maintenance end-to-end congestion control mechanisms in the Internet. The suggested AQM algorithm through the IETF is Random Early Detection (RED). The RED algorithm permits network operators concurrently to reach at high throughput and low average delay. But, the resulting average queue length is comparatively sensitive to the level of congestion. The Refined Adaptive RED (RARED), use for decreasing the sensitivity to parameters that affect RED performance. CoDel Algorithm is proposed for controlling the congestion in routers. We compare these three algorithms on the basis of simulations results. CoDel is parameterless and it treats good queue and bed queue differently. So we see that the CoDel scheme improves almost performance of the network.
2014 6th Computer Science and Electronic Engineering Conference (CEEC), 2014
DSDV (Destination-Sequenced Distance Vector), AODV (Ad hoc On-Demand Distance Vector), AOMDV (Ad ... more DSDV (Destination-Sequenced Distance Vector), AODV (Ad hoc On-Demand Distance Vector), AOMDV (Ad hoc On-Demand Multipath Distance Vector) and DSR (Dynamic Source Routing) are among the most widely used routing protocols in Mobile Ad-hoc Networks (MANET) due to their compatibility with multi hop routing environments and scalability towards increased traffic and mobility. On the other hand due to the differences in the routing schemes where DSDV is a table driven routing scheme and AODV, AOMDV and DSR are on-demand routing schemes, each routing protocol has its advantages and disadvantages. In this paper we investigate the performance of these routing protocols in Ship Ad-hoc Networks (SANET) in a maritime environment where ships communicate using Very High Frequency (VHF) as the physical layer. A mobile ad-hoc network is achieved so that ships can use it to share data networking facilities or to send particular sensor data such as sea depth, temperature, wind speed and direction, etc. to a central server to produce a public information map.
In traditional approach used for image indexing is typically compute a single signature for each ... more In traditional approach used for image indexing is typically compute a single signature for each image based on color histograms, texture, wavelet transforms, region based etc, and return as the query result, image whose coordinates match closet to the inserted query image. Therefore, sometime it becomes very hard to match the similar coordinates when the objects that scaled differently or at different locations or only certain regions of the image match. The research aspects of image feature representation and extraction, multi-dimensional indexing and based on the Content-Based Image Retrieval. Second, a variation to Guttman’s R-trees (R-trees) that avoids overlapping rectangles in intermediate nodes of the tree is intro duced. Algorithms for searching, updating, initial packing and reorganization of the structure are discussed in detail. Finally, we provide analytical results indicating that R-trees achieve up to 50% savings in disk accesses compared to an R-tree when searching...
International Journal of Artificial Intelligence and Knowledge Discovery, 2011
Web spider is one of the most important components of the search engine, which has been used for ... more Web spider is one of the most important components of the search engine, which has been used for accessing the web pages through the World Wide Web(WWW). When we crawl the web pages with different data sizes, then we can access limited web pages according to transfer rate of total data size in per unit time. In case of increasing the data size of web pages, number of web pagesdecrease. So it affects the performance of web spider. This paper describes the issue of data size of web pages, crawled by the web spider. This problem has been implemented here regarding to achieve high performance of web spider by using compression technique, which reduces the data size with big compression ratio of crawled web pages.
International journal of engineering research and technology, 2013
Continuous queries are used to monitor changes to time varying data and to provide results useful... more Continuous queries are used to monitor changes to time varying data and to provide results useful for online decision making. In these queries, at specifies coherency a client requirement as part of the query. For continuous queries, network of data aggregators a low cost and scalable technique used. Individual node cannot be determine its inclusion by itself in the query result for this a different algorithmic challenges from aggregate and selection queries are presented. At specific coherencies each data item can serve for a set of data aggregators. In this technique disseminating query into sub query and sub queries are executed on the chosen data aggregator are involves. We build a query cost model, which can be used to estimate the number of refresh messages which is required to satisfy client specified incoherency bound. Each data aggregator serve a set of data items at specific coherencies. Performance show that, our query cost model can be executed using less than one third ...
International Journal of Advances in Engineering Sciences, 2011
It is widely recognized that data security, influence ofprocessor types used, resources and the a... more It is widely recognized that data security, influence ofprocessor types used, resources and the architecture regardingthroughputs are playing a central role in IT systems. EllipticCurve Cryptography (ECC) is a sort of public-key cryptographythat is an alternative to other public-key algorithms like RSA,DSA. It is widely accepted because of the usage of smallerparameters than other public-key cryptosystems but with samelevel of security, reduces the memory consumption of hardwareand software by using the special kind of functionalities and animproved versions of algorithm with a larger margin of theprevious systems by using an extension of ECC called KoblitzCurve.It has becoming increasingly common to implementsuch kind of systems that takes a shorter computational time foran execution of an instruction. The previous versions were of anarithmetic computation that takes a longer time. I haveemphasized on fuzzy modular arithmetic that takes a veryshorter time towards the computational ...
International Journal of Computer Applications, 2012
protocols for Mobile Ad Hoc Networks (MANETs) have been explored extensively in recent years. Muc... more protocols for Mobile Ad Hoc Networks (MANETs) have been explored extensively in recent years. Much of this work is targeted at finding a feasible route from a source to a destination without considering current network traffic or application requirements. Therefore, the network may easily become overloaded with too much traffic and the application has no way to improve its performance under a given network traffic condition. While this may be acceptable for data transfer, many real-time applications require Quality-of- Service (QoS) support from the network. The main idea of the proposal is that QoS support can be achieved more efficiently by considering the link stability and signal strength of intermediate channels. We propose an on demand QoS routing scheme named signal Stability based QoS Routing (SSQR), that provides QoS support in terms of bandwidth and end to- end delay in mobile ad hoc networks (MANETs). SSQR is designed over Signal Stability based Adaptive Routing (SSA) and...
Nowadays, Users are uploading thousands of videos to the internet and every day they are shared. ... more Nowadays, Users are uploading thousands of videos to the internet and every day they are shared. In these uploaded videos, several amounts of videos are manipulated and illegal replicas of present media. So it becomes difficult for copyright management to manage media not to be stealing on internet. In recent times, they have turn into thoughtful concerns for accretive online video sources copyright infractions and data piracy. There are two methods to detect infringements. The First method is centered on watermarking as well as second method is founded on Content Based Copy Detection (CBCD). These disadvantages are overcome in suggested Temporally Informative Representative Images-Discrete Cosine Transform (TIRIDCT) system.In suggested TIRI-DCT method; two quick searching methods are used for matching of fingerprint. These two approaches are inverted file based similarity search and cluster based similarity search. Such as for a small section of video TIRI holds spatial and tempora...
EXCEL International Journal of Multidisciplinary Management Studies, 2013
The incessant growth in the size and use of the internet imposes new methods of design and develo... more The incessant growth in the size and use of the internet imposes new methods of design and development of online information services. Most Web structures are large, complicated and users often miss the purpose of their inquiry or get ambiguous results when they try to navigate through them. Internet is enormous compilation of multivariate data. Several problems prevent effective and efficient knowledge discovery for required better knowledge management techniques, it is important to retrieve accurate and complete data. The hidden Web, also known as the invisible Web or invisible Web, has given rise to a novel issue of Web mining research. A huge amount documents in the hidden Web, as well as pages hidden behind search forms, specialized databases and dynamically generated Web pages, are not accessible by universal Web mining application. In this research we proposed a system that has a robust ability to access these hidden web Effectiveness Improvement Model (EIM) techniques for better Invisible Web Resources Selection and integration system. As dynamic content generation is used in modern web pages and user forms are used to get information from a particular user and stored in a database. The link structure lying in these forms can not be accessed through conventional mining procedures. And the research proposed new technique for Invisible Web Resources Selection and integration and its construction for real-world domains based on database Schemas, web query interfaces and improve traditional methods for information retrieve. Applications of the Invisible Web query interface mapping and intelligent user query intension recognition based on our domain knowledge-base are also introduced briefly. It is valuable to large scale applications of intelligent Invisible Web integration and information retrieval.
The incessant growth in the size and use of the internet imposes new methods of design and develo... more The incessant growth in the size and use of the internet imposes new methods of design and development of online information services. Most Web structures are large, complicated and users often miss the purpose of their inquiry or get ambiguous results when they try to navigate through them. Internet is enormous compilation of multivariate data. Several problems prevent effective and efficient knowledge discovery for required better knowledge management techniques, it is important to retrieve accurate and complete data. The hidden Web, also known as the invisible Web or invisible Web, has given rise to a novel issue of Web mining research. A huge amount documents in the hidden Web, as well as pages hidden behind search forms, specialized databases and dynamically generated Web pages, are not accessible by universal Web mining application. In this research we proposed a system that has a robust ability to access these hidden web Effectiveness Improvement Model (EIM) techniques for be...
In this work the statistical information of various formats of the collection of item in a group ... more In this work the statistical information of various formats of the collection of item in a group are obtained by using various searching and sorting techniques. Which are time consuming and memory usage models the item user while Working with the item has to implement his manual efficiency to trace the list of the selected item in the transactions in various item set collection to identify the most frequency set item in all item .but there is no appraise method to search for the item set result. So we have to use the concept of searching techniques along with the sale on product that find out profit on item trade the required output.
Active queue management (AQM) denotes to a family of packet dropping mechanisms for router queues... more Active queue management (AQM) denotes to a family of packet dropping mechanisms for router queues that has been suggested to maintenance end-to-end congestion control mechanisms in the Internet. The suggested AQM algorithm through the IETF is Random Early Detection (RED). The RED algorithm permits network operators concurrently to reach at high throughput and low average delay. But, the resulting average queue length is comparatively sensitive to the level of congestion. The Refined Adaptive RED (RARED), use for decreasing the sensitivity to parameters that affect RED performance. CoDel Algorithm is proposed for controlling the congestion in routers. We compare these three algorithms on the basis of simulations results. CoDel is parameterless and it treats good queue and bed queue differently. So we see that the CoDel scheme improves almost performance of the network.
2014 6th Computer Science and Electronic Engineering Conference (CEEC), 2014
DSDV (Destination-Sequenced Distance Vector), AODV (Ad hoc On-Demand Distance Vector), AOMDV (Ad ... more DSDV (Destination-Sequenced Distance Vector), AODV (Ad hoc On-Demand Distance Vector), AOMDV (Ad hoc On-Demand Multipath Distance Vector) and DSR (Dynamic Source Routing) are among the most widely used routing protocols in Mobile Ad-hoc Networks (MANET) due to their compatibility with multi hop routing environments and scalability towards increased traffic and mobility. On the other hand due to the differences in the routing schemes where DSDV is a table driven routing scheme and AODV, AOMDV and DSR are on-demand routing schemes, each routing protocol has its advantages and disadvantages. In this paper we investigate the performance of these routing protocols in Ship Ad-hoc Networks (SANET) in a maritime environment where ships communicate using Very High Frequency (VHF) as the physical layer. A mobile ad-hoc network is achieved so that ships can use it to share data networking facilities or to send particular sensor data such as sea depth, temperature, wind speed and direction, etc. to a central server to produce a public information map.
In traditional approach used for image indexing is typically compute a single signature for each ... more In traditional approach used for image indexing is typically compute a single signature for each image based on color histograms, texture, wavelet transforms, region based etc, and return as the query result, image whose coordinates match closet to the inserted query image. Therefore, sometime it becomes very hard to match the similar coordinates when the objects that scaled differently or at different locations or only certain regions of the image match. The research aspects of image feature representation and extraction, multi-dimensional indexing and based on the Content-Based Image Retrieval. Second, a variation to Guttman’s R-trees (R-trees) that avoids overlapping rectangles in intermediate nodes of the tree is intro duced. Algorithms for searching, updating, initial packing and reorganization of the structure are discussed in detail. Finally, we provide analytical results indicating that R-trees achieve up to 50% savings in disk accesses compared to an R-tree when searching...
International Journal of Artificial Intelligence and Knowledge Discovery, 2011
Web spider is one of the most important components of the search engine, which has been used for ... more Web spider is one of the most important components of the search engine, which has been used for accessing the web pages through the World Wide Web(WWW). When we crawl the web pages with different data sizes, then we can access limited web pages according to transfer rate of total data size in per unit time. In case of increasing the data size of web pages, number of web pagesdecrease. So it affects the performance of web spider. This paper describes the issue of data size of web pages, crawled by the web spider. This problem has been implemented here regarding to achieve high performance of web spider by using compression technique, which reduces the data size with big compression ratio of crawled web pages.
International journal of engineering research and technology, 2013
Continuous queries are used to monitor changes to time varying data and to provide results useful... more Continuous queries are used to monitor changes to time varying data and to provide results useful for online decision making. In these queries, at specifies coherency a client requirement as part of the query. For continuous queries, network of data aggregators a low cost and scalable technique used. Individual node cannot be determine its inclusion by itself in the query result for this a different algorithmic challenges from aggregate and selection queries are presented. At specific coherencies each data item can serve for a set of data aggregators. In this technique disseminating query into sub query and sub queries are executed on the chosen data aggregator are involves. We build a query cost model, which can be used to estimate the number of refresh messages which is required to satisfy client specified incoherency bound. Each data aggregator serve a set of data items at specific coherencies. Performance show that, our query cost model can be executed using less than one third ...
International Journal of Advances in Engineering Sciences, 2011
It is widely recognized that data security, influence ofprocessor types used, resources and the a... more It is widely recognized that data security, influence ofprocessor types used, resources and the architecture regardingthroughputs are playing a central role in IT systems. EllipticCurve Cryptography (ECC) is a sort of public-key cryptographythat is an alternative to other public-key algorithms like RSA,DSA. It is widely accepted because of the usage of smallerparameters than other public-key cryptosystems but with samelevel of security, reduces the memory consumption of hardwareand software by using the special kind of functionalities and animproved versions of algorithm with a larger margin of theprevious systems by using an extension of ECC called KoblitzCurve.It has becoming increasingly common to implementsuch kind of systems that takes a shorter computational time foran execution of an instruction. The previous versions were of anarithmetic computation that takes a longer time. I haveemphasized on fuzzy modular arithmetic that takes a veryshorter time towards the computational ...
International Journal of Computer Applications, 2012
protocols for Mobile Ad Hoc Networks (MANETs) have been explored extensively in recent years. Muc... more protocols for Mobile Ad Hoc Networks (MANETs) have been explored extensively in recent years. Much of this work is targeted at finding a feasible route from a source to a destination without considering current network traffic or application requirements. Therefore, the network may easily become overloaded with too much traffic and the application has no way to improve its performance under a given network traffic condition. While this may be acceptable for data transfer, many real-time applications require Quality-of- Service (QoS) support from the network. The main idea of the proposal is that QoS support can be achieved more efficiently by considering the link stability and signal strength of intermediate channels. We propose an on demand QoS routing scheme named signal Stability based QoS Routing (SSQR), that provides QoS support in terms of bandwidth and end to- end delay in mobile ad hoc networks (MANETs). SSQR is designed over Signal Stability based Adaptive Routing (SSA) and...
Nowadays, Users are uploading thousands of videos to the internet and every day they are shared. ... more Nowadays, Users are uploading thousands of videos to the internet and every day they are shared. In these uploaded videos, several amounts of videos are manipulated and illegal replicas of present media. So it becomes difficult for copyright management to manage media not to be stealing on internet. In recent times, they have turn into thoughtful concerns for accretive online video sources copyright infractions and data piracy. There are two methods to detect infringements. The First method is centered on watermarking as well as second method is founded on Content Based Copy Detection (CBCD). These disadvantages are overcome in suggested Temporally Informative Representative Images-Discrete Cosine Transform (TIRIDCT) system.In suggested TIRI-DCT method; two quick searching methods are used for matching of fingerprint. These two approaches are inverted file based similarity search and cluster based similarity search. Such as for a small section of video TIRI holds spatial and tempora...
Uploads
Papers by Pratap Patwal