Computer Applications: An International Journal (CAIJ) is a Quarterly open access peer-reviewed journal that publishes articles which contribute new results in all areas of the Computer Science Applications. The journal is devoted to the publication of high quality papers on theoretical and practical aspects of computer science applications. The goal of this journal is to bring together researchers and practitioners from academia and industry to focus on Computer science application advancements, and establishing new collaborations in these areas. Original research papers, state-of-the-art reviews are invited for publication in all areas of Computer Science Applications.
In this paper we analytically model technological processes of manufacturing of multiemitter hete... more In this paper we analytically model technological processes of manufacturing of multiemitter heterotransistors with account relaxation of mismatch-induced stress. Based on this modeling some recommendations to increase sharpness of p-n- junctions, which included into the transistors, and increasing compactness of the transistors have been formulated
“If your compassion does not include yourself, it is incomplete” – Gautama Buddha Telemedicine; u... more “If your compassion does not include yourself, it is incomplete” – Gautama Buddha Telemedicine; use of telecommunication and information technological services, which permits the communication between the users with convenience and fidelity, as well transmitting medical, images and health informatics data. Numerous image processing applications like Satellite Imaging, Medical Imaging and Video has images with too large size or stream size, with a large amount of space or high bandwidth
for communication in its original form. Integrity of the transmitted medical images and the informatics
data, without any compromise in the data is an essential product of telecommunication and information
technology. A colossal need for an adequate compression methodology, in adoption for the compression of
medical images /data, to domicile for various metrics like high bandwidth, resolution factors, storage of the
images/data, the obligation to perpetuate the validity and precision of data for subsequent perceived
diagnosis transactions. This leverages exacting coercions on the restoration error. In this paper we survey
the literature related to the Image Processing Methodologies based on ROI technique/s for Digital Imaging
and Communication for Medicine (DICOM). A scrutiny as such persuades with the several congestions
related to prospective techniques of lossless compression, recommending for a better and a unique image
compression technique.
Embedded systems shape our world nowadays. It’s almost impossible to imagine our day to day life ... more Embedded systems shape our world nowadays. It’s almost impossible to imagine our day to day life without it. Examples can include cell phones, home appliances, energy generators, satellites, automotive components …etc. it is even far more complex if there are real-time and interface constraints. Developing real-time embedded systems is significantly challenging to developers. Results need not be only correct, but also in a timely manner. New software development approaches are needed due to the
complexity of functional and non-functional requirements of embedded systems. Due to the complex context of embedded systems, defects can cause life threatening situations. Delays can create huge costs, and insufficient productivity can impact the entire industry. The rapid evolution of software engineering technologies will be a key factor in the successful future development of even more
complex embedded systems. Software development is shifting from manual programming, to model-driven engineering (MDE). One of the most important challenges is to manage the increasing complexity of embedded software development, while maintaining the product’s quality, reducing time to market, and reducing development cost.
MDE is a promising approach that emerged lately. Instead of directly coding the software using programming languages, developers model software systems using expressive, graphical notations, which provide a higher abstraction level than programming languages. This is called Model Based Development (MBD). Model Based Development if accompanied by Model Based Validation (MBV), will help identify problems early thus reduce rework cost. Applying tests based on the designed models not only enable early detection of defects, but also continuous quality assurance. Testing can start in the first iteration of the development
process. As a result of the model based approach, and in addition to the major advantage of early defects detection,
several time consuming tasks within the classical software development life cycle will be excluded. For embedded systems development, it’s really important to follow a more time efficient approach.
In this paper generation of binary sequences derived from chaotic sequences defined over Z4 is pr... more In this paper generation of binary sequences derived from chaotic sequences defined over Z4 is proposed. The six chaotic map equations considered in this paper are Logistic map, Tent Map, Cubic Map, Quadratic Map and Bernoulli Map. Using these chaotic map equations, sequences over Z4 are generated which are converted to binary sequences using polynomial mapping. Segments of sequences of different lengths are tested for cross correlation and linear complexity properties. It is found that some segments of different length of these sequences have good cross correlation and linear complexity properties. The Bit Error Rate performance in DS-CDMA communication systems using these binary sequences is found to be better than Gold sequences and Kasami sequences.
Most of people desire to know about online blood donation to the patients at once. Patients want ... more Most of people desire to know about online blood donation to the patients at once. Patients want to get blood to live at emergency time. At present people are needed to know how to contact blood donors online. This system provides how to get blood at their serious time to be longer life time. Matcher system is implemented with Decision Tree and Decision Table by rules. This matcher applies the rules based on
Blood Donation in Blood Bank in Myanmar. Information about donors and patients has been reserved in the system so that it is ready to donate blood instantly.
In this paper, we present a novel approach for image retrieval based on extraction of low level f... more In this paper, we present a novel approach for image retrieval based on extraction of low level features
using techniques such as Directional Binary Code (DBC), Haar Wavelet transform and Histogram of
Oriented Gradients (HOG). The DBC texture descriptor captures the spatial relationship between any pair
of neighbourhood pixels in a local region along a given direction, while Local Binary Patterns (LBP)
descriptor considers the relationship between a given pixel and its surrounding neighbours. Therefore,
DBC captures more spatial information than LBP and its variants, also it can extract more edge
information than LBP. Hence, we employ DBC technique in order to extract grey level texture features
(texture map) from each RGB channels individually and computed texture maps are further combined
which represents colour texture features (colour texture map) of an image. Then, we decomposed the
extracted colour texture map and original image using Haar wavelet transform. Finally, we encode the
shape and local features of wavelet transformed images using Histogram of Oriented Gradients (HOG) for
content based image retrieval. The performance of proposed method is compared with existing methods on
two databases such as Wang’s corel image and Caltech 256. The evaluation results show that our
approach outperforms the existing methods for image retrieval.
Today is a knowledge age so that world needs to become a more richer palace for everyone. Student... more Today is a knowledge age so that world needs to become a more richer palace for everyone. Students can
learn their lectures and students can do their exercises on the web as individually or collaboratively with
their peers like directed by the teacher by using the think-pair-share technique. The system provides the
ability to clear to decide on their choices about the questions. The K-means clustering method is used to
modify the pair state and support for determining students’ grade of classes. The main objective of this
study is to design a model for java programming learning system that facilitates the collaborative learning
activities in a virtual classroom.
This article presents a fully automatic segmentation method of liver CT scans using fuzzy cmean
... more This article presents a fully automatic segmentation method of liver CT scans using fuzzy cmean
clustering and level set. First, the difference of unique image is improved to make
boundaries clearer; second, a spatial fuzzy c-mean clustering combining with anatomical
previous information is engaged to extract liver area automatically Thirdly, a distance
regularized level set is used for modification; finally, morphological operations are used as postprocessing.
The experiment result shows that the method can achieve high accuracy (0.9986)
and specificity (0.9989). Comparing with standard level set method, our method is more
successful in dealing with over-segmentation difficulty.
Simultaneous Multi-Threading (SMT) is a tech-nique by which multiple independent threads can be
... more Simultaneous Multi-Threading (SMT) is a tech-nique by which multiple independent threads can be
simulta-neously executed and it increases the system throughput. For further improving the performance of
the machine one more technique known as pipelining is implemented. Pipelining allows the SMT machine
to execute more than one function in the same cycle. Pipelining optimizes several processes such as fetch,
decode, dispatch, issue, execute and commit. Our focus is on the Dispatch Cycle in which instructions get
Dispatched from the individual re-order buffers to the shared Issue Queue. Branch Miss-Prediction
reduces the throughput of SMT machine due to the instruction flush-outs. The pipeline then has to be
refilled starting from the branch instruction, after changing the branch direction. It causes a delay in the
pipeline after which instructions will start to commit from that particular thread and this affects the overall
system throughput adversely. Our approach is to implement a technique called as selective context
blocking after context-branch evaluation. According to this technique the number of instructions a thread is
allowed to dispatch is controlled, if it has a higher Branch Miss-Prediction percent-age within the past
certain number of Branch Instructions. As a result of this we get an overall throughput increase of up to 4
% in the performance when compared to the default algorithm.
Secure localization is a critical issue in wireless sensor networks. Providing a certain degree o... more Secure localization is a critical issue in wireless sensor networks. Providing a certain degree of
localization accuracy at the presence of malicious beacons becomes a challenging task. Robust
Position Estimation (ROPE) is one of the secure localization systems. However, this system suffers
from compromised network entities. Trust Based Secure Robust Position Estimation filters the
compromised network entities with the use of trust model. Simulation results and analytical
reasoning shows that the security is provided by this scheme with minimal energy consumption.
Task scheduling on different processors with precedence constraints is NP hard and finds a promin... more Task scheduling on different processors with precedence constraints is NP hard and finds a prominent
place in the field of parallel computing and combinatorial optimization. However, it is quite difficult to
achieve an optimal solution to this problem with traditional optimization approaches owing to the high
computational complexity. Amongst the metaheuristics, Simulated Annealing (SA) and Genetic Algorithm
(GA) represent the powerful combinatorial optimization methods with corresponding strengths and
weaknesses. Borrowing the respective advantages of the two paradigms, an effective combination of GA
and SA called hybrid GASA has been proposed for multiprocessor task scheduling problems with
precedence constraints. The bi-criteria objective function, including the weighted sum of makespan and
total completion has been considered for the analysis. Comparative analysis with the help of defined
performance index on the standard problems shows that the proposed hybrid GASA provides better results
when compared to simple GA and SA alone in terms of solution quality.
Great knowledge and experience on microbiology are required for accurate bacteria identification.... more Great knowledge and experience on microbiology are required for accurate bacteria identification.
Automation of bacteria identification is required because there might be a shortage of skilled
microbiologists and clinicians at a time of great need. We propose an automatic bacteria identification
framework that can classify three famous classes of bacteria namely Cocci, Bacilli and Vibrio from
microscopic morphology using the Naïve Bayes classifier. The proposed bacteria identification framework
comprises two steps. In the first step, the system is trained using a set of microscopic images containing
Cocci, Bacilli, and Vibrio. The input images are normalized to emphasize the diameter and shape features.
Edge-based descriptors are then extracted from the input images. In the second step, we use the Naïve
Bayes classifier to perform probabilistic inference based on the input descriptors. 64 images for each class
of bacteria were used as the training set and 222 images consisting of the three classes of bacteria and
other random images such as humans and airplanes were used as the test set. There are no images
overlapped between the training set and the test set. The system was found to be able to accurately
discriminate the three classes of bacteria. Moreover, the system was also found to be able to reject images
that did not belong to any of the three classes of bacteria. The preliminary results demonstrate how a
simple machine learning classifier with a set of simple image-based features can result in high
classification accuracy. The preliminary results also demonstrate the efficacy and efficiency of our two-step
automatic bacteria identification approach and motivate us to extend this framework to identify a variety of
other types of bacteria.
Cross platform application in mobile computing paradigm are applications that are coded using one... more Cross platform application in mobile computing paradigm are applications that are coded using one standard language ,compiled and build using platform specific native API (Application Programming Interface) and it can run on different operating systems (such as Amazon Fire OS, Android, Blackberry 10, Firefox OS, iOS, Ubuntu, Windows Phone, Windows 8, Tizen).In this paper we propose a context based cross platform mobile application for android OS which provide service for generating unreserved ticket for Mumbai local trains. This system will create a ticket in the form of QR code using basic information of the journey. With this system the will be able to avoid to stand in long queues at the stations for their tickets.
This study proposes a new router architecture to improve the performance of dynamic allocation of... more This study proposes a new router architecture to improve the performance of dynamic allocation of virtual channels. The proposed router is designed to reduce the hardware complexity and to improve power and area consumption, simultaneously. In the new structure of the proposed router, all of the controlling components have been implemented sequentially inside the allocator router modules. This optimizes communications between the controlling components and eliminates the most of hardware overloads of modular communications. Eliminating additional communications also reduces the hardware complexity. In order to show the validity of the proposed design in real hardware resources, the proposed router has been implemented onto a Field-Programmable Gate Array (FPGA). Since the implementation of a Network-on-Chip (NoC) requires certain amount of area on the chip, the suggested approach is also able to reduce the demand of hardware resources. In this method, the internal memory of the FPGA is used for implementing control units. This memory is faster and can be used with specific patterns. The use of the FPGA memory saves the hardware resources and allows the implementation of NoC based FPGA.
We have been studying different kinds of papers about web pages clustering techniques with many
a... more We have been studying different kinds of papers about web pages clustering techniques with many algorithms. Almost papers published in the Web are used popular algorithms. Among them similarity methods and clustering methods are very popular algorithms for extraction data in the Web. We use some methods to summarize by means of well known those methods. Some papers are applied to different methods. It is motivated to collect and summarize what are differences between them. This paper presents a comprehensive analysis of different methods and intends to be more recognized those methods. Most of sections are well composed for detail explanations about papers studied as a list of numbers and a table. Today web pages are saturated on the Internet. Content of web pages are extracted from cluster of web page using related methods. Clustering systems are useful to extract knowledge. The results obtained show that the status of using similarity and clustering method provides the best results for people who study about web.
In wireless sensor networks, adversaries can easily launch multiple attacks on the application la... more In wireless sensor networks, adversaries can easily launch multiple attacks on the application layer and the network layer, such as false report injection attacks and wormhole attacks. These attacks drain finite energy resources through false reports of compromised nodes, and devastate the constructed routing paths through the illegal messages of adversary nodes. Statistical en-route filtering (SEF) is proposed in order to drop the false reports against the false report injection attack, and a localized encryption and authentication protocol (LEAP) is proposed to detect the illegal messages to protect against wormhole attacks. When these attacks occur at the same time, SEF and LEAP should be operated simultaneously to confront the false reports and the illegal message in the sensor network. In this paper, we propose a method to improve the energy efficiency and the security level as compared to the simultaneous application of SEF and LEAP. In the proposed method, three types of new keys are designed to effectively detect these attacks, identifying and eliminating the redundancies. The effectiveness of the proposed method is verified through experimentation, and the results are compared to the simultaneous application of SEF and LEAP when the two attacks occur. The experiment results show that our proposed method saves up to 8% more energy, improves the security level of the compromised nodes up to 10%, and maintains the same detection power of the adversary nodes.
In the Internet we have been searching different kinds of papers describing Cryptography techniqu... more In the Internet we have been searching different kinds of papers describing Cryptography technique
imposed by many algorithms. Almost papers published in the Web are used popular algorithms. Among
them RSA is a very popular algorithm for security in the Web. We use attributes to summarize in the RSA
algorithm by means of well known parameters. Some papers are thought about cloud and others for
different environments. There is motivated to collect and summarize what are differences between them.
This paper focuses on a comprehensive analysis of different modified RSA algorithms and supplies the
recognition of RSA intensively. Most of sections are well composed for detail explanations about papers
studied as a list of numbers and a table.
In real applications, graphs may be huge in terms of the number of nodes and edges. Many graph dr... more In real applications, graphs may be huge in terms of the number of nodes and edges. Many graph drawing algorithms have been developed, but most of them have difficulty dealing with large graphs with thousands of nodes. Clustering graphs is one efficient method to draw large graphs even though other techniques exist, such as fisheye view, hyperbolic geometry and distortion-oriented presentation. A clustered graph can significantly reduce visual complexity by replacing a set of nodes in a cluster with one abstract node. Moreover, a hierarchically clustered graph can find superimposed structures over the original graph through a recursive clustering process. The link structure of the Web is modelled using the Web graph. The result of the studies can appear on Web algorithm such as crawling, searching and tracing of Web communities. This system proposes to track a time similarities communities.
Augmented Reality (AR) allows the real time blendin
g of exhibited objects with digital informat... more Augmented Reality (AR) allows the real time blendin
g of exhibited objects with digital information suc
h as
3D models, still images, video clips, and web pages
. The blending presents exhibition visitors with a
more
interesting and exciting experience. This paper foc
uses on the creation and presentation of artworks u
sing
mobile AR technology
.
In this paper we analytically model technological processes of manufacturing of multiemitter hete... more In this paper we analytically model technological processes of manufacturing of multiemitter heterotransistors with account relaxation of mismatch-induced stress. Based on this modeling some recommendations to increase sharpness of p-n- junctions, which included into the transistors, and increasing compactness of the transistors have been formulated
“If your compassion does not include yourself, it is incomplete” – Gautama Buddha Telemedicine; u... more “If your compassion does not include yourself, it is incomplete” – Gautama Buddha Telemedicine; use of telecommunication and information technological services, which permits the communication between the users with convenience and fidelity, as well transmitting medical, images and health informatics data. Numerous image processing applications like Satellite Imaging, Medical Imaging and Video has images with too large size or stream size, with a large amount of space or high bandwidth
for communication in its original form. Integrity of the transmitted medical images and the informatics
data, without any compromise in the data is an essential product of telecommunication and information
technology. A colossal need for an adequate compression methodology, in adoption for the compression of
medical images /data, to domicile for various metrics like high bandwidth, resolution factors, storage of the
images/data, the obligation to perpetuate the validity and precision of data for subsequent perceived
diagnosis transactions. This leverages exacting coercions on the restoration error. In this paper we survey
the literature related to the Image Processing Methodologies based on ROI technique/s for Digital Imaging
and Communication for Medicine (DICOM). A scrutiny as such persuades with the several congestions
related to prospective techniques of lossless compression, recommending for a better and a unique image
compression technique.
Embedded systems shape our world nowadays. It’s almost impossible to imagine our day to day life ... more Embedded systems shape our world nowadays. It’s almost impossible to imagine our day to day life without it. Examples can include cell phones, home appliances, energy generators, satellites, automotive components …etc. it is even far more complex if there are real-time and interface constraints. Developing real-time embedded systems is significantly challenging to developers. Results need not be only correct, but also in a timely manner. New software development approaches are needed due to the
complexity of functional and non-functional requirements of embedded systems. Due to the complex context of embedded systems, defects can cause life threatening situations. Delays can create huge costs, and insufficient productivity can impact the entire industry. The rapid evolution of software engineering technologies will be a key factor in the successful future development of even more
complex embedded systems. Software development is shifting from manual programming, to model-driven engineering (MDE). One of the most important challenges is to manage the increasing complexity of embedded software development, while maintaining the product’s quality, reducing time to market, and reducing development cost.
MDE is a promising approach that emerged lately. Instead of directly coding the software using programming languages, developers model software systems using expressive, graphical notations, which provide a higher abstraction level than programming languages. This is called Model Based Development (MBD). Model Based Development if accompanied by Model Based Validation (MBV), will help identify problems early thus reduce rework cost. Applying tests based on the designed models not only enable early detection of defects, but also continuous quality assurance. Testing can start in the first iteration of the development
process. As a result of the model based approach, and in addition to the major advantage of early defects detection,
several time consuming tasks within the classical software development life cycle will be excluded. For embedded systems development, it’s really important to follow a more time efficient approach.
In this paper generation of binary sequences derived from chaotic sequences defined over Z4 is pr... more In this paper generation of binary sequences derived from chaotic sequences defined over Z4 is proposed. The six chaotic map equations considered in this paper are Logistic map, Tent Map, Cubic Map, Quadratic Map and Bernoulli Map. Using these chaotic map equations, sequences over Z4 are generated which are converted to binary sequences using polynomial mapping. Segments of sequences of different lengths are tested for cross correlation and linear complexity properties. It is found that some segments of different length of these sequences have good cross correlation and linear complexity properties. The Bit Error Rate performance in DS-CDMA communication systems using these binary sequences is found to be better than Gold sequences and Kasami sequences.
Most of people desire to know about online blood donation to the patients at once. Patients want ... more Most of people desire to know about online blood donation to the patients at once. Patients want to get blood to live at emergency time. At present people are needed to know how to contact blood donors online. This system provides how to get blood at their serious time to be longer life time. Matcher system is implemented with Decision Tree and Decision Table by rules. This matcher applies the rules based on
Blood Donation in Blood Bank in Myanmar. Information about donors and patients has been reserved in the system so that it is ready to donate blood instantly.
In this paper, we present a novel approach for image retrieval based on extraction of low level f... more In this paper, we present a novel approach for image retrieval based on extraction of low level features
using techniques such as Directional Binary Code (DBC), Haar Wavelet transform and Histogram of
Oriented Gradients (HOG). The DBC texture descriptor captures the spatial relationship between any pair
of neighbourhood pixels in a local region along a given direction, while Local Binary Patterns (LBP)
descriptor considers the relationship between a given pixel and its surrounding neighbours. Therefore,
DBC captures more spatial information than LBP and its variants, also it can extract more edge
information than LBP. Hence, we employ DBC technique in order to extract grey level texture features
(texture map) from each RGB channels individually and computed texture maps are further combined
which represents colour texture features (colour texture map) of an image. Then, we decomposed the
extracted colour texture map and original image using Haar wavelet transform. Finally, we encode the
shape and local features of wavelet transformed images using Histogram of Oriented Gradients (HOG) for
content based image retrieval. The performance of proposed method is compared with existing methods on
two databases such as Wang’s corel image and Caltech 256. The evaluation results show that our
approach outperforms the existing methods for image retrieval.
Today is a knowledge age so that world needs to become a more richer palace for everyone. Student... more Today is a knowledge age so that world needs to become a more richer palace for everyone. Students can
learn their lectures and students can do their exercises on the web as individually or collaboratively with
their peers like directed by the teacher by using the think-pair-share technique. The system provides the
ability to clear to decide on their choices about the questions. The K-means clustering method is used to
modify the pair state and support for determining students’ grade of classes. The main objective of this
study is to design a model for java programming learning system that facilitates the collaborative learning
activities in a virtual classroom.
This article presents a fully automatic segmentation method of liver CT scans using fuzzy cmean
... more This article presents a fully automatic segmentation method of liver CT scans using fuzzy cmean
clustering and level set. First, the difference of unique image is improved to make
boundaries clearer; second, a spatial fuzzy c-mean clustering combining with anatomical
previous information is engaged to extract liver area automatically Thirdly, a distance
regularized level set is used for modification; finally, morphological operations are used as postprocessing.
The experiment result shows that the method can achieve high accuracy (0.9986)
and specificity (0.9989). Comparing with standard level set method, our method is more
successful in dealing with over-segmentation difficulty.
Simultaneous Multi-Threading (SMT) is a tech-nique by which multiple independent threads can be
... more Simultaneous Multi-Threading (SMT) is a tech-nique by which multiple independent threads can be
simulta-neously executed and it increases the system throughput. For further improving the performance of
the machine one more technique known as pipelining is implemented. Pipelining allows the SMT machine
to execute more than one function in the same cycle. Pipelining optimizes several processes such as fetch,
decode, dispatch, issue, execute and commit. Our focus is on the Dispatch Cycle in which instructions get
Dispatched from the individual re-order buffers to the shared Issue Queue. Branch Miss-Prediction
reduces the throughput of SMT machine due to the instruction flush-outs. The pipeline then has to be
refilled starting from the branch instruction, after changing the branch direction. It causes a delay in the
pipeline after which instructions will start to commit from that particular thread and this affects the overall
system throughput adversely. Our approach is to implement a technique called as selective context
blocking after context-branch evaluation. According to this technique the number of instructions a thread is
allowed to dispatch is controlled, if it has a higher Branch Miss-Prediction percent-age within the past
certain number of Branch Instructions. As a result of this we get an overall throughput increase of up to 4
% in the performance when compared to the default algorithm.
Secure localization is a critical issue in wireless sensor networks. Providing a certain degree o... more Secure localization is a critical issue in wireless sensor networks. Providing a certain degree of
localization accuracy at the presence of malicious beacons becomes a challenging task. Robust
Position Estimation (ROPE) is one of the secure localization systems. However, this system suffers
from compromised network entities. Trust Based Secure Robust Position Estimation filters the
compromised network entities with the use of trust model. Simulation results and analytical
reasoning shows that the security is provided by this scheme with minimal energy consumption.
Task scheduling on different processors with precedence constraints is NP hard and finds a promin... more Task scheduling on different processors with precedence constraints is NP hard and finds a prominent
place in the field of parallel computing and combinatorial optimization. However, it is quite difficult to
achieve an optimal solution to this problem with traditional optimization approaches owing to the high
computational complexity. Amongst the metaheuristics, Simulated Annealing (SA) and Genetic Algorithm
(GA) represent the powerful combinatorial optimization methods with corresponding strengths and
weaknesses. Borrowing the respective advantages of the two paradigms, an effective combination of GA
and SA called hybrid GASA has been proposed for multiprocessor task scheduling problems with
precedence constraints. The bi-criteria objective function, including the weighted sum of makespan and
total completion has been considered for the analysis. Comparative analysis with the help of defined
performance index on the standard problems shows that the proposed hybrid GASA provides better results
when compared to simple GA and SA alone in terms of solution quality.
Great knowledge and experience on microbiology are required for accurate bacteria identification.... more Great knowledge and experience on microbiology are required for accurate bacteria identification.
Automation of bacteria identification is required because there might be a shortage of skilled
microbiologists and clinicians at a time of great need. We propose an automatic bacteria identification
framework that can classify three famous classes of bacteria namely Cocci, Bacilli and Vibrio from
microscopic morphology using the Naïve Bayes classifier. The proposed bacteria identification framework
comprises two steps. In the first step, the system is trained using a set of microscopic images containing
Cocci, Bacilli, and Vibrio. The input images are normalized to emphasize the diameter and shape features.
Edge-based descriptors are then extracted from the input images. In the second step, we use the Naïve
Bayes classifier to perform probabilistic inference based on the input descriptors. 64 images for each class
of bacteria were used as the training set and 222 images consisting of the three classes of bacteria and
other random images such as humans and airplanes were used as the test set. There are no images
overlapped between the training set and the test set. The system was found to be able to accurately
discriminate the three classes of bacteria. Moreover, the system was also found to be able to reject images
that did not belong to any of the three classes of bacteria. The preliminary results demonstrate how a
simple machine learning classifier with a set of simple image-based features can result in high
classification accuracy. The preliminary results also demonstrate the efficacy and efficiency of our two-step
automatic bacteria identification approach and motivate us to extend this framework to identify a variety of
other types of bacteria.
Cross platform application in mobile computing paradigm are applications that are coded using one... more Cross platform application in mobile computing paradigm are applications that are coded using one standard language ,compiled and build using platform specific native API (Application Programming Interface) and it can run on different operating systems (such as Amazon Fire OS, Android, Blackberry 10, Firefox OS, iOS, Ubuntu, Windows Phone, Windows 8, Tizen).In this paper we propose a context based cross platform mobile application for android OS which provide service for generating unreserved ticket for Mumbai local trains. This system will create a ticket in the form of QR code using basic information of the journey. With this system the will be able to avoid to stand in long queues at the stations for their tickets.
This study proposes a new router architecture to improve the performance of dynamic allocation of... more This study proposes a new router architecture to improve the performance of dynamic allocation of virtual channels. The proposed router is designed to reduce the hardware complexity and to improve power and area consumption, simultaneously. In the new structure of the proposed router, all of the controlling components have been implemented sequentially inside the allocator router modules. This optimizes communications between the controlling components and eliminates the most of hardware overloads of modular communications. Eliminating additional communications also reduces the hardware complexity. In order to show the validity of the proposed design in real hardware resources, the proposed router has been implemented onto a Field-Programmable Gate Array (FPGA). Since the implementation of a Network-on-Chip (NoC) requires certain amount of area on the chip, the suggested approach is also able to reduce the demand of hardware resources. In this method, the internal memory of the FPGA is used for implementing control units. This memory is faster and can be used with specific patterns. The use of the FPGA memory saves the hardware resources and allows the implementation of NoC based FPGA.
We have been studying different kinds of papers about web pages clustering techniques with many
a... more We have been studying different kinds of papers about web pages clustering techniques with many algorithms. Almost papers published in the Web are used popular algorithms. Among them similarity methods and clustering methods are very popular algorithms for extraction data in the Web. We use some methods to summarize by means of well known those methods. Some papers are applied to different methods. It is motivated to collect and summarize what are differences between them. This paper presents a comprehensive analysis of different methods and intends to be more recognized those methods. Most of sections are well composed for detail explanations about papers studied as a list of numbers and a table. Today web pages are saturated on the Internet. Content of web pages are extracted from cluster of web page using related methods. Clustering systems are useful to extract knowledge. The results obtained show that the status of using similarity and clustering method provides the best results for people who study about web.
In wireless sensor networks, adversaries can easily launch multiple attacks on the application la... more In wireless sensor networks, adversaries can easily launch multiple attacks on the application layer and the network layer, such as false report injection attacks and wormhole attacks. These attacks drain finite energy resources through false reports of compromised nodes, and devastate the constructed routing paths through the illegal messages of adversary nodes. Statistical en-route filtering (SEF) is proposed in order to drop the false reports against the false report injection attack, and a localized encryption and authentication protocol (LEAP) is proposed to detect the illegal messages to protect against wormhole attacks. When these attacks occur at the same time, SEF and LEAP should be operated simultaneously to confront the false reports and the illegal message in the sensor network. In this paper, we propose a method to improve the energy efficiency and the security level as compared to the simultaneous application of SEF and LEAP. In the proposed method, three types of new keys are designed to effectively detect these attacks, identifying and eliminating the redundancies. The effectiveness of the proposed method is verified through experimentation, and the results are compared to the simultaneous application of SEF and LEAP when the two attacks occur. The experiment results show that our proposed method saves up to 8% more energy, improves the security level of the compromised nodes up to 10%, and maintains the same detection power of the adversary nodes.
In the Internet we have been searching different kinds of papers describing Cryptography techniqu... more In the Internet we have been searching different kinds of papers describing Cryptography technique
imposed by many algorithms. Almost papers published in the Web are used popular algorithms. Among
them RSA is a very popular algorithm for security in the Web. We use attributes to summarize in the RSA
algorithm by means of well known parameters. Some papers are thought about cloud and others for
different environments. There is motivated to collect and summarize what are differences between them.
This paper focuses on a comprehensive analysis of different modified RSA algorithms and supplies the
recognition of RSA intensively. Most of sections are well composed for detail explanations about papers
studied as a list of numbers and a table.
In real applications, graphs may be huge in terms of the number of nodes and edges. Many graph dr... more In real applications, graphs may be huge in terms of the number of nodes and edges. Many graph drawing algorithms have been developed, but most of them have difficulty dealing with large graphs with thousands of nodes. Clustering graphs is one efficient method to draw large graphs even though other techniques exist, such as fisheye view, hyperbolic geometry and distortion-oriented presentation. A clustered graph can significantly reduce visual complexity by replacing a set of nodes in a cluster with one abstract node. Moreover, a hierarchically clustered graph can find superimposed structures over the original graph through a recursive clustering process. The link structure of the Web is modelled using the Web graph. The result of the studies can appear on Web algorithm such as crawling, searching and tracing of Web communities. This system proposes to track a time similarities communities.
Augmented Reality (AR) allows the real time blendin
g of exhibited objects with digital informat... more Augmented Reality (AR) allows the real time blendin
g of exhibited objects with digital information suc
h as
3D models, still images, video clips, and web pages
. The blending presents exhibition visitors with a
more
interesting and exciting experience. This paper foc
uses on the creation and presentation of artworks u
sing
mobile AR technology
.
The rapid development of multi-core system and increase of data-intensive application in recent y... more The rapid development of multi-core system and increase of data-intensive application in recent years call for larger main memory. Traditional DRAM memory can increase its capacity by reducing the feature size of storage cell. Now further scaling of DRAM faces great challenge, and the frequent refresh operations of DRAM can bring a lot of energy consumption. As an emerging technology, Phase Change Memory (PCM) is promising to be used as main memory. It draws wide attention due to the advantages of low power consumption, high density and nonvolatility, while it incurs finite endurance and relatively long write latency. To handle the problem of write, optimizing the cache replacement policy to protect dirty cache block is an efficient way. In this paper, we construct a systematically multilevel structure, and based on it propose a novel cache replacement policy called MAC. MAC can effectively reduce write traffic to PCM memory with low hardware overhead. We conduct simulation experiments on GEM5 to evaluate the performances of MAC and other related works. The results show that MAC performs best in reducing the amount of writes (averagely 25.12%) without increasing the program execution time.
Robotic fish design (Biomimetic) is an upcoming research area in which undulation motion has its ... more Robotic fish design (Biomimetic) is an upcoming research area in which undulation motion has its own set of parameters to decide its efficiency. To analyse its swimming efficiency several existing robotic models using undulation motion have been thoroughly studied and few critical parameters such as fin size, shape, flexibility of material, fin flapping rate etc., have been identified. After simulating the above critical parameters using a MATLAB based CFD tool, mainly to design prototypes for research purpose which concentrates on optimizing its swimming efficiency. Several combinations of critical parameters are discussed for robotic fish prototype design by identifying critical parameters and maximizing the swimming efficiency of robotic fish based on the above properties. The fin models have been tested using Matlab based CFD tool through simulation and the swimming speed, direction have also been tested for various models. An improved benchmark for designing robotic fish model is proposed in order to simplify selection of components, architecture, and cost involve.
In this paper, we propose a novel multi-hop cluster architecture for location service protocol in... more In this paper, we propose a novel multi-hop cluster architecture for location service protocol in vehicular ad hoc networks. The proposed scheme uses two parameters which are the connectivity between vehicles and the vehicle mobility to select cluster head. The performance of the proposed scheme is the tradeoff between the vehicle locations and the communication overheads. The proposed scheme is not only scalable but also reliable and is able to achieve high load balancing with fast convergence. The cluster constructed by the proposed scheme is more stable than exiting vehicular ad hoc network clustering schemes. Specifically, the proposed scheme can increase the cluster-head lifetime up to 50%. The reason behind this achievement is that the proposed scheme considers vehicle mobility in terms of average link expiration time while selecting cluster-head.
This paper deals with different loss mechanisms within the single mode fiber (SMF) in optical fib... more This paper deals with different loss mechanisms within the single mode fiber (SMF) in optical fiber
communication. A number of mechanisms are responsible for the signal attenuation within optical fibers.
As the optical signal propagates over long stretch of fiber, it becomes attenuated because of absorption,
scattering, fiber bends by material impurities, and other effects. The transmission using high bandwidth
can handle vast amounts of information, which can be further improved by reduction in fiber losses,
increase in data rates and distances, and using appropriate operating wavelength in optical fiber
communication. The recent development in the area of fiber optic communication as well as the
advances in different fiber types and their properties such as attenuation or loss and bandwidth are
also discussed in this paper. The performance improvement of the proposed different loss, such as Rayleigh
scattering, Stimulated Brillouin Scattering (SBS), Stimulated Raman Scattering (SRS), and bending loss
within the various loss mechanisms in fiber optic communication is shown through simulations.
In this paper, we examine WiMAX – based network and evaluate the performance for quality of servi... more In this paper, we examine WiMAX – based network and evaluate the performance for quality of service (QoS) using an idea of IEEE 802.16 technology. In our models, the study used a multiprocessor architecture organized by the interconnection network. OPNET Modeler is used to simulate the architecture and to calculate the performance criteria (i.e. throughput, delay and data dropped) that slightly concerned in network estimation. It is concluded that our models shorten the time quite a bit for obtaining the performance measures of an end-to-end delay as well as throughput can be used as an effective tool for this purpose.
In today’s network-based cloud computing era, software applications are playing big role. The sec... more In today’s network-based cloud computing era, software applications are playing big role. The security of
these software applications is paramount to the successful use of these applications. These applications utilize cryptographic algorithms to secure the data over the network through encryption and decryption processes. The use of parallel processors is now common in both mobile and cloud computing scenarios. Cryptographic algorithms are compute intensive and can significantly benefit from parallelism. This paper introduces a parallel approach to symmetric stream cipher security algorithm known as RC4A, which is
one of the strong variants of RC4. We present an efficient parallel implementation to the compute intensive PRGA that is pseudo-random generation algorithm portion of the RC4A algorithm and the resulted algorithm will be named as PARC4-I. We have added some functionality in terms of lookup tables.
Modified algorithm is having four lookup tables instead of two and is capable of returning four distinct output bytes at each iteration. Further, with the help of Parallel Additive Stream Cipher Structure and loop unrolling method, encryption/decryption is being done on multi core machine. Finally, the results shows that PARC4-I is a time efficient algorithm.
Uploads
Papers by Computer Applications: An International Journal (CAIJ)(ISSN :2393 - 8455)
for communication in its original form. Integrity of the transmitted medical images and the informatics
data, without any compromise in the data is an essential product of telecommunication and information
technology. A colossal need for an adequate compression methodology, in adoption for the compression of
medical images /data, to domicile for various metrics like high bandwidth, resolution factors, storage of the
images/data, the obligation to perpetuate the validity and precision of data for subsequent perceived
diagnosis transactions. This leverages exacting coercions on the restoration error. In this paper we survey
the literature related to the Image Processing Methodologies based on ROI technique/s for Digital Imaging
and Communication for Medicine (DICOM). A scrutiny as such persuades with the several congestions
related to prospective techniques of lossless compression, recommending for a better and a unique image
compression technique.
complexity of functional and non-functional requirements of embedded systems. Due to the complex context of embedded systems, defects can cause life threatening situations. Delays can create huge costs, and insufficient productivity can impact the entire industry. The rapid evolution of software engineering technologies will be a key factor in the successful future development of even more
complex embedded systems. Software development is shifting from manual programming, to model-driven engineering (MDE). One of the most important challenges is to manage the increasing complexity of embedded software development, while maintaining the product’s quality, reducing time to market, and reducing development cost.
MDE is a promising approach that emerged lately. Instead of directly coding the software using programming languages, developers model software systems using expressive, graphical notations, which provide a higher abstraction level than programming languages. This is called Model Based Development (MBD). Model Based Development if accompanied by Model Based Validation (MBV), will help identify problems early thus reduce rework cost. Applying tests based on the designed models not only enable early detection of defects, but also continuous quality assurance. Testing can start in the first iteration of the development
process. As a result of the model based approach, and in addition to the major advantage of early defects detection,
several time consuming tasks within the classical software development life cycle will be excluded. For embedded systems development, it’s really important to follow a more time efficient approach.
Blood Donation in Blood Bank in Myanmar. Information about donors and patients has been reserved in the system so that it is ready to donate blood instantly.
using techniques such as Directional Binary Code (DBC), Haar Wavelet transform and Histogram of
Oriented Gradients (HOG). The DBC texture descriptor captures the spatial relationship between any pair
of neighbourhood pixels in a local region along a given direction, while Local Binary Patterns (LBP)
descriptor considers the relationship between a given pixel and its surrounding neighbours. Therefore,
DBC captures more spatial information than LBP and its variants, also it can extract more edge
information than LBP. Hence, we employ DBC technique in order to extract grey level texture features
(texture map) from each RGB channels individually and computed texture maps are further combined
which represents colour texture features (colour texture map) of an image. Then, we decomposed the
extracted colour texture map and original image using Haar wavelet transform. Finally, we encode the
shape and local features of wavelet transformed images using Histogram of Oriented Gradients (HOG) for
content based image retrieval. The performance of proposed method is compared with existing methods on
two databases such as Wang’s corel image and Caltech 256. The evaluation results show that our
approach outperforms the existing methods for image retrieval.
learn their lectures and students can do their exercises on the web as individually or collaboratively with
their peers like directed by the teacher by using the think-pair-share technique. The system provides the
ability to clear to decide on their choices about the questions. The K-means clustering method is used to
modify the pair state and support for determining students’ grade of classes. The main objective of this
study is to design a model for java programming learning system that facilitates the collaborative learning
activities in a virtual classroom.
clustering and level set. First, the difference of unique image is improved to make
boundaries clearer; second, a spatial fuzzy c-mean clustering combining with anatomical
previous information is engaged to extract liver area automatically Thirdly, a distance
regularized level set is used for modification; finally, morphological operations are used as postprocessing.
The experiment result shows that the method can achieve high accuracy (0.9986)
and specificity (0.9989). Comparing with standard level set method, our method is more
successful in dealing with over-segmentation difficulty.
simulta-neously executed and it increases the system throughput. For further improving the performance of
the machine one more technique known as pipelining is implemented. Pipelining allows the SMT machine
to execute more than one function in the same cycle. Pipelining optimizes several processes such as fetch,
decode, dispatch, issue, execute and commit. Our focus is on the Dispatch Cycle in which instructions get
Dispatched from the individual re-order buffers to the shared Issue Queue. Branch Miss-Prediction
reduces the throughput of SMT machine due to the instruction flush-outs. The pipeline then has to be
refilled starting from the branch instruction, after changing the branch direction. It causes a delay in the
pipeline after which instructions will start to commit from that particular thread and this affects the overall
system throughput adversely. Our approach is to implement a technique called as selective context
blocking after context-branch evaluation. According to this technique the number of instructions a thread is
allowed to dispatch is controlled, if it has a higher Branch Miss-Prediction percent-age within the past
certain number of Branch Instructions. As a result of this we get an overall throughput increase of up to 4
% in the performance when compared to the default algorithm.
localization accuracy at the presence of malicious beacons becomes a challenging task. Robust
Position Estimation (ROPE) is one of the secure localization systems. However, this system suffers
from compromised network entities. Trust Based Secure Robust Position Estimation filters the
compromised network entities with the use of trust model. Simulation results and analytical
reasoning shows that the security is provided by this scheme with minimal energy consumption.
place in the field of parallel computing and combinatorial optimization. However, it is quite difficult to
achieve an optimal solution to this problem with traditional optimization approaches owing to the high
computational complexity. Amongst the metaheuristics, Simulated Annealing (SA) and Genetic Algorithm
(GA) represent the powerful combinatorial optimization methods with corresponding strengths and
weaknesses. Borrowing the respective advantages of the two paradigms, an effective combination of GA
and SA called hybrid GASA has been proposed for multiprocessor task scheduling problems with
precedence constraints. The bi-criteria objective function, including the weighted sum of makespan and
total completion has been considered for the analysis. Comparative analysis with the help of defined
performance index on the standard problems shows that the proposed hybrid GASA provides better results
when compared to simple GA and SA alone in terms of solution quality.
Automation of bacteria identification is required because there might be a shortage of skilled
microbiologists and clinicians at a time of great need. We propose an automatic bacteria identification
framework that can classify three famous classes of bacteria namely Cocci, Bacilli and Vibrio from
microscopic morphology using the Naïve Bayes classifier. The proposed bacteria identification framework
comprises two steps. In the first step, the system is trained using a set of microscopic images containing
Cocci, Bacilli, and Vibrio. The input images are normalized to emphasize the diameter and shape features.
Edge-based descriptors are then extracted from the input images. In the second step, we use the Naïve
Bayes classifier to perform probabilistic inference based on the input descriptors. 64 images for each class
of bacteria were used as the training set and 222 images consisting of the three classes of bacteria and
other random images such as humans and airplanes were used as the test set. There are no images
overlapped between the training set and the test set. The system was found to be able to accurately
discriminate the three classes of bacteria. Moreover, the system was also found to be able to reject images
that did not belong to any of the three classes of bacteria. The preliminary results demonstrate how a
simple machine learning classifier with a set of simple image-based features can result in high
classification accuracy. The preliminary results also demonstrate the efficacy and efficiency of our two-step
automatic bacteria identification approach and motivate us to extend this framework to identify a variety of
other types of bacteria.
standard language ,compiled and build using platform specific native API (Application Programming
Interface) and it can run on different operating systems (such as Amazon Fire OS, Android, Blackberry 10,
Firefox OS, iOS, Ubuntu, Windows Phone, Windows 8, Tizen).In this paper we propose a context based
cross platform mobile application for android OS which provide service for generating unreserved ticket
for Mumbai local trains. This system will create a ticket in the form of QR code using basic information of
the journey. With this system the will be able to avoid to stand in long queues at the stations for their
tickets.
channels. The proposed router is designed to reduce the hardware complexity and to improve power and
area consumption, simultaneously. In the new structure of the proposed router, all of the controlling
components have been implemented sequentially inside the allocator router modules. This optimizes
communications between the controlling components and eliminates the most of hardware overloads of
modular communications. Eliminating additional communications also reduces the hardware complexity.
In order to show the validity of the proposed design in real hardware resources, the proposed router has
been implemented onto a Field-Programmable Gate Array (FPGA). Since the implementation of a
Network-on-Chip (NoC) requires certain amount of area on the chip, the suggested approach is also able
to reduce the demand of hardware resources. In this method, the internal memory of the FPGA is used for
implementing control units. This memory is faster and can be used with specific patterns. The use of the
FPGA memory saves the hardware resources and allows the implementation of NoC based FPGA.
algorithms. Almost papers published in the Web are used popular algorithms. Among them similarity
methods and clustering methods are very popular algorithms for extraction data in the Web. We use some
methods to summarize by means of well known those methods. Some papers are applied to different
methods. It is motivated to collect and summarize what are differences between them. This paper presents a
comprehensive analysis of different methods and intends to be more recognized those methods. Most of
sections are well composed for detail explanations about papers studied as a list of numbers and a table.
Today web pages are saturated on the Internet. Content of web pages are extracted from cluster of web
page using related methods. Clustering systems are useful to extract knowledge. The results obtained show
that the status of using similarity and clustering method provides the best results for people who study
about web.
network layer, such as false report injection attacks and wormhole attacks. These attacks drain finite
energy resources through false reports of compromised nodes, and devastate the constructed routing paths
through the illegal messages of adversary nodes. Statistical en-route filtering (SEF) is proposed in order to
drop the false reports against the false report injection attack, and a localized encryption and
authentication protocol (LEAP) is proposed to detect the illegal messages to protect against wormhole
attacks. When these attacks occur at the same time, SEF and LEAP should be operated simultaneously to
confront the false reports and the illegal message in the sensor network. In this paper, we propose a
method to improve the energy efficiency and the security level as compared to the simultaneous application
of SEF and LEAP. In the proposed method, three types of new keys are designed to effectively detect these
attacks, identifying and eliminating the redundancies. The effectiveness of the proposed method is verified
through experimentation, and the results are compared to the simultaneous application of SEF and LEAP
when the two attacks occur. The experiment results show that our proposed method saves up to 8% more
energy, improves the security level of the compromised nodes up to 10%, and maintains the same detection
power of the adversary nodes.
imposed by many algorithms. Almost papers published in the Web are used popular algorithms. Among
them RSA is a very popular algorithm for security in the Web. We use attributes to summarize in the RSA
algorithm by means of well known parameters. Some papers are thought about cloud and others for
different environments. There is motivated to collect and summarize what are differences between them.
This paper focuses on a comprehensive analysis of different modified RSA algorithms and supplies the
recognition of RSA intensively. Most of sections are well composed for detail explanations about papers
studied as a list of numbers and a table.
algorithms have been developed, but most of them have difficulty dealing with large graphs with thousands
of nodes. Clustering graphs is one efficient method to draw large graphs even though other techniques
exist, such as fisheye view, hyperbolic geometry and distortion-oriented presentation. A clustered graph
can significantly reduce visual complexity by replacing a set of nodes in a cluster with one abstract node.
Moreover, a hierarchically clustered graph can find superimposed structures over the original graph
through a recursive clustering process. The link structure of the Web is modelled using the Web graph. The
result of the studies can appear on Web algorithm such as crawling, searching and tracing of Web
communities. This system proposes to track a time similarities communities.
g of exhibited objects with digital information suc
h as
3D models, still images, video clips, and web pages
. The blending presents exhibition visitors with a
more
interesting and exciting experience. This paper foc
uses on the creation and presentation of artworks u
sing
mobile AR technology
.
for communication in its original form. Integrity of the transmitted medical images and the informatics
data, without any compromise in the data is an essential product of telecommunication and information
technology. A colossal need for an adequate compression methodology, in adoption for the compression of
medical images /data, to domicile for various metrics like high bandwidth, resolution factors, storage of the
images/data, the obligation to perpetuate the validity and precision of data for subsequent perceived
diagnosis transactions. This leverages exacting coercions on the restoration error. In this paper we survey
the literature related to the Image Processing Methodologies based on ROI technique/s for Digital Imaging
and Communication for Medicine (DICOM). A scrutiny as such persuades with the several congestions
related to prospective techniques of lossless compression, recommending for a better and a unique image
compression technique.
complexity of functional and non-functional requirements of embedded systems. Due to the complex context of embedded systems, defects can cause life threatening situations. Delays can create huge costs, and insufficient productivity can impact the entire industry. The rapid evolution of software engineering technologies will be a key factor in the successful future development of even more
complex embedded systems. Software development is shifting from manual programming, to model-driven engineering (MDE). One of the most important challenges is to manage the increasing complexity of embedded software development, while maintaining the product’s quality, reducing time to market, and reducing development cost.
MDE is a promising approach that emerged lately. Instead of directly coding the software using programming languages, developers model software systems using expressive, graphical notations, which provide a higher abstraction level than programming languages. This is called Model Based Development (MBD). Model Based Development if accompanied by Model Based Validation (MBV), will help identify problems early thus reduce rework cost. Applying tests based on the designed models not only enable early detection of defects, but also continuous quality assurance. Testing can start in the first iteration of the development
process. As a result of the model based approach, and in addition to the major advantage of early defects detection,
several time consuming tasks within the classical software development life cycle will be excluded. For embedded systems development, it’s really important to follow a more time efficient approach.
Blood Donation in Blood Bank in Myanmar. Information about donors and patients has been reserved in the system so that it is ready to donate blood instantly.
using techniques such as Directional Binary Code (DBC), Haar Wavelet transform and Histogram of
Oriented Gradients (HOG). The DBC texture descriptor captures the spatial relationship between any pair
of neighbourhood pixels in a local region along a given direction, while Local Binary Patterns (LBP)
descriptor considers the relationship between a given pixel and its surrounding neighbours. Therefore,
DBC captures more spatial information than LBP and its variants, also it can extract more edge
information than LBP. Hence, we employ DBC technique in order to extract grey level texture features
(texture map) from each RGB channels individually and computed texture maps are further combined
which represents colour texture features (colour texture map) of an image. Then, we decomposed the
extracted colour texture map and original image using Haar wavelet transform. Finally, we encode the
shape and local features of wavelet transformed images using Histogram of Oriented Gradients (HOG) for
content based image retrieval. The performance of proposed method is compared with existing methods on
two databases such as Wang’s corel image and Caltech 256. The evaluation results show that our
approach outperforms the existing methods for image retrieval.
learn their lectures and students can do their exercises on the web as individually or collaboratively with
their peers like directed by the teacher by using the think-pair-share technique. The system provides the
ability to clear to decide on their choices about the questions. The K-means clustering method is used to
modify the pair state and support for determining students’ grade of classes. The main objective of this
study is to design a model for java programming learning system that facilitates the collaborative learning
activities in a virtual classroom.
clustering and level set. First, the difference of unique image is improved to make
boundaries clearer; second, a spatial fuzzy c-mean clustering combining with anatomical
previous information is engaged to extract liver area automatically Thirdly, a distance
regularized level set is used for modification; finally, morphological operations are used as postprocessing.
The experiment result shows that the method can achieve high accuracy (0.9986)
and specificity (0.9989). Comparing with standard level set method, our method is more
successful in dealing with over-segmentation difficulty.
simulta-neously executed and it increases the system throughput. For further improving the performance of
the machine one more technique known as pipelining is implemented. Pipelining allows the SMT machine
to execute more than one function in the same cycle. Pipelining optimizes several processes such as fetch,
decode, dispatch, issue, execute and commit. Our focus is on the Dispatch Cycle in which instructions get
Dispatched from the individual re-order buffers to the shared Issue Queue. Branch Miss-Prediction
reduces the throughput of SMT machine due to the instruction flush-outs. The pipeline then has to be
refilled starting from the branch instruction, after changing the branch direction. It causes a delay in the
pipeline after which instructions will start to commit from that particular thread and this affects the overall
system throughput adversely. Our approach is to implement a technique called as selective context
blocking after context-branch evaluation. According to this technique the number of instructions a thread is
allowed to dispatch is controlled, if it has a higher Branch Miss-Prediction percent-age within the past
certain number of Branch Instructions. As a result of this we get an overall throughput increase of up to 4
% in the performance when compared to the default algorithm.
localization accuracy at the presence of malicious beacons becomes a challenging task. Robust
Position Estimation (ROPE) is one of the secure localization systems. However, this system suffers
from compromised network entities. Trust Based Secure Robust Position Estimation filters the
compromised network entities with the use of trust model. Simulation results and analytical
reasoning shows that the security is provided by this scheme with minimal energy consumption.
place in the field of parallel computing and combinatorial optimization. However, it is quite difficult to
achieve an optimal solution to this problem with traditional optimization approaches owing to the high
computational complexity. Amongst the metaheuristics, Simulated Annealing (SA) and Genetic Algorithm
(GA) represent the powerful combinatorial optimization methods with corresponding strengths and
weaknesses. Borrowing the respective advantages of the two paradigms, an effective combination of GA
and SA called hybrid GASA has been proposed for multiprocessor task scheduling problems with
precedence constraints. The bi-criteria objective function, including the weighted sum of makespan and
total completion has been considered for the analysis. Comparative analysis with the help of defined
performance index on the standard problems shows that the proposed hybrid GASA provides better results
when compared to simple GA and SA alone in terms of solution quality.
Automation of bacteria identification is required because there might be a shortage of skilled
microbiologists and clinicians at a time of great need. We propose an automatic bacteria identification
framework that can classify three famous classes of bacteria namely Cocci, Bacilli and Vibrio from
microscopic morphology using the Naïve Bayes classifier. The proposed bacteria identification framework
comprises two steps. In the first step, the system is trained using a set of microscopic images containing
Cocci, Bacilli, and Vibrio. The input images are normalized to emphasize the diameter and shape features.
Edge-based descriptors are then extracted from the input images. In the second step, we use the Naïve
Bayes classifier to perform probabilistic inference based on the input descriptors. 64 images for each class
of bacteria were used as the training set and 222 images consisting of the three classes of bacteria and
other random images such as humans and airplanes were used as the test set. There are no images
overlapped between the training set and the test set. The system was found to be able to accurately
discriminate the three classes of bacteria. Moreover, the system was also found to be able to reject images
that did not belong to any of the three classes of bacteria. The preliminary results demonstrate how a
simple machine learning classifier with a set of simple image-based features can result in high
classification accuracy. The preliminary results also demonstrate the efficacy and efficiency of our two-step
automatic bacteria identification approach and motivate us to extend this framework to identify a variety of
other types of bacteria.
standard language ,compiled and build using platform specific native API (Application Programming
Interface) and it can run on different operating systems (such as Amazon Fire OS, Android, Blackberry 10,
Firefox OS, iOS, Ubuntu, Windows Phone, Windows 8, Tizen).In this paper we propose a context based
cross platform mobile application for android OS which provide service for generating unreserved ticket
for Mumbai local trains. This system will create a ticket in the form of QR code using basic information of
the journey. With this system the will be able to avoid to stand in long queues at the stations for their
tickets.
channels. The proposed router is designed to reduce the hardware complexity and to improve power and
area consumption, simultaneously. In the new structure of the proposed router, all of the controlling
components have been implemented sequentially inside the allocator router modules. This optimizes
communications between the controlling components and eliminates the most of hardware overloads of
modular communications. Eliminating additional communications also reduces the hardware complexity.
In order to show the validity of the proposed design in real hardware resources, the proposed router has
been implemented onto a Field-Programmable Gate Array (FPGA). Since the implementation of a
Network-on-Chip (NoC) requires certain amount of area on the chip, the suggested approach is also able
to reduce the demand of hardware resources. In this method, the internal memory of the FPGA is used for
implementing control units. This memory is faster and can be used with specific patterns. The use of the
FPGA memory saves the hardware resources and allows the implementation of NoC based FPGA.
algorithms. Almost papers published in the Web are used popular algorithms. Among them similarity
methods and clustering methods are very popular algorithms for extraction data in the Web. We use some
methods to summarize by means of well known those methods. Some papers are applied to different
methods. It is motivated to collect and summarize what are differences between them. This paper presents a
comprehensive analysis of different methods and intends to be more recognized those methods. Most of
sections are well composed for detail explanations about papers studied as a list of numbers and a table.
Today web pages are saturated on the Internet. Content of web pages are extracted from cluster of web
page using related methods. Clustering systems are useful to extract knowledge. The results obtained show
that the status of using similarity and clustering method provides the best results for people who study
about web.
network layer, such as false report injection attacks and wormhole attacks. These attacks drain finite
energy resources through false reports of compromised nodes, and devastate the constructed routing paths
through the illegal messages of adversary nodes. Statistical en-route filtering (SEF) is proposed in order to
drop the false reports against the false report injection attack, and a localized encryption and
authentication protocol (LEAP) is proposed to detect the illegal messages to protect against wormhole
attacks. When these attacks occur at the same time, SEF and LEAP should be operated simultaneously to
confront the false reports and the illegal message in the sensor network. In this paper, we propose a
method to improve the energy efficiency and the security level as compared to the simultaneous application
of SEF and LEAP. In the proposed method, three types of new keys are designed to effectively detect these
attacks, identifying and eliminating the redundancies. The effectiveness of the proposed method is verified
through experimentation, and the results are compared to the simultaneous application of SEF and LEAP
when the two attacks occur. The experiment results show that our proposed method saves up to 8% more
energy, improves the security level of the compromised nodes up to 10%, and maintains the same detection
power of the adversary nodes.
imposed by many algorithms. Almost papers published in the Web are used popular algorithms. Among
them RSA is a very popular algorithm for security in the Web. We use attributes to summarize in the RSA
algorithm by means of well known parameters. Some papers are thought about cloud and others for
different environments. There is motivated to collect and summarize what are differences between them.
This paper focuses on a comprehensive analysis of different modified RSA algorithms and supplies the
recognition of RSA intensively. Most of sections are well composed for detail explanations about papers
studied as a list of numbers and a table.
algorithms have been developed, but most of them have difficulty dealing with large graphs with thousands
of nodes. Clustering graphs is one efficient method to draw large graphs even though other techniques
exist, such as fisheye view, hyperbolic geometry and distortion-oriented presentation. A clustered graph
can significantly reduce visual complexity by replacing a set of nodes in a cluster with one abstract node.
Moreover, a hierarchically clustered graph can find superimposed structures over the original graph
through a recursive clustering process. The link structure of the Web is modelled using the Web graph. The
result of the studies can appear on Web algorithm such as crawling, searching and tracing of Web
communities. This system proposes to track a time similarities communities.
g of exhibited objects with digital information suc
h as
3D models, still images, video clips, and web pages
. The blending presents exhibition visitors with a
more
interesting and exciting experience. This paper foc
uses on the creation and presentation of artworks u
sing
mobile AR technology
.
is promising to be used as main memory. It draws wide attention due to the advantages of low power consumption, high density and nonvolatility, while it incurs finite endurance and relatively long write latency. To handle the problem of write, optimizing the cache replacement policy to protect dirty cache block is an efficient way. In this paper, we construct a systematically multilevel structure, and based on it
propose a novel cache replacement policy called MAC. MAC can effectively reduce write traffic to PCM memory with low hardware overhead. We conduct simulation experiments on GEM5 to evaluate the performances of MAC and other related works. The results show that MAC performs best in reducing the amount of writes (averagely 25.12%) without increasing the program execution time.
of parameters to decide its efficiency. To analyse its swimming efficiency several existing robotic models
using undulation motion have been thoroughly studied and few critical parameters such as fin size, shape,
flexibility of material, fin flapping rate etc., have been identified. After simulating the above critical
parameters using a MATLAB based CFD tool, mainly to design prototypes for research purpose which
concentrates on optimizing its swimming efficiency. Several combinations of critical parameters are
discussed for robotic fish prototype design by identifying critical parameters and maximizing the swimming
efficiency of robotic fish based on the above properties. The fin models have been tested using Matlab
based CFD tool through simulation and the swimming speed, direction have also been tested for various
models. An improved benchmark for designing robotic fish model is proposed in order to simplify selection
of components, architecture, and cost involve.
reliable and is able to achieve high load balancing with fast convergence. The cluster constructed by the
proposed scheme is more stable than exiting vehicular ad hoc network clustering schemes. Specifically, the proposed scheme can increase the cluster-head lifetime up to 50%. The reason behind this achievement is that the proposed scheme considers vehicle mobility in terms of average link expiration time while selecting cluster-head.
communication. A number of mechanisms are responsible for the signal attenuation within optical fibers.
As the optical signal propagates over long stretch of fiber, it becomes attenuated because of absorption,
scattering, fiber bends by material impurities, and other effects. The transmission using high bandwidth
can handle vast amounts of information, which can be further improved by reduction in fiber losses,
increase in data rates and distances, and using appropriate operating wavelength in optical fiber
communication. The recent development in the area of fiber optic communication as well as the
advances in different fiber types and their properties such as attenuation or loss and bandwidth are
also discussed in this paper. The performance improvement of the proposed different loss, such as Rayleigh
scattering, Stimulated Brillouin Scattering (SBS), Stimulated Raman Scattering (SRS), and bending loss
within the various loss mechanisms in fiber optic communication is shown through simulations.
these software applications is paramount to the successful use of these applications. These applications utilize cryptographic algorithms to secure the data over the network through encryption and decryption processes. The use of parallel processors is now common in both mobile and cloud computing scenarios. Cryptographic algorithms are compute intensive and can significantly benefit from parallelism. This paper introduces a parallel approach to symmetric stream cipher security algorithm known as RC4A, which is
one of the strong variants of RC4. We present an efficient parallel implementation to the compute intensive PRGA that is pseudo-random generation algorithm portion of the RC4A algorithm and the resulted algorithm will be named as PARC4-I. We have added some functionality in terms of lookup tables.
Modified algorithm is having four lookup tables instead of two and is capable of returning four distinct output bytes at each iteration. Further, with the help of Parallel Additive Stream Cipher Structure and loop unrolling method, encryption/decryption is being done on multi core machine. Finally, the results shows that PARC4-I is a time efficient algorithm.