Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
The current interest about the *nternet of Things (IoT) evokes the establishment of infinite services giving huge, active, and varied information sets. Within it, an enormous mass of heterogeneous data are generated and interchanged by... more
The current interest about the *nternet of Things (IoT) evokes the establishment of infinite services giving huge, active, and varied information sets. Within it, an enormous mass of heterogeneous data are generated and interchanged by billions of device which can yield to an enormous information traffic jam and affects network efficiency. To get over this issue, there’s a necessity for an effective, smart, distributed, and in-network technique that uses a cooperative effort to aggregate data along the pathway from the network edge to its sink. we tend to propose an information organization blueprint that systematizes data aggregation and transmission within the bounds of the Edge domain from the front-end until the Cloud. A social consensus technique obtained by applying statistical analysis is employed within the blueprint to get and update a policy concerning a way to aggregate and transmit data according to the order of information consumption inside the network. The Propose technique, consensus Aggregation, uses statistical Machine Learning to consolidate the approach and appraise its performance. inside the normal operation of the approach, data aggregation is performed with the utilization of data distribution. A notable information delivery efficiency was obtained with a nominal loss in precision as the blueprint was tested inside a particular environment as a case study. The conclusion of the strategy showed that the consensus approach overcome the individual ones in several directions.
Senior citizens consider the most significant part of aging to be the changes that occur to their bodies. For this reason, they avoid doing body exercises and therefore often prone to heart diseases, muscle disorders and many other... more
Senior citizens consider the most significant part of aging to be the changes that occur to their bodies. For this reason, they avoid doing body exercises and therefore often prone to heart diseases, muscle disorders and many other ailments. Although weakening of bones and muscles is inevitable as we grow old, we can still maintain our health by doing regular physical exercise. Encouraging elderly people to do physical exercise can be challenging, therefore the Robot-Trainer is introduced to make their exercising journey exciting accompanied by a friendly humanoid robot. The main goal of the system is to program an IoT-Assistive Robot to interact and conduct physical exercise training sessions with elderly people and maintain their physical fitness and mental health.
The world today, as we know it, is profuse with information about humans and objects. Datasets generated by cyber-physical systems are orders of magnitude larger than their current information processing capabilities. Tapping into these... more
The world today, as we know it, is profuse with information about humans and objects. Datasets generated by cyber-physical systems are orders of magnitude larger than their current information processing capabilities. Tapping into these big data flows to uncover much deeper perceptions into the functioning, operational logic and smartness levels attainable has been investigated for quite a while. Knowledge Discovery & Representation capabilities across mutiple modalities holds much scope in this direction, with regards to their information holding potential. This paper investigates the applicability of an arithmetic tool Tensor Decompositions and Factorizations in this scenario. Higher order datasets are decomposed for Anomaly Pattern capture which encases intelligence along multiple modes of data flow. Preliminary investigations based on data derived from Smart Grid Smart City Project are compliant with our hypothesis. The results proved that Abnormal patterns detected in decomposed Tensor factors encompass deep information energy content from Big Data as efficiently as other Pattern Extraction and Knowledge Discovery frameworks, while salvaging time and resources.
ABSTRACT
Network virtualization is an inherent component of future internets. Network resources are virtualized and provisioned to users on demand. The virtual network embedding entails two processes: node mapping and link mapping. However,... more
Network virtualization is an inherent component of future internets. Network resources are virtualized and provisioned to users on demand. The virtual network embedding entails two processes: node mapping and link mapping. However, efficient and practical solutions to the link mapping problem in software-defined networks (SDN) and data centers are still lacking. This paper proposes a solution to the link mapping (LiM) process that can dynamically interact with the routing protocols of the substrate network to allocate virtual link requests to the underlying substrate links, satisfies optimizing cost, minimizing energy consumption, and avoiding congestion (CEVNE) concurrently. CEVNE LiM is realized as a composite application on top of the SDN controller running the Segment Routing (SR) application. The performance of the CEVNE LiM algorithm is compared with the k-shortest path link mapping algorithm and shows its superior performance in terms of the overall runtime, the average path length, the average node stress, the average link stress, and the overall energy consumption.
Accurately identifying and authenticating a human without requiring proximity, yet with high reliability, is crucial for application areas such as surveillance, dynamic authentication, and proof of identity. While a significant research... more
Accurately identifying and authenticating a human without requiring proximity, yet with high reliability, is crucial for application areas such as surveillance, dynamic authentication, and proof of identity. While a significant research has been conducted in the direction of human authentication, achieving the authentication in an ambient manner, with high reliability, and security is still far from being perfect under the currently deployed methods. This is due to the requirement of a specific posture and approximation to the device in many approved and deployed methods such as fingerprint, facial recognition, and retinal scan. An ambient authentication system is highly dependent on biometric features that can accurately discriminate between one human and another. These biometric features should be insensitive to variation in posture, proximity, and aging. Extraction of a unique feature set that allows a human to be identified smoothly, even without proximity, under varying conditions is one of the main steps needed for an advancement in this field. In this paper, a review of current and emerging mechanism are provided. A proposed method will use a feature set that allows the system to achieve dynamic detection compared to the currently used technologies such as fingerprint, facial recognition and retinal scan. The proposed feature set is made up of a combination of facial minutiae and thermal contours which are extracted from the human face on-the-fly even without the subject coordination.
These days identification of a person is an integral part of many computer-based solutions. It is a key characteristic for access control, customized services, and a proof of identity. Over the last couple of decades, many new techniques... more
These days identification of a person is an integral part of many computer-based solutions. It is a key characteristic for access control, customized services, and a proof of identity. Over the last couple of decades, many new techniques were introduced for how to identify human faces. The purpose of this paper is to introduce yet another innovative approach for face recognition. The human face consists of multiple features that when considered together produces a unique signature that identifies a single person. Building upon this premise, we are studying the identification of faces by producing ratios from the distances between the different features on the face and their locations in an explainable algorithm with the possibility of future inclusion of multiple spectrum and 3D images for data processing and analysis.
The Cognitive Radio can be considered as a mandatory part of the Internet of Things applications. It helps to solve the sacristy issues in the frequency bands of the wireless network component of the technology. However, the security... more
The Cognitive Radio can be considered as a mandatory part of the Internet of Things applications. It helps to solve the sacristy issues in the frequency bands of the wireless network component of the technology. However, the security problem is the primary challenge that needs to be carefully mitigated. Specifically, defending the Cognitive Radio mechanism against the jamming attacks. The aim this research paper is to investigate and provide a reliable and adaptive Cognitive Radio protection methods against the jamming attacks. Thus, improving the performance of the wireless network of IoT technology, enhancing the bandwidth and solving the issue of the sacristy of the frequency bands. The mentioned objectives will be accomplished by the aid of the game theory which is modelled as an anti-jamming game and by adapting the multi-arm bandit (MAB) policies. However, to solve the sacristy issue in the frequency band spectrum of the cognitive radio, some MAB policies were adapted such as Upper Confidence Bound (UCB), Thompson Sampling and Kullback-Leibler Upper Confidence Bound (KL-UCB). The results show some improvements and enhancements to the sacristy problem in the frequency band spectrum. To conclude, the Thompson Sampling MAB policy was the best to be adapted for solving the problem, as it resulted with lowest regrets and highest rewards compared to the other MAB policies.
Telecollaboration (TC), a next generation Internet application would demand particular Quality of Service (QoS) requirements in order to function properly for which best-effort (BE) delivery becomes inadequate. This paper proposes a... more
Telecollaboration (TC), a next generation Internet application would demand particular Quality of Service (QoS) requirements in order to function properly for which best-effort (BE) delivery becomes inadequate. This paper proposes a solution for this in terms of redefining the ...
This chapter introduces a new linear sensor model of low cost micro-electro-mechanical-systems accelerometers. A modelled algorithm has been developed to reduce different errors for the system. Experiments with Matlab analysis has been... more
This chapter introduces a new linear sensor model of low cost micro-electro-mechanical-systems accelerometers. A modelled algorithm has been developed to reduce different errors for the system. Experiments with Matlab analysis has been displayed in the paper. Transmission of accelerometer in both cable and wireless has been calculated and displayed from 1500 samples. Linear sensor model has been applied into the experiment results to optimize the accuracy of the accelerometer data. This method is able to be used in different positions of the accelerometer. It is also used to calibrate and align the system before the system operation. Therefore, it is possible that the sensor model can be used to reduce errors in different sensor networks, such as bluetooth, beacon and ZigBee.
this paper introduces a new generation an error modelling of wireless indoor localization. A newly designed algorithm modelling by using Kalman Filter is developed to optimize the wireless indoor localization system based on iBeacon... more
this paper introduces a new generation an error modelling of wireless indoor localization. A newly designed algorithm modelling by using Kalman Filter is developed to optimize the wireless indoor localization system based on iBeacon sensors. It is designed to detect and minimize the errors. Matlab simulations, implementation, and validation are presented in the paper.
This research paper proposes a Middleware model for a Localization System that may be applied in Healthcare environments such as Hospitals or Nursing Homes to track staff, patients, visitors and equipment. It investigates literature... more
This research paper proposes a Middleware model for a Localization System that may be applied in Healthcare environments such as Hospitals or Nursing Homes to track staff, patients, visitors and equipment. It investigates literature regarding indoor localization methods and limitations to determine a suitable algorithm that may be implemented in an infrastructure oriented software. The methodology used to build and test the software is explained. It then illustrates the concept of the Localization Middleware and how it might be used when deployed indoor premises, inside such rooms as a hospital wards, In terms of the functional responsibilities, it is expected to offer an effective implementation of the distance measurement algorithm for Received Signal Strength and the Linear Least-Squares localization algorithm. The simulations of the localization algorithm with the given simulation results are looking promising. However, the real-time tests demonstrated that the range measurement was insufficiently precise to be reliable. Given a more accurate and reliable distance measurement, a more precise localization result could be attained.
Artificial Endocrine Systems is the study of nature's highly evolved and advanced resource management system that sustains life. The basic units of the endocrine system are the chemical messengers, otherwise known as hormones, which... more
Artificial Endocrine Systems is the study of nature's highly evolved and advanced resource management system that sustains life. The basic units of the endocrine system are the chemical messengers, otherwise known as hormones, which in enough numbers produce cascading effects in the host organism that promotes growth, maintains metabolism and much more. This behaviour is in effect a living metaphor indicative of the high throughput, constant flux and stochastic interactions that exist in complex topologies such as large scale IoT infrastructures.
In today's life, images play a significant role in many Big Data application fields for various purposes. Image processing has to face the huge challenges because of images created in a digital format which leads to huge data volumes.... more
In today's life, images play a significant role in many Big Data application fields for various purposes. Image processing has to face the huge challenges because of images created in a digital format which leads to huge data volumes. Using Joint Photographic Experts Group 2000 (JPEG2000) compression techniques to meet the diverse type of real-time applications. Lossless compression JPEG2000 and others are used to minimize the expenditure of possessions such as hard disk space and transmission bandwidth. This experimental work shows an improved lossless color image compression that uses a wavelet based Human Visual System.This Reversible Color Compression Transform method (RCCT) produces an efficient algorithm to compress the image without loss of information. JPEG2000 as a lossless mode is utilized for bit-preserving and to refer globally for encoding and decoding processes. The Reversible Color Transform (RCT) is used in JPEG 2000 using wavelets which provide a mathematical way to encode the information in such a way that it is layered according to the level of detail by using HVS attributes in the stage of quantization. In this research, the goal of lossless image compression is to decrease the number of bits required to demand computing resources such as store and transmit images without any loss of information.
Blockchain has been evolving and gaining new heights over the years. The shift in the perspective is allowing new user cases beyond the cryptocurrency space. Cryptocurrencies are digital assets supported by the complexities of... more
Blockchain has been evolving and gaining new heights over the years. The shift in the perspective is allowing new user cases beyond the cryptocurrency space. Cryptocurrencies are digital assets supported by the complexities of cryptography, game theory and peer-to-peer networks. Blockchain became a popular platform for decentralized applications, as well as a valuable tool for start-ups seeking fundraising. The aim of this research paper is to review and assess the status quo for each branch of use cases, and then analyze the enabling and inhibiting factors influencing the adoption of blockchain. These findings permit a broader comprehension over the concepts backing blockchain. It will help new users to establish strategies, develop solutions and encourage the employment of blockchain technology.
Tunneling is one of the key mechanisms which can help in the transition from the current IPv4 to IPv6 protocol. The function of automatic tunneling process is to encapsulate IPv6 packets into IPv4 packets. The main components involved in... more
Tunneling is one of the key mechanisms which can help in the transition from the current IPv4 to IPv6 protocol. The function of automatic tunneling process is to encapsulate IPv6 packets into IPv4 packets. The main components involved in the tunelling mechanism are: Teredo, ISATAP, and 6to4. In some cases, however, these components have ceratain issues related to source routing, neighbor discovery and NAT holes problems. This paper aims to demonstrate how a serious problem related to the Teredo mechanism, called “Teredo NAT Holes” can be solved. The problem NAT Holes problem increases the attack surface in Teredo and thus causes the NAT service to become vulnerable to attacks. This research work proposes an approach called the Packet Authentication and Integrity Services (PAIS) that takes advantage of the Certificate Authentication (CA) that is combined with the Diffie-Hellman key exchange and Hash Message Authentication Code (HMAC) algorithms to provide a suitable solution for the problem. Here it is suggested that the proposed method needs to create the PAIS at the Tunnel’s starting point first, and then needs to verify it at the end point of the Tunnel, by recreating the value of md , which is subsequently inserted into the md field and compared against the md’ field in the packet. The proposed methodology adds md field in order to replace the next header in the packet header structure. The Diffie-Hellman algorithm is used for the key exchange. The IPv6 protocol supports loopback virtual network, and is used in the experimental test bed to validate the efficiency of the method. The experimental results show that the method offers good performance and is able to adequately mitigate NAT Holes issues in Teredo clients
In recent years, multimedia computing has emerged as a major area of research. This led to the development of various other types of applications like video, images and graphics. Data compression is specifically valuable during... more
In recent years, multimedia computing has emerged as a major area of research. This led to the development of various other types of applications like video, images and graphics. Data compression is specifically valuable during communication as it enables the electronic devices to store and transmit the data in a smaller number of bits. The following research considers data compression as a possible solution to retrieve, store and transmit data. In relation to this, this research also aims to have a balance between the processing time, the quality and compression rate based on Human Visual System (HVS) perception. The best compression method for multimedia conversion would be to convert at the highest possible rate with the minimum amount of distortion. For this reason, the proposed method using wavelet based on HVS method with integrated JPEG2000 to achieve a high compression ratio, faster execution and better quality all in real time. This experiment work has improved the performance of the previous method and achieved the best result of data compression using wavelet based on HVS.
The paper presents the concept of renewable energy management system. The idea behind the system is to exploit the potential of renewable energy generation sources so as to provide additional energy services and participation in a... more
The paper presents the concept of renewable energy management system. The idea behind the system is to exploit the potential of renewable energy generation sources so as to provide additional energy services and participation in a competitive energy market. These actions can significantly affect the shortening of the period of return on investment of individual customer in renewable energy sources. The paper contains a concept of Electricity Consumption and Supply Management System (ECSM) with application of blockchain technology. ECSM provides functionality to monitor and record continuously information about inbound and outbound energy to/from power grid. Except monitoring inbound and outbound energy, solution will provide the possibility to manage in automatic and manual way when energy should be sent to energy grid. Information about inbound/outbound energy will be part of smart contract which will be confirmed and stored in every node.
Every organization defines Principles for Enterprise Architecture (EA) practice. As there is no set standard, the principles identified exceeds the recommended number 20 by TOGAF. More the number of Principles defined it will be ignored... more
Every organization defines Principles for Enterprise Architecture (EA) practice. As there is no set standard, the principles identified exceeds the recommended number 20 by TOGAF. More the number of Principles defined it will be ignored by the Enterprise Architects instead referring for their decision making. In this paper, we identify the ideal number of principles that will motivate Architects to refer to perform their task
Hardware resources require efficient scaling because the future of computing technology seems to be intensive multithreaded. One of the main challenges in the scalability of computers hardware is the hierarchy of the memory.... more
Hardware resources require efficient scaling because the future of computing technology seems to be intensive multithreaded. One of the main challenges in the scalability of computers hardware is the hierarchy of the memory. Chip-multiprocessors (CMPs) rely on large and multi-level hierarchies of caches to reduce cost of resources and improve systems performance. These multi-level hierarchies are the ones, which also help to solve the issue of limited bandwidth and minimize the latency of the main memory. Almost half of the area of the chip and a large percentage of the system energy is used by caches. One of the main problems limiting the scalability of cache hierarchies is called cache associativity. Caches consume a lot of energy to implement associative lookups. This affects the performance of the system by reducing the efficiency of caches. This paper describes a new design of cache that we called - Adaptive Hashing and Replacement Cache (AHRC). This design has the ability of maintaining high associativity with an advanced method of replacement policy. AHRC can improve associativity and maintain the number of possible locations, where each block is kept as small as possible. Several workloads were simulated on a large-scale CMP with AHRC as the last-level cache. We propose an Adaptive Reuse Interval Prediction (ARIP) scheme for AHRC, which is superior to the NRU scheme that was described by Seznec. Results demonstrate that AHRC has better energy efficiency and higher performance as compared to conventional caches. Additionally, large caches that utilize AHRC are the most suitable in many core CMPs to provide a more significant improvement and scalability than the smaller caches. However, AHRC with a higher-level replacement may lead to loss of energy for workloads that are not sensitive to the policy governing the replacement process.
The ability to learn without instruction is a powerful enabler for learning systems. A mechanism for this, selfplay, allows reinforcement learning to develop high performing policies without large datasets or expert knowledge. Despite... more
The ability to learn without instruction is a powerful enabler for learning systems. A mechanism for this, selfplay, allows reinforcement learning to develop high performing policies without large datasets or expert knowledge. Despite these benefits, self-play is known to be less sample efficient and suffer unstable learning dynamics. This is in part due to a nonstationary learning problem where an agent’s actions influence their opponents and as a consequence the training data they receive. In this paper we demonstrate that competitive pressures can be utilised to improve self-play. This paper leverages coevolution, an evolutionary inspired process in which individuals are compelled to innovate and adapt, to optimise the training of a population of reinforcement learning agents. We demonstrate that our algorithm improves the final performance of a Rainbow DQN trained in the game Connect Four, achieving a 15% higher win percentage over the next leading self-play algorithm. Furthermore, our algorithm exhibits more stable training with less variation in evaluation performance.
The technology landscape has evolved from Mainframe to Digital platform. In this paper, we are proposing the skills that are essential for an Enterprise Architect to be successful in the Digital Era.
In this stage of Data Preservation the challenge is how to keep the attributes of the data and how to preserve the originality. It is like to keep the living part of the data. It is how the concepts of Heritage have sense. Heritage is the... more
In this stage of Data Preservation the challenge is how to keep the attributes of the data and how to preserve the originality. It is like to keep the living part of the data. It is how the concepts of Heritage have sense. Heritage is the concrete data, it gives the interconnection to other aspects of the reality. Nowadays the physical value and the aspects of items complete the relevance of information. The relation between Preservation and Digital patterns of Heritage is well related because of the two aspects to consider: Accessibility and Context.
Recent advances in Information and Communication Technology (ICT) have stimulated the need for adaptation of new models of teaching and learning in educational institutions. Since the beginning, the use of technology in education has... more
Recent advances in Information and Communication Technology (ICT) have stimulated the need for adaptation of new models of teaching and learning in educational institutions. Since the beginning, the use of technology in education has gained a lot of attention and has been applied to many areas in the academic domain. The introduction of ICT has motivated scholars to find out how various ICT tools could be deployed and efficiently used. Due to rapid changes in ICT, an extended TAM (ETAM) has been developed to highlight the demand for using advanced ICT tools in the education process. The idea is to support the acceptance of technology implementations by modifying the conventional model with the pre-acknowledgement of additional elements that reflect the state of art technological advances. These additional factors aim to increase the motivation acceptance levels by instructors and students alike towards ICT teaching and learning studio style.
This paper describes the Studio experiences created for Data and Electronic Engineering students at the University of Technology Sydney. It describes the purpose of the Studios, and their structure. It completes with a retrospective of... more
This paper describes the Studio experiences created for Data and Electronic Engineering students at the University of Technology Sydney. It describes the purpose of the Studios, and their structure. It completes with a retrospective of what worked, and what did not work, and suggests future changes.
The consequence of road accidents that involves a motorcycle is far more fatal for the rider than the other drivers. Yet, there has not been an effective vehicle alert system that can eliminate these avoidable motorcycle accidents caused... more
The consequence of road accidents that involves a motorcycle is far more fatal for the rider than the other drivers. Yet, there has not been an effective vehicle alert system that can eliminate these avoidable motorcycle accidents caused by other drivers where they fail to notice the motorcycles. One of the major flaws with the existing vehicle alert systems is that it should not treat motorcycles as same as other vehicles as they take much longer time to brake than a cars do. Therefore, this project aimed to find an effective method to identify motorcycles and alert the other drivers when motorcyclists are around them in 20-meter radius. After extensive literature review, the best method to solve the problem is to use road side infrastructure based Internet of Things (IOT) that divides the network into a set of clusters. In this method to identify a vehicle, it is identifying the driver and the rider from their smartphone application that beacons custom, unique Media Access Control (MAC) addresses via Bluetooth or Wi-Fi to the IOT probes. The probe differentiates the users, registers them when they arrive into the network, alerts the driver about motorcycles around them and removes them from the database when they move to other cluster. The whole scenario is simulated using the OMNET++ simulator and INET framework to demonstrate how the methodology works. If the concept is implemented in real-life, many valuable lives of motorists will be much safer on the road.
ABSTRACT
Manufacturing industry based on steam know as Industry 1.0 is evolving to Industry 4.0 a digital ecosystem consisting of an interconnected automated system with real-time data. This paper investigates and proposes, how the digital... more
Manufacturing industry based on steam know as Industry 1.0 is evolving to Industry 4.0 a digital ecosystem consisting of an interconnected automated system with real-time data. This paper investigates and proposes, how the digital ecosystem complemented with Enterprise Architecture practice will ensure the success of digital transformation.
Abstract - Use of Telecollaboration (TC) systems based on sensor networks in education is an emerging field of research. It is feasible to connect many small wireless micro-sensors (“smart dust”) [36], PDAs and cellular phones into a... more
Abstract - Use of Telecollaboration (TC) systems based on sensor networks in education is an emerging field of research. It is feasible to connect many small wireless micro-sensors (“smart dust”) [36], PDAs and cellular phones into a network that extends a TC education system. ...
Multiple core designs have become commonplace in the processor marketplace, and are therefore a major focus in modern computer architecture research. Thus, for both product development and research, multiple core processor performance... more
Multiple core designs have become commonplace in the processor marketplace, and are therefore a major focus in modern computer architecture research. Thus, for both product development and research, multiple core processor performance evaluation is a mandatory step in marketplace. Multicore computing has presented many challenges for system designers; one of which is data consistency between a shared cache or memory and the local caches of the chip. This is also known as cache coherency. The cache coherence mechanisms are a key component in the direction of accomplishing the goal of continuing exponential performance growth through widespread thread-level parallelism. In the scope of this research, we have studied the available efficient methods and protocols used to achieve cache coherence in multicore architectures. These protocols were further modeled and evaluated utilizing Simics simulator for multicore architectures. We also explored the weaknesses and strengths of different protocols and discussed the way of improving them.
Preset day cyber-physical systems (CPS) are the confluence of very large data sets, tight time constraints, and heterogeneous hardware units, ridden with latency and volume constraints, demanding newer analytic perspectives. Their system... more
Preset day cyber-physical systems (CPS) are the confluence of very large data sets, tight time constraints, and heterogeneous hardware units, ridden with latency and volume constraints, demanding newer analytic perspectives. Their system logistics can be well-defined by the data-streams’ behavioral trends across various modalities, without numerical restrictions, favoring resource-saving over methods of investigating individual component features and operations. The aim of this paper is to demonstrate how behavior patterns and related anomalies comprehensively define a CPS. Tensor decompositions are hypothesized as the solution in the context of multimodal smart-grid-originated Big Data analysis. Tensorial data representation is demonstrated to capture the complex knowledge encompassed in these data flows. The uniqueness of this approach is highlighted in the modified multiway anomaly patterns models. In addition, higher-order data preparation schemes, design and implementation of tensorial frameworks and experimental-analysis are final outcomes.
This research paper presents a Machine Learning-based human detection model focusing on improving precision of human movement conditions in video frames. The problem is addressed by focusing on pre-processing and an efficient feature... more
This research paper presents a Machine Learning-based human detection model focusing on improving precision of human movement conditions in video frames. The problem is addressed by focusing on pre-processing and an efficient feature extraction methodology. Combination of features are extracted, including histograms of gradients (HoG), histograms of colors (HoC), and histograms of bars (HoB). These featuresets are combined to form the finall feature vector that describes the human shape, and the Support Vector Machine (SVM)-based classifier is used for classification purposes. Improving the precision will allow the human movement detector to make better detections by reducing false positives and missed detections, which are the problems faced by current detection techniques. Training of the algorithm is done using the INRIA dataset and tested on sequences depicting conditions of moving humans in different environments. In the testing phase, the search space is reduced using an upper body detector, which is done using haar features. The reduced space is used to carry out human detection using the proposed feature extraction technique. The proposed detector approach performs well, and the number of missed detections are reduced. However, some false detections are still performed, but this is due the fact that some objects resemble humans. The proposed model is benchmarked with the current state-of-the-art detectors using a challenging test dataset, which is used to test the performance. The Receiver Operating Characteristic (ROC) curves for the precision-recall and true-positive rates are plotted to compare and evaluate the results. The proposed model outperforms most of the current state-of-the-art detectors.
In this paper we present a mathematical model and an algorithm for solving a task scheduling problem in computing cluster. The problem is considered as a 2D packing problem. Each multi-node task is treated as a set of separate subtasks... more
In this paper we present a mathematical model and an algorithm for solving a task scheduling problem in computing cluster. The problem is considered as a 2D packing problem. Each multi-node task is treated as a set of separate subtasks with common constrains. For optimization the tabu search metaheuristic algorithm is applied.
Coronary Artery Disease (CAD) is the prime causal factor in cardiovascular disease in the 21st century throughout the world. In Australia, CAD related diseases result in 12% morbidity and mortality rate. This paper summarizes the... more
Coronary Artery Disease (CAD) is the prime causal factor in cardiovascular disease in the 21st century throughout the world. In Australia, CAD related diseases result in 12% morbidity and mortality rate. This paper summarizes the noninvasive methods of diagnosis of CAD. The association between medical science and biomedical engineering has led to the development of non-invasive methods of diagnosis of CAD. The use of new technology that exploits IoT and Body Area Networks using wearable sensor devices over the patient’s body and medical experts to diagnose CAD. Progression of clinical assessment, diagnosis, and evaluation of CAD have been achieved in the last decade. The current treatment plan for CAD focused on clinical prevention, surgical or a combination of both depending on the severity of disease. The analysis of coronary artery disease, chest pain, and various things involved in the assessment of patient’s history with relieving factors such as risk stratification and non-invasive tests used in diagnosis of CAD.
This paper presents a robust machine learning based computational solution for human detection. The proposed mechanism is specifically applicable for pose-variant situations in video frames. In order to address the pose variance problem,... more
This paper presents a robust machine learning based computational solution for human detection. The proposed mechanism is specifically applicable for pose-variant situations in video frames. In order to address the pose variance problem, features are extracted using an improved variant of Histograms of Gradients (HoG) and local Binary Pattern features (LBP). The two feature sets are combined to form a feature vector based on different poses and human shapes, while a support vector machine (SVM)-based classifier is used for detection. Common issues faced by current approaches include false and missed detections in frames with robust feature-sets consisting of improved HoG features and LBP features with rotation information. The proposed detector model performs efficiently; the miss rates are reduced, the true positives are increased, and the accuracy is improved. Some false detections for human look alike objects are also observed. A diverse dataset depicting different poses is used for training purposes. A challenge test dataset is used to test the performance of the proposed approach against current state-of-the-art detectors to verify the performance. Receiver operating characteristic (ROC) curves are plotted to compare and evaluate the results based on miss rates and true positives, which demonstrate the proposed model achieves optimal results.

And 300 more