Classically, the end point detection during fluid bed drying has been performed using indirect parameters, such as the product temperature or the humidity of the outlet drying air. This paper aims at comparing those classic methods to... more
Classically, the end point detection during fluid bed drying has been performed using indirect parameters, such as the product temperature or the humidity of the outlet drying air. This paper aims at comparing those classic methods to both in-line moisture and solid-state determination by means of Process Analytical Technology (PAT) tools (Raman and NIR spectroscopy) and a mass balance approach. The six-segmented fluid bed drying system being part of a fully continuous from-powder-to-tablet production line (ConsiGma™-25) was used for this study. A theophylline:lactose:PVP (30:67.5:2.5) blend was chosen as model formulation. For the development of the NIR-based moisture determination model, 15 calibration experiments in the fluid bed dryer were performed. Six test experiments were conducted afterwards, and the product was monitored in-line with NIR and Raman spectroscopy during drying. The results (drying endpoint and residual moisture) obtained via the NIR-based moisture determinati...
Classically, the end point detection during fluid bed drying has been performed using indirect parameters, such as the product temperature or the humidity of the outlet drying air. This paper aims at comparing those classic methods to... more
Classically, the end point detection during fluid bed drying has been performed using indirect parameters, such as the product temperature or the humidity of the outlet drying air. This paper aims at comparing those classic methods to both in-line moisture and solid-state determination by means of Process Analytical Technology (PAT) tools (Raman and NIR spectroscopy) and a mass balance approach. The six-segmented fluid bed drying system being part of a fully continuous from-powder-to-tablet production line (ConsiGma™-25) was used for this study. A theophylline:lactose:PVP (30:67.5:2.5) blend was chosen as model formulation. For the development of the NIR-based moisture determination model, 15 calibration experiments in the fluid bed dryer were performed. Six test experiments were conducted afterwards, and the product was monitored in-line with NIR and Raman spectroscopy during drying. The results (drying endpoint and residual moisture) obtained via the NIR-based moisture determination model, the classical approach by means of indirect parameters and the mass balance model were then compared. Our conclusion is that the PAT-based method is most suited for use in a production set-up. Secondly, the different size fractions of the dried granules obtained during different experiments (fines, yield and oversized granules) were compared separately, revealing differences in both solid state of theophylline and moisture content between the different granule size fractions.
Towards a real time release approach for manufacturing tablets using NIR spectroscopy, Journal of Pharmaceutical and Biomedical Analysis (2014), http://dx.
In a search for aspartic proteinase inhibitors (APIs) in kiwifruit seeds, we observed pepsin inhibitory activity (PIA) in an abundant globulin fraction extracted in high salt buffer with a Mr of ∼148 kDa by gel-filtration. On a... more
In a search for aspartic proteinase inhibitors (APIs) in kiwifruit seeds, we observed pepsin inhibitory activity (PIA) in an abundant globulin fraction extracted in high salt buffer with a Mr of ∼148 kDa by gel-filtration. On a SDS-polyacrylamide gel, a major protein band of 54 kDa was observed under non-reducing conditions. This band was largely replaced by two subunits of Mr 33.5 and 20 kDa under reducing conditions. N-terminal sequencing of the smaller subunit, which had associated PIA, revealed a β-subunit of the 11S globulin-like protein (11S-GLP), legumin. After trypsin, chymotrypsin or papain digestion, the α-subunit of the kiwifruit legumin (11S-GLP) was degraded to varying degrees but there was no effect on the β-subunit, or on the PIA. This 11S-GLP also appeared to inhibit bovine spleen cathepsin D, Candida albicans secreted aspartic proteinases (SAPs) 1, 2 and 4, the plant fungus Glomerella cingulata SAP as well as apple seed aspartic proteinase, but to a much lesser extent kiwifruit seed aspartic proteinase. Through kinetic analysis of pepsin inhibition, the 11S-GLP and the purified β-subunit were found to fit a Michaelis–Menten model for competitive inhibition rather than a tight-binding model characteristic for typical proteinase inhibitors. Extrapolated complete inhibition was only obtained at an 11S-GLP concentration ∼900 times that of the pepsin. Further investigation revealed that the 11S-GLP and the β-subunit acted as weak alternative substrates for pepsin. As the 11S-GLP or the β-subunit were degraded, the PIA activity declined in parallel. We discuss these results in terms of the substrate recognition site of pepsin compared with other proteinases.
h i g h l i g h t s Sample preparation strategies used for crude oil in last ten years are discussed. Methods related to wet digestion with concentrated acids or combustion are covered. Trends in sample preparation are discussed, as well... more
h i g h l i g h t s Sample preparation strategies used for crude oil in last ten years are discussed. Methods related to wet digestion with concentrated acids or combustion are covered. Trends in sample preparation are discussed, as well as speciation analysis. Methods focusing on further metals and non-metals determination are discussed. Official methods are covered and certified reference materials are summarized.
Please cite this article in press as: Ali, M.F.M., et al., Development of Java based RFID application programmable interface for heterogeneous RFID system. a b s t r a c t Developing RFID based applications is a painstakingly difficult... more
Please cite this article in press as: Ali, M.F.M., et al., Development of Java based RFID application programmable interface for heterogeneous RFID system. a b s t r a c t Developing RFID based applications is a painstakingly difficult endeavor. The difficulties include nonstandard software and hardware peripherals from vendors, interoperability problems between different operating systems as well as lack of expertise in terms of low-level programming for RFID (i.e. steep learning curve). In order to address these difficulties, a reusable RFIDTM API (RFID Tracking & Monitoring Application Programmable Interface) for heterogeneous RFID system has been designed and implemented. The API has been successfully employed in a number of application prototypes including tracking of inventories as well as human/object tracking and tagging. Here, the module has been tested on a number of different types and configuration of active and passive readers including that LF and UHF Readers.
ABSTRACT Central Processing Units (CPUs) are task-parallel, latency-oriented processors, while Graphics Processing Units (GPUs) are data-parallel, throughput oriented processors. Besides their traditional use as graphics coprocessors, the... more
ABSTRACT Central Processing Units (CPUs) are task-parallel, latency-oriented processors, while Graphics Processing Units (GPUs) are data-parallel, throughput oriented processors. Besides their traditional use as graphics coprocessors, the GPUs have been used in recent years for general purpose computations, too. Rapid development of graphics hardware led to an extensive use in both scientific and commercial applications. Numerous papers report high speedups in various domains. This paper presents an effort to bring GPU computing closer to programmers and wider community of users. GPU computing is explored through NVIDIA Compute Unified Device Architecture (CUDA) that is currently the most mature application programming interface (API) for general purpose computation on GPUs.
Measuring the dynamic release of aroma compounds from ethanolic solutions by direct gas phase mass spectrometry (MS) techniques is an important technique for flavor chemists but presents technical difficulties as the changing ethanol... more
Measuring the dynamic release of aroma compounds from ethanolic solutions by direct gas phase mass spectrometry (MS) techniques is an important technique for flavor chemists but presents technical difficulties as the changing ethanol concentration in the source makes quantitative measurements impossible. The effect of adding ethanol into the source via the sweep gas (0-565 L ethanol/L N 2 ), to act as the proton transfer reagent ion and thereby control ionization was studied. With increasing concentrations of ethanol in the source, the water ions were replaced by ethanol ions above 3.2 L/L. The effect of source ethanol on the ionization of eleven aroma compounds was then measured. Some compounds showed reduced signal (10-40%), others increased signal (150-400%) when ionized via ethanol reagent ions compared to water reagent ions. Noise also increased in most cases so there was no overall increase in sensitivity. Providing the ethanol concentration in the source was >6.5 L/L N 2 and maintained at a fixed value, ionization was consistent and quantitative. The technique was successfully applied to measure the partition of the test volatile compounds from aqueous and 12% ethanol solutions at equilibrium. Ethanolic solutions decreased the partition coefficient of most of the aroma compounds, as a function of hydrophobicity.
This paper presents a 3D graphics engine which is specifically designed to minimize the hardware cost while providing sufficient computing capability for consumer electronics with small to medium screen sizes (up to 800times600) such as... more
This paper presents a 3D graphics engine which is specifically designed to minimize the hardware cost while providing sufficient computing capability for consumer electronics with small to medium screen sizes (up to 800times600) such as digital television. The presented 3D engine consists of a fixed full 3D graphics pipeline for both geometry and rendering operation. This engine provides a standard AHB interface that makes it easily to be integrated into an AMBA-based SoC. The development of the 3D engine has gone through a rigorous design process: starting from system modeling (using System-C), RTL implementation, hardware/software co-simulation and FPGA verification to test chip fabrication. This 3D engine provides 3.3 M vertices/s and 278 Mpixels/s in maximum performance at 139 MHz using 0.18 silicon technology with 987 K gates that is sufficient for most applications for digital television. At the same time, a complete OpenGL-ES 1.1 API, windowing system, Linux operating system, device driver and a 3D performance monitoring tool have been developed for our 3D engine. This performance monitoring tool provides run-time performance information include frame rate, triangle rate, pixel rate, involved OpenGL function list, function counts, memory utilization and etc. Moreover, a built-in real-time AHB bus tracer is also provided to monitor the bus activities of the 3D engine and other components on the system bus. The bus tracer captures on-chip bus signals at ether cycle accurate or transaction levels and applies real-time compression to both levels of signals. With the performance monitoring tool and the bus tracer, the 3D application developer can easily analyze the communication of the components and fine tune the 3D application to optimize the entire SoC system performance and to satisfy performance/cost constrains on consumer electronics. Both of the hardware and software have been carefully verified and demonstrated on FPGA using ARM versatile SoC develop board- .
In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network... more
In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and forwarding planes. So, due to the rapid increase in the number of applications, websites, storage space, and some of the network resources are being underutilized due to static routing mechanisms. To overcome these limitations, a Software Defined Network based Openflow Data Center network architecture is used to obtain better performance parameters and implementing traffic load balancing function. The load balancing distributes the traffic requests over the connected servers, to diminish network congestions, and reduce underutilization problem of servers. As a result, SDN is developed to afford more effective configuration, enhanced performance, and more flexibility to deal with huge network designs.
Nowadays the typical desktop computer processors have four or more independent CPU core, which are called as multi-core processors to execute instructions. So parallel programming language come into play to execute instructions... more
Nowadays the typical desktop computer processors have four or more independent CPU core, which are called as multi-core processors to execute instructions. So parallel programming language come into play to execute instructions concurrently for multi core architecture using openMP. Users prefer cryptographic algorithms to encrypt and decrypt data in order to send it securely over an unsafe environment like the internet. This paper describes the implementation and test results of Caesar cipher and RSA cryptographic algorithms in parallelization are done using OpenMP API 3.1 Standard and performance Analysis. According to our test results, the parallel design approach for security algorithm exhibits improved performance over the sequential approach in terms of execution of time
There exists a lack of 'off the shelf' and user-friendly computational tools that allow architects and other design consultants to quickly analyse and simulate circulation patterns of buildings. Other developments of such tools have... more
There exists a lack of 'off the shelf' and user-friendly computational tools that allow architects and other design consultants to quickly analyse and simulate circulation patterns of buildings. Other developments of such tools have so far failed to penetrate the mainstream market for architectural design software. The research presented in this paper focuses on the development of a graph-based building navigation and distance measurement tool; the Spatial Analysis and Query Tool (SQ&AT), that plugs into the most common design documentation software (Autodesk Revit) with the ability for extension into other tools. The resulting software allows users to test point-to-point shortest distances, produces several grid based metrics and allows scenarios to be built. These results may expose areas where the designer’s intuition and bias has led them to make inaccurate assumptions about quantitative design aspects.
There has been an increasing interest of a wide number of scholars and researchers in promoting the applications for Peer-to-Peer (P2P) communication in Androidbased smartphones as ubiquitous. Such increasing interest has been attributed... more
There has been an increasing interest of a wide number of scholars and researchers in promoting the applications for Peer-to-Peer (P2P) communication in Androidbased smartphones as ubiquitous. Such increasing interest has been attributed to their increasing perception of these devices. There is a wide use of many P2P applications in mobile devices. These applications include IM, VoIP, file sharing, social networks, and video streaming. This paper describes a proposed Android-based middleware acts as the interface point between mobile nodes and higher application layers for mobile ubiquitous computing. The middleware mainly aims at supporting and enhancing the protocols for direct P2P communication among users in the ensemble mobile environment. Moreover, the paper presents a discussion of the available P2P middleware for the mobile environment along with its applications. Furthermore, the paper moves on to providing the mobile devices limitations and the challenges encountered in the adoption of the P2P communication technology in the mobile environment. Finally, the paper is concluded by presenting the directions for future research as to develop a middleware with necessary APIs and implement an enhanced P2P protocol in the proposed middleware on Android-Based mobile devices.
In this paper we present a fully implemented system which is used for tracking the location. The location of the user will be tracked with the help of Google API. Location tracking will be helpful for various applications. By using this... more
In this paper we present a fully implemented system which is used for tracking the location. The location of the user will be tracked with the help of Google API. Location tracking will be helpful for various applications. By using this system we obtain the current geographical position of the hosting device or the user.
An ongoing Project, preparing for the DataBase Developer Test in TIC Timor Agency, Unique ID Secretariate, at Prime Minister Cabinet, Palacio do Governo, Timor Leste. I was developing a task for my own project for the test that will be... more
An ongoing Project, preparing for the DataBase Developer Test in TIC Timor Agency, Unique ID Secretariate, at Prime Minister Cabinet, Palacio do Governo, Timor Leste.
I was developing a task for my own project for the test that will be held in few days upcoming. I am using programming languages such as; MySQL, PHP, HTML, CSS and JavaScript.
I wish this project could have an assistance and feedback from the Senior IT Developers.
In this review sample preparation strategies used for crude oil digestion in last ten years are discussed focusing on further metals and non-metals determination. One of the main challenges of proposed methods has been to overcome the... more
In this review sample preparation strategies used for crude oil digestion in last ten years are discussed focusing on further metals and non-metals determination. One of the main challenges of proposed methods has been to overcome the difficulty to bring crude oil samples into solution, which should be compatible with analytical techniques used for element determination. On this aspect, this review summarizes the sample preparation methods for metals and non metals determination in crude oil including those based on wet digestion, combustion, emulsification, extraction, sample dilution with organic solvents, among others. Conventional methods related to wet digestion with concentrated acids or combustion are also covered, with special emphasis to closed systems. Trends in sample digestion, such as microwave-assisted digestion using diluted acids combined with high-efficiency decomposition P.A. Mello et al. / Analytica Chimica Acta 746 (2012) 15-36 ICP-OES ICP-MS Direct analysis
In 2001, a multidisciplinary team made of analytical scientists and statisticians at sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release... more
In 2001, a multidisciplinary team made of analytical scientists and statisticians at sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations.
As the technology advances, more and more systems are introduced which will look after the users comfort. Few years before hard switches were used as keys. Traditional QWERTY keyboards are bulky and offer very little in terms of... more
As the technology advances, more and more systems are introduced which will look after the users comfort. Few years before hard switches were used as keys. Traditional QWERTY keyboards are bulky and offer very little in terms of enhancements. Now-a-days soft touch keypads are much popular in the market. These keypads give an elegant look and a better feel. Currently keyboards are static and their interactivity and usability would increase if they were made dynamic and adaptable. Various on-screen virtual keyboards are available but it is difficult to accommodate full sized keyboard on the screen as it creates hindrance to see the documents being typed. Virtual Keyboard has no physical appearance. Although other forms of Virtual Keyboards exist; they provide solutions using specialized devices such as 3D cameras. Due to this, a practical implementation of such keyboards is not feasible. The Virtual Keyboard that we propose uses only a standard web camera, with no additional hardware. Thus we see that the new technology always has more Benefits and is more user-friendly.
With ever increasing regulatory and compendial stringency on the control of impurities (IMPs) and degradation products (DPs) (including genotoxic impurities) in drug substances and finished pharmaceutical formulations, a profound emphasis... more
With ever increasing regulatory and compendial stringency on the control of impurities (IMPs) and degradation products (DPs) (including genotoxic impurities) in drug substances and finished pharmaceutical formulations, a profound emphasis is being paid on their characterization and analysis at trace levels. Fortunately, there have been parallel tremendous advancements in the instrumental techniques that allow rapid characterization of IMPs and/or DPs at the prescribed levels of ∼0.1%. With this, there is perceptible shift from conventional protocol of isolation and spectral analysis to on-line analysis using modern sophisticated hyphenated tools, like GC-MS, LC-MS, CE-MS, SFC-MS, LC-NMR, CE-NMR, LC-FTIR, etc. These are already being extensively used by industry and also there is tremendous increase in publications in the literature involving their use. This write-up critically reviews the literature for application of hyphenated tools in impurity and degradation product profiling of small molecules. A brief mention is made on possible pitfalls in the experimentation and data interpretation. Appropriate strategies are proposed, following which one can obtain unambiguous characterization of the unidentified IMPs and/or DPs.
SIMPEG API is an application programming interface which provides services from BKPP Kota Bogor to public developers. SIMPEG API is a multiplatform service that could be accessed from web and mobile. The method used in developing this... more
SIMPEG API is an application programming interface which provides services from BKPP Kota Bogor to public developers. SIMPEG API is a multiplatform service that could be accessed from web and mobile. The method used in developing this system was incremental. The development of SIMPEG API used three increments which provide 31 methods for the public developer and 4 methods which is accessed in every method. SIMPEG API used user acceptance test for the technic for testing the method. Therefore, SIMPEG API is used by the public developer to provide an information for the user by consuming a service from SIMPEG API.
This paper briefly examines the interpretation of RestFul APIs, its methods, and responses. It explains how REST (Representational State Transfer) is preferred over SOAP (Simple Object Access Protocol). Majority of the IOT devices are not... more
This paper briefly examines the interpretation of RestFul APIs, its methods, and responses. It explains how REST (Representational State Transfer) is preferred over SOAP (Simple Object Access Protocol). Majority of the IOT devices are not straightaway related to each other. They are connected via services that provide interfaces to users. In this paper, the author addresses how the Internet of Things (IoT) can get profited from APIs. This paper also introduces a guide for Application Programming Interface (API) documentation. Index Terms: REST, API, SOAP, JSON, IOT, HTTP, XML I. INTRODUCTION Currently, the data is often made available to the users on the servers via specialized RESTful APIs. APIs connect modern world applications. Majority of the applications uses APIs to establish and transmit data. API consist of request and response which differ from each other. Originally, REST was known for the connection of the Web Services but, nowadays it is becoming a common method for the development of applications. RESTFul Web Service is implemented using the Web standards that includes HTTP, XML, URI and REST principles. The paper is designed as follows: Section II contains the definition of RESTFul API is answered in-depth. Section III justification for the need of REST. Section IV states all the principles of REST API, and its methods are mentioned in Section V. Section VI discusses on JSON and includes the key differences between XML and JSON. Section VII explains why REST is preferred over SOAP. Section VIII shows how API can be used to ease IOT. Section IX guides how to document APIs correctly. It is followed by Conclusion, Future Scope and References.
The provision of Application Programming Interface (API) in BIM-enable tools can contribute to facilitating BIM-related research. APIs are useful links for running plug-ins and external programmes but they are yet to be fully exploited in... more
The provision of Application Programming Interface (API) in BIM-enable tools can contribute to facilitating BIM-related research. APIs are useful links for running plug-ins and external programmes but they are yet to be fully exploited in expanding the BIM scope. The modelling of n-Dimensional (nD) building performance measures can potentially benefit from BIM extension through API implementations. Sustainability is one such measure associated with buildings. For the structural engineer, recent design criteria have put great emphasis on the sustainability credentials as part of the traditional criteria of structural integrity, constructability and cost. This paper examines the utilization of API in BIM extension and presents a demonstration of an API application to embed sustainability issues into the appraisal process of structural conceptual design options in BIM. It concludes that API implementations are useful in expanding the BIM scope. Also, the approach including process modelling, algorithms and object-based instantiations demonstrated in the API implementation can be applicable to other nD building performance measures as may be relevant to the various professional platforms in the construction domain.
This paper describes design and implementation of android application based bluetooth toy car. In this work android mobile platform is used for controlling toy car. Android application provides graphical user interface to the user. The... more
This paper describes design and implementation of android application based bluetooth toy car. In this work android mobile platform is used for controlling toy car. Android application provides graphical user interface to the user. The toy car is composed of three DC motor with gear, bluetooth module, microcontroller unit, H-bridge and LCD display. Two motors are used for controlling speed and direction of toy car and one motor is used for general purpose. Bluetooth module accepts control signals from android mobile and sends data to the microcontroller unit. Microcontroller unit processes the received data and generates control signal for DC motor and LCD. Keil software is used for microcontroller programming and android application is developed with Eclipse and android SDK.
Pharmaceutical excipients for topical use may contain impurities, which are often neglected from a toxicity qualification viewpoint. The possible impurities in the most frequently used topical excipients were evaluated in-silico for their... more
Pharmaceutical excipients for topical use may contain impurities, which are often neglected from a toxicity qualification viewpoint. The possible impurities in the most frequently used topical excipients were evaluated in-silico for their toxicity hazard. Acetol, an impurity likely present in different topical pharmaceutical excipients such as propylene glycol and glycerol, was withheld for the evaluation of its health risk after dermal exposure.
Zero-day or unknown malware are created using code obfuscation techniques that can modify the parent code to produce offspring copies which have the same functionality but with different signatures. Current techniques reported in... more
Zero-day or unknown malware are created using code obfuscation techniques that can modify the parent code to produce offspring copies which have the same functionality but with different signatures. Current techniques reported in literature lack the capability of detecting zero-day malware with the required accuracy and efficiency. In this paper, we have proposed and evaluated a novel method of employing several data mining techniques to detect and classify zero-day malware with high levels of accuracy and efficiency based on the frequency of Windows API calls. This paper describes the methodology employed for the collection of large data sets to train the classifiers, and analyses the performance results of the various data mining algorithms adopted for the study using a fully automated tool developed in this research to conduct the various experimental investigations and evaluation. Through the performance results of these algorithms from our experimental analysis, we are able to evaluate and discuss the advantages of one data mining algorithm over the other for accurately detecting zero-day malware successfully. The data mining framework employed in this research learns through analysing the behavior of existing malicious and benign codes in large datasets. We have employed robust classifiers, namely Naïve Bayes (NB) Algorithm, k−Nearest Neighbor (kNN) Algorithm, Sequential Minimal Optimization (SMO) Algorithm with 4 differents kernels (SMO-Normalized PolyKernel, SMO-PolyKernel, SMO-Puk, and SMO-Radial Basis Function (RBF)), Backpropagation Neural Networks Algorithm, and J48 decision tree and have evaluated their performance.
This paper proposes a scalable approach for distinguishing malicious files from clean files by investigating the behavioural features using logs of various API calls. We also propose, as an alternative to the traditional method of... more
This paper proposes a scalable approach for distinguishing malicious files from clean files by investigating the behavioural features using logs of various API calls. We also propose, as an alternative to the traditional method of manually identifying malware files, an automated classification system using runtime features of malware files. For both projects, we use an automated tool running in a virtual environment to extract API call features from executables and apply pattern recognition algorithms and statistical methods to differentiate between files. Our experimental results, based on a dataset of 1368 malware and 456 cleanware files, provide an accuracy of over 97% in distinguishing malware from cleanware. Our techniques provide a similar accuracy for classifying malware into families. In both cases, our results outperform comparable previously published techniques.
Cyberspace supports opportunities to capture cultures through machine and human interfaces. Art disciplines, particularly media arts, are not only enhanced by such a network of connectivity, they have become significantly changed through... more
Cyberspace supports opportunities to capture cultures through machine and human interfaces. Art disciplines, particularly media arts, are not only enhanced by such a network of connectivity, they have become significantly changed through it. The Internet, as a public interface, is becoming the place to foster the development of media art disciplines. This is because the transmission and moving of data through the interface have permitted artists to develop new ways to capture, access, and select content. Well beyond simple conversations, networked
information exchange and working transactions allow social issues and cultural production to be facilitated in a rich and transparent manner. Ultimately, this leads to new creations made fundamentally possible through the interface and its ease of use. From these resulting collaborations the interface can alter the way we understand “data” generated in the public domain. Via reuse and reproduction into another form the data is able to “live” again in many different ways respective to each artist’s, designer’s, and scientist’s reinterpretation. The public interface
becomes a dynamic of network culture and further advances new levels for artistic and cultural purpose.
Shopping and purchase is the integral part of our day to day life. We are aware of the long queues on the billing counter of the shopping malls. The present billing system is dependent on the manual entries and connected to the printers... more
Shopping and purchase is the integral part of our day to day life. We are aware of the long queues on the billing counter of the shopping malls. The present billing system is dependent on the manual entries and connected to the printers for printing the bills. The prints are taken with the low cost ink to make it economical. It involves the wastage of paper and it is also difficult to handle the bill as the ink vanishes after few days. Most of the malls have arranged the payments through debit and credit cards but the online payment options are not often present. We have identified the need of a smart system to send the bill online to the customer, send the SMS collect, feedback, and to make a payment with online options. The implementation of the same system is discussed in this paper in detail. The system is compatible with android and windows.
Despite the security community's emphasis on the importance of building secure software, the number of new vulnerabilities found in our systems is increasing. In addition, vulnerabilities that have been studied for years, such as buffer... more
Despite the security community's emphasis on the importance of building secure software, the number of new vulnerabilities found in our systems is increasing. In addition, vulnerabilities that have been studied for years, such as buffer overflows, are still commonly reported in vulnerability databases. Historically, the common response has been to blame the developers for their lack of security education. This paper discusses a new hypothesis to explain this problem and introduces a new security paradigm where software vulnerabilities are viewed as blind spots in developer's heuristicbased decision-making processes. Humans have been hardwired through evolution to use heuristics when confronted with problems and tasks. Heuristics are simple computational models to solve problems without considering all the information available, like taking shortcuts. They are an adaptive response to our limited working memory because they require less cognitive effort, but can lead to errors. This paper's thesis is that as software vulnerabilities represent corner cases that exercise unusual information flows, security thinking tends to be left out from the repertoire of heuristics used by developers during their programming tasks. Leveraging this paradigm, this paper introduces a novel methodology for capturing and understanding security-related blind spots in Application Programming Interfaces (APIs). Finally, it discusses how this methodology can be applied to the design and implementation of the next generation of automated diagnosis tools.
In order to enable the accelerated preclinical and clinical pharmaceutical development of formulations based on nanomilling, a screening media mill was developed and evaluated for the production of nanoparticulate active pharmaceutical... more
In order to enable the accelerated preclinical and clinical pharmaceutical development of formulations based on nanomilling, a screening media mill was developed and evaluated for the production of nanoparticulate active pharmaceutical ingredients (API). The screening media mill is based on the principle of a conventional planetary mill, equipped with up to 24 milling beakers of 0.05-1.0 mL individual milling chamber volume. The applicability of the screening media mill was evaluated by 0.7-12.8 mg naproxen (2 wt %) per batch with a nanomilling formulation known from the literature. A case study for the preclinical formulation development is presented, applying 42 screening experiments by one operator within 5 working days, using in total 110 mg API. Promising nanomilling formulations with median particle sizes below 200 nm could be identified, suitable for preclinical in vivo studies. A second case study for the early clinical development of another proprietary API showed the successful formulation development within 12 working days. Up-scaling to a miniaturized stirred media mill of 10 mL milling chamber volume resulted in a satisfying comparability of the selected formulations, with varying performance. Further up-scaling of the most promising formulation to a laboratory-scale stirred-media mill showed the successful production of 250 g API with a median particle size of 140 nm.
Large software frameworks and APIs can be hard to learn and use, impeding software productivity. But what are the specific challenges that programmers actually face when using frameworks and APIs in practice? What makes APIs hard to use,... more
Large software frameworks and APIs can be hard to learn and use, impeding software productivity. But what are the specific challenges that programmers actually face when using frameworks and APIs in practice? What makes APIs hard to use, and what can be done to alleviate the problems associated with API usability and learnability? To explore these questions, we conducted an exploratory study in which we manually analyzed a set of newsgroup discussions about specific challenges that programmers had about a software framework. Based on this set of data, we identified several categories of obstacles in using APIs. We discussed what could be done to help overcome these obstacles.
This paper describes Lipi Toolkit (LipiTk) -a generic toolkit whose aim is to facilitate development of online handwriting recognition engines for new scripts, and simplify integration of the resulting engines into real-world application... more
This paper describes Lipi Toolkit (LipiTk) -a generic toolkit whose aim is to facilitate development of online handwriting recognition engines for new scripts, and simplify integration of the resulting engines into real-world application contexts. The toolkit provides robust implementations of tools, algorithms, scripts and sample code necessary to support the activities of handwriting data collection and annotation, training and evaluation of recognizers, packaging of engines and their integration into pen-based applications. The toolkit is designed to be extended with new tools and algorithms to meet the requirements of specific scripts and applications. The toolkit attempts to satisfy the requirements of a diverse set of users, such as researchers, commercial technology providers, do-ityourself enthusiasts and application developers. In this paper we describe the first version of the toolkit which focuses on isolated online handwritten shape and character recognition.
The purpose of this study was to produce a dry powder for inhalation (DPI) of a poorly soluble active ingredient (itraconazole: ITZ) that would present an improved dissolution rate and enhanced solubility with good aerosolization... more
The purpose of this study was to produce a dry powder for inhalation (DPI) of a poorly soluble active ingredient (itraconazole: ITZ) that would present an improved dissolution rate and enhanced solubility with good aerosolization properties. Solid dispersions of amorphous ITZ, mannitol and, when applicable, d-␣tocopherol polyethylene glycol 1000 succinate (TPGS) were produced by spray-drying hydro-alcoholic solutions in which all agents were dissolved. These dry formulations were characterized in terms of their aerosol performances and their dissolution, solubility and physical properties. Modulate differential scanning calorimetry and X-ray powder diffraction analyses showed that ITZ recovered from the different spray-dried solutions was in an amorphous state and that mannitol was crystalline. The inlet drying temperature and, indirectly, the outlet temperature selected during the spray-drying were critical parameters. The outlet temperature should be below the ITZ glass transition temperature to avoid severe particle agglomeration. The formation of a solid dispersion between amorphous ITZ and mannitol allowed the dry powder to be produced with an improved dissolution rate, greater saturation solubility than bulk ITZ and good aerosol properties. The use of a polymeric surfactant (such as TPGS) was beneficial in terms of dissolution rate acceleration and solubility enhancement, but it also reduced aerosol performance. For example, significant dissolution rate acceleration (f 2 < 50) and greater saturation solubility were obtained when introducing 1% (w/w) TPGS (mean dissolution time dropped from 50.4 min to 36.9 min and saturation solubility increased from 20 ± 3 ng/ml to 46 ± 2 ng/ml). However, the fine particle fraction dropped from 47 ± 2% to 37.2 ± 0.4%. This study showed that mannitol solid dispersions may provide an effective formulation type for producing DPIs of poorly soluble active ingredients, as exemplified by ITZ.
High potency active pharmaceutical ingredients (HPAPIs) is fastest growing segment of pharmaceutical industry. Many new companies are entering into this market segment. HPAPIs require different regulations, handling, containment as well... more
High potency active pharmaceutical ingredients (HPAPIs) is fastest growing segment of pharmaceutical industry. Many new companies are entering into this market segment. HPAPIs require different regulations, handling, containment as well as manufacturing facilities. Apart from that plant design and machineries required are also of different types than that of conventional manufacturing of dosage forms. This review focuses on HPAPIs current market, classifications, regulatory aspects, manufacturing and handling issues.
Blending of active pharmaceutical ingredient (API) and excipients is a pre-requisite to the dry manufacture of solid dosage forms intended for oral use, whether or not granulation steps are employed prior to compaction. Excipients and API... more
Blending of active pharmaceutical ingredient (API) and excipients is a pre-requisite to the dry manufacture of solid dosage forms intended for oral use, whether or not granulation steps are employed prior to compaction. Excipients and API are known to be subject to tribo-electric charging, against each other and the materials in which the blends are manufactured (e.g. stainless steel 316). This study aimed to assess and compare the magnitude of tribo-electric charging of excipients and API using a material-sparing technique. Intra-sample variability in tribo-electric charging was found to be generally low. The results showed that excipients had lower charge levels and smaller variability as compared to the API materials. Some of the APIs tested charged extensively to the levels in excess of ± 150 nC/g. It was also found that the extent of particle adhesion to surfaces of the container walls for charged API was considerably greater compared to the excipients. These results suggest that the extent and variability of tribo-electric charging of APIs is the predominant contributor to variability in electrostatic charge of pharmaceutical blends and to any related formulation issues. It is therefore reasonable to conclude that, to control the electrostatic properties of a formulation, it is a priority to control the particle properties of the API.
This paper describes the design, implementation and performance of an open, high performance, dynamically extensible router under development at Washington University in St. Louis. This router supports the dynamic installation of software... more
The principal objective of this paper is to present an effective solution for storing and retrieving a cancer patient’s medical history in hospitals, clinics and wherever else need be. We have used latest technologies like Near Field... more
The principal objective of this paper is to present an effective solution for storing and retrieving a cancer patient’s medical history in hospitals, clinics and wherever else need be. We have used latest technologies like Near Field Communication (NFC) as a medium for communication, MySQL server for storing the database i.e. EHR (Electronic Health Record) of patients and lastly an Android application which will provide the interface for the same.
In the present study the air pollution tolerance index(apti)of about fifteen tree species collected from a semi urban area. The study examined ,the air pollution tolerance indices of plant species around three areas of virudhunagar .The... more
In the present study the air pollution tolerance index(apti)of about fifteen tree species collected from a semi urban area. The study examined ,the air pollution tolerance indices of plant species around three areas of virudhunagar .The physiological and biochemical parameter ,which are relative to leaf's water content(RWC) ,ascorbic acid content (AA),total leaf chlorophyll(TCH) and leaf extract PH were used to compute the APTI values.The anticipated performance index (API) of these plant species was also calculated by considering their APTI values together with other socioeconomic and biological parameters .According to API most tolerant plant species for green belt development in Virudhunagar area were identified. Mangifera indica and Ficus relogiosa with highest scoring (69 %) and it was assessed that good for heavy traffic areas or planting along road sides. It was assessed good (% scoring 69) for planting Ficus religiosa and Eugenia jambolana in areas around industries.
NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion... more
NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.
Current intrusion detection systems work in isolation from access control for the application the systems aim to protect. The lack of coordination and inter-operation between these components prevents detecting and responding to ongoing... more
Current intrusion detection systems work in isolation from access control for the application the systems aim to protect. The lack of coordination and inter-operation between these components prevents detecting and responding to ongoing attacks in real time, before they cause damage. To address this, we apply dynamic authorization techniques to support fine-grained access control and application level intrusion detection and response capabilities. This paper describes our experience with integration of the Generic Authorization and Access Control API (GAA-API) to provide dynamic intrusion detection and response for the Apache Web Server. The GAA-API is a generic interface which may be used to enable such dynamic authorization and intrusion response capabilities for many applications.
Mobile phones are the primary means of accessing information or communicating for those who live at the base of the pyramid (BoP). It is likely that the mobile phone will therefore also be the preferred medium to provide value-added... more
Mobile phones are the primary means of accessing information or communicating for those who live at the base of the pyramid (BoP). It is likely that the mobile phone will therefore also be the preferred medium to provide value-added services to those at the BoP, whether they are private users or informal businesses, for the foreseeable future. Although the prepaid mobile model has brought voice and text services to this group, sustainable, replicable models for enhanced services, products and applications are far more limited. The purpose of the study is to investigate the demand for mobile applications, services and products, with a view to increasing economic opportunities and improving well-being for users at the BoP. The key objectives of the study are the following: i) to increase understanding of the actual usage of mobile services, products and applications at the BoP, and to understand their potential for economic and social empowerment; ii) to identify scalable examples of ...
Building Information Modeling (BIM) has established into a powerful solution for our construction requirements throughout its life cycle. Compared to conventional methods, BIM offers simple, faster and accurate methodology for modelling,... more
Building Information Modeling (BIM) has established into a powerful solution for our construction requirements throughout its life cycle. Compared to conventional methods, BIM offers simple, faster and accurate methodology for modelling, estimation and analysis. In this paper, a novel Autodesk Revit add-in tool named as “Electrical System Estimation and Costing Tool”(ESECT) is proposed for the simultaneous estimation of electrical connected load, demand load, Volt Ampere per square meter, cost of electrical system construction and monthly bill from a residential building model and the add-in tool is developed by using Visual basic C# language. Research results indicate that wide range of BIM automated tools are possible for code-checking and estimation for design analysis at all stages of electrical system development and eventually leads to better design and cost reduction.