A strategy for adaptive control and energetic optimization of aerobic fermentors was implemented, with both air flow and agitation speed as manipulated variables. This strategy is separable in its components: control, optimization,... more
A strategy for adaptive control and energetic optimization of aerobic fermentors was implemented, with both air flow and agitation speed as manipulated variables. This strategy is separable in its components: control, optimization, estimation. We optimized parameter’s estimation (from the usual KLa correlation) using sinusoidal excitation of air flow and agitation speed. We have implemented parameter’s estimation trough recursive least squares algorithm with forgetting factor. We carried separate essays on control, optimization and estimation algorithms. We carried our essays using an original computational simulation environment, with noise and delay generating facilities for data sampling and filtering.
Our results show the convergence and robustness of the estimation algorithm used, improved with use of both forgetting factor and KLa dead-band facilities. Control algorithm used in our work compares favorably with PID using the integrated area criteria for deviation between oxygen molarity and critical molarity (set point). Optimization algorithm clearly reduces energetic consumption, respecting critical molarity. Integration of control, optimization and adaptive algorithms was implemented, but future work is needed for stability. Methods were defined and implemented for stability improvement. We have implemented data acquisition and computer manipulation of air flow and agitation speed for actual fermentors.
In this article, estimation methods of the semiparametric generalized linear model known as the generalized partial linear model (GPLM) are reviewed. These methods are based on using kernel smoothing functions in the estimation of the... more
In this article, estimation methods of the semiparametric generalized linear model known as the generalized partial linear model (GPLM) are reviewed. These methods are based on using kernel smoothing functions in the estimation of the nonparametric component of the model. We derive the algorithms for the estimation process and develop these algorithms for the generalized partial linear model (GPLM) with
stkerhaz computes nonparametric estimates of the baseline hazard or baseline SMR and draws the graph of the results. This command can be used after stcox. In this case it requires that you previously specified stcox's basech option.... more
stkerhaz computes nonparametric estimates of the baseline hazard or baseline SMR and draws the graph of the results. This command can be used after stcox. In this case it requires that you previously specified stcox's basech option. Otherwise you can specify varname storing cumulative baseline hazard.
The Diplomatarium Norvegicum are problematic sources for medieval Norwegian: we usually don’t know how charter language has been influenced by exemplars, who wrote and who dictated texts, or how ‘standard’ forms of writing interfered with... more
The Diplomatarium Norvegicum are problematic sources for medieval Norwegian: we usually don’t know how charter language has been influenced by exemplars, who wrote and who dictated texts, or how ‘standard’ forms of writing interfered with the representation of speech. Yet in other senses these are rich historical sources that could throw a light on many external and internal linguistic questions. Charters are almost always precisely dated and record rich information about signatories and locality. As noisy and complex sources, they call for sophisticated and nuanced methods when tracing change or dialectal distributions. In this talk I will present a new historical dialectological approach to tracing language change in the DN, kernel density estimation. I will focus on the loss of the dual-plural distinction in the first person pronoun, asking: when was this distinction lost? Was this prompted by contact with Low German or East Nordic, or was it a community-internal development? How did it lead to the diversity of plural forms in Modern Norwegian?
Affective interface that acquires and detects the emotion of the user can potentially enhance the human-computer interface experience. In this paper, an affective brain-computer interface (ABCI) is proposed to perform affective... more
Affective interface that acquires and detects the emotion of the user can potentially enhance the human-computer interface experience. In this paper, an affective brain-computer interface (ABCI) is proposed to perform affective computation on electroencephalogram (EEG) correlates of emotion. The proposed ABCI extracts EEG features from subjects while exposed to 6 emotionally-related musical and vocal stimuli using kernel smoothing density estimation (KSDE) and Gaussian mixture model probability estimation (GMM). A classification algorithm is subsequently used to learn and classify the extracted EEG features. An inter-subject validation study is performed on healthy subjects to assess the performance of ABCI using a selection of classification algorithms. The results show that ABCI that employed the Bayesian network and the one-rule classifier yielded a promising inter-subject validation accuracy of 90%.
Hidden process models are a conceptually useful and practical way to si- multaneously account for process variation in animal population dynamics and measurement errors in observations and estimates made on the population. Process... more
Hidden process models are a conceptually useful and practical way to si- multaneously account for process variation in animal population dynamics and measurement errors in observations and estimates made on the population. Process variation, which can be both demographic and environmental, is modeled by linking a series of stochastic and deterministic subprocesses that characterize processes such as birth, survival, maturation,
We study the estimation of the mean function of a continuous-time stochastic process and its derivatives. The covariance function of the process is assumed to be nonparametric and to satisfy mild smoothness conditions. Assuming that $n$... more
We study the estimation of the mean function of a continuous-time stochastic process and its derivatives. The covariance function of the process is assumed to be nonparametric and to satisfy mild smoothness conditions. Assuming that $n$ independent realizations of the process are observed at a sampling design of size $N$ generated by a positive density, we derive the asymptotic bias and variance of the local polynomial estimator as $n,N$ increase to infinity. We deduce optimal sampling densities, optimal bandwidths, and propose a new plug-in bandwidth selection method. We establish the asymptotic performance of the plug-in bandwidth estimator and we compare, in a simulation study, its performance for finite sizes $n, N$ to the cross-validation and the optimal bandwidths. A software implementation of the plug-in method is available in the R environment.
The estimation of spatial intensity and relative risk are important inference problems in spatial epidemiologic studies. A standard component of data assimilation in these studies is the assignment of a geocode, i.e. point-level spatial... more
The estimation of spatial intensity and relative risk are important inference problems in spatial epidemiologic studies. A standard component of data assimilation in these studies is the assignment of a geocode, i.e. point-level spatial coordinates, to the address of each subject in the study population. Unfortunately, when geocoding is performed by the pervasive method of street-segment matching to a
Estimation and inferences for the hemodynamic response functions (HRF) using multi-subject fMRI data are considered. Within the context of the General Linear Model, two new nonparametric estimators for the HRF are proposed. The first is a... more
Estimation and inferences for the hemodynamic response functions (HRF) using multi-subject fMRI data are considered. Within the context of the General Linear Model, two new nonparametric estimators for the HRF are proposed. The first is a kernel-smoothed estimator, which is used to construct hypothesis tests on the entire HRF curve, in contrast to only summaries of the curve as in most existing tests. To cope with the inherent large data variance, we introduce a second approach which imposes Tikhonov regularization on the ...
Summary Competing risks arise naturally in time-to-event studies. In this article, we propose time-dependent accuracy measures for a marker when we have censored survival times and competing risks. Time-dependent versions of sensitivity... more
Summary Competing risks arise naturally in time-to-event studies. In this article, we propose time-dependent accuracy measures for a marker when we have censored survival times and competing risks. Time-dependent versions of sensitivity or true positive (TP) fraction naturally correspond to consideration of either cumulative (or prevalent) cases that accrue over a fixed time period, or alternatively to incident cases that are observed among event-free subjects at any select time. Time-dependent (dynamic) specificity (1–false positive (FP)) can be based on the marker distribution among event-free subjects. We extend these definitions to incorporate cause of failure for competing risks outcomes. The proposed estimation for cause-specific cumulative TP/dynamic FP is based on the nearest neighbor estimation of bivariate distribution function of the marker and the event time. On the other hand, incident TP/dynamic FP can be estimated using a possibly nonproportional hazards Cox model for the cause-specific hazards and riskset reweighting of the marker distribution. The proposed methods extend the time-dependent predictive accuracy measures of Heagerty, Lumley, and Pepe (2000, Biometrics 56, 337–344) and Heagerty and Zheng (2005, Biometrics 61, 92–105).
This paper presents a new method for the reconstruction of three-dimensional objects surveyed with a terrestrial laser scanner. The method is a 2.5D surface modelling technique which is based on the application of statistical... more
This paper presents a new method for the reconstruction of three-dimensional objects surveyed with a terrestrial laser scanner. The method is a 2.5D surface modelling technique which is based on the application of statistical nonparametric regression methods for point cloud regularization and mesh smoothing, specifically the kernel-smoothing techniques. The proposed algorithm was tested in a theoretical model-simulations being carried out with the aim of evaluating the ability of the method to filter random noise and oscillations related to the acquisition of data during the fieldwork-and the results were satisfactory. The method was then applied, as a reverse engineering tool, to real-world data in the field of naval construction. A precise solution to the problem of obtaining realistic surfaces and sections of large industrial objects from laser 3D point clouds is provided, which has proved to be efficient in terms of computational time.
The precise knowledge of the number and nature of the species belonging to a fossil assemblage as well as the age and sex structure at death of each species is of great importance in zooarchaeology. Mixture Analysis based on the method of... more
The precise knowledge of the number and nature of the species belonging to a fossil assemblage as well as the age and sex structure at death of each species is of great importance in zooarchaeology. Mixture Analysis based on the method of Maximum Likelihood is a modern statistical technique that concerns the problem of samples consisting of several components, the composition of which is not known. Nonparametric Bootstrap and Jackknife techniques are used to calculate a confidence interval for each estimated parameter (prior probability, mean, standard deviation) of each group. The Bootstrap method is also used to mathematically evaluate how many groups are present in a sample. Experimental density smoothing using the Kernel method appears to be a better solution than the use of histograms for the estimation of a distribution. Some preliminary results concerning sex ratios assessments using bone metric data of goats (Capra) and fallow deer (Dama mesopotamica) from Neolithic sites are presented and discussed.
Abstract In this paper, a weighted random subspace method (RSM) with automatic subspace dimensionality selection has been proposed for classifying hyperspectral image data. The dimensionality selection method is based on the importance... more
Abstract In this paper, a weighted random subspace method (RSM) with automatic subspace dimensionality selection has been proposed for classifying hyperspectral image data. The dimensionality selection method is based on the importance distribution of dimensionality estimated by kernel smoothing technique during the algorithm training. Two feature weighting methods based on normalized re-substitution accuracy and Fisher's LDA separability are introduced for improving the original RSM. Experimental result shows that ...