This article presents a new estimation method for the parameters of a time series model. We consi... more This article presents a new estimation method for the parameters of a time series model. We consider here composite Gaussian processes that are the sum of independent Gaussian processes which, in turn, explain an important aspect of the time series, as is the case in engineering and natural sciences. The proposed estimation method offers an alternative to classical estimation based on the likelihood, that is straightforward to implement and often the only feasible estimation method with complex models. The estimator furnishes results as the optimization of a criterion based on a standardized distance between the sample wavelet variances (WV) estimates and the model-based WV. Indeed, the WV provides a decomposition of the variance process through different scales, so that they contain the information about different features of the stochastic model. We derive the asymptotic properties of the proposed estimator for inference and perform a simulation study to compare our estimator to t...
Inequality measures are often used fot summarise information about empirical income distributions... more Inequality measures are often used fot summarise information about empirical income distributions. However, the resulting picture of the distribution and of changes in the distribution can be severely distorted if the data are contaminated. The nature of this distortion will in general depend upon the underlying properties of the inequality measure. We investigate this issue theoretically using a technique based on the influence function, and illustrate the magnitude of the effect using a simulation. We consider both direct nonparametric estimation from the sample, and indirect estimation using a parametric model. In the latter case we demonstratge the application of a robust estimation procedure.
Average (bio)equivalence tests are used to assess if a parameter, like the mean difference in tre... more Average (bio)equivalence tests are used to assess if a parameter, like the mean difference in treatment response between two conditions for example, lies within a given equivalence interval, hence allowing to conclude that the conditions have "equivalent" means. The Two One-Sided Tests (TOST) procedure, consisting in testing whether the target parameter is respectively significantly greater and lower than some pre-defined lower and upper equivalence limits, is typically used in this context, usually by checking whether the confidence interval for the target parameter lies within these limits. This intuitive and visual procedure is however known to be conservative, especially in the case of highly variable drugs, where it shows a rapid power loss, often reaching zero, hence making it impossible to conclude for equivalence when it is actually true. Here, we propose a finite sample correction of the TOST procedure, the α-TOST, which consists in a correction of the significanc...
Evolution of volcanic plumbing systems towards eruptions of different styles and sizes largely de... more Evolution of volcanic plumbing systems towards eruptions of different styles and sizes largely depends on processes at crustal depths that are outside our observational capabilities. Theseprocesses can be modeled and the outputs of the simulations can be compared with the chemistry of the erupted products, geophysical and geodetic data to retrieve information on the architecture of the plumbing system and the processes leading to eruption. The interaction between magmas with different physical and chemical properties often precedes volcanic eruptions. Thus, sophisticated numerical models havebeen developed that describe in detail the dynamics of interacting magmas, specifically aimed at evaluating pre-eruptive magma mingling and mixing timescales. However, our ability to explore the parameters space in order to match petrological and geophysical observations is limited by the extremely high computational costs of these multiphase, multicomponent computational fluid dynamics simulati...
Aims. The primary difficulty in understanding the sources and processes that powered cosmic reion... more Aims. The primary difficulty in understanding the sources and processes that powered cosmic reionization is that it is not possible to directly probe the ionizing Lyman-continuum (LyC) radiation at that epoch as those photons have been absorbed by the intervening neutral hydrogen. It is therefore imperative to build a model to accurately predict LyC emission using other properties of galaxies in the reionization era. Methods. In recent years, studies have shown that the LyC emission from galaxies may be correlated to their Lyman-alpha (Lyα) emission. In this paper we study this correlation by analyzing thousands of simulated galaxies at high redshift in the SPHINX cosmological simulation. We post-process these galaxies with the Lyα radiative transfer code RASCAS and analyze the Lyα – LyC connection. Results. We find that the Lyα and LyC luminosities are strongly correlated with each other, although with dispersion. There is a positive correlation between the escape fractions of Lyα ...
We present a new framework for the robust estimation of time series models which is fairly genera... more We present a new framework for the robust estimation of time series models which is fairly general and, for example, covers models going from ARMA to state-space models. This approach provides estimators which are (i) consistent and asymptotically normally distributed, (ii) applicable to a broad spectrum of time series models, (iii) straightforward to implement and (iv) computationally efficient. The framework is based on the recently developed Generalized Method of Wavelet Moments and a new robust estimator of the wavelet variance. Compared to existing methods, the latter directly estimates the quantity of interest while performing better in finite samples and using milder conditions for its asymptotic properties to hold. Hence, not only does this paper provide an alternative estimator which allows to perform wavelet variance analysis when data are contaminated but also a general approach to robustly estimate the parameters of a variety of time series models. The simulation studies...
An important challenge in statistical analysis concerns the control of the finite sample bias of ... more An important challenge in statistical analysis concerns the control of the finite sample bias of estimators. This problem is magnified in high dimensional settings where the number of variables p diverge with the sample size n. However, it is difficult to establish whether an estimator θˆ of θ0 is unbiased and the asymptotic order of E[θˆ] − θ0 is commonly used instead. We introduce a new property to assess the bias, called phase transition unbiasedness, which is weaker than unbiasedness but stronger than asymptotic results. An estimator satisfying this property is such that E[θˆ] − θ0 2 = 0, for all n greater than a finite sample size n ∗ . We propose a phase transition unbiased estimator by matching an initial estimator computed on the sample and on simulated data. It is computed using an algorithm which is shown to converge exponentially fast. The initial estimator is not required to be consistent and thus may be conveniently chosen for computational efficiency or for other prope...
We present an algorithm for determining the nature of stochastic processes together with its para... more We present an algorithm for determining the nature of stochastic processes together with its parameters based on the analysis of time series of inertial errors. The algorithm is suitable mainly (but not only) for situations when several stochastic processes are superposed. In such cases, classical approaches based on the analysis of Allan variance or PSD are likely to fail due to the difficulty of separating the underlying error-processes in the spectral domain. The developed alternative is based on the recently proposed method called the Generalized Method of Wavelet Moments (GMWM), whose resulting estimator was proven to be consistent and asymptotically normally distributed. The principle of this method is to match the empirical and model-based wavelet variances (WV). In this study we propose a goodness-of-fit criterion which can be used to determine the suitability of a candidate model and apply it to low-cost inertial sensors. The suggested approach of model selection relies on ...
This article presents a new estimation method for the parameters of a time series model. We consi... more This article presents a new estimation method for the parameters of a time series model. We consider here composite Gaussian processes that are the sum of independent Gaussian processes which, in turn, explain an important aspect of the time series, as is the case in engineering and natural sciences. The proposed estimation method offers an alternative to classical estimation based on the likelihood, that is straightforward to implement and often the only feasible estimation method with complex models. The estimator furnishes results as the optimization of a criterion based on a standardized distance between the sample wavelet variances (WV) estimates and the model-based WV. Indeed, the WV provides a decomposition of the variance process through different scales, so that they contain the information about different features of the stochastic model. We derive the asymptotic properties of the proposed estimator for inference and perform a simulation study to compare our estimator to t...
Inequality measures are often used fot summarise information about empirical income distributions... more Inequality measures are often used fot summarise information about empirical income distributions. However, the resulting picture of the distribution and of changes in the distribution can be severely distorted if the data are contaminated. The nature of this distortion will in general depend upon the underlying properties of the inequality measure. We investigate this issue theoretically using a technique based on the influence function, and illustrate the magnitude of the effect using a simulation. We consider both direct nonparametric estimation from the sample, and indirect estimation using a parametric model. In the latter case we demonstratge the application of a robust estimation procedure.
Average (bio)equivalence tests are used to assess if a parameter, like the mean difference in tre... more Average (bio)equivalence tests are used to assess if a parameter, like the mean difference in treatment response between two conditions for example, lies within a given equivalence interval, hence allowing to conclude that the conditions have "equivalent" means. The Two One-Sided Tests (TOST) procedure, consisting in testing whether the target parameter is respectively significantly greater and lower than some pre-defined lower and upper equivalence limits, is typically used in this context, usually by checking whether the confidence interval for the target parameter lies within these limits. This intuitive and visual procedure is however known to be conservative, especially in the case of highly variable drugs, where it shows a rapid power loss, often reaching zero, hence making it impossible to conclude for equivalence when it is actually true. Here, we propose a finite sample correction of the TOST procedure, the α-TOST, which consists in a correction of the significanc...
Evolution of volcanic plumbing systems towards eruptions of different styles and sizes largely de... more Evolution of volcanic plumbing systems towards eruptions of different styles and sizes largely depends on processes at crustal depths that are outside our observational capabilities. Theseprocesses can be modeled and the outputs of the simulations can be compared with the chemistry of the erupted products, geophysical and geodetic data to retrieve information on the architecture of the plumbing system and the processes leading to eruption. The interaction between magmas with different physical and chemical properties often precedes volcanic eruptions. Thus, sophisticated numerical models havebeen developed that describe in detail the dynamics of interacting magmas, specifically aimed at evaluating pre-eruptive magma mingling and mixing timescales. However, our ability to explore the parameters space in order to match petrological and geophysical observations is limited by the extremely high computational costs of these multiphase, multicomponent computational fluid dynamics simulati...
Aims. The primary difficulty in understanding the sources and processes that powered cosmic reion... more Aims. The primary difficulty in understanding the sources and processes that powered cosmic reionization is that it is not possible to directly probe the ionizing Lyman-continuum (LyC) radiation at that epoch as those photons have been absorbed by the intervening neutral hydrogen. It is therefore imperative to build a model to accurately predict LyC emission using other properties of galaxies in the reionization era. Methods. In recent years, studies have shown that the LyC emission from galaxies may be correlated to their Lyman-alpha (Lyα) emission. In this paper we study this correlation by analyzing thousands of simulated galaxies at high redshift in the SPHINX cosmological simulation. We post-process these galaxies with the Lyα radiative transfer code RASCAS and analyze the Lyα – LyC connection. Results. We find that the Lyα and LyC luminosities are strongly correlated with each other, although with dispersion. There is a positive correlation between the escape fractions of Lyα ...
We present a new framework for the robust estimation of time series models which is fairly genera... more We present a new framework for the robust estimation of time series models which is fairly general and, for example, covers models going from ARMA to state-space models. This approach provides estimators which are (i) consistent and asymptotically normally distributed, (ii) applicable to a broad spectrum of time series models, (iii) straightforward to implement and (iv) computationally efficient. The framework is based on the recently developed Generalized Method of Wavelet Moments and a new robust estimator of the wavelet variance. Compared to existing methods, the latter directly estimates the quantity of interest while performing better in finite samples and using milder conditions for its asymptotic properties to hold. Hence, not only does this paper provide an alternative estimator which allows to perform wavelet variance analysis when data are contaminated but also a general approach to robustly estimate the parameters of a variety of time series models. The simulation studies...
An important challenge in statistical analysis concerns the control of the finite sample bias of ... more An important challenge in statistical analysis concerns the control of the finite sample bias of estimators. This problem is magnified in high dimensional settings where the number of variables p diverge with the sample size n. However, it is difficult to establish whether an estimator θˆ of θ0 is unbiased and the asymptotic order of E[θˆ] − θ0 is commonly used instead. We introduce a new property to assess the bias, called phase transition unbiasedness, which is weaker than unbiasedness but stronger than asymptotic results. An estimator satisfying this property is such that E[θˆ] − θ0 2 = 0, for all n greater than a finite sample size n ∗ . We propose a phase transition unbiased estimator by matching an initial estimator computed on the sample and on simulated data. It is computed using an algorithm which is shown to converge exponentially fast. The initial estimator is not required to be consistent and thus may be conveniently chosen for computational efficiency or for other prope...
We present an algorithm for determining the nature of stochastic processes together with its para... more We present an algorithm for determining the nature of stochastic processes together with its parameters based on the analysis of time series of inertial errors. The algorithm is suitable mainly (but not only) for situations when several stochastic processes are superposed. In such cases, classical approaches based on the analysis of Allan variance or PSD are likely to fail due to the difficulty of separating the underlying error-processes in the spectral domain. The developed alternative is based on the recently proposed method called the Generalized Method of Wavelet Moments (GMWM), whose resulting estimator was proven to be consistent and asymptotically normally distributed. The principle of this method is to match the empirical and model-based wavelet variances (WV). In this study we propose a goodness-of-fit criterion which can be used to determine the suitability of a candidate model and apply it to low-cost inertial sensors. The suggested approach of model selection relies on ...
Uploads
Papers by Maria-Pia Victoria-Feser