Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Pando Georgiev

    Pando Georgiev

    ABSTRACT In many applications of Independent Component Analysis (ICA) and Blind Source Separation (BSS) estimated sources signals and the mixing or separating matrices have some special structure or some constraints are imposed for the... more
    ABSTRACT In many applications of Independent Component Analysis (ICA) and Blind Source Separation (BSS) estimated sources signals and the mixing or separating matrices have some special structure or some constraints are imposed for the matrices such as symmetries, orthogonality, non-negativity, sparseness and specified invariant norm of the separating matrix. In this paper we present several algorithms and overview some known transformations which allows us to preserve several important constraints.
    SUMMARY In many applications of Independent Compo- nent Analysis (ICA) and Blind Source Separation (BSS) esti- mated sources signals and the mixing or separating matrices have some special structure or some constraints are imposed for the... more
    SUMMARY In many applications of Independent Compo- nent Analysis (ICA) and Blind Source Separation (BSS) esti- mated sources signals and the mixing or separating matrices have some special structure or some constraints are imposed for the matrices such as symmetries, orthogonality, non-negativity, sparseness and specified invariant norm of the separating ma- trix. In this paper we present several algorithms and overview some known transformations which allows us to preserve several important constraints. Computer simulation experiments con- firmed validity and usefulness of the developed algorithms.
    In many applications of independent component analysis (ICA) and blind source separation (BSS) the mixing or separating matrices have some special structure or some constraints are imposed for the matrices like symmetry, orthogonality,... more
    In many applications of independent component analysis (ICA) and blind source separation (BSS) the mixing or separating matrices have some special structure or some constraints are imposed for the matrices like symmetry, orthogonality, nonnegativity, sparseness and unit (or specified invariant norm) of the matrix. We present several algorithms and overview some known transformations which allows us to preserve such constraints. Especially, we propose algorithms for a blind identification problem with non-negativity constraints.
    Preface.- List of Contributors.- 1 Optimizing Organ Allocation and Acceptance: Oguzhan Alagoz, Andrew J. Schaefer, Mark S. Roberts.- 2 Can We Do Better? Optimization Models for Breast Cancer Screening: Julie Simmons Ivy.- 3 Optimization... more
    Preface.- List of Contributors.- 1 Optimizing Organ Allocation and Acceptance: Oguzhan Alagoz, Andrew J. Schaefer, Mark S. Roberts.- 2 Can We Do Better? Optimization Models for Breast Cancer Screening: Julie Simmons Ivy.- 3 Optimization Models and Computational Approaches for Three-dimensional Conformal Radiation Treatment Planning: Gino J. Lim.- 4 Continuous Optimization of Beamlet Intensities for Intensity Modulated Photon and Proton Radiotherapy: Rembert Reemtsen and Markus Alber.- 5 Multicriteria Optimization in Intensity Modulated Radiotherapy Planning: Karl-Heinz Kufer, Michael Monz, Alexander Scherrer, Philipp Suss, Fernando Alonso, Ahmad Saher Azizi Sultan, Thomas Bortfeld, Christian Thieke.-6 Algorithms for Sequencing Multileaf Collimators: Srijit Kamath, Sartaj Sahni, Jatinder Palta, Sanjay Ranka, Jonathan Li.- 7 Image Registration and Segmentation Based on Energy Minimization: Michael Hintermuller, Stephen L. Keeling.- 8 Optimization Techniques for Data Representations wi...
    ABSTRACT
    ... of International Conference on Acoustics and Statistical Signal Processing (ICASSP2004), Montreal, Canada, May 17-21, 2004. [GCB04] Georgiev P., Cichocki A ... [HKOOl] A. Hyvarinen, J. Karhunen and E. Oja, Independent Component... more
    ... of International Conference on Acoustics and Statistical Signal Processing (ICASSP2004), Montreal, Canada, May 17-21, 2004. [GCB04] Georgiev P., Cichocki A ... [HKOOl] A. Hyvarinen, J. Karhunen and E. Oja, Independent Component Analysis, John Wiley & Sons, 2001. ...
    Kernel Principal Component Analysis (KPCA) is a dimension reduction method that is closely related to Principal Component Analysis (PCA). This report gives an overview of kernel PCA and presents an implementation of the method in MATLAB.... more
    Kernel Principal Component Analysis (KPCA) is a dimension reduction method that is closely related to Principal Component Analysis (PCA). This report gives an overview of kernel PCA and presents an implementation of the method in MATLAB. The implemented method is tested in a transductive setting on two data bases. Two methods for labeling data points are considered, the nearest neighbor method and kernel regression, together with some possible improvements of the methods.
    We extend the idea of reproducing kernel Hilbert spaces (RKHS) to Banach spaces, developing a theory of pairs of reproducing kernel Banach spaces (RKBS) without the requirement of existence of semi-inner product (which requirement is... more
    We extend the idea of reproducing kernel Hilbert spaces (RKHS) to Banach spaces, developing a theory of pairs of reproducing kernel Banach spaces (RKBS) without the requirement of existence of semi-inner product (which requirement is already explored in another construction of RKBS). We present several natural examples, which involve RKBS of functions with supremum norm and with l p -norm (1 ≤ p ≤ ∞). Special attention is devoted to the case of a pair of RKBS \((B,{B}^{\sharp })\) in which B has sup-norm and \({B}^{\sharp }\) has l 1-norm. Namely, we show that if \((B,{B}^{\sharp })\) is generated by a universal kernel and B is furnished with the sup-norm, then \({B}^{\sharp }\), furnished with the l 1-norm, is linearly isomorphically embedded in the dual of B. We reformulate the classical classification problem (support vector machine classifier) to RKBS and suggest that it will have sparse solutions when the RKBS is furnished with the l 1-norm.
    ... For example, for mixture of two uniform distributed sources, independent components and sparse components are quite ... 2. ROBUST SPARSE SIGNAL REPRESENTATIONS Sparse Component Analysis (SCA) and sparse signals rep-resentations (SSR)... more
    ... For example, for mixture of two uniform distributed sources, independent components and sparse components are quite ... 2. ROBUST SPARSE SIGNAL REPRESENTATIONS Sparse Component Analysis (SCA) and sparse signals rep-resentations (SSR) arise in many scientific ...
    ABSTRACT
    Abstract: We show how the necessary conditions for a local minimum can be used to obtain a sufficient optimality condition of first order for a global minimum of a locally Lipschitz function on a closed convex set. Using a theorem of... more
    Abstract: We show how the necessary conditions for a local minimum can be used to obtain a sufficient optimality condition of first order for a global minimum of a locally Lipschitz function on a closed convex set. Using a theorem of Clarke, we obtain a short proof and an extension to Banach spaces of a result of Hiriart-Urruty and Ledyaev.
    ABSTRACT
    Abstract— We define a fuzzy subspace skeleton of data points and propose an algorithm for finding it. Such a skeleton is con-nected with data representation: if the data points (represented as columns of a given matrix X) belong exactly... more
    Abstract— We define a fuzzy subspace skeleton of data points and propose an algorithm for finding it. Such a skeleton is con-nected with data representation: if the data points (represented as columns of a given matrix X) belong exactly to this fuzzy skeleton, then under some ...
    Research Interests:
    We introduce identifiability conditions for the blind source separation (BSS) problem, combining the second and fourth order statistics. We prove that under these conditions, well known methods (like eigen-value decomposition and joint... more
    We introduce identifiability conditions for the blind source separation (BSS) problem, combining the second and fourth order statistics. We prove that under these conditions, well known methods (like eigen-value decomposition and joint diagonalization) can be applied with probability one, i.e. the set of parameters for which such a method doesn’t solve the BSS problem, has a measure zero.
    In this paper, we consider several kinds of convexity and semicontinuity of multifunctions with respect to ordering cones, and we investigate inherited properties of convexity and semicontinuity through scalarization of multifunctions. We... more
    In this paper, we consider several kinds of convexity and semicontinuity of multifunctions with respect to ordering cones, and we investigate inherited properties of convexity and semicontinuity through scalarization of multifunctions. We introduce four kinds of scalarizing functions to characterize images of a multifunction by using Tchebyshev scalarization; these (real-valued) scalarizing functions have the same sorts of convexity and semicontinuity which correspond to those of parent multifunctions.
    We present two methods for data representations based on matrix fac- torization: Independent Component Analysis (ICA) and Sparse Component Analysis (SCA). Our presentation focuses on the mathematical foundation of ICA and SCA based on... more
    We present two methods for data representations based on matrix fac- torization: Independent Component Analysis (ICA) and Sparse Component Analysis (SCA). Our presentation focuses on the mathematical foundation of ICA and SCA based on optimization theory, which appears to be enough for rigorous justification of the methods, although the ICA methods usually are justified by principles from physics, such as
    Research Interests:
    Research Interests:
    Abstract Solutions φ (x) of the functional equation φ (φ (x))= f (x) are called iterative roots of the given function f (x). They are of interest in dynamical systems, chaos and complexity theory and also in the modeling of certain... more
    Abstract Solutions φ (x) of the functional equation φ (φ (x))= f (x) are called iterative roots of the given function f (x). They are of interest in dynamical systems, chaos and complexity theory and also in the modeling of certain industrial and financial processes. The problem of computing this “square root” of a function or operator remains a hard task. While the theory of functional equations provides some insight for real and complex valued functions, iterative roots of nonlinear mappings from R^ n to R^ n are less studied from a theoretical and ...
    ABSTRACT We consider four variants of Fan's type inequality for vector-valued multifunctions in topological vector spaces with respect to a cone preorder in the target space. In order to establish these results, firstly we prove a... more
    ABSTRACT We consider four variants of Fan's type inequality for vector-valued multifunctions in topological vector spaces with respect to a cone preorder in the target space. In order to establish these results, firstly we prove a two-function result of Simons directly by the scalar Fan's inequality, after that, by its help we derive a new two-function result, which is the base of our proofs.

    And 67 more