Gradient dynamical systems with discontinuous righthand sides are designed using Persidskii-type ... more Gradient dynamical systems with discontinuous righthand sides are designed using Persidskii-type nonsmooth Lyapunov functions to work as support vector machines (SVMs) for the discrimination of nonseparable classes. The gradient systems are obtained from an exact penalty method applied to the constrained quadratic optimization problems, which are formulations of two well known SVMs. Global convergence of the trajectories of the gradient dynamical systems to the solution of the corresponding constrained problems is shown to be independent of the penalty parameters and of the parameters of the SVMs. The proposed gradient systems can be implemented as simple analog circuits as well as using standard software for integration of ODEs, and in order to use efficient integration methods with adaptive stepsize selection, the discontinuous terms are smoothed around a neighborhood of the discontinuity surface by means of the boundary layer technique. The scalability of the proposed gradient systems is also shown by means of an implementation using parallel computers, resulting in smaller processing times when compared with traditional SVM packages.
This paper presents the implementation of two neural networks for SVM training in parallel comput... more This paper presents the implementation of two neural networks for SVM training in parallel computers. The results obtained are compared with two well known packages for SVM training and the parallel implementation shows that the neural network approach can be as accurate as the traditional packages and, since the proposed gradient-based neural networks can be easily parallelized, the proposed approach is scalable and the training times can be considerably reduced.
Gradient dynamical systems with discontinuous righthand sides are designed using Persidskii-type ... more Gradient dynamical systems with discontinuous righthand sides are designed using Persidskii-type nonsmooth Lyapunov functions to work as support vector machines (SVMs) for the discrimination of nonseparable classes. The gradient systems are obtained from an exact penalty method applied to the constrained quadratic optimization problems, which are formulations of two well known SVMs. Global convergence of the trajectories of the gradient dynamical systems to the solution of the corresponding constrained problems is shown to be independent of the penalty parameters and of the parameters of the SVMs. The proposed gradient systems can be implemented as simple analog circuits as well as using standard software for integration of ODEs, and in order to use efficient integration methods with adaptive stepsize selection, the discontinuous terms are smoothed around a neighborhood of the discontinuity surface by means of the boundary layer technique. The scalability of the proposed gradient systems is also shown by means of an implementation using parallel computers, resulting in smaller processing times when compared with traditional SVM packages.
This paper presents the implementation of two neural networks for SVM training in parallel comput... more This paper presents the implementation of two neural networks for SVM training in parallel computers. The results obtained are compared with two well known packages for SVM training and the parallel implementation shows that the neural network approach can be as accurate as the traditional packages and, since the proposed gradient-based neural networks can be easily parallelized, the proposed approach is scalable and the training times can be considerably reduced.
Uploads
Papers by Leonardo Ferreira