Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2006, Nonconvex Optimization and Its Applications
1999
Abstract. The isotonic regression problem has applications in statistics, operations research, and image processing. In this paper a general framework for the isotonic regression algorithm is proposed. Under this framework, we discuss the isotonic regression problem in the case where the directed graph specifying the order restriction is a directed tree with n vertices. A new algorithm is presented for this case, which can be regarded as a generalization of the PAV algorithm of Ayer et al.
2019
This note is a status report on the fastest known isotonic regression algorithms for various Lp metrics and partial orderings. The metrics considered are unweighted and weighted L0, L1, L2, and L∞. The partial orderings considered are linear, tree, d-dimensional grids, points in d-dimensional space with componentwise ordering, and arbitrary orderings (posets). Throughout, “fastest” means for the worst case in O-notation, not in any measurements of implementations. This note will occasionally be updated as better algorithms are developed. Citations are to the first paper to give a correct algorithm with the given time bound, though in some cases two are cited if they appeared nearly contemporaneously.
2010
A new algorithm for isotonic regression is presented based on recursively partitioning the solution space. We develop efficient methods for each partitioning subproblem through an equivalent representation as a network flow problem, and prove that this sequence of partitions converges to the global solution. These network flow problems can further be decomposed in order to solve very large problems. Success of isotonic regression in prediction and our algorithm's favorable computational properties are demonstrated through simulated examples as large as 2 x 105 variables and 107 constraints.
arXiv (Cornell University), 2021
2009
This paper gives an approach for determining isotonic regre ssions for data at points in multidimensional space, with the ordering given by domination. Recent algori thmic advances for 2-dimensional isotonic regressions have made them useful for significantly larger d ata sets, and here we provide an advance for dimensions 3 and larger. Given a set V of n d-dimensional points, it is shown that an isotonic regressio n on V can be determined iñ Θ(n2), Θ̃(n3), andΘ̃(n) time for theL1, L2, andL∞ metrics, respectively. This improves upon previous results by a factor of Θ̃(n). The core of the approach is in extending the regression to a set of pointsV ′ ⊃ V where the domination ordering on V ′ can be represented with relatively few edges.
Applied Mathematics and Parallel Computing, 1996
Proceedings of the seventeenth annual ACM-SIAM symposium on Discrete algorithm - SODA '06, 2006
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Romanian Journal of Pediatrics
Academia Letters, 2021
Studia Politicae, 2023
CSIR Slovenija I 1 Poetovio, Sarkofagi in pepelnice, 2024
Engineering Construction & Architectural Management, 2021
F1000Research, 2022
Choice Reviews Online, 2015
Geriatrics Gerontology and Aging
Contemporary Clinical Trials, 2013
Social Justice Research, 2011