Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- extended-abstractDecember 2024
Competitive Equilibrium for Chores: from Dual Eisenberg-Gale to a Fast, Greedy, LP-based Algorithm
EC '24: Proceedings of the 25th ACM Conference on Economics and ComputationPage 40https://doi.org/10.1145/3670865.3673516We study the computation of competitive equilibrium for Fisher markets with n agents and m divisible chores. Prior work showed that competitive equilibria correspond to the nonzero KKT points of the Nash welfare minimization program, which is a non-...
- research-articleFebruary 2024
A First-Order Primal-Dual Method for Nonconvex Constrained Optimization Based on the Augmented Lagrangian
Mathematics of Operations Research (MOOR), Volume 49, Issue 1Pages 125–150https://doi.org/10.1287/moor.2022.1350Nonlinearly constrained nonconvex and nonsmooth optimization models play an increasingly important role in machine learning, statistics, and data analytics. In this paper, based on the augmented Lagrangian function, we introduce a flexible first-order ...
- research-articleJanuary 2022
First-Order Methods for Problems with $O$(1) Functional Constraints Can Have Almost the Same Convergence Rate as for Unconstrained Problems
SIAM Journal on Optimization (SIOPT), Volume 32, Issue 3Pages 1759–1790https://doi.org/10.1137/20M1371579First-order methods (FOMs) have recently been applied and analyzed for solving problems with complicated functional constraints. Existing works show that FOMs for functional constrained problems have lower-order convergence rates than those for ...
- research-articleJanuary 2021
Dual Space Preconditioning for Gradient Descent
SIAM Journal on Optimization (SIOPT), Volume 31, Issue 1Pages 991–1016https://doi.org/10.1137/19M130858XThe conditions of relative smoothness and relative strong convexity were recently introduced for the analysis of Bregman gradient methods for convex optimization. We introduce a generalized left-preconditioning method for gradient descent and show that ...
- research-articleJanuary 2021
A Primal-Dual Algorithm with Line Search for General Convex-Concave Saddle Point Problems
SIAM Journal on Optimization (SIOPT), Volume 31, Issue 2Pages 1299–1329https://doi.org/10.1137/18M1213488In this paper, we propose a primal-dual algorithm with a novel momentum term using the partial gradients of the coupling function that can be viewed as a generalization of the method proposed by Chambolle and Pock in [Math. Program., 159 (2016), pp. 253--287]...
- research-articleJanuary 2018
Relatively Smooth Convex Optimization by First-Order Methods, and Applications
SIAM Journal on Optimization (SIOPT), Volume 28, Issue 1Pages 333–354https://doi.org/10.1137/16M1099546The usual approach to developing and analyzing first-order methods for smooth convex optimization assumes that the gradient of the objective function is uniformly smooth with some Lipschitz constant $L$. However, in many settings the differentiable ...
- articleJanuary 2018
Extragradient Method in Optimization: Convergence and Complexity
Journal of Optimization Theory and Applications (JOPT), Volume 176, Issue 1Pages 137–162https://doi.org/10.1007/s10957-017-1200-6We consider the extragradient method to minimize the sum of two functions, the first one being smooth and the second being convex. Under the Kurdyka---źojasiewicz assumption, we prove that the sequence produced by the extragradient method converges to a ...
- research-articleJune 2017
Katyusha: the first direct acceleration of stochastic gradient methods
STOC 2017: Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of ComputingPages 1200–1205https://doi.org/10.1145/3055399.3055448Nesterov's momentum trick is famously known for accelerating gradient descent, and has been proven useful in building fast iterative algorithms. However, in the stochastic setting, counterexamples exist and prevent Nesterov's momentum from providing ...
- articleJanuary 2017
Distributed stochastic variance reduced gradient methods by sampling extra data with replacement
We study the round complexity of minimizing the average of convex functions under a new setting of distributed optimization where each machine can receive two subsets of functions. The first subset is from a random partition and the second subset is ...
- research-articleJanuary 2017
Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming
SIAM Journal on Optimization (SIOPT), Volume 27, Issue 3Pages 1459–1484https://doi.org/10.1137/16M1082305Motivated by big data applications, first-order methods have been extremely popular in recent years. However, naive gradient methods generally converge slowly. Hence, much effort has been made to accelerate various first-order methods. This paper ...
- research-articleJanuary 2017
Invariant Domains Preserving Arbitrary Lagrangian Eulerian Approximation of Hyperbolic Systems with Continuous Finite Elements
SIAM Journal on Scientific Computing (SISC), Volume 39, Issue 2Pages A385–A414https://doi.org/10.1137/16M1063034A conservative invariant domain preserving arbitrary Lagrangian Eulerian method for solving nonlinear hyperbolic systems is introduced. The method is explicit in time, works with continuous finite elements, and is first-order accurate in space. One original ...
- research-articleJanuary 2016
Invariant Domains and First-Order Continuous Finite Element Approximation for Hyperbolic Systems
SIAM Journal on Numerical Analysis (SINUM), Volume 54, Issue 4Pages 2466–2489https://doi.org/10.1137/16M1074291We propose a numerical method for solving general hyperbolic systems in any space dimension using forward Euler time stepping and continuous finite elements on nonuniform grids. The properties of the method are based on the introduction of an artificial ...
- research-articleJanuary 2016
A Subgradient Method for Free Material Design
SIAM Journal on Optimization (SIOPT), Volume 26, Issue 4Pages 2314–2354https://doi.org/10.1137/15M1019660A small improvement in the structure of a material could potentially lower manufacturing costs. Thus, the free material design can be formulated as an optimization problem. However, due to its large scale, second-order methods cannot solve the free material ...
- research-articleJanuary 2013
Hankel Matrix Rank Minimization with Applications to System Identification and Realization
SIAM Journal on Matrix Analysis and Applications (SIMAX), Volume 34, Issue 3Pages 946–977https://doi.org/10.1137/110853996We introduce a flexible optimization framework for nuclear norm minimization of matrices with linear structure, including Hankel, Toeplitz, and moment structures and catalog applications from diverse fields under this framework. We discuss various first-...