default search action
Yu-Hong Dai
Person information
- affiliation: Chinese Academy of Sciences, Institute of Computational Mathematics and Scientific Engineering Computing
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [j89]Rui-Jin Zhang, Xin-Wei Liu, Yu-Hong Dai:
IPRSDP: a primal-dual interior-point relaxation algorithm for semidefinite programming. Comput. Optim. Appl. 88(1): 1-36 (2024) - [j88]Zhihua Allen-Zhao, Fengmin Xu, Yu-Hong Dai, Sanyang Liu:
Robust enhanced indexation optimization with sparse industry layout constraint. Comput. Oper. Res. 161: 106420 (2024) - [j87]Jiang-Yao Luo, Liang Chen, Weikun Chen, Jian-Hua Yuan, Yu-Hong Dai:
A cut-and-solve algorithm for virtual machine consolidation problem. Future Gener. Comput. Syst. 154: 359-372 (2024) - [j86]Yu-Hong Dai, Jiani Wang, Liwei Zhang:
Optimality Conditions and Numerical Algorithms for a Class of Linearly Constrained Minimax Optimization Problems. SIAM J. Optim. 34(3): 2883-2916 (2024) - [j85]Zheyu Wu, Bo Jiang, Ya-Feng Liu, Mingjie Shao, Yu-Hong Dai:
Efficient CI-Based One-Bit Precoding for Multiuser Downlink Massive MIMO Systems With PSK Modulation. IEEE Trans. Wirel. Commun. 23(5): 4861-4875 (2024) - [i21]Na Huang, Yu-Hong Dai, Dominique Orban, Michael A. Saunders:
An inexact augmented Lagrangian algorithm for unsymmetric saddle-point systems. CoRR abs/2404.14636 (2024) - [i20]Wei-Kun Chen, Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
QoS-Aware and Routing-Flexible Network Slicing for Service-Oriented Networks. CoRR abs/2409.13943 (2024) - 2023
- [j84]Juan Gao, Xinwei Liu, Yu-Hong Dai, Yakui Huang, Junhua Gu:
Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization. Comput. Optim. Appl. 84(2): 531-572 (2023) - [j83]Zhen-Yuan Ji, Yu-Hong Dai:
Greedy PSB methods with explicit superlinear convergence. Comput. Optim. Appl. 85(3): 753-786 (2023) - [j82]Liang Chen, Sheng-Jie Chen, Weikun Chen, Yu-Hong Dai, Tao Quan, Juan Chen:
Efficient presolving methods for solving maximal covering and partial set covering location problems. Eur. J. Oper. Res. 311(1): 73-87 (2023) - [j81]Fengmin Xu, Xuepeng Li, Yu-Hong Dai, Meihua Wang:
New insights and augmented Lagrangian algorithm for optimal portfolio liquidation with market impact. Int. Trans. Oper. Res. 30(5): 2640-2664 (2023) - [j80]Wei-Kun Chen, Liang Chen, Yu-Hong Dai:
Lifting for the integer knapsack cover polyhedron. J. Glob. Optim. 86(1): 205-249 (2023) - [j79]Jiawei Chen, Yu-Hong Dai:
Multiobjective optimization with least constraint violation: optimality conditions and exact penalization. J. Glob. Optim. 87(2): 807-830 (2023) - [j78]Rui-Jin Zhang, Xin-Wei Liu, Yu-Hong Dai:
IPRQP: a primal-dual interior-point relaxation algorithm for convex quadratic programming. J. Glob. Optim. 87(2): 1027-1053 (2023) - [j77]Zi Xu, Ziqi Wang, Jun-Lin Wang, Yu-Hong Dai:
Zeroth-Order Alternating Gradient Descent Ascent Algorithms for A Class of Nonconvex-Nonconcave Minimax Problems. J. Mach. Learn. Res. 24: 313:1-313:25 (2023) - [j76]Yu-Hong Dai, Fangfang Xu, Liwei Zhang:
Alternating direction method of multipliers for linear hyperspectral unmixing. Math. Methods Oper. Res. 97(3): 289-310 (2023) - [j75]Yu-Hong Dai, Liwei Zhang:
The augmented Lagrangian method can approximately solve convex optimization with least constraint violation. Math. Program. 200(2): 633-667 (2023) - [j74]Sheng-Jie Chen, Weikun Chen, Yu-Hong Dai, Jian-Hua Yuan, Hou-Shan Zhang:
Efficient presolving methods for the influence maximization problem. Networks 82(3): 229-253 (2023) - [j73]Na Huang, Yu-Hong Dai, Dominique Orban, Michael A. Saunders:
Properties of semi-conjugate gradient methods for solving unsymmetric positive definite linear systems. Optim. Methods Softw. 38(5): 887-913 (2023) - [j72]Na Huang, Yu-Hong Dai, Dominique Orban, Michael A. Saunders:
On GSOR, the Generalized Successive Overrelaxation Method for Double Saddle-Point Problems. SIAM J. Sci. Comput. 45(5): 2185- (2023) - [j71]Juan Gao, Xinwei Liu, Yu-Hong Dai, Yakui Huang, Peng Yang:
A Family of Distributed Momentum Methods Over Directed Graphs With Linear Convergence. IEEE Trans. Autom. Control. 68(2): 1085-1092 (2023) - [j70]Weikun Chen, Ya-Feng Liu, Fan Liu, Yu-Hong Dai, Zhi-Quan Luo:
Towards Efficient Large-Scale Network Slicing: An LP Dynamic Rounding-and-Refinement Approach. IEEE Trans. Signal Process. 71: 615-630 (2023) - [c11]Zheyu Wu, Ya-Feng Liu, Bo Jiang, Yu-Hong Dai:
Efficient Quantized Constant Envelope Precoding for Multiuser Downlink Massive MIMO Systems. ICASSP 2023: 1-5 - [c10]Wei-Kun Chen, Ya-Feng Liu, Rui-Jin Zhang, Yu-Hong Dai, Zhi-Quan Luo:
An Efficient Decomposition Algorithm for Large-Scale Network Slicing. SPAWC 2023: 171-175 - [i19]Wei-Kun Chen, Ya-Feng Liu, Rui-Jin Zhang, Yu-Hong Dai, Zhi-Quan Luo:
An Efficient Decomposition Algorithm for Large-Scale Network Slicing. CoRR abs/2306.15247 (2023) - 2022
- [j69]Juan Gao, Xin-Wei Liu, Yu-Hong Dai, Yakui Huang, Peng Yang:
Achieving geometric convergence for distributed optimization with Barzilai-Borwein step sizes. Sci. China Inf. Sci. 65(4) (2022) - [j68]Yakui Huang, Yu-Hong Dai, Xin-Wei Liu, Hongchao Zhang:
On the acceleration of the Barzilai-Borwein method. Comput. Optim. Appl. 81(3): 717-740 (2022) - [j67]Liang Chen, Yu-Hong Dai, Zhou Wei:
Sufficient conditions for existence of global minimizers of functions on Hilbert spaces. J. Glob. Optim. 84(1): 137-147 (2022) - [j66]Yakui Huang, Yu-Hong Dai, Xinwei Liu, Hongchao Zhang:
On the Asymptotic Convergence and Acceleration of Gradient Methods. J. Sci. Comput. 90(1): 7 (2022) - [j65]Xin-Wei Liu, Yu-Hong Dai, Yakui Huang:
A primal-dual interior-point relaxation method with global and rapidly local convergence for nonlinear programs. Math. Methods Oper. Res. 96(3): 351-382 (2022) - [j64]Xinwei Liu, Yu-Hong Dai, Yakui Huang, Jie Sun:
A novel augmented Lagrangian method of multipliers for optimization with general inequality constraints. Math. Comput. 92(341): 1301-1330 (2022) - [c9]Wei-Kun Chen, Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Optimal Qos-Aware Network Slicing for Service-Oriented Networks with Flexible Routing. ICASSP 2022: 5288-5292 - [c8]Zheyu Wu, Bo Jiang, Ya-Feng Liu, Yu-Hong Dai:
A Novel Negative ℓ1 Penalty Approach for Multiuser One-Bit Massive MIMO Downlink with PSK Signaling. ICASSP 2022: 5323-5327 - [i18]Na Huang, Yu-Hong Dai, Dominique Orban, Michael A. Saunders:
A semi-conjugate gradient method for solving unsymmetric positive definite linear systems. CoRR abs/2206.02951 (2022) - [i17]Na Huang, Yu-Hong Dai, Dominique Orban, Michael A. Saunders:
On GSOR, the Generalized Successive Overrelaxation Method for Double Saddle-Point Problems. CoRR abs/2208.07499 (2022) - [i16]Zheyu Wu, Ya-Feng Liu, Bo Jiang, Yu-Hong Dai:
Efficient Quantized Constant Envelope Precoding for Multiuser Downlink Massive MIMO Systems. CoRR abs/2210.14534 (2022) - [i15]Zi Xu, Ziqi Wang, Jun-Lin Wang, Yu-Hong Dai:
Zeroth-Order Alternating Gradient Descent Ascent Algorithms for a Class of Nonconvex-Nonconcave Minimax Problems. CoRR abs/2211.13668 (2022) - [i14]Huiling Zhang, Jun-Lin Wang, Zi Xu, Yu-Hong Dai:
Primal Dual Alternating Proximal Gradient Algorithms for Nonsmooth Nonconvex Minimax Problems with Coupled Linear Constraints. CoRR abs/2212.04672 (2022) - [i13]Jiang-Yao Luo, Liang Chen, Wei-Kun Chen, Jian-Hua Yuan, Yu-Hong Dai:
A Cut-and-solve Algorithm for Virtual Machine Consolidation Problem. CoRR abs/2212.12341 (2022) - 2021
- [j63]Zhi-Long Dong, Jiming Peng, Fengmin Xu, Yu-Hong Dai:
On some extended mixed integer optimization models of the Eisenberg-Noe model in systemic risk management. Int. Trans. Oper. Res. 28(6): 3014-3037 (2021) - [j62]Liang Chen, Weikun Chen, Mu-Ming Yang, Yu-Hong Dai:
An exact separation algorithm for unsplittable flow capacitated network design arc-set polyhedron. J. Glob. Optim. 81(3): 659-689 (2021) - [j61]Tengteng Yu, Xinwei Liu, Yu-Hong Dai, Jie Sun:
Stochastic Variance Reduced Gradient Methods Using a Trust-Region-Like Scheme. J. Sci. Comput. 87(1): 5 (2021) - [j60]Hui Zhang, Yu-Hong Dai, Lei Guo, Wei Peng:
Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions. Math. Oper. Res. 46(1): 61-81 (2021) - [j59]Yakui Huang, Yu-Hong Dai, Xinwei Liu:
Equipping the Barzilai-Borwein Method with the Two Dimensional Quadratic Termination Property. SIAM J. Optim. 31(4): 3068-3096 (2021) - [j58]Tengteng Yu, Xinwei Liu, Yu-Hong Dai, Jie Sun:
A Minibatch Proximal Stochastic Recursive Gradient Algorithm Using a Trust-Region-Like Scheme and Barzilai-Borwein Stepsizes. IEEE Trans. Neural Networks Learn. Syst. 32(10): 4627-4638 (2021) - [j57]Weikun Chen, Ya-Feng Liu, Antonio De Domenico, Zhi-Quan Luo, Yu-Hong Dai:
Optimal Network Slicing for Service-Oriented Networks With Flexible Routing and Guaranteed E2E Latency. IEEE Trans. Netw. Serv. Manag. 18(4): 4337-4352 (2021) - [c7]Weikun Chen, Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
An Efficient Linear Programming Rounding-and-Refinement Algorithm for Large-Scale Network Slicing Problem. ICASSP 2021: 4735-4739 - [i12]Weikun Chen, Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
An efficient linear programming rounding-and-refinement algorithm for large-scale network slicing problem. CoRR abs/2102.02563 (2021) - [i11]Wei-Kun Chen, Ya-Feng Liu, Fan Liu, Yu-Hong Dai, Zhi-Quan Luo:
Towards Efficient Large-Scale Network Slicing: An LP Rounding-and-Refinement Approach. CoRR abs/2107.14404 (2021) - [i10]Wei-Kun Chen, Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Optimal QoS-Aware Network Slicing for Service-Oriented Networks with Flexible Routing. CoRR abs/2110.03915 (2021) - [i9]Zheyu Wu, Bo Jiang, Ya-Feng Liu, Yu-Hong Dai:
A Novel Negative 𝓁1 Penalty Approach for Multiuser One-Bit Massive MIMO Downlink with PSK Signaling. CoRR abs/2110.04768 (2021) - [i8]Zheyu Wu, Bo Jiang, Ya-Feng Liu, Yu-Hong Dai:
CI-Based One-Bit Precoding for Multiuser Downlink Massive MIMO Systems with PSK Modulation: A Negative 𝓁1 Penalty Approach. CoRR abs/2110.11628 (2021) - 2020
- [j56]Zexian Liu, Hongwei Liu, Yu-Hong Dai:
An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75(1): 145-167 (2020) - [j55]Zhongwen Chen, Yu-Hong Dai, Jiangyan Liu:
A penalty-free method with superlinear convergence for equality constrained optimization. Comput. Optim. Appl. 76(3): 801-833 (2020) - [j54]Zhibao Li, Ka Fai Cedric Yiu, Yu-Hong Dai, Sven Nordholm:
Distributed LCMV beamformer design by randomly permuted ADMM. Digit. Signal Process. 106: 102820 (2020) - [j53]Zhi-Long Dong, Fengmin Xu, Yu-Hong Dai:
Fast algorithms for sparse portfolio selection considering industries and investment styles. J. Glob. Optim. 78(4): 763-789 (2020) - [j52]Xinwei Liu, Yu-Hong Dai:
A globally convergent primal-dual interior-point relaxation method for nonlinear programs. Math. Comput. 89(323): 1301-1329 (2020) - [j51]Yu-Hong Dai, Xin Liu, Jiawang Nie, Zaiwen Wen:
Preface. Optim. Methods Softw. 35(2): 221-222 (2020) - [j50]Yakui Huang, Yu-Hong Dai, Xinwei Liu, Hongchao Zhang:
Gradient methods exploiting spectral properties. Optim. Methods Softw. 35(4): 681-705 (2020) - [j49]Yu-Hong Dai, Florian Jarre, Felix Lieder:
On the existence of affine invariant descent directions. Optim. Methods Softw. 35(5): 938-954 (2020) - [j48]Wei Peng, Yu-Hong Dai, Hui Zhang, Lizhi Cheng:
Training GANs with centripetal acceleration. Optim. Methods Softw. 35(5): 955-973 (2020) - [i7]Weikun Chen, Ya-Feng Liu, Antonio De Domenico, Zhi-Quan Luo, Yu-Hong Dai:
Optimal Network Slicing for Service-Oriented Networks with Flexible Routing and Guaranteed E2E Latency. CoRR abs/2006.13019 (2020) - [i6]Tengteng Yu, Xinwei Liu, Yu-Hong Dai, Jie Sun:
A variable metric mini-batch proximal stochastic recursive gradient algorithm with diagonal Barzilai-Borwein stepsize. CoRR abs/2010.00817 (2020)
2010 – 2019
- 2019
- [j47]Yu-Hong Dai, Yakui Huang, Xinwei Liu:
A family of spectral gradient methods for optimization. Comput. Optim. Appl. 74(1): 43-65 (2019) - [j46]Zhibao Li, Yu-Hong Dai, Huan Gao:
Alternating projection method for a class of tensor equations. J. Comput. Appl. Math. 346: 490-504 (2019) - [j45]Na Huang, Yu-Hong Dai, QiYa Hu:
Uzawa methods for a class of block three-by-three saddle-point problems. Numer. Linear Algebra Appl. 26(6) (2019) - 2018
- [j44]Fengmin Xu, Meihua Wang, Yu-Hong Dai, Dachuan Xu:
A sparse enhanced indexation model with chance and cardinality constraints. J. Glob. Optim. 70(1): 5-25 (2018) - [j43]Weikun Chen, Liang Chen, Mu-Ming Yang, Yu-Hong Dai:
Generalized coefficient strengthening cuts for mixed integer programming. J. Glob. Optim. 70(1): 289-306 (2018) - [j42]Wanyou Cheng, Yu-Hong Dai:
Gradient-based method with active set strategy for ℓ1 optimization. Math. Comput. 87(311): 1283-1305 (2018) - 2017
- [j41]Rui Diao, Ya-Feng Liu, Yu-Hong Dai:
A new fully polynomial time approximation scheme for the interval subset sum problem. J. Glob. Optim. 68(4): 749-775 (2017) - [i5]Rui Diao, Ya-Feng Liu, Yu-Hong Dai:
A New Fully Polynomial Time Approximation Scheme for the Interval Subset Sum Problem. CoRR abs/1704.06928 (2017) - 2016
- [j40]Zhongwen Chen, Yu-Hong Dai:
A line search exact penalty method with bi-object strategy for nonlinear constrained optimization. J. Comput. Appl. Math. 300: 245-258 (2016) - [j39]Ya-Feng Liu, Shiqian Ma, Yu-Hong Dai, Shuzhong Zhang:
A smoothing SQP framework for a class of composite Lq minimization over polyhedron. Math. Program. 158(1-2): 467-500 (2016) - [c6]Conghui Tan, Shiqian Ma, Yu-Hong Dai, Yuqiu Qian:
Barzilai-Borwein Step Size for Stochastic Gradient Descent. NIPS 2016: 685-693 - [i4]Conghui Tan, Shiqian Ma, Yu-Hong Dai, Yuqiu Qian:
Barzilai-Borwein Step Size for Stochastic Gradient Descent. CoRR abs/1605.04131 (2016) - 2015
- [j38]Cai-Xia Kou, Yu-Hong Dai:
A Modified Self-Scaling Memoryless Broyden-Fletcher-Goldfarb-Shanno Method for Unconstrained Optimization. J. Optim. Theory Appl. 165(1): 209-224 (2015) - [j37]Bo Jiang, Yu-Hong Dai:
A framework of constraint preserving update schemes for optimization on Stiefel manifold. Math. Program. 153(2): 535-575 (2015) - [j36]Chun-Lin Hao, Chun-Feng Cui, Yu-Hong Dai:
A sequential subspace projection method for extreme Z-eigenvalues of supersymmetric tensors. Numer. Linear Algebra Appl. 22(2): 283-298 (2015) - [j35]Ya-Feng Liu, Yu-Hong Dai, Shiqian Ma:
Joint Power and Admission Control: Non-Convex Lq Approximation and An Effective Polynomial Time Deflation Approach. IEEE Trans. Signal Process. 63(14): 3641-3656 (2015) - 2014
- [j34]Yi-Qing Hu, Chun-Lin Hao, Yu-Hong Dai:
Projected gradient algorithms for optimization over order simplices. Optim. Methods Softw. 29(5): 1090-1117 (2014) - [j33]Chun-Feng Cui, Yu-Hong Dai, Jiawang Nie:
All Real Eigenvalues of Symmetric Tensors. SIAM J. Matrix Anal. Appl. 35(4): 1582-1601 (2014) - [j32]Ya-Feng Liu, Yu-Hong Dai:
On the Complexity of Joint Subcarrier and Power Allocation for Multi-User OFDMA Systems. IEEE Trans. Signal Process. 62(3): 583-596 (2014) - 2013
- [j31]Yu-Hong Dai:
A perfect example for the BFGS method. Math. Program. 138(1-2): 501-530 (2013) - [j30]Bo Jiang, Yu-Hong Dai:
Feasible Barzilai-Borwein-like methods for extreme symmetric eigenvalue problems. Optim. Methods Softw. 28(4): 756-784 (2013) - [j29]Yu-Hong Dai, Cai-Xia Kou:
A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search. SIAM J. Optim. 23(1): 296-320 (2013) - [j28]Ya-Feng Liu, Mingyi Hong, Yu-Hong Dai:
Max-Min Fairness Linear Transceiver Design Problem for a Multi-User SIMO Interference Channel is Polynomial Time Solvable. IEEE Signal Process. Lett. 20(1): 27-30 (2013) - [j27]Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Joint Power and Admission Control via Linear Programming Deflation. IEEE Trans. Signal Process. 61(6): 1327-1338 (2013) - [j26]Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Max-Min Fairness Linear Transceiver Design for a Multi-User MIMO Interference Channel. IEEE Trans. Signal Process. 61(9): 2413-2423 (2013) - [c5]Ya-Feng Liu, Yu-Hong Dai:
Joint power and admission control via p norm minimization deflation. ICASSP 2013: 4789-4793 - [i3]Ya-Feng Liu, Yu-Hong Dai:
Joint power and admission control via p norm minimization deflation. CoRR abs/1303.1633 (2013) - [i2]Ya-Feng Liu, Yu-Hong Dai, Shiqian Ma:
Joint Power and Admission Control: Non-Convex Approximation and An Efficient Polynomial Time Deflation Approach. CoRR abs/1311.3045 (2013) - 2012
- [j25]Zi Xu, Yu-Hong Dai:
New stochastic approximation algorithms with adaptive step sizes. Optim. Lett. 6(8): 1831-1846 (2012) - [c4]Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Joint power and admission control via linear programming deflation. ICASSP 2012: 2873-2876 - [i1]Ya-Feng Liu, Yu-Hong Dai:
On the Complexity of Joint Subcarrier and Power Allocation for Multi-User OFDMA Systems. CoRR abs/1212.5024 (2012) - 2011
- [j24]Yu-Hong Dai:
Convergence of conjugate gradient methods with constant stepsizes. Optim. Methods Softw. 26(6): 895-909 (2011) - [j23]Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Coordinated Beamforming for MISO Interference Channel: Complexity Analysis and Efficient Algorithms. IEEE Trans. Signal Process. 59(3): 1142-1157 (2011) - [c3]Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
Max-Min Fairness Linear Transceiver Design for a Multi-User MIMO Interference Channel. ICC 2011: 1-5 - 2010
- [j22]Yun-Shan Fu, Yu-Hong Dai:
Improved Projected Gradient Algorithms for Singly Linearly Constrained Quadratic Programs Subject to Lower and Upper Bounds. Asia Pac. J. Oper. Res. 27(1): 71-84 (2010) - [c2]Ya-Feng Liu, Yu-Hong Dai, Zhi-Quan Luo:
On the complexity of optimal coordinated downlink beamforming. ICASSP 2010: 3274-3277
2000 – 2009
- 2009
- [j21]Gaohang Yu, Liqun Qi, Yu-Hong Dai:
On Nonmonotone Chambolle Gradient Projection Algorithms for Total Variation Image Restoration. J. Math. Imaging Vis. 35(2): 143-154 (2009) - [c1]Fuxin Li, Yun-Shan Fu, Yu-Hong Dai, Cristian Sminchisescu, Jue Wang:
Kernel Learning by Unconstrained Optimization. AISTATS 2009: 328-335 - 2008
- [j20]Liping Wang, Yu-Hong Dai:
Left conjugate gradient method for non-Hermitian linear systems. Numer. Linear Algebra Appl. 15(10): 891-909 (2008) - 2007
- [j19]Yi-Qing Hu, Yu-Hong Dai:
Inexact Barzilai-Borwein method for saddle point problems. Numer. Linear Algebra Appl. 14(4): 299-317 (2007) - 2006
- [j18]Y. H. Dai, X. Q. Yang:
A New Gradient Method with an Optimal Stepsize Property. Comput. Optim. Appl. 33(1): 73-88 (2006) - [j17]Bin Zhou, Li Gao, Yu-Hong Dai:
Gradient Methods with Adaptive Step-Sizes. Comput. Optim. Appl. 35(1): 69-86 (2006) - [j16]Yu-Hong Dai, Roger Fletcher:
New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds. Math. Program. 106(3): 403-421 (2006) - [j15]Yu-Hong Dai:
Fast Algorithms for Projection on an Ellipsoid. SIAM J. Optim. 16(4): 986-1006 (2006) - 2005
- [j14]Yu-Hong Dai, Roger Fletcher:
On the asymptotic behaviour of some new gradient methods. Math. Program. 103(3): 541-559 (2005) - [j13]Yu-Hong Dai, Roger Fletcher:
Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming. Numerische Mathematik 100(1): 21-47 (2005) - 2004
- [j12]Yu-Hong Dai, Li-Zhi Liao, Duan Li:
On Restart Procedures for the Conjugate Gradient Method. Numer. Algorithms 35(2-4): 249-260 (2004) - 2003
- [j11]Yu-Hong Dai:
A family of hybrid conjugate gradient methods for unconstrained optimization. Math. Comput. 72(243): 1317-1328 (2003) - [j10]Yu-Hong Dai, José Mario Martínez, Jin Yun Yuan:
An increasing-angle property of the conjugate gradient method and the implementation of large-scale minimization algorithms with line searches. Numer. Linear Algebra Appl. 10(4): 323-334 (2003) - 2002
- [j9]Yu-Hong Dai, Jin Yun Yuan, Ya-Xiang Yuan:
Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization. Comput. Optim. Appl. 22(1): 103-109 (2002) - [j8]Yu-Hong Dai:
Convergence Properties of the BFGS Algoritm. SIAM J. Optim. 13(3): 693-701 (2002) - 2001
- [j7]Y. H. Dai, Ya-Xiang Yuan:
An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization. Ann. Oper. Res. 103(1-4): 33-47 (2001) - [j6]Yu-Hong Dai, Ya-Xiang Yuan:
A three-parameter family of nonlinear conjugate gradient methods. Math. Comput. 70(235): 1155-1167 (2001) - [j5]Yu-Hong Dai, Hongchao Zhang:
Adaptive Two-Point Stepsize Gradient Algorithm. Numer. Algorithms 27(4): 377-385 (2001) - [j4]Yu-Hong Dai:
New properties of a nonlinear conjugate gradient method. Numerische Mathematik 89(1): 83-98 (2001) - 2000
- [j3]Yu-Hong Dai, Jiye Han, Guanghui Liu, Defeng Sun, Hongxia Yin, Ya-Xiang Yuan:
Convergence Properties of Nonlinear Conjugate Gradient Methods. SIAM J. Optim. 10(2): 345-358 (2000)
1990 – 1999
- 1999
- [j2]Yu-Hong Dai, Ya-Xiang Yuan:
Global convergence of the method of shortest residuals. Numerische Mathematik 83(4): 581-598 (1999) - [j1]Y. H. Dai, Ya-Xiang Yuan:
A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property. SIAM J. Optim. 10(1): 177-182 (1999)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-16 20:31 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint