default search action
Michal Derezinski
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [j4]Michal Derezinski, Elizaveta Rebrova:
Sharp Analysis of Sketch-and-Project Methods via a Connection to Randomized Singular Value Decomposition. SIAM J. Math. Data Sci. 6(1): 127-153 (2024) - [c29]Michal Derezinski, Michael W. Mahoney:
Recent and Upcoming Developments in Randomized Numerical Linear Algebra for Machine Learning. KDD 2024: 6470-6479 - [c28]Shabarish Chenakkod, Michal Derezinski, Xiaoyu Dong, Mark Rudelson:
Optimal Embedding Dimension for Sparse Subspace Embeddings. STOC 2024: 1106-1117 - [c27]Michal Derezinski, Jiaming Yang:
Solving Dense Linear Systems Faster Than via Preconditioning. STOC 2024: 1118-1129 - [i40]Yongyi Yang, Jiaming Yang, Wei Hu, Michal Derezinski:
HERTA: A High-Efficiency and Rigorous Training Algorithm for Unfolded Graph Neural Networks. CoRR abs/2403.18142 (2024) - [i39]Sachin Garg, Albert S. Berahas, Michal Derezinski:
Second-order Information Promotes Mini-Batch Robustness in Variance-Reduced Gradients. CoRR abs/2404.14758 (2024) - [i38]Sachin Garg, Kevin Tan, Michal Derezinski:
Distributed Least Squares in Small Space via Sketching and Bias Reduction. CoRR abs/2405.05343 (2024) - [i37]Michal Derezinski, Daniel LeJeune, Deanna Needell, Elizaveta Rebrova:
Fine-grained Analysis and Faster Algorithms for Iteratively Solving Linear Systems. CoRR abs/2405.05818 (2024) - [i36]Michal Derezinski, Christopher Musco, Jiaming Yang:
Faster Linear Systems and Matrix Norm Approximation via Multi-level Sketched Preconditioning. CoRR abs/2405.05865 (2024) - [i35]Ruichen Jiang, Michal Derezinski, Aryan Mokhtari:
Stochastic Newton Proximal Extragradient Method. CoRR abs/2406.01478 (2024) - [i34]Michal Derezinski, Michael W. Mahoney:
Recent and Upcoming Developments in Randomized Numerical Linear Algebra for Machine Learning. CoRR abs/2406.11151 (2024) - 2023
- [j3]Sen Na, Michal Derezinski, Michael W. Mahoney:
Hessian averaging in stochastic Newton methods achieves superlinear convergence. Math. Program. 201(1): 473-520 (2023) - [c26]Michal Derezinski:
Algorithmic Gaussianization through Sketching: Converting Data into Sub-gaussian Random Designs. COLT 2023: 3137-3172 - [i33]Riley Murray, James Demmel, Michael W. Mahoney, N. Benjamin Erichson, Maksim Melnichenko, Osman Asif Malik, Laura Grigori, Piotr Luszczek, Michal Derezinski, Miles E. Lopes, Tianyu Liang, Hengrui Luo, Jack J. Dongarra:
Randomized Numerical Linear Algebra : A Perspective on the Field With an Eye to Software. CoRR abs/2302.11474 (2023) - [i32]Younghyun Cho, James Weldon Demmel, Michal Derezinski, Haoyun Li, Hengrui Luo, Michael W. Mahoney, Riley J. Murray:
Surrogate-based Autotuning for Randomized Sketching Algorithms in Regression Problems. CoRR abs/2308.15720 (2023) - [i31]Shabarish Chenakkod, Michal Derezinski, Xiaoyu Dong, Mark Rudelson:
Optimal Embedding Dimension for Sparse Subspace Embeddings. CoRR abs/2311.10680 (2023) - [i30]Michal Derezinski, Jiaming Yang:
Solving Dense Linear Systems Faster than via Preconditioning. CoRR abs/2312.08893 (2023) - 2022
- [j2]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Unbiased estimators for random design regression. J. Mach. Learn. Res. 23: 167:1-167:46 (2022) - [c25]Nima Anari, Michal Derezinski, Thuy-Duong Vuong, Elizabeth Yang:
Domain Sparsification of Discrete Distributions Using Entropic Independence. ITCS 2022: 5:1-5:23 - [i29]Sen Na, Michal Derezinski, Michael W. Mahoney:
Hessian Averaging in Stochastic Newton Methods Achieves Superlinear Convergence. CoRR abs/2204.09266 (2022) - [i28]Michal Derezinski:
Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches. CoRR abs/2206.02702 (2022) - [i27]Michal Derezinski:
Algorithmic Gaussianization through Sketching: Converting Data into Sub-gaussian Random Designs. CoRR abs/2206.10291 (2022) - [i26]Michal Derezinski, Elizaveta Rebrova:
Sharp Analysis of Sketch-and-Project Methods via a Connection to Randomized Singular Value Decomposition. CoRR abs/2208.09585 (2022) - 2021
- [c24]Xue Chen, Michal Derezinski:
Query complexity of least absolute deviation regression via robust uniform convergence. COLT 2021: 1144-1179 - [c23]Michal Derezinski, Zhenyu Liao, Edgar Dobriban, Michael W. Mahoney:
Sparse sketches with small inversion bias. COLT 2021: 1467-1510 - [c22]Michal Derezinski, Rajiv Khanna, Michael W. Mahoney:
Improved Guarantees and a Multiple-descent Curve for Column Subset Selection and the Nystrom Method (Extended Abstract). IJCAI 2021: 4765-4769 - [c21]Michal Derezinski, Jonathan Lacotte, Mert Pilanci, Michael W. Mahoney:
Newton-LESS: Sparsification without Trade-offs for the Sketched Newton Update. NeurIPS 2021: 2835-2847 - [c20]Vipul Gupta, Avishek Ghosh, Michal Derezinski, Rajiv Khanna, Kannan Ramchandran, Michael W. Mahoney:
LocalNewton: Reducing communication rounds for distributed learning. UAI 2021: 632-642 - [i25]Xue Chen, Michal Derezinski:
Query Complexity of Least Absolute Deviation Regression via Robust Uniform Convergence. CoRR abs/2102.02322 (2021) - [i24]Vipul Gupta, Avishek Ghosh, Michal Derezinski, Rajiv Khanna, Kannan Ramchandran, Michael W. Mahoney:
LocalNewton: Reducing Communication Bottleneck for Distributed Learning. CoRR abs/2105.07320 (2021) - [i23]Michal Derezinski, Jonathan Lacotte, Mert Pilanci, Michael W. Mahoney:
Newton-LESS: Sparsification without Trade-offs for the Sketched Newton Update. CoRR abs/2107.07480 (2021) - [i22]Nima Anari, Michal Derezinski, Thuy-Duong Vuong, Elizabeth Yang:
Domain Sparsification of Discrete Distributions using Entropic Independence. CoRR abs/2109.06442 (2021) - 2020
- [c19]Mojmir Mutny, Michal Derezinski, Andreas Krause:
Convergence Analysis of Block Coordinate Algorithms with Determinantal Sampling. AISTATS 2020: 3110-3120 - [c18]Michal Derezinski, Feynman T. Liang, Michael W. Mahoney:
Bayesian experimental design using regularized determinantal point processes. AISTATS 2020: 3197-3207 - [c17]Nima Anari, Michal Derezinski:
Isotropy and Log-Concave Polynomials: Accelerated Sampling and High-Precision Counting of Matroid Bases. FOCS 2020: 1331-1344 - [c16]Daniele Calandriello, Michal Derezinski, Michal Valko:
Sampling from a k-DPP without looking at all items. NeurIPS 2020 - [c15]Michal Derezinski, Burak Bartan, Mert Pilanci, Michael W. Mahoney:
Debiasing Distributed Second Order Optimization with Surrogate Sketching and Scaled Regularization. NeurIPS 2020 - [c14]Michal Derezinski, Rajiv Khanna, Michael W. Mahoney:
Improved guarantees and a multiple-descent curve for Column Subset Selection and the Nystrom method. NeurIPS 2020 - [c13]Michal Derezinski, Feynman T. Liang, Zhenyu Liao, Michael W. Mahoney:
Precise expressions for random projections: Low-rank approximation and randomized Newton. NeurIPS 2020 - [c12]Michal Derezinski, Feynman T. Liang, Michael W. Mahoney:
Exact expressions for double descent and implicit regularization via surrogate random design. NeurIPS 2020 - [i21]Michal Derezinski, Rajiv Khanna, Michael W. Mahoney:
Improved guarantees and a multiple-descent curve for the Column Subset Selection Problem and the Nyström method. CoRR abs/2002.09073 (2020) - [i20]Nima Anari, Michal Derezinski:
Isotropy and Log-Concave Polynomials: Accelerated Sampling and High-Precision Counting of Matroid Bases. CoRR abs/2004.09079 (2020) - [i19]Michal Derezinski, Michael W. Mahoney:
Determinantal Point Processes in Randomized Numerical Linear Algebra. CoRR abs/2005.03185 (2020) - [i18]Michal Derezinski, Feynman T. Liang, Zhenyu Liao, Michael W. Mahoney:
Precise expressions for random projections: Low-rank approximation and randomized Newton. CoRR abs/2006.10653 (2020) - [i17]Daniele Calandriello, Michal Derezinski, Michal Valko:
Sampling from a k-DPP without looking at all items. CoRR abs/2006.16947 (2020) - [i16]Michal Derezinski, Burak Bartan, Mert Pilanci, Michael W. Mahoney:
Debiasing Distributed Second Order Optimization with Surrogate Sketching and Scaled Regularization. CoRR abs/2007.01327 (2020) - [i15]Michal Derezinski, Zhenyu Liao, Edgar Dobriban, Michael W. Mahoney:
Sparse sketches with small inversion bias. CoRR abs/2011.10695 (2020)
2010 – 2019
- 2019
- [c11]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Correcting the bias in least squares regression with volume-rescaled sampling. AISTATS 2019: 944-953 - [c10]Michal Derezinski:
Fast determinantal point processes via distortion-free intermediate sampling. COLT 2019: 1029-1049 - [c9]Michal Derezinski, Kenneth L. Clarkson, Michael W. Mahoney, Manfred K. Warmuth:
Minimax experimental design: Bridging the gap between statistical and worst-case approaches to least squares regression. COLT 2019: 1050-1069 - [c8]Michal Derezinski, Michael W. Mahoney:
Distributed estimation of the inverse Hessian by determinantal averaging. NeurIPS 2019: 11401-11411 - [c7]Michal Derezinski, Daniele Calandriello, Michal Valko:
Exact sampling of determinantal point processes with sublinear time preprocessing. NeurIPS 2019: 11542-11554 - [i14]Michal Derezinski, Kenneth L. Clarkson, Michael W. Mahoney, Manfred K. Warmuth:
Minimax experimental design: Bridging the gap between statistical and worst-case approaches to least squares regression. CoRR abs/1902.00995 (2019) - [i13]Michal Derezinski, Michael W. Mahoney:
Distributed estimation of the inverse Hessian by determinantal averaging. CoRR abs/1905.11546 (2019) - [i12]Michal Derezinski, Daniele Calandriello, Michal Valko:
Exact sampling of determinantal point processes with sublinear time preprocessing. CoRR abs/1905.13476 (2019) - [i11]Michal Derezinski, Feynman T. Liang, Michael W. Mahoney:
Bayesian experimental design using regularized determinantal point processes. CoRR abs/1906.04133 (2019) - [i10]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Unbiased estimators for random design regression. CoRR abs/1907.03411 (2019) - [i9]Mojmír Mutný, Michal Derezinski, Andreas Krause:
Convergence Analysis of the Randomized Newton Method with Determinantal Sampling. CoRR abs/1910.11561 (2019) - [i8]Michal Derezinski, Feynman T. Liang, Michael W. Mahoney:
Exact expressions for double descent and implicit regularization via surrogate random design. CoRR abs/1912.04533 (2019) - 2018
- [b1]Michal Derezinski:
Volume sampling for linear regression. University of California, Santa Cruz, USA, 2018 - [j1]Michal Derezinski, Manfred K. Warmuth:
Reverse Iterative Volume Sampling for Linear Regression. J. Mach. Learn. Res. 19: 23:1-23:39 (2018) - [c6]Michal Derezinski, Manfred K. Warmuth:
Subsampling for Ridge Regression via Regularized Volume Sampling. AISTATS 2018: 716-725 - [c5]Michal Derezinski, Dhruv Mahajan, S. Sathiya Keerthi, S. V. N. Vishwanathan, Markus Weimer:
Batch-Expansion Training: An Efficient Optimization Framework. AISTATS 2018: 736-744 - [c4]Michal Derezinski, Khashayar Rohanimanesh, Aamer Hydrie:
Discovering Surprising Documents with Context-Aware Word Representations. IUI 2018: 31-35 - [c3]Michal Derezinski, Manfred K. Warmuth, Daniel J. Hsu:
Leveraged volume sampling for linear regression. NeurIPS 2018: 2510-2519 - [i7]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Tail bounds for volume sampled linear regression. CoRR abs/1802.06749 (2018) - [i6]Michal Derezinski, Manfred K. Warmuth:
Reverse iterative volume sampling for linear regression. CoRR abs/1806.01969 (2018) - [i5]Michal Derezinski, Manfred K. Warmuth, Daniel Hsu:
Correcting the bias in least squares regression with volume-rescaled sampling. CoRR abs/1810.02453 (2018) - [i4]Michal Derezinski:
Fast determinantal point processes via distortion-free intermediate sampling. CoRR abs/1811.03717 (2018) - 2017
- [c2]Michal Derezinski, Manfred K. Warmuth:
Unbiased estimates for linear regression via volume sampling. NIPS 2017: 3084-3093 - [i3]Michal Derezinski, Dhruv Mahajan, S. Sathiya Keerthi, S. V. N. Vishwanathan, Markus Weimer:
Batch-Expansion Training: An Efficient Optimization Paradigm for Machine Learning. CoRR abs/1704.06731 (2017) - [i2]Michal Derezinski, Manfred K. Warmuth:
Unbiased estimates for linear regression via volume sampling. CoRR abs/1705.06908 (2017) - [i1]Michal Derezinski, Manfred K. Warmuth:
Subsampling for Ridge Regression via Regularized Volume Sampling. CoRR abs/1710.05110 (2017) - 2014
- [c1]Michal Derezinski, Manfred K. Warmuth:
The limits of squared Euclidean distance regularization. NIPS 2014: 2807-2815
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-07 21:22 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint