Arya Mazumdar
Associate Professor of Computer Science
|
Contact
College of Information and Computer Sciences
University of Massachusetts Amherst
CS 332 | Phone: (413) 545-0359 | email: arya@cs.umass.edu
Assistant: Rachel Lavery | email: lavery@cs.umass.edu
After five wonderful years at UMass, I am moving to UC San Diego soon. This webpage will no longer be maintained. Please proceed to my new webpage.
Research Interests
I work on a range of statistical reconstruction problems, error correcting codes, information theory, and on some foundational topics in machine learning. I am also interested in the trade-offs between computation, communication and convergence rates of distributed algorithms.
See details.
I am an associate professor in the College of Information and Computer Sciences, an adjunct professor in the Department of Mathematics and Statistics, and affiliated with the Center for Data Science. In 2019-2020, I am on leave from UMass, and a researcher at Amazon AI and Search at Berkeley, CA.
|
Updates
I gave talks on Learning Mixtures and Trace Reconstruction at the CCSP Seminar, Workshop on Inference problems: algorithms and lower bounds, TBSI Workshop on Learning Theory and at SPCOM.
2020 EURASIP JASP Best Paper Award for this paper.
Foundations of Data Science Virtual Talk Series starting Feb 28, 2020. Thanks NSF.
Four sessions in ITA on Federated Learning. See here.
The TRIPODS Institute for Theoretical Foundations of Data Science now has a website. Multiple postdoc positions are available.
NSF HDR TRIPODS Award: HDR TRIPODS: Institute for Integrated Data Science: A Transdisciplinary Approach to Understanding Fundamental Trade-offs and Theoretical Foundations.
A new NSF award NSF1909046 New Directions in Clustering: Interactive Algorithms and Statistical Models.
An oral presentation at the ICML 2019 Workshop on Security and Privacy in Machine Learning: Private vqSGD: Vector-Quantized Stochastic Gradient Descent.
A spotlight talk at the ICML 2019 Workshop on Coding Theory for Large-Scale Machine Learning: Reliable Distributed Clustering with Redundant Data Assignment.
For students interested in CMPSCI 690-OP: Optimization in Computer Science (spring 2019): The course webpage is up.
I gave a talk at the Shannon Channel on Graph Clustering: Variations on the Block Models. Video Link.
I gave an invited talk at Frontiers of Coding Theory at the Information Theory Workshop, Guangzhou, based on this work.
We organized a session on New Directions in Clustering in the Allerton Conference where I gave a talk on Geometric Block Model. Thanks to all the other speakers.
The course webpage for CS 514: Algorithms for Data Science, fall 2018, is up.
I am part of the Workshop on Coding and Information Theory at the Center of Mathematical Sciences and Applications in Harvard Apr 9-13 and speaking about LRCs.
The Dagstuhl Workshop on Coding Theory for Inference, Learning and Optimization that we organized was a great success. Thanks attendees.
An oral presentation at SysML Conference, Feb 16, 2018.
Invited talk at Institute for Advanced Study (IAS) on Locally repairable codes, Storage capacity and Index coding, Feb 5, 2018.
Invited talk at Bombay Information Theory Seminar (BITS) on Interactive Algorithms for Clustering and Community Detection, Jan 2018.
A spotlight paper Semisupervised Clustering, AND-Queries and Locally Encodable Source Coding, among three accepted at NIPS 2017.
Invited talk at AMS Fall Sectional Meeting on Locally repairable codes, Oct 28.
Invited talks at Google (Oct 11), IMA (Oct 28) on Clustering with an Oracle.
We are organizing the CS Theory Seminar in fall. Look for updates in this page.
A new NSF award BSF-NSF1618512, Coding and Information - Theoretic Aspects of Local Data Recovery.
Invited talk at SIAM Discrete Math. Locally repairable codes and index coding.
Invited talk at the Shannon Centenary Session of SPCOM and invited paper Local partial clique covers and Index coding.
Elevated to the grade of IEEE Senior member.
Machine Learning and Friends Lunch, Neural Auto-associative Memory Via Sparse Recovery Jan 29, 2016.
New invited paper and talk at Information Theory Workshop, Jeju Island, Oct 11-15, 2015.
NSF CyberBridges Workshop, Aug 31-Sep 1, 2015.
A new NSF grant NSF1642550, NSF1526763 Ordinal Data Compression.
Invited Talk on “Recent Results in Local Repair” at Communication Theory Workshop (CTW), May 13, 2015.
Invited Talk on “Associative Memory via Linear Neural Networks” at ITW, Apr 30, 2015.
Invited Talk on “Memoryless Adversary,” at Between Shannon and Hamming: Network Information Theory and Combinatorics, Banff Workshop, Mar 2, 2015.
Invited Talk on “Local Repairability,” at Simons Institute Workshop on Coding, Feb 12, 2015.
Thanks NSF, for the CAREER award, Reliability in Large-Scale Storage, UMN Part, UMass Part (2015-2020).
Representative Papers (All Papers)
A. Krishnamurthy, A. Mazumdar, A. McGregor, S. Pal, Sample Complexity of Learning Mixture of Sparse Linear Regressions, NeurIPS 2019. Also a sister paper: Algebraic and Analytic Approaches for Parameter Learning in Mixture Models, in Algorithmic Learning Theory (ALT), 2020. Complex analytic technique for mixtures.
S. Galhotra, A. Mazumdar, S. Pal, B. Saha, Part 1: The Geometric Block Model, AAAI Conference on Artificial Intelligence (AAAI-18), Part 2: Connectivity in Random Annulus Graphs and the Geometric Block Model, Randomization and Computation (RANDOM), 2019. Two papers that introduce the geometric block model.
P. Li, A. Mazumdar, O. Milenkovic, Efficient Rank Aggregation via Lehmer Codes, International Conference on Artificial Intelligence and Statistics (AISTATS), 2017. Shows a new way of rank aggregation.
A. Mazumdar, B. Saha, Part 1: Query Complexity of Clustering with Side Information, Part 2: Clustering with Noisy Queries, Neural Information Processing Systems (NIPS), 2017.Two papers characterizing clustering via crowdsourcing. Also see, Part 3, with S. Pal, a Spotlight paper in NIPS 2017, a Part 0 in AAAI 2017 and a new Part 4 in NeurIPS 2019.
A. Mazumdar, Nonadaptive Group Testing with Random Set of Defectives, IEEE Transactions on Information Theory, Dec 2016. Answers a basic question about group testing with a construction.
V. Cadambe, A. Mazumdar, Bounds on the Size of Locally Recoverable Codes, IEEE Transactions on Information Theory, Nov 2015. Introduces the shortening bound for locally recoverable codes.
A. Mazumdar, Storage Capacity of Repairable Networks, IEEE Transactions on Information Theory, Nov 2015. Introduces storage capacity.
A. Barg, A. Mazumdar, Rongrong Wang, Restricted Isometry Property of Random Subdictionaries, IEEE Transactions on Information Theory, Aug 2015. Deterministic matrices with statistical RIP properties.
A. Mazumdar, V. Chandar, G. Wornell, Update-Efficiency and Local Repairability Limits for Capacity Approaching Codes, IEEE Journal on Selected Areas of Communications, May 2014. Special issue on Next-Generation Storage.
A. Barg, A. Mazumdar, On the Number of Errors Correctable with Codes on Graphs, IEEE Transactions on Information Theory, Feb 2011. Special Ralf Koetter memorial issue.
A. Barg, A. Mazumdar, Codes in Permutations and Error Correction for Rank Modulation, IEEE Transactions on Information Theory, July 2010. The conference version won the ISIT Jack K. Wolf paper award.
Bio
I am a tenured associate professor in the College of Information and Computer Sciences at UMass with additional affiliations to Center for Data Science and Department of Mathematics and Statistics. I am on leave from UMass now at Amazon AI and Search, in the Machine Intelligence and Decision Analytics (MIDAS) team at Berkeley, CA. My research is mainly supported by the NSF via multiple grants, including an NSF CAREER award. Before joining UMass, I was an assistant professor at University of Minnesota – Twin Cities. I was also a member/long-term visitor (Sep 2014 - Jun 2015) at Institute for Mathematics and its Applications (IMA) at UMN.
Even before that, I was a postdoctoral scholar (Aug 2011- Dec 2012) at Signals, Information and Algorithms Lab led by Greg Wornell at MIT.
I received my PhD in 2011 from University of Maryland, College Park under the guidance of Alexander Barg. My thesis, Combinatorial Methods in Coding Theory, won a distinguished dissertation fellowship award in UMD, and also an ISIT Jack K Wolf paper award. While a graduate student, I spent summer of 2010 at IBM Almaden, San Jose, CA, and summer of 2008 at HP Labs, Palo Alto, CA doing internships. As of now, I am also serving as an Associate Editor of IEEE Transactions on Information Theory and as an Area Editor of Foundation and Trends in Communication and Information Theory.
Back in a magical decade, I was born in the then-small-town of Siliguri in Darjeeling, West Bengal, India. I spent the wonder-years there and in Jadavpur, Kolkata, till I graduated with a B.E. in Electronics and Telecommunication engineering.
|