Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article
Free access

TensorLy: tensor learning in python

Published: 01 January 2019 Publication History

Abstract

Tensors are higher-order extensions of matrices. While matrix methods form the cornerstone of traditional machine learning and data analysis, tensor methods have been gaining increasing traction. However, software support for tensor operations is not on the same footing. In order to bridge this gap, we have developed TensorLy, a Python library that provides a high-level API for tensor methods and deep tensorized neural networks. TensorLy aims to follow the same standards adopted by the main projects of the Python scientific community, and to seamlessly integrate with them. Its BSD license makes it suitable for both academic and commercial applications. TensorLy's backend system allows users to perform computations with several libraries such as NumPy or PyTorch to name but a few. They can be scaled on multiple CPU or GPU machines. In addition, using the deep-learning frameworks as backend allows to easily design and train deep tensorized neural networks. TensorLy is available at https://github.com/tensorly/tensorly

References

[1]
M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, C. Citro, G. S. Corrado, A. Davis, J. Dean, M. Devin, S. Ghemawat, I. Goodfellow, A. Harp, G. Irving, M. Isard, Y. Jia, R. Jozefowicz, L. Kaiser, M. Kudlur, J. Levenberg, D. Mané, R. Monga, S. Moore, D. Murray, C. Olah, M. Schuster, J. Shlens, B. Steiner, I. Sutskever, K. Talwar, P. Tucker, V. Vanhoucke, V. Vasudevan, F. Viégas, O. Vinyals, P. Warden, M. Wattenberg, M. Wicke, Y. Yu, and X. Zheng. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. URL https://www.tensorflow.org/. Software available from tensorow.org.
[2]
E. Acar and B. Yener. Unsupervised multiway data analysis: A literature survey. IEEE Transactions on Knowledge and Data Engineering, 21(1):6-20, Jan 2009.
[3]
A. Anandkumar, R. Ge, D. Hsu, S. M. Kakade, and M. Telgarsky. Tensor decompositions for learning latent variable models. JMLR, 15(1):2773-2832, jan 2014.
[4]
B. W. Bader and T. G. Kolda. Matlab tensor toolbox version 2.6. Available online, February 2015.
[5]
L. Buitinck, G. Louppe, M. Blondel, F. Pedregosa, A. Mueller, O. Grisel, V. Niculae, P. Prettenhofer, A. Gramfort, J. Grobler, R. Layton, J. VanderPlas, A. Joly, B. Holt, and G. Varoquaux. API design for machine learning software: experiences from the scikit-learn project. In ECML PKDD Workshop: Languages for Data Mining and Machine Learning, pages 108-122, 2013.
[6]
X. Cao and Q. Zhao. Tensorizing generative adversarial nets. CoRR, abs/1710.10772, 2017.
[7]
T. Chen, M. Li, Y. Li, M. Lin, N. Wang, M. Wang, T. Xiao, B. Xu, C. Zhang, and Z. Zhang. Mxnet: A exible and efficient machine learning library for heterogeneous distributed systems. CoRR, abs/1512.01274, 2015.
[8]
A. Cichocki, R. Zdunek, A. H. Phan, and S.-I. Amari. Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-Way Data Analysis and Blind Source Separation. John Wiley & Sons, Ltd, 2009.
[9]
A. Cichocki, D. Mandic, L. D. Lathauwer, G. Zhou, Q. Zhao, C. Caiafa, and H. A. PHAN. Tensor decompositions for signal processing applications: From two-way to multiway component analysis. IEEE Signal Processing Magazine, 32(2):145-163, March 2015.
[10]
N. Cohen, O. Sharir, and A. Shashua. On the expressive power of deep learning: A tensor analysis. CoRR, abs/1509.05009, 2015.
[11]
D. Goldfarb and Z. T. Qin. Robust low-rank tensor recovery: Models and algorithms. SIAM Journal on Matrix Analysis and Applications, 35(1):225-253, 2014.
[12]
L. Grasedyck, D. Kressner, and C. Tobler. A literature survey of low-rank tensor approximation techniques. GAMM-Mitteilungen, 36(1):53-78, 2013.
[13]
J. D. Hunter. Matplotlib: A 2d graphics environment. Computing in Science Engineering, 9(3): 90-95, May 2007.
[14]
M. Janzamin, R. Ge, J. Kossai_, and A. Anandkumar. Spectral learning on matrices and tensors. pre-print, 2019.
[15]
T. G. Kolda and B. W. Bader. Tensor decompositions and applications. SIAM REVIEW, 51(3): 455-500, 2009.
[16]
J. Kossai_, Z. C. Lipton, A. Khanna, T. Furlanello, and A. Anandkumar. Tensor regression networks. CoRR, abs/1707.08308, 2017.
[17]
H. Lu, K. N. Plataniotis, and A. N. Venetsanopoulos. A survey of multilinear subspace learning for tensor data. Pattern Recognition, 44(7):1540 - 1551, 2011.
[18]
A. Novikov, D. Podoprikhin, A. Osokin, and D. Vetrov. Tensorizing neural networks. In NIPS, pages 442-450, 2015.
[19]
E. E. Papalexakis, C. Faloutsos, and N. D. Sidiropoulos. Tensors for data mining and data fusion: Models, applications, and scalable algorithms. ACM Trans. Intell. Syst. Technol., 8(2):16:1-16:44, Oct. 2016.
[20]
A. Paszke, S. Gross, S. Chintala, G. Chanan, E. Yang, Z. DeVito, Z. Lin, A. Desmaison, L. Antiga, and A. Lerer. Automatic di_erentiation in pytorch. In NIPS-W, 2017.
[21]
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research, 12:2825-2830, 2011.
[22]
Y. Shi, U. N. Niranjan, A. Anandkumar, and C. Cecka. Tensor contractions with extended blas kernels on cpu and gpu. In IEEE HiPC, pages 193-202, Dec 2016.
[23]
N. D. Sidiropoulos, L. De Lathauwer, X. Fu, K. Huang, E. E. Papalexakis, and C. Faloutsos. tensor decomposition for signal processing and machine learning. arXiv preprint arXiv:1607.01668, 2016.
[24]
S. Tokui, K. Oono, S. Hido, and J. Clayton. Chainer: a next-generation open source framework for deep learning. In LearningSys Workshop in NIPS, 2015.
[25]
S. van der Walt, S. C. Colbert, and G. Varoquaux. The numpy array: A structure for efficient numerical computation. Computing in Science Engineering, 13(2):22-30, March 2011.
[26]
N. Vervliet, O. Debals, L. Sorber, M. Van Barel, and L. De Lathauwer. Tensorlab 3.0, Mar. 2016. URL https://www.tensorlab.net. Available online.
[27]
R. Yu, S. Zheng, A. Anandkumar, and Y. Yue. Long-term forecasting using tensor-train rnns. CoRR, abs/1711.00073, 2017.

Cited By

View all
  • (2023)Laplacian Change Point Detection for Single and Multi-view Dynamic GraphsACM Transactions on Knowledge Discovery from Data10.1145/363160918:3(1-32)Online publication date: 6-Nov-2023
  • (2022)Fused orthogonal alternating least squares for tensor clusteringProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3600848(7960-7971)Online publication date: 28-Nov-2022
  • (2022)Algorithm 1026: Concurrent Alternating Least Squares for Multiple Simultaneous Canonical Polyadic DecompositionsACM Transactions on Mathematical Software10.1145/351938348:3(1-20)Online publication date: 29-Apr-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image The Journal of Machine Learning Research
The Journal of Machine Learning Research  Volume 20, Issue 1
January 2019
3071 pages
ISSN:1532-4435
EISSN:1533-7928
Issue’s Table of Contents

Publisher

JMLR.org

Publication History

Published: 01 January 2019
Revised: 01 October 2018
Published in JMLR Volume 20, Issue 1

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)71
  • Downloads (Last 6 weeks)21
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Laplacian Change Point Detection for Single and Multi-view Dynamic GraphsACM Transactions on Knowledge Discovery from Data10.1145/363160918:3(1-32)Online publication date: 6-Nov-2023
  • (2022)Fused orthogonal alternating least squares for tensor clusteringProceedings of the 36th International Conference on Neural Information Processing Systems10.5555/3600270.3600848(7960-7971)Online publication date: 28-Nov-2022
  • (2022)Algorithm 1026: Concurrent Alternating Least Squares for Multiple Simultaneous Canonical Polyadic DecompositionsACM Transactions on Mathematical Software10.1145/351938348:3(1-20)Online publication date: 29-Apr-2022
  • (2022)TP: tensor product layer to compress the neural network in deep learningApplied Intelligence10.1007/s10489-022-03260-652:15(17133-17144)Online publication date: 1-Dec-2022
  • (2021)Tensor decomposition for analysing time-evolving social networks: an overviewArtificial Intelligence Review10.1007/s10462-020-09916-454:4(2891-2916)Online publication date: 1-Apr-2021
  • (2021)MuLOT: Multi-level Optimization of the Canonical Polyadic Tensor Decomposition at Large-ScaleAdvances in Databases and Information Systems10.1007/978-3-030-82472-3_15(198-212)Online publication date: 24-Aug-2021
  • (2020)Multiresolution tensor learning for efficient and interpretable spatial analysisProceedings of the 37th International Conference on Machine Learning10.5555/3524938.3525633(7499-7509)Online publication date: 13-Jul-2020
  • (2020)Multilinear latent conditioning for generating unseen attribute combinationsProceedings of the 37th International Conference on Machine Learning10.5555/3524938.3525260(3442-3451)Online publication date: 13-Jul-2020
  • (2020)Empowering big data analytics with polystore and strongly typed functional queriesProceedings of the 24th Symposium on International Database Engineering & Applications10.1145/3410566.3410591(1-10)Online publication date: 12-Aug-2020
  • (2020)AutoHOOTProceedings of the ACM International Conference on Parallel Architectures and Compilation Techniques10.1145/3410463.3414647(125-137)Online publication date: 30-Sep-2020
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media