Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
review-article

A Characterization of Quantum Generative Models

Published: 17 June 2024 Publication History

Abstract

Quantum generative modeling is a growing area of interest for industry-relevant applications. This work systematically compares a broad range of techniques to guide quantum computing practitioners when deciding which models and methods to use in their applications. We compare fundamentally different architectural ansatzes of parametric quantum circuits: (1) A continuous architecture, which produces continuous-valued data samples, and (2) a discrete architecture, which samples on a discrete grid. We also compare the performance of different data transformations: the min-max and the probability integral transforms. We use two popular training methods: (1) quantum circuit Born machines (QCBM), and (2) quantum generative adversarial networks (QGAN). We study their performance and tradeoffs as the number of model parameters increases, with a baseline comparison of similarly trained classical neural networks. The study is performed on six low-dimensional synthetic and two real financial data sets. Our two key findings are that: (1) For all data sets, our quantum models require similar or fewer parameters than their classical counterparts. In the extreme case, the quantum models require two orders of magnitude less parameters. (2) We empirically find that a variant of the discrete architecture, which learns the copula of the probability distribution, outperforms all other methods.

References

[1]
2022. Yahoo Finance API. Retrieved from https://finance.yahoo.com/.
[2]
Scott Aaronson and Alex Arkhipov. 2011. The computational complexity of linear optics. In Proceedings of the Forty-Third Annual ACM Symposium on Theory of Computing (San Jose, California, USA). Association for Computing Machinery, New York, NY, USA, 333–342. DOI:
[3]
Javier Alcazar, Vicente Leyton-Ortega, and Alejandro Perdomo-Ortiz. 2020. Classical versus quantum models in machine learning: Insights from a finance application. Machine Learning: Science and Technology 1, 3 (2020), 035003. DOI:
[4]
Abhinav Anand, Jonathan Romero, Matthias Degroote, and Alán Aspuru-Guzik. 2021. Noise robustness and experimental demonstration of a quantum generative adversarial network for continuous distributions. Advanced Quantum Technologies 4, 5 (2021), 2000069.
[5]
John E. Angus. 1994. The probability integral transform and related results. SIAM Review. 36, 4 (1994), 652–654. DOI:
[6]
Martin Arjovsky, Soumith Chintala, and Léon Bottou. 2017. Wasserstein generative adversarial networks. In Proceedings of the 34th International Conference on Machine Learning - Volume 70 (Sydney, NSW, Australia). JMLR.org, 214–223.
[7]
Frank Arute, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph Bardin, Rami Barends, Rupak Biswas, Sergio Boixo, Fernando Brandao, David Buell, Brian Burkett, Yu Chen, Jimmy Chen, Ben Chiaro, Roberto Collins, William Courtney, Andrew Dunsworth, Edward Farhi, Brooks Foxen, Austin Fowler, Craig Michael Gidney, Marissa Giustina, Rob Graff, Keith Guerin, Steve Habegger, Matthew Harrigan, Michael Hartmann, Alan Ho, Markus Rudolf Hoffmann, Trent Huang, Travis Humble, Sergei Isakov, Evan Jeffrey, Zhang Jiang, Dvir Kafri, Kostyantyn Kechedzhi, Julian Kelly, Paul Klimov, Sergey Knysh, Alexander Korotkov, Fedor Kostritsa, Dave Landhuis, Mike Lindmark, Erik Lucero, Dmitry Lyakh, Salvatore Mandrà, Jarrod Ryan McClean, Matthew McEwen, Anthony Megrant, Xiao Mi, Kristel Michielsen, Masoud Mohseni, Josh Mutus, Ofer Naaman, Matthew Neeley, Charles Neill, Murphy Yuezhen Niu, Eric Ostby, Andre Petukhov, John Platt, Chris Quintana, Eleanor G. Rieffel, Pedram Roushan, Nicholas Rubin, Daniel Sank, Kevin J. Satzinger, Vadim Smelyanskiy, Kevin Jeffery Sung, Matt Trevithick, Amit Vainsencher, Benjamin Villalonga, Ted White, Z. Jamie Yao, Ping Yeh, Adam Zalcman, Hartmut Neven, and John Martinis. 2019. Quantum supremacy using a programmable superconducting processor. Nature 574 (2019), 505–510. Retrieved from https://www.nature.com/articles/s41586-019-1666-5
[8]
Igor Babuschkin, Kate Baumli, Alison Bell, Surya Bhupatiraju, Jake Bruce, Peter Buchlovsky, David Budden, Trevor Cai, Aidan Clark, Ivo Danihelka, Claudio Fantacci, Jonathan Godwin, Chris Jones, Ross Hemsley, Tom Hennigan, Matteo Hessel, Shaobo Hou, Steven Kapturowski, Thomas Keck, Iurii Kemaev, Michael King, Markus Kunesch, Lena Martens, Hamza Merzic, Vladimir Mikulik, Tamara Norman, John Quan, George Papamakarios, Roman Ring, Francisco Ruiz, Alvaro Sanchez, Rosalia Schneider, Eren Sezener, Stephen Spencer, Srivatsan Srinivasan, Luyu Wang, Wojciech Stokowiec, and Fabio Viola. 2020. The DeepMind JAX Ecosystem. Retrieved from http://github.com/google-deepmind
[9]
Marcello Benedetti, Brian Coyle, Mattia Fiorentini, Michael Lubasch, and Matthias Rosenkranz. 2021. Variational inference with a quantum computer. Physical Review Applied 16, 4 (2021), 044057. DOI:
[10]
Marcello Benedetti, Delfina Garcia-Pintos, Oscar Perdomo, Vicente Leyton-Ortega, Yunseong Nam, and Alejandro Perdomo-Ortiz. 2019. A generative modeling approach for benchmarking and training shallow quantum circuits. npj Quantum Information 5, 1 (2019), 1–9.
[11]
Marcello Benedetti, Erika Lloyd, Stefan Sack, and Mattia Fiorentini. 2019. Parameterized quantum circuits as machine learning models. Quantum Science and Technology 4, 4 (2019), 043001. DOI:
[12]
Ali Borji. 2018. Pros and Cons of GAN Evaluation Measures. DOI:
[13]
Ali Borji. 2021. Pros and Cons of GAN Evaluation Measures: New Developments. DOI:
[14]
Kerstin Borras, Su Yeon Chang, Lena Funcke, Michele Grossi, Tobias Hartung, Karl Jansen, Dirk Kruecker, Stefan Kühn, Florian Rehm, Cenk Tüysüz, and Sofia Vallecorsa. 2023. Impact of quantum noise on the training of quantum Generative Adversarial Networks. Journal of Physics: Conference Series 2438, 1 (2023), 012093. DOI:
[15]
Carlos Bravo-Prieto, Julien Baglio, Marco Cè, Anthony Francis, Dorota M. Grabowska, and Stefano Carrazza. 2022. Style-based quantum generative adversarial networks for Monte Carlo events. Quantum 6 (2022), 777. DOI:
[16]
Su Yeon Chang, Edwin Agnew, Elías Combarro, Michele Grossi, Steven Herbert, and Sofia Vallecorsa. 2023. Running the Dual-PQC GAN on noisy simulators and real quantum hardware. Journal of Physics: Conference Series 2438, 1 (2023), 012062. DOI:
[17]
Smit Chaudhary, Patrick Huembeli, Ian MacCormack, Taylor L. Patti, Jean Kossaifi, and Alexey Galda. 2023. Towards a scalable discrete quantum generative adversarial neural network. Quantum Sci. Technol. 8, 035002 (2023). DOI:
[18]
Song Cheng, Jing Chen, and Lei Wang. 2018. Information perspective to probabilistic modeling: Boltzmann machines versus born machines. Entropy 20, 8 (2018), 583.
[19]
Brian Coyle, Maxwell Henderson, Justin Chan Jin Le, Niraj Kumar, Marco Paini, and Elham Kashefi. 2021. Quantum versus classical generative modelling in finance. Quantum Science and Technology 6, 2 (2021), 024013. DOI:
[20]
Brian Coyle, Daniel Mills, Vincent Danos, and Elham Kashefi. 2020. The Born supremacy: Quantum advantage and training of an Ising Born machine. npj Quantum Information 6, 1 (2020), 1–11.
[21]
Antonia Creswell, Tom White, Vincent Dumoulin, Kai Arulkumaran, Biswa Sengupta, and Anil A. Bharath. 2018. Generative adversarial networks: An overview. IEEE Signal Processing Magazine 35, 1 (2018), 53–65. DOI:
[22]
Pierre-Luc Dallaire-Demers and Nathan Killoran. 2018. Quantum generative adversarial networks. Physical Review Applied 98, 1 (2018), 012324. DOI:
[23]
Yuxuan Du, Min-Hsiu Hsieh, Tongliang Liu, and Dacheng Tao. 2020. Expressive power of parametrized quantum circuits. Physical Review Research 2, 3 (2020), 033125. DOI:
[24]
Xun Gao, Eric R. Anschuetz, Sheng-Tao Wang, J. Ignacio Cirac, and Mikhail D. Lukin. 2022. Enhancing generative models via quantum correlations. Physical Review X 12, 2 (2022), 021037. DOI:
[25]
X. Gao, Z.-Y. Zhang, and L.-M. Duan. 2018. A quantum machine learning algorithm based on generative models. Science Advances 4, 12 (2018), eaat9004. DOI:
[26]
Kaitlin Gili, Mohamed Hibat-Allah, Marta Mauri, Chris Ballance, and Alejandro Perdomo-Ortiz. 2023. Do quantum circuit born machines generalize? Quantum Sci. Technol. 8, 035021 (2023). DOI:
[27]
Kaitlin Gili, Marta Mauri, and Alejandro Perdomo-Ortiz. 2022. Generalization metrics for practical quantum advantage in generative models. Retrieved from https://arxiv.org/abs/2201.08770
[28]
Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, and Ignacio Cirac. 2019. Expressive power of tensor-network factorizations for probabilistic modeling. In Proceedings of the Advances in Neural Information Processing Systems.H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (Eds.), Vol. 32, Curran Associates, Inc. Retrieved from https://proceedings.neurips.cc/paper/2019/file/b86e8d03fe992d1b0e19656875ee557c-Paper.pdf
[29]
Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. 2014. Generative adversarial nets. In Proceedings of the Advances in Neural Information Processing Systems.Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K. Q. Weinberger (Eds.), Vol. 27, Curran Associates, Inc. Retrieved from https://proceedings.neurips.cc/paper/2014/file/5ca3e9b122f61f8f06494c97b1afccf3-Paper.pdf
[30]
Diego Guala, Shaoming Zhang, Esther Cruz, Carlos A. Riofrío, Johannes Klepsch, and Juan Miguel Arrazola. 2023. Practical overview of image classification with tensor-network quantum circuits. Scientific Reports 13, 1 (2023), 4427. DOI:
[31]
Kathleen E. Hamilton, Eugene F. Dumitrescu, and Raphael C. Pooser. 2019. Generative model benchmarks for superconducting qubits. Physical Review A 99, 6 (2019), 062323. DOI:
[32]
Nikolaus Hansen, Youhei Akimoto, and Petr Baudis. 2019. CMA-ES/pycma on Github. Zenodo, DOI:10.5281/zenodo.2559634. DOI:
[33]
Yamin Hossain. 2024. Heart disease dataset: Health conditions based on various medical indicators. https://www.kaggle.com/datasets/yaminh/heart-disease-dataset
[34]
He-Liang Huang, Yuxuan Du, Ming Gong, Youwei Zhao, Yulin Wu, Chaoyue Wang, Shaowei Li, Futian Liang, Jin Lin, Yu Xu, Rui Yang, Tongliang Liu, Min-Hsiu Hsieh, Hui Deng, Hao Rong, Cheng-Zhi Peng, Chao-Yang Lu, Yu-Ao Chen, Dacheng Tao, Xiaobo Zhu, and Jian-Wei Pan. 2021. Experimental quantum generative adversarial networks for image generation. Physical Review Applied 16, 2 (2021), 024051. DOI:
[35]
Vicente Leyton-Ortega, Alejandro Perdomo-Ortiz, and Oscar Perdomo. 2021. Robust implementation of generative modeling with parametrized quantum circuits. Quantum Machine Intelligence 3, 1 (2021), 1–10.
[36]
Jin-Guo Liu and Lei Wang. 2018. Differentiable learning of quantum circuit Born machines. Physical Review A 98, 6 (2018), 062324. DOI:
[37]
Angus Lowe, Matija Medvidović, Anthony Hayes, Lee J. O’Riordan, Thomas R. Bromley, Juan Miguel Arrazola, and Nathan Killoran. 2023. Fast quantum circuit cutting with randomized measurements. Quantum 7 (2023), 934. DOI:
[38]
Andrea Mari, Thomas R. Bromley, Josh Izaac, Maria Schuld, and Nathan Killoran. 2020. Transfer learning in hybrid classical-quantum neural networks. Quantum 4 (2020), 340. DOI:
[39]
K. Mitarai, M. Negoro, M. Kitagawa, and K. Fujii. 2018. Quantum circuit learning. Physical Review A 98, 032309 (2018). DOI:
[40]
Kouhei Nakaji, Shumpei Uno, Yohichi Suzuki, Rudy Raymond, Tamiya Onodera, Tomoki Tanaka, Hiroyuki Tezuka, Naoki Mitsuda, and Naoki Yamamoto. 2022. Approximate amplitude encoding in shallow parameterized quantum circuits and its application to financial market indicators. Physical Review Research 4, 2 (2022), 023136. DOI:
[41]
Roger B. Nelsen. 2007. An Introduction to Copulas. Springer Science and Business Media.
[42]
Murphy Yuezhen Niu, Alexander Zlokapa, Michael Broughton, Sergio Boixo, Masoud Mohseni, Vadim Smelyanskyi, and Hartmut Neven. 2022. Entangling quantum generative adversarial networks. Physical Review Letters 128, 22 (2022), 220505.
[43]
Zhaoqing Pan, Weijie Yu, Bosi Wang, Haoran Xie, Victor S. Sheng, Jianjun Lei, and Sam Kwong. 2020. Loss functions of generative adversarial networks (GANs): Opportunities and challenges. IEEE Transactions on Emerging Topics in Computational Intelligence 4, 4 (2020), 500–522. DOI:
[44]
Dong Huk Park, Grace Luo, Clayton Toste, Samaneh Azadi, Xihui Liu, Maka Karalashvili, Anna Rohrbach, and Trevor Darrell. 2022. Shape-Guided Diffusion with Inside-Outside Attention. DOI:
[45]
Jay Patel, Sofia Vallecorsa, and Michele Grossi. 2023. Generative Models using Continuous Variable Quantum Computing. DOI:
[46]
Alejandro Perdomo-Ortiz, Marcello Benedetti, John Realpe-Gó mez, and Rupak Biswas. 2018. Opportunities and challenges for quantum-assisted machine learning in near-term quantum computers. Quantum Science and Technology 3, 3 (2018), 030502. DOI:
[47]
Adrián Pérez-Salinas, Alba Cervera-Lierta, Elies Gil-Fuster, and José I. Latorre. 2020. Data re-uploading for a universal quantum classifier. Quantum 4 (2020), 226. DOI:
[48]
John Preskill. 2018. Quantum Computing in the NISQ era and beyond. Quantum 2 (2018), 79. DOI:
[49]
Lyle Regenwetter, Amin Heyrani Nobari, and Faez Ahmed. 2022. Deep generative models in engineering design: A review. Journal of Mechanical Design 144, 7 (2022), 15 pages. DOI:
[50]
Manuel S. Rudolph, Ntwali Bashige Toussaint, Amara Katabarwa, Sonika Johri, Borja Peropadre, and Alejandro Perdomo-Ortiz. 2022. Generation of high-resolution handwritten digits with an ion-trap quantum computer. Physical Review X 12, 3 (2022), 031010. DOI:
[51]
M. Schuld and F. Petruccione. 2021. Machine Learning with Quantum Computers. Springer International Publishing. Retrieved from
[52]
Maria Schuld, Ilya Sinayskiy, and Francesco Petruccione. 2015. An introduction to quantum machine learning. Contemporary Physics 56, 2 (2015), 172–185.
[53]
Maria Schuld, Ryan Sweke, and Johannes Jakob Meyer. 2021. Effect of data encoding on the expressive power of variational quantum-machine-learning models. Physical Review A 103, 3 (2021), 032430. DOI:
[54]
Sukin Sim, Peter D. Johnson, and Alán Aspuru-Guzik. 2019. Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms. Advanced Quantum Technologies 2, 12 (2019), 1900070.
[55]
S. Sinisi, V. Alimguzhin, T. Mancini, E. Tronci, and B. Leeners. 2020. Complete populations of virtual patients for in silico clinical trials. Bioinformatics (Oxford, England) 36, 22–23 (2020), 5465–5472. DOI:
[56]
Haozhen Situ, Zhimin He, Yuyi Wang, Lvzhou Li, and Shenggen Zheng. 2020. Quantum generative adversarial network for generating discrete distribution. Information Sciences 538 (2020), 193–208. DOI:
[57]
Samuel A. Stein, Betis Baheri, Daniel Chen, Ying Mao, Qiang Guan, Ang Li, Bo Fang, and Shuai Xu. 2021. QuGAN: A quantum state fidelity based generative adversarial network. In Proceedings of the 2021 IEEE International Conference on Quantum Computing and Engineering. 71–81. DOI:
[58]
Ryan Sweke, Jean-Pierre Seifert, Dominik Hangleiter, and Jens Eisert. 2021. On the Quantum versus classical learnability of discrete distributions. Quantum 5 (2021), 417. DOI:
[59]
N. G. Van Kampen. 1992. Stochastic Processes in Physics and Chemistry. Elsevier Science. Retrieved from https://books.google.de/books?id=3e7XbMoJzmoC
[60]
Benjamin Villalonga, Dmitry Lyakh, Sergio Boixo, Hartmut Neven, Travis S. Humble, Rupak Biswas, Eleanor G. Rieffel, Alan Ho, and Salvatore Mandrà. 2020. Establishing the quantum supremacy frontier with a 281 Pflop/s simulation. Quantum Science and Technology 5, 3 (2020), 034003. DOI:
[61]
Carolyn Whitnall, Elisabeth Oswald, and Luke Mather. 2011. An exploration of the kolmogorov-smirnov test as a competitor to mutual information analysis. In Proceedings of the International Conference on Smart Card Research and Advanced Applications. Springer, 234–251.
[62]
Stefan Woerner and Daniel J. Egger. 2019. Quantum risk analysis. npj Quantum Information 5, 1 (2019), 15. DOI:
[63]
Jinfeng Zeng, Yufeng Wu, Jin-Guo Liu, Lei Wang, and Jiangping Hu. 2019. Learning and inference on generative adversarial quantum circuits. Physical Review A 99, 5 (2019), 052306.
[64]
Shengjia Zhao, Hongyu Ren, Arianna Yuan, Jiaming Song, Noah Goodman, and Stefano Ermon. 2018. Bias and generalization in deep generative models: An empirical study. Advances in Neural Information Processing Systems 31 (2018). https://proceedings.neurips.cc/paper_files/paper/2018/file/5317b6799188715d5e00a638a4278901-Paper.pdf
[65]
Nan-Run Zhou, Tian-Feng Zhang, Xin-Wen Xie, and Jun-Yun Wu. 2022. Hybrid quantum–classical generative adversarial networks for image generation via learning discrete distribution. Signal Processing: Image Communication 110 (2022), 116891. DOI:
[66]
D. Zhu, N. M. Linke, M. Benedetti, K. A. Landsman, N. H. Nguyen, C. H. Alderete, A. Perdomo-Ortiz, N. Korda, A. Garfoot, C. Brecque, L. Egan, O. Perdomo, and C. Monroe. 2019. Training of quantum circuits on a hybrid quantum computer. Science Advances 5, 10 (2019), eaaw9918. DOI:
[67]
Daiwei Zhu, Weiwei Shen, Annarita Giani, Saikat Ray Majumder, Bogdan Neculaes, and Sonika Johri. 2022. Copula- based risk aggregation with trapped Ion quantum computers. Sci Rep 13, 18511 (2023).
[68]
Elton Yechao Zhu, Sonika Johri, Dave Bacon, Mert Esencan, Jungsang Kim, Mark Muir, Nikhil Murgai, Jason Nguyen, Neal Pisenti, Adam Schouela, Ksenia Sosnova, and Ken Wright. 2022. Generative quantum learning of joint probability distribution functions. Physical Review Research 4, 043092 (2022). DOI:
[69]
Qingling Zhu, Sirui Cao, Fusheng Chen, Ming-Cheng Chen, Xiawei Chen, Tung-Hsun Chung, Hui Deng, Yajie Du, Daojin Fan, Ming Gong, Cheng Guo, Chu Guo, Shaojun Guo, Lianchen Han, Linyin Hong, He-Liang Huang, Yong-Heng Huo, Liping Li, Na Li, Shaowei Li, Yuan Li, Futian Liang, Chun Lin, Jin Lin, Haoran Qian, Dan Qiao, Hao Rong, Hong Su, Lihua Sun, Liangyuan Wang, Shiyu Wang, Dachao Wu, Yulin Wu, Yu Xu, Kai Yan, Weifeng Yang, Yang Yang, Yangsen Ye, Jianghan Yin, Chong Ying, Jiale Yu, Chen Zha, Cha Zhang, Haibin Zhang, Kaili Zhang, Yiming Zhang, Han Zhao, Youwei Zhao, Liang Zhou, Chao-Yang Lu, Cheng-Zhi Peng, Xiaobo Zhu, and Jian-Wei Pan. 2022. Quantum computational advantage via 60-qubit 24-cycle random circuit sampling. Science Bulletin 67, 3 (2022), 240–245. DOI:
[70]
Christa Zoufal, Aurélien Lucchi, and Stefan Woerner. 2019. Quantum generative adversarial networks for learning and loading random distributions. npj Quantum Information 5, 1 (2019), 1–9.

Cited By

View all
  • (2024)On the sample complexity of quantum Boltzmann machine learningCommunications Physics10.1038/s42005-024-01763-x7:1Online publication date: 14-Aug-2024
  • (2024)Benchmarking Quantum Generative Learning: A Study on Scalability and Noise Resilience using QUARKKI - Künstliche Intelligenz10.1007/s13218-024-00864-7Online publication date: 19-Aug-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Quantum Computing
ACM Transactions on Quantum Computing  Volume 5, Issue 2
June 2024
230 pages
EISSN:2643-6817
DOI:10.1145/3613676
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 June 2024
Online AM: 04 April 2024
Accepted: 24 March 2024
Revised: 08 March 2024
Received: 20 February 2023
Published in TQC Volume 5, Issue 2

Check for updates

Author Tags

  1. Quantum machine learning
  2. datasets
  3. neural networks

Qualifiers

  • Review-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)684
  • Downloads (Last 6 weeks)70
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)On the sample complexity of quantum Boltzmann machine learningCommunications Physics10.1038/s42005-024-01763-x7:1Online publication date: 14-Aug-2024
  • (2024)Benchmarking Quantum Generative Learning: A Study on Scalability and Noise Resilience using QUARKKI - Künstliche Intelligenz10.1007/s13218-024-00864-7Online publication date: 19-Aug-2024

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media