Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

CTRQNets & LQNets: Continuous Time Recurrent and Liquid Quantum Neural Networks

Created by
  • Haebom

Author

Alejandro Antonio Mayorga, Alexander Yuan, Andrew Yuan, Tyler Wooldridge, Xiaodi Wang

Outline

To overcome the limitations of conventional quantum neural networks with static structures, this paper proposes a liquid quantum neural network (LQNet) with dynamic intelligence and a continuous-time recurrent quantum neural network (CTRQNet). Both models demonstrate significant accuracy improvements over conventional quantum neural networks, achieving up to a 40% improvement in accuracy on the CIFAR-10 binary classification task. This suggests their potential contribution to understanding the black box of quantum machine learning.

Takeaways, Limitations

Takeaways:
Presenting a new model (LQNet, CTRQNet) that overcomes the limitations of the static structure of existing quantum neural networks.
Achieved up to 40% accuracy improvement over existing quantum neural networks (CIFAR-10 binary classification)
Suggesting a potential contribution to understanding the black box of quantum machine learning.
Limitations:
Additional evaluation of the proposed model's generalization performance and performance on various datasets is needed.
Implementation and performance verification on actual quantum computers are required.
Analysis of the model's complexity and computational cost is needed.
👍