Post-variational classical quantum transfer learning for binary classification

Abstract We address the limitations of variational quantum circuits (VQCs) in hybrid classical-quantum transfer learning by introducing post-variational strategies, which reduce training overhead and mitigate optimization issues. Our approach Post Variational Classical Quantum Transfer Learning (PVC...

Full description

Saved in:
Bibliographic Details
Main Authors: Kavitha Yogaraj, Brian Quanz, Tarun Vikas, Arijit Mondal, Samrat Mondal
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-08887-2
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849238842856439808
author Kavitha Yogaraj
Brian Quanz
Tarun Vikas
Arijit Mondal
Samrat Mondal
author_facet Kavitha Yogaraj
Brian Quanz
Tarun Vikas
Arijit Mondal
Samrat Mondal
author_sort Kavitha Yogaraj
collection DOAJ
description Abstract We address the limitations of variational quantum circuits (VQCs) in hybrid classical-quantum transfer learning by introducing post-variational strategies, which reduce training overhead and mitigate optimization issues. Our approach Post Variational Classical Quantum Transfer Learning (PVCQTL) includes three designs: (1) modified observable construction, (2) a hybrid approach, and (3) a variational-post-variational combination. We evaluate these on pre-trained models (VGG19, ResNet50, ResNet18, MobileNet) for 4 and 8 qubits, with ResNet50 performing best in deepfake detection. Compared to classical models (MLP, ResNet50) and quantum baselines hybrid quantum classical neural network (HQCNN), classical-quantum transfer learning (CQTL). PVCQTL consistently achieves better accuracy. The modified observable variant reaches 85% accuracy for Deepfake dataset with lower computational cost. To evaluate generalizability, we tested PVCQTL on three additional binary classification datasets, observing improved accuracy on each. We conducted ablation studies to assess the effects of architectural choices on quantum component variations, including the choice of quantum gates, use of fixed ansatz circuits, and observable measurements. Robustness to input noise and sensitivity of the PVCQTL models were examined through ablation studies on learning rate, batch size, and number of qubits. These results demonstrate that PVCQTL offers a measurable improvement over traditional hybrid classical-quantum approaches.
format Article
id doaj-art-4a638f351c6f4b77ae55b37ccf4870fc
institution Kabale University
issn 2045-2322
language English
publishDate 2025-07-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-4a638f351c6f4b77ae55b37ccf4870fc2025-08-20T04:01:24ZengNature PortfolioScientific Reports2045-23222025-07-0115112210.1038/s41598-025-08887-2Post-variational classical quantum transfer learning for binary classificationKavitha Yogaraj0Brian Quanz1Tarun Vikas2Arijit Mondal3Samrat Mondal4IBM Quantum, IBM ResearchIBM Quantum, IBM ResearchDepartment of Computer Science and Engineering, Indian Institute of TechnologyDepartment of Computer Science and Engineering, Indian Institute of TechnologyDepartment of Computer Science and Engineering, Indian Institute of TechnologyAbstract We address the limitations of variational quantum circuits (VQCs) in hybrid classical-quantum transfer learning by introducing post-variational strategies, which reduce training overhead and mitigate optimization issues. Our approach Post Variational Classical Quantum Transfer Learning (PVCQTL) includes three designs: (1) modified observable construction, (2) a hybrid approach, and (3) a variational-post-variational combination. We evaluate these on pre-trained models (VGG19, ResNet50, ResNet18, MobileNet) for 4 and 8 qubits, with ResNet50 performing best in deepfake detection. Compared to classical models (MLP, ResNet50) and quantum baselines hybrid quantum classical neural network (HQCNN), classical-quantum transfer learning (CQTL). PVCQTL consistently achieves better accuracy. The modified observable variant reaches 85% accuracy for Deepfake dataset with lower computational cost. To evaluate generalizability, we tested PVCQTL on three additional binary classification datasets, observing improved accuracy on each. We conducted ablation studies to assess the effects of architectural choices on quantum component variations, including the choice of quantum gates, use of fixed ansatz circuits, and observable measurements. Robustness to input noise and sensitivity of the PVCQTL models were examined through ablation studies on learning rate, batch size, and number of qubits. These results demonstrate that PVCQTL offers a measurable improvement over traditional hybrid classical-quantum approaches.https://doi.org/10.1038/s41598-025-08887-2Post variationalQuantum transfer learningQuantum binary classification
spellingShingle Kavitha Yogaraj
Brian Quanz
Tarun Vikas
Arijit Mondal
Samrat Mondal
Post-variational classical quantum transfer learning for binary classification
Scientific Reports
Post variational
Quantum transfer learning
Quantum binary classification
title Post-variational classical quantum transfer learning for binary classification
title_full Post-variational classical quantum transfer learning for binary classification
title_fullStr Post-variational classical quantum transfer learning for binary classification
title_full_unstemmed Post-variational classical quantum transfer learning for binary classification
title_short Post-variational classical quantum transfer learning for binary classification
title_sort post variational classical quantum transfer learning for binary classification
topic Post variational
Quantum transfer learning
Quantum binary classification
url https://doi.org/10.1038/s41598-025-08887-2
work_keys_str_mv AT kavithayogaraj postvariationalclassicalquantumtransferlearningforbinaryclassification
AT brianquanz postvariationalclassicalquantumtransferlearningforbinaryclassification
AT tarunvikas postvariationalclassicalquantumtransferlearningforbinaryclassification
AT arijitmondal postvariationalclassicalquantumtransferlearningforbinaryclassification
AT samratmondal postvariationalclassicalquantumtransferlearningforbinaryclassification