This repository contains the implementation of DyPP and its application to accelerate QNNs convergence on MNIST classification problem as in:
- Paper : DyPP: Dynamic Parameter Prediction to Accelerate Convergence of Variational Quantum Algorithms
- Authors : Satwik Kundu, Debarshi Kundu and Swaroop Ghosh
To install, execute the following commands:
$ git clone https://github.com/satwik-kundu/dynamic-parameter-prediction.git
$ cd dynamic-parameter-prediction
$ pip install -r requirements.txt
In this work, we used a convolutional autoencoder (modified) to reduce feature dimension from original autoencoder.py. To generate reduced MNIST feature set of dimension 32, run:
$ python autoencoder.py --dimension 32
For d = 32, running the above command would generate the reduced dataset stored in a csv file named latent_mnist_test_d32.csv.
The QNN circuit builder has been modified from our previous work and individual circuit blocks are implemented in app. The pytorch implementation for the QNN model is defined in qnn_torch.py, where:
| Parameter | Description | Value Range | 
|---|---|---|
| qubit | Number of qubits | Any | 
| enc | Features encoded per qubit | |
| pqc | Parameterized quantum circuit architectures | |
| layers | Number of pqc layers | Any | 
| meas | Measurement operations | 
The hyperparameters for our parameter prediction technique are also defined in qnn_torch.py, where:
| Parameter | Description | Value Range | 
|---|---|---|
| p | Prediction interval | |
| d | Prediction distance | |
| k | Proportionality constant | 0.0001 (constant) | 
| dr | Decay rate | 0.95 (constant) | 
After setting the hyperparameters, to train a QNN model on dataset latent_mnist_test_d32.csv, choose encoding and qubit count accordingly and run:
python qnn_torch.py latent_mnist_test_d32.csv
If this project contributes to your research, kindly cite our paper:
@article{kundu2023dypp,
  title={DyPP: Dynamic Parameter Prediction to Accelerate Convergence of Variational Quantum Algorithms},
  author={Kundu, Satwik and Kundu, Debarshi and Ghosh, Swaroop},
  journal={arXiv preprint arXiv:2307.12449},
  year={2023}
}
