This repository contains the data and code in the paper "Understanding quantum machine learning also requires rethinking generalization," found in Nature Communications or arXiv. The code relies on the following packages: TensorCircuit (GitHub) and Qibo (GitHub). Please ensure that these packages are installed before running the code.
The repository is organized into folders corresponding to different experiments conducted. Three primary code files can be executed:
main_code.py
: This file trains and executes the quantum convolutional neural network from scratch for the different experiments. It accepts the following arguments:
--training_data
(int): Training data size. It supports5
,8
,10
,14
, and20
. Default =5
.--accuracy_training
(int): minimum training accuracy that shall be achieved. The code performs new random initialization and training if the accuracy falls below this threshold. Default =100
Note that executing main_code.py
can be computationally demanding. To reproduce the results presented in the paper, consider running the following files instead.
-
accuracy_train.py
: Run this file to obtain the training accuracy using the best parameters determined by the authors. It accepts the same--training_data
argument asmain_code.py
. -
accuracy_test.py
: Run this file to obtain the test accuracy using the best parameters determined by the authors. Again, it accepts the same--training_data
argument asmain_code.py
.
To run the code, follow these steps:
-
Install the required packages: TensorCircuit and Qibo.
-
Choose the appropriate code file based on your requirements:
- To train and execute the quantum convolutional neural network from scratch, run
main_code.py
. - For reproducing the paper's results, execute
accuracy_train.py
on the training set andaccuracy_test.py
on the test set.
-
Set the desired values for the arguments
--training_data
and--accuracy_training
(if applicable) to customize the execution. -
Execute the chosen code file, ensuring the required packages are accessible.
For further assistance or inquiries, please refer to the paper or contact the authors directly.