qiskit-machine-learning icon indicating copy to clipboard operation
qiskit-machine-learning copied to clipboard

[WIP] Quantum CNN

Open Gopal-Dahale opened this issue 2 years ago • 15 comments

Summary

Quantum CNN using torch connector following this tutorial.

Details and comments

@adekusar-drl @dlasecki Your help and feedback are much appreciated! This will fix #408 in future.

Gopal-Dahale avatar Jun 29 '22 17:06 Gopal-Dahale

CLA assistant check
All committers have signed the CLA.

CLAassistant avatar Jun 29 '22 17:06 CLAassistant

@Gopal-Dahale Thanks for opening this PR! I ran the notebook for a few epochs and I don't see a good convergence. See the screenshot attached. Could it be that the network is not correct? I have not looked at the QCNN paper though. image

adekusar-drl avatar Jul 08 '22 10:07 adekusar-drl

Rechecked the network. Seems fine to me. I have used ListOp([Z]) as the observable which shows PauliOp(Pauli('ZIII') for 4 qubits. Does this mean that only the last qubit (q3) is measured? Also, shall I use softmax for the last layer, although Torch Connector and Hybrid QNNs does not do so?

Gopal-Dahale avatar Jul 08 '22 14:07 Gopal-Dahale

Yes, only one qubit affects the expectation value. You can try to use softmax, you should adjust the network and the training loop. I noticed that you compute accuracy with gradient enabled, I guess this is a mistake? Although excluding accuracy does not affect convergence of the model.

adekusar-drl avatar Jul 11 '22 09:07 adekusar-drl

@adekusar-drl Please review the new architecture.

Gopal-Dahale avatar Jul 31 '22 16:07 Gopal-Dahale

@Gopal-Dahale I don't think we can accept a tutorial based on pennylane. Thanks.

adekusar-drl avatar Aug 02 '22 21:08 adekusar-drl

@adekusar-drl Please review the new architecture with qiskit and pytorch only.

Gopal-Dahale avatar Aug 04 '22 06:08 Gopal-Dahale

I see that you have one qubit circuits/networks. Can you please explain what you have implemented?

adekusar-drl avatar Aug 04 '22 10:08 adekusar-drl

In brief, I am sliding a kernel of (3,3) with strides (2,2) on the image of size (8,8). The kernel will therefore have 9 inputs. The dot product of inputs with 9 weights is added with a bias and passed through the $R_y$ gate i.e $$R_y\left(b+\sum_{i=0}^9w_ix_i\right)$$ This $R_y$ block can then be repeated on the single qubit or on more qubits (in this case we can have CZ entanglement and the last qubit can be measured).

Reference: Data re-uploading for a universal quantum classifier

Gopal-Dahale avatar Aug 04 '22 11:08 Gopal-Dahale

@adekusar-drl Any feedback?

Gopal-Dahale avatar Aug 06 '22 08:08 Gopal-Dahale

@Gopal-Dahale Sorry for the delay, did not have time to review after you posted the explanation. Will be back shortly.

adekusar-drl avatar Aug 08 '22 11:08 adekusar-drl

@adekusar-drl Any feedback?

The new notebook looks like an example of the power of one qubit rather than a real QCNN, honestly. I don't think this is a good idea for a QCNN tutorial.

adekusar-drl avatar Aug 09 '22 16:08 adekusar-drl

I used one qubit due to the long training time. Will more than one qubit using and a different ansatz (Data reuploading or not) is acceptable? Also please tell me what your expectation is. Also, can we modify this notebook to create a new tutorial on the Data Reuploading circuit?

Gopal-Dahale avatar Aug 09 '22 16:08 Gopal-Dahale

I used one qubit due to the long training time. Will more than one qubit using and a different ansatz (Data reuploading or not) is acceptable? Also please tell me what your expectation is.

Classical CNN is a network where you alternate convolutional and pooling layers. Quantum CNN looks similar, but such networks are more flexible in my opinion. Indeed, the expectation is see some similarity in both type of networks.

Also, can we modify this notebook to create a new tutorial on the Data Reuploading circuit?

Not a subject matter expect here, can't advise on this. What can be the goal?

adekusar-drl avatar Aug 09 '22 17:08 adekusar-drl

Classical CNN is a network where you alternate convolutional and pooling layers. Quantum CNN looks similar, but such networks are more flexible in my opinion. Indeed, the expectation is to see some similarities in both types of networks.

Similar to classical CNNs, the kernel here is replaced by a Quantum Circuit. The pooling operation still needs to be classical in my opinion (max or average pool). So, in the case of QCNN the network will be as follows: QConv2D $\rightarrow$ Pool $\rightarrow$ QConv2D .... We can also allow skip connections like Resnet.

Not a subject matter expert here, can't advise on this. What can be the goal?

We can compare the performance of an ansatz with a data reuploading circuit to demonstrate the high expressibility of data reuploading circuits. This will also demonstrate that when aided by a classical subroutine, a single qubit offers sufficient processing power to build a universal quantum classifier. The paper Data re-uploading for a universal quantum classifier can be used as a reference.

Gopal-Dahale avatar Aug 09 '22 17:08 Gopal-Dahale

@adekusar-drl Please give some feedback.

Gopal-Dahale avatar Aug 22 '22 17:08 Gopal-Dahale

@Gopal-Dahale Sorry for the delay. So far I don't like how the QCNN tutorial looks like, honestly. It does look like what a reader may expect from QCNN. It is not obvious. As for data reuploading - I don't get what exactly you want to show. And, perhaps, this is a separate topic.

adekusar-drl avatar Aug 22 '22 18:08 adekusar-drl

The QCNN tutorial has been contributed in #462.

adekusar-drl avatar Aug 29 '22 21:08 adekusar-drl