forked from PennyLaneAI/qml
-
Notifications
You must be signed in to change notification settings - Fork 0
/
demos_community.yaml
426 lines (379 loc) · 23.8 KB
/
demos_community.yaml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
- title: Quantum Variational Rewinding for Time Series Anomaly Detection
author: Jack S. Baker
date: 19/01/2023
code: https://github.com/AgnostiqHQ/QuantumVariationalRewinding
paper: https://arxiv.org/abs/2210.16438
color: heavy-rain-gradient
description: |
In this demo, we walk through a select number of examples from the paper Quantum Variational Rewinding for Time
Series Anomaly Detection. Specifically, we demonstrate the detection of anomalous behaviour in a bivariate
cryptocurrency time series data and in synthetically generated univariate time series data.
The goal of the tutorial is to have others experiment with the code to create new algorithmic variations, as well
as exposing the advantages of using the heterogeneous workflow manager Covalent in quantum machine learning workflows.
- title: SO2 Emission Prediction from Diesel Engines with Quantum Technology (5G)
author: Emirhan Bulut
date: 16/11/2022
code: https://github.com/emirhanai/SO2-Emission-Prediction-from-Diesel-Engines-with-Quantum-Technology-5G
color: heavy-rain-gradient
description: |
A worldwide study has been conducted on the emission values of SO2 gases released from diesel engines in the world
(class 1 if it has increased compared to the previous year, class 0 if there has been a decrease compared to the
previous year, and class 0 for the starting years). In this research, 5G compatible quantum algorithms were
designed by me. A quantum computer was used for the process and the minimum number of qubits was for use on all
computers. Finally, the same data was tested with a classical deep neural network and
a Random Forest algorithm. Based on test accuracy, the quantum algorithm was found to be
more performant than all of them.
- title: Generalization of Quantum Metric Learning Classifiers
author: Jonathan Kim and Stefan Bekiranov
date: 09/11/2022
code: https://github.com/Rlag1998/Embedding_Generalization
paper: https://doi.org/10.3390/biom12111576
color: heavy-rain-gradient
description: |
This demo is a fork of the previously discontinued Embeddings & Metric Learning demo authored by Maria Schuld and
Aroosa Ijaz in 2020. This new demo uses the ImageNet ants/bees image dataset and the UCI ML Breast Cancer
(Diagnostic) Dataset to assess the generalization limits and performance of quantum metric learning.
Schuld and Ijaz's original code was adapted in numerous ways to attempt to produce good test set results for both
datasets. The ants/bees dataset, which had a high number of initial features per sample, did not lead to good
generalization. Models generalized best for test data when a fewer number of features per sample were used
(as seen in the breast cancer dataset), particularly after feature reduction through principal component analysis.
Ultimately, this demo illustrates that quantum metric learning can lead to accurate test set classification given a
suitable dataset and appropriate data preparation.
- title: Optimizing a Variational Quantum Circuit via Simulated Annealing
author: Mahnoor Fatima
date: 24/08/2022
code: https://github.com/maxwell04-wq/simulated-annealing-pennylane/blob/main/Simulated_Annealing_Tutorial_Pennylane.ipynb
paper: https://projecteuclid.org/journals/statistical-science/volume-8/issue-1/Simulated-Annealing/10.1214/ss/1177011077.full
color: heavy-rain-gradient
description: |
In this tutorial, a variational quantum circuit is optimized by using simulated annealing. This algorithm returns a stochastic global optimum for the optimization problem.
- title: Continuous Variable Quantum Classifiers - MNIST
author: Sophie Choe
date: 04/06/2022
code: https://github.com/sophchoe/Continous-Variable-Quantum-MNIST-Classifiers
paper: https://github.com/sophchoe/Continous-Variable-Quantum-MNIST-Classifiers/blob/main/MNIST_QNN.pdf
color: heavy-rain-gradient
description: |
We built 8 MNIST dataset classifiers using 2-8 qumodes. This family of MNIST classifiers are classical-quantum hybrid circuits using Keras and PennyLane. The quantum circuit is composed of a data encoding circuit and a quantum neural network circuit as proposed in the paper "Continuous variable quantum neural networks" by Killoran et al. The PennyLane-TensorFlow interface converts the quantum circuit into a Keras layer, and the whole network is treated as a Keras network, to which Keras' built-in loss function and optimizer can be applied for parameter updates. Categorical cross-entropy is used as the loss function and Stochastic Gradient Descent is used for the optimizer.
Author affiliation: Portland State University, Electrical and Computer Engineering.
- title: Weighted SubSpace VQE to find kth excited state energies
author: Jay Patel, Siddharth Patel, and Amit Hirpara
date: 03/07/2022
code: https://github.com/Jay-Patel-257/Qhack-2022/blob/main/SSVQE.ipynb
paper: https://arxiv.org/pdf/1810.09434.pdf
color: heavy-rain-gradient
description: |
The variational quantum eigensolver (VQE) is generally used for finding
the ground state energy of a given hamiltonian. To find the kth excited
state energy of the hamiltonian we need to run the VQE optimization
process at least k+1 times. Each time we also need to calculate
the hamiltonian again, taking into account the state of the previous iteration.
Even after that, the accuracy decreases as the value of k increases.
The Subspace Search VQE (SSVQE) algorithm is used to find the kth excited-state energy of a hamiltonian in
two subsequent optimization processes.
Research on a more generalized version of SSVQE, namely Weighted SSVQE, shows that by using the weights as hyperparameters we can find the
kth excited-state energy in just a single optimization process. There are two variants of this algorithm:
Weighted SSVQE to find kth excited state energy, and weighted SSVQE to find all
energies up to the kth excited state.
- title: Implementing a unitary quantum perceptron with quantum computing
author: Katerina Gratsea
date: 03/02/2022
code: https://github.com/KaterinaGratsea/Unitary_quantum_perceptron
paper: https://iopscience.iop.org/article/10.1209/0295-5075/125/30004
color: heavy-rain-gradient
description: |
Here, we simulate a unitary quantum perceptron with quantum computing.
The quantum perceptron can be implemented as a single (fast) adiabatic
passage in a model of interacting spins. To demonstrate the learning ability
of the quantum perceptron, we train it to perform the XOR gate and discuss
its power consumption. Author affiliation: ICFO.
- title: quantum Case-Based Reasoning (qCBR) learning by cases
author: Parfait Atchade
date: 12/22/2021
code: https://github.com/pifparfait/qCBR-learning-by-cases
paper: https://arxiv.org/abs/2104.00409
color: heavy-rain-gradient
description: |
A supervised classifier is a program that can predict a label (class) for a new
input object, based on the value of its attributes and on a training set.
The training set consists of labelled data. The main idea of quantum Case-Based Reasoning (qCBR)
is to interpret the statement of the problem as an input object, and the solution to the problem as an output (label).
Therefore, if we have a series of situations (inputs) with their outcomes (labels),
we can train our classifier to determine the solution given a new problem.
- title: Exploring quantum models with a teacher-student scheme
author: Katerina Gratsea and Patrick Huembeli
date: 11/22/2021
code: https://github.com/KaterinaGratsea/Teacher-student_scheme
paper: https://arxiv.org/abs/2105.01477
color: heavy-rain-gradient
description: |
Using PennyLane, we introduce a teacher-student scheme to systematically compare
different Quantum Neural Network (QNN) architectures and to evaluate their relative expressive power.
This scheme avoids training with a specific dataset and compares the learning
capacity of different quantum models.
- title: Hybrid Neural Network using Data-Reuploading technique
author: Nikolaos Schetakis (nikschet)
date: 11/04/2021
code: https://github.com/nsansen/Quantum-Machine-Learning/blob/main/Pennylane%20DEMO%20v4.ipynb
paper: https://link.springer.com/content/pdf/10.1038/s41598-022-14876-6.pdf
color: heavy-rain-gradient
description: |
We combine a standard Variational Classifier with a Data-Reuploading Classifier,
and integrate the resulting QNode as a quantum layer in a Hybrid Neural Network.
- title: Fraud Detection
author: Sophie Choe
date: 10/04/2021
code: https://github.com/sophchoe/Binary_Classification_Pennylane_Keras/blob/main/fraud_detection_Pennylane_Keras.ipynb
paper: https://github.com/sophchoe/Hybrid-Quantum-Classical-MNIST-Classfication-Model/blob/main/QNN.pdf
color: heavy-rain-gradient
description: |
This is a binary classification hybrid model as proposed in the paper
"Continuous Variable Quantum Neural Networks", composed of 2 layers of feed forward
classical layers and 4 layers of quantum neural network. Using the Pennylane Tensorflow plug-in,
the whole network is wrapped as a Keras sequential network, whose parameters are updated via
Keras's built in loss function and optimizer.
- title: Quantum-Classical MNIST Classification Model
author: Sophie Choe
date: 09/28/2021
code: https://github.com/sophchoe/Hybrid-Quantum-Classical-MNIST-Classfication-Model
paper: https://github.com/sophchoe/Hybrid-Quantum-Classical-MNIST-Classfication-Model/blob/main/QNN.pdf
color: heavy-rain-gradient
description: |
Keras-PennyLane hybrid model for MNIST classification, inspired by the "Supervised
learning with hybrid networks" section of the paper "Continuous-variable
quantum neural networks".
- title: Hybrid quantum-classical auto encoder
author: Sophie Choe
date: 09/28/2021
code: https://github.com/sophchoe/QML/blob/main/auto_encoder_Pennylane_Keras.ipynb
paper: https://github.com/sophchoe/Hybrid-Quantum-Classical-MNIST-Classfication-Model/blob/main/QNN.pdf
color: heavy-rain-gradient
description: |
Keras-PennyLane implementation of the hybrid quantum-classical auto encoder proposed in the paper
"Continous-variable quantum neural networks". The loss function used here is the mean-squared error,
unlike the paper which requires state vector retrieval.
- title: Quantum circuit learning to compute option prices and their sensitivities
author: Takayuki Sakuma
date: 09/16/2021
code: https://github.com/ta641/option_QCL/blob/master/qclop_tutorial.ipynb
paper: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3922040
color: heavy-rain-gradient
description: |
Quantum circuit learning is applied to computing option prices and their sensitivities.
The advantage of this method is that a suitable choice of quantum circuit architecture
makes it possible to compute the sensitivities analytically by applying parameter-shift rules.
- title: Subspace Search Variational Quantum Eigensolver
author: Shah Ishman Mohtashim, Turbasu Chatterjee, Arnav Das
date: 08/23/2021
code: https://github.com/LegacYFTw/SSVQE
paper: https://arxiv.org/abs/1810.09434
color: heavy-rain-gradient
description: |
The variational quantum eigensolver (VQE) is an algorithm for searching the
ground state of a quantum system. The SSVQE uses a simple technique to find
the excited energy states by transforming the |0⋯0⟩ to the ground state,
and another orthogonal basis state |0⋯1⟩ to the first excited state and so on.
As a demonstration, the weighted SSVQE is used to find out the excited states
of a transverse Ising model with 4 spins and that of the hydrogen molecule.
- title: Quantum PPO/TRPO - LSTMs and memory proximal policy optimization for black-box quantum control
author: Abhilash Majumder
date: 08/10/2021
code: https://colab.research.google.com/drive/1wkZpEpOuZHUdI-vRxQAiDlQD455diFSs?usp=sharing
color: heavy-rain-gradient
description: |
Reinforcement Learning as quantum control leverages quantum hybrid circuits
(QHC) for creating optimizations on policy networks for Deep RL.
Policy-gradient-based reinforcement learning (RL) algorithms are well suited
for optimizing the variational parameters of QAOA in a noise-robust fashion,
opening up the way for developing RL techniques for continuous quantum control.
This is advantageous to help mitigate and monitor the potentially unknown
sources of errors in modern quantum simulators. This demo aims to provide an
implementation of PPO on policy algorithm with QHC for continuous control.
- title: EVA (Exponential Value Approximation) algorithm
author: Guillermo Alonso-Linaje
date: 07/30/2021
code: https://github.com/KetpuntoG/EVA_Tutorial/blob/main/EVA.ipynb
paper: https://arxiv.org/abs/2106.08731
color: heavy-rain-gradient
description: |
VQE is currently one of the most widely used algorithms for optimizing
problems using quantum computers. A necessary step in this algorithm is
calculating the expectation value given a state, which is calculated by
decomposing the Hamiltonian into Pauli operators and obtaining this value for
each of them. In this work, we have designed an algorithm capable of figuring
this value using a single circuit. A time cost study has been carried out, and
it has been found that in certain more complex Hamiltonians, it is possible to
obtain a good performance over the current methods.
- title: Meta-Variational Quantum Eigensolver
author: Nahum Sá
date: 03/27/2021
code: https://github.com/nahumsa/pennylane-notebooks/blob/main/Meta-VQE%20Pennylane.ipynb
color: heavy-rain-gradient
description: |
In this tutorial I follow the <a href="https://arxiv.org/abs/2009.13545" target="_blank">Meta-VQE paper</a>.
The Meta-VQE algorithm is a variational quantum algorithm that is suited for NISQ devices
and encodes parameters of a Hamiltonian into a variational ansatz. We can obtain good
estimations of the ground state of the Hamiltonian by changing only those encoded parameters.
- title: Feature maps for kernel-based quantum classifiers
author: Semyon Sinchenko
date: 03/03/2021
code: https://github.com/SemyonSinchenko/PennylaneQuantumFeatureMaps
color: heavy-rain-gradient
description: |
In this tutorial we implement a few examples of feature maps for kernel based quantum
machine learning. We'll see how quantum feature maps could make linear unseparable data
separable after applying a kernel and measuring an observable. We will
<a href="https://arxiv.org/abs/1906.10467" target="_blank">follow an article</a>
and also implement all the kernel functions with PennyLane.
- title: Variational Quantum Circuits for Deep Reinforcement Learning
author: Samuel Yen-Chi Chen
date: 03/03/2021
code: https://github.com/ycchen1989/Var-QuantumCircuits-DeepRL
paper: https://ieeexplore.ieee.org/abstract/document/9144562
color: heavy-rain-gradient
description: |
This work explores variational quantum circuits for deep reinforcement learning.
Specifically, we reshape classical deep reinforcement learning algorithms like
experience replay and target network into a representation of variational quantum
circuits. Moreover, we use a quantum information encoding scheme to reduce the number of
model parameters compared to classical neural networks. To the best of our knowledge,
this work is the first proof-of-principle demonstration of variational quantum circuits
to approximate the deep Q-value function for decision-making and policy-selection
reinforcement learning with experience replay and target network. Besides, our
variational quantum circuits can be deployed in many near-term NISQ machines.
- title: QCNN for Speech Commands Recognition
author: C.-H. Huck Yang
date: 02/03/2021
code: https://github.com/huckiyang/QuantumSpeech-QCNN
paper: https://arxiv.org/abs/2010.13309
color: heavy-rain-gradient
description: |
We train a hybrid quantum convolution neural network (QCNN) on acoustic data with up to 10,000
features. This model uses layers of random quantum gates to efficiently encode convolutional
features. We perform a neural saliency analysis to provide a classical activation mapping to
compare classical and quantum models, illustrating that the QCNN self-attention model did learn
meaningful representations. An additional connectionist temporal classification (CTC) loss on
character recognition is also provided for continuous speech recognition.
- title: Layerwise learning for quantum neural networks
author: Felipe Oyarce Andrade
date: 26/01/2021
code: https://github.com/felipeoyarce/layerwise-learning
color: heavy-rain-gradient
description: |
In this project we’ve implemented a strategy presented by <a
href="https://arxiv.org/abs/2006.14904" target="_blank">Skolik et al., 2020</a> for
effectively training quantum neural networks. In layerwise learning the
strategy is to gradually increase the number of parameters by adding a few
layers and training them while freezing the parameters of previous layers
already trained. An easy way for understanding this technique is to think
that we’re dividing the problem into smaller circuits to successfully avoid
falling into Barren Plateaus. We provide a proof-of-concept
implementation of this technique in Pennylane’s Pytorch interface for binary
classification in the MNIST dataset.
- title: A Quantum-Enhanced Transformer
author: Riccardo Di Sipio
date: 20/01/2021
code: https://github.com/rdisipio/qtransformer
blog: https://towardsdatascience.com/toward-a-quantum-transformer-a51566ed42c2
color: heavy-rain-gradient
description: |
The Transformer neural network architecture revolutionized the analysis of
text. Here we show an example of a Transformer with quantum-enhanced
multi-headed attention. In the quantum-enhanced version, dense layers are
replaced by simple Variational Quantum Circuits. An implementation based on
PennyLane and TensorFlow-2.x illustrates the basic concept.
- title: A Quantum-Enhanced LSTM Layer
author: Riccardo Di Sipio
date: 18/12/2020
code: https://github.com/rdisipio/qlstm/blob/main/POS_tagging.ipynb
blog: https://towardsdatascience.com/a-quantum-enhanced-lstm-layer-38a8c135dbfa
color: heavy-rain-gradient
description: |
In Natural Language Processing, documents are usually presented as sequences
of words. One of the most successful techniques to manipulate this kind of
data is the Recurrent Neural Network architecture, and in particular a
variant called Long Short-Term Memory (LSTM). Using the PennyLane library
and its PyTorch interface, one can easily define a LSTM network where
Variational Quantum Circuits (VQCs) replace linear operations. An
application to Part-of-Speech tagging is presented in this tutorial.
- title: Quantum Machine Learning Model Predictor for Continuous Variables
author: Roberth Saénz Pérez Alvarado
date: 16/12/2020
code: https://github.com/roberth2018/Quantum-Machine-Learning/blob/main/Quantum_Machine_Learning_Model_Predictor_for_Continuous_Variable_.ipynb
color: heavy-rain-gradient
description: |
According to the paper "Predicting toxicity by quantum machine learning"
(Teppei Suzuki, Michio Katouda 2020), it is possible to predict continuous
variables—like those in the continuous-variable quantum neural network model
described in Killoran et al. (2018)—using 2 qubits per feature. This is
done by applying encodings, variational circuits, and some linear
transformations on expectation values in order to predict values close to
the real target. Based on an <a
href="https://pennylane.ai/qml/demos/quantum_neural_net.html">example</a>
from PennyLane, and using a small dataset which consists of a
one-dimensional feature and one output (so that the processing does not take
too much time), the algorithm showed reliable results.
- title: Trainable Quanvolutional Neural Networks
author: Denny Mattern, Darya Martyniuk, Fabian Bergmann, and Henri Willems
date: 26/11/2020
code: https://github.com/PlanQK/TrainableQuantumConvolution
color: heavy-rain-gradient
description: |
We implement a trainable version of Quanvolutional Neural Networks using
parametrized <code>RandomCircuits</code>. Parameters are optimized using
standard gradient descent. Our code is based on the <a
href="https://pennylane.ai/qml/demos/tutorial_quanvolution.html">Quanvolutional
Neural Networks</a> demo by Andrea Mari. This demo results from our research
as part of the <a href="https://www.planqk.de">PlanQK consortium</a>.
- title: Using a Keras optimizer for Iris classification with a QNode and loss function
author: Hemant Gahankari
date: 09/11/2020
code: https://colab.research.google.com/drive/17Qri3jUBpjjkhmO6ZZZNXwm511svSVPw?usp=sharing
color: heavy-rain-gradient
description: |
Using PennyLane, we explain how to create a quantum function and train a
quantum function using a Keras optimizer directly, i.e., not using a Keras
layer. The objective is to train a quantum function to predict classes of
the Iris dataset.
- title: Linear regression using angle embedding and a single qubit
author: Hemant Gahankari
date: 09/11/2020
code: https://colab.research.google.com/drive/1ABVtBjwcGNNIfmiwEXRdFdZ47K1vZ978?usp=sharing
color: heavy-rain-gradient
description: |
In this example, we create a hybrid neural network (mix of classical and
quantum layers), train it and get predictions from it. The data set
consists of temperature readings in degrees Centigrade and corresponding
Fahrenheit. The objective is to train a neural network that predicts
Fahrenheit values given Centigrade values.
- title: Amplitude embedding in Iris classification with PennyLane's KerasLayer
author: Hemant Gahankari
date: 09/11/2020
code: https://colab.research.google.com/drive/12ls_GkSD2t0hr3Mx9-qzVvSWxR3-N0WI#scrollTo=4PQTkXpv52vZ
color: heavy-rain-gradient
description: |
Using amplitude embedding from PennyLane, this demonstration aims to explain
how to pass classical data into the quantum function and convert it to quantum
data. It also shows how to create a PennyLane KerasLayer from a QNode, train it
and check the performance of the model.
- title: Angle embedding in Iris classification with PennyLane's KerasLayer
author: Hemant Gahankari
date: 09/11/2020
code: https://colab.research.google.com/drive/13PvS2D8mxBvlNw6_5EapUU2ePKdf_K53#scrollTo=1fJWDX5LxfvB
color: heavy-rain-gradient
description: |
Using angle embedding from PennyLane, this demonstration aims to explain
how to pass classical data into the quantum function and convert it to
quantum data. It also shows how to create a PennyLane KerasLayer from a
QNode, train it and check the performance of the model.
- title: Characterizing the loss landscape of variational quantum circuits
author: Patrick Huembeli and Alexandre Dauphin
date: 30/09/2020
code: https://github.com/PatrickHuembeli/vqc_loss_landscapes
paper: https://arxiv.org/abs/2008.02785
color: heavy-rain-gradient
description: |
Using PennyLane and complex PyTorch, we compute the Hessian of the loss
function of VQCs and show how to characterize the loss landscape with it. We
show how the Hessian can be used to escape flat regions of the loss
landscape.