|
| 1 | +# Quantum Autoencoder |
| 2 | + |
| 3 | +This note describes the quantum autoencoder workflow implemented in `qml.autoencoder`. |
| 4 | + |
| 5 | +The current implementation is intentionally compact and package-oriented: |
| 6 | + |
| 7 | +• structured four-qubit input state families |
| 8 | +• a trainable encoder/decoder ansatz |
| 9 | +• latent and trash subsystem separation |
| 10 | +• compression and reconstruction fidelity reporting |
| 11 | + |
| 12 | +--- |
| 13 | + |
| 14 | +# Overview |
| 15 | + |
| 16 | +A quantum autoencoder learns a unitary compression map that preserves the |
| 17 | +informative degrees of freedom of a quantum state in a smaller latent subspace. |
| 18 | + |
| 19 | +Rather than predicting labels directly, it learns a transformation that moves |
| 20 | +discardable information into a trash subsystem. |
| 21 | + |
| 22 | +--- |
| 23 | + |
| 24 | +# Model structure |
| 25 | + |
| 26 | +Let the input state be |
| 27 | + |
| 28 | +$$ |
| 29 | +|\psi(x)\rangle. |
| 30 | +$$ |
| 31 | + |
| 32 | +The encoder applies a trainable unitary |
| 33 | + |
| 34 | +$$ |
| 35 | +|\phi(x,\theta)\rangle |
| 36 | += |
| 37 | +U(\theta)|\psi(x)\rangle. |
| 38 | +$$ |
| 39 | + |
| 40 | +If compression succeeds, the state factorizes approximately as |
| 41 | + |
| 42 | +$$ |
| 43 | +|\phi(x,\theta)\rangle |
| 44 | +\approx |
| 45 | +|\tilde{\psi}(x)\rangle_{\mathrm{latent}} |
| 46 | +\otimes |
| 47 | +|0\rangle_{\mathrm{trash}}. |
| 48 | +$$ |
| 49 | + |
| 50 | +The implementation retains a configurable number of latent qubits and measures |
| 51 | +how often the trash subsystem lands in the all-zero basis state. |
| 52 | + |
| 53 | +--- |
| 54 | + |
| 55 | +# Training objective |
| 56 | + |
| 57 | +The training signal is the probability of measuring the trash subsystem in |
| 58 | +$|0\rangle^{\otimes k}$. |
| 59 | + |
| 60 | +If |
| 61 | + |
| 62 | +$$ |
| 63 | +p_{\mathrm{trash}}(0 \cdots 0 \mid x,\theta) |
| 64 | +$$ |
| 65 | + |
| 66 | +denotes that probability, the loss is |
| 67 | + |
| 68 | +$$ |
| 69 | +\mathcal{L}(\theta) |
| 70 | += |
| 71 | +1 - \mathbb{E}_x \left[p_{\mathrm{trash}}(0 \cdots 0 \mid x,\theta)\right]. |
| 72 | +$$ |
| 73 | + |
| 74 | +Minimizing this loss encourages the encoder to compress the structured state |
| 75 | +family into the latent subsystem. |
| 76 | + |
| 77 | +--- |
| 78 | + |
| 79 | +# Reconstruction fidelity |
| 80 | + |
| 81 | +To assess whether useful information is preserved, the workflow also computes a |
| 82 | +reconstruction fidelity by applying the decoder |
| 83 | + |
| 84 | +$$ |
| 85 | +U(\theta)^\dagger |
| 86 | +$$ |
| 87 | + |
| 88 | +after the encoder and comparing the resulting state to the original state. |
| 89 | + |
| 90 | +This yields two complementary metrics: |
| 91 | + |
| 92 | +• compression fidelity on the trash subsystem |
| 93 | +• reconstruction fidelity on the full state |
| 94 | + |
| 95 | +--- |
| 96 | + |
| 97 | +# Example usage |
| 98 | + |
| 99 | +```python |
| 100 | +from qml.autoencoder import run_quantum_autoencoder |
| 101 | + |
| 102 | +result = run_quantum_autoencoder( |
| 103 | + family="correlated", |
| 104 | + n_samples=200, |
| 105 | + n_layers=2, |
| 106 | + latent_qubits=2, |
| 107 | + steps=50, |
| 108 | +) |
| 109 | +``` |
| 110 | + |
| 111 | +Outputs include: |
| 112 | + |
| 113 | +• train/test compression fidelity |
| 114 | +• train/test reconstruction fidelity |
| 115 | +• learned ansatz parameters |
| 116 | +• loss history |
| 117 | + |
| 118 | +When `save=True`, the workflow writes JSON results and generated figures to: |
| 119 | + |
| 120 | +• `results/autoencoder/` |
| 121 | +• `images/autoencoder/` |
| 122 | + |
| 123 | +--- |
| 124 | + |
| 125 | +# State families |
| 126 | + |
| 127 | +The current implementation provides several synthetic state families: |
| 128 | + |
| 129 | +• `correlated` |
| 130 | +• `entangled` |
| 131 | +• `hybrid` |
| 132 | + |
| 133 | +These are designed to provide structured low-dimensional families that are |
| 134 | +meaningful compression targets for a small autoencoder. |
0 commit comments