This is an improved implementation of the paper Auto-Encoding Variational Bayes by Kingma and Welling. It uses ReLUs and the adam optimizer, instead of sigmoids and adagrad. These changes make the network converge much faster.
Install the required dependencies:
pip install -r requirements.txtTo run the example, execute:
python main.pyIf a hardware accelerator device is detected, the example will execute on the accelerator; otherwise, it will run on the CPU.
To force execution on the CPU, use --no-accel command line argument:
python main.py --no-accelThe main.py script accepts the following optional arguments:
--batch-size input batch size for training (default: 128)
--epochs number of epochs to train (default: 10)
--no-accel disables accelerator
--seed random seed (default: 1)
--log-interval how many batches to wait before logging training status