-
Yanghao Zhang [email protected]
-
Zhiruo Zhang [email protected]
-
Yigang Zhou [email protected]
This project presents a probabilistic binary neural network according to the paper published on ICLR. We implemented all functions in the paper including the embracement of stochasticity in training process and stochastic versions of Batch Normalization, as well as sampling in binary activations. A similar result to the original paper was obtained after experiment.
Jorn W.T. Peters, Tim Genewein, and Max Welling. Probabilistic binary neural networks, 2019. https://openreview.net/forum?id=B1fysiAqK7
Shayer, O., Levi, D. and Fetaya, E., 2017. Learning discrete weights using the local reparameterization trick. arXiv preprint arXiv:1710.07739. https://openreview.net/pdf?id=BySRH6CpW
Peters, J.W. and Welling, M., 2018. Probabilistic Binary Neural Networks. arXiv preprint arXiv:1809.03368. https://arxiv.org/pdf/1809.03368.pdf
Kingma, D.P., Salimans, T. and Welling, M., 2015. Variational dropout and the local reparameterization trick. In Advances in Neural Information Processing Systems (pp. 2575-2583).https://arxiv.org/pdf/1506.02557.pdf
Binarized Neural Network (BNN) for pytorch https://github.com/itayhubara/BinaryNet.pytorch/