Home

/neuralmagic/ Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations

Code Link
https://github.com/neuralmagic/sparseml
Description
Quantized recurrent neural networks were tested over the Penn Treebank dataset, and achieved comparable accuracy as their 32-bit counterparts using only 4-bits. Code: https://github.com/neuralmagic/sparseml
Retrieved
2022/06/24
Stars
1133
TOP