Skip to content

leliyliu/Fixed-Point-Training

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fixed-Point-Training

Low-bit & Hardware-aware Quantization Training

Experiment Results

CIFAR10

Methods Models Acc Precision Dataset
FP ResNet-20 92.32 fp32 CIFAR10
FP8 ResNet-20 92.21 fp8 CIFAR10
Unified ResNet-20 91.95 int8 CIFAR10
Distributed ResNet-20 92.76 int8 CIFAR10
Methods Models Acc Precision Dataset
FP MobileNetV2 94.39 fp32 CIFAR10
DoReFa MobileNetV2 91.03 int8 CIFAR10
WAGEUBN MobileNetV2 92.32 int8 CIFAR10
SBM MobileNetV2 93.57 int8 CIFAR10
CPT MobileNetV2 93.76 int8 CIFAR10
Unified MobileNetV2 93.38 int8 CIFAR10
Distributed MobileNetV2 94.37 int8 CIFAR10
Methods Models Acc Precision Dataset
FP InceptionV3 94.89 fp32 CIFAR10
Unified InceptionV3 95.00 int8 CIFAR10
Distributed InceptionV3 95.21 int8 CIFAR10

CIFAR100

Methods Models Acc Precision Dataset
DoReFa MobileNetV2 70.17 int8 CIFAR100
WAGEUBN MobileNetV2 71.45 int8 CIFAR100
SBM MobileNetV2 75.28 int8 CIFAR100
CPT MobileNetV2 75.65 int8 CIFAR100
Methods Models Acc Precision Dataset
DoReFa ResNet-74 69.31 int8 CIFAR100
WAGEUBN ResNet-74 69.61 int8 CIFAR100
SBM ResNet-74 71.44 int8 CIFAR100
CPT ResNet-74 72.35 int8 CIFAR100

ImageNet

Todo-List

  • full-precision baseline for CIFAR10/CIFAR100
  • A Simple int8 quantization training Framework
  • CPT baseline for CIFAR10/CIFAR100

About

Low-bit & Hardware-aware Quantization Training

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published