Releases: VainF/Torch-Pruning
Releases · VainF/Torch-Pruning
v1.1.9: Bugfixing
Bugfixing for ViT, max_sparsity, etc.
v.1.1.8: Serialization for Pruned Models
- Experimental Features: Save & load pruned models with
tp.state_dict
andtp.load_state_dict
. - Bugfixing
Full Changelog: v1.1.7...v.1.1.8
v.1.1.7: Pruning via Taylor Expansion
- Add
tp.importance.TaylorImportance
- Support PruningHistory
- Pruning & Post-training for Yolov8
- Bugfixing
v1.1.6
v1.1.5: Fixed a backward error
Update setup.py
v1.1.4
v1.1.3: BatchNorm / InstanceNorm / LayerNorm / GroupNorm
- Complete support for widely used normalization layers: Batch Normalization, Instance Normalization, Layer Normalization, and Group Normalization.
- Covers 90.6% of the models in Torchvision v0.13.1.
- Improved compatibility with PyTorch<=1.8.
- Bugfix in GroupNormPruner and Benchmark
v1.1.2: Automatic pruning of unwrapped nn.Parameters
v1.1.1
- Torchvision Compatibility: https://github.com/VainF/Torch-Pruning/tree/master/benchmarks/prunability
- Reshape Support: Support for common reshape operations like
.view
,.reshape
, and.flatten
. - User-Friendly Interfaces: Easy-to-use interfaces for users of varying skill levels.
- Robustness: Improve the robustness of Torch-Pruning
- Yolov7: https://github.com/VainF/Torch-Pruning/tree/master/benchmarks/prunability/yolov7_detect_pruned.py
- Bugfix
v1.0.0
Features:
- Channel pruning for CNNs (e.g. ResNet, DenseNet, Deeplab) and Transformers (e.g. ViT)
- High-level pruners: MagnitudePruner, BNScalePruner, GroupPruner, etc.
- Graph Tracing and dependency fixing.
- Supported modules: Conv, Linear, BatchNorm, LayerNorm, Transposed Conv, PReLU, Embedding, MultiheadAttention, nn.Parameters and customized modules.
- Supported operations: split, concatenation, skip connection, flatten, etc.
- Pruning strategies: Random, L1, L2, etc.
- Low-level pruning functions
- Benchmarks and tutorials