Neural Network Pruning Using Dependency Measures
-
Updated
Dec 17, 2021 - Python
Neural Network Pruning Using Dependency Measures
Neural network compression with SVD
Compressed CNNs for airplane classification in satellite images (APoZ-based parameter pruning, INT8 weight quantization)
Use a meta-network to learn the importance and correlation of neural network weights
This repository is for reproducing the results shown in the NNCodec ICML Workshop paper. Additionally, it includes a demo, prepared for the Neural Compression Workshop (NCW).
Masters Thesis Project https://davidturner94.github.com/nncompression
ESPN: Extreme Sparse Pruned Network
Implementation of various neural network pruing methods in pytorch.
Compact representations of convolutional neural networks via weight pruning and quantization
Code for our WACV 2021 paper "Exploiting the Redundancy in Convolutional Filters for Parameter Reduction"
Official PyTorch implementation of "Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming" (ICML'23)
Code for testing DCT plus Sparse (DCTpS) networks
Bayesian Optimization-Based Global Optimal Rank Selection for Compression of Convolutional Neural Networks, IEEE Access
[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
[ICLR 2022] "Audio Lottery: Speech Recognition Made Ultra-Lightweight, Noise-Robust, and Transferable", by Shaojin Ding, Tianlong Chen, Zhangyang Wang
Hyperspectral CNN compression and band selection
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
Tools and libraries to run neural networks in Minecraft ⛏️
[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"
Add a description, image, and links to the neural-network-compression topic page so that developers can more easily learn about it.
To associate your repository with the neural-network-compression topic, visit your repo's landing page and select "manage topics."