Use a meta-network to learn the importance and correlation of neural network weights
-
Updated
Mar 23, 2019 - Python
Use a meta-network to learn the importance and correlation of neural network weights
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
Hyperspectral CNN compression and band selection
Compressed CNNs for airplane classification in satellite images (APoZ-based parameter pruning, INT8 weight quantization)
Masters Thesis Project https://davidturner94.github.com/nncompression
Implementation of various neural network pruing methods in pytorch.
Code for our WACV 2021 paper "Exploiting the Redundancy in Convolutional Filters for Parameter Reduction"
Bayesian Optimization-Based Global Optimal Rank Selection for Compression of Convolutional Neural Networks, IEEE Access
Code for testing DCT plus Sparse (DCTpS) networks
Neural Network Pruning Using Dependency Measures
[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"
[ICLR 2022] "Audio Lottery: Speech Recognition Made Ultra-Lightweight, Noise-Robust, and Transferable", by Shaojin Ding, Tianlong Chen, Zhangyang Wang
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
ESPN: Extreme Sparse Pruned Network
Embedded and mobile deep learning research resources
Tools and libraries to run neural networks in Minecraft ⛏️
Compact representations of convolutional neural networks via weight pruning and quantization
This repository is for reproducing the results shown in the NNCodec ICML Workshop paper. Additionally, it includes a demo, prepared for the Neural Compression Workshop (NCW).
Official PyTorch implementation of "Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming" (ICML'23)
Add a description, image, and links to the neural-network-compression topic page so that developers can more easily learn about it.
To associate your repository with the neural-network-compression topic, visit your repo's landing page and select "manage topics."