An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
-
Updated
Sep 26, 2024 - Python
An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
Embedded and mobile deep learning research resources
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
Tools and libraries to run neural networks in Minecraft ⛏️
Hyperspectral CNN compression and band selection
Bayesian Optimization-Based Global Optimal Rank Selection for Compression of Convolutional Neural Networks, IEEE Access
Compact representations of convolutional neural networks via weight pruning and quantization
[ICLR 2022] "Audio Lottery: Speech Recognition Made Ultra-Lightweight, Noise-Robust, and Transferable", by Shaojin Ding, Tianlong Chen, Zhangyang Wang
Code for our WACV 2021 paper "Exploiting the Redundancy in Convolutional Filters for Parameter Reduction"
[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
ESPN: Extreme Sparse Pruned Network
Neural Network Pruning Using Dependency Measures
Compressed CNNs for airplane classification in satellite images (APoZ-based parameter pruning, INT8 weight quantization)
Code for testing DCT plus Sparse (DCTpS) networks
Use a meta-network to learn the importance and correlation of neural network weights
Neural network compression with SVD
Official PyTorch implementation of "Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming" (ICML'23)
Add a description, image, and links to the neural-network-compression topic page so that developers can more easily learn about it.
To associate your repository with the neural-network-compression topic, visit your repo's landing page and select "manage topics."