An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
-
Updated
Nov 22, 2024 - Python
An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
Embedded and mobile deep learning research resources
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"
Tools and libraries to run neural networks in Minecraft ⛏️
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
Hyperspectral CNN compression and band selection
[ICLR 2022] "Audio Lottery: Speech Recognition Made Ultra-Lightweight, Noise-Robust, and Transferable", by Shaojin Ding, Tianlong Chen, Zhangyang Wang
[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
Code for testing DCT plus Sparse (DCTpS) networks
Bayesian Optimization-Based Global Optimal Rank Selection for Compression of Convolutional Neural Networks, IEEE Access
Official PyTorch implementation of "Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming" (ICML'23)
Code for our WACV 2021 paper "Exploiting the Redundancy in Convolutional Filters for Parameter Reduction"
Compact representations of convolutional neural networks via weight pruning and quantization
Implementation of various neural network pruing methods in pytorch.
ESPN: Extreme Sparse Pruned Network
Compressed CNNs for airplane classification in satellite images (APoZ-based parameter pruning, INT8 weight quantization)
Use a meta-network to learn the importance and correlation of neural network weights
Image classification using compressed deep neural network ported on resource-constrained platforms.
Add a description, image, and links to the neural-network-compression topic page so that developers can more easily learn about it.
To associate your repository with the neural-network-compression topic, visit your repo's landing page and select "manage topics."