This is an official implementation of the paper "Toward INT4 Fixed-Point Training via Exploring Quantization Error for Gradients", accepted to ECCV 2024.
For more information, checkout the project site [website].
- Python >= 3.6
- PyTorch >= 1.8.0
You can adjust the bit-widths of forward and backward passes in models/modules.py.
To start training, run:
python train.py --config configs/resnet20_cifar100.yml
@inproceedings{kim2024toward,
author={Kim, Dohyung and Lee, Junghyup and Jeon, Jeimin and Moon, Jaehyeon and Ham, Bumsub},
title={Toward INT4 Fixed-Point Training via Exploring Quantization Error for Gradients},
booktitle={European Conference on Computer Vision},
year={2024},
}
- ResNet-20 model: [ResNet on CIFAR100]
- Quantized modules: [DSQ]