This repo is an implementation for M2T: Masking Transformers Twice for Faster Decoding in pytorch.
The latest codes are tested on Ubuntu18.04LTS, CUDA11.7, PyTorch1.9 and Python 3.7. Some libraries are required to run the codes in this repo, including constriction, compressai, and timm.
python train.py --config config/mt.yaml # --wandb (if you want to use wandb)
Model checkpoints and logs will be saved in ./history/MT
.
python train.py --config config/mt.yaml --test-only --eval-dataset-path: 'path_to_kodak'
Red dot is our reproduction with distortion lambda
To be released.
We use constriction for actual entropy coding. Thanks for Fabian Mentzer's help for the clarification of the details of the paper.