Skip to content

wsxtyrdd/NTSCC_plus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Python >=3.7 PyTorch >=1.7

Improved Nonlinear Transform Source-Channel Coding to Catalyze Semantic Communications [pdf]

This is the repository of the paper "Improved Nonlinear Transform Source-Channel Coding to Catalyze Semantic Communications".

Pipeline

Evaluation on Kodak Dataset and CLIC 2021 testset

Requirements

Clone the repo and create a conda environment (we use PyTorch 1.9, CUDA 11.1).

The dependencies includes CompressAI, Natten, and timm.

Trained Models

Download the pre-trained models from Google Drive or Baidu Netdisk (提取码: deye).

Note: We reorganize code and the performances are slightly different from the paper's.

Train & Evaluate & Comress & Decompress

Train:

sh scripts/pretrain.sh 0.3
sh scripts/train.sh [tradeoff_lambda(e.g. 0.02)]
(You may use your own dataset by modifying the train/test data path.)

Evaluate:

# Kodak
sh scripts/test.sh [/path/to/kodak] [model_path]
(sh test_parallel.sh [/path/to/kodak] [model_path])

Compress:

sh scripts/compress.sh [original.png] [model_path]
(sh compress_parallel.sh [original.png] [model_path])

Decompress:

sh scripts/decompress.sh [original.bin] [model_path]
(sh decompress_parallel.sh [original.bin] [model_path])

Acknowledgement

Codebase from CompressAI, TinyLIC, and Swin Transformer

Citation

If you find this code useful for your research, please cite our paper

@inproceedings{
      wang2023improved,
      title={Improved Nonlinear Transform Source-Channel Coding to Catalyze Semantic Communications},
      author={Sixian Wang and Jincheng Dai and Xiaoqi Qin and Zhongwei Si and Kai Niu and Ping Zhang},
      year={2023},
      booktitle={IEEE Journal of Selected Topics in Signal Processing, early access},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages