Skip to content

[IJCNN 2024] Implicit Multi-Spectral Transformer: An Lightweight and Effective Visible to Infrared Image Translation Model

License

Notifications You must be signed in to change notification settings

CXH-Research/IRFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Yijia Chen , Pinghua Chen📮 , Xiangxin Zhou , Yingtie Lei , Ziyang Zhou , Mingxian Li (📮 Corresponding Author)

Guangdong University of Technology, University of Macau, Huizhou University

In International Joint Conference on Neural Networks 2024 (IJCNN 2024)

⚙️ Usage

Installation

git clone https://github.com/CXH-Research/IRFormer.git
cd IRFormer
pip install -r requirements.txt

Training

Please first specify TRAIN_DIR, VAL_DIR and SAVE_DIR in section TRAINING in traning.yml

For single GPU training:

python train.py

For multiple GPUs training:

accelerate config
accelerate launch train.py

If you have difficulties on the usage of accelerate, please refer to Accelerate.

Inference

Please first specify TRAIN_DIR, VAL_DIR and SAVE_DIR in section TESTING in traning.yml

python test.py

💗 Acknowledgements

This work was supported in part by the Guangdong Provincial Key R&D Programme under Grant No.2023B1111050010 and No.2020B0101100001, in part by the Huizhou Daya Bay Science and Technology Planning Project under Grant No.2020020003.

🛎 Citation

If you find our work helpful for your research, please cite:

About

[IJCNN 2024] Implicit Multi-Spectral Transformer: An Lightweight and Effective Visible to Infrared Image Translation Model

Topics

Resources

License

Stars

Watchers

Forks

Languages