Skip to content
/ AdaptIR Public

[NeurIPS2024] Tune your restoration model with one 3090 GPU!

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
LICENSE.md
Notifications You must be signed in to change notification settings

csguoh/AdaptIR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Parameter Efficient Adaptation for Image Restoration with Heterogeneous Mixture-of-Experts

Hang Guo, Tao Dai, Yuanchao Bai, Bin Chen, Xudong Ren, Zexuan Zhu, Shu-Tao Xia

Abstract: Designing single-task image restoration models for specific degradation has seen great success in recent years. To achieve generalized image restoration, all-in-one methods have recently been proposed and shown potential for multiple restoration tasks using one single model. Despite the promising results, the existing all-in-one paradigm still suffers from high computational costs as well as limited generalization on unseen degradations. In this work, we introduce an alternative solution to improve the generalization of image restoration models. Drawing inspiration from recent advancements in Parameter Efficient Transfer Learning (PETL), we aim to tune only a small number of parameters to adapt pre-trained restoration models to various tasks. However, current PETL methods fail to generalize across varied restoration tasks due to their homogeneous representation nature. To this end, we propose AdaptIR, a Mixture-of-Experts (MoE) with orthogonal multi-branch design to capture local spatial, global spatial, and channel representation bases, followed by adaptive base combination to obtain heterogeneous representation for different degradations. Extensive experiments demonstrate that our AdaptIR achieves stable performance on single-degradation tasks, and excels in hybrid-degradation tasks, with fine-tuning only 0.6% parameters for 8 hours.

⭐If this work is helpful for you, please help star this repo. Thanks!🤗

📑 Contents

👀Visual Results On Different Restoration Tasks

🆕 News

  • 2023-12-12: arXiv paper available.
  • 2023-12-16: This repo is released.
  • 2023-09-28: 😊Our AdaptIR was accepted by NeurIPS2024!
  • 2024-10-19: 🔈The code is available now, enjoy yourself!
  • 2025-01-13: Updated README file with detailed instruciton.

☑️ TODO

  • arXiv version
  • Release code
  • More detailed introductions of README file
  • Further improvements

🥇 Results

We achieve state-of-the-art adaptation performance on various downstream image restoration tasks. Detailed results can be found in the paper.

Evaluation on Second-order Degradation (LR4&Noise30) (click to expand)

Evaluation on Classic SR (click to expand)

Evaluation on Denoise&DerainL (click to expand)

Evaluation on Heavy Rain Streak Removal (click to expand)

Evaluation on Low-light Image Enhancement (click to expand)

Evaluation on Model Scalability (click to expand)

Datasets & Models Preparation

Datasets

Since this work involves various restoration tasks, you may collect the training and testing datasets you need from existing repos, such as Basicsr, Restormer, and PromptIR.

Pre-trained weights

  • IPT pre-trained models download the IPT_pretrain with the link of the IPT repo.

  • EDT pre-trained models download the SRx2x3x4_EDTB_ImageNet200K.pth with the link of the EDT repo

Training

Our AdaptIR can adapt the pretrained models to various unseen downstream tasks, including Hybrid-Degradation (lr4_noise30, lr4_jpeg30), Image SR (sr_2,sr_3,sr_4), Image denoising (denoise_30, denoise_50), Image Deraining (derainL, derainH) and low-light image enhancement (low_light).

One can adjust the param de_type in the ./options.py file to train the specific downstream models. Note that only the very lightweight AdaptIR is tuned and thus it only consumes about 8 hours for downstream adaptation.

One single 3090 with 24GB memory is enough for training.

You can simply run the following command to start training, with our default params:

python train.py

Testing

After training, the downstream weights can be found in the ./train_ckpt path. You can load this ckpt to evaluate the performance of the downstream unseen tasks.

python test.py

🥰 Citation

Please cite us if our work is useful for your research.

@inproceedings{guoparameter,
  title={Parameter Efficient Adaptation for Image Restoration with Heterogeneous Mixture-of-Experts},
  author={Guo, Hang and Dai, Tao and Bai, Yuanchao and Chen, Bin and Ren, Xudong and Zhu, Zexuan and Xia, Shu-Tao},
  booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems}
}

License

This project is released under the Apache 2.0 license.

Acknowledgement

This code is based on AirNet, IPT and EDT. Thanks for their awesome work.

Contact

If you have any questions, feel free to approach me at [email protected]

About

[NeurIPS2024] Tune your restoration model with one 3090 GPU!

Resources

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
LICENSE.md

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages