Skip to content

Latest commit

 

History

History
24 lines (18 loc) · 1.31 KB

README.md

File metadata and controls

24 lines (18 loc) · 1.31 KB

Bayesian Parameter-Efficient Fine-Tuning for Overcoming Catastrophic Forgetting

This repository contains the source code for the following paper by Haolin Chen and Philip N. Garner:

@misc{chen2024bayesian,
      title={Bayesian Parameter-Efficient Fine-Tuning for Overcoming Catastrophic Forgetting}, 
      author={Haolin Chen and Philip N. Garner},
      year={2024},
      eprint={2402.12220},
      archivePrefix={arXiv},
      primaryClass={eess.AS}
}

It comprises three components:

  1. peft: a customized Python package based on Hugging Face PEFT version 0.6.0. It includes the implementation of the Bayesian transfer learning techniques with LoRA and supports the language modeling experiments.
  2. lm: codes and scripts for language modeling experiments adapted from the Hugging Face Transformers version 4.34.0. This is dependent on the customized peft package.
  3. tts: codes and scripts for speech synthesis experiments based on the official implementation of StyleTTS 2.

Please refer to the README.md in each directory for instructions.

Audio samples are available.