Skip to content

Bayesian Parameter-Efficient Fine-Tuning for Overcoming Catastrophic Forgetting

Notifications You must be signed in to change notification settings

idiap/bayesian-peft

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bayesian Parameter-Efficient Fine-Tuning for Overcoming Catastrophic Forgetting

This repository contains the source code for the following paper by Haolin Chen and Philip N. Garner:

@misc{chen2024bayesian,
      title={Bayesian Parameter-Efficient Fine-Tuning for Overcoming Catastrophic Forgetting}, 
      author={Haolin Chen and Philip N. Garner},
      year={2024},
      eprint={2402.12220},
      archivePrefix={arXiv},
      primaryClass={eess.AS}
}

It comprises three components:

  1. peft: a customized Python package based on Hugging Face PEFT version 0.6.0. It includes the implementation of the Bayesian transfer learning techniques with LoRA and supports the language modeling experiments.
  2. lm: codes and scripts for language modeling experiments adapted from the Hugging Face Transformers version 4.34.0. This is dependent on the customized peft package.
  3. tts: codes and scripts for speech synthesis experiments based on the official implementation of StyleTTS 2.

Please refer to the README.md in each directory for instructions.

Audio samples are available.

About

Bayesian Parameter-Efficient Fine-Tuning for Overcoming Catastrophic Forgetting

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published