Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about KAdaptation implementation #6

Open
vishaal27 opened this issue Sep 10, 2023 · 1 comment
Open

Questions about KAdaptation implementation #6

vishaal27 opened this issue Sep 10, 2023 · 1 comment

Comments

@vishaal27
Copy link

Hi, thanks for the great work and releasing the code to reproduce it.

I have a few questions regarding the kronecker adaptation forward pass through the adapter modules:

(1) The scaling factor you use for the KAdaptation is 1/5 times the scaling used in standard LoRA:

scale_factor = self.lora_attn_alpha / self.lora_attn_dim * 5

Is there a justification for this or is it simply an empirical magic number?

(2) While forwarding through your adapter for the value matrix, it seems like you reuse the query weight matrix (A as defined in the paper as I understand it). Is this a typo/bug?

"Perform kronecker adaptation to Q and K matrices"
if matrix == 'q':
if self.factorized_phm_rule:
phm_rule1 = torch.bmm(self.phm_rule1_left, self.phm_rule1_right)
H = kronecker_product_einsum_batched(phm_rule1, Wq).sum(0)
elif matrix == 'v':
if self.factorized_phm_rule:
phm_rule2 = torch.bmm(self.phm_rule2_left, self.phm_rule2_right)
H = kronecker_product_einsum_batched(phm_rule2, Wq).sum(0)

Shouldn't line 580 be H = kronecker_product_einsum_batched(phm_rule2, Wv).sum(0) instead?

@jkooy
Copy link
Collaborator

jkooy commented Sep 17, 2023

Hi, many thanks for interests! The scaling factor is a hyper-parameter, you can manually adjust it but from my experience it won't affect the performance much. For the value matrix, actually we share the same decomposition here so that's why reusing it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants