-
Notifications
You must be signed in to change notification settings - Fork 158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please add clarity to code #31
Comments
sure, submit a PR! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
so Phil - I love your work - I wish you could go extra few steps to help out users.
I found this class by François-Guillaume @frgfm - which adds in clear math coments.
I want to merge it but there's a bit of code drift don't want to introduce any bugs.
I beseech you to go extra step to help users bridge from papers to code.
https://github.com/frgfm/Holocron/blob/bcc3ea19a477e4b28dc5973cdbe92a9b05c690bb/holocron/nn/modules/lambda_layer.py
eg.
please articulate return types
def forward(self, x: torch.Tensor) -> torch.Tensor:
Please give any clarity in arguments.
# Project input and context to get queries, keys & values
Throw in some maths as a comment / this is great as it bridges the paper to the code.
B x (num_heads * dim_k) * H * W -> B x num_heads x dim_k x (H * W)
The text was updated successfully, but these errors were encountered: