Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding_some_attention_modules #117

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

gongyan1
Copy link

@gongyan1 gongyan1 commented Jun 26, 2024

Title

Update Attention Modules

Description

Purpose

To enhance the versatility and adaptability of our models, this update introduces 6 new Attention Modules, enriching the available options for attention mechanisms within the project.

Changes

  • New Attention Modules:

    • Frequency Channel Attention: Enhances channel attention using frequency information.
    • Attention Augmented Convolutional Networks: Combines attention mechanisms with convolutional networks to improve feature extraction capabilities.
    • Global Context Attention: Incorporates global context information to enhance the model's global perception.
    • Linear Context Transform Attention: Strengthens the utilization of context information through linear transformation.
    • Gated Channel Transformation: Introduces a gating mechanism to dynamically adjust the importance of channels.
    • Gaussian Context Attention: Employs a Gaussian model to simulate contextual relationships, deepening the model's understanding.
  • Readme Adjustments:

    • Made minor formatting adjustments to the Readme file to improve document readability.

Impact

  • The introduction of a wider variety of Attention Modules provides more robust options for model construction, enhancing flexibility and applicability for community members' use and research.

Testing

  • Detailed local testing was conducted for all newly introduced attention modules, and all performed as expected.

Checklist

  • Confirm that all newly introduced modules have been thoroughly tested.
  • Ensure that the Readme file has been accurately updated without omissions.

Notes for Reviewers

  • Please pay special attention to the implementation code of the newly introduced attention modules.
  • If there are any suggestions for further improvements to the Readme formatting, they are welcome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant