Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some questions about reproducibility #17

Open
JennyVanessa opened this issue May 12, 2023 · 0 comments
Open

Some questions about reproducibility #17

JennyVanessa opened this issue May 12, 2023 · 0 comments

Comments

@JennyVanessa
Copy link

Dear author,

As mentioned in the supplementary materials, the hyperparameters were set as follows:

To train our TANet, we used Microsoft's neural network intelligence (NNI) tuning tool, where the learning rate search space from L1 to L6 was set as [0.000001, 0.0000001, 0.0000003], without any decay rate strategy. Specifically, we set the input size to 40 and the number of training epochs to 200. We used 224 × 224 crops from 256 × 256 fixed images as input.

However, achieving the desired training results in a single step with this particular combination of hyperparameters has proven to be challenging due to the absence of a decay rate strategy. I wonder if you performed multiple training steps, manually adjusting the learning rate to obtain the best results. For example, training for a certain number of epochs with one set of hyperparameters and then building upon that by training further, ensuring the total number of training epochs reaches 200, rather than directly training for 200 epochs using a specific set of hyperparameters. If this is the case, how can the ablation experiment be performed to ensure the accuracy of the experiment? Additionally, the provision of random seeds was not mentioned.

I would greatly appreciate your insights and guidance on this matter.

Warm regards,

Vanessa

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant