Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allowing per-parameters options for the optimizer #85

Open
lorenzosquadrani opened this issue Jul 1, 2021 · 1 comment
Open

Allowing per-parameters options for the optimizer #85

lorenzosquadrani opened this issue Jul 1, 2021 · 1 comment
Labels
new feature Feature request to work on

Comments

@lorenzosquadrani
Copy link

Hi! I hope I'm not mistaken, but it seems like I cannot set an optimizer with different per-layer learning rates. My model really needs it, and I couldn't find an easy workaround. Maybe you could just modify the code in such a way that the argument model.parameters() is not implicitily obtained, but explicitely passed to the function set_optimizer. In this way one could pass the dict {params1: lr1, params2: lr2} if needed.

@xuyxu
Copy link
Member

xuyxu commented Jul 1, 2021

Hi @lorenzosquadrani, thanks for you suggestions. I agree with you that we should support specifying various learning rates for different model parameters. However, in some ensembles such as gradient boosting, it does not make any sense to use model.parameters() in set_optimizer, since it is None before calling the fit method.

Could we support this feature request via adding a new utility function?

@xuyxu xuyxu added the new feature Feature request to work on label Jul 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature Feature request to work on
Projects
None yet
Development

No branches or pull requests

2 participants