You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I hope I'm not mistaken, but it seems like I cannot set an optimizer with different per-layer learning rates. My model really needs it, and I couldn't find an easy workaround. Maybe you could just modify the code in such a way that the argument model.parameters() is not implicitily obtained, but explicitely passed to the function set_optimizer. In this way one could pass the dict {params1: lr1, params2: lr2} if needed.
The text was updated successfully, but these errors were encountered:
Hi @lorenzosquadrani, thanks for you suggestions. I agree with you that we should support specifying various learning rates for different model parameters. However, in some ensembles such as gradient boosting, it does not make any sense to use model.parameters() in set_optimizer, since it is None before calling the fit method.
Could we support this feature request via adding a new utility function?
Hi! I hope I'm not mistaken, but it seems like I cannot set an optimizer with different per-layer learning rates. My model really needs it, and I couldn't find an easy workaround. Maybe you could just modify the code in such a way that the argument model.parameters() is not implicitily obtained, but explicitely passed to the function set_optimizer. In this way one could pass the dict {params1: lr1, params2: lr2} if needed.
The text was updated successfully, but these errors were encountered: