-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dynamic loss function (changes over generations) #162
Comments
Very interesting idea, I could definitely see this being useful for regularizations! One difficult thing about implementing this is that the loss function is recalculated only when expressions change. So some of the losses may be out-of-date, especially in the hall of fame. Do you want the absolute loss to change, or just for the search to favor different things over time? If the latter, it will be much easier. You could modify this line: SymbolicRegression.jl/src/Population.jl Line 103 in a093714
L2Loss .
|
I got time to think about this: The idea could (?) be used to: 1) pause the search, 2) change the loss metric, 3)reevaluate the loss of each member in the saved_state, 4) reevaluate the scores, 5)and then continue.
would help me reevaluate all the scores, but when I try to replicate the scores of a member I don't get the same value. If all this was possible, a dynamic loss could be done externally without changing too much into the code, and just readjusting the population, as described, accordingly right? And to answer the question, yes, im more interested in adjusting the search to favor different things :) |
Hi Miles!
Most of this idea builds upon the one discussed here:
#92 (comment)
But it is different so I decided to make a new issue.
Ive been using the idea of 'custom_loss_functions' and it has worked great so far. Its really problem specific so I haven't made any pull requests as I dont see it as a generalizable idea, but this one might be.
Is it possible to make a loss function with a variable hyperparamenter? That changes after X generations?
like
return L2loss + alfa*custom_loss
so that alfa changes over generations (maybe decreasingly or increasingly) or this would have to be fixed?
So far I've been saving the state and restarting it changing the loss parameters on my needs but I was wondering if this could be done since the beginning.
The text was updated successfully, but these errors were encountered: