-
Notifications
You must be signed in to change notification settings - Fork 159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Consecutive calls to PSO is not necessarily the same of running with multiple generations #486
Comments
Indeed it's a bug. I just noted the equivalence is not tested for pso. Have a look to the test for cmaes here: Line 266 in 16fced2
A similar test should be added also for pso, and it should pass. Anything that needs to be remembered from previous generations should Indeed be in the algo memory. When memory=true it should be remembered. A PR would indeed be appreciated ... |
In case you do open a PR, could you also make the symmetric modifications to pso_gen? |
@darioizzo I didn't make the Also note that I had to make further modifications in order to make all variations of I'm planning on making similar changes (and adding similar test) to Thanks in advance! |
According to this discussion, all algorithms should behave the same with
gen=N
vsN
calls withgen=1
(Did I understand correcly?).I tried with PSO in pygmo, and couldn't make it work (Will explain better after showing the code):
Results:
pso_run_a: 35.042803177459504
pso_run_b: 14.44962519803876
It seems to be a bug related to the way the population evolves. AFAIU, when returning from the algorithm, PSO will return the best fit ( important code here ), and then, in the next run,
X
will be in relation tolbX
. While when running the code iteratively with multiple generations (and assuming that one generation didn't perform better than the previous, thenX
andlbX
will be different ( important code here )There are several ways of solving this. The simplier, I believe, is to update keep an internal
m_X
in thePSO
algorithm class. What are your thoughts around this? Are you interest in a specific fix just to the PSO algorithm (I can open a PR)?Thanks in advance!
The text was updated successfully, but these errors were encountered: